CS 585: Fall 2010: Computational Photography

False-color images.

Course: CS 585 – Intermediate Topics in Computer Science: Computational Photography
Instructor: Dr. Nathan Jacobs
Contact: jacobs@cs.uky.edu
Time: MWF 3:00 pm – 3:50 pm
Location: Anderson Tower (a.k.a. FPAT), Room 259

Introduction

Computational Photography is a field of research at the convergence of computer graphics, computer vision and photography. Its goal is to overcome the limitations of the traditional camera by using computational techniques and alternative camera designs to produce a richer, more vivid, perhaps more perceptually meaningful representation of our visual world.

The aim of this advanced undergraduate/graduate course is to study ways in which visual samples of the real world (images and video) can be used to generate compelling computer graphics imagery. We will learn how to acquire, represent, and render scenes using images captured by digital cameras. Several popular algorithms will be presented, with an emphasis on using these techniques to build practical systems. This hands-on emphasis will be reflected in the programming assignments, in which students will have the opportunity to acquire their own images of indoor and outdoor scenes and develop the image analysis and synthesis tools needed to render and view the scenes on the computer.

Topics include:

  • Cameras (traditional and generalized)
  • Image Formation (lenses, shutters, sensors and apertures)
  • Visual Perception
  • Image and Video Processing (filtering, anti-aliasing, pyramids)
  • Image Manipulation (warping, morphing, mosaicing, matting, compositing)
  • Modeling and Synthesis using Lots of Images (e.g. summarizing a year of images of scene with one image)
  • High Dynamic Range Imaging and Tone Mapping

Prerequisites

Knowledge of computer programming, linear algebra and calculus is assumed.  Previous experience with computer vision, image processing and/or computer graphics will be very helpful but is not required.  Please contact me if you are interested in the topic but are unsure if you are prepared.

Schedule

Date Topic Additional Information
W 8/25 Jacobs: introduction slides pdf
F 8/27 Jacobs: handout assignment, intro to matlab programming assignment one is out
M 8/30 guest lecture: matlab and image processing
W 9/1 guest lecture: color
F 9/3 Jacobs: image formation slides pdf
M 9/6 Labor Day (no class)
W 9/8 Jacobs: cameras, color, and compression slides pdf
F 9/10 Jacobs: image processing (spatial filtering) slides pdf
M 9/13 Jacobs: image processing (point processing, frequency domain, mophological) slides pdf
W 9/15 Jacobs: project debriefing, odds and ends programming assignment two is out
F 9/17 Jacobs: warping and retargeting slides pdf

read seam carving paper
M 9/20 Jacobs: compositing, blending, and matting I slides pdf

read: Burt and Adelson, A multi-resolution spline with application to image mosaics, ACM ToG (1983)

Perez et al, Poisson Image Editing, TOG 2003

W 9/22 Jacobs: compositing, blending, and matting II slides pdf

read: Agarwala et al, Interactive Digital Photomontage, SIGGRAPH 2004
F 9/24 Jacobs: compositing, blending, and matting III slides pdf

read: Error  Tolerant  Image  Compositing, ECCV  2010

read: A Bayesian Approach to Digital Matting, CVPR 2001
M 9/27 project day
W 9/29 Jacobs: overview of project 3, panorama construction 1 slides pdf

programming assignment three is out
F 10/1 Jacobs: feature-based alignment slides pdf

read Multi-Image Matching using Multi-Scale Oriented Patches

read Recognising Panoramas
M 10/4 Jacobs: feature-based alignment continued
W 10/6 Jacobs: Image Morphing slides pdf
F 10/8 Jacobs: Video Textures slides pdf
M 10/10 Jacobs: Image Synthesis slides pdf
W 10/13 Jacobs: Data Driven Scene Completion slides pdf
F 10/15 Jacobs: Geometric and radiometric camera calibration slides pdf
M 10/18 midterm review + misc
W 10/20 project 3 presentations + misc
F 10/22 midterm
M 10/25 Jacobs: HDR slides pdf
W 10/27 Jacobs: Tone Mapping using the Bilateral Filter slides pdf
F 10/29 Jacobs: Vignette and Exposure Calibration and Compensation slides pdf

read vignette and exposure calibration and compensation, Goldman and Chen

programming assignment four is out
M 11/01 Jacobs: Coded Aperture read: Image and Depth from a Conventional Camera with a Coded Aperture; Levin, Fergus, Durand and Freeman, SIGGRAPH 07 pdf

project 4 due on Tuesday at 11:59pm
W 11/03 Jacobs: Light Fields
F 11/05 Harris: Multi-Flash Imaging Non-photorealistic Camera: Depth Edge Detection and Stylized Rendering
using Multi-Flash Imaging
, Raskar et al
M 11/08 Ti: Reinterpretable Imager Reinterpretable Imager: Towards Variable Post-Capture Space, Angle and Time Resolution in Photography
W 11/10 Unnikrishnan: Coded Strobing Coded Strobing Photography: Compressive Sensing of High-speed Periodic Events, Ashok Veeraghavan, Dikpal Reddy, Ramesh Raskar



F 11/12 Islam: Motion Deblurring Motion-based Motion Deblurring, M. Ben-Ezra and S.K. Nayar
M 11/15 Fu: Panoramas read Multi-Image Matching using Multi-Scale Oriented Patches
W 11/17 Mihail: Shift-Map Image Editing read the pdf on the shift map image editing website
F 11/19 Jacobs, TBD Present project plan (5 minutes)
M 11/22 Project Work Day
W 11/24 Thanksgiving Holiday, no class
F 11/26 Thanksgiving Holiday, no class
M 11/29 Jacobs, Image-Based Lighting Brief Project Updates

slides pdf
W 12/01 Jacobs, Compressive Sensing
F 12/03 Jacobs, TBD
M 12/06 Final Project Presentations Presentation (roughly 20 minutes each)
W 12/08 Final Project Presentations
F 12/10 No class
M 12/13 Final project report due

Reading List

Active Research Groups (see recent publications)

Extended Sensing (coded exposure and aperture)

  • Reinterpretable Imager: Towards Variable Post-Capture Space, Angle and Time Resolution in Photography; Agrawal, Veeraraghavan, and Raskar EUROGRAPHICS 2010, pdf
  • Image and Depth from a Conventional Camera with a Coded Aperture; Levin,  Fergus, Durand and Freeman, SIGGRAPH 07, pdf
  • Programmable Aperture Camera Using LCoS, Nagahara et al, ECCV 2010, pdf
  • An Introduction to Compressive Sampling; Candes and Wakin, pdf

Flash/No-Flash

  • Non-photorealistic Camera: Depth Edge Detection and Stylized Rendering using Multi-Flash Imaging, Raskar et al, link
  • Digital Photography with Flash and No-Flash Image Pairs, Petschnigg et al, link

Light Fields

  • Light Field Rendering, Levoy and Hanrahan, SIGGRAPH 96, pdf, project
  • The Lumigraph; Gortler et al, SIGGRAPH 96, pdf

Panorama and Alignment

  • Multi-Image Matching using Multi-Scale Oriented Patches; Brown, Szeliski, and Winder, pdf
  • Squaring the Circle in Panoramas, Zelnik-Manor et al, pdf
  • Recognising Panoramas; Brown and Lowe, pdf
  • Seamless Image Stitching in the Gradient Domain; Levin et al, pdf
  • Scene Collages and Flexible Camera Arrays; Nomura, Zhang, and Nayar, EUROGRAPHICS 2007, pdf

Camera Calibration and HDR

  • Recovering High Dynamic Range Radiance Maps from Photographs; Debevec and Malik, SIGGRAPH 1997, pdf
  • Radiometric Self Calibration; Mitsunaga and Nayar, pdf
  • Vignette and Exposure Calibration and Compensation; Goldman and Chen, ICCV 2005, pdf

Post Processing

  • Poisson Image Editing; Perez et al, pdf
  • Seeing Mt. Rainier: Lucky Imaging for Multi-Image Denoising, Sharpening, and Haze Removal; Joshi and Cohen, ICCP 2010, pdf
  • Seam Carving for Content-Aware Image Resizing; Avidan and Shamir, pdf
  • A Multiresolution Spline With Application to Image Mosaics; Burt and Adelson, pdf
  • A Closed Form Solution to Natural Image Matting; Levin, Lischinski, and Weiss, pdf
  • Error-tolerant Image Compositing; Tao et al, pdf
  • Fast Bilateral Filtering for the Display of High-Dynamic-Range Images; Durand and Dorsey, pdf

Grading

Your final grade will be based approximately on the following distribution:

  • 50% programming assignments (results, report, and presentation)
  • 10% participation (paper discussions)
  • 20% midterm examination
  • 20% final project (including a written report and presentation)

Grades will be assigned according to the following scale:

A = 90-100
B = 80-89
C = 70-79
D = 60-69

Late Assignment Policy

Each students may use three free “late days” for the whole course.  That is to say, the first 24 hours after the due date and time counts as 1 day, up to 48 hours is two and 72 for the third late day.  Once the free “late days” are used each additional day cost 20% of the assignment.

Programming assignments

There will be 5 programming assignments.  For each assignment, students will be expected to submit their source code, sample program outputs, and give a brief in-class presentation.  In addition, students will create a web page summarizing the results, methods used, and lessons learned.

Final Project

In addition to a set of predefined programming assignments, students will be allowed some freedom in choosing a final project, which will most likely be an implementation or an extension of recent research results.  Students may work individually or in groups of two.

As with the programming assignments, students will create a webpage summarizing the results, submit their source code, and give an in-class presentation.  In addition, students will prepare a written report in the style of a conference or workshop submission (more details will be given as the semester proceeds).

Details

Textbook

There will be no required text book, instead readings will be drawn from original research papers, Computer Vision: Algorithms and Applications, and other web resources.

Assignments and Programming Languages

Students can use any programming language and environment they desire.  However, in class examples will be given in Matlab and students are strongly encouraged to use Matlab, with the Image Processing Toolkit for all work.  Experience has shown that using lower-level languages, such as C++, will significantly hinder progress on projects.  Students may find using Python with SciPy, Numpy and OpenCV an acceptable alternative to Matlab.

If you need help with Matlab I highly recommend the following resources:

Cameras

Students are encouraged to obtain a digital camera (preferably one which allows manual control, even more preferably a DSLR) for this course.  This is not a requirement, but it will make the class more fun for you.

Learning Outcomes

Students will understand the image formation process of modern cameras. In addition, they will understand ways in which adding computation to the imaging process extends and improves the standard camera. More specifically, students will:

  1. Understand the relationship between light and pixel intensity values for standard cameras (i.e. the image formation process)
  2. Analyze methods for and design a system that implements camera calibration
  3. Understand basic algorithms commonly found in modern cameras, such as auto-focus, auto-color balance, and auto-exposure compensation
  4. Understand several low-level algorithms for post-processing intensity measurements to form images (e.g. warping, demosaicing, morphing, compositing, flash/no-flash, and super-resolution)
  5. Analyze methods for and design systems that solve higher-level computational photography problems, such as time-lapse summarization, panorama construction, high-dynamic range imaging, and image re-targeting
  6. Analyze and understand several alternatives to the standard camera

Course Evaluation Questions

The course has helped me:

  1. Understand the process of digital image formation
  2. Understand the effect of and methods for camera calibration
  3. Understand and be able to apply computational photography algorithms, such as warping, aligning, and compositing
  4. Understand how computation can expand the capabilities of the traditional camera

Acknowledgments

The materials from this class rely on slides prepared by others including: Robert Pless, Alexei A. Efros, Steve Seitz, Rick Szeliski, Paul Debevec, Stephen Palmer, Paul Heckbert, David Forsyth, Steve Marschner and others, as noted in the slides. Feel free to use these slides for academic or research purposes, but please maintain all acknowledgments.

Similar Courses at other Universities

Links

Academic Dishonesty

Short version: Use good judgment and ask me if you have questions.  In written reports and presentation you should acknowledge all sources you used (including others with whom you discussed your project).

Longer version: Individual work (programming, exams) must be your own. You may discuss ideas with others, but no sharing of computer code or other work will be allowed. Group projects allow the sharing of ideas and computer code within the group; no sharing of work between groups will be acceptable. The University of Kentucky’s guidelines regarding academic dishonesty will be strictly enforced (see section 6.3 of the student code for details).

Instructor: Nathan Jacobs

Comments are closed.