computational ⊗⊘ photography - computer...

74
Computational Photography Computational Optics Jongmin Baek CS 478 Lecture Feb 29, 2012 Wednesday, February 29, 12

Upload: duongdat

Post on 23-Jun-2018

250 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

⊕⊖ Computational ⊗⊘ PhotographyComputational Optics

Jongmin Baek

CS 478 LectureFeb 29, 2012

Wednesday, February 29, 12

Page 2: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Camera as a Black BoxWorld

v

u

t

s

4D Light Field

Sensor

2D Image

An imaging system is a function that maps 4D input to 2D output.

ImagingSystem

Wednesday, February 29, 12

Page 3: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Camera as a Black BoxWorld

v

u

t

s

4D Light Field

Sensor

2D Image

By changing parameters (e.g. focus), we can obtain a different mapping.

ImagingSystem

ImagingSystem

ImagingSystem

Wednesday, February 29, 12

Page 4: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Camera as a Black Box

• What is the space of all mappings we can reasonably obtain?

• Clearly not all f:(ℝ4→ℝ)→(ℝ2→ℝ)

• Are all mappings useful?

• Consider f: x ↦ (g: y ↦ 0), ∀x∈(ℝ4→ℝ).

• Do all mappings yield “images”?

Wednesday, February 29, 12

Page 5: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Overview• Coded Aperture

• Spatial coding

• Amplitude

• Phase

• Temporal coding

• Wavelength coding

• Other stuff

Wednesday, February 29, 12

Page 6: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

“Hand-wavy”Wave Optics Tutorial

Isotropic emitter

Thin Lens

Pixel

We want all these waves to interfere constructively at the pixel.

Wednesday, February 29, 12

Page 7: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

“Hand-wavy”Wave Optics Tutorial

Isotropic emitter Thin Lens

Pixel

We want all these waves to interfere destructively at the pixel.

Wednesday, February 29, 12

Page 8: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

“Hand-wavy”Wave Optics Tutorial• Lens

• Controls how wavefronts from the scene interfere at the sensor.

• Ideally, all wavefronts from a single point source interfere constructively at a pixel, and other wavefronts interfere destructively at that pixel.

• “Perfect” imaging system.

Wednesday, February 29, 12

Page 9: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

“Hand-wavy”Wave Optics Tutorial• Perfect imaging system is impossible.

• Defocus blur: It’s hard to make all the waves interfere 100% constructively, for objects at arbitrary depth.

• Diffraction: It’s hard to make something interfere 100% constructively, and something 𝜀-away interfere 100% destructively.

• But...

Wednesday, February 29, 12

Page 10: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

“Hand-wavy”Wave Optics Tutorial

... after some math later ...(Refer to any optics textbook)

Wednesday, February 29, 12

Page 11: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

“Hand-wavy”Wave Optics Tutorial• Sinusoidal patterns are perfectly imaged.

• Same frequency, potentially lower magnitude

Wednesday, February 29, 12

Page 12: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Imaging in Fourier Domain

• Any signal can be written as a sum of sinusoids.

• We know how each sinusoid is imaged.

• Imaging is linear.

• Figure out what the imaging system does to each signal, and add up results!

Wednesday, February 29, 12

Page 13: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Imaging in Fourier Domain

Decompose

FOURIERTRANSFORM

Mapping viaimaging system

α1

α2

α3

α1

α2

α3

Recompose

INVERSEFOURIER

TRANSFORM

weights

Wednesday, February 29, 12

Page 14: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Imaging in Fourier Domain

• (Traditional) Imaging system

• A multiplicative filter in Fourier domain.

• This filter is called theOptical Transfer Function (OTF).

• The magnitude of the filter is called the Modulation Transfer Function (MTF).

• A convolution in the spatial domain.

• This kernel is called thePoint Spread Function (PSF).

Wednesday, February 29, 12

Page 15: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Aperture Coding

• Why insist on a circular aperture?

• What kind of aperture should we use?

(Levin 2007)

Wednesday, February 29, 12

Page 16: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Circular Aperture

• Let’s consider the circular aperture.

• Imagine 2D world.

• The aperture is now a 1D slit.

Transmittance

x

Wednesday, February 29, 12

Page 17: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Circular Aperture

Point Spread Function Modulation Transfer Function

(Focused)

Wednesday, February 29, 12

Page 18: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Circular ApertureMTF for a 1D slit at various misfocus ψ

(figures stolen from self)

Wednesday, February 29, 12

Page 19: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Circular ApertureMTF as a function of misfocus, at various frequency

Wednesday, February 29, 12

Page 20: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Stopping DownMTF as a function of misfocus, at various frequency

Aperture size at 100%, 80%, 60%, 40% with equal exposure.

Wednesday, February 29, 12

Page 21: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Desiderata

• Given an aperture, we can generate these plots mathematically*.

• What kind of aperture do we want?

• What kind of plot is ideal?

*Are you sure you want to know?OTF(x, y,ψ) = ∬p(t1-fx/2,t2-fy/2)p*(t1+fx/2,t2+fy/2)e2i(t1fx+t2fy)ψdt1dt2.

For aperture with large features, one can estimate the PSF by the aperture shape, scaled by misfocus.

Wednesday, February 29, 12

Page 22: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Depth Invariance?

• Do we want the frequency response to be constant w.r.t. misfocus (equivalently, depth)?

• Would be useful for all-focus image.

• Do we want the frequency response to vary wildly w.r.t. misfocus (equivalently, depth)?

• Would be useful as depth cues, for depthmap generation.

Wednesday, February 29, 12

Page 23: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Depth Invariance?

• Do we want the frequency response to be constant w.r.t. misfocus?

• Would be useful for all-focus image.

• Do we want the frequency response to vary wildly w.r.t. misfocus?

• Would be useful as depth cues, for depthmap generation.

Wednesday, February 29, 12

Page 24: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Image and Depth from a Conventional Camera with a Coded Aperture

Levin et al., SIGGRAPH 2007

• Pick an aperture whose OTF varies much with depth.

• Random search.

• Restrict search tobinary 11x11 patterns.

• MaximizeK-L divergence among OTFs.

• Calculate PSF for each depth.

Wednesday, February 29, 12

Page 25: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Image and Depth from a Conventional Camera with a Coded Aperture

Levin et al., SIGGRAPH 2007

• Steps

• Take a picture.

Wednesday, February 29, 12

Page 26: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Image and Depth from a Conventional Camera with a Coded Aperture

Levin et al., SIGGRAPH 2007

• Steps

• Try deconvolving with each candidate PSF.

• Convolve again with PSF, subtract from picture to compute error.

• For each region, pick the PSF (hence depth) that gives the minimal error.

• Regularize depthmap.

Wednesday, February 29, 12

Page 27: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Image and Depth from a Conventional Camera with a Coded Aperture

Levin et al., SIGGRAPH 2007

Resulting Depthmap

Wednesday, February 29, 12

Page 28: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Image and Depth from a Conventional Camera with a Coded Aperture

Levin et al., SIGGRAPH 2007

• You can do all this with a circular aperture.

• The result won’t be as good, though.

Wednesday, February 29, 12

Page 29: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Next Step

• Instead of modulating the aperture amplitude (transmittance), we could modulate the phase as well.

• Upside: No light lost.

• Downside: Larger space of unknowns.

Wednesday, February 29, 12

Page 30: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Phase Coding

• (Parabolic) Lens already modulates the phase.

• Add an additional refractive element.

Lens

Phase plateWednesday, February 29, 12

Page 31: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Depth Invariance?

• Do we want the frequency response to be constant w.r.t. misfocus?

• Would be useful for all-focus image.

• Do we want the frequency response to vary wildly w.r.t. misfocus?

• Would be useful as depth cues, for depthmap generation.

Wednesday, February 29, 12

Page 32: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Extended Depth of Field through Wavefront Coding

Dowski et al., Applied Optics 1995

• Design a phase plate such that the MTF is the same across depth.

• A regular lens is parabolic, or quadratic.

• Instead, use a lens whose profile is cubic.

regular lens cubic phase plate

Wednesday, February 29, 12

Page 33: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Extended Depth of Field through Wavefront Coding

Dowski et al., Applied Optics 1995

• How does it work?

• A regular lens is parabolic, or quadratic.

• The 2nd derivative determines plane of focus.

Wednesday, February 29, 12

Page 34: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Extended Depth of Field through Wavefront Coding

Dowski et al., Applied Optics 1995

• How does it work?

• A regular lens is parabolic, or quadratic.

• The 2nd derivative determines plane of focus.

Parabolic lens profileswith varying 2nd deriative.

Wednesday, February 29, 12

Page 35: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Extended Depth of Field through Wavefront Coding

Dowski et al., Applied Optics 1995

• How does it work?

• A regular lens is parabolic, or quadratic.

• The 2nd derivative determines plane of focus.

• A cubic lens is locally quadratic with varying 2nd derivative.

• Different parts of the lens “focus” at different depth!

Wednesday, February 29, 12

Page 36: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Extended Depth of Field through Wavefront Coding

Dowski et al., Applied Optics 1995

• How does it work?

• Therefore, regardless of depth, the object will be:

• in focus (small PSF) for some parts of the lens

• blurry (large PSF) for other parts of the lens

• The overall PSF will be the sum.

• More or less depth-invariant.

• Deconvolve with a single PSF to recover scene.

Wednesday, February 29, 12

Page 37: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Extended Depth of Field through Wavefront Coding

Dowski et al., Applied Optics 1995

Regularlens

Cubic phase plate

(deblurred)

Wednesday, February 29, 12

Page 38: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Depth from Diffracted RotationGreengard et al., Optics Letters 2006

Aside: Can also design phase plate to be depth-variant.

Wednesday, February 29, 12

Page 39: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

4D Frequency Analysis of Computational Cameras for Depth of Field Extension

Levin et al., SIGGRAPH 2009

• Similar idea

• Have parts of the lens focus at different depths.

“Lattice Focal Lens”

Wednesday, February 29, 12

Page 40: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

4D Frequency Analysis of Computational Cameras for Depth of Field Extension

Levin et al., SIGGRAPH 2009

• Similar idea

• Have parts of the lens focus at different depths.

Regular lens Lattice Focal Lens Deconvolved

Wednesday, February 29, 12

Page 41: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Diffusion Coded Photography for Extended Depth of FieldCossairt et al., SIGGRAPH 2010

• Put a radial diffuser in front of the lens.T

ime

Sensor Position

Wednesday, February 29, 12

Page 42: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Diffusion Coded Photography for Extended Depth of FieldCossairt et al., SIGGRAPH 2010• Idea

• Add a random diffuser (surface gradient is sampled randomly from a probability distribution)

• This makes the PSF stochastic, and ultimately less dependent on ray angles, leading to depth invariance.

Sensor Position

Wednesday, February 29, 12

Page 43: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Diffusion Coded Photography for Extended Depth of FieldCossairt et al., SIGGRAPH 2010

Sensor Position

PSF is indeed depth-invariant.

Wednesday, February 29, 12

Page 44: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Diffusion Coded Photography for Extended Depth of FieldCossairt et al., SIGGRAPH 2010

Sensor PositionDeblurred output

Reg

ular

pho

tos

Wednesday, February 29, 12

Page 45: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

• We’ve looked at techniques that modulate the aperture spatially.

• Why not try temporally?

• Change modulation over time.

Next Step

Wednesday, February 29, 12

Page 46: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Flexible Depth of Field PhotographyNagahara et al., ECCV 2008

• Translate the sensor over the exposure time.

• Equivalent to simulating lens of different focal lengths over time.

Wednesday, February 29, 12

Page 47: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Flexible Depth of Field PhotographyNagahara et al., ECCV 2008

• Translate the sensor over the exposure time.

• Equivalent to simulating lens of different focal lengths over time.

Wednesday, February 29, 12

Page 48: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Flexible Depth of Field PhotographyNagahara et al., ECCV 2008

• Other applications

• Could move the sensor non-linearly

• Discontinuous depth of field?

• Combine with rolling shutter

• Tilt-shift

Wednesday, February 29, 12

Page 49: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Flexible Depth of Field PhotographyNagahara et al., ECCV 2008

Wednesday, February 29, 12

Page 50: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

More ways for Temporal Coding

• One can also temporally code aperture by engaging the shutter over time.

• Could even use electronic shutter.

Time

Wednesday, February 29, 12

Page 51: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Coded Exposure Photography: Motion Deblurring using Fluttered Shutter

Raskar et al., SIGGRAPH 2006

• LCD shutter flutters in order to block/unblock light during exposure.

Wednesday, February 29, 12

Page 52: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Coded Exposure Photography: Motion Deblurring using Fluttered Shutter

Raskar et al., SIGGRAPH 2006slide stolen from Ramesh Raskar

Wednesday, February 29, 12

Page 53: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Coded Exposure Photography: Motion Deblurring using Fluttered Shutter

Raskar et al., SIGGRAPH 2006slide stolen from Ramesh Raskar

Creates a better-conditioned motion blur!Wednesday, February 29, 12

Page 54: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Coded Exposure Photography: Motion Deblurring using Fluttered Shutter

Raskar et al., SIGGRAPH 2006

Motion blur can be

inverted easily!

Wednesday, February 29, 12

Page 55: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Next Step

• We’ve tried modulating capture based on

• where the ray passes through the aperture

• when the ray passes through the aperture

• Instead, let’s move the entire camera.

Wednesday, February 29, 12

Page 56: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Motion Invariant PhotographyLevin et al., SIGGRAPH 2008

• Motivation

• If there is an object that travels at a constant speed, you can image it sharply by moving the camera linearly at some velocity.

Tim

e

Sensor Position

Wednesday, February 29, 12

Page 57: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Motion Invariant PhotographyLevin et al., SIGGRAPH 2008

• The entire camera moves during exposure in a parabola.T

ime

Sensor Position

Wednesday, February 29, 12

Page 58: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Motion Invariant PhotographyLevin et al., SIGGRAPH 2008

• If there is an object that travels parallel to the image plane, at some point in time the camera motion will mirror the object exactly.

• Object is momentarily imaged sharply.

• At other times, it will be somewhat blurry, blurry, very blurry, etc.

• Above happens independent of object speed!

• Motion-invariant motion blur!

Tim

e

Sensor Position

Wednesday, February 29, 12

Page 59: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Motion Invariant PhotographyLevin et al., SIGGRAPH 2008

Alt-tab to video?

Wednesday, February 29, 12

Page 60: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Motion Invariant PhotographyLevin et al., SIGGRAPH 2008

Tim

e

Sensor PositionScene Captured Deblurred

Wednesday, February 29, 12

Page 61: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Next Step

• While we are at it, let’s move both lens and the sensor, independently.

Wednesday, February 29, 12

Page 62: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Image Destabilization: Programmable Defocus using Lens and Sensor Motion

Mohan et al., ICCP 2009

Tim

e

Sensor Position

Wednesday, February 29, 12

Page 63: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Image Destabilization: Programmable Defocus using Lens and Sensor Motion

Mohan et al., ICCP 2009

Tim

e

Sensor Position

• Translate both the lens and the sensor laterally.

• Depending on their relative speed, there exists a 3D point in the scene that is imaged by the same pixel.

• Remains sharp.

• Other points are effectively motion-blurred

Wednesday, February 29, 12

Page 64: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Image Destabilization: Programmable Defocus using Lens and Sensor Motion

Mohan et al., ICCP 2009

Tim

e

Sensor Position

regular camera

result

Wednesday, February 29, 12

Page 65: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Next Step

• Can we modulate capture based on something entirely different?

• Wavelength?

Wednesday, February 29, 12

Page 66: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Spectral Focal Sweep: Extended Depth of Field from Chromatic Aberration

Cossairt et al., ICCP 2010

• Have a lens that maximizes axial chromatic aberration.

• Different wavelength focuses at different depth!

• If the scene spectrum is broadband,

• We’re effectively doing a focal sweep!

Sensor Position

Wednesday, February 29, 12

Page 67: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Spectral Focal Sweep: Extended Depth of Field from Chromatic Aberration

Cossairt et al., ICCP 2010

Sensor Position

Wednesday, February 29, 12

Page 68: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Spectral Focal Sweep: Extended Depth of Field from Chromatic Aberration

Cossairt et al., ICCP 2010

Sensor Position

Conventional camera

SFS Lens

Deblurred output

Wednesday, February 29, 12

Page 69: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Spectral Focal Sweep: Extended Depth of Field from Chromatic Aberration

Cossairt et al., ICCP 2010

Sensor Position

Conventional camera

Deblurred output

For color, transform to YUV and deblur Y only.

Wednesday, February 29, 12

Page 70: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Other Cool Stuff

• Coded aperture projection

• Periodic motion

• http://www.umiacs.umd.edu/~dikpal/Projects/codedstrobing.html

• Interaction with rolling shutter

Wednesday, February 29, 12

Page 71: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Coded Aperture ProjectionGrosse et al., SIGGRAPH 2010

• Pick coded aperture that creates depth-invariant blur.

• Could be adaptive.

• Before projecting, convolve image with the inverse of that aperture. (Ensures that the image looks fine.)

Sensor Position

Wednesday, February 29, 12

Page 72: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Coded Aperture ProjectionGrosse et al., SIGGRAPH 2010

Sensor PositionDepth of field increased!

• Pick coded aperture that creates depth-invariant blur.

• Could be adaptive.

• Before projecting, convolve image with the inverse of that aperture. (Ensures that the image looks fine.)

Wednesday, February 29, 12

Page 73: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Questions?

Wednesday, February 29, 12

Page 74: Computational ⊗⊘ Photography - Computer graphicsgraphics.stanford.edu/.../lectures/02292012_computational_optics.pdf · ⊕⊖ Computational ⊗⊘ Photography Computational Optics

Cited Papers• Levin et al., “Image and Depth from a Conventional Camera with a Coded Aperture.” SIGGRAPH, 2007.

• Dowski et al., “Extended Depth of Field through Wavefront Coding.” Applied Optics, 1995.

• Greengard et al., “Depth from Diffracted Rotation.” Optics Letters, 2006.

• Levin et al., “4D Frequency Analysis of Computational Cameras for Depth of Field Extension.” SIGGRAPH, 2009.

• Cossairt et al., “Diffusion Coded Photography for Extended Depth of Field.” SIGGRAPH, 2011.

• Nagahara et al., “Flexible Depth-of-Field Photography.” ECCV, 2008.

• Raskar et al., “Coded Exposure Photography: Motion Deblurring using Fluttered Shutter.” SIGGRAPH, 2006.

• Levin et al., “Motion Invariant Photography.” SIGGRAPH, 2010.

• Mohan et al., “Image Destabilization: Programmable Defocus using Lens and Sensor Motion.” ICCP, 2009.

• Cossairt and Nayar. “Spectral Focal Sweep: Extended Depth of Field from Chromatic Aberration.” ICCP, 2010.

• Grosse et al. “Coded Aperture Projection.” SIGGRAPH, 2010.

Wednesday, February 29, 12