hdrchitecture: real-time stereoscopic hdr imaging for ...wearcam.org/siggraph2012_a11-mann.pdf ·...

1
Copyright is held by the author / owner(s). SIGGRAPH 2012, Los Angeles, California, August 5 – 9, 2012. ISBN 978-1-4503-1435-0/12/0008 HDRchitecture: Real-time Stereoscopic HDR Imaging for Extreme Dynamic Range Steve Mann, Raymond Lo, Jason Huang, Valmiki Rampersad, and Ryan Janzen University of Toronto, Faculty of Engineering, Arts and Sciences, and Forestry Figure 1: EyeTap welding helmet causes the eyes themselves, to, in effect, become both cameras and displays by providing exact POE (Point-of-Eye) cap- ture for a stereo headup display. 1. BACKGROUND ON HDR The history of HDR (high dynamic range) imaging in digital photography goes back 19 years, as Robertson et al. state[2]: “The first report of digitally combining multiple pictures of the same scene to improve dynamic range appears to be Mann[1].” HDR combines multiple differently exposed pictures of the same subject matter to be able to see a much greater dy- namic range than is possible in a single exposure. 2. REALTIME HDR VIDEO In this demonstration, we present extreme HDR adapted for use in GlassEyes TM (EyeTap) electric arc welding helmets, as well as for a general-purpose seeing aid. Our “WeldCam HDRchitecture” (abbreviated “HDRchitec- ture”) system uses one or more cameras, and optional active illumination systems, that can be used by welding schools and professionals to inspect welding in real-time. We present HDRchitecture as either a fixed camera system (e.g. for use on a tripod), or as a stereo EyeTap cybernetic welding hel- met (see Figure 1) that records and streams live video to observers, nearby or remote. By capturing a dynamic range of more than a million to one, we can see details that cannot be seen by the human eye or any currently existing commer- cially available cameras. We present a highly parallelizable and computationally ef- ficient HDR reconstruction and tonemapping algorithm for extreme dynamic range scenes. In comparison to existing HDR work, our system runs in real-time, and requires no user intervention or fine-tuning of parameters. It renders im- ages with a high image quality up to 1920x1080 resolution. HDRchitecture uses FPGAs, GPUs, or multi-core CPUs for real-time (30 to 120 frames/sec) stereoscopic HDR process- ing. Figure 2: (Left) State-of-the-art digital cameras can not capture extreme dynamic range of TIG welding. The electric arc is overexposed, yet the surroundings are underexposed. (Right) Single frame from the real-time HDR video glasses. Our system captures 120 frames/sec. Figure 3: HDR technology developed for welding can be used for everyday life. Augmediated TM Reality and HDR Cearable Computing, etc, http://eyetap.org/publications Our initial FPGA-based hardware configuration fits inside a shirt pocket, and can be built into ordinary eyeglasses if desired. There are two HDMI camera inputs, one for the left eye, and the other for the right eye, as well as HDMI outputs fed back to the left and right eyes, after processing of the video signals. The circuit board facilitates processing by way of a Xilinx FPGA, but our approach also works with Altera FPGAs. A goal of the demonstration is to show the development of HDR eyeglasses as a general-purpose seeing aid for everyday life (Fig 3). 3. REFERENCES [1] S. Mann. Compositing multiple pictures of the same scene. In Proceedings of the 46th Annual IS&T Conference, volume 2, 1993. [2] M. Robertson, S. Borman, and R. Stevenson. Estimation-theoretic approach to dynamic range enhancement using multiple exposures. Journal of Electronic Imaging, 12:219, 2003.

Upload: others

Post on 13-Jun-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: HDRchitecture: real-time stereoscopic HDR imaging for ...wearcam.org/SIGGRAPH2012_a11-mann.pdf · ISBN 978 - 1 - 4503 - 1435 - 0 /12 /0008 HDRchitecture: Real-time Stereoscopic HDR

Copyright is held by the author / owner(s). SIGGRAPH 2012, Los Angeles, California, August 5 – 9, 2012. ISBN 978-1-4503-1435-0/12/0008

HDRchitecture: Real-time StereoscopicHDR Imaging for Extreme Dynamic Range

Steve Mann, Raymond Lo, Jason Huang, Valmiki Rampersad, and Ryan JanzenUniversity of Toronto, Faculty of Engineering, Arts and Sciences, and Forestry

Figure 1: EyeTap welding helmet causes the eyesthemselves, to, in effect, become both cameras anddisplays by providing exact POE (Point-of-Eye) cap-ture for a stereo headup display.

1. BACKGROUND ON HDRThe history of HDR (high dynamic range) imaging in digitalphotography goes back 19 years, as Robertson et al. state[2]:

“The first report of digitally combining multiplepictures of the same scene to improve dynamicrange appears to be Mann[1].”

HDR combines multiple differently exposed pictures of thesame subject matter to be able to see a much greater dy-namic range than is possible in a single exposure.

2. REALTIME HDR VIDEOIn this demonstration, we present extreme HDR adapted foruse in GlassEyesTM (EyeTap) electric arc welding helmets,as well as for a general-purpose seeing aid.

Our “WeldCam HDRchitecture” (abbreviated “HDRchitec-ture”) system uses one or more cameras, and optional activeillumination systems, that can be used by welding schoolsand professionals to inspect welding in real-time. We presentHDRchitecture as either a fixed camera system (e.g. for useon a tripod), or as a stereo EyeTap cybernetic welding hel-met (see Figure 1) that records and streams live video toobservers, nearby or remote. By capturing a dynamic rangeof more than a million to one, we can see details that cannotbe seen by the human eye or any currently existing commer-cially available cameras.

We present a highly parallelizable and computationally ef-ficient HDR reconstruction and tonemapping algorithm forextreme dynamic range scenes. In comparison to existingHDR work, our system runs in real-time, and requires nouser intervention or fine-tuning of parameters. It renders im-ages with a high image quality up to 1920x1080 resolution.HDRchitecture uses FPGAs, GPUs, or multi-core CPUs forreal-time (30 to 120 frames/sec) stereoscopic HDR process-ing.

Figure 2: (Left) State-of-the-art digital cameras cannot capture extreme dynamic range of TIG welding.The electric arc is overexposed, yet the surroundingsare underexposed. (Right) Single frame from thereal-time HDR video glasses. Our system captures120 frames/sec.

Figure 3: HDR technology developed for weldingcan be used for everyday life. AugmediatedTM

Reality and HDR Cearable Computing, etc,http://eyetap.org/publications

Our initial FPGA-based hardware configuration fits insidea shirt pocket, and can be built into ordinary eyeglasses ifdesired. There are two HDMI camera inputs, one for theleft eye, and the other for the right eye, as well as HDMIoutputs fed back to the left and right eyes, after processingof the video signals. The circuit board facilitates processingby way of a Xilinx FPGA, but our approach also works withAltera FPGAs.

A goal of the demonstration is to show the development ofHDR eyeglasses as a general-purpose seeing aid for everydaylife (Fig 3).

3. REFERENCES[1] S. Mann. Compositing multiple pictures of the same

scene. In Proceedings of the 46th Annual IS&TConference, volume 2, 1993.

[2] M. Robertson, S. Borman, and R. Stevenson.Estimation-theoretic approach to dynamic rangeenhancement using multiple exposures. Journal ofElectronic Imaging, 12:219, 2003.