evaluation of high dynamic range photography as a luminance...

14
Evaluation of high dynamic range photography as a luminance data acquisition system MN Inanici PhD Lawrence Berkeley National Laboratory, Lighting Research Group, Berkeley, California, USA Received 17 January 2005; revised 21 July 2005; accepted 9 August 2005 In this paper, the potential, limitations and applicability of the High Dynamic Range (HDR) photography technique are evaluated as a luminance mapping tool. Multiple exposure photographs of static scenes were taken with a commercially available digital camera to capture the wide luminance variation within the scenes. The camera response function was computationally derived by using Photosphere software, and was used to fuse the multiple photographs into an HDR image. The vignetting effects and point spread function of the camera and lens system were determined. Laboratory and field studies showed that the pixel values in the HDR photographs correspond to the physical quantity of luminance with reasonable precision and repeatability. 1. Introduction It possible to measure the photometric proper- ties of a scene by means of point-by-point measuring devices. However, these measure- ments take a long time, are prone to errors due to measurement uncertainties in the field and the obtained data may be too coarse for analysing the lighting distribution and varia- tion. There is a need for a tool that can capture the luminances within a large field of view at a high resolution, in a quick and inexpensive manner. Photography has the potential for this kind of data collection. Photograph based photo- metry is not a new approach; various research- ers have utilized film based photographs and CCD outputs for luminance measurements. The common approach in early studies was to acquire the calibration functions for a certain camera and illuminant, which involved lengthy and complex physical measurements performed in laboratory conditions, accompa- nied with extended data analysis. 1,2 RGB values were converted to the CIE tristimulus values through the calibration functions. These early studies were based on single photographs, which provide a limited dynamic range of luminances. More recent techniques 3 employ multiple photographs, which allow a larger luminance range to be captured. Some of these techniques require expensive equip- ment. All of these studies require costly and tedious calibration processes, and yield cali- bration functions that are strictly device dependent. In HDR photography, multiple exposure photographs are taken to capture the wide luminance variation within a scene. Camera response function is computationally derived through a self-calibration process from the multiple exposure photographs; therefore the HDR photography technique is applicable to all cameras that have multiple exposure cap- abilities. The camera response function is used to fuse the photograph sequences into a single HDR image. HDR photography is not Address for correspondence: MN Inanici is currently at: University of Washington, Department of Architecture, Box 355720, Seattle, WA 98195-5720, USA. E-mail: inanici@ w.washington.edu Lighting Res. Technol. 38,2 (2006) pp. 123 /136 # The Chartered Institution of Building Services Engineers 2006 10.1191/1365782806li164oa

Upload: others

Post on 18-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

Evaluation of high dynamic range photography as aluminance data acquisition systemMN Inanici PhD

Lawrence Berkeley National Laboratory Lighting Research Group Berkeley California USA

Received 17 January 2005 revised 21 July 2005 accepted 9 August 2005

In this paper the potential limitations and applicability of the High DynamicRange (HDR) photography technique are evaluated as a luminance mapping toolMultiple exposure photographs of static scenes were taken with a commerciallyavailable digital camera to capture the wide luminance variation within the scenesThe camera response function was computationally derived by using Photospheresoftware and was used to fuse the multiple photographs into an HDR image Thevignetting effects and point spread function of the camera and lens system weredetermined Laboratory and field studies showed that the pixel values in the HDRphotographs correspond to the physical quantity of luminance with reasonableprecision and repeatability

1 Introduction

It possible to measure the photometric proper-ties of a scene by means of point-by-pointmeasuring devices However these measure-ments take a long time are prone to errors dueto measurement uncertainties in the field andthe obtained data may be too coarse foranalysing the lighting distribution and varia-tion There is a need for a tool that cancapture the luminances within a large field ofview at a high resolution in a quick andinexpensive manner

Photography has the potential for this kindof data collection Photograph based photo-metry is not a new approach various research-ers have utilized film based photographs andCCD outputs for luminance measurementsThe common approach in early studies was toacquire the calibration functions for a certaincamera and illuminant which involved

lengthy and complex physical measurementsperformed in laboratory conditions accompa-nied with extended data analysis12 RGBvalues were converted to the CIE tristimulusvalues through the calibration functionsThese early studies were based on singlephotographs which provide a limited dynamicrange of luminances More recent techniques3

employ multiple photographs which allow alarger luminance range to be captured Someof these techniques require expensive equip-ment All of these studies require costly andtedious calibration processes and yield cali-bration functions that are strictly devicedependent

In HDR photography multiple exposurephotographs are taken to capture the wideluminance variation within a scene Cameraresponse function is computationally derivedthrough a self-calibration process from themultiple exposure photographs therefore theHDR photography technique is applicable toall cameras that have multiple exposure cap-abilities The camera response function is usedto fuse the photograph sequences into asingle HDR image HDR photography is not

Address for correspondence MN Inanici is currently atUniversity of Washington Department of Architecture Box355720 Seattle WA 98195-5720 USA E-mail inaniciwwashingtonedu

Lighting Res Technol 382 (2006) pp 123136

The Chartered Institution of Building Services Engineers 2006 1011911365782806li164oa

specifically developed for lighting measure-ment purposes The objective of the presentstudy is to evaluate the appropriateness (ieaccuracy applicability and limitations) of thetechnique as a luminance data acquisitionsystem

2 Equipment and softwares

21 Equipments

The multiple exposure photographs weretaken with a Nikon Coolpix 5400 digitalcamera mounted on a tripod and fitted witha fisheye lens (Nikon FC-E9) The fisheye lenshas a focal length of 56 mm and an angle ofview of 1908 Reference physical measure-ments were taken with a calibrated hand

held luminance meter with 138 field of view(Minolta LS110)

22 Softwares

The multiple exposure photographs wereprocessed using the software called Photo-sphere4 All photographs were taken with thecamera settings shown in Table 1 It isespecially important to fix the white balancefor achieving consistent colour space transi-tions Changing either the aperture size(f-stop) or the shutter speed (exposure time)can vary the exposure values Shutter speed isa more reliable measure than aperture size56

Therefore exposure variations were achievedwith a fixed aperture size (f40) and varyingonly the shutter speed in manual exposuremode (2 s to 14000 s)

Table 1 Camera settings

Feature Setting Feature Setting

White balance Daylight Image size 2592 pixels1944 pixelsBest shot selector Off Sensitivity 100 ISOImage adjustment Normal Image sharpening OffSaturation control Normal Lens FisheyeAuto-bracketing Off Noise reduction Off

Camera response curve

00

02

04

06

08

10

0Pixel255

mdc( ecnanimu

L enecS2 )

10

Photosphere

Figure 1 Camera response curve for Nikon 5400 as determined by Photosphere

124 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

An interior scene with daylight that hadlarge and smooth gradients throughout theinterior and exterior views was selected fordetermining the camerarsquos natural responsefunction7 Photosphere generated the cameraresponse curve for the Nikon 5400 that wasused in this study based on 10 exposuresequences in three channels (RGB) as follows(Figure 1)

R153994x3 099492x2 046536x

001037

G 131795x3 069784x2 038994x

001005

B167667x3 109256x2 042334x

000745

The curves are polynomial functionsthat model the accumulated radiometric non-linearities of the image acquisition process(such as gamma correction AD conversionimage digitizer various mappings) withoutaddressing the individual source of eachnon-linearity56 The technique known asradiometric self-calibration5 is a computa-tionally derived calibration process used torelate the pixel values to the real worldluminances Camera response curves varyconsiderably between different cameras there-fore radiometric self-calibration has to beapplied to each camera However the onlyinput to the process is a set of multipleexposure photographs

Once the camera response curve is deter-mined Photosphere can fuse any photographsequence into a HDR image HDR images canbe stored in image formats such as RadianceRGBE8 and LogLuv TIFF9 where the pixelvalues can extend over the luminance spanof the human visual system (from 106 to108 cdm2)

23 Luminance calculationFor analysing the HDR images from Photo-

sphere computational procedures were imple-

mented (referred to as HDRLab) The routines

were written in Matlabdagger and they allow theuser to extract and process per-pixel lightingdata from the HDR images saved in RadianceRGBE format CIE XYZ values for each pixelwere quantified from floating point RGBvalues based on the standard colour space(sRGB) reference primaries10 CIE StandardIlluminant D65 and standard CIE Colori-metric Observer with 28 field of view Thetransformation process is performed as follows

1) Floating point RGB is calculated fromRGBE integer values8

red R

255+2(E128) green

G

255+2(E128)

blueB

255+2(E128)

2) CIE chromaticities for the reference pri-maries and CIE Standard Illuminant D65are as follows10

R(x y z) (064 033 003)

G(x y z) (030 060010)

B(x y z) (015 006 079)

D65(x y z) (03127 03290 03583)

3) RGB to XYZ matrix is constructed withthe matrix of RGB chromaticities (lsquoKrsquo)which are differentially scaled to achievethe unit white point balance (lsquoWrsquo)Further information on RGB to XYZconversion can be found in Wyszecki andStiles12 and Glassner13

K rx ry rz

gx gy gz

bx by bz

24

35

064 033 003

030 060 010

015 006 079

24

35

W frac14Xn

Yn

1Zn

Yn

09505 10 10891frac12

High dynamic range photography 125

Lighting Res Technol 382 (2006) pp 123136

V frac14WK1 frac14 06444 11919 12032frac12

Gr Gg Gh

G Gr 0 0

0 Gg 0

0 0 Gb

24

35

06466 0 0

0 11919 0

0 0 12032

24

35

X frac14 04124 +R03576 +G01805 +B

Y frac14 02127 +R07151 +G00722 +B

Z frac14 00193 +R01192 +G09505 +B

4) It is possible to calibrate images and

cameras in Photosphere with a physical

luminance measurement of a selected

region in the scene This feature can be

applied as a constant (lsquok rsquo) either to thepixel values in an image or to the cameraresponse function Luminance (L) iscalculated as

Lk+(02127+R07151

+G00722+B) (cd=m2)

3 Measurement setups and analyses

To evaluate the HDR technique a suite ofmeasurements and analyses were undertakenKey sets of measurements involved the com-parison of luminances of various surfaces asdetermined from HDR images and the lumi-nance meter The measurements were repeatedin different settings which varied from con-trolled laboratory environments (blackpainted room without daylight) to officespaces and outdoors The interior settings

0

5

10

15

20

25

21G

kcalb

eulb

elprup

53 lartuen

eulb hsilprup

11G

nik s kr ad

egailo f

der

n ayc

5 lartu en

yks eulb

01G

atnegam

nee rg

r ewolf eul b

der etaredom

9G

56 lar tue n

eg naro

8G

niks th gi l

7G

neerg hsiulb

6G

neerg wolley

wolley egnaro

8 lartu en

5G

wolley

4G

3G

etihw

2G

1G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

IncandescentAverage error percentagesAll 79Grayscale targets 46Colored targets 112

Figure 2 Dark room with greyscale and coloured targets illuminated with an incandescent lamp

126 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

had different light sources and different illu-minances Precautions were taken to ensurethat the equipment and the people who madethe measurements did not cast shadows on thetargets The measurements were made withfive sets of paper targets The first targetconsisted of 24 squares (two sets of 12squares) which had reflectances ranging from4 to 87 The other four targets had a greysquare target in the middle with a 28reflectance This square was enveloped by a)white (87 reflectance) b) black (4)c) white-then-black d) black-then-white sur-roundings The Macbeth ColorCheckerdagger

chart was also used as a targetThe centre of each target was measured with

the Minolta luminance meter and each mea-surement setup was captured with the HDRimages When daylight was present in thesetup the physical measurements were madetwice before and after the multiple exposurephotographs were taken There was a time lagbetween the physical measurements and the

multiple exposure photographs thereforebracketing the photograph sequences withphysical measurements allowed the variationof the lighting conditions to be characterizedA rectangular block of pixels (approximatelycorresponding to the size and location of thespot used for the luminance meter) wasextracted from the digital images for eachtarget The selection area was fitted insideeach target and excluded the borders wherethe luminances could change drastically Theminimum maximum mean and standarddeviation of each block were studied to ensurethat luminance within the selected region wasuniform This approach was considered to bemore reliable than processing 1 or 2 pixelvalues

31 AccuracyA laboratory space was illuminated with

different light sources such as an incandescentlamp a projector (using a 500W tungstenlamp) fluorescent (T12 T8 and T5 lamps

0

10

20

30

40

50

60

70

80

21G

kcalb

eulb

elprup

11G

inks krad

53 lartuen

eulb hsilp rup

egailof

der

nay c

etihw

yks eulb

01G

rewolf eu lb

neerg

6G

9G

der etaredom

egnaro

8G

7G

n iks thgil

neerg hs iu lb

5G

neerg wolley

wolley egnaro

8 lartuen

4G

wolley

3G

2G

etihw

1G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Fluorescent

Average error percentagesAll 53Grayscale targets 30Colored targets 74

Figure 3 Dark room with greyscale and coloured targets illuminated with a T5 fluorescent (3000K) lamp

High dynamic range photography 127

Lighting Res Technol 382 (2006) pp 123136

0

100

200

300

400

500

600

700

800

900

1000

kcalb

eulb

5 3 lartuen

niks krad

egailof

nayc

11G

n ee rg

der etaredom

01G

niks thgil

neerg wolley

9G

wolley

7G

6G

4G

2G

Greyscale and coloured targets

mdc( ecnanimu

L2)

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Metal HalideAverage error percentagesAll 48Grayscale targets 38Colored targets 58

Figure 4 Dark room with greyscale and coloured targets illuminated with a metal halide lamp

0

50

100

150

200

250

300

1G

3G

5G

7G

8G

01G

21G

niks krad

egailof

d er

nayc

71G

81G

atnegam

1 2G

d er etar edom

32G

4 2G

6 2G

8 2G

9 2G

03G

niks thgil

3 3G

4 3G

5 3G

73G

neerg wolley

0 4G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Average error percentagesAll 72Grayscale targets 52Colored targets 102

Figure 5 Coloured and greyscale targets in an office space illuminated with daylighting and electric lighting

128 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

with CCT of 6500K 3500K and 3000Krespectively) metal halide (MH) and highpressure sodium (HPS) lamps The accuracy

tests were carried out with Target 1 and aMacbeth ColorCheckerdagger chart Figures 24show the accuracy obtained for incandescent

0

2000

4000

6000

8000

10000

12000

14000

16000

18000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Greyscale targets

mdc( ecnan im u

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 116

Figure 7 Outdoor scene with greyscale targets under clear sky

0

1000

2000

3000

4000

5000

6000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

Greyscale targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 58

Figure 6 Outdoor scene with greyscale targets under overcast sky

High dynamic range photography 129

Lighting Res Technol 382 (2006) pp 123136

fluorescent and MH light sources Althoughthe spectral properties of the light sourceswere quite different the accuracy proved to bereasonable (error margin within 10) for alltargets The error margins for greyscale andcoloured targets varied depending on thespectral power distribution of the lightsources Figure 5 shows an office spaceilluminated with daylight and electric lightingFigures 6 and 7 illustrate outdoor scenes withdynamic lighting conditions The measure-ment setups shown here are chosen to providea wide spectrum of luminous environmentsAdditional setups have been measured and theresults reveal similar error margins11

32 VignettingLight fall offThe Nikon FCE9 fisheye lens uses equidi-

stant projection to produce an image (ie thedistances between two points are projectedby a constant scaling factor in all directions)The equidistant fisheye lenses exhibit notice-able light fall off (which will be referred toas vignetting) for the pixels far from theoptical axis It is therefore necessary toaddress and correct this aberration Theapproach adopted was to determine thevignetting function of the Nikon FCE9 fisheye

lens and use this function to devise a lsquodigitalfilterrsquo in HDRLab to compensate for theluminance loss

The camera was rotated with respect to thetarget in increments of 58 until the camerafield of view was covered The luminance of atarget was determined from HDR photo-graphs for each camera position and com-pared with the physical measurement Figure 8(a) illustrates the normalized data points andthe polynomial function derived to fit thesepoints The square of the correlation (r2)between the measured points and the derivedvignetting function is 9912 As it can beseen from the figure vignetting effects areinsignificant in the centre of the image and themaximum effect is at the periphery with 23luminance loss

Figure 8 (b) is the digital filter developed inHDRLab based on the vignetting functionThis filter is a matrix that has the sameresolution as the fisheye image and it com-pensates for the vignetting effect based on thepixel locations The vignetting function shownin Figure 8 (b) is generated for an aperture sizeof f40 Note that vignetting is stronglydependent on the aperture size and increasesdramatically with wider apertures Therefore

y = -3E-09x4 + 1E-06x3 - 00002x2 + 00101x + 07704

R2 = 09912

0

02

04

06

08

1

12

0 30 60 90 120 150 180

Field of view (deg)

tceffe gnittengiV

measured Polynomial fit of measured data

(a) (b)

Vig

netti

ng e

ffec

t

1

08

06

04

02

0Pixel location (x coordinate) Pixel location (y coordinate)

Figure 8 The vignetting function of Nikon FC9 fisheye lens a) Measured data points and the vignetting function derived to fitthe measured points b) digital filter developed based on the vignetting function

130 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

it is recommended that smaller apertures areused so as to reduce vignetting effects whilecapturing the calibration sequence which de-termines the camera response curves7

33 Point spread functionIt would have been very convenient to

interpret each pixel in the photograph as aluminance measurement but unfortunately inmost optical systems some portion of the pixelsignal comes from surrounding areas Lightentering the camera is spread out and scat-tered by the optical structure of the lens

which is referred to as the point spreadfunction (PSF)14

A point light source is used to quantify thePSF The source is located far enough fromthe camera to ensure that theoretically itcovers less than one pixel area Without anypoint spread the image of the light sourceshould fit onto one pixel The point spreadingis illustrated in Figure 9 (a) and (b) as close-upviews

The aperture size exposure time and eccen-tricity (distance from the optical centre) affectPSF The aperture size was kept constant

Figure 10 Histograms for error percentages between the measured and captured luminances for a) greyscale targets b)coloured targets

Figure 9 a) Close-up view of point spreading the square in (b) shows the pixel that corresponds to the actual light source c)PSF as quantified from the HDR image d) PSF as a function of eccentricity (distance from the optical centre) in the camera fieldof view from 0 to 908 in 158 intervals

High dynamic range photography 131

Lighting Res Technol 382 (2006) pp 123136

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 2: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

specifically developed for lighting measure-ment purposes The objective of the presentstudy is to evaluate the appropriateness (ieaccuracy applicability and limitations) of thetechnique as a luminance data acquisitionsystem

2 Equipment and softwares

21 Equipments

The multiple exposure photographs weretaken with a Nikon Coolpix 5400 digitalcamera mounted on a tripod and fitted witha fisheye lens (Nikon FC-E9) The fisheye lenshas a focal length of 56 mm and an angle ofview of 1908 Reference physical measure-ments were taken with a calibrated hand

held luminance meter with 138 field of view(Minolta LS110)

22 Softwares

The multiple exposure photographs wereprocessed using the software called Photo-sphere4 All photographs were taken with thecamera settings shown in Table 1 It isespecially important to fix the white balancefor achieving consistent colour space transi-tions Changing either the aperture size(f-stop) or the shutter speed (exposure time)can vary the exposure values Shutter speed isa more reliable measure than aperture size56

Therefore exposure variations were achievedwith a fixed aperture size (f40) and varyingonly the shutter speed in manual exposuremode (2 s to 14000 s)

Table 1 Camera settings

Feature Setting Feature Setting

White balance Daylight Image size 2592 pixels1944 pixelsBest shot selector Off Sensitivity 100 ISOImage adjustment Normal Image sharpening OffSaturation control Normal Lens FisheyeAuto-bracketing Off Noise reduction Off

Camera response curve

00

02

04

06

08

10

0Pixel255

mdc( ecnanimu

L enecS2 )

10

Photosphere

Figure 1 Camera response curve for Nikon 5400 as determined by Photosphere

124 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

An interior scene with daylight that hadlarge and smooth gradients throughout theinterior and exterior views was selected fordetermining the camerarsquos natural responsefunction7 Photosphere generated the cameraresponse curve for the Nikon 5400 that wasused in this study based on 10 exposuresequences in three channels (RGB) as follows(Figure 1)

R153994x3 099492x2 046536x

001037

G 131795x3 069784x2 038994x

001005

B167667x3 109256x2 042334x

000745

The curves are polynomial functionsthat model the accumulated radiometric non-linearities of the image acquisition process(such as gamma correction AD conversionimage digitizer various mappings) withoutaddressing the individual source of eachnon-linearity56 The technique known asradiometric self-calibration5 is a computa-tionally derived calibration process used torelate the pixel values to the real worldluminances Camera response curves varyconsiderably between different cameras there-fore radiometric self-calibration has to beapplied to each camera However the onlyinput to the process is a set of multipleexposure photographs

Once the camera response curve is deter-mined Photosphere can fuse any photographsequence into a HDR image HDR images canbe stored in image formats such as RadianceRGBE8 and LogLuv TIFF9 where the pixelvalues can extend over the luminance spanof the human visual system (from 106 to108 cdm2)

23 Luminance calculationFor analysing the HDR images from Photo-

sphere computational procedures were imple-

mented (referred to as HDRLab) The routines

were written in Matlabdagger and they allow theuser to extract and process per-pixel lightingdata from the HDR images saved in RadianceRGBE format CIE XYZ values for each pixelwere quantified from floating point RGBvalues based on the standard colour space(sRGB) reference primaries10 CIE StandardIlluminant D65 and standard CIE Colori-metric Observer with 28 field of view Thetransformation process is performed as follows

1) Floating point RGB is calculated fromRGBE integer values8

red R

255+2(E128) green

G

255+2(E128)

blueB

255+2(E128)

2) CIE chromaticities for the reference pri-maries and CIE Standard Illuminant D65are as follows10

R(x y z) (064 033 003)

G(x y z) (030 060010)

B(x y z) (015 006 079)

D65(x y z) (03127 03290 03583)

3) RGB to XYZ matrix is constructed withthe matrix of RGB chromaticities (lsquoKrsquo)which are differentially scaled to achievethe unit white point balance (lsquoWrsquo)Further information on RGB to XYZconversion can be found in Wyszecki andStiles12 and Glassner13

K rx ry rz

gx gy gz

bx by bz

24

35

064 033 003

030 060 010

015 006 079

24

35

W frac14Xn

Yn

1Zn

Yn

09505 10 10891frac12

High dynamic range photography 125

Lighting Res Technol 382 (2006) pp 123136

V frac14WK1 frac14 06444 11919 12032frac12

Gr Gg Gh

G Gr 0 0

0 Gg 0

0 0 Gb

24

35

06466 0 0

0 11919 0

0 0 12032

24

35

X frac14 04124 +R03576 +G01805 +B

Y frac14 02127 +R07151 +G00722 +B

Z frac14 00193 +R01192 +G09505 +B

4) It is possible to calibrate images and

cameras in Photosphere with a physical

luminance measurement of a selected

region in the scene This feature can be

applied as a constant (lsquok rsquo) either to thepixel values in an image or to the cameraresponse function Luminance (L) iscalculated as

Lk+(02127+R07151

+G00722+B) (cd=m2)

3 Measurement setups and analyses

To evaluate the HDR technique a suite ofmeasurements and analyses were undertakenKey sets of measurements involved the com-parison of luminances of various surfaces asdetermined from HDR images and the lumi-nance meter The measurements were repeatedin different settings which varied from con-trolled laboratory environments (blackpainted room without daylight) to officespaces and outdoors The interior settings

0

5

10

15

20

25

21G

kcalb

eulb

elprup

53 lartuen

eulb hsilprup

11G

nik s kr ad

egailo f

der

n ayc

5 lartu en

yks eulb

01G

atnegam

nee rg

r ewolf eul b

der etaredom

9G

56 lar tue n

eg naro

8G

niks th gi l

7G

neerg hsiulb

6G

neerg wolley

wolley egnaro

8 lartu en

5G

wolley

4G

3G

etihw

2G

1G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

IncandescentAverage error percentagesAll 79Grayscale targets 46Colored targets 112

Figure 2 Dark room with greyscale and coloured targets illuminated with an incandescent lamp

126 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

had different light sources and different illu-minances Precautions were taken to ensurethat the equipment and the people who madethe measurements did not cast shadows on thetargets The measurements were made withfive sets of paper targets The first targetconsisted of 24 squares (two sets of 12squares) which had reflectances ranging from4 to 87 The other four targets had a greysquare target in the middle with a 28reflectance This square was enveloped by a)white (87 reflectance) b) black (4)c) white-then-black d) black-then-white sur-roundings The Macbeth ColorCheckerdagger

chart was also used as a targetThe centre of each target was measured with

the Minolta luminance meter and each mea-surement setup was captured with the HDRimages When daylight was present in thesetup the physical measurements were madetwice before and after the multiple exposurephotographs were taken There was a time lagbetween the physical measurements and the

multiple exposure photographs thereforebracketing the photograph sequences withphysical measurements allowed the variationof the lighting conditions to be characterizedA rectangular block of pixels (approximatelycorresponding to the size and location of thespot used for the luminance meter) wasextracted from the digital images for eachtarget The selection area was fitted insideeach target and excluded the borders wherethe luminances could change drastically Theminimum maximum mean and standarddeviation of each block were studied to ensurethat luminance within the selected region wasuniform This approach was considered to bemore reliable than processing 1 or 2 pixelvalues

31 AccuracyA laboratory space was illuminated with

different light sources such as an incandescentlamp a projector (using a 500W tungstenlamp) fluorescent (T12 T8 and T5 lamps

0

10

20

30

40

50

60

70

80

21G

kcalb

eulb

elprup

11G

inks krad

53 lartuen

eulb hsilp rup

egailof

der

nay c

etihw

yks eulb

01G

rewolf eu lb

neerg

6G

9G

der etaredom

egnaro

8G

7G

n iks thgil

neerg hs iu lb

5G

neerg wolley

wolley egnaro

8 lartuen

4G

wolley

3G

2G

etihw

1G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Fluorescent

Average error percentagesAll 53Grayscale targets 30Colored targets 74

Figure 3 Dark room with greyscale and coloured targets illuminated with a T5 fluorescent (3000K) lamp

High dynamic range photography 127

Lighting Res Technol 382 (2006) pp 123136

0

100

200

300

400

500

600

700

800

900

1000

kcalb

eulb

5 3 lartuen

niks krad

egailof

nayc

11G

n ee rg

der etaredom

01G

niks thgil

neerg wolley

9G

wolley

7G

6G

4G

2G

Greyscale and coloured targets

mdc( ecnanimu

L2)

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Metal HalideAverage error percentagesAll 48Grayscale targets 38Colored targets 58

Figure 4 Dark room with greyscale and coloured targets illuminated with a metal halide lamp

0

50

100

150

200

250

300

1G

3G

5G

7G

8G

01G

21G

niks krad

egailof

d er

nayc

71G

81G

atnegam

1 2G

d er etar edom

32G

4 2G

6 2G

8 2G

9 2G

03G

niks thgil

3 3G

4 3G

5 3G

73G

neerg wolley

0 4G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Average error percentagesAll 72Grayscale targets 52Colored targets 102

Figure 5 Coloured and greyscale targets in an office space illuminated with daylighting and electric lighting

128 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

with CCT of 6500K 3500K and 3000Krespectively) metal halide (MH) and highpressure sodium (HPS) lamps The accuracy

tests were carried out with Target 1 and aMacbeth ColorCheckerdagger chart Figures 24show the accuracy obtained for incandescent

0

2000

4000

6000

8000

10000

12000

14000

16000

18000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Greyscale targets

mdc( ecnan im u

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 116

Figure 7 Outdoor scene with greyscale targets under clear sky

0

1000

2000

3000

4000

5000

6000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

Greyscale targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 58

Figure 6 Outdoor scene with greyscale targets under overcast sky

High dynamic range photography 129

Lighting Res Technol 382 (2006) pp 123136

fluorescent and MH light sources Althoughthe spectral properties of the light sourceswere quite different the accuracy proved to bereasonable (error margin within 10) for alltargets The error margins for greyscale andcoloured targets varied depending on thespectral power distribution of the lightsources Figure 5 shows an office spaceilluminated with daylight and electric lightingFigures 6 and 7 illustrate outdoor scenes withdynamic lighting conditions The measure-ment setups shown here are chosen to providea wide spectrum of luminous environmentsAdditional setups have been measured and theresults reveal similar error margins11

32 VignettingLight fall offThe Nikon FCE9 fisheye lens uses equidi-

stant projection to produce an image (ie thedistances between two points are projectedby a constant scaling factor in all directions)The equidistant fisheye lenses exhibit notice-able light fall off (which will be referred toas vignetting) for the pixels far from theoptical axis It is therefore necessary toaddress and correct this aberration Theapproach adopted was to determine thevignetting function of the Nikon FCE9 fisheye

lens and use this function to devise a lsquodigitalfilterrsquo in HDRLab to compensate for theluminance loss

The camera was rotated with respect to thetarget in increments of 58 until the camerafield of view was covered The luminance of atarget was determined from HDR photo-graphs for each camera position and com-pared with the physical measurement Figure 8(a) illustrates the normalized data points andthe polynomial function derived to fit thesepoints The square of the correlation (r2)between the measured points and the derivedvignetting function is 9912 As it can beseen from the figure vignetting effects areinsignificant in the centre of the image and themaximum effect is at the periphery with 23luminance loss

Figure 8 (b) is the digital filter developed inHDRLab based on the vignetting functionThis filter is a matrix that has the sameresolution as the fisheye image and it com-pensates for the vignetting effect based on thepixel locations The vignetting function shownin Figure 8 (b) is generated for an aperture sizeof f40 Note that vignetting is stronglydependent on the aperture size and increasesdramatically with wider apertures Therefore

y = -3E-09x4 + 1E-06x3 - 00002x2 + 00101x + 07704

R2 = 09912

0

02

04

06

08

1

12

0 30 60 90 120 150 180

Field of view (deg)

tceffe gnittengiV

measured Polynomial fit of measured data

(a) (b)

Vig

netti

ng e

ffec

t

1

08

06

04

02

0Pixel location (x coordinate) Pixel location (y coordinate)

Figure 8 The vignetting function of Nikon FC9 fisheye lens a) Measured data points and the vignetting function derived to fitthe measured points b) digital filter developed based on the vignetting function

130 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

it is recommended that smaller apertures areused so as to reduce vignetting effects whilecapturing the calibration sequence which de-termines the camera response curves7

33 Point spread functionIt would have been very convenient to

interpret each pixel in the photograph as aluminance measurement but unfortunately inmost optical systems some portion of the pixelsignal comes from surrounding areas Lightentering the camera is spread out and scat-tered by the optical structure of the lens

which is referred to as the point spreadfunction (PSF)14

A point light source is used to quantify thePSF The source is located far enough fromthe camera to ensure that theoretically itcovers less than one pixel area Without anypoint spread the image of the light sourceshould fit onto one pixel The point spreadingis illustrated in Figure 9 (a) and (b) as close-upviews

The aperture size exposure time and eccen-tricity (distance from the optical centre) affectPSF The aperture size was kept constant

Figure 10 Histograms for error percentages between the measured and captured luminances for a) greyscale targets b)coloured targets

Figure 9 a) Close-up view of point spreading the square in (b) shows the pixel that corresponds to the actual light source c)PSF as quantified from the HDR image d) PSF as a function of eccentricity (distance from the optical centre) in the camera fieldof view from 0 to 908 in 158 intervals

High dynamic range photography 131

Lighting Res Technol 382 (2006) pp 123136

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 3: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

An interior scene with daylight that hadlarge and smooth gradients throughout theinterior and exterior views was selected fordetermining the camerarsquos natural responsefunction7 Photosphere generated the cameraresponse curve for the Nikon 5400 that wasused in this study based on 10 exposuresequences in three channels (RGB) as follows(Figure 1)

R153994x3 099492x2 046536x

001037

G 131795x3 069784x2 038994x

001005

B167667x3 109256x2 042334x

000745

The curves are polynomial functionsthat model the accumulated radiometric non-linearities of the image acquisition process(such as gamma correction AD conversionimage digitizer various mappings) withoutaddressing the individual source of eachnon-linearity56 The technique known asradiometric self-calibration5 is a computa-tionally derived calibration process used torelate the pixel values to the real worldluminances Camera response curves varyconsiderably between different cameras there-fore radiometric self-calibration has to beapplied to each camera However the onlyinput to the process is a set of multipleexposure photographs

Once the camera response curve is deter-mined Photosphere can fuse any photographsequence into a HDR image HDR images canbe stored in image formats such as RadianceRGBE8 and LogLuv TIFF9 where the pixelvalues can extend over the luminance spanof the human visual system (from 106 to108 cdm2)

23 Luminance calculationFor analysing the HDR images from Photo-

sphere computational procedures were imple-

mented (referred to as HDRLab) The routines

were written in Matlabdagger and they allow theuser to extract and process per-pixel lightingdata from the HDR images saved in RadianceRGBE format CIE XYZ values for each pixelwere quantified from floating point RGBvalues based on the standard colour space(sRGB) reference primaries10 CIE StandardIlluminant D65 and standard CIE Colori-metric Observer with 28 field of view Thetransformation process is performed as follows

1) Floating point RGB is calculated fromRGBE integer values8

red R

255+2(E128) green

G

255+2(E128)

blueB

255+2(E128)

2) CIE chromaticities for the reference pri-maries and CIE Standard Illuminant D65are as follows10

R(x y z) (064 033 003)

G(x y z) (030 060010)

B(x y z) (015 006 079)

D65(x y z) (03127 03290 03583)

3) RGB to XYZ matrix is constructed withthe matrix of RGB chromaticities (lsquoKrsquo)which are differentially scaled to achievethe unit white point balance (lsquoWrsquo)Further information on RGB to XYZconversion can be found in Wyszecki andStiles12 and Glassner13

K rx ry rz

gx gy gz

bx by bz

24

35

064 033 003

030 060 010

015 006 079

24

35

W frac14Xn

Yn

1Zn

Yn

09505 10 10891frac12

High dynamic range photography 125

Lighting Res Technol 382 (2006) pp 123136

V frac14WK1 frac14 06444 11919 12032frac12

Gr Gg Gh

G Gr 0 0

0 Gg 0

0 0 Gb

24

35

06466 0 0

0 11919 0

0 0 12032

24

35

X frac14 04124 +R03576 +G01805 +B

Y frac14 02127 +R07151 +G00722 +B

Z frac14 00193 +R01192 +G09505 +B

4) It is possible to calibrate images and

cameras in Photosphere with a physical

luminance measurement of a selected

region in the scene This feature can be

applied as a constant (lsquok rsquo) either to thepixel values in an image or to the cameraresponse function Luminance (L) iscalculated as

Lk+(02127+R07151

+G00722+B) (cd=m2)

3 Measurement setups and analyses

To evaluate the HDR technique a suite ofmeasurements and analyses were undertakenKey sets of measurements involved the com-parison of luminances of various surfaces asdetermined from HDR images and the lumi-nance meter The measurements were repeatedin different settings which varied from con-trolled laboratory environments (blackpainted room without daylight) to officespaces and outdoors The interior settings

0

5

10

15

20

25

21G

kcalb

eulb

elprup

53 lartuen

eulb hsilprup

11G

nik s kr ad

egailo f

der

n ayc

5 lartu en

yks eulb

01G

atnegam

nee rg

r ewolf eul b

der etaredom

9G

56 lar tue n

eg naro

8G

niks th gi l

7G

neerg hsiulb

6G

neerg wolley

wolley egnaro

8 lartu en

5G

wolley

4G

3G

etihw

2G

1G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

IncandescentAverage error percentagesAll 79Grayscale targets 46Colored targets 112

Figure 2 Dark room with greyscale and coloured targets illuminated with an incandescent lamp

126 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

had different light sources and different illu-minances Precautions were taken to ensurethat the equipment and the people who madethe measurements did not cast shadows on thetargets The measurements were made withfive sets of paper targets The first targetconsisted of 24 squares (two sets of 12squares) which had reflectances ranging from4 to 87 The other four targets had a greysquare target in the middle with a 28reflectance This square was enveloped by a)white (87 reflectance) b) black (4)c) white-then-black d) black-then-white sur-roundings The Macbeth ColorCheckerdagger

chart was also used as a targetThe centre of each target was measured with

the Minolta luminance meter and each mea-surement setup was captured with the HDRimages When daylight was present in thesetup the physical measurements were madetwice before and after the multiple exposurephotographs were taken There was a time lagbetween the physical measurements and the

multiple exposure photographs thereforebracketing the photograph sequences withphysical measurements allowed the variationof the lighting conditions to be characterizedA rectangular block of pixels (approximatelycorresponding to the size and location of thespot used for the luminance meter) wasextracted from the digital images for eachtarget The selection area was fitted insideeach target and excluded the borders wherethe luminances could change drastically Theminimum maximum mean and standarddeviation of each block were studied to ensurethat luminance within the selected region wasuniform This approach was considered to bemore reliable than processing 1 or 2 pixelvalues

31 AccuracyA laboratory space was illuminated with

different light sources such as an incandescentlamp a projector (using a 500W tungstenlamp) fluorescent (T12 T8 and T5 lamps

0

10

20

30

40

50

60

70

80

21G

kcalb

eulb

elprup

11G

inks krad

53 lartuen

eulb hsilp rup

egailof

der

nay c

etihw

yks eulb

01G

rewolf eu lb

neerg

6G

9G

der etaredom

egnaro

8G

7G

n iks thgil

neerg hs iu lb

5G

neerg wolley

wolley egnaro

8 lartuen

4G

wolley

3G

2G

etihw

1G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Fluorescent

Average error percentagesAll 53Grayscale targets 30Colored targets 74

Figure 3 Dark room with greyscale and coloured targets illuminated with a T5 fluorescent (3000K) lamp

High dynamic range photography 127

Lighting Res Technol 382 (2006) pp 123136

0

100

200

300

400

500

600

700

800

900

1000

kcalb

eulb

5 3 lartuen

niks krad

egailof

nayc

11G

n ee rg

der etaredom

01G

niks thgil

neerg wolley

9G

wolley

7G

6G

4G

2G

Greyscale and coloured targets

mdc( ecnanimu

L2)

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Metal HalideAverage error percentagesAll 48Grayscale targets 38Colored targets 58

Figure 4 Dark room with greyscale and coloured targets illuminated with a metal halide lamp

0

50

100

150

200

250

300

1G

3G

5G

7G

8G

01G

21G

niks krad

egailof

d er

nayc

71G

81G

atnegam

1 2G

d er etar edom

32G

4 2G

6 2G

8 2G

9 2G

03G

niks thgil

3 3G

4 3G

5 3G

73G

neerg wolley

0 4G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Average error percentagesAll 72Grayscale targets 52Colored targets 102

Figure 5 Coloured and greyscale targets in an office space illuminated with daylighting and electric lighting

128 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

with CCT of 6500K 3500K and 3000Krespectively) metal halide (MH) and highpressure sodium (HPS) lamps The accuracy

tests were carried out with Target 1 and aMacbeth ColorCheckerdagger chart Figures 24show the accuracy obtained for incandescent

0

2000

4000

6000

8000

10000

12000

14000

16000

18000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Greyscale targets

mdc( ecnan im u

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 116

Figure 7 Outdoor scene with greyscale targets under clear sky

0

1000

2000

3000

4000

5000

6000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

Greyscale targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 58

Figure 6 Outdoor scene with greyscale targets under overcast sky

High dynamic range photography 129

Lighting Res Technol 382 (2006) pp 123136

fluorescent and MH light sources Althoughthe spectral properties of the light sourceswere quite different the accuracy proved to bereasonable (error margin within 10) for alltargets The error margins for greyscale andcoloured targets varied depending on thespectral power distribution of the lightsources Figure 5 shows an office spaceilluminated with daylight and electric lightingFigures 6 and 7 illustrate outdoor scenes withdynamic lighting conditions The measure-ment setups shown here are chosen to providea wide spectrum of luminous environmentsAdditional setups have been measured and theresults reveal similar error margins11

32 VignettingLight fall offThe Nikon FCE9 fisheye lens uses equidi-

stant projection to produce an image (ie thedistances between two points are projectedby a constant scaling factor in all directions)The equidistant fisheye lenses exhibit notice-able light fall off (which will be referred toas vignetting) for the pixels far from theoptical axis It is therefore necessary toaddress and correct this aberration Theapproach adopted was to determine thevignetting function of the Nikon FCE9 fisheye

lens and use this function to devise a lsquodigitalfilterrsquo in HDRLab to compensate for theluminance loss

The camera was rotated with respect to thetarget in increments of 58 until the camerafield of view was covered The luminance of atarget was determined from HDR photo-graphs for each camera position and com-pared with the physical measurement Figure 8(a) illustrates the normalized data points andthe polynomial function derived to fit thesepoints The square of the correlation (r2)between the measured points and the derivedvignetting function is 9912 As it can beseen from the figure vignetting effects areinsignificant in the centre of the image and themaximum effect is at the periphery with 23luminance loss

Figure 8 (b) is the digital filter developed inHDRLab based on the vignetting functionThis filter is a matrix that has the sameresolution as the fisheye image and it com-pensates for the vignetting effect based on thepixel locations The vignetting function shownin Figure 8 (b) is generated for an aperture sizeof f40 Note that vignetting is stronglydependent on the aperture size and increasesdramatically with wider apertures Therefore

y = -3E-09x4 + 1E-06x3 - 00002x2 + 00101x + 07704

R2 = 09912

0

02

04

06

08

1

12

0 30 60 90 120 150 180

Field of view (deg)

tceffe gnittengiV

measured Polynomial fit of measured data

(a) (b)

Vig

netti

ng e

ffec

t

1

08

06

04

02

0Pixel location (x coordinate) Pixel location (y coordinate)

Figure 8 The vignetting function of Nikon FC9 fisheye lens a) Measured data points and the vignetting function derived to fitthe measured points b) digital filter developed based on the vignetting function

130 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

it is recommended that smaller apertures areused so as to reduce vignetting effects whilecapturing the calibration sequence which de-termines the camera response curves7

33 Point spread functionIt would have been very convenient to

interpret each pixel in the photograph as aluminance measurement but unfortunately inmost optical systems some portion of the pixelsignal comes from surrounding areas Lightentering the camera is spread out and scat-tered by the optical structure of the lens

which is referred to as the point spreadfunction (PSF)14

A point light source is used to quantify thePSF The source is located far enough fromthe camera to ensure that theoretically itcovers less than one pixel area Without anypoint spread the image of the light sourceshould fit onto one pixel The point spreadingis illustrated in Figure 9 (a) and (b) as close-upviews

The aperture size exposure time and eccen-tricity (distance from the optical centre) affectPSF The aperture size was kept constant

Figure 10 Histograms for error percentages between the measured and captured luminances for a) greyscale targets b)coloured targets

Figure 9 a) Close-up view of point spreading the square in (b) shows the pixel that corresponds to the actual light source c)PSF as quantified from the HDR image d) PSF as a function of eccentricity (distance from the optical centre) in the camera fieldof view from 0 to 908 in 158 intervals

High dynamic range photography 131

Lighting Res Technol 382 (2006) pp 123136

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 4: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

V frac14WK1 frac14 06444 11919 12032frac12

Gr Gg Gh

G Gr 0 0

0 Gg 0

0 0 Gb

24

35

06466 0 0

0 11919 0

0 0 12032

24

35

X frac14 04124 +R03576 +G01805 +B

Y frac14 02127 +R07151 +G00722 +B

Z frac14 00193 +R01192 +G09505 +B

4) It is possible to calibrate images and

cameras in Photosphere with a physical

luminance measurement of a selected

region in the scene This feature can be

applied as a constant (lsquok rsquo) either to thepixel values in an image or to the cameraresponse function Luminance (L) iscalculated as

Lk+(02127+R07151

+G00722+B) (cd=m2)

3 Measurement setups and analyses

To evaluate the HDR technique a suite ofmeasurements and analyses were undertakenKey sets of measurements involved the com-parison of luminances of various surfaces asdetermined from HDR images and the lumi-nance meter The measurements were repeatedin different settings which varied from con-trolled laboratory environments (blackpainted room without daylight) to officespaces and outdoors The interior settings

0

5

10

15

20

25

21G

kcalb

eulb

elprup

53 lartuen

eulb hsilprup

11G

nik s kr ad

egailo f

der

n ayc

5 lartu en

yks eulb

01G

atnegam

nee rg

r ewolf eul b

der etaredom

9G

56 lar tue n

eg naro

8G

niks th gi l

7G

neerg hsiulb

6G

neerg wolley

wolley egnaro

8 lartu en

5G

wolley

4G

3G

etihw

2G

1G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

IncandescentAverage error percentagesAll 79Grayscale targets 46Colored targets 112

Figure 2 Dark room with greyscale and coloured targets illuminated with an incandescent lamp

126 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

had different light sources and different illu-minances Precautions were taken to ensurethat the equipment and the people who madethe measurements did not cast shadows on thetargets The measurements were made withfive sets of paper targets The first targetconsisted of 24 squares (two sets of 12squares) which had reflectances ranging from4 to 87 The other four targets had a greysquare target in the middle with a 28reflectance This square was enveloped by a)white (87 reflectance) b) black (4)c) white-then-black d) black-then-white sur-roundings The Macbeth ColorCheckerdagger

chart was also used as a targetThe centre of each target was measured with

the Minolta luminance meter and each mea-surement setup was captured with the HDRimages When daylight was present in thesetup the physical measurements were madetwice before and after the multiple exposurephotographs were taken There was a time lagbetween the physical measurements and the

multiple exposure photographs thereforebracketing the photograph sequences withphysical measurements allowed the variationof the lighting conditions to be characterizedA rectangular block of pixels (approximatelycorresponding to the size and location of thespot used for the luminance meter) wasextracted from the digital images for eachtarget The selection area was fitted insideeach target and excluded the borders wherethe luminances could change drastically Theminimum maximum mean and standarddeviation of each block were studied to ensurethat luminance within the selected region wasuniform This approach was considered to bemore reliable than processing 1 or 2 pixelvalues

31 AccuracyA laboratory space was illuminated with

different light sources such as an incandescentlamp a projector (using a 500W tungstenlamp) fluorescent (T12 T8 and T5 lamps

0

10

20

30

40

50

60

70

80

21G

kcalb

eulb

elprup

11G

inks krad

53 lartuen

eulb hsilp rup

egailof

der

nay c

etihw

yks eulb

01G

rewolf eu lb

neerg

6G

9G

der etaredom

egnaro

8G

7G

n iks thgil

neerg hs iu lb

5G

neerg wolley

wolley egnaro

8 lartuen

4G

wolley

3G

2G

etihw

1G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Fluorescent

Average error percentagesAll 53Grayscale targets 30Colored targets 74

Figure 3 Dark room with greyscale and coloured targets illuminated with a T5 fluorescent (3000K) lamp

High dynamic range photography 127

Lighting Res Technol 382 (2006) pp 123136

0

100

200

300

400

500

600

700

800

900

1000

kcalb

eulb

5 3 lartuen

niks krad

egailof

nayc

11G

n ee rg

der etaredom

01G

niks thgil

neerg wolley

9G

wolley

7G

6G

4G

2G

Greyscale and coloured targets

mdc( ecnanimu

L2)

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Metal HalideAverage error percentagesAll 48Grayscale targets 38Colored targets 58

Figure 4 Dark room with greyscale and coloured targets illuminated with a metal halide lamp

0

50

100

150

200

250

300

1G

3G

5G

7G

8G

01G

21G

niks krad

egailof

d er

nayc

71G

81G

atnegam

1 2G

d er etar edom

32G

4 2G

6 2G

8 2G

9 2G

03G

niks thgil

3 3G

4 3G

5 3G

73G

neerg wolley

0 4G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Average error percentagesAll 72Grayscale targets 52Colored targets 102

Figure 5 Coloured and greyscale targets in an office space illuminated with daylighting and electric lighting

128 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

with CCT of 6500K 3500K and 3000Krespectively) metal halide (MH) and highpressure sodium (HPS) lamps The accuracy

tests were carried out with Target 1 and aMacbeth ColorCheckerdagger chart Figures 24show the accuracy obtained for incandescent

0

2000

4000

6000

8000

10000

12000

14000

16000

18000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Greyscale targets

mdc( ecnan im u

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 116

Figure 7 Outdoor scene with greyscale targets under clear sky

0

1000

2000

3000

4000

5000

6000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

Greyscale targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 58

Figure 6 Outdoor scene with greyscale targets under overcast sky

High dynamic range photography 129

Lighting Res Technol 382 (2006) pp 123136

fluorescent and MH light sources Althoughthe spectral properties of the light sourceswere quite different the accuracy proved to bereasonable (error margin within 10) for alltargets The error margins for greyscale andcoloured targets varied depending on thespectral power distribution of the lightsources Figure 5 shows an office spaceilluminated with daylight and electric lightingFigures 6 and 7 illustrate outdoor scenes withdynamic lighting conditions The measure-ment setups shown here are chosen to providea wide spectrum of luminous environmentsAdditional setups have been measured and theresults reveal similar error margins11

32 VignettingLight fall offThe Nikon FCE9 fisheye lens uses equidi-

stant projection to produce an image (ie thedistances between two points are projectedby a constant scaling factor in all directions)The equidistant fisheye lenses exhibit notice-able light fall off (which will be referred toas vignetting) for the pixels far from theoptical axis It is therefore necessary toaddress and correct this aberration Theapproach adopted was to determine thevignetting function of the Nikon FCE9 fisheye

lens and use this function to devise a lsquodigitalfilterrsquo in HDRLab to compensate for theluminance loss

The camera was rotated with respect to thetarget in increments of 58 until the camerafield of view was covered The luminance of atarget was determined from HDR photo-graphs for each camera position and com-pared with the physical measurement Figure 8(a) illustrates the normalized data points andthe polynomial function derived to fit thesepoints The square of the correlation (r2)between the measured points and the derivedvignetting function is 9912 As it can beseen from the figure vignetting effects areinsignificant in the centre of the image and themaximum effect is at the periphery with 23luminance loss

Figure 8 (b) is the digital filter developed inHDRLab based on the vignetting functionThis filter is a matrix that has the sameresolution as the fisheye image and it com-pensates for the vignetting effect based on thepixel locations The vignetting function shownin Figure 8 (b) is generated for an aperture sizeof f40 Note that vignetting is stronglydependent on the aperture size and increasesdramatically with wider apertures Therefore

y = -3E-09x4 + 1E-06x3 - 00002x2 + 00101x + 07704

R2 = 09912

0

02

04

06

08

1

12

0 30 60 90 120 150 180

Field of view (deg)

tceffe gnittengiV

measured Polynomial fit of measured data

(a) (b)

Vig

netti

ng e

ffec

t

1

08

06

04

02

0Pixel location (x coordinate) Pixel location (y coordinate)

Figure 8 The vignetting function of Nikon FC9 fisheye lens a) Measured data points and the vignetting function derived to fitthe measured points b) digital filter developed based on the vignetting function

130 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

it is recommended that smaller apertures areused so as to reduce vignetting effects whilecapturing the calibration sequence which de-termines the camera response curves7

33 Point spread functionIt would have been very convenient to

interpret each pixel in the photograph as aluminance measurement but unfortunately inmost optical systems some portion of the pixelsignal comes from surrounding areas Lightentering the camera is spread out and scat-tered by the optical structure of the lens

which is referred to as the point spreadfunction (PSF)14

A point light source is used to quantify thePSF The source is located far enough fromthe camera to ensure that theoretically itcovers less than one pixel area Without anypoint spread the image of the light sourceshould fit onto one pixel The point spreadingis illustrated in Figure 9 (a) and (b) as close-upviews

The aperture size exposure time and eccen-tricity (distance from the optical centre) affectPSF The aperture size was kept constant

Figure 10 Histograms for error percentages between the measured and captured luminances for a) greyscale targets b)coloured targets

Figure 9 a) Close-up view of point spreading the square in (b) shows the pixel that corresponds to the actual light source c)PSF as quantified from the HDR image d) PSF as a function of eccentricity (distance from the optical centre) in the camera fieldof view from 0 to 908 in 158 intervals

High dynamic range photography 131

Lighting Res Technol 382 (2006) pp 123136

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 5: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

had different light sources and different illu-minances Precautions were taken to ensurethat the equipment and the people who madethe measurements did not cast shadows on thetargets The measurements were made withfive sets of paper targets The first targetconsisted of 24 squares (two sets of 12squares) which had reflectances ranging from4 to 87 The other four targets had a greysquare target in the middle with a 28reflectance This square was enveloped by a)white (87 reflectance) b) black (4)c) white-then-black d) black-then-white sur-roundings The Macbeth ColorCheckerdagger

chart was also used as a targetThe centre of each target was measured with

the Minolta luminance meter and each mea-surement setup was captured with the HDRimages When daylight was present in thesetup the physical measurements were madetwice before and after the multiple exposurephotographs were taken There was a time lagbetween the physical measurements and the

multiple exposure photographs thereforebracketing the photograph sequences withphysical measurements allowed the variationof the lighting conditions to be characterizedA rectangular block of pixels (approximatelycorresponding to the size and location of thespot used for the luminance meter) wasextracted from the digital images for eachtarget The selection area was fitted insideeach target and excluded the borders wherethe luminances could change drastically Theminimum maximum mean and standarddeviation of each block were studied to ensurethat luminance within the selected region wasuniform This approach was considered to bemore reliable than processing 1 or 2 pixelvalues

31 AccuracyA laboratory space was illuminated with

different light sources such as an incandescentlamp a projector (using a 500W tungstenlamp) fluorescent (T12 T8 and T5 lamps

0

10

20

30

40

50

60

70

80

21G

kcalb

eulb

elprup

11G

inks krad

53 lartuen

eulb hsilp rup

egailof

der

nay c

etihw

yks eulb

01G

rewolf eu lb

neerg

6G

9G

der etaredom

egnaro

8G

7G

n iks thgil

neerg hs iu lb

5G

neerg wolley

wolley egnaro

8 lartuen

4G

wolley

3G

2G

etihw

1G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Fluorescent

Average error percentagesAll 53Grayscale targets 30Colored targets 74

Figure 3 Dark room with greyscale and coloured targets illuminated with a T5 fluorescent (3000K) lamp

High dynamic range photography 127

Lighting Res Technol 382 (2006) pp 123136

0

100

200

300

400

500

600

700

800

900

1000

kcalb

eulb

5 3 lartuen

niks krad

egailof

nayc

11G

n ee rg

der etaredom

01G

niks thgil

neerg wolley

9G

wolley

7G

6G

4G

2G

Greyscale and coloured targets

mdc( ecnanimu

L2)

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Metal HalideAverage error percentagesAll 48Grayscale targets 38Colored targets 58

Figure 4 Dark room with greyscale and coloured targets illuminated with a metal halide lamp

0

50

100

150

200

250

300

1G

3G

5G

7G

8G

01G

21G

niks krad

egailof

d er

nayc

71G

81G

atnegam

1 2G

d er etar edom

32G

4 2G

6 2G

8 2G

9 2G

03G

niks thgil

3 3G

4 3G

5 3G

73G

neerg wolley

0 4G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Average error percentagesAll 72Grayscale targets 52Colored targets 102

Figure 5 Coloured and greyscale targets in an office space illuminated with daylighting and electric lighting

128 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

with CCT of 6500K 3500K and 3000Krespectively) metal halide (MH) and highpressure sodium (HPS) lamps The accuracy

tests were carried out with Target 1 and aMacbeth ColorCheckerdagger chart Figures 24show the accuracy obtained for incandescent

0

2000

4000

6000

8000

10000

12000

14000

16000

18000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Greyscale targets

mdc( ecnan im u

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 116

Figure 7 Outdoor scene with greyscale targets under clear sky

0

1000

2000

3000

4000

5000

6000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

Greyscale targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 58

Figure 6 Outdoor scene with greyscale targets under overcast sky

High dynamic range photography 129

Lighting Res Technol 382 (2006) pp 123136

fluorescent and MH light sources Althoughthe spectral properties of the light sourceswere quite different the accuracy proved to bereasonable (error margin within 10) for alltargets The error margins for greyscale andcoloured targets varied depending on thespectral power distribution of the lightsources Figure 5 shows an office spaceilluminated with daylight and electric lightingFigures 6 and 7 illustrate outdoor scenes withdynamic lighting conditions The measure-ment setups shown here are chosen to providea wide spectrum of luminous environmentsAdditional setups have been measured and theresults reveal similar error margins11

32 VignettingLight fall offThe Nikon FCE9 fisheye lens uses equidi-

stant projection to produce an image (ie thedistances between two points are projectedby a constant scaling factor in all directions)The equidistant fisheye lenses exhibit notice-able light fall off (which will be referred toas vignetting) for the pixels far from theoptical axis It is therefore necessary toaddress and correct this aberration Theapproach adopted was to determine thevignetting function of the Nikon FCE9 fisheye

lens and use this function to devise a lsquodigitalfilterrsquo in HDRLab to compensate for theluminance loss

The camera was rotated with respect to thetarget in increments of 58 until the camerafield of view was covered The luminance of atarget was determined from HDR photo-graphs for each camera position and com-pared with the physical measurement Figure 8(a) illustrates the normalized data points andthe polynomial function derived to fit thesepoints The square of the correlation (r2)between the measured points and the derivedvignetting function is 9912 As it can beseen from the figure vignetting effects areinsignificant in the centre of the image and themaximum effect is at the periphery with 23luminance loss

Figure 8 (b) is the digital filter developed inHDRLab based on the vignetting functionThis filter is a matrix that has the sameresolution as the fisheye image and it com-pensates for the vignetting effect based on thepixel locations The vignetting function shownin Figure 8 (b) is generated for an aperture sizeof f40 Note that vignetting is stronglydependent on the aperture size and increasesdramatically with wider apertures Therefore

y = -3E-09x4 + 1E-06x3 - 00002x2 + 00101x + 07704

R2 = 09912

0

02

04

06

08

1

12

0 30 60 90 120 150 180

Field of view (deg)

tceffe gnittengiV

measured Polynomial fit of measured data

(a) (b)

Vig

netti

ng e

ffec

t

1

08

06

04

02

0Pixel location (x coordinate) Pixel location (y coordinate)

Figure 8 The vignetting function of Nikon FC9 fisheye lens a) Measured data points and the vignetting function derived to fitthe measured points b) digital filter developed based on the vignetting function

130 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

it is recommended that smaller apertures areused so as to reduce vignetting effects whilecapturing the calibration sequence which de-termines the camera response curves7

33 Point spread functionIt would have been very convenient to

interpret each pixel in the photograph as aluminance measurement but unfortunately inmost optical systems some portion of the pixelsignal comes from surrounding areas Lightentering the camera is spread out and scat-tered by the optical structure of the lens

which is referred to as the point spreadfunction (PSF)14

A point light source is used to quantify thePSF The source is located far enough fromthe camera to ensure that theoretically itcovers less than one pixel area Without anypoint spread the image of the light sourceshould fit onto one pixel The point spreadingis illustrated in Figure 9 (a) and (b) as close-upviews

The aperture size exposure time and eccen-tricity (distance from the optical centre) affectPSF The aperture size was kept constant

Figure 10 Histograms for error percentages between the measured and captured luminances for a) greyscale targets b)coloured targets

Figure 9 a) Close-up view of point spreading the square in (b) shows the pixel that corresponds to the actual light source c)PSF as quantified from the HDR image d) PSF as a function of eccentricity (distance from the optical centre) in the camera fieldof view from 0 to 908 in 158 intervals

High dynamic range photography 131

Lighting Res Technol 382 (2006) pp 123136

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 6: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

0

100

200

300

400

500

600

700

800

900

1000

kcalb

eulb

5 3 lartuen

niks krad

egailof

nayc

11G

n ee rg

der etaredom

01G

niks thgil

neerg wolley

9G

wolley

7G

6G

4G

2G

Greyscale and coloured targets

mdc( ecnanimu

L2)

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Metal HalideAverage error percentagesAll 48Grayscale targets 38Colored targets 58

Figure 4 Dark room with greyscale and coloured targets illuminated with a metal halide lamp

0

50

100

150

200

250

300

1G

3G

5G

7G

8G

01G

21G

niks krad

egailof

d er

nayc

71G

81G

atnegam

1 2G

d er etar edom

32G

4 2G

6 2G

8 2G

9 2G

03G

niks thgil

3 3G

4 3G

5 3G

73G

neerg wolley

0 4G

Greyscale and coloured targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

Measured HDR error

Average error percentagesAll 72Grayscale targets 52Colored targets 102

Figure 5 Coloured and greyscale targets in an office space illuminated with daylighting and electric lighting

128 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

with CCT of 6500K 3500K and 3000Krespectively) metal halide (MH) and highpressure sodium (HPS) lamps The accuracy

tests were carried out with Target 1 and aMacbeth ColorCheckerdagger chart Figures 24show the accuracy obtained for incandescent

0

2000

4000

6000

8000

10000

12000

14000

16000

18000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Greyscale targets

mdc( ecnan im u

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 116

Figure 7 Outdoor scene with greyscale targets under clear sky

0

1000

2000

3000

4000

5000

6000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

Greyscale targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 58

Figure 6 Outdoor scene with greyscale targets under overcast sky

High dynamic range photography 129

Lighting Res Technol 382 (2006) pp 123136

fluorescent and MH light sources Althoughthe spectral properties of the light sourceswere quite different the accuracy proved to bereasonable (error margin within 10) for alltargets The error margins for greyscale andcoloured targets varied depending on thespectral power distribution of the lightsources Figure 5 shows an office spaceilluminated with daylight and electric lightingFigures 6 and 7 illustrate outdoor scenes withdynamic lighting conditions The measure-ment setups shown here are chosen to providea wide spectrum of luminous environmentsAdditional setups have been measured and theresults reveal similar error margins11

32 VignettingLight fall offThe Nikon FCE9 fisheye lens uses equidi-

stant projection to produce an image (ie thedistances between two points are projectedby a constant scaling factor in all directions)The equidistant fisheye lenses exhibit notice-able light fall off (which will be referred toas vignetting) for the pixels far from theoptical axis It is therefore necessary toaddress and correct this aberration Theapproach adopted was to determine thevignetting function of the Nikon FCE9 fisheye

lens and use this function to devise a lsquodigitalfilterrsquo in HDRLab to compensate for theluminance loss

The camera was rotated with respect to thetarget in increments of 58 until the camerafield of view was covered The luminance of atarget was determined from HDR photo-graphs for each camera position and com-pared with the physical measurement Figure 8(a) illustrates the normalized data points andthe polynomial function derived to fit thesepoints The square of the correlation (r2)between the measured points and the derivedvignetting function is 9912 As it can beseen from the figure vignetting effects areinsignificant in the centre of the image and themaximum effect is at the periphery with 23luminance loss

Figure 8 (b) is the digital filter developed inHDRLab based on the vignetting functionThis filter is a matrix that has the sameresolution as the fisheye image and it com-pensates for the vignetting effect based on thepixel locations The vignetting function shownin Figure 8 (b) is generated for an aperture sizeof f40 Note that vignetting is stronglydependent on the aperture size and increasesdramatically with wider apertures Therefore

y = -3E-09x4 + 1E-06x3 - 00002x2 + 00101x + 07704

R2 = 09912

0

02

04

06

08

1

12

0 30 60 90 120 150 180

Field of view (deg)

tceffe gnittengiV

measured Polynomial fit of measured data

(a) (b)

Vig

netti

ng e

ffec

t

1

08

06

04

02

0Pixel location (x coordinate) Pixel location (y coordinate)

Figure 8 The vignetting function of Nikon FC9 fisheye lens a) Measured data points and the vignetting function derived to fitthe measured points b) digital filter developed based on the vignetting function

130 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

it is recommended that smaller apertures areused so as to reduce vignetting effects whilecapturing the calibration sequence which de-termines the camera response curves7

33 Point spread functionIt would have been very convenient to

interpret each pixel in the photograph as aluminance measurement but unfortunately inmost optical systems some portion of the pixelsignal comes from surrounding areas Lightentering the camera is spread out and scat-tered by the optical structure of the lens

which is referred to as the point spreadfunction (PSF)14

A point light source is used to quantify thePSF The source is located far enough fromthe camera to ensure that theoretically itcovers less than one pixel area Without anypoint spread the image of the light sourceshould fit onto one pixel The point spreadingis illustrated in Figure 9 (a) and (b) as close-upviews

The aperture size exposure time and eccen-tricity (distance from the optical centre) affectPSF The aperture size was kept constant

Figure 10 Histograms for error percentages between the measured and captured luminances for a) greyscale targets b)coloured targets

Figure 9 a) Close-up view of point spreading the square in (b) shows the pixel that corresponds to the actual light source c)PSF as quantified from the HDR image d) PSF as a function of eccentricity (distance from the optical centre) in the camera fieldof view from 0 to 908 in 158 intervals

High dynamic range photography 131

Lighting Res Technol 382 (2006) pp 123136

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 7: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

with CCT of 6500K 3500K and 3000Krespectively) metal halide (MH) and highpressure sodium (HPS) lamps The accuracy

tests were carried out with Target 1 and aMacbeth ColorCheckerdagger chart Figures 24show the accuracy obtained for incandescent

0

2000

4000

6000

8000

10000

12000

14000

16000

18000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

Greyscale targets

mdc( ecnan im u

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 116

Figure 7 Outdoor scene with greyscale targets under clear sky

0

1000

2000

3000

4000

5000

6000

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

Greyscale targets

mdc( ecnanimu

L2 )

0

10

20

30

40

50

60

70

80

90

100

)( rorr

E

before after HDR Error

Average error 58

Figure 6 Outdoor scene with greyscale targets under overcast sky

High dynamic range photography 129

Lighting Res Technol 382 (2006) pp 123136

fluorescent and MH light sources Althoughthe spectral properties of the light sourceswere quite different the accuracy proved to bereasonable (error margin within 10) for alltargets The error margins for greyscale andcoloured targets varied depending on thespectral power distribution of the lightsources Figure 5 shows an office spaceilluminated with daylight and electric lightingFigures 6 and 7 illustrate outdoor scenes withdynamic lighting conditions The measure-ment setups shown here are chosen to providea wide spectrum of luminous environmentsAdditional setups have been measured and theresults reveal similar error margins11

32 VignettingLight fall offThe Nikon FCE9 fisheye lens uses equidi-

stant projection to produce an image (ie thedistances between two points are projectedby a constant scaling factor in all directions)The equidistant fisheye lenses exhibit notice-able light fall off (which will be referred toas vignetting) for the pixels far from theoptical axis It is therefore necessary toaddress and correct this aberration Theapproach adopted was to determine thevignetting function of the Nikon FCE9 fisheye

lens and use this function to devise a lsquodigitalfilterrsquo in HDRLab to compensate for theluminance loss

The camera was rotated with respect to thetarget in increments of 58 until the camerafield of view was covered The luminance of atarget was determined from HDR photo-graphs for each camera position and com-pared with the physical measurement Figure 8(a) illustrates the normalized data points andthe polynomial function derived to fit thesepoints The square of the correlation (r2)between the measured points and the derivedvignetting function is 9912 As it can beseen from the figure vignetting effects areinsignificant in the centre of the image and themaximum effect is at the periphery with 23luminance loss

Figure 8 (b) is the digital filter developed inHDRLab based on the vignetting functionThis filter is a matrix that has the sameresolution as the fisheye image and it com-pensates for the vignetting effect based on thepixel locations The vignetting function shownin Figure 8 (b) is generated for an aperture sizeof f40 Note that vignetting is stronglydependent on the aperture size and increasesdramatically with wider apertures Therefore

y = -3E-09x4 + 1E-06x3 - 00002x2 + 00101x + 07704

R2 = 09912

0

02

04

06

08

1

12

0 30 60 90 120 150 180

Field of view (deg)

tceffe gnittengiV

measured Polynomial fit of measured data

(a) (b)

Vig

netti

ng e

ffec

t

1

08

06

04

02

0Pixel location (x coordinate) Pixel location (y coordinate)

Figure 8 The vignetting function of Nikon FC9 fisheye lens a) Measured data points and the vignetting function derived to fitthe measured points b) digital filter developed based on the vignetting function

130 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

it is recommended that smaller apertures areused so as to reduce vignetting effects whilecapturing the calibration sequence which de-termines the camera response curves7

33 Point spread functionIt would have been very convenient to

interpret each pixel in the photograph as aluminance measurement but unfortunately inmost optical systems some portion of the pixelsignal comes from surrounding areas Lightentering the camera is spread out and scat-tered by the optical structure of the lens

which is referred to as the point spreadfunction (PSF)14

A point light source is used to quantify thePSF The source is located far enough fromthe camera to ensure that theoretically itcovers less than one pixel area Without anypoint spread the image of the light sourceshould fit onto one pixel The point spreadingis illustrated in Figure 9 (a) and (b) as close-upviews

The aperture size exposure time and eccen-tricity (distance from the optical centre) affectPSF The aperture size was kept constant

Figure 10 Histograms for error percentages between the measured and captured luminances for a) greyscale targets b)coloured targets

Figure 9 a) Close-up view of point spreading the square in (b) shows the pixel that corresponds to the actual light source c)PSF as quantified from the HDR image d) PSF as a function of eccentricity (distance from the optical centre) in the camera fieldof view from 0 to 908 in 158 intervals

High dynamic range photography 131

Lighting Res Technol 382 (2006) pp 123136

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 8: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

fluorescent and MH light sources Althoughthe spectral properties of the light sourceswere quite different the accuracy proved to bereasonable (error margin within 10) for alltargets The error margins for greyscale andcoloured targets varied depending on thespectral power distribution of the lightsources Figure 5 shows an office spaceilluminated with daylight and electric lightingFigures 6 and 7 illustrate outdoor scenes withdynamic lighting conditions The measure-ment setups shown here are chosen to providea wide spectrum of luminous environmentsAdditional setups have been measured and theresults reveal similar error margins11

32 VignettingLight fall offThe Nikon FCE9 fisheye lens uses equidi-

stant projection to produce an image (ie thedistances between two points are projectedby a constant scaling factor in all directions)The equidistant fisheye lenses exhibit notice-able light fall off (which will be referred toas vignetting) for the pixels far from theoptical axis It is therefore necessary toaddress and correct this aberration Theapproach adopted was to determine thevignetting function of the Nikon FCE9 fisheye

lens and use this function to devise a lsquodigitalfilterrsquo in HDRLab to compensate for theluminance loss

The camera was rotated with respect to thetarget in increments of 58 until the camerafield of view was covered The luminance of atarget was determined from HDR photo-graphs for each camera position and com-pared with the physical measurement Figure 8(a) illustrates the normalized data points andthe polynomial function derived to fit thesepoints The square of the correlation (r2)between the measured points and the derivedvignetting function is 9912 As it can beseen from the figure vignetting effects areinsignificant in the centre of the image and themaximum effect is at the periphery with 23luminance loss

Figure 8 (b) is the digital filter developed inHDRLab based on the vignetting functionThis filter is a matrix that has the sameresolution as the fisheye image and it com-pensates for the vignetting effect based on thepixel locations The vignetting function shownin Figure 8 (b) is generated for an aperture sizeof f40 Note that vignetting is stronglydependent on the aperture size and increasesdramatically with wider apertures Therefore

y = -3E-09x4 + 1E-06x3 - 00002x2 + 00101x + 07704

R2 = 09912

0

02

04

06

08

1

12

0 30 60 90 120 150 180

Field of view (deg)

tceffe gnittengiV

measured Polynomial fit of measured data

(a) (b)

Vig

netti

ng e

ffec

t

1

08

06

04

02

0Pixel location (x coordinate) Pixel location (y coordinate)

Figure 8 The vignetting function of Nikon FC9 fisheye lens a) Measured data points and the vignetting function derived to fitthe measured points b) digital filter developed based on the vignetting function

130 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

it is recommended that smaller apertures areused so as to reduce vignetting effects whilecapturing the calibration sequence which de-termines the camera response curves7

33 Point spread functionIt would have been very convenient to

interpret each pixel in the photograph as aluminance measurement but unfortunately inmost optical systems some portion of the pixelsignal comes from surrounding areas Lightentering the camera is spread out and scat-tered by the optical structure of the lens

which is referred to as the point spreadfunction (PSF)14

A point light source is used to quantify thePSF The source is located far enough fromthe camera to ensure that theoretically itcovers less than one pixel area Without anypoint spread the image of the light sourceshould fit onto one pixel The point spreadingis illustrated in Figure 9 (a) and (b) as close-upviews

The aperture size exposure time and eccen-tricity (distance from the optical centre) affectPSF The aperture size was kept constant

Figure 10 Histograms for error percentages between the measured and captured luminances for a) greyscale targets b)coloured targets

Figure 9 a) Close-up view of point spreading the square in (b) shows the pixel that corresponds to the actual light source c)PSF as quantified from the HDR image d) PSF as a function of eccentricity (distance from the optical centre) in the camera fieldof view from 0 to 908 in 158 intervals

High dynamic range photography 131

Lighting Res Technol 382 (2006) pp 123136

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 9: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

it is recommended that smaller apertures areused so as to reduce vignetting effects whilecapturing the calibration sequence which de-termines the camera response curves7

33 Point spread functionIt would have been very convenient to

interpret each pixel in the photograph as aluminance measurement but unfortunately inmost optical systems some portion of the pixelsignal comes from surrounding areas Lightentering the camera is spread out and scat-tered by the optical structure of the lens

which is referred to as the point spreadfunction (PSF)14

A point light source is used to quantify thePSF The source is located far enough fromthe camera to ensure that theoretically itcovers less than one pixel area Without anypoint spread the image of the light sourceshould fit onto one pixel The point spreadingis illustrated in Figure 9 (a) and (b) as close-upviews

The aperture size exposure time and eccen-tricity (distance from the optical centre) affectPSF The aperture size was kept constant

Figure 10 Histograms for error percentages between the measured and captured luminances for a) greyscale targets b)coloured targets

Figure 9 a) Close-up view of point spreading the square in (b) shows the pixel that corresponds to the actual light source c)PSF as quantified from the HDR image d) PSF as a function of eccentricity (distance from the optical centre) in the camera fieldof view from 0 to 908 in 158 intervals

High dynamic range photography 131

Lighting Res Technol 382 (2006) pp 123136

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 10: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

throughout the capturing processes thereforeit was not a parameter affecting the lumi-nances The effects of exposure time andeccentricity were recorded and they wereplotted in RGB channels The PSF of theHDR image is shown in Figure 9 (c) Theeffect of the eccentricity was studied at 158intervals and it is illustrated in Figure 9 (d)The spread affected a limited number of theneighbouring pixels For most architecturallighting applications it seems to be a smalleffect However it is important to note thatthe general scattering of light is such that thecumulative effect of a bright backgroundsurrounding a dark target can lead to largermeasurement errors This is indeed evident inthe measurement results for darker targetsshown in Section 31 Unfortunately it is notfeasible to quantify the general scatteringConsequently point spread is one of theaccumulative factors in the error marginwhich was quantified as less than 10 onaverage in Section 31

4 Discussion

The HDR images used in this paper weregenerated with the camera response functionsshown in Figure 1 The luminances werecalculated with the transformation functionsgiven in Section 23 The calibration feature ofPhotosphere was used for fine-tuning theluminances with a grey target in each sceneThe vignetting effect was corrected by utilizingthe function shown in Figure 8

The luminances extracted from theHDR images indicate reasonable accuracywhen compared with physical measurementsFigure 10 presents a histogram for errorpercentages for 485 coloured and greyscaletarget points under a wide range of electricand daylighting conditions The minimumand maximum measured target luminancesare 05 cdm2 and 12870 cdm2 respectivelyThe average error percentages for all grey-scale and coloured targets were 73 58

and 93 respectively The square of thecorrelation (r2) between the measured andcaptured luminances was found to be 988There is an increased error for the darkergreyscale targets As mentioned in Section33 the general scattering in the lens andsensor affect the darker regions of the imagesdisproportionately The consequence of thisscattering is an over-estimation of the lumi-nances of the darker regions Since theluminance is quite low for these targetssmall differences between the measuredand captured values yield higher error per-centages

The results also revealed increased errorsfor the saturated colours (Figure 10b) InPhotosphere a separate polynomial functionis derived to fit to each RGB channel How-ever it is important to note that the RGBvalues produced by the camera are mixedbetween the different red green and bluesensors of the CCD The CCD sensors havecoloured filters that pass red green or bluelight With the large sensor resolution it wasassumed that enough green (red blue)-filteredpixels receive enough green (red blue) light toensure that the image would yield reasonableresults The sensor arrays are usually arrangedin a Bayer (mosaic) pattern such that 50 ofthe pixels have green and 25 of the pixelshave red and blue filters When the image issaved to a file algorithms built within thecamera employ interpolation between theneighbouring pixels15 The HDR algorithmassumes that the computed response functionspreserve the chromaticity of the correspondingscene points5 In an effort to keep the RGBtransformations constant within the camerathe white balance setting was kept constantthroughout the capturing process The cameraresponse functions were generated with theseconstraints Likewise the luminance calcula-tions were approximated based on sRGBreference primaries with the assumption thatsRGB provides a reasonable approximation tothe camera sensor primaries

132 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 11: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

This work shows that the HDR photogra-phy technique is a useful tool for capturingluminances over a wide range within 10accuracy It is worth mentioning that lumi-nance meter measurements also have expectederrors V(l) match (3) UV response (02)IR response (02) directional response(2) effect from the surrounding field (1)linearity error (02) fatigue (01) polari-zation (01) and errors of focus (04) Thepercentage values are published by the CIE16

they correspond with representative errorsthat are collected from the best availablecommercial instruments

It is not suggested that HDR photographyis a substitute for luminance meter measure-ments It requires calibration against a pointor area of a reliable standard target with areliable luminance meter to have absolutevalidity Yet it provides a measurementcapability with the advantage of collectinghigh resolution luminance data within alarge field of view quickly and efficientlywhich is not possible to achieve with aluminance meter It uses equipment that iswithin the budgets of lighting practitionersand researchers Additionally the self-calibra-tion algorithm in Photosphere provides quickand easy camera response functions comparedto the lengthy calibration measurements re-quired in prior research

The HDR technique requires stable condi-tions over the period of the capturing processDynamic lighting conditions resulting insignificant light changes between differentlyexposed photographs can compromise theaccuracy of the end result It is stronglyadvisable to measure a single target in thescene to be used as a calibration featureFinally as with any measurement and simula-tion tool the user should be aware of thelimitations and expected errors to be able tointerpret the results meaningfully

HDR imagery provides data for qualitativeand quantitative lighting analysis since HDRimages can be studied for visual analysis by

adjusting the exposure to different ranges andpost-processed to

extract photometric information on a pixelscale this information can be utilized forstatistical and mathematical analysis

generate false colour images (a range ofcolours is assigned to represent a range ofluminances) iso-contour lines (colouredcontour lines are superimposed on an imageto demonstrate the distribution of theluminances) and calculate glare with auto-mated analysis tools1718

Acknowledgements

This work was sponsored by the AssistantSecretary for Energy Efficiency and Renew-able Energy Office of Building TechnologyBuilding Technologies Program of the USDepartment of Energy under Contract NoDE-AC03-76SF00098 I thank Greg Ward(Anyhere Software) for his continuous gui-dance and feedback throughout the study Iam grateful for the technical help and advicefrom Jim Galvin (LBNL) on photographyand physical measurement issues I alsothank Steve Johnson (LBNL) and FrancisRubinstein (LBNL) for their suggestions

5 References

1 Uetani Y Measurement of CIE tristimulusvalues XYZ by color video images develop-ment of video colorimetry J Arch PlanningEnv Eng 2001 543 2531

2 Belia L Minichiello F Sibilio S Setting up aCCD photometer for lighting research anddesign Building Env 2002 37 10991106

3 Coutelier B Dumortier D Luminance calibra-tion of the Nikon Coolpix 990 digital cameraProceedings of the CIE Conference San DiegoInternational Commission on Illumination2003

4 Ward G Photosphere Bhttpwwwanyherecom

High dynamic range photography 133

Lighting Res Technol 382 (2006) pp 123136

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 12: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

5 Mitsunaga T Nayar SK Radiometric self-calibration Proceedings of IEEE Conference onComputer Vision and Pattern Recognition FortCollins Institute of Electrical and ElectronicsEngineers 1999

6 Debevec PE Malik J Recovering highdynamic range radiance maps from photographsProceedings of the 24th Annual Conference onComputer Graphics and Interactive TechniquesLos Angeles SIGGRAPH 1997

7 Ward G Photosphere Quick-startBhttpwwwanyherecom

8 Ward G Real pixels In Arvo J editor Graphicsgems II Academic Press 1991 8083

9 Ward G The LogLuv Encoding for full gamuthigh dynamic range images ACM Journal ofGraphics Tools 1998 3 1531

10 Stokes M Chandrasekar S Standardcolor space for the internet sRGBBhttpwwww3orgGraphicsColorsRGB

11 Inanici MN Galvin J Evaluation of HighDynamic Range photography as a luminancemapping technique LBNL Report No 57545Berkeley Lawrence Berkeley NationalLaboratory 2004

12 Wyszecki G Stiles WS Color science conceptsand methods quantitative data and formulaeJohn Wiley and Sons 2000

13 Glassner AS Principles of digital imagesynthesis Morgan Kaufman Publishers1995

14 Du H Voss KJ Effects of point-spread func-tion on calibration and radiometric accuracy ofCCD camera Applied Optics 2004 43 66570

15 Busch DD Mastering digital photographyMuska amp Lipman Publishing 2004

16 International Commission on Illumination(CIE) Methods of characterizing illuminancemeters and luminance meters performancecharacteristics and specification Vienna CIE1987

17 Ward G Radiance Visual Comfort Calcula-tion 1992 BhttpradsitelblgovradiancereferNotesglarehtml

18 Wienold J Reetz C Kuhn T Evalglare3International Radiance WorkshopFribourg Switzerland 1112 October 2004Bhttpwwwradiance-onlineorgradiance-workshop3cdWienold_extabspdf

Discussion

Comment 1 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciJ Mardaljevic (IESD De Montfort UniversityThe Gateway Leicester UK)

The appearance of this paper is very timely asI suspect that many of us in the lightingresearch community are still unfamiliar withHDR imaging and the possibilities that itoffers I think it is fair to say that this workpresents a significant advance over the tech-niques described by Moore et al in 20001 Thequeries that follow are prompted by a desire toanticipate the practicalities involved in usingthe technique for experimental work In parti-cular how much of the work described herewould need to be repeated to achieve reliableresults It is noted in Section 4 that thecalibration feature of the Photosphere soft-ware was applied lsquoin each scenersquo It would beinstructive to know the degree of the variationin the calibration factors across the scenesRelated to this could Dr Inanici comment onwhat would be the likely impact on estimatedluminance if the camera was calibratedjust once against a spot measurement andthen used for long-term monitoring TheFCE9 lens that was used is described as alsquoFisheye Converterrsquo Is it the case that vignet-ting is likely to be greatest using such a lensand are more moderate wide-angle and tele-photo lenses less prone to this effect It shouldbe noted that Wardrsquos Photosphere software(which runs under Mac OS X) is freelyavailable from the URL given in the refer-ences

Reference

1 Moore TA Graves H Perry MJ Carter DJApproximate field measurement of surfaceluminance using a digital camera Lighting ResTechnol 2000 32 111

134 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 13: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

Comment 2 on lsquoEvaluation of high dyna-mic range photography as a luminancedata acquisition systemrsquo by MN InaniciRH Simons (23 Oaklands Park BishopsStortford Hertfordshire UK)

The HDR photography system investigated byMehlika Inanici seems principally aimed atmeasuring the luminances of interiors I see noreason why this method could not be appliedto recording the luminances of the road inroad lighting installations

I mention this because the last time I had tomeasure the luminance distribution of a roadsurface I was thwarted by the weather I hadto do this on a motorway which was closed forrepairs and was going to be opened the nextday to traffic This then was a good opportu-nity for making the luminance measurementsthe client required These had to be taken witha Spectra Pritchard Luminance Metermounted on a special tripod which allowedthe meter to be pointed accurately at the 30measuring points on the road As it might beimagined this was a time consuming businessWhen it came to measuring the last row ofpoints there was a slight sprinkling of rainslight but enough to change the reflectionproperties of the road completely As the roadcould not be closed on the next day because ofthe expense and the disruption caused to thetraffic the measurements were never done

This is a situation where HDR photographywould have been a boon I wonder whetheryou have had any experience of using it forroad lighting measurements I did use photo-graphy for this application in the 1950s whenI used a large plate camera The system wascrude and I would not like to hazard a guess atthe accuracy At that time we were dogged bythe formation of ghost images of the roadlighting luminaires diametrically opposite theimages of the luminaires We ended up usingone element of a Rapid Rectilinear lens (whichhas two elements each consisting of cementeddoublets) which certainly eliminated the ghostimages I imagine that the modern zoom

lenses with as many as eight componentsseparated by air spaces are very prone togive ghost images and to degradation of theimage because of scattered light in the lensHave you any advice on choice of lens

Authorrsquos response to J Mardaljevic andRH SimonsMN Inanici

The HDR photography technique presentedin this paper is a camera independent lowcost and accessible solution for high resolu-tion and large field of view luminance dataacquisition As Dr Mardaljevic points out thesoftware is freely available from the URLgiven in Ward4 and it is user-friendly Mostof the work presented in this paper was doneto evaluate the technique as a luminancemeasurement tool hence it is not necessaryto repeat the whole study to adopt thetechnique Nevertheless some of the stepshave to be repeated when a different camerais employed

It is necessary to generate a new cameraresponse curve for each camera that is goingto be used as a luminance data acquisitionsystem It is an easy process and the resultantcamera response curve can be used for gen-erating the subsequent HDR images HDRphotography technique has been tested with afew different camera models and the errormargins with different models were similar tothe values reported in this paper Yet as withany measurement device it is recommendedthat a few test measurements are performed toverify the accuracy

Dr Mardaljevic enquires about the degreeof variation in the calibration factors amongthe scenes The calibration factors for thescenes presented in this paper vary between087 and 209 The calibration factor isdetermined by dividing pixel values of aselected region in the scene by the physicalluminance measurement (Minolta luminancemeter) of the same region A calibration value

High dynamic range photography 135

Lighting Res Technol 382 (2006) pp 123136

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136

Page 14: Evaluation of high dynamic range photography as a luminance …faculty.washington.edu/inanici/Publications/LRTPublished.pdf · 2006-05-19 · Evaluation of high dynamic range photography

of lsquo1rsquo obviously means that no calibrationwould be necessary A calibration value of lsquo2rsquomeans that the pixel value of a selected regionin the scene is twice the value measured by aluminance meter The calibration factors forFigures 27 are given as follows Figure 2(laboratory space illuminated with incandes-cent lamp) 087 Figure 3 (fluorescent lamp)095 Figure 4 (metal halide lamp) 086Figure 5 (office space) 098 Figure 6 (out-door scene with overcast sky) 19 and Figure7 (outdoor scene with clear sky) 209 Theutilization of a calibration factor becomesespecially important with high dynamic rangescenes such as daylit and sunlit interiors andexterior environments Therefore it is neces-sary to use the calibration factor across thescenes to assure accuracy with a 10 errormargin Yet it would be interesting to look atthe variability of a calibration factor for asingle scene over time

The author agrees with Dr Mardaljevic thatvignetting is higher with fisheye lenses andwide-angle and telephoto lenses should beexpected to be less prone to vignetting It isalso important to note that aperture size has asignificant effect on vignetting and it isrecommended that smaller apertures are usedfor reducing the vignetting effects

Due to the limited scope of the project theauthor did not attempt to evaluate it for road

measurements One of the advantages of HDRphotography technique as luminance mappingtool is its ability to offer fast data collectionTherefore as Mr Simons indicates HDRphotography technique has a great potentialin recording the luminances of the road inroad lighting installations

The author thanks the discussants for theircomments that serve to further describe thepracticality and applicability of the HDRphotography technique as a luminance dataacquisition system The author also thanks theanonymous discussant who has pointed tothe following as additional references

References

1 Rea MS Jeffrey IG A new luminance and imageanalysis system for lighting and vision equip-ment and calibration J Illumin Eng Soc NAm 1990 9 6472

2 Illuminating Engineering Society of NorthAmerica (IESNA) IESNA approved methodfor digital photographic photometry of luminairesand lamps New York IESNA LM-77-05 2005

3 Reinhard E Ward G Pattanaik S Debevec PHigh dynamic range imaging acquisitiondisplay and image-based lighting MorganKaufmann Publishers 2005

136 MN Inanici

Lighting Res Technol 382 (2006) pp 123136