hoip10 articulo color degradation_univ_granada

2
COLOR DEGRADATION OF OBJECTS DUE TO THE ATMOSPHERE Javier Hernández-Andrés, Raúl Luzón, Juan L. Nieves and Javier Romero Color Imaging Lab, Department of Optics, University of Granada, SPAIN. The light coming from an object, at a certain distance from the observer, is spatially and spectrally degraded due to the absorption and scattering in the atmosphere, reducing its visibility and contrast and changing its color through the loss in saturation and a likely hue change, depending on the atmospheric conditions. In this paper we focus on the object’s color change due to the atmosphere under different weather conditions. Our future aim is to estimate the real color by correcting the degradation, based on a physical model and the color content of images. We present here a prior study of color changes in natural scenes in order to extract relevant information to be used for the degradation correction algorithms that we will propose in a next research step. In the field of image restoration there are some papers based on statistical information of the scene [1] and others on physical models [2]. Most physical based models are constructed over the dichromatic model [3] in which the light radiance at the camera is the addition of the light coming from the object attenuated by the atmosphere and airlight: ( ) d d e L e L L ) ( ) ( 0 1 ) ( ) ( ) ( λ β λ β λ λ λ + = where L 0 is the object radiance, β is the attenuation coefficient, d is the distance object-camera and L is the horizon radiance. In order to estimate the original object’s color these models require meteorological conditions information [4], distances from the objects [5] or some images taken under different weather conditions [6]. Assuming Lambertian objects receiving an irradiance E d , then the irradiance impinging on the camera is ( ) d d d t e L e E E ) ( ) ( 1 ) ( ) ( ) ( ) ( λ β λ β λ π λ ρ λ λ Ω + Ω = (1) where is the solid angle of the scene viewed from the camera and ρ(λ) is the spectral reflectance of the object. Ω For overcast skies we consider the object radiance as [2,3]: ϕ θ λ ϕ θ λ θ λ λ d d R f L L ) , , ( ) , ( ) ( ) ( 0 = , where R(θ,φ,λ) is the Bidirectional Reflectance Distribution Function (BRDF) of the object, θ and φ are the solid angle variables viewed from the object, and f(θ,λ) depends on the overcast sky. Assuming the same horizon radiance at any point on the sky, then, for a Lambertian object, we have: ) ( ) ( ) ( 0 λ ρ λ λ = L L . Therefore, the final expression of the irradiance impinging on the camera for an overcast sky [2,3] is ( ) d d t e L e L E ) ( ) ( 1 ) ( ) ( ) ( ) ( λ β λ β λ λ ρ λ λ Ω + Ω = (2) For three clear days and three overcast days we measured the spectral irradiance over a vertical surface ) (λ d E using the PR-650 spectroradiometer and simultaneously the spectral radiance at the horizon ) ( λ L . The measurements were taken under dense and low haze conditions at the top of the Faculty of Science building in Granada (Spain). For these days the attenuation coefficient β at 450, 550 and 700 nm was known. For the rest of wavelengths in the visible range the coefficient was extrapolated assuming its proportionality [7] to . n λ Fig 1. Color change in CIE 1931 of a color patch for different d distance values Using equations 1 and 2 we simulate color changes for the 23 color patches from the Macbeth Color-Checker as the distance object-observer increases and under different weather conditions, calculating CIE 1931 chromaticity coordinates (x,y,Y) and CIELAB coordinates (L*,a*,b*), for the six days where ) (λ d E , ) ( λ L and β were measured. Fig 1 shows the CIE 1931 color change simulation of an object with the distance of observation for the six days.

Upload: tecnalia-research-innovation

Post on 25-Jan-2015

471 views

Category:

Technology


0 download

DESCRIPTION

Artículo presentado por la Universidad de Granada durante las jornadas HOIP'10 organizadas por TECNALIA. Más información en http://www.tecnalia.com/es/ict-european-software-institute/index.htm

TRANSCRIPT

Page 1: Hoip10 articulo color degradation_univ_granada

COLOR DEGRADATION OF OBJECTS DUE TO THE ATMOSPHERE Javier Hernández-Andrés, Raúl Luzón, Juan L. Nieves and Javier Romero Color Imaging Lab, Department of Optics, University of Granada, SPAIN.

The light coming from an object, at a certain distance from the observer, is spatially and spectrally degraded due to the absorption and scattering in the atmosphere, reducing its visibility and contrast and changing its color through the loss in saturation and a likely hue change, depending on the atmospheric conditions. In this paper we focus on the object’s color change due to the atmosphere under different weather conditions. Our future aim is to estimate the real color by correcting the degradation, based on a physical model and the color content of images. We present here a prior study of color changes in natural scenes in order to extract relevant information to be used for the degradation correction algorithms that we will propose in a next research step. In the field of image restoration there are some papers based on statistical information of the scene [1] and others on physical models [2]. Most physical based models are constructed over the dichromatic model [3] in which the light radiance at the camera is the addition of the light coming from the object attenuated by the atmosphere and airlight: ( )dd eLeLL )()(

0 1)()()( λβλβ λλλ −∞

− −+= where L0 is the object radiance, β is the attenuation coefficient, d is the distance object-camera and L∞ is the horizon radiance. In order to estimate the original object’s color these models require meteorological conditions information [4], distances from the objects [5] or some images taken under different weather conditions [6]. Assuming Lambertian objects receiving an irradiance Ed, then the irradiance impinging on the camera is

( )dddt eLeEE )()( 1)()()()( λβλβ λ

πλρλ

λ −∞

− −Ω+Ω= (1)

where is the solid angle of the scene viewed from the camera and ρ(λ) is the spectral reflectance of the object.

Ω

For overcast skies we consider the object radiance as [2,3]: ϕθλϕθλθλλ ddRfLL ),,(),()()(0 ∫∫ ∞= ,

where R(θ,φ,λ) is the Bidirectional Reflectance Distribution Function (BRDF) of the object, θ and φ are the solid angle variables viewed from the object, and f(θ,λ) depends on the overcast sky. Assuming the same horizon radiance at any point on the sky, then, for a Lambertian object, we have: )()()(0 λρλλ ∞= LL . Therefore, the final expression of the irradiance impinging on the camera for an overcast sky [2,3] is

( )ddt eLeLE )()( 1)()()()( λβλβ λλρλλ −

∞−

∞ −Ω+Ω= (2) For three clear days and three overcast days we measured the spectral irradiance over a vertical surface

)(λdE using the PR-650 spectroradiometer and simultaneously the spectral radiance at the horizon )(λ∞L . The measurements were taken under dense and low haze conditions at the top of the Faculty of Science building in Granada (Spain). For these days the attenuation coefficient β at 450, 550 and 700 nm was known. For the rest of wavelengths in the visible range the coefficient was extrapolated assuming its proportionality [7] to . n−λ

Fig 1. Color change in CIE 1931 of a color patch for different d distance values

Using equations 1 and 2 we simulate color changes for the 23 color patches from the Macbeth Color-Checker as the distance object-observer increases and under different weather conditions, calculating CIE 1931 chromaticity coordinates (x,y,Y) and CIELAB coordinates (L*,a*,b*), for the six days where )(λdE , )(λ∞L and β were measured. Fig 1 shows the CIE 1931 color change simulation of an object with the distance of observation for the six days.

Page 2: Hoip10 articulo color degradation_univ_granada

The contribution of direct light decreases when the distance increases. In fact for a certain distance the airlight contribution is more noticeable that the attenuation factor and therefore the object’s color gets closer to the horizon’s color. This behaviour is similar for the overcast days (2, 5 and 6), but different for clear days (3 and 4) where the term due to attenuation is more important than the term due to airlight. As we can see in Figure 2, for very high distances all objects share the same color (i.e. the color of the horizon) as expected.

In Figure 3, similar to figure 2 but for an overcast day, we include the chromaticity discrimination threshold at the horizon, considering 3 CIELAB units. With this criterion of visibility, based on chromatic discrimination the visibility [7] would be 0.084 Mm (mega meters), while the traditional criterion for visibility based on the illuminance, yields a visibility of 0.184 Mm for a wavelength of 550 nm. Using just 1 CIELAB color difference as threshold, the value will be 0.140 Mm. Therefore the visibility calculated with a colorimetric criterion provides a lower value than using the classical form.

Fig 2. Color change in CIELAB of six color patches under a clear day for different d values

Fig 3. Color change in CIELAB of six color patches under an overcast day for different d values. The circle corresponds to the chromaticity discrimination.

The distance of the object can be estimated comparing the contrast between the object and the background, but for large distances the objects have poor contrast and there is no chromatic discrimination between them, as we can see on the figures 2 and 3, where different branches of different objects tends to mix and cross for large distances. These preliminary results are essential for our final aim of correcting the color degradation of any color image without knowing its distance nor the weather conditions. REFERENCES 1. I. Pitas and P. Kiniklis, “Multichannel Techniques in Color Image Enhancement and Modeling”, Image

Processing, IEEE Transactions, Vol 5,No. 1, pp. 168-171, 1996. 2. K. Tan and J.P. Oakley, “Physics-Based Approach to Color Image Enhancement in Poor Visibility

Conditions”, Journal of the Optical Society of America, Vol. 18, No. 10, pp. 2460-2467, 2001. 3. S. G. Narasimhan and S. K. Nayar, “Chromatic Framework for Vision in Bad Weather”, Conference on

Computer Vision and Pattern Recognition, IEEE Proceedings. Vol. 1, pp. 598-605, 2000. 4. Y. Yitzhaky, I. Dror and N. S. Kopeika, “Restoration of atmospherically Blurred Images According to

Weather-predicted Atmospheric Modulation Transfer Functions”, Optical Engineering, Vol 36, pp. 3064-3072, 1997.

5. S. G. Narasimhan and S. K. Nayar, “Contrast Restoration of Weather Degraded Images”, Pattern Analysis And Machine Intelligence, IEEE Transactions, Vol. 25, No. 6, pp. 713-724, 2003.

6. S. G. Narasimhan and S. K. Nayar, “Vision in Bad Weather”, Seventh IEEE International Conference in Computer Vision, IEEE Proceedings, Vol 1, pp. 820-827, 2000.

7. W. E. K. Middleton, “Vision through the atmosphere”, 2nd Edition, University of Toronto Press, 1952. Acknowledgements This work was supported by the “Proyecto de Excelencia de la Junta de Andalucia” through grant 30B4680301.