extended depth-of-field for vi sual systems: an...

15
Extended depth-of-field for visual systems: an overview Yufu Qu*, Liyan Liu**, Haijuan Yang Key Laboratory of Precision Opto-mechatronics Mechatronics Technology of Education Ministry, Beijing University of Aeronautics & Astronautics, Beijing 100191, China School of Instrumentation and Opto-Electronic Engineering, Beijing University of Aeronautics & Astronautics, Beijing 100191, China ABSTRACT In this paper, we first introduce the concept of the depth of field (DOF) in machine vision systems, which serves as a basic building block for our study. Then, related work on the generalization of the fundamental methods and current status with regard to extending the DOF is presented, followed by a detailed analysis of the principles and performances of some representative extended depth-of-field (EDOF) technologies. Finally, we make some predictions about the prospects of EDOF technologies. Keywords: Machine vision, DOF extension, Vision measurement, Computational photography, Point spread function 1. INTRODUCTION In a machine vision system, DOF is the distance between the nearest and farthest objects in a scene that appear acceptably sharp in an image when the detector is fixed in position. The DOF mainly depends on the numerical aperture of the imaging lens and the magnification of the system. At the high numerical aperture of a machine vision system, the DOF is determined primarily by wave optics, while at lower numerical aperture, the circle of confusion, calculated based on geometric optics, dominates the phenomenon. Using a variety of different criteria for determining when the image falls below an acceptable level of sharpness, several authors have proposed different formulas to describe the depth of field in a microscope. The total depth of field is given by the sum of the wave-optics and geometrical-optics depth of field as 1 : 2 = + n n d e NA M NA λ (1) where d represents the depth of field, λ is the wavelength of the illuminating light, n is the refractive index of the medium between the cover slip and the objective front lens element, and NA is the numerical aperture of the objective. The variable e is the smallest distance that can be resolved by a detector that is placed in the image plane of the lens of the machine vision system, whose lateral magnification is M . It is noted that the DOF determines the range of sharp images along the optical axis according to the definition above. The larger the DOF is, the wider the depth range as viewed from the optical axis becomes. However, the value of the DOF is expected to be larger in many practical machine vision systems, such as simultaneous detection of the top surface and pin in an integrated chip, critical size detection of MEMS at different heights, and defect and critical size detection on a spherical or complex surface. Unfortunately, the DOF is inversely proportional to the magnification of the machine vision system. In other words, when the magnification is large, the DOF is only several microns or even less. Larger DOF values can be achieved by reducing the magnification or the numerical aperture in the machine vision system at the further expense of a decrease in spatial resolution, luminous flux, and signal-to-noise ratio (SNR). 1 *[email protected]; ** [email protected] Optoelectronic Imaging and Multimedia Technology II, edited by Tsutomu Shimura, Guangyu Xu, Linmi Tao, Jesse Zheng, Proc. of SPIE Vol. 8558, 85580K · © 2012 SPIE CCC code: 0277-786/12/$18 · doi: 10.1117/12.999728 Proc. of SPIE Vol. 8558 85580K-1 Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Upload: others

Post on 05-Aug-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

Extended depth-of-field for visual systems: an overview Yufu Qu*, Liyan Liu**, Haijuan Yang

Key Laboratory of Precision Opto-mechatronics Mechatronics Technology of Education Ministry, Beijing University of Aeronautics & Astronautics, Beijing 100191, China

School of Instrumentation and Opto-Electronic Engineering, Beijing University of Aeronautics & Astronautics, Beijing 100191, China

ABSTRACT

In this paper, we first introduce the concept of the depth of field (DOF) in machine vision systems, which serves as a basic building block for our study. Then, related work on the generalization of the fundamental methods and current status with regard to extending the DOF is presented, followed by a detailed analysis of the principles and performances of some representative extended depth-of-field (EDOF) technologies. Finally, we make some predictions about the prospects of EDOF technologies.

Keywords: Machine vision, DOF extension, Vision measurement, Computational photography, Point spread function

1. INTRODUCTION In a machine vision system, DOF is the distance between the nearest and farthest objects in a scene that appear acceptably sharp in an image when the detector is fixed in position. The DOF mainly depends on the numerical aperture of the imaging lens and the magnification of the system. At the high numerical aperture of a machine vision system, the DOF is determined primarily by wave optics, while at lower numerical aperture, the circle of confusion, calculated based on geometric optics, dominates the phenomenon. Using a variety of different criteria for determining when the image falls below an acceptable level of sharpness, several authors have proposed different formulas to describe the depth of field in a microscope. The total depth of field is given by the sum of the wave-optics and geometrical-optics depth of field as1:

2= +⋅

n nd e

NA M NAλ

, (1)

where d represents the depth of field, λ is the wavelength of the illuminating light, n is the refractive index of the medium between the cover slip and the objective front lens element, and NA is the numerical aperture of the objective. The variable e is the smallest distance that can be resolved by a detector that is placed in the image plane of the lens of the machine vision system, whose lateral magnification isM .

It is noted that the DOF determines the range of sharp images along the optical axis according to the definition above. The larger the DOF is, the wider the depth range as viewed from the optical axis becomes. However, the value of the DOF is expected to be larger in many practical machine vision systems, such as simultaneous detection of the top surface and pin in an integrated chip, critical size detection of MEMS at different heights, and defect and critical size detection on a spherical or complex surface. Unfortunately, the DOF is inversely proportional to the magnification of the machine vision system. In other words, when the magnification is large, the DOF is only several microns or even less. Larger DOF values can be achieved by reducing the magnification or the numerical aperture in the machine vision system at the further expense of a decrease in spatial resolution, luminous flux, and signal-to-noise ratio (SNR).1

*[email protected]; ** [email protected]

Optoelectronic Imaging and Multimedia Technology II, edited by Tsutomu Shimura, Guangyu Xu, Linmi Tao, Jesse Zheng, Proc. of SPIE Vol. 8558, 85580K · © 2012 SPIE

CCC code: 0277-786/12/$18 · doi: 10.1117/12.999728

Proc. of SPIE Vol. 8558 85580K-1

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 2: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

ú

m

In order to break through the DOF limitation in an optical imaging system, many researchers have proposed tens of DOF extension methods to make possible the coexistence of high magnification and large DOF in the past fifty years. In this paper, we classify the EDOF technologies into four types, based on the adoption of optical elements and image restoration techniques: simple improvement (which neither add optical elements nor use image restoration techniques), image restoration (which only use image restoration techniques), element addition (which only add optical elements), and computational photography (which not only add optical elements but also use image restoration techniques). In the following sections, we will analyze these four types in terms of the corresponding principle and characteristics separately and then summarize our results.

2. EXISTING EDOF TECHNOLOGIES 2.1 Simple improvement DOF extension by simple improvement is achieved by changing the elements or parameters of the optical imaging system. In general, DOF technologies can be subdivided into four types, based on the approach that they use. These four approaches are as follows:

(1) Reduction of the numerical aperture (NA) size, resulting in the decrease of luminous flux, SNR, and spatial resolution, as is shown in Fig. 1. (2) Immersing the lens in a solution with larger refractive index than the lens. Since the refractive index of the solution is slightly larger than that of air, the capability of this type of EDOF technology is limited. (3) Increasing the pixel area of the detector. In this approach, the DOF is extended at the expense of the axial resolution. (4) Increasing the wavelength, such as by using red or infrared light. Likewise, an EDOF image is obtained at the cost of axial resolution. On the other hand, the DOF extension ratio is limited to between a factor of 1.5 and 2.

Figure 1. The relationship between DOF and NA.

2.2 Image restoration EDOF technology based on image restoration is realized by repeatedly altering the parameters of an optical imaging system. The DOF is extended with the aid of digital image processing technology. It is concluded that the representative technologies consist of four types: focus variation2, DOF superposition3, spherical aberration4, and chromatic aberration5. 2.2.1 Focus variation In this approach, the DOF is extended by varying the focus of the optical imaging system to render an object being imaged in a scene with different depths, and then by merging the sharp images that are separated from the pixels to obtain an EDOF image.2 As is shown in Fig. 2, the picture is captured at different focus values. The left one is focused on the object that is near, but the right one is focused on the object situated farther away. It is shown that the object in the farther distance is blurred when the nearer object is focused upon. However, the object in the nearer distance is blurred when the farther object is focused upon. In conclusion, there are always some blurred portions in this picture.

As is illustrated in Fig. 3, this picture is acquired by fusing two images in Fig. 2. As can be seen, the objects in the near and far distance become sharp, giving rise to successful DOF extension. However, the disadvantage is that it is not suitable for the observation and detection of moving objects because the images of a moving object need to be captured repeatedly. In addition, the captured images need to be processed using substantial computations, which makes it difficult to present EDOF images in real time.

Proc. of SPIE Vol. 8558 85580K-2

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 3: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

Figure 2. Pictures of a scene captured at different focus values.

Figure 3. EDOF image after fusion.

2.2.2 DOF superposition In this approach, by moving a certain part of the optical imaging system or the objective along the optical axis, a thick specimen is imaged slice by slice at different focal depths. An EDOF image is then reconstructed by superimposing the image stack into a single two-dimensional (2-D) image. In the machine vision system, the three-dimensional (3-D) coordinates of the free-form surface could be obtained by DOF superposition. When the coordinates of the free-form surface, shown in Fig. 4(a), are measured, the camera moves along the optical axis to capture images from each field of view. In this case, we can obtain image sequences of the free-form surface at different focal depths; the sequences are presented in Fig. 4(b). Based on the focus evaluation factor known as focus measure (FM), a set of sharp pixels are selected from those sequences6,7. Through computations and comparisons of the relative FMs of such pixels, we can determine whether they are sharp. In this manner, images with large DOF values are obtained through image fusion, as indicated in Fig. 4(c).

(a) Free-form surface. (b) Image sequences.

Proc. of SPIE Vol. 8558 85580K-3

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 4: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

512

00

(c) Free-form surface topography after DOF extension.

Figure 4. 3-D coordinate measurement of free-form surface.

From the discussion above, it can be noted that DOF superimposition is dependent on the FM. However, noise is unavoidable in an actual image, which leads to errors dependent on the FM value. Furthermore, the algorithm of image fusion is extremely complicated. Although image fusion in the frequency domain avoids the problem of complicated processing, the image quality becomes worse due to the false contour effect8–10. 2.2.3 Spherical aberration By increasing the spherical aberration of the imaging lens, an EODF image can be obtained by the deconvolution technique based on the principle of optical aberrations4. Firstly, it preserves the resolution of the system without the use of special phase elements. Further, the lens performance is azimuth-independent and achromatic over the visible range.

EDOF technology based on spherical aberration is the most advantageous method to extend the DOF in a low-power microscope system. This approach allows the lens aberrations to pre-blur the image in a controlled manner and takes advantage of digital restoration to acquire EDOF images. In a spherical aberration system, the DOF can be expressed as a change in the focus, 20Wδ , which is related to the numerical aperture NA and the corresponding displacement zδ by the equation11

220 0.5( ) ,=W NA zδ δ (2)

The depth of field can be obtained from Eq. (2). When the images are captured by a black-and-white image detector, there is only a slight difference between the modulation transfer function (MTF) curves for different wavelengths and depths. Moreover, the PSF curve retains a strong central lobe, contributing to the good quality of image reconstruction. However, for an off-axis field point of a polychromatic image, successful reconstruction also requires the absence of transverse chromatic aberration, which is a difficult constraint to satisfy. Therefore, it may be preferable to synthesize a polychromatic image by acquiring three monochromatic images, which increases the complexity of the optical system. 2.2.4 Chromatic aberration This EDOF method is based on the phenomenon of chromatic dispersion, which enables a lens to possess axial chromatic aberrations. In this case, the focal length of a thin singlet refractive lens varies as a function of wavelength12

1 2

1 1( ) ( ( ) 1)( ).= − +f n

R Rλ λ (3)

where ( )n λ is the index of refraction of the material and 1R and 2R are the radii of curvature.

If such a lens is used with a black-and–white image detector, the imaging system can be considered to simultaneously possess a continuum of focal lengths. Thus, this system is called a spectral focal sweep (SFS) camera, which is analogous to the existing focal sweep techniques13,14. However, it is different in that it can be used to extend the DOF without using moving parts or typical optical elements. Additionally, the SFS lens is considerably lighter and more compact, contributing to the simplification of the imaging system.

The SFS camera can be used to extend the DOF when the reflectance spectra of the objects being imaged are known. As a result, the amount of focal sweep depends on the reflectance spectra. The more broadband an object’s spectrum is,

Proc. of SPIE Vol. 8558 85580K-4

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 5: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

a Fluorescence Dichroicfcitation

filter cube mirrorfilter

Excitation light

Barrier filter

Fluorescent light

Tube lens

Confocal aperture

Excitation andfluorescent Objectivés

light back apertureMicroscopeobjective

Oscillating

ina

C

Fluoresces Intensityfrom the scanning-spotleaden

fluorescent intensityfrom a point an thefrai

side-lobe enne excitation

the wider the focal sweep becomes. In order to make the camera function correctly, it is required that the object possess reasonably broad spectral reflectance distributions. Fortunately, the reflectance spectra of most real world objects are sufficiently broadband15, which is advantageous for extending DOF for a wide range of scenes. However, some naturally occurring spectra are extremely narrowband, and hence, unsuitable for producing a large focus range, which render a PSF highly dependent on depth. When an SFS lens is used to photograph a scene, a significant number of artifacts are introduced by the relevant PSF after deconvolution. 2.3 Element addition In this approach, the DOF is extended by altering certain parts of the optical system. It is primarily applied in optical sectioning microscopy16, digital holographic microscopy17, and white-light interferometer microscopy18.

2.3.1 Optical sectioning microscopy Under optical sectioning microscopy, a thick specimen is imaged slice by slice at different focal depths with the aid of an optical sectioning technique in light microscopy. An EDOF image is then reconstructed by superimposing the images in a stack. In order to obtain an EDOF image of good quality, it is necessary to eliminate images from background caused by out-of-focus light and scatter. Some optical sectioning techniques adopt mechanisms to suppress the out-of-focus signals. Examples include confocal scanning microscopy19, multi-photon fluorescence microscopy20, selective plane illumination microscopy21, and structured illumination microscopy22. Confocal scanning microscopy removes the contribution of out-of-focus light in a thick specimen to obtain a slice image, and it removes the effect caused by scattered light in other positions. This removal of unwanted light provides better contrast and permits 3-D reconstructions by computationally combining the image data from a stack of images23.

The schematic diagram of the confocal scanning microscopy is shown in Fig. 5(a). The excited light from the laser is passed through the fluorescent filter cube and then redirected by an oscillating mirror. It is focused onto the specimen after it passes through the microscope objective. A specimen stained with fluorescent dyes is excited to produce fluorescent light, which can reach the photomultiplier tube (PMT) detector after entering the confocal aperture through the objective, oscillating mirror, or another such route. As is presented in Fig. 5(b), light emitted from the focused spot can reach the detector through the pinhole of the confocal aperture, but light emitted from other spots is blocked, enabling the images to become sharper. In Fig. 5(c), the fluorescent molecules produce an Airy diffraction pattern with a strong central lobe at the plane of the confocal pinhole, but the intensity of the central Airy disc becomes sharper due to the constraint of the confocal aperture.

Figure 5. The principle of confocal microscopy.

Proc. of SPIE Vol. 8558 85580K-5

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 6: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

Refercncmirrorspcctroscop

1

Piatio

V

CCD

Zoom lens

r ---I

PZT Test

Source

Mirau white light

1interference

2.3.2 Digital holographic microscopy Holography is an imaging method in which an interference fringe pattern, referred to as a hologram, is recorded to store information about the light wave that reflected from the imaged object and contributed to the interference pattern in some recording material. When the reference wave illuminates the hologram, the original object wave is reconstructed based on diffraction theory, producing a real 3-D image that is mostly similar to the original object. Thus, this process of wavefront recording and reconstruction is called holography, which also means interference recording and diffraction reconstruction24.

When holography was initially discovered, holograms were recorded in chemical media such as silver salt and gelatin. However, with the development of computational technology and digital sensor technology, especially the emergence of photosensitive-element-based CCDs and the improvement of resolution, it became possible to perform holographic computation by using digital recording and numerical reconstruction without recording with photosensitive materials and chemical processing. As a result, digital holography was developed, in which the recording material (photographic plate) was replaced by a CCD lens25.

Although an EDOF image can be achieved by 3-D reconstruction based on holographic theory in digital holographic microscopy, this approach is not applied in online measurement systems of high accuracy due to the complexity of the structure and principle and the small range of the field of view in such microscopy. 2.3.3 White-light interferometer microscopy White-light interferometer microscopy is a type of microscopy that recovers the topography of the measured object based on the principle of white-light interference. The white light is used as the light source in the microscopy. Compared with using a monochromatic source, the white-light interference fringes possess the position of zero optical path difference (OPD), eliminating the disorder of interference fringes that occurs in a monochromatic laser source. The characteristic of the white-light interference fringes is that there is a maximum in the fringe pattern, referred to as the central fringe, which corresponds to the position of zero OPD but is not related to the wavelength. Further, both sides of the central fringes are chromatic. Thus, positions can be accurately determined according to this characteristic of the central fringes, resulting in the successful measurement of large DOF with the aid of a reliable absolute position reference26–28.

There are mainly four types of white-light interference microscopy: Linnik29, Mechelson30–32, Mirau33,34, and confocal interference microscopy35. A schematic diagram of the white-light interference microscopy is represented in Fig. 6. The light emitted from the white-light source reaches the interference optical path after being reflected by a beam splitter prism. Thus, the interference cavity is formed between the Mirau interference objective and the measured object.

Figure 6. Schematic diagram of Mirau interference microscopy.

Proc. of SPIE Vol. 8558 85580K-6

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 7: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

CCDcamera

sync

4111110111>

,r-

specimen

grid

piezo

lamp

The measured light and the reference light are focused into a CCD, which records the white-light interference fringes. The micro-displacement device made from piezoelectric ceramic material is rigidly connected to the Mirau interference objective, and the piezoelectric transducer (PZT) drives the objective to scan vertically. When the distance from the objective to one point on the surface of the measured object equals the optical path in the reference light path, the OPD in both arms of the interferometer becomes zero. At that time, the intensity of the white-light interference signal attains its peak value. As the piezoelectric transducer (PZT) vertically scans the object, the white-light interference signal from the object surface is recorded by the CCD. Then the curve for the intensity variation of each point in the PZT vertical scanning process is plotted according to the recorded intensity signal in the interference figure. Thus, the topography of the measured object can be achieved by comparing the height values resulting from the peak position of the envelope of the interference signal at each point. 2.4 Computational photography In the approach that adds an element to an optical imaging system, an EDOF image is acquired by restoring the blurred image, so this approach is independent of the degree of defocusing. This method has been the focus of research in the past ten years. It is referred to as the EDOF technology of computational photography because the core idea is the same as the concept of computational photography, which was proposed in a conference at the Massachusetts Institute of Technology in 2005.36 Based on where the element is positioned, we classify this technology as light source, focal plane, or aperture.

2.4.1 Light source By changing the illumination system, the spatial frequency grid pattern is projected onto the object, enabling a portion of the object where the grid pattern is in focus to image effectively. We will thus obtain an optically sectioned image of the object, but the unwanted grid pattern is superimposed. It is therefore necessary to remove the unwanted image from the optically sectioned ones37.

A simple method, shown in Fig. 7, permits us to obtain optically sectioned images from a conventional microscopy scheme in real time. In this figure, a sinusoidal mask is placed between the source of incoherent illumination and the beam splitter in the microscopy apparatus. When the grid is in motion during the integration time of the camera, we project the sinusoidal fringes onto the object and take three images corresponding to the relative spatial phases. Thus, an EDOF image can be achieved by image processing. In Fig. 8, we compare two images of the thick volume structure of a lily pollen grain, one of which is an EDOF image obtained by image restoration and the other of which was taken using conventional microscopy. This figure shows that the DOF is larger after DOF extension.

Figure 7. Schematic of the light source EDOF system.

Proc. of SPIE Vol. 8558 85580K-7

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 8: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

G

(a) Conventional image of lily pollen grain (b) EDOF image of lily pollen grain Figure 8. Comparison of lily pollen grain before and after DOF extension in a conventional microscopy scheme.

2.4.2 Focal plane This type of EDOF technology is composed of such methods as using a multi-focus detector and moving the detector. In the following section, we will introduce the basic principle of those two approaches.

2.4.2.1 Multi-focus detector We introduce the multi-focus camera to image focal planes with certain focal length intervals after the imaging light path is divided into three paths. Then we use imaging processing technology to obtain large DOF images. Compared with the approach of varying the focus and defocusing, this method would only need to capture images once to extend the DOF. 38

Fig. 9 shows the principle of the multi-focus camera. The dichroic prism block is re-coated so that each CCD obtains just one third of the incoming light in all spectral bands. On the other hand, two of those three CCDs are shifted backwards or forwards along the optical axis to make the captured images defocused. Then an optimization algorithm is employed to compute the depth.39

Figure 9. Structure of the multi-focus camera.

2.4.2.2 Detector movement By placing the detector in the micro-actuator, a blurred image can be acquired by the translation of the detector during the integration time. Then an EDOF image can be obtained by deconvolution with a PSF as has been theoretically proven.40 Fig. 10 displays diagrams and photographs of the prototype of a visual system with a moving detector.

The advantage of this approach is that the DOF can be extended flexibly by the camera, which could also capture images with a tilted DOF. However, the PSF of the imaging system is depth-dependent. This means that the image point, which corresponds to the object point with the identical coordinate in the plane perpendicular to the optical axis, is shifted, leading to a magnification variation in the depth direction. If a uniform PSF is applied to the blurred images, there will be artifacts in the restored images. Although an ideal image can be recovered by using a depth-dependent PSF, we must pre-calibrate the PSF at different depths and determine the depth value of the object, which is difficult to realize due to the complexity of this system.

Proc. of SPIE Vol. 8558 85580K-8

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 9: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

ApertureScene Point

M

011111111111

u V>

P

ní h

(a)

Scene

Lens Detector Motion-

Integration Time

(b) (c)

0A¡xrturel

Aperture.

Apenurc3

Objective lens

Variable aperture

Imaging lens

Image plane

imagel

v

image3

digital final ccpth-restoration extended image

Figure 10. (a) A scene point M , at a distance of u from the aperture is imaged at a point m at a distance of v from the lens. If the detector is shifted to a distance p from the lens, M is imaged as a blurred circle centered at 'm . (b) The

DOF camera translates the detector along the optical axis during the integration time. (3) Mechanical structure of the DOF

camera.

2.4.3 Aperture Much research has been conducted in the field of adding elements in the plane of the aperture because the DOF ratio of this method reaches factors of ten or even higher. The typical methods are as follows: aperture variation, mask, phase plate, lens of grid focal length, and light field imaging. We will give detailed information about each of these approaches in the following analysis.

2.4.3.1 EDOF technology of variable aperture In this approach, the imaging system design includes an annular aperture of variable size. When we change the size of the aperture, the image is integrated simultaneously and then reconstructed by image processing technology, contributing to the large-DOF image41.

The principle of EDOF technology based on aperture variation is represented in Fig. 11. The aperture is divided into annular apertures with different radii. During the integration time of an image, the aperture will be open in order to capture images, which are superimposed averaged to obtain a blurred image. Then we employ a deblurring algorithm to achieve an EDOF image of high contrast. The advantage of this method is that it is independent of the wavelength of the illumination light. At the same time, the PSF curve is symmetric and sharp. Thus, a restoration error exists in the decrease of the contrast owing to the symmetry of the PSF curve.

Figure 11. Principle of aperture variation method.

Fig. 12 and Fig. 13 present the MTF curve of the traditional optical system and the EDOF system based on aperture variation, respectively. Both curves show the variation trends of the in-focus object and the defocused object. From these figures, we can conclude that the MTF curve obtained by aperture variation is defocus-independent compared with the curve of the traditional system.

Proc. of SPIE Vol. 8558 85580K-9

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 10: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

JL1

0.8

Ê 0.6Á

$ 0.4E

0.2

0-2 -1 0 1

x (normalized)

MTF (focused)

2

1

0.8

t 0.6a

$ 0.4E

0.2

0-2 -1 0 1

x (normalized)

MTF (debcused)

2

MTF (focused)1

0.8

0.6ïe

$ 0.4E

0.2

02 -2 -1 0

x (normalized)

MTF (defocused)

2

Figure 12. MTF curve of the traditional optical system.

Figure 13. MTF curve of an EDOF system based on aperture variation.

2.4.3.2 EDOF technology based on mask The characteristic of this method is that a mask is placed in the aperture plane of an optical system. An EDOF image can be achieved by using an image recovery technique, which is simple and convenient. However, the disadvantage is that the luminous flux and spatial resolution decrease, and the PSF is depth-dependent. If we adopt the same PSF to restore the blurred image, an image that is only in focus is sharp but blurred at other depths. The masks can be of the following types: gray42, circular ring43, multi-ring41, random44, modified non-uniform redundant array45, aperture pair46, Levin47, and Veeraraghavan48. In the following analysis, we will introduce the characteristics of some typical masks.

The exit pupil function is modulated when we use the gray mask. In this case, the modulated PSF and MTF vary little with the defocusing. In 2007, Anat Levin designed the Levin mask by adopting an evaluation method for depth resolution. The mask in Fig. 14 can be used to obtain the depth information of the image and to restore the full-focus images.

Figure 14. The Levin mask.

The Veeraraghavan mask, shown in Fig. 15, can be obtained after optimizing the modified non-uniform redundant array by Ashok Veeraraghavan in 2007. Due to the large scale of the search space, the mask is limited to a 7 7× unit structure. The camera with this mask can record the modulated information of four-dimensional light fields. Consequently, a full-focus image and information about image depth can be obtained by the image restoration algorithm.

Proc. of SPIE Vol. 8558 85580K-10

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 11: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

t)

Figure 15. The Veeraraghavan mask.

In 2009, Changyin Zhou put forward a new method of evaluating masks44 and adopted a genetic algorithm capable of rapidly converging to a random optimized mask of 13 13× cells that offered considerable improvements compared with the Veeraraghavan mask. The shape of the random mask and the optimized one based on a certain noise level are presented in Fig. 16(a) and Fig. 16(b), respectively. In contrast with the experimental results, we conclude that the restored image becomes sharper and possesses less artifacts and noise.

(a) Random Mask (b) Optimized Mask

Figure 16. The common random mask and the optimized

random mask at the standard deviation of noise 0.005σ = .

Later, Changyin Zhou designed a coding aperture with a high depth-recovery accuracy46. In his design, the mask is extended to 33 33× cells, which is presented in Fig. 17. This mask can not only restore reliable depth information, but it can also produce a full-focus image to extend the DOF by combining two images passing through the aperture pair.

Figure 17. An optimized aperture pair pattern.

2.4.3.3 EDOF technology based on a phase plate A phase plate is placed in the pupil plane or the aperture plane of an optical system to modulate the wavefront signal, resulting in an image independent of the defocusing. Then a sharp image with a large DOF is attained by image demodulation.

Wavefront coding, being the most developed technology, is widely applied49–51. The system is mostly composed of an optical part and a digital signal processing part, as illustrated in Fig. 18. In this figure, a phase plate is shown placed in the aperture plane. A blurred image independent of defocusing is formed in the detector after the incoherent wavefront is coded by the phase plate. Then we use digital image processing technology to restore the image, producing a final image with a large DOF.

Compared with EDOF technology based on masks, this phase plate approach does not reduce the luminous flux and does not affect the spatial resolution, and hence, it is widely applied in many fields. Generally, the phase plate form options consist of a cubic phase plate52, an exponential phase plate53, a logarithmic phase plate54, and a sine phase plate55. However, the PSF in the EDOF phase plate approach depends on the depth, which is similar to the detector-moving approach. On the other hand, the PSF sacrifices the dynamic range and SNR of the imaging system.

Proc. of SPIE Vol. 8558 85580K-11

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 12: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

Wavefront Codingoptical element

Digital detector

Digital processingapplied to sampled

image

FinalImage

(a) ordinary microscope

rrv -11.-G

st - 000000c=000000F

uv

st

(b) light field microscope

160nun

(c) our prototype

Figure 18. The principle of the wavefront coding system.

2.4.3.4 EDOF technology based on lens of grid focal length A large lens is formed by combining several small lenses with different focus values into an array.56 A blurred image is obtained by imaging the objective after placing the large combined lens in the aperture plane. Thus, the DOF is extended with the aid of image restoration. Moreover, we can obtain a large spatial-frequency power spectrum. However, this approach sacrifices the spatial resolution, which is identical to the case in mask technology. 2.4.3.5 EDOF technology based on synthesized light field imaging In this approach, we describe the radiation characteristics of light to obtain the object information from the light field. The technology of light-field imaging is based on the transfer of information to the light wave field, from which the two-dimensional information in object space and the one-dimensional depth information are reconstructed. 57–59

The EDOF technology based on light field imaging is a method in which several micro-lenses are inserted into the optical system to form a synthesized light field60. How information about the object is captured in the light field.is presented in Fig. 19(b). Each micro-lens records the light information from a different field of view. Then an object is imaged into the subunit of the detector behind the micro-lens, producing an image focal stack. When a deconvolution algorithm is used, a series of transversal surfaces are produced in order to display the volumetric shape of an object. While the disadvantages are that the micro-lens array is complex and that the manufacturing cost is high, the accuracy of the position where the micro-lens position is high.

Figure 19. Light field microscopy.

3. CONCLUSIONS AND DISCUSSIONS As discussed in the previous sections, the four types of EDOF technology have been widely researched. Particularly, most studies have focused on the EDOF technology of computational photography, in which the characteristics of optical imaging technology and computer signal processing technology are combined. In this approach, a simple optical element is added into the system to modulate the object to obtain the blurred image. As a consequence, an EDOF image can be obtained by demodulating the blurred image with the aid of digital signal processing technology. Due to the simplicity and convenience of this method, the DOF extension ratio can be as large as a factor of ten or even several tens. Therefore, the EDOF technology based on computational photography is the primary direction for recently proposed DOF technologies. Furthermore, an EDOF image can be obtained by imaging the object only once during the integration time

Proc. of SPIE Vol. 8558 85580K-12

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 13: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

without the need for complex operations. Fortunately, this technology satisfies the requirements in the biological, medical, and manufacturing fields. At the same time, this approach is also necessary for performing measurements in visual measurement systems that require high accuracy.

ACKNOWLEDGMENTS This work is supported by the National Natural Science Foundation of China under Grant No. 51105027.

REFERENCES [1] Inoué, S. and Spring, K. R., [Video Microscopy: The Fundamentals], Plenum Press, New York & London, 48-50 (1997). [2] Haeberli, P., “A multifocus method for controlling depth of field,” Course Hero, Fall 2009, <http://www.coursehero.com/file/4293407/A-Multifocus-Method-for-Controlling-Depth-of-Field/> (October 1994). http://www.graficaobscura.com/depth/index.html [3] Qu, Y. F., Pu, Z. B. and Zhao, H. J., “3D coordinate measurement methods of free form surface based on monocular vision,” Scientific Instrument 27(10), 1318-1321 (2006). (In Chinese) [4] Mouroulis, P., “Depth of field extension with spherical optics,” Optics Express, 16 (17), 12995~13004 (2008). [5] Cossairt, O. and Nayar, S., “Spectral focal sweep: Extended depth of field from chromatic aberrations,” Proc. Int. Conf. Comput. Photography, 1-8 (2010). [6] Li, Q., “Studies on the theory and implementation method of digital autofocus,” Ph.D dissertation, Zhejiang University (2004). [7] Jenn-Kwe, T., “Analysis and application of autofocusing and three-dimensional shape recovery techniques based on image focus and defocus,” Ph.D dissertation, the State University of New York (1997). [8] Yang, X., Yang, W., and Pei, J. H., “Different focus points images fusion based on wavelet decomposition,” Proceedings of Third International Conference on Information Fusion 1, 3-8 (2000). [9] Duhamel, P. and Rioul, O., “Fast algorithms for discrete and continuous wavelet transforms,” IEEE Transactions on Information Theory 38, 569-586 (1992). [10] Li, H., Manjunath, B. S. and Mitra, S. K., “Multisensor image fusion using the wavelet transforms,” Proceedings of the First International Conference on Image Processing 1, 51-55 (1994). [11] Mouroulis, P. and Macdonald, J., [Geometrical optics and optical design], Oxford University Press, New York & Oxford, 324 (1997). [12] Smith, W. J., [Modern optical engineering: the design of optical systems], McGraw-Hill, New York, 80-90 (1966). [13] Nagahara, H., Kuthirummal, S., Zhou, C., and Nayar, S., “Flexible depth of field photography,” ECCV 4, 60-73 (2008). [14] Häusler, G., “A method to increase the depth of focus by two step image processing,” Optics Comm. 6(1), 38-42 (1972). [15] Parkkinen, J., Hallikainen, J. and Jaaskelainen, T., “Characteristic spectral of munsell colors,” J. Opt. Soc. Am. 6(2), 318-322 (1989). [16] Liu, S. and Hua, H., “Extended depth-of-field microscopy imaging with a variable focus microscope objective,” Optics Express 19(1), 353-362 (2011). [17] Marquet, P., Rappaz, B., Magistretti, P. J., Cuche, E., Emery, Y., Colomb, T. and Depeursinge, D., “Digital holographic microscopy: a noninvasive contrast imaging technique allowing quantitative visualization of living cells with subwavelength axial accuracy,” Opt. Lett. 30(5), 468-470 (2005). [18] Pavlicek, P. and Soubusta, J., “Theoretical measurement uncertainty of white-light interferometry on rough surfaces,” Appl. Opt. 42(10), 1809-1813 (2003). [19] Conchello, J. A. and Lichtman, J., “Optical sectioning microscopy,” Nat. Methods 2(12), 920-931 (2005). [20] Helmchen, F. and Denk, K., “Deep tissue two-photon microscopy,” Nat. Methods 2(12), 932–940 (2005). [21] Huisken, J., Swoger, J., Del Bene, F., Wittbrodt, J. and Stelzer, E. H. K., “Optical sectioning deep inside live embryos by selective plane illumination microscopy,” Science 305(5686), 1007–1009 (2004). [22] Neil, M. A. A., Juskaitis, R. and Wilson, T., “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Opt. Lett. 22(24), 1905–1907 (1997). [23] Minsky, M., “Memoir on inventing the confocal scanning microscope,” Scanning 10(4), 128-138 (1988). [24] Gabor, D., “A new microscopic principle,” Nature 161(4098), 777-778 (1948). [25] Goodman, J. W. and Lawrence, R. W., “Digital image formulation from electronically detected hologram,” Applied Physics Letters 11(3), 77-79 (1967).

Proc. of SPIE Vol. 8558 85580K-13

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 14: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

[26] Park, M. C. and Kim, S. W., “Direct quadratic polynomial fitting for fringe peak detection of white light scanning stereograms,” Opt. Eng. 39(4), 952-959 (2000). [27] Groot, P., Lega, X. C., Kramer, J. and Turzhitsky, M., “Determination of fringe order in white-light interference microscopy,” Appl. Opt. 41(22), 4571- 4578 (2002). [28] Harasaki, A., Schmit, J. and Wyant, J. G., “Improved vertical-scanning interferometry,” Appl. Opt. 30(13), 2107- 2115 (2000). [29] Pfortner, A. and Schwide, J., “Dispersion error in white-light linnik interferometers and its implications for evaluation procedures,” Appl. Opt. 40(34), 6223- 6228 (2001). [30] Dresel, T., Hausler, G. and Venzke, H., “Three-dimensional sensing of rough surfaces by coherence radar,” Appl. Opt. 31(7), 919-925 (1992). [31] Pavlicek, P. and Soubusta, J., “Theoretical measurement uncertainty of white-light interferometry on rough surfaces,” Appl. Opt. 42(10), 1809-1813 (2003). [32] Pavlicek, P. and Soubusta, J., “Measurement of the influence of dispersion on white-light interferometry,” Appl. Opt. 43(4), 766-770 (2004). [33] Groot, P. and Lega, X. C., “Signal modeling for low-coherence height-scanning interference microscopy,” Appl. Opt. 43(25), 4821- 4830 (2004). [34] Kino, G. S. and Chim, S. C., “The mirau correlation microscope,” Appl. Opt. 29(26), 3775-3783 (1990). [35] Shottos, D.M., “Confocal scanning optical microscope and its applications for biological specimens,” J. Cell. Sci. 94(2), 175-206 (1989). [36] Levoy, M., Durand, F. and Szeliski, R., “Symposium on Computational Photography and Video,” SCPV, <http://scpv.csail.mit.edu/home.htm > (23 May 2005). http://scpv.csail.mit.edu/ [37] Neil, A. A., Juskaitis, R. and Wilson, T., “Method of obtaining optical sectioning by using structured light in a conventional microscope,” Opt. Lett. 22(24), 1905-197 (1997). [38] Goodman, J. W. and Lawrence, R. W., “Digital image formation from electronically detected hologram,” Appl. Phys. Lett. 11(3), 77-79 (1967). [39] Hiura, S. and Matsuyama, T., “Depth measurement by the multi-focus camera,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 953- 959 (1998). [40] Kuthirummal, S., Nagahara, H., Zhou, C. and Nayar, S. K., “Flexible depth of field photography,” IEEE Transactions on Pattern Analysis and Machine Intelligence 33(1), 58-71 (2011). [41] Hong, D., Cho, H., Kim, M. Y. and Moon, J. I., “Depth of field extension using variable annular pupil,” Optomechatronic Technologies, 283-288 (2009). [42] Bagheri, S. and Javidi, B., “Extension of depth of field using amplitude and phase modulation of the pupil function,” Opt. Lett. 33(7), 757-759 (2008). [43] Ojeda-Castaneda, J., Adres, P. and Diaz, A., “Annular apodizers for low sensitivity to defocus and to spherical aberration,” Opt. Lett. 11(8), 487-489 (1996). [44] Zhou, C. and Nayar, S., “What are good apertures for defocus deblurring?” IEEE International Conference on Computational Photography, 1-8 (2009). [45] Gottesman, S. and Fenimore, E., “New family of binary arrays for coded aperture imaging,” Appl. Opt. 28(20), 4344-4352 (1989). [46] Zhou, C., Lin, S. and Nayar, S., “Coded aperture pairs for depth from defocus,” International Conference on Computer Vision, 325-332 (2009). [47] Levin, A., Fergus, R., Durand, F. and Freeman, W., “Image and depth from a conventional camera with a coded aperture,” ACM Transactions on Graphics, 26(3), No.70 (2007). [48] Veeraraghavan, A., Raskar, R., Agrawal, A., Mohan, A. and Tumblin, J., “Dappled photography: mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Transactions on Graphics 26(3), No.69 (2007). [49] Dowski, E. R., Cathey, W. T. and et al., “Extended depth of field through wave-front coding,” Appl. Opt. 34(11), 1859-1866 (1995). [50] Porras, R., Vazquez, S., Castro and et al., “Wavefront coding technology in the optical design of astronomical instruments,” Proc. SPIE 5622, 796-800 (2004). [51] PrischePa, I. A., Dowski, E. D. and Jr., “Wavefront coded zoom lens system,” Proc. SPIE 4487, 83-93 (2001). [52] Cathey, W. T. and Dowski, E. R., “New paradigm for imaging systems,” Appl. Opt. 41(29), 6080-6092 (2002). [53] Yang, Q., Liu, L. and Sun, J., “Optimized phase pupil masks for extended depth of field,” Optics Communications. 272(1), 56-66 (2007).

Proc. of SPIE Vol. 8558 85580K-14

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms

Page 15: Extended depth-of-field for vi sual systems: an overviewcpl.buaa.edu.cn/article/Extended_depth_of_field_for... · 2020-04-13 · Using a variety of different criteria for determining

[54] Zhao, H. and Li, Y., “Optimized logarithmic phase masks used to generate defocus invariant modulation transfer function for wavefront coding system,” Opt. Lett. 35(15), 2630-2632 (2010). [55] Zhao, H. and Li, Y., “Optimized sinusoidal phase mask to extend the depth of field of an incoherent imaging system,” Opt. Lett. 35(2), 267-269 (2010). [56] Levin, A., Hasinoff, S., Green, P., Durand, F. and Freeman, B., “4D frequency analysis of computational cameras for depth of field extension,” ACM Transactions on Graphics 28(3), No. 97 (2009). [57] Levoy, M., Ng, R., Adams, A., Footer, M. and Horowitz, M., “Light Field Microscopy,” ACM Transactions on Graphics, 924-934 (2006). [58] Levoy, M., “Light field rendering,” Proc. 23rd Annu. Conf. Comput. Graph. Interactive Tech., 31-42 (1996). [59] Gortler, S. J., Grzeszczuk, R., Szeliski, R. and Cohen, M. F., “The lumigraph,” Proc. 23rd Annu. Conf. Comput. Graph. Interactive Tech., 43-54 (1996). [60] Liang, C. K., “Analysis, acquisition, and processing of light field for computational photography,” Ph. D dissertation, National Taiwan University (2008).

Proc. of SPIE Vol. 8558 85580K-15

Downloaded From: http://proceedings.spiedigitallibrary.org/ on 12/28/2012 Terms of Use: http://spiedl.org/terms