week10 iris

Upload: rabi-syed

Post on 13-Apr-2018

236 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/27/2019 Week10 Iris

    1/46

    Iris Modeling and Synthesis

    CPSC 601 Biometric Technologies Course

  • 7/27/2019 Week10 Iris

    2/46

    Lecture Plan

    Motivation

    Iris structure

    Iris Image acquisition

    Methodology Iris Localization

    Iris features Matching

    Iris Synthesis

    Future Developments

  • 7/27/2019 Week10 Iris

    3/46

    Anatomy of the human iris. The upper panel illustrates the structure ofthe iris seen in a transverse section. The lower panel illustrates thestructure of the iris seen in a frontal sector.

    Iris Structure

  • 7/27/2019 Week10 Iris

    4/46

    At a finer grain of analysis, the iris is composed of several

    layers. The posterior surface is composed of heavily

    pigmented epithelial cells that make it impenetrable to light.

    Anterior to this layer two muscles are located that work in

    cooperation to control the size of the pupil.

    The visual appearance of the iris is a direct result of itsmultilayered structure. Iris color results from the differential

    absorption of light impinging on the pigmented cells in the

    anterior border layer.

    Iris Structure

  • 7/27/2019 Week10 Iris

    5/46

    The first source of evidence comes

    from clinical observations. Duringthe course of examining largenumber of eyes, ophthalmologistshave noted that the detailedspatial pattern of an Iris seems tobe unique. The pattern seem tovary little, at least past childhood.

    The second source of evidencecomes from developmentalbiology. While the generalstructure of the iris is geneticallydetermined, the particulars of itsminutiae are critically dependenton circumstances (e.g. the initialconditions in the embryonic

    precursor to the iris).

    Anatomy of the iris visible in an optical

    image.

    Iris Structure - Uniqueness

  • 7/27/2019 Week10 Iris

    6/46

    Another interesting aspect of the physical characteristics of the iris

    from a biometric point of view has to do with its dynamics. Due to the

    complex interplay of the iris's muscles, the diameter of the pupil is in a

    constant state of small oscillation at a rate of approximately 0.5 Hz.

    This movement could be monitored to ensure that a live specimen is

    being evaluated.

    Since the iris reacts very quickly to changes in impinging illumination,

    monitoring the reaction to a controlled illuminant could provide similar

    evidence.

    Iris Structure - dynamics

  • 7/27/2019 Week10 Iris

    7/46

    Acquisition of a high-quality iris image,

    while remaining non-invasive to human

    subjects, is one of the major challengesof automated iris recognition.

    Figure: Passive sensing approaches toiris image acquisition. The upperdiagram shows a schematic diagram

    of the Daugman image acquisition rig.The lower diagram shows a schematicdiagram of the Wildes et al. imageacquisition.

    Iris Image Acquisition

  • 7/27/2019 Week10 Iris

    8/46

    In order to cope with the inherent variability of ambientillumination, extant approaches to iris image sensing providea controlled source of illumination as a part of their method.

    Research initiated at Sarnoff Corporation and subsequentlytransferred to Sensar Incorporated for refinement andCommercialization has yielded the most non-invasive approach toiris image capture that has been documented to date.

    For capture, a subject merely needs to stand still and face forwardwith their head in an acquisition volume of 600vertical by

    450 horizontal and a distance of approximately 0.38 to 0.76 m, allmeasured from the front-center of the acquisition rig. Capture of an imagethat has proven suitable to drive iris recognition algorithm can then beachieved totally automatically, typically within 2-10 seconds.

    Iris Image Acquisition

  • 7/27/2019 Week10 Iris

    9/46

    Figure: Active sensing approach to iris image acquisition.

    Iris Image Acquisition

  • 7/27/2019 Week10 Iris

    10/46

    Following image acquisition, the portion ofthe image that corresponds to the iris

    needs to be localized from itssurroundings.

    The iris image data can then be brought

    under a representation to yield an irissignature for matching against similarlyacquired, localized and represented irises.

    Iris Image Localization

  • 7/27/2019 Week10 Iris

    11/46

    Daugman and Wildes et al. approaches make use of firstderivatives of image intensity to signal the location of edges

    that correspond to the borders of the iris. Here, the notionis that the magnitude of the derivative across an imagedborder will show a local maximum due to the local changeof image intensity.

    Both systems model the various boundaries that delimit theiris with simple geometric models. For example, they bothmodel the limbus and pupil with circular contours.

    The Wildes et al. system also explicitly models the upper andlower eyelids with parabolic arcs. In initial implementation,the Daugman system simply excluded the upper and lowermost portions of the image where eyelid occlusion wasmost likely to occur; subsequent refinements includeexplicit eyelid localization.

    Iris Image Localization

  • 7/27/2019 Week10 Iris

    12/46

    The two approaches differ mostly in the way that they search their

    parameter spaces to fit the contour models to the image information. The

    Daugman approach fits the circular contours via gradient ascent on the

    parameters (xc, y

    c, r) so as to maximize

    where

    is a radial Gaussian with center r0and standard deviation thatsmoothes the image to select the spatial scale of edges under

    consideration, *symbolizes the convolution, dsis an element of

    circular arc and division by 2rserves to normalize the integral.

    Iris Image Localization

  • 7/27/2019 Week10 Iris

    13/46

    The Wildes et al. approach performs its contour fitting in two steps.First, the image intensity information is converted into a binary edge-map. Second, the edge points vote to instantiate particular contourparameter values.

    Iris Image Localization

  • 7/27/2019 Week10 Iris

    14/46

    Both approaches to localizing the

    iris have proven to be successful in

    the targeted application. The

    histogram-based approach tomodel fitting should avoid problems

    with local minima that the active

    contour model's gradient descent

    procedure might experience.

    However, by operating moredirectly with the image derivatives,

    the active contour approach

    avoids the inevitable thresholding

    involved in generating a binary

    Edge map.

    Illustrative results of iris localization.Given an acquired image, it isnecessary to separate the iris from thesurroundings. Taking as input an irisimage, automated processing

    delineates that portion whichcorresponds to the iris.

    Iris Image Localization

  • 7/27/2019 Week10 Iris

    15/46

    The distinctive spatial characteristics of the human iris are displayed at a variety

    of scales.

    The Daugman approach makes use of a decomposition derived from

    application of a two-dimensional version of Gabor filters to the image data.

    Since the Daugman system converts to polar coordinates, (r, ), during

    matching, it is convenient to give the filters in a corresponding form as

    where and co-vary in inverse proportion to generate a set ofquadrature pair frequency selective filters, with center locations specified by (r0,

    0). These filters are particularly notable for their ability to achieve good joint

    localization in the spatial and frequency domains.

    Iris modeling- methodology

  • 7/27/2019 Week10 Iris

    16/46

    The Wildes et al. approach makes use of an isotropic bandpassdecomposition derived from application of Laplacian of Gaussian (LoG)filters to the image data. The LoG filters can be specified via the form

    with the standard deviation of the Gaussian and the radialdistance of a point from the filters center. In practice, the filteredimage is realized as a Laplacian pyramid.

    Iris modeling- methodology

  • 7/27/2019 Week10 Iris

    17/46

    By retaining only the sign of the Gabor filter output, the

    Representational approach that is used by Daugman yields a

    remarkably parsimonious representation of an iris. Indeed, arepresentation with a size of 256 bytes can be

    accommodated on the magnetic stripe affixed to the back

    of standard credit/debit cards. In contrast, the Wildes et

    al. representation is derived directly from the filteredimage for size on the order of the number of bytes in the

    iris region of the originally captured image.

    Iris modeling- methodology

  • 7/27/2019 Week10 Iris

    18/46

    Iris matching can be understood as a three-stage process.

    The first stage is concerned with establishing a spatialcorrespondence between two iris signatures that are to be

    compared.

    Given correspondence, the second stage is concernedwith quantifying the goodness of match between two irissignatures.

    The third stage is concerned with making a decisionabout whether or not two signatures derive from the same physicaliris, based on the goodness of match.

    Iris matching

  • 7/27/2019 Week10 Iris

    19/46

    Given the combination of required subject participation and thecapabilities of sensor platforms currently in use, the keygeometric degrees of freedom that must be compensated for inthe underlying iris data are shift, scaling and rotation.Shift accounts for offsets of the eye in the plane parallel to thecamera's sensor array. Scale accounts for offsets along thecamera's optical axis. Rotation accounts for deviation in angularposition about the optical axis. Another degree of freedom ofpotential interest is that of pupil dilation.

    Iris matching

  • 7/27/2019 Week10 Iris

    20/46

    Daugmans system uses radial scaling to compensate for overall size

    as well as a simple model of pupil variation based on linear stretching.

    The scaling serves to map Cartesian image coordinates (x, y) to polar

    image coordinates (r, ) according to

    where r lies on [0, 1] and is cyclic over [0, 2], while (xp(), yp())

    and (x1(), y1()) are the coordinates of the pupillary and limbicboundaries in the direction . Rotation is compensated for by brute

    force search: explicitly shifting an iris signature in by various

    amounts during matching.

    Iris matching

  • 7/27/2019 Week10 Iris

    21/46

    The Wildes et al. approach uses an image registration technique to

    compensate for both scaling and rotation. This approach geometrically projects

    an image, Ia(x, y), into alignment with a comparison image, Ic(x, y), according

    to a mapping function (u(x, y), v(x, y)) such that, for all (x, y), the image

    intensity value at (x, y)(u(x, y), v(x, y)) in Iais close to that at (x, y) in Ic.

    More precisely, the mapping function (u, v) is taken to minimize

    while being constrained to capture a similarity transformation of image

    coordinates (x, y) to (x, y), i.e.

    with sa scaling factor and R()a matrix representing rotation by .

    Iris matching

  • 7/27/2019 Week10 Iris

    22/46

    An appropriate match metric can be based on direct point wise

    Comparisons between primitives in the corresponding signature

    representations. The Daugman approach quantifies this matter by

    computing the percentage of mismatched bits between a pair of irisrepresentations, i.e. the normalized Hamming distance. Letting A and B be

    two iris signatures to be compared, this quantity can be calculated as

    With subscriptj indexing bit position anddenoting the exclusive-OR operator. The Wildes et al. system employs a

    somewhat more elaborate procedure to quantify the goodness of match.

    The approach is based on normalized correlation between two signatures

    (i.e. pyramid representations) of interest.

    Iris matching

  • 7/27/2019 Week10 Iris

    23/46

    The final subtask of matching is to evaluate the goodness ofmatch values to make a final judgement as to whether twosignatures under consideration do (authentic) or do not(impostor) derive from the same physical iris. In theDaugman approach, this amounts to choosing a separationpoint in the space of (normalized) Hamming distancesbetween the iris signatures. Distances smaller than theseparation point will be taken as indicative of authentics;those larger will be taken as indicative of impostors.

    In the Wildes et al. approach, the decision making processmust combine the four goodness of match measurementsthat are calculated by the previous stage of processing (i.e.one for each pass band in the Laplacian pyramidrepresentation that comprises a signature) into a single

    accept/reject judgement.

    Iris matching

  • 7/27/2019 Week10 Iris

    24/46

    Further developments could befocused on yielding more compactSystems that can be easilyincorporated into consumer productswhere access control is desired (e.g.automobiles, personal computers,various handheld devices).

    Can iris recognition be performed atgreater subject to sensor distanceswhile remaining unobtrusive?

    How much subject motion can betolerated during image capture?

    Can performance be made morerobust to uncontrolled ambientillumination?

    Toward iris recognition at a distance.An interesting direction for futureresearch in iris recognition is to relaxconstraints observed by extantsystems. As a step in this direction, aniris image captured at 10m subject to

    sensor distance is shown.

    Iris modelingfuture developments

  • 7/27/2019 Week10 Iris

    25/46

    At a more operational level of performance analysis, studies ofiris recognition systems need to be performed whereindetails of acquisition are systematically manipulated,documented and reported. Parameters of interest include,geometric and photometric aspects of the experimentalstage, length of time monitored and temporal lag betweentemplate construction and recognition attempt. Similarly,details of captured irises and relevant personal accessoriesneed to be properly documented in these same studies

    (e.g. eye color, eyewear).

    Iris modelingfuture developments

  • 7/27/2019 Week10 Iris

    26/46

    Iris Synthesis

    Classical Biometrics - Recognition Fingerprints, Faces, Irises

    Inverse Problem Synthesis

    Testing Recognition methods

  • 7/27/2019 Week10 Iris

    27/46

    Iris Synthesis - Goals

    Synthesis Of Biometric Databases

    Iris Database Augmentation

    Testing Recognition Methods

    Minimal User Input

  • 7/27/2019 Week10 Iris

    28/46

    Iris Synthesis - previous work

    Iris Recognition - [Wildes 94, Daugman 04]

    Biometric Synthesis - [Yanushkevich et al.04]

    Iris Synthesis - [Lefohn et al. 03, Cui et al. 04]

  • 7/27/2019 Week10 Iris

    29/46

    Iris Synthesis

    An Ocularists Approach to Human

    Iris Synthesis. [ Lefohn et. al. 03]

    An Iris image synthesis method

    based on PCA and Super-Resolution. [Cui et. al. 04]

  • 7/27/2019 Week10 Iris

    30/46

    PCA Approach

    Uses 75 Dimensional PCA Feature

    Vector Randomization

    Super Resolution

    Great statistical results Low Realism

  • 7/27/2019 Week10 Iris

    31/46

    Ocularists Approach

    Uses: 30-70

    Layers Great Results.

    Domain Specific

    KnowledgeAn ocularist's approach to human iris synthesis. Lefohn et. al. 2003.

    Used with permission.

  • 7/27/2019 Week10 Iris

    32/46

    New Approach

    Use Real Iris Sample Use sets of Similar Irises

    Capture Characteristics Chaikin Reverse Subdivision

    Combine Characteristics Multiple Iris Donors

    See L. Wecker, F. Samavati and M. Gavrilova workson the subject.

  • 7/27/2019 Week10 Iris

    33/46

    Comparison

    PCA Global Features

    Not as Efficient

    Realism

    Reverse

    Subdivision Global & Local Features

    Linear Implementation

    Realistic results

  • 7/27/2019 Week10 Iris

    34/46

    Organization

  • 7/27/2019 Week10 Iris

    35/46

    Method

    First step: Isolate the iris.

    Polar Transform

    Iris Stretching

  • 7/27/2019 Week10 Iris

    36/46

    Multiresolution

    Data has many resolutions Levels of resolution have different meanings

    Reverse Subdivision Details

  • 7/27/2019 Week10 Iris

    37/46

    Decomposition

  • 7/27/2019 Week10 Iris

    38/46

    Method

    Capture Details Reverse Subdivision

    Details All Characteristics

    Courtesy of: Michal Dobes and Libor Machala,

    Iris Database, http://www.inf.upol.cz/iris/

    http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/
  • 7/27/2019 Week10 Iris

    39/46

    Combinations

  • 7/27/2019 Week10 Iris

    40/46

    Classifications

    Frequency of Data

    Number of Concentric Rings

    Courtesy of: Michal Dobes and Libor Machala, Iris Database, http://www.inf.upol.cz/iris/

    http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/
  • 7/27/2019 Week10 Iris

    41/46

    Database Size

  • 7/27/2019 Week10 Iris

    42/46

    Courtesy of: Michal Dobes and Libor Machala, Iris Database, http://www.inf.upol.cz/iris/

    Original SetInput Irises

    http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/
  • 7/27/2019 Week10 Iris

    43/46

    Courtesy of: Michal Dobes and Libor Machala, Iris Database, http://www.inf.upol.cz/iris/

    Output Irises

    http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/
  • 7/27/2019 Week10 Iris

    44/46

    Combinations

  • 7/27/2019 Week10 Iris

    45/46

    Output Irises

    Courtesy of: Michal Dobes and Libor Machala, Iris Database, http://www.inf.upol.cz/iris/

    http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/http://www.inf.upol.cz/iris/
  • 7/27/2019 Week10 Iris

    46/46

    Future Work

    Post-Processing

    Multiple samples of each iris

    Verification

    Statistically