a non-circular iris localization algorithm using image projection function and gray level statistics

7
Please cite this article in press as: F. Jan, et al., A non-circular iris localization algorithm using image projection function and gray level statistics, Optik - Int. J. Light Electron Opt. (2012), http://dx.doi.org/10.1016/j.ijleo.2012.09.018 ARTICLE IN PRESS G Model IJLEO-52635; No. of Pages 7 Optik xxx (2012) xxx–xxx Contents lists available at SciVerse ScienceDirect Optik jo ur n al homepage: www.elsevier.de/ijleo A non-circular iris localization algorithm using image projection function and gray level statistics Farmanullah Jan , Imran Usman, Shahrukh Agha Department of Electrical Engineering, COMSATS Institute of Information Technology, Park Road, Chak Shahzad, 44000 Islamabad, Pakistan a r t i c l e i n f o Article history: Received 11 April 2012 Accepted 23 September 2012 Keywords: Iris localization Histogram Biometrics Iris recognition Active contours a b s t r a c t Iris recognition technology identifies an individual from its iris texture with great precision. A typical iris recognition system comprises eye image acquisition, iris segmentation, feature extraction, and matching. However, the system precision greatly depends on accurate iris localization in the segmentation module. In this paper, we propose a reliable iris localization algorithm. First, we locate a coarse eye location in an eye image using integral projection function (IPF). Next, we localize the pupillary boundary in a sub image using a reliable technique based on the histogram-bisection, image statistics, eccentricity, and object geometry. After that, we localize the limbic boundary using a robust scheme based on the radial gradients and an error distance transform. Finally, we regularize the actual iris boundaries using active contours. The proposed algorithm is tested on public iris databases: MMU V1.0, CASIA-IrisV1, and the CASIA-IrisV3-Lamp. Experimental results demonstrate superiority of the proposed algorithm over some of the contemporary techniques. © 2012 Elsevier GmbH. All rights reserved. 1. Introduction In recent years, information technology has undergone through enormous developmental phases that caused maturity in both the software and hardware platforms, for instance, cell phones, auto teller machines, and the Google-map are to name a few. Despite of these advances, the life and assets of an individual are not safe. Every day, we hear about the criminal activities, such as cybercrimes, bank frauds, hacking of passwords and personal iden- tification numbers, and so on. Most often, it is found that such happenings occur because of loopholes present in the traditional security systems [1–3], which use the knowledge and tokens based techniques (e.g., keys, identity cards, passwords, etc.), which could be shared and/or hacked. Due to these imperfections, traditional security systems are now being replaced by the biometric technol- ogy [1–3]. Biometric technology uses the physiological and/or physical traits to authenticate identity of an individual [2,4,5]. It includes retina, iris, signature, ear, face, smell, palm, fingerprint, gait, DNA, and among the others [2,4–7]. Traits, such as fingerprint, signature, voice, and face, have long been in use for the human identifica- tion and verification. However, these traits may change with the subject aging effects and could be duplicated artificially as well. On other hand, iris texture could not be copied and/or changed by any artificially means, except its intentional surgery [4]. In human Corresponding author. E-mail address: [email protected] (F. Jan). eye, iris is an annulus located between the pupil and sclera, which is protected by the cornea. In addition to its stable, unique, and non-invasive properties, it has quite complex structure comprising ridges, corona, furrows, crypts, freckles, and the arching ligaments [4,5]. Literature reveals that iris structure remains stable over the entire life period of a subject, except some negligible changes in the early life stages [4,5]. Iris technology has great potential for its applications; for example, citizen registration department, border- crossing control, sales points, health and care department, travel and immigration agencies, and so on. A typical iris recognition system includes image acquisition, iris segmentation, feature extraction, and matching and recog- nition [2]. However, among these modules, iris segmentation plays a vital role in the overall system accuracy, because all the subsequent modules follow its results. It localizes the iris inner (pupillary boundary) and outer (limbic boundary) boundaries at the pupil and sclera, respectively; and detects and excludes any other noise, such as eyelids, eyelashes, and/or specular reflections [3,4]. This study focuses on accurate localization of pupillary and limbic boundaries only (as in [3,8]). In [4], Daugman localized these boundaries with a circle approximation using an Integro- differential operator (IDO). Similarly, Wildes [5] used a combination of circular Hough transform (CHT) and the gradient edge-map to localize iris. Following that, numerous researchers proposed different segmentation techniques [2], which are based on these pioneered ideas [4,5]. However, iris localization techniques [2,3,8] that use CHT and/or IDO consume relatively more time compared to the methods based on the histogram and thresholding based techniques [3]. 0030-4026/$ see front matter © 2012 Elsevier GmbH. All rights reserved. http://dx.doi.org/10.1016/j.ijleo.2012.09.018

Upload: shahrukh

Post on 08-Dec-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

I

Ag

FD

a

ARA

KIHBIA

1

estoscthstbso

travtsOa

0h

ARTICLE IN PRESSG ModelJLEO-52635; No. of Pages 7

Optik xxx (2012) xxx–xxx

Contents lists available at SciVerse ScienceDirect

Optik

jo ur n al homepage: www.elsev ier .de / i j leo

non-circular iris localization algorithm using image projection function andray level statistics

armanullah Jan ∗, Imran Usman, Shahrukh Aghaepartment of Electrical Engineering, COMSATS Institute of Information Technology, Park Road, Chak Shahzad, 44000 Islamabad, Pakistan

r t i c l e i n f o

rticle history:eceived 11 April 2012ccepted 23 September 2012

eywords:ris localization

a b s t r a c t

Iris recognition technology identifies an individual from its iris texture with great precision. A typical irisrecognition system comprises eye image acquisition, iris segmentation, feature extraction, and matching.However, the system precision greatly depends on accurate iris localization in the segmentation module.In this paper, we propose a reliable iris localization algorithm. First, we locate a coarse eye location inan eye image using integral projection function (IPF). Next, we localize the pupillary boundary in a sub

istogramiometrics

ris recognitionctive contours

image using a reliable technique based on the histogram-bisection, image statistics, eccentricity, andobject geometry. After that, we localize the limbic boundary using a robust scheme based on the radialgradients and an error distance transform. Finally, we regularize the actual iris boundaries using activecontours. The proposed algorithm is tested on public iris databases: MMU V1.0, CASIA-IrisV1, and theCASIA-IrisV3-Lamp. Experimental results demonstrate superiority of the proposed algorithm over some

nique

of the contemporary tech

. Introduction

In recent years, information technology has undergone throughnormous developmental phases that caused maturity in both theoftware and hardware platforms, for instance, cell phones, autoeller machines, and the Google-map are to name a few. Despitef these advances, the life and assets of an individual are notafe. Every day, we hear about the criminal activities, such asybercrimes, bank frauds, hacking of passwords and personal iden-ification numbers, and so on. Most often, it is found that suchappenings occur because of loopholes present in the traditionalecurity systems [1–3], which use the knowledge and tokens basedechniques (e.g., keys, identity cards, passwords, etc.), which coulde shared and/or hacked. Due to these imperfections, traditionalecurity systems are now being replaced by the biometric technol-gy [1–3].

Biometric technology uses the physiological and/or physicalraits to authenticate identity of an individual [2,4,5]. It includesetina, iris, signature, ear, face, smell, palm, fingerprint, gait, DNA,nd among the others [2,4–7]. Traits, such as fingerprint, signature,oice, and face, have long been in use for the human identifica-ion and verification. However, these traits may change with the

Please cite this article in press as: F. Jan, et al., A non-circular iris localstatistics, Optik - Int. J. Light Electron Opt. (2012), http://dx.doi.org/10

ubject aging effects and could be duplicated artificially as well.n other hand, iris texture could not be copied and/or changed byny artificially means, except its intentional surgery [4]. In human

∗ Corresponding author.E-mail address: [email protected] (F. Jan).

030-4026/$ – see front matter © 2012 Elsevier GmbH. All rights reserved.ttp://dx.doi.org/10.1016/j.ijleo.2012.09.018

s.© 2012 Elsevier GmbH. All rights reserved.

eye, iris is an annulus located between the pupil and sclera, whichis protected by the cornea. In addition to its stable, unique, andnon-invasive properties, it has quite complex structure comprisingridges, corona, furrows, crypts, freckles, and the arching ligaments[4,5]. Literature reveals that iris structure remains stable over theentire life period of a subject, except some negligible changes inthe early life stages [4,5]. Iris technology has great potential for itsapplications; for example, citizen registration department, border-crossing control, sales points, health and care department, traveland immigration agencies, and so on.

A typical iris recognition system includes image acquisition,iris segmentation, feature extraction, and matching and recog-nition [2]. However, among these modules, iris segmentationplays a vital role in the overall system accuracy, because all thesubsequent modules follow its results. It localizes the iris inner(pupillary boundary) and outer (limbic boundary) boundaries atthe pupil and sclera, respectively; and detects and excludes anyother noise, such as eyelids, eyelashes, and/or specular reflections[3,4]. This study focuses on accurate localization of pupillary andlimbic boundaries only (as in [3,8]). In [4], Daugman localizedthese boundaries with a circle approximation using an Integro-differential operator (IDO). Similarly, Wildes [5] used a combinationof circular Hough transform (CHT) and the gradient edge-mapto localize iris. Following that, numerous researchers proposeddifferent segmentation techniques [2], which are based on these

ization algorithm using image projection function and gray level.1016/j.ijleo.2012.09.018

pioneered ideas [4,5]. However, iris localization techniques [2,3,8]that use CHT and/or IDO consume relatively more time comparedto the methods based on the histogram and thresholding basedtechniques [3].

IN PRESSG ModelI

2 ik xxx (2012) xxx–xxx

tiitchaslhaeiuteTthptIvtrtirpc

eetc

2

l(

2

rlpgttawtbeoe

2

1

Window-image Iw(x,y) centered at (Sx,Sy); its each side is set to60% of width of Ip(x,y). From now on, we use Iw(x,y) as the inputimage for further processing unless stated otherwise.

ARTICLEJLEO-52635; No. of Pages 7

F. Jan et al. / Opt

Khan et al. [3] proposed an iterative strategy based on the his-ogram and eccentricity to localize pupillary boundary in a binarymage, which has some flaws; for example, they first convert the eyemage to a binary image and, then check eccentricity by consideringhe whole image as a binary object. However, in case, if an eye imageontains other low intensity regions (e.g., eyebrows, eyelashes, andair), then the resultant binary image may contain multiple objectsnd this method will face its outage. Similarly, Ibrahim et al. [8] usedtandard deviations of the pixels’ coordinates of a binary object toocalize pupil. However, they also did not propose any method toandle with multiple objects. Thus, to resolve this issue, we proposen iris localization technique that comprises an adaptive threshold,ccentricity, area, and binary object geometry. Moreover, authorsn [3,8] localized limbic boundary with a circle approximation. Theysed radial gradients in horizontal direction to estimate radius ofhe iris circle. However, to some extent, this technique is quiteffective, but they fail to extract the precise center for iris circle.o estimate center coordinates, they compute y-coordinate fromhe absolute distances of the left and the right boundary pointsaving maximum gradients. Next, they assign x-coordinate of theupil circle to x-coordinate of the iris circle. Due to this assignment,he iris circle may be either pushed up or down along the y-axis.t may cause significant number of iris pixels to be left outside andice versa. Thus, to resolve this problem too, we propose an effec-ive scheme that does not even need an iris center: first, we extractadius for the circular limbic boundary using radial gradients inwo secure regions. Next, we mark circular limbic boundary withts center at the pupil center. Finally, we apply active contours toegularize the pupillary and limbic boundaries. This regularizationrocess effectively compensates for any offset in the pupil and iris’enters.

Rest of the paper is structured as follows. Section 2 details differ-nt modules of the proposed technique whereas Section 3 presentsxperimental results and discussion. Limitations of the proposedechnique are explained in Section 4 and finally, this study is con-luded in Section 5.

. Proposed iris localization method

The proposed technique includes the following modules: pupil-ary boundary localization (PBL), limbic boundary localizationLBL), and iris boundaries regularization (IBR).

.1. Pupillary boundary localization

As indicated in Fig. 2(a), generally, the input eye image containsegions, such as skin, sclera, iris, pupil, eyelids, eyelashes, specu-ar reflections, and eyebrows. However, among these regions theupil, eyelashes, eyebrows, and hair may have relatively the sameray level intensities, which fortunately, differ from each other byheir respective geometry, for instance, eyelashes, eyebrows, andhe hair may have large length compared to width. On other hand,

pupil region is compact; therefore, s aspect ratio (i.e., length toidth ratio) would always be within a specific threshold. Due to

hese arguments, we propose a robust scheme to localize pupillaryoundary with circle approximation using eccentricity, area, geom-try, and the image gray level statistics. Fig. 1 shows flow diagramf pupil localization scheme, which includes four stages that arelaborated in the following text.

.1.1. Stage-IThis stage is consisted of the following two steps:

Please cite this article in press as: F. Jan, et al., A non-circular iris localstatistics, Optik - Int. J. Light Electron Opt. (2012), http://dx.doi.org/10

. Literature reveals that illuminator of the image acquisitiondevice causes specular reflections in iris images [4]. As specularreflections encumber most of the iris segmentation techniques

Fig. 1. Flow chart of module PBL.

[2]; therefore, we suppress them prior to start the iris local-ization process. To begin with, first complement the gray levelinput eye image Ig(x,y) as Ig(x, y) = 255 − Ig(x, y). Next, use a4-connectivity procedure [9] to detect and invert gray level val-ues of holes in Ig(x, y); a region of dark pixels surrounded bylighter pixels is called a hole (see Fig. 2(b)). After that, com-plement Ig(x, y) and pass the resultant image through a medianfilter (windows size [12 12]) to get the preprocessed eye imageIp(x,y). Finally, stretch contrast of Ip(x,y) to the full gray scalerange 0–255 [3,9], see Fig. 2(d).

2. Use the vertical (Mv(x)) and horizontal (Mh(y)) Integral projec-tion functions [1] to locate a Seed-pixel (Sx,Sy) in the iris/pupilregion. Where Sx and Sy represent the x- and y-coordinates ofSeed-pixel, respectively.

Mv(x) = 1n

n∑i=1

I(x, yi), x = 1, 2, 3, . . . , m, (1)

Mh(y) = 1m

m∑i=1

I(xi, y), y = 1, 2, 3, . . . , n, (2)

where m and n represent the number of rows and columns ofIp(x,y), respectively. After computing Mv and Mh, extract (Sx,Sy)-coordinates as Sx[xo ∈ {x = 1, 2, 3,. . .,m} such that Mv(x) is maximumat x = xo], Sy[yo ∈ {y = 1, 2, 3,. . .,n} such that Mh(y) is maximum aty = yo].

Fig. 3(a) and (b) shows the vertical and horizontal inte-gral projection functions, respectively, whereas Fig. 3(c) shows a

ization algorithm using image projection function and gray level.1016/j.ijleo.2012.09.018

Fig. 2. (a) Gray level input eye image Ig(x,y). (b) Complemented image Ig (x, y); blackdot in pupil region represents hole. (c) Ig (x, y) with holes suppressed. (d) Prepro-cessed image Ip(x,y).

ARTICLE IN PRESSG ModelIJLEO-52635; No. of Pages 7

F. Jan et al. / Optik xxx (2012) xxx–xxx 3

Fig. 3. (a) Horizontal IPF Mh(y); its peak corresponds to Sy = 123. (b) Vertical IPF Mv(x); its peye image Ip(x,y).

Fig. 4. (a) Histogram h of window-image Iw(x,y). (b) Histogram h1 showing positionso

2

1

2

2

1

2

Also, register the center and radius parameters into a pupil circlevector (C) as: C = [ xc yc rc ].

f v1(=17) and �(= 18). (c) Histogram h2 showing position of v1(=25).

.1.2. Stage-IIThis stage involves the following two steps:

. Compute a lower gray level saturated limit (�) of Ig(x,y) [10],which represents bottom 1% of all the gray values in Ig(x,y). Next,compute histogram (h) of Iw(x,y). It is evident from h (Fig. 4(a))that gray levels in the human eye, generally, contribute to threesignificant regions: a lower region, e.g., pupil, eyelashes and eye-brows; a middle region that usually contains iris region; andan upper region which indicates sclera and possibly the skin.Moreover, for the near infrared eye images, the pupil gray levelintensity is always found within the lower gray level range(0–128). However, we experimentally observed that the actualpupil gray intensity is always in the neighborhoods of �. Follow-ing that argument, we neglect the upper gray level range whilelocalizing pupillary boundary.

. Extract a lower (roi1) and an upper (roi2) gray level range asroi1 = (0 − �), and roi2 = (� + 1 − 2�). Then, locate gray levelvalues (v1) and (v2) corresponding to maximum frequenciesin h as v1 = [g1 ∈ roi1 such that h1(g) is maximum at g = g1],v2 = [g2 ∈ roi2 such that h2(g) is maximum at g = g1]. With h1 =h(roi1) and h2 = h(roi2).

.1.3. Stage-IIIThis stage comprises the following two steps:

. Compute an adaptive threshold (�) as follows. If parameter � is

Please cite this article in press as: F. Jan, et al., A non-circular iris localstatistics, Optik - Int. J. Light Electron Opt. (2012), http://dx.doi.org/10

zero then � = v1 else � = v2. Here, parameter � is initially set tozero.

. Convert Iw(x, y) to a binary image Ibin(x, y) as

eak corresponds to Sx = 192. (c) Window-image Iw(x,y) marked in the preprocessed

Ibin(x, y) ={

1 if (� − � ≤ Iw(x, y) ≤ � + �)

0 otherwise, (3)

where � is empirically set to 7, which compensates for gray levelvariations in the pupil region. Now, search Ibin(x, y) for any non-zeropixel (i.e., one). However, if no white pixel exists, then increment� by one, and repeat the whole process from step 1 provided �is less than 2. Otherwise, repeat the entire process from step 2 ofStage-II for Ip(x,y) instead of Iw(x,y); its reason is that, in case, ifIp(x,y) contains other low intensity regions, as mentioned earlier,then IPF may wrongly localize the seed pixel location. However, weexperimentally observed that it happens rarely.

Next, use the 4-connectivity procedure [9] to detect and invertgray level values of holes in Ibin(x, y). Following that, perform amorphological open [11] operation on the resultant image; use adisk with radius three as the structuring element. This operationremoves any spurious pixels and isolates any loosely connectedbinary objects as well. Fig. 5(a) and (b) shows a single and multipleobjects (two objects in this case), respectively.

2.1.4. Stage-IVAs shown in Fig. 5(b), Ibin(x, y) may contain multiple objects;

the low intensity regions, as mentioned earlier, may result in theseobjects. In this case, localizing pupillary boundary with only eccen-tricity [3] or standard deviation [8] of the object pixels’ coordinatesmay not be feasible. Thus, to resolve this issue, we supplementeccentricity with area and geometry of the binary object to find anaccurate pupil region with a circle approximation (xp,yp,rp). Where(xp,yp) and rp represent the center and radius of the pupil circle,respectively. This stage involves the following two steps:

1. Use the 4-connectivity procedure to detect the jth object inIbin(x, y), where the parameter j is initially set to one. Next, com-pute its area a(j), eccentricity e(j), and the boundary coordinates(X,Y). After that, extract the minimum (x1,y1) and maximum(x2,y2) values of xy-coordinates as (x1, y1) = (min(X), min(Y)) and(x2, y2) = (max(X), max(Y)). Then, estimate coarse length (lc) andwidth (wc) as (lc, wc) = (max(x2 − x1, y2 − y1), min(x2 − x1, y2 −y1)).

Now, if (wc > 0.6lc) is found false, then increment j by one andrepeat current step until all other objects are examined in Ibin(x, y).Otherwise, compute the coarse center (xc,yc) and radius (rc) as

(xc, yc, rc) =(

(x1 + x2)2

,(y1 + y2)

2,(

(x2 − x1)4

+ (y2 − y1)4

))

ization algorithm using image projection function and gray level.1016/j.ijleo.2012.09.018

2. Compute the following conditions:

ARTICLE IN PRESSG ModelIJLEO-52635; No. of Pages 7

4 F. Jan et al. / Optik xxx (2012) xxx–xxx

F from Mi ith ci

Ni

e

e

tvot1pia

(

2

pbaapcts

ig. 5. (a) Binary image Ibin(x, y) showing marked pupil object (eye image is taken

s taken from CASIA-IrisV1 [13]). Similarly, (c) and (d) pupillary boundary marked w

f0 = ((e(j − 1) + 0.2) ≤ e(j)&e(j) < 0.5),

f1 = (e(j) ≤ (e(j − 1) − 0.2)),

f2 = (a(j) > a(j − 1)).

ext, perform the following test asf ((f0 == 1|f1 == 1)&(f2 == 1)) then

(e(j − 1), a(j − 1), C(j − 1)) = (e(j), a(j), C(j)),lseC(j) = C(j − 1)

nd

Parameters e(j − 1) and a(j − 1) hold previous values of eccen-ricity and the area, respectively. Similarly, C(j − 1) holds previousalue of C. Repeat the entire process from step 1 until all the otherbjects are examined. However, if no correct pupil location is found,hen increment K by one, and repeat the entire process from step

of Stage-III provided K is less than 2. Otherwise, repeat the entirerocess from step 1 of Stage-II for the image Ip(x,y) as the input

mage instead of Iw(x,y). Finally, obtain the pupil center and radiuss (xp, yp, rp) = (Cj(1), Cj(2), Cj(3)).

Fig. 5(c) and (d) shows the pupillary boundary localized withxp,yp,rp) in two different eye images.

.2. Limbic boundary localization

This stage is usually considered more difficult as compared toupillary boundary, because of the following reasons: first, contrastetween the iris and sclera region is usually low. Second, eyelidsnd/or the eyelashes may partially occlude the limbic bound-ry. Lastly, as pupil always exists within iris region; therefore,

Please cite this article in press as: F. Jan, et al., A non-circular iris localstatistics, Optik - Int. J. Light Electron Opt. (2012), http://dx.doi.org/10

upil and limbic boundaries could be assumed as two non-oncentric circles; however, this assumption may not be alwaysrue [2]. Thus, to resolve these issues, we propose an effectivetrategy for limbic boundary localization (see Fig. 6). It includes

Extract 1 D gray l evel profilesin two secure regions

Eye image withpupillary b oundary marke d

Use radial gradient sto extract points a long th e

limbic boundary i neach region

Remove irrelevantpoints by using an E rror

distance transform

Extract average distance oftrue boundary points in each

secure region

Compute radius of thecircular limbic b oundary

Pupillary and the limbicboundaries marked with

circle approximation in theeye image

Fig. 6. Flow chart of module LBL.

MU V1.0 [12]). (b) Binary image Ibin(x, y) showing correct pupil object (eye imagercle approximation for cases in (a) and (b), respectively.

extracting 1D-gray level profiles in two secure regions, i.e.,S1 = (0–40◦) and S2 = (140–180◦) (see Fig. 7); then computing theirgradients; followed by excluding wrong boundary points using anerror distance transform; and finally, extracting radius for the cir-cular limbic boundary using true boundary points. The followingsteps elaborate this process in more explicit details:

1. Initialize two vectors, i.e., a radius (�) and an angle (�),as: � = {1.5rp,1.5rp + 1,1.5rp + 2,. . .,3.5rp} and � = {0:�/180:�}.Where rp is the pupil circle radius and parameter � is in radi-ans. Next, extract 1D gray level profiles (�1) in sector S1 as�1(q,t) = Ip(x(t),y(t)). Where q = 1, 2, 3, . . . , m (here, m repre-sents size of S1), t = 1, 2, 3, . . . , n (here, n represents size of �),x(t) = xp + �(t) cos(�(q)), and y(t) = yp + �(t) sin(�(q)).

Fig. 7 shows sectors S1 and S2 where the white lines show theradial segments, located at different discrete angles, along whichthe 1D gray level profiles are computed.

2. Use a first order difference equation [9] to compute an absoluteradial gradient (�1) as

�1(t) =∣∣�1(q, t + 1) − �1(q, t)

∣∣ for q = 1, 2, 3, . . . , m − 1 and

t = 1, 2, 3, . . . , n − 1. (4)

Fig. 8(a) and (b) shows �1 (in S1 at angle � = 0) and its corre-sponding �1, respectively.

ization algorithm using image projection function and gray level.1016/j.ijleo.2012.09.018

3. Compute a matrix of radial distances (D1) of points (along limbicboundary) having maximum radial gradient in S1 with respectto (xp,yp) as

Fig. 7. Ip(x,y) showing S1 and S1; each sector is 105rp pixels away from (xp ,yp).

ARTICLE IN PRESSG ModelIJLEO-52635; No. of Pages 7

F. Jan et al. / Optik xxx (2012) xxx–xxx 5

Fg

D

ea

4

D

Fa

5

r

wrsw

2

pacfl

Ft

ig. 8. (a) 1D gray level profile �1, taken at angle � = 0 in S1. (b) Absolute radialradient �1 of �1 in (a).

1 = [�(to) such that �1(t) is maximum at t = to],

for q = 1, 2, 3, . . . , m and t = 1, 2, 3, . . . , n. (5)

Fig. 9(a) illustrates radial distances of boundary points in D1. It isvident that most of these distances are close to each other, except

few ones that belong to the wrong boundary points (Fig. 9(b)).

. Compute an error distance transform (D1) by taking absolutedifference of each distance with all other entries in D1 as

˜ 1(i, j) =∣∣D1(i) − D1(j)

∣∣ for i, j = 1, 2, 3, . . . , n2, (6)

It is evident that majority of the error distances in D1 (seeig. 9(c)) are below 3 whereas error distances of the wrong bound-ry points are close to 10.

. Discard a point in D1 that has 50% of its error distances (withother points in D1) greater than 5; a true boundary point mayhave small error distances with rest of the boundary points; D1is the resultant distance that contains only true limbic boundarypoints. Similarly, compute a distance matrix D2 for sector S2 aswell. Finally, compute radius (ri) of the circular limbic boundaryas

i = 0.5

⎛⎝ 1

K1

K1∑i=1

D1(i) + 1K2

K2∑j=1

D2(j)

⎞⎠ , (7)

here D1 and D2 contain true boundary points only and K1 and K2epresent the number of points in D1 and D2, respectively. Fig. 10hows limbic boundary marked with circle approximation (xi,yi,ri),hich has radius ri and (xp,yp) as its center (xi,yi).

.3. Iris boundaries regularization

In previous sections, we have localized pupillary boundary with

Please cite this article in press as: F. Jan, et al., A non-circular iris localstatistics, Optik - Int. J. Light Electron Opt. (2012), http://dx.doi.org/10

upil circle (xp,yp,rp) and limbic boundary with iris circle (xi,yi,ri),s done in [3–5,8]. However, in reality these boundaries are notircular [14]; therefore, they should be localized with a kind ofexible curves, such as the active contours model [2,14]. Herein, we

ig. 9. (a) Distances of boundary points having maximum gradient along limbic boundarransform D1; values in the neighborhood of 10 are clues to the wrong boundary points. E

Fig. 10. Limbic boundary marked with circle approximation (xi ,yi ,ri).

propose a simple but an effective scheme to regularize these bound-aries locally. To demonstrate the proposed scheme, we proceedwith the pupillary boundary as follows:

1. To begin with, consider a circular band in Ip(x,y); it is cen-tered at pupil center (xp,yp) and has its inner and outer circularboundaries located at (rp − ϑ) and (rp + ϑ), respectively. Here, ϑis empirically set to 8 (see Fig. 11(a)).

2. Next, use a similar approach, as adopted in Section 2.2, to extractN regularly spaced points around pupillary boundary in thestated band. Where N is set to perimeter of pupil circle, i.e.,N = 2�rp. Let r� represents the angular radial distances of thesepoints with respect to (xp,yp), for � = {0:�/180:2�}. Filter r� withthe median filter having window size (3 × 3) to calm down anyrapid transitions in radial distances. Fig. 11(b) shows pupillaryboundary marked with r� after filtering process.

3. Filter r� by the Fourier series [14] as follows. First, use the fol-lowing expression to get M discrete Fourier series coefficientsCw (with M < N). However, for optimal results, set M to 20 forpupillary boundary; it is set to 15 for limbic boundary.

Cw =N−1∑z=0

r�(z) exp(

−2�iwz

N

), w = 0, 1, 2, . . . , M − 1 (8)

Next, plug these coefficients, i.e., Cw, w = 0,1,2,. . .,M − 1, in the fol-lowing expression to get a smooth and close boundary, whoseresolution is controlled by M.

r�(s) = 1N

M−1∑q=0

Cq exp(

2�iqs

N

), s = 0, 1, 2, . . . , N − 1 (9)

Lastly, use the following parametric equations to get xy-coordinates

ization algorithm using image projection function and gray level.1016/j.ijleo.2012.09.018

of pupillary boundary points as

X(s) = xp + r�(s)cos(�s) and Y(s) = yp + r�(s)sin(�s),

for s = 0, 1, 2, . . . , N − 1.

y in D1. (b) Ip(x,y) showing the true and wrong points. Similarly, (c) error distanceye image is taken from MMU V1.0 [12].

ARTICLE IN PRESSG ModelIJLEO-52635; No. of Pages 7

6 F. Jan et al. / Optik xxx (2012) xxx–xxx

F ds. (b)t U V1.

whSaeoo

3

vRIua

wi

ı

ig. 11. (a) Preprocessed eye image Ip(x,y) showing the outer and inner circular banhe limbic boundaries after the regularization process. Eye image is taken from MM

Fig. 11(c) shows the pupillary and limbic boundaries regularizedith the proposed technique. It is obvious that the iris contoursave accurately been regularized in the non-occluded iris parts.imilarly, in regions, where iris is partially occluded by the uppernd/or lower eyelids, the circular approximation is retained. How-ver, it does not affect the system performance, because theseccluded parts are removed before entering into the other modulesf an iris recognition system.

. Experimental results and discussion

We tested the proposed iris localization algorithm in MATLABersion 7.1, installed on a desktop PC with 2.33 GHz CPU and 1.5 GBAM, and a set of the public iris databases: MMU V1.0 [12], CASIA-

risV1 [13], and CASIA-IrisV3-Lamp [13]. For accuracy results, wesed the accuracy rate () as proposed in [8,15], which is defineds

=(

c1

ct× 100

), (10)

here c1 represents the number of correctly localized irises, and ct

s the total number of eye images used in experimentation. Herein,

Please cite this article in press as: F. Jan, et al., A non-circular iris localstatistics, Optik - Int. J. Light Electron Opt. (2012), http://dx.doi.org/10

depends on the accuracy error (ıerr) that is defined as

err =( |u1 − u2|

u1× 100

), (11)

Fig. 12. (a–c) Some randomly selected accurate iris localization results for CAS

Pupillary boundary marked with r� after median filtering process. (c) Pupillary and0 [12].

where u1 and u2 represent the number of actual and the detectediris pixels, respectively. Herein, the detected iris pixels are countedby a row-major scanning technique. A localized iris is considered asaccurate if ıerr is less 10% else false. The following sections explainexperimental details.

3.1. Iris localization results

As mentioned earlier, we tested the proposed algorithm on threepublic iris databases. CASIA-IrisV1 includes 756 images collectedfrom 108 subjects, where each image is stored in the BMP formatwith resolution 320 × 280 pixel. The CASIA-IrisV3-Lamp includes16,212 eye images acquired from 411 subjects. Here, each image isstored in the JPEG format with resolution 640 × 480 pixel. Similarly,MMU V1.0 contains 450 iris images, which were collected from45 individuals; each eye image is stored in the BMP format withresolution 320 × 240 pixel.

Collectively, the set of these databases offer noise, such as spec-ular reflections, non-uniform illumination, low contrast, off axiseye images; hair, eyebrows, eyelashes, eyeglasses and the cos-metic lens’ occlusions. However, for experimentation, we used allthe images in CASIA-IrisV1 and MMU V1.0, whereas only 4080

ization algorithm using image projection function and gray level.1016/j.ijleo.2012.09.018

eye images of the first 102 subjects from CASIA-IrisV3-Lamp wereinvolved. Table 1 shows accuracy comparison results of the pro-posed iris localization technique on these databases with someof the contemporary techniques. Similarly, Fig. 12 demonstrates

IA-IrisV1 [13], CASIA-IrisV3-Lamp [13], and MMU V1 [12], respectively.

ARTICLE ING ModelIJLEO-52635; No. of Pages 7

F. Jan et al. / Optik xxx

Table 1Comparison with other methods for CASIA-IrisV1, CASIA-IrisV3-Lamp, and the MMUV1.0 (results for comparison are taken from the published work).

Method Accuracy (%)

CASIA-IrisV1 CASIA-IrisV3-Lamp MMU V1.0

Khan et al. [3] 100.00 – 98.22Ibrahim et al. [8] 99.90 98.28 99.77Basit et al. [16] 99.60 – 98.10Masek [17] – 79.02a 83.92a

Koh et al. [18] – 99.00 –Daugman [4] – 96.00b –Proposed 100.0 99.2 99.55

st

3

rfttLcaa0M

4

gdattIah

5

nriiiauc

[

[

[[

[

[

[

[

a Results are taken from [8].b Results are taken from [18].

ome randomly selected irises that were accurately localized byhe proposed method.

.2. Temporal analysis

Computational cost is considered as critical parameter in irisecognition systems, because of the real time constraints. There-ore, in this regard, we analyzed computational cost of the proposedechnique using the MATLAB built-in ‘profile’ [19] facility, tuned inhe real timer mode. It helps in debugging and optimizing the MAT-AB files, such as number of calls, parent functions, child functions,ode line hit count, code line execution time, and among others. Werbitrarily chose 500 selected iris images from each iris database,nd then estimated its average time per eye image. It takes 0.75,.85, and 0.06 s for the CASIA-IrisV1, CASIA-IrisV3-Lamp, and theMU V1.0, respectively.

. Limitations of the proposed method

As the proposed algorithm is based on the histogram and imageray level statistics, therefore, we do not recommend it for the irisatabases acquired with visible light wavelength. However, it canlso be used in visible light band, but with some improvements inhe pupil localization scheme. Moreover, limbic boundary localiza-ion procedure is straight forward applicable to the visible band.n addition, the proposed technique does not successfully localize

pupil, which is severely occluded by a thick strip of eyelashesaving approximately the same gray level intensity.

. Conclusion

In conclusion, this study proposes a robust iris localization tech-ique. For fast processing, first, a window-image containing the irisegion inside is extracted by using IPF. Next, an adaptive thresholds computed using histogram-bisection of window image and themage’s statistics. After that, window image is converted to a binary

Please cite this article in press as: F. Jan, et al., A non-circular iris localstatistics, Optik - Int. J. Light Electron Opt. (2012), http://dx.doi.org/10

mage using adaptive threshold. Following that, eccentricity, area,nd geometrical properties of binary objects, in binary image, aresed to localize the circular pupillary boundary. Moreover, the cir-ular limbic boundary is localized using radial gradients and an

[

[

PRESS (2012) xxx–xxx 7

error distance transform in two secure regions. Finally, the activecontours are applied to get the non-circular iris inner and outerboundaries.

As, on average, it takes less than a second to localize an iris in theeye image, therefore, it can safely be applied in real time systems.Besides, it is tolerant to off-axis eye images, specular reflections,non-uniform illumination; contact lenses, eyelashes, eyelids, andthe hair occlusions. The proposed algorithm is tested on the pub-lic databases: CASIA-IirsV1, CASIA-IrisV3-Lamp, and MMU V1.0.Experimental and accuracy comparison results with other state ofthe art contemporary techniques show satisfactory performance ofthe proposed algorithm.

Acknowledgments

COMSATS Institute of Information Technology, Islamabad cam-pus Pakistan, through its in-house PhD program, supports thiswork. Thanks to Malaysia Multimedia University and ChineseAcademy of Sciences for providing free access to their respectivedatabases.

References

[1] A. Basit, M.Y. Javed, Localization of iris in gray scale images using intensitygradient, Opt. Lasers Eng. 45 (2007) 1107–1114.

[2] K.W. Bowyer, K. Hollingsworth, P.J. Flynn, Image understanding for iris biomet-rics: a survey, Comput. Vis. Image Und. 110 (2008) 281–307.

[3] T.M. Khan, M. Aurangzeb Khan, S.A. Malik, S.A. Khan, T. Bashir, A.H. Dar, Auto-matic localization of pupil using eccentricity and iris using gradient basedmethod, Opt. Lasers Eng. 49 (2011) 177–187.

[4] J.G. Daugman, High confidence visual recognition of persons by a test ofstatistical independence, IEEE Trans. Pattern Anal. Machine Intell. 15 (1993)1148–1161.

[5] R.P. Wildes, Iris recognition: an emerging biometric technology, Proc. IEEE 85(1997) 1348–1363.

[6] M. Dobes, L. Machala, P. Tichavsky, J. Pospisil, Human eye iris recognition usingthe mutual information, Optik 115 (2004) 399–404.

[7] M. Dobes, J. Martinek, D. Skoupil, Z. Dobesova, J. Pospisil, Human eye localiza-tion using the modified Hough transform, Optik 117 (2006) 468–473.

[8] M.T. Ibrahim, T.M. Khan, S.A. Khan, M.A. Khan, L. Guan, Iris localization usinglocal histogram and other image statistics, Opt. Lasers Eng. 50 (2012) 645–654.

[9] R.C. Gonzalez, R.E. Woods, Digital Image Processing, 2nd ed., Prentice Hall,Upper Saddle River, NJ, 2001.

10] Stretchelim: http://www.mathworks.com/help/toolbox/images/ref/stretchlim.html (accessed 10.04.12).

11] Morphological operation: http://www.mathworks.com/help/toolbox/images/ref/imopen.html (accessed 10.04.12).

12] MMU iris database: http://pesona.mmu.edu.my/∼ccteo/ (accessed 10.04.12).13] CASIA iris database: http://www.idealtest.org/findTotalDbByMode.do?mode=

Iris (accessed 10.04.12).14] J. Daugman, New methods in iris recognition, IEEE Trans. Syst. Man Cybern. B

37 (2007) 1167–1175.15] S. Dey, D. Samanta, A novel approach to iris localization for iris biometric

processing, Int. J. Biol. Biomed. Med. Sci. 3 (2008) 180–191.16] A. Basit, Iris localization using graylevel texture analysis and recognition

using bit planes Department of Computer Engineering, Ph.D. Thesis, Collegeof Electrical and Mechanical Engineering, National University of Sciences andTechnology, Rawalpindi, Pakistan, 2009.

17] L. Masek, Thesis, Recognition of Human Iris Patterns for Biometric Identifica-tion, School of Computer Science and Software Engineering, The University of

ization algorithm using image projection function and gray level.1016/j.ijleo.2012.09.018

Western Australia, 2003.18] J. Koh, V. Govindaraju, V. Chaudhary, A robust iris localization method using an

active contour model and Hough transform, in: 20th Int. Conf. Pattern Recog-nition (ICPR), Istanbul, Turky, August 23–26, 2010, pp. 2852–2856.

19] Mathworks website: http://www.mathworks.com/ (accessed 10.04.12).