report ix final

32
Fingerprint Attendance System Using MATLAB Simulator Danish Refai Siddharth Upadhyay Anurag Choudhary B.tech – EXTC – Trim IX MPSTME, NMIMS Shirpur Campus

Upload: -refai

Post on 08-Apr-2015

577 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Report IX Final

 

Fingerprint  Attendance  System

Using  MATLAB  Simulator

Danish  RefaiSiddharth  UpadhyayAnurag  Choudhary

B.tech  –  EXTC  –  Trim  IXMPSTME,  NMIMS  Shirpur  Campus

Page 2: Report IX Final

CertiGicate  

This   is   to   certify   that   Mr.   Danish   Refai   with   “Gr.   No.:   C07009153”   &   Mr.   Siddharth  

Upadhyay  with  “Gr.  No.:  C07009085“  &  “Mr.  Anurag  Choudhary  with  “Gr.  No.:  C07009819”  

have  undertaken  the  minor  project  titled

Fingerprint  Attendance  System

 

The  students   have  successfully  completed  the  project  report  for  the  partial   fulGillment  of  the  

B.Tech  (EXTC)  as  per  the  norms  prescribed  by  NMIMS  University  during  the  ninth  trimester  

of  academic  year  2009-­‐10.  The  project  report  has  been  assessed  and  found  to  be  satisfactory.

We  wish  them  a  great  success.

_______________________

Prof.  Shashikant  Patil

   (Project  Guide)

_______________________               _________________________

Prof  B.Shailendra                  Prof.  R.R.Sedamkar

           (H.O.D  EXTC)                      (Associate  Dean)

2

Page 3: Report IX Final

Acknowledgments

    This   project   could  not   have  been   accomplished  without   the   splendid   support  and  guidance  of  Prof.  Shashikant  Patil.  His  tremendous  efforts  and  profound  inputs  have  made  this  report  of  a  great  class  and  standard.  

   

      We   would   also   like   to   acknowledge   the   invaluable   assistance   from   the  IEEE  Xplore  Section  who  have  provided  with  the  necessary  documents  and  technical  papers;.

    We  would  also   like  to   thank  Prof.   Shashikant   Patil   for   the  collaboration  on  the  IEEE  Internet  Archives  which  made  our  research  very  perceptive.

     

    Also  expressing  credit   toward  the  MPSTME  Library  for  providing  us  with  great  resources  for  this  project.

      Last   but   not   the   least,   we   would   also   like   to   thank   our   Head   of  Department,   Prof.   B.Shailendra,   for   being   such   a   great   source   of   inspiration   for   the   this  project.  

3

Page 4: Report IX Final

Table  of  Contents

Introduction:! 6

Introduction to Fingerprints! 6

Physiology of Fingers! 7

Extraction Process! 8

Pore Extraction Technique! 8

Scanning Resolution! 8

Feature or Search Area! 9

Finger Plasticity! 9

Important Terms:! 9

MATLAB Simulation! 12

Requirements! 12

Preparations! 12

Code! 12

Functions! 15

Stepwise Code Explanation! 16

Load image! 16

Enhancement! 16

Binarize ! 17

Thining! 18

Minutiae! 18

Termination! 19

Bifurcation! 20

Remarks! 20

Process 1! 21

Process 2! 21

4

Page 5: Report IX Final

Process 3! 21

ROI! 22

Show ROI on Print:! 23

Suppress extrema minutiae! 24

Orientation! 25

Termination Orientation! 25

Bifurcation Orientation! 26

Save in a text file ! 27

Results ! 27

Storage! 27

Screenshot of the Saved File! 28

For Attendance System:! 28

Future Prospects:! 29

Appendix! 30

Supplement Files! 30

Bibliography:! 32

5

Page 6: Report IX Final

Introduction:

! Modern digital technology has made it possible to manipulate multi-dimensional signals with systems that range from simple digital circuits to advanced parallel computers. The goal of this manipulation can be divided into three categories:

• Image Processing image in -> image out • Image Analysis image in -> measurements out• Image Understanding image in -> high-level description out

! In the project, we focus on the fundamental concepts of image processing especially with fingerprint. Space does not permit us to make more than a few introductory remarks about image analysis.

! An image defined in the "real world" is considered to be a function of two real variables, for example, a(x,y) with a as the amplitude (e.g. brightness) of the image at the real coordinate position (x,y).

! An image may be considered to contain sub- images sometimes referred to as regions-of-interest, ROIs, or simply regions. This concept reflects the fact that images frequently contain collections of objects each of which can be the basis for a region. In a sophisticated image processing system it should be possible to apply specific image processing operations to selected regions.

Introduction to Fingerprints

! Fingerprints have been used to secure commercial transactions since the days of ancient Babylon, where fingerprints have been found among the ruins on clay seals attached to business documents. Each fingerprint contains global features, which can be seen with the naked eye, and local features, also called minutia points, the tiny, unique characteristics of fingerprint ridges. Ridge patterns can be loops, arches, or whorls; minutia types associated with a ridge pattern include ridge endings, bifurcations, divergences (ridges so small that they appear as dots or islands), and enclosures (ridges that bifurcate and reunite around a ridgeless area).

February 2006 97

police fingerprinting stores the entireimage, fingerprint scanning systemsstore only the template. An originalimage cannot be constructed from itsdata template alone.

Fingerprint scanners such as theDigital Persona U.are.U Pro modelshown in Figure 2 are increasinglycommon. This unit has an advertisedFAR of 0.01 percent and an FRR of1.4 percent. Fingerprint scanners canbe attached to USB ports as an exter-nal peripheral or they can be embed-ded within devices, as in the HP iPAQ5550 and the IBM Thinkpad T42.

IRIS SCANSFigure 3 shows the iris—the colored

part of the eye surrounding the pupil.Iris scans, which analyze the iris’s veinpattern, have the potential to be evenmore accurate than fingerprints be-cause the iris has about 260 degrees offreedom with regard to its vein pat-terns. Using an iris scanner requiresaligning the eye with a colored LEDinside the camera, then moving theperson’s head forward or back untilthe LED changes color, signaling thatthe distance is correct for proper imag-ing. The system then makes the scan,analyzes the image, and stores the tem-plate.

Just as a person has 10 different fin-gerprints, each human has two dis-tinct iris patterns, even identical twins;thus, it is impossible to enroll with theleft eye and authenticate with the

FINGERPRINTS Fingerprints have been used to

secure commercial transactions sincethe days of ancient Babylon, where fin-gerprints have been found among theruins on clay seals attached to businessdocuments. Each fingerprint containsglobal features, which can be seen withthe naked eye, and local features, alsocalled minutia points, the tiny, uniquecharacteristics of fingerprint ridges. AsFigure 1 shows, ridge patterns can beloops, arches, or whorls; minutia typesassociated with a ridge pattern includeridge endings, bifurcations, diver-gences (ridges so small that theyappear as dots or islands), and enclo-sures (ridges that bifurcate and reunitearound a ridgeless area).

While two or more fingerprints canhave the same global features, noknown pair, at least since the firstcriminal fingerprint identification wasmade in 1892, have the same minutia.Fingerprint scanners detect ridge pat-terns and minutia and then character-ize the minutia based upon orientation(the direction the minutia are facing),spatial frequency (how far apart theridges are around a particular mark),curvature (rate of orientation change),and position (X,Y location relative tosome fixed point). There are about 60to 70 minutia points on each finger,and even identical twins have differ-ent minutia points.

These data describing the minutiaprovide the essential components ofthe template computed from theenrollment and bid samples. Whereas

right. Even though the visible portionof the iris changes as a function ofpupil dilation, this does not adverselyaffect authentication. Eyeglasses andcontact lenses do not reduce accuracyas long as the iris is clearly visible.

B iometric authentication is here tostay, and so is the controversy itengenders regarding invasion of

privacy. Critics say that biometric datagathered for one purpose—say, finger-prints taken from noncitizens whoenter the US under the US-VISIT pro-gram—are too easily repurposed forapplications such as criminal identifi-cation. Proponents say that current“best practices” such as not storing thefingerprint or iris scan, but only itsdata template, are adequate for pro-tecting personal privacy.

As is usual with new technologies,members of the public must decide forthemselves whether the reliability andconvenience of biometric-based ser-vices are worth leaving behind a digi-tal trace of their identity. !

Alfred C. Weaver is a professor of com-puter science at the University of Virginia.Contact him at [email protected].

Figure 1. Fingerprint with ridge patternsand minutia points—the tiny, unique characteristics of fingerprint ridges.

Figure 2. Digital Persona U.are.U Profingerprint scanner. Fingerprint scannerscan be attached to USB ports as an external peripheral or they can be embedded within devices.

Figure 3. Pupil and surrounding iris. Irisscans analyze the iris’s vein patterns,potentially providing even more accurateidentification than fingerprints becausethe iris has about 260 degrees of freedomwith regard to its vein patterns.

Authorized licensed use limited to: Vivekanand Education Society's Inst of Tech. Downloaded on August 13, 2009 at 08:39 from IEEE Xplore. Restrictions apply.

6

Page 7: Report IX Final

! While two or more fingerprints can have the same global features, no known pair, at least since the first criminal fingerprint identification was made in 1892, have the same minutia. Fingerprint scanners detect ridge pat- terns and minutia and then characterize the minutia based upon orientation (the direction the minutia are facing), spatial frequency (how far apart the ridges are around a particular mark), curvature (rate of orientation change), and position (X,Y location relative to some fixed point). There are about 60 to 70 minutia points on each finger, and even identical twins have different minutia points.

These data describing the minutia provide the essential components of the template computed from the enrollment and bid samples. Whereas police fingerprinting stores the entire image, fingerprint scanning systems store only the template. An original image cannot be constructed from its data template alone.

Physiology of Fingers

Fig. 1. Physiology of the skin.

a mathematically derived matching scheme, and resultsfrom measurements of real fingerprint images. The resultingmodel is intended to simulate a real matching system andprovide the ability to estimate error rates for a given set ofsystem parameters.Inherent in developing this theoretical model, and es-

pecially in gauging the performance of the internally de-veloped system of the NSA, is exploring the efficacy ofusing pores to match fingerprints. Automated matchingtechniques traditionally have used configurations consist-ing of medium-resolution features, such as branch points(bifurcations) and end points of epidermal ridges, but it isalso possible to implement pores, which are high-resolutionfeatures. The NSA internal system is one of only a fewsystems to date that utilizes pores as features to matchfingerprints. (For those interested in exploring this topicfurther, [16]–[35] are suggested reading.)

II. HISTORYBranch and end points of epidermal ridges were used

by Sir F. Galton in 1872 to develop a probabilistic modelof fingerprint individuality. They have been used sincethen in both forensic work [3] and automated matching[2], [5]. These Galton features, or minutiae, contain uniqueinformation that enables their use in probabilistic analyses.Each Galton feature has a specific type, i.e., branch pointor end point, a unique location on the fingerprint, and aspecific orientation [12]. The orientation can be defined foran end point, for example, as the approximate tangent angleto the ridge ending.Most probabilistic models to date have utilized Galton

features exclusively; two of these models will be presentedin this paper. The first model, published in 1977 by Os-terburg et al. at the University of Illinois [9], determinesthe probability of occurrence of a certain configurationof Galton features in a fingerprint. Two years later, a

member of Osterburg’s team, S. Sclove, published a paperpresenting the occurrence of Galton features as a two-dimensional Markov model [10]. Both of these models canbe adapted to use pores instead of Galton features.Pores historically have been used to assist in forensic

matching. Although most matching methods have empha-sized minutia comparisons and used pores as ancillarycomparison features, the ability to match prints based onpore information alone has been documented [1], [7], [13].The concept of using pores to match prints has beenessentially dormant during the rise of automated fingerprint-recognition systems.

III. PHYSIOLOGYThe uniqueness of a configuration of pores depends on

several factors, such as the number of pores involved, theirrespective shapes and sizes, the locations of these poreswith respect to each other, and so on. These factors allare a function of morphology. Thus, it would be helpfulto discuss briefly the genesis and formation of fingerprintsas well as the implications imposed in the development ofpores.Pores are formed where sweat glands in the subcutaneous

layer of the skin generate sweat ducts. These sweat ductsgrow through the subcutaneous layer and dermis to theepidermis, where the open duct on the skin’s surfacepresents itself as a pore [15] (see Fig. 1). According to a1973 study on skin-ridge formation [4], sweat glands beginto form in the fifth month of gestation. The sweat-glandducts reach the surface of the epidermis in the sixth month,forming pores.The epidermal ridges are not formed until after the sixth

month; then, the pattern that has been forming in theglandular fold region is transferred to the epidermis. Hirschand Schweichel [4] concluded that several forces affect theepidermal pattern formation. One of these forces is the

RODDY AND STOSZ: FINGERPRINT FEATURES 1391

Authorized licensed use limited to: Vivekanand Education Society's Inst of Tech. Downloaded on August 13, 2009 at 08:15 from IEEE Xplore. Restrictions apply.

(Physiology of Skin)

! The uniqueness of a configuration of pores depends on several factors, such as the number of pores involved, their respective shapes and sizes, the locations of these pores with respect to each other, and so on. These factors all are a function of morphology.

Pores are formed where sweat glands in the subcutaneous layer of the skin generate sweat ducts. These sweat ducts grow through the subcutaneous layer and dermis to the epidermis, where the open duct on the skinʼs surface presents itself as a pore. According to a 1973 study on skin-ridge formation, sweat glands begin to form in the fifth month of gestation. The sweat-gland ducts reach the surface of the epidermis in the sixth month, forming pores.

7

Page 8: Report IX Final

! The epidermal ridges are not formed until after the sixth month; then, the pattern that has been forming in the glandular fold region is transferred to the epidermis. Hirsch and Schweichel concluded that several forces affect the epidermal pattern formation. One of these forces is the stabilization that occurs “when sweat gland secretion ducts open on to the surface, at regular intervals, in the papillary ridges.” These openings of the ducts on the surface are the pores, and the regularity of their appearance plays a significant part in the uniqueness of pore configurations. Once these pores form on the ridge, they are fixed at that location. Considerable research has shown that pores do not disappear, move, or spontaneously generate over time.

Extraction ProcessPore Extraction Technique

! The method used to extract the pores as fingerprint features is critical to the matching routine. The poreʼs position, size, and shape are features that make it distinct from other objects in an image. Techniques used for the fingerprint data capture can be used to enhance the pore information. For example, high-resolution scanning and manipulation of the gain and contrast camera controls can highlight the pores. The position of the pore is determined by processing the gray-scale fingerprint image and trans- forming it to a skeleton representation. By applying models and processing routines to the skeleton of the fingerprint image, the pore locations can be extracted. Pores are transformed into isolated and connected short lines in the skeleton image. Given this information, the size of the pore can be determined by region-growing routines operating on a binary version of the fingerprint image.

Scanning Resolution

! Some parameters become critical to the matching routine. For instance, the resolution at which the fingerprints are scanned determines the accuracy of feature location mea- surements. Inherently, there may be only one pore in a given 1x1 mm section of print, and at 1000 pixels per inch (ppi), this section is represented by approximately 40x40 (1600) pixels. In comparison, at a scanning resolution of 500 ppi, the same segment is represented by 20x20 (400) pixels. Therefore, the probability of another 1 mm! segment of printʼs matching with respect to pore position is either 1/1600 or 1/400 depending on the scanning resolution. It can be seen that the FAR(False Accept Ratio) will be reduced at a higher scanning resolution at the cost of an increased FRR (False Reject Ratio).

8

Page 9: Report IX Final

Feature or Search Area

! The scanning resolution issue can be made invariant by defining an absolute area to be associated with each feature (a feature area or search area). For instance, the location of a pore in the enrolled print segment may be determined to be (x,y) but in the corresponding live-scan segment, its location may be shifted some distance (as a result of rotation, plasticity, or other distortion. For the purpose of matching these two segments, a search area

transformed into isolated and connected short lines in theskeleton image. Given this information, the size of the porecan be determined by region-growing routines operating ona binary version of the fingerprint image. More details canbe found in [13] and [14].

B. Scanning ResolutionSome parameters become critical to the matching routine.

For instance, the resolution at which the fingerprints arescanned determines the accuracy of feature location mea-surements. Inherently, there may be only one pore in a given1 1 mm section of print, and at 1000 pixels per inch(ppi), this section is represented by approximately 40 40(1600) pixels. In comparison, at a scanning resolution of500 ppi, the same segment is represented by 20 20 (400)pixels. Therefore, the probability of another 1 mm segmentof print’s matching with respect to pore position is either1/1600 or 1/400 depending on the scanning resolution. Itcan be seen that the FAR will be reduced at a higherscanning resolution at the cost of an increased FRR.

C. Feature or Search AreaThe scanning resolution issue can be made invariant

by defining an absolute area to be associated with eachfeature (a feature area or search area). For instance, thelocation of a pore in the enrolled print segment may bedetermined to be but in the corresponding live-scansegment, its location may be shifted some distance asa result of rotation, plasticity, or other distortion. For thepurpose of matching these two segments, a search area

, as seen in Fig. 5, canbe defined such that if the feature is within , the featuresmatch with respect to position.The size of is a parameter that influences the perfor-

mance of the system (decreasing produces a decreasingFAR and increasing FRR; increasing produces an increas-ing FAR and decreasing FRR). Practically, should belarge enough to account for effects such as plasticity of thefinger and deviations in feature position due to variationsin the data, as well as effects of the processing algorithms,but not large enough for areas associated with distinctfeatures to overlap. In a forensic comparison, plasticityand distortion of the finger are accounted for by humanprocessing, but in an automated process, tolerances suchas must be incorporated to accommodate these inherentvariations.

D. Finger PlasticityThe distance between two features can change signifi-

cantly due to plasticity of the finger. This relative changeof position is generally not significant for nearby featureswithin small areas of print. Therefore, when measuring theposition of small high-density features such as pores, a localorigin should be established. A minutia point can be usedto establish a local origin.

E. ReliabilityA critical factor when considering the performance of a

fingerprint-matching system is reliability. Within the scopeof this paper, overall reliability is broken down into twocomponents: inherent reliability and algorithm (processing)reliability. Inherent reliability refers to the physiologicaldependability of pores, which is the probability that aknown pore will be visible in a particular live-scan print.Pores do not always appear on print images; factors suchas temperature and skin condition can conspire to alter orsuppress altogether the physical appearance of a given pore.Algorithm or processing reliability must also be taken

into account. Depending on the quality of the image,automated processing and detection algorithms make errors.There are two errors that the feature-detection algorithmcan make: a missed detect and an incorrect (false) detect.A missed detect occurs when a feature (pore or minutia) isdiscernible in an image yet is not picked up by the detectionalgorithm. A false detect occurs when the algorithm mis-takenly marks a feature when in fact no feature is present.The degree of noise and degradation in the image influencesthe quantity of errors. The probability of incorrect detection

and missed detection are parameters on which theperformance of the system depends. A high or willtend to increase the FRR but have little effect on the FAR.

F. Match ScoreA particular matching technique will produce a score

representing the fraction of features matching between theenrolled and live-scan prints. Fig. 6 provides an exampleof matching based on either minutiae or pores for twosegments from different fingers and also for two segmentsfrom the same finger. Generally, the number of featuresdetected in the two different prints, and , will bedifferent. Therefore, the matching routine must comparetwo sets, or configurations, with a different number ofelements. For example, a pore match score can bedefined as

(1)

where

total number of pores in both segmentsnumber of pores that matchnumber of pores that do not match

and using

The pore matching score can be rewritten as

(2)

A match occurs when a pore is detected in the comparisonimage at an enrolled pore’s location. A mismatch occurswhen a detected pore from either image does not correspond

RODDY AND STOSZ: FINGERPRINT FEATURES 1395

Authorized licensed use limited to: Vivekanand Education Society's Inst of Tech. Downloaded on August 13, 2009 at 08:15 from IEEE Xplore. Restrictions apply.

as seen in Fig. 5, can be defined such that if the feature is within!, the features within, match with respect to position. The size of is a parameter that influences the performance of the system (decreasing produces a decreasing FAR and increasing FRR; increasing produces an increasing FAR and decreasing FRR).

! Practically, should be large enough to account for effects such as plasticity of the finger and deviations in feature position due to variations in the data, as well as effects of the processing algorithms, but not large enough for areas associated with distinct features to overlap. In a forensic comparison, plasticity and distortion of the finger are accounted for by human processing, but in an automated process, tolerances such as must be incorporated to accommodate these inherent variations.

Finger Plasticity

! The distance between two features can change signifi- cantly due to plasticity of the finger. This relative change of position is generally not significant for nearby features within small areas of print. Therefore, when measuring the position of small high-density features such as pores, a local origin should be established. A minutia point can be used to establish a local origin.

Important Terms:1. Attribute: a subfeature; the position, shape, and size are attributes of a pore.

2. Authentication: confirmation of proper identity.

3. Distribution (probability): the pdf of a random variable.

4. Distribution (spatial): the way that a set of points is positioned in space or in an image.

5. dpi: refers to scanning resolution measured in dots (pix- els) per inch.

6. EER: equal error rate; the value at which the FAR and FRR are equal.

9

Page 10: Report IX Final

7. FAR: false accept error rate; fraction of attempts for which the system allows access to an imposter or invalid user.

8. FRR: false reject error rate; fraction of attempts for which a fingerprint system denies access to a valid user.

9. Feature: a characteristic; pores and minutia points are fingerprint features.

10.Feature area: search area; the area assigned to an individual feature in which no other feature is assumed to exist. A small area of fingerprint surrounding the feature location in which the featureʼs exact position is not important. Related to resolution and search area.

11.Feature characteristic (subfeature): attributes of features such as shape, size, and location for pores; type, orientation, and position for minutiae.

12.Feature configuration: a feature set for which the specific arrangement of the features within the area occupied by the set is known.

13.Feature position or location: defined as the center of mass of a pore or the center of the ridge at the point at which it ends (for end points) or branches (for bifurcations).

14.Feature Set:(a,b,c....); a group of features associated with a specified area of fingerprint.

15.Homogeneous: uniformly spatially distributed; the density of pores is constant over the entire area of print.

16. Identification: a scenario in which the identity of the user who presents a live-scan image to the system is unknown. The system must determine who the unknown user is from a database of valid users.

17. Isotropic: having the same properties independent of direction or orientation.

18.Latent: the fingerprint impression left on an objectʼs surface resulting from contact with a finger.

19.Live Scan: an image of the fingerprint acquired using an electronic scanner for the purpose of real-time fingerprint processing or matching. A live scan represents a pressed finger as opposed to a rolled print.

20.Match (of a feature): a feature represented in the enrolled template (or fingerprint) corresponds to a feature from the live-scan fingerprint.

10

Page 11: Report IX Final

21.Mismatch (of a feature): a feature represented in the enrolled template (or fingerprint) does not correspond to a feature from the live-scan fingerprint, or a feature from the live-scan print does not correspond to a feature in the enrolled template.

22.Minutia: a ridge structure that differs from the usual (normal) continuous and nondiverging flow; examples are ridge branches (bifurcations) and ridge ends.

23.M x N pixels: an area of fingerprint M pixels in width and N pixels in height.

24.Measurement Accuracy: the accuracy of determined locations of features; higher resolution allows more precise estimation of an actual featureʼs position, shape, and size.

25.Performance: measure of FRR and FAR for a given system; match time and cost should also be considered but are not addressed in this paper.

26.Pixel: can be used as a unit of length or area; mag- nitude is established by the magnification of the input image and the dimensions of the image produced by the frame-grabber.

27.Pore: opening of a sweat gland that is visible on the surface of the finger ridges.

28.ppi: refers to scanning resolution measured in pixels per inch of fingerprint.

29.Resolution Scanning: number of samples per unit length (or area) determined by the degree of magnification of the fingerprint image on the charge-coupled device (CCD) sensor.

30.Scanned Area: the area of fingerprint incident on the active area of the CCD sensor device and represented in the fingerprint image.

31.Search Area: a small area in which a feature is searched for; designed to account for detected feature position deviations due to noise, plasticity, distortion, or processing variations. Increasing the search area is equivalent to reducing the scanning resolution, reducing the accuracy of detection of the feature position.

32.Sub-feature: an attribute of a fingerprint feature.

33.Template: a set of data that is extracted from a fingerprint and then used to represent that finger. Fingerprints are matched against templates or templates are matched against templates.

34.Uniqueness: key to FAR; probability of occurrence of a configuration of features.

11

Page 12: Report IX Final

35.Verification: a scenario in which a user claims an identity (enters a personal identification number) and the system then authenticates the userʼs claim by matching his live- scan print against the template corresponding to his claimed identity.

MATLAB Simulation

Requirements

! In order to perform a Matlab simulation, we need the following:

1. Matlab2. Image Processing Toolbox3. A sample fingerprint of 200x200 pixels

Preparations

1. Make a root folder where all the ʻ.mʼ files will be stored. This is the working Matlab directory.

2. Make sure the resolution of the sample fingerprint is 200 x 200 pixels in bitmap color mode.

3. Name the sample image as: Empreinte.bmp4. The source fileʼs name will be ʻfingerprint.mʼ

Codefingerprint.m

clear all,close all,clc

I=imread('Empreinte.bmp');imshow(I)set(gcf,'position',[1 1 600 600]);

J=I(:,:,1)>160;imshow(J)set(gcf,'position',[1 1 600 600]);

K=bwmorph(~J,'thin','inf');imshow(~K)set(gcf,'position',[1 1 600 600]);

fun=@minutie;L = nlfilter(K,[3 3],fun);

12

Page 13: Report IX Final

(contd.) fingerprint.m

LTerm=(L==1);imshow(LTerm)LTermLab=bwlabel(LTerm);propTerm=regionprops(LTermLab,'Centroid');CentroidTerm=round(cat(1,propTerm(:).Centroid));imshow(~K)set(gcf,'position',[1 1 600 600]);hold onplot(CentroidTerm(:,1),CentroidTerm(:,2),'ro')

LBif=(L==3);LBifLab=bwlabel(LBif);propBif=regionprops(LBifLab,'Centroid','Image');CentroidBif=round(cat(1,propBif(:).Centroid));plot(CentroidBif(:,1),CentroidBif(:,2),'go')

D=6;

Distance=DistEuclidian(CentroidBif,CentroidTerm);SpuriousMinutae=Distance<D;[i,j]=find(SpuriousMinutae);CentroidBif(i,:)=[];CentroidTerm(j,:)=[];

Distance=DistEuclidian(CentroidBif);SpuriousMinutae=Distance<D;[i,j]=find(SpuriousMinutae);CentroidBif(i,:)=[];

Distance=DistEuclidian(CentroidTerm);SpuriousMinutae=Distance<D;[i,j]=find(SpuriousMinutae);CentroidTerm(i,:)=[];

hold offimshow(~K)hold onplot(CentroidTerm(:,1),CentroidTerm(:,2),'ro')plot(CentroidBif(:,1),CentroidBif(:,2),'go')hold off

Kopen=imclose(K,strel('square',7));

KopenClean= imfill(Kopen,'holes');KopenClean=bwareaopen(KopenClean,5);imshow(KopenClean)

KopenClean([1 end],:)=0;KopenClean(:,[1 end])=0;ROI=imerode(KopenClean,strel('disk',10));imshow(ROI)

13

Page 14: Report IX Final

(contd.) fingerprint.m

imshow(I)hold onimshow(ROI)alpha(0.5)

hold onplot(CentroidTerm(:,1),CentroidTerm(:,2),'ro')plot(CentroidBif(:,1),CentroidBif(:,2),'go')hold off

[m,n]=size(I(:,:,1));indTerm=sub2ind([m,n],CentroidTerm(:,1),CentroidTerm(:,2));Z=zeros(m,n);Z(indTerm)=1;ZTerm=Z.*ROI';[CentroidTermX,CentroidTermY]=find(ZTerm);

indBif=sub2ind([m,n],CentroidBif(:,1),CentroidBif(:,2));Z=zeros(m,n);Z(indBif)=1;ZBif=Z.*ROI';[CentroidBifX,CentroidBifY]=find(ZBif);

imshow(I)hold onplot(CentroidTermX,CentroidTermY,'ro','linewidth',2)plot(CentroidBifX,CentroidBifY,'go','linewidth',2)

Table=[3*pi/4 2*pi/3 pi/2 pi/3 pi/4 5*pi/6 0 0 0 pi/6 pi 0 0 0 0 -5*pi/6 0 0 0 -pi/6 -3*pi/4 -2*pi/3 -pi/2 -pi/3 -pi/4];

for ind=1:length(CentroidTermX) Klocal=K(CentroidTermY(ind)-2:CentroidTermY(ind)+2,CentroidTermX(ind)-2:CentroidTermX(ind)+2); Klocal(2:end-1,2:end-1)=0; [i,j]=find(Klocal); OrientationTerm(ind,1)=Table(i,j);enddxTerm=sin(OrientationTerm)*5;dyTerm=cos(OrientationTerm)*5;figureimshow(K)set(gcf,'position',[1 1 600 600]);hold onplot(CentroidTermX,CentroidTermY,'ro','linewidth',2)plot([CentroidTermX CentroidTermX+dyTerm]',... [CentroidTermY CentroidTermY-dxTerm]','r','linewidth',2)

14

Page 15: Report IX Final

(contd.) fingerprint.m

for ind=1:length(CentroidBifX) Klocal=K(CentroidBifY(ind)-2:CentroidBifY(ind)+2,CentroidBifX(ind)-2:CentroidBifX(ind)+2); Klocal(2:end-1,2:end-1)=0; [i,j]=find(Klocal); if length(i)~=3 CentroidBifY(ind)=NaN; CentroidBifX(ind)=NaN; OrientationBif(ind)=NaN; else for k=1:3 OrientationBif(ind,k)=Table(i(k),j(k)); dxBif(ind,k)=sin(OrientationBif(ind,k))*5; dyBif(ind,k)=cos(OrientationBif(ind,k))*5;

end endend

plot(CentroidBifX,CentroidBifY,'go','linewidth',2)OrientationLinesX=[CentroidBifX CentroidBifX+dyBif(:,1);CentroidBifX CentroidBifX+dyBif(:,2); CentroidBifX CentroidBifX+dyBif(:,3)]';OrientationLinesY=[CentroidBifY CentroidBifY-dxBif(:,1);CentroidBifY CentroidBifY-dxBif(:,2);CentroidBifY CentroidBifY-dxBif(:,3)]';plot(OrientationLinesX,OrientationLinesY,'g','linewidth',2)

MinutiaTerm=[CentroidTermX,CentroidTermY,OrientationTerm];MinutiaBif=[CentroidBifX,CentroidBifY,OrientationBif];saveMinutia('John Doe',MinutiaTerm,MinutiaBif);

Functions

! The List of all the function calls have been listen in the appendix with their code. Ensure to put all of them in the same directory as the ʻfingerprint.mʼ file.

Sample Fingerprint:

15

Page 16: Report IX Final

Stepwise Code Explanation

Load image

! The general shape of the fingerprint is generally used to pre-process the images, and reduce the search in large databases. This uses the general directions of the lines of the fingerprint, and the presence of the core and the delta. Several categories have been defined in the Henry system: whorl, right loop, left loop, arch, and tented arch.

Most algorithms are using minutiae, the specific points like ridges ending, bifurcation... Only the position and direction of these features are stored in the signature for further comparison.

I=imread('Empreinte.bmp');imshow(I)set(gcf,'position',[1 1 600 600]);

Enhancement

! A critical step in automatic fingerprint matching is to automatically and reliably extract minutiae from the input fingerprint images. However, the performance of a minutiae extraction algorithm relies heavily on the quality of the input fingerprint images. In order to ensure that the performance of an automatic fingerprint identification/verification system would be robust with respect to the quality of the fingerprint images, it would be essential to incorporate a fingerprint enhancement algorithm in the minutiae extraction module.

In our case, the quality of the image is really good, and we won't need to enhance our image.

16

Page 17: Report IX Final

Binarize

! We binarize the image. After the operation, ridges in the fingerprint are highlighted with black color while furrow are white.

J=I(:,:,1)>160;imshow(J)set(gcf,'position',[1 1 600 600]);

17

Page 18: Report IX Final

Thining

Ridge thining is to eliminate the redundant pixels of ridges till the ridges are just one pixel wide.

K=bwmorph(~J,'thin','inf');imshow(~K)set(gcf,'position',[1 1 600 600]);

MinutiaeWe filter the thinned ridge map by the filter "minutie". "minutie" compute the number of one-value of each 3x3 window: * if the central is 1 and has only 1 one-value neighbor, then the central pixel is a termination. * if the central is 1 and has 3 one-value neighbor, then the central pixel is a bifurcation. * if the central is 1 and has 2 one-value neighbor, then the central pixel is a usual pixel.

fun=@minutie;L = nlfilter(K,[3 3],fun);

18

Page 19: Report IX Final

Termination

LTerm=(L==1);imshow(LTerm)LTermLab=bwlabel(LTerm);propTerm=regionprops(LTermLab,'Centroid');CentroidTerm=round(cat(1,propTerm(:).Centroid));imshow(~K)set(gcf,'position',[1 1 600 600]);hold onplot(CentroidTerm(:,1),CentroidTerm(:,2),'ro')

19

Page 20: Report IX Final

Bifurcation

LBif=(L==3);LBifLab=bwlabel(LBif);propBif=regionprops(LBifLab,'Centroid','Image');CentroidBif=round(cat(1,propBif(:).Centroid));plot(CentroidBif(:,1),CentroidBif(:,2),'go')

Remarks

We have a lot of spurious minutae. We are going to process them. process 1: if the distance between a termination and a biffurcation is smaller than D, we remove this minutiae process 2: if the distance between two biffurcations is smaller than D, we remove this minutia process 3: if the distance between two terminations is smaller than D, we remove this minutia

D=6;

20

Page 21: Report IX Final

Process 1

Distance=DistEuclidian(CentroidBif,CentroidTerm);SpuriousMinutae=Distance<D;[i,j]=find(SpuriousMinutae);CentroidBif(i,:)=[];CentroidTerm(j,:)=[];

Process 2

Distance=DistEuclidian(CentroidBif);SpuriousMinutae=Distance<D;[i,j]=find(SpuriousMinutae);CentroidBif(i,:)=[];

Process 3

Distance=DistEuclidian(CentroidTerm);SpuriousMinutae=Distance<D;[i,j]=find(SpuriousMinutae);CentroidTerm(i,:)=[];

hold offimshow(~K)hold onplot(CentroidTerm(:,1),CentroidTerm(:,2),'ro')plot(CentroidBif(:,1),CentroidBif(:,2),'go')hold off

21

Page 22: Report IX Final

ROI

We have to determine a ROI. For that, we consider the binary image, and we aply an closing on this image and an erosion. With the GUI, I allow the use of ROI tools of MATLAB, to define manually the ROI.

Kopen=imclose(K,strel('square',7));

KopenClean= imfill(Kopen,'holes');KopenClean=bwareaopen(KopenClean,5);imshow(KopenClean)KopenClean([1 end],:)=0;KopenClean(:,[1 end])=0;ROI=imerode(KopenClean,strel('disk',10));imshow(ROI)

22

Page 23: Report IX Final

Show ROI on Print:

imshow(I)hold onimshow(ROI)alpha(0.5)

hold onplot(CentroidTerm(:,1),CentroidTerm(:,2),'ro')plot(CentroidBif(:,1),CentroidBif(:,2),'go')hold off

23

Page 24: Report IX Final

Suppress extrema minutiae

Once we defined the ROI, we can suppress minutiae external to this ROI.

[m,n]=size(I(:,:,1));indTerm=sub2ind([m,n],CentroidTerm(:,1),CentroidTerm(:,2));Z=zeros(m,n);Z(indTerm)=1;ZTerm=Z.*ROI';[CentroidTermX,CentroidTermY]=find(ZTerm);

indBif=sub2ind([m,n],CentroidBif(:,1),CentroidBif(:,2));Z=zeros(m,n);Z(indBif)=1;ZBif=Z.*ROI';[CentroidBifX,CentroidBifY]=find(ZBif);

imshow(I)hold onplot(CentroidTermX,CentroidTermY,'ro','linewidth',2)plot(CentroidBifX,CentroidBifY,'go','linewidth',2)

24

Page 25: Report IX Final

Orientation

Once we determined the differents minutiae, we have to find the orientation of each one.

Table=[3*pi/4 2*pi/3 pi/2 pi/3 pi/4 5*pi/6 0 0 0 pi/6 pi 0 0 0 0 -5*pi/6 0 0 0 -pi/6 -3*pi/4 -2*pi/3 -pi/2 -pi/3 -pi/4];

Termination Orientation

We have to find the orientation of the termination. For finding that, we analyze the position of the pixel on the boundary of a 5 x 5 bounding box of the termination. We compare this position to the Table variable. The Table variable gives the angle in radian.

for ind=1:length(CentroidTermX) Klocal=K(CentroidTermY(ind)-2:CentroidTermY(ind)+2,CentroidTermX(ind)-2:CentroidTermX(ind)+2); Klocal(2:end-1,2:end-1)=0; [i,j]=find(Klocal); OrientationTerm(ind,1)=Table(i,j);enddxTerm=sin(OrientationTerm)*5;dyTerm=cos(OrientationTerm)*5;figureimshow(K)set(gcf,'position',[1 1 600 600]);hold onplot(CentroidTermX,CentroidTermY,'ro','linewidth',2)plot([CentroidTermX CentroidTermX+dyTerm]',... [CentroidTermY CentroidTermY-dxTerm]','r','linewidth',2)

25

Page 26: Report IX Final

Bifurcation OrientationFor each bifurcation, we have three lines. So we operate the same process than in termination case three times.

for ind=1:length(CentroidBifX) Klocal=K(CentroidBifY(ind)-2:CentroidBifY(ind)+2,CentroidBifX(ind)-2:CentroidBifX(ind)+2); Klocal(2:end-1,2:end-1)=0; [i,j]=find(Klocal); if length(i)~=3 CentroidBifY(ind)=NaN; CentroidBifX(ind)=NaN; OrientationBif(ind)=NaN; else for k=1:3 OrientationBif(ind,k)=Table(i(k),j(k)); dxBif(ind,k)=sin(OrientationBif(ind,k))*5; dyBif(ind,k)=cos(OrientationBif(ind,k))*5;

end endend

plot(CentroidBifX,CentroidBifY,'go','linewidth',2)

OrientationLinesX=[CentroidBifX CentroidBifX+dyBif(:,1);CentroidBifX CentroidBifX+dyBif(:,2);CentroidBifX CentroidBifX+dyBif(:,3)]';

OrientationLinesY=[CentroidBifY CentroidBifY-dxBif(:,1);CentroidBifY CentroidBifY-dxBif(:,2);CentroidBifY CentroidBifY-dxBif(:,3)]';

plot(OrientationLinesX,OrientationLinesY,'g','linewidth',2)

26

Page 27: Report IX Final

Save in a text file

In this step, we are going to save the minutia in a file

MinutiaTerm=[CentroidTermX,CentroidTermY,OrientationTerm];MinutiaBif=[CentroidBifX,CentroidBifY,OrientationBif];saveMinutia('John Doe',MinutiaTerm,MinutiaBif);

ResultsThe final feature extraced image is as shown below:

(Fig. Extracted Images)

Storage

! The orientation of all the features viz. Terminations & Bifurcations are stored in text file names as ʻJohn_Doe_(Date).txtʼ. The screenshot of the text file is given on the next page.

27

Page 28: Report IX Final

Screenshot of the Saved File

For Attendance System:! Since the saved file contain all the information of the features. Each individuals can store their fingerprints on a server database, which later can be used to validate and mark attendances. The hardware implementation is reserved for future.

28

Page 29: Report IX Final

Future Prospects:! In the future, we hope to extend this project to Feature Validation. This could provide us with a challenge to match two fingerprints. In succeeding of such event, we would like to implement the same on Hardware and operate it in Real-time Scenarios.

! Since all the files are in ʻ.mʼ extension, they can be easily converted to ʻ.cʼ or ʻ.cppʼ or ʻ.hexʼ files and then be burned on a micro-controller. This provides a real time implementation of this system.

29

Page 30: Report IX Final

AppendixSupplement Files

! These files are to be put in the same root as ʻfingerprint.mʼ

DistEuclidian.m

function D=DistEuclidian(dataset1,dataset2)

h = waitbar(0,'Distance Computation');switch nargin case 1 [m1,n1]=size(dataset1); m2=m1; D=zeros(m1,m2); for i=1:m1 waitbar(i/m1) for j=1:m2 if i==j D(i,j)=NaN; else D(i,j)=sqrt((dataset1(i,1)-dataset1(j,1))^2+(dataset1(i,2)-dataset1(j,2))^2); end end end case 2 [m1,n1]=size(dataset1); [m2,n2]=size(dataset2); D=zeros(m1,m2); for i=1:m1 waitbar(i/m1) for j=1:m2 D(i,j)=sqrt((dataset1(i,1)-dataset2(j,1))^2+(dataset1(i,2)-dataset2(j,2))^2); end end otherwise error('only one or two input arguments')end

close(h)

lissage.m

function y = lissage(x)m=sum(x(:));if m>3 y=1;else y=0;end

30

Page 31: Report IX Final

minutie.m

function y=minutie(x)i=ceil(size(x)/2);if x(i,i)==0; y=0;else y=sum(x(:)) - 1;end

saveMinutia.m

function saveMinutia(name,MinutiaFin,MinutiaSep)name=strrep(name,' ','_');date=datestr(now,29);FileName=[name '_' date '.txt'];

file=fopen(FileName,'wt');fprintf(file,'%s \n','-------------------------------------------------------------------');fprintf(file,'%s \n',['Name: ' name]);fprintf(file,'%s \n',['Date: ' date]);fprintf(file,'%s','Number of Terminations: ');fprintf(file,'%2.0f \n',size(MinutiaFin,1));fprintf(file,'%s','Number of Bifurcations: ');fprintf(file,'%2.0f \n',size(MinutiaSep,1));fprintf(file,'%s \n','-------------------------------------------------------------------');fprintf(file,'%s \n','-------------------------------------------------------------------');fprintf(file,'%s \n','Terminations :');fprintf(file,'%s \n','-------------------------------------------------------------------');fprintf(file,'%s \n','X Y Angle');fprintf(file,'%3.0f \t %3.0f \t %3.2f \n',MinutiaFin');fprintf(file,'%s \n','-------------------------------------------------------------------');fprintf(file,'%s \n','Bifurcations :');fprintf(file,'%s \n','-------------------------------------------------------------------');fprintf(file,'%s \n','X Y Angle 1 Angle 2 Angle 3');fprintf(file,'%3.0f \t %3.0f \t %3.2f \t \t %3.2f \t \t %3.2f \n',MinutiaSep');fclose(file);

31

Page 32: Report IX Final

Bibliography:[1.] Huang D.S., Andréa Rowdy. “Fingerprint Analysis and Pattern Recognition”

[2.] Jain K. A. Ross and Jolene Ayer kenos Lama. “Biometric Template Security”

[3.] Miltonic D. D’Amico, A.K. Jain and Muhammad Kumara Khan. “Fingerprint Analysis”

[4.] Gourav Aggarwal, Embedded Solutions. “Authentication phase”

[5.] R.K. Rowe, “A Multi-modal Authentication for Fingerprint Spoof Detection”

[6.] A.K. Jain and A. Ross “A Multi modal Authentication Scheme”

[7.] Mathwork’s Image Processing Toolbox

32