dod abis: quality evaluation of operational multi-modal

25
1 DoD ABIS: Quality Evaluation of Operational Multi-Modal Biometric Data Bob Carter Lockheed Martin DoD Biometrics

Upload: others

Post on 30-Jan-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

1

DoD ABIS: Quality Evaluation of Operational Multi-Modal

Biometric Data

Bob Carter Lockheed Martin DoD Biometrics

22

Outline

• Motivation/Problem Statement • Operational Setting • Multi-Modal Biometric Data

– Fingerprint – Face – Iris

• Challenges

NIST Workshop on Biometric Quality

33

Motivation

• Quality of data affects system performance – Processing time – Validity of results

• Quality-adaptive processing – Thresholds sensitive to quality of probe & gallery

samples • Multi-modal fusion

– Quality drives order of processing – Quality a factor into score/decision combination – Quality-sensitive thresholds

NIST Workshop on Biometric Quality

44

Operational Setting

• DoD BMO Biometric Collection SOP – 10(14) finger images, 5 face photos, 2 iris images

• Overworked, under-trained, collectors – often under stressful (life-threatening) conditions – often in a harsh environment (lighting, temperature, etc.)

• Substantial amount of legacy data (10+ years old) – paper fingerprint cards that have been exposed to severe

environmental conditions – scanned images of Polaroid photos that have been stapled and

exposed to the elements • Highest reliability desired

– National security at stake

NIST Workshop on Biometric Quality

55

Fingerprint

• Evaluation methods • Data sample • Quality findings

NIST Workshop on Biometric Quality

66

Finger Image Quality Evaluation

• NFIQ - NIST Finger Image Quality • Range of 1-5 • Related to minutia matcher performance

• FIQM - Finger Image Quality Measurement • Range of 0-100 • Related to human perception

• ENM - Equivalent Number of Minutia • Range of 0-85 • Related to quality of print near each minutia and its

neighbors

NIST Workshop on Biometric Quality

77

Quality Measures

NFIQ = 1 NFIQ = 5

ENM = 25 ENM = 20

ENM = 21 ENM = 17

NIST Workshop on Biometric Quality

88

Finger Quality Findings I

0.00%

10.00%

20.00%

30.00%

40.00%

50.00%

60.00%

70.00%

1 2 3 4 5

ABIS-1

ABIS-2

ABIS-3

ABIS-3F

0.00%

5.00%

10.00%

15.00%

20.00%

25.00%

0 10 20 30 40 50 60 70 80 90 100

ABIS-1

ABIS-2

ABIS-3

ABIS-3F

NFIQ ENM

NIST Workshop on Biometric Quality

NIST Workshop on Biometric Quality 99

Finger Quality Findings II

NFIQ, FIQM, ENM (N=21572)

1

2

3

4

5

0.00%

5.00%

10.00%

15.00%

20.00%

25.00%

30.00%

35.00%

40.00%

45.00%

50.00%

0 10 20 30 40 50 60 70 80 90 100

NFIQ

FIQM

ENM

1010

Finger Quality Correlation?

NFIQ ENM ENM per Minutia

FIQM

NFIQ 1

ENM -0.355 1

ENM per Minutia -0.588 0.782 1

FIQM -0.775 0.434 0.687 1

NIST Workshop on Biometric Quality

1111

Face

• Evaluation methods • Data sample • Quality findings

NIST Workshop on Biometric Quality

1212

Face Image Quality Evaluation

• Identix FaceIt Quality Assessment – 11 dimensions

• darkness, brightness, exposure, focus, resolution,cropping, glasses, faceness, contrast, texture, andfaceFindingConfidence

– Overall Quality computed as: • minimum(darkness, brightness, focus, resolution,

cropping, faceness, contrast) • 0.0-3.9 : Bad • 4.0-6.9 : Fair • 7.0-10.0 : Good

NIST Workshop on Biometric Quality

1313

Face Data Quality Issues

• Cluttered background • Legacy data – e.g. scans of 10+ year-old

Polaroids • Non-frontal pose • Inconsistent lighting • Multiple heads • Low resolution

NIST Workshop on Biometric Quality

1414

Face Quality Findings I Overall Quality Distribution (N=27399)

0.00%

1.00%

2.00%

3.00%

4.00%

5.00%

6.00%

7.00%

0 1 2 3 4 5 6 7 8 9 10

0.00%

10.00%

20.00%

30.00%

40.00%

50.00%

60.00%

70.00%

80.00%

90.00%

100.00%

Overall Quality

CDF

NIST Workshop on Biometric Quality

1515

Face Quality Findings II Overall Quality Distribution (Ground Truth Mates; N=2023)

0.00%

1.00%

2.00%

3.00%

4.00%

5.00%

6.00%

7.00%

0 1 2 3 4 5 6 7 8 9 10

Qverall Quality

0.00%

10.00%

20.00%

30.00%

40.00%

50.00%

60.00%

70.00%

80.00%

90.00%

100.00%

NIST Workshop on Biometric Quality

1616

Face Identification Performance and Quality

Match/False Non-Match vs. Overall Image Quality (1039 mated pairs)

0

1

2

3

4

5

6

7

8

9

10

0 1 2 3 4 5 6 7 8 9 10

Probe Image Quality

Ga

lle

ry I

ma

ge

Qu

ali

ty

Rank 1-10

Rank 11-20

Rank 21-30

Rank 31+

False NonMatch

NIST Workshop on Biometric Quality

1717

Iris

• Evaluation methods • Quality findings

NIST Workshop on Biometric Quality

1818

Iris Image Quality Evaluation

• Method of Kalka and Schmid from WVU • 7 dimensions

– Occlusion, motion blur, defocus blur, lighting, pixel counts, specular reflection and off-angle

– Overall quality computed by applying Dempster-Shafer method using Murphy’s rule to normalized (0.0-1.0) dimensions

NIST Workshop on Biometric Quality

1919

Iris Quality Findings I

Histogram of Iris Image Quality ( N=29657)

0

1000

2000

3000

4000

5000

6000

0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1

Iris Image QualityScore

0.00%

10.00%

20.00%

30.00%

40.00%

50.00%

60.00%

70.00%

80.00%

90.00%

100.00%

NIST Workshop on Biometric Quality

NIST Workshop on Biometric Quality

2020

Relative Iris Quality Comparative Iris Quality

0.00%

5.00%

10.00%

15.00%

20.00%

25.00%

30.00%

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

0.00%

10.00%

20.00%

30.00%

40.00%

50.00%

60.00%

70.00%

80.00%

90.00%

100.00%

ABIS

WVU

CASIA

ABIS CDF

WVU CDF

CASIA CDF

WVU and CASIA Iris Quality Scores courtesy of Nate Kalka, WVU

2121

Iris Quality Findings II Histogram of Iris Image Quality (N=575)

0

10

20

30

40

50

60

70

80

90

100

0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1

Quality Score

Bin

Co

un

t

0.00%

10.00%

20.00%

30.00%

40.00%

50.00%

60.00%

70.00%

80.00%

90.00%

100.00%

NIST Workshop on Biometric Quality

2222

Iris Identification Performance and Quality

Match Score vs Quality of Probe and Gallery Images (274 pairs)

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Probe Quality

Ga

lle

ry Q

ua

lity

False Non-Match

True Match

NIST Workshop on Biometric Quality

2323

Challenges

• Need real-time feedback at point of collection • Need either

– generic, algorithm-agnostic quality metrics – or, algorithm (vendor)-specific quality metrics

• Want performance-predictive metrics • Machine perception and/or human perception? • Need to understand tradeoff involving very low

quality data – can we quantify diminishing returns? – can we justify excluding some samples?

NIST Workshop on Biometric Quality

2424

Collaboration Opportunity

• We have plenty of real-world data. – Unfortunately, not for public dissemination

• However, we welcome the chance to evaluate new ideas using our data set for mutual benefit. – WVU – iris image quality assessment – BAH – finger image quality assessment

• POC: [email protected]

NIST Workshop on Biometric Quality

2525

Questions?

NIST Workshop on Biometric Quality