copyright, 2000-04 1 biometric insecurity roger clarke xamax consultancy pty ltd, canberra visiting...
TRANSCRIPT
Copyright,2000-04
1
Biometric Insecurity
Roger ClarkeXamax Consultancy Pty Ltd, Canberra
Visiting Professor, Baker & McKenzie Cyberspace Law & Policy Centre, U.N.S.W.
Visiting Professor in eCommerce, Uni. of Hong KongVisiting Fellow, Dept of Computer Science, ANU
http://www.anu.edu.au/people/Roger.Clarke/....../DV/BiomInsec{.html, .ppt}
AusCERT Conference – 24 May 2004
Copyright,2000-04
2
Biometric Insecurity
AGENDA
1. The Security Intent2. Biometric Technology
3. The Contexts of Biometrics Use4. Quality Challenges in Biometrics Applications5. Applications of Biometric Technology
6. A Long Litany of Failures7. Conclusions
Copyright,2000-04
3
Security• A Condition in which harm does not arise,
despite the occurrence of threatening events• Safeguards designed to achieve that Condition
Security Safeguards• Measures to ensure that Vulnerabilities,
when impinged upon by Threatening Events, do not result in undue Harm to Assets
Copyright,2000-04
4
What Did You Want To Do?• Control access to Personal Devices?• Control access to R&D premises?• Prevent employees bundying on and off
on behalf of their absent workmates? • Detect benefits recipients double-dipping?• Stop Israeli agents getting Kiwi passports?• Intercept miscreants at border-posts?• Identify the perpetrator of a crime?• Stop terrorists getting onto planes?• Establish a Stalinist social control regime?
Copyright,2000-04
5
2.
Biometric Technology
Copyright,2000-04
6
Terminology
• 'A Biometric' is a measurable physical or behavioural characteristic of a human being
• So 'Biometrics' refers to measures of people
• 'Biometrics Technologies' are technologies that produce and process measures of people
• So ‘Biometrics’ also refers to technologies
Copyright,2000-04
7
A Taxonomy of Biometrics• Appearance
height, weight, colour of skin, hair and eyes, visible physical markings, gender?, race??, facial hair??, wearing of glasses??
• Social Behaviourhabituated body-signals, general voice characteristics, style of speech, visible handicaps
• Bio-Dynamicsmanner of writing one's signature, statistically-analysed voice characteristics, keystroke dynamics, esp. login-id, password
• Natural Physiographyskull measurements??, teeth and skeletal injuries?, thumbprint, fingerprint sets, handprints, retinal scans, capillary patterns (e.g. in earlobes), hand geometry, digit geometry, DNA-patterns
• Imposed Featuresdog-tags, collars, bracelets and anklets, bar-codes and other kinds of brands, embedded micro-chips and transponders
Copyright,2000-04
8
Available Biometrics Technologies
• Variously Dormant or Extinct
• Cranial Measures• Face Thermograms• Veins (hands, earlobes)• Retinal Scan• Handprint• Written Signature• Keystroke Dynamics• Skin Optical Reflectance• ...
• Currently in Vogue• Iris• Thumbprint• Hand Geometry• Voice• Face
• Special Case• DNA
• Promised• Body Odour• Multi-Attribute
Copyright,2000-04
9
The Biometric ProcessReferenceMeasure
or ‘MasterTemplate’
MeasuringDevice
Matchingand
Analysis
ResultTestMeasureor ‘Live
Template’
MeasuringDevice
1. Enrolment / Registration2. Testing
Copyright,2000-04
10
Kinds of Templates
• Bit-Map / Image / Print• Bit-Map Filtering / Compression
• Select-in of useful features• Select-out of un-useful features• Arbitrary
• Hashed• Reversibly ‘Hashed’• One-Way Hashed
• Encrypted
Copyright,2000-04
11
Categories of Biometric Application
• Authentication1-to-1 / ref. measure from somewhere / tests ‘entity assertions’
• Identification1-to-many / ref. measures from a database that also contains data about population-members / generates an ‘entity assertion’
• Vetting against a Blacklist1-to-many / ref. measures and data of a small population of wanted or unwanted people / may create an ‘entity assertion’
• Duplicate Detection1-to-many / ref. measures of a large population / may create an assertion ‘person already enrolled’
Copyright,2000-04
12
Motivations• Foreground
• Authorisations within a Context• Right to Perform a Function• Creation of Suspicion• Interception of 'Wanted Persons'• Deterrence of People from Locations
• Background• Collection of New Transaction Data• People-Location and People-Tracking• Creation / Enhancement
of Biometrics Databases
Copyright,2000-04
13
3.
The Contexts of Biometrics Use
Copyright,2000-04
14
Aspects of the Contexts of Use
The Subject'sKnowledge
and Consent• Acquisition of
Reference Measure(s)
• Acquisition ofTest-Measure(s)
The Subject'sWillingness
• Willing Participants• Cowed Participants
• Casual Opponents• Concerned
Opponents• Serious Opponents
Copyright,2000-04
15
Aspects of Biometrics Usage:The Information Associated with
the Reference Measure
• commonly-used name(s)• organisationally-assigned identifier(s)• an attribute (e.g. ‘suspected miscreant’,
‘arrest warrant outstanding’, ...) • an event• a location• stored data• data trail(s)
Copyright,2000-04
16
Circumstances of Acquisitionespecially of the Test-Measure
• Physically Supervised• Control-Points• Closed Spaces• Open Spaces
• Remote• At Individual Devices• Over Closed Networks• Over Open Networks
Copyright,2000-04
17
4. Quality Challengesin Biometric Applications
Dimensions of Quality
• Reference-Measure• Association• Test-Measure• Comparison• Result-Computation
Other Aspects of Quality
• Vulnerabilities• Quality Measures• Counter-Measures• Spiralling
Complexity• Consequences
Copyright,2000-04
18
Reference-Measure Quality
• The Person's Feature (‘Enrolment’)• The Acquisition Device• The Environmental Conditions• The Manual Procedures• The Interaction between Subject and
Device• The Automated Processes
Copyright,2000-04
19
Association Quality
• Depends on a Pre-Authentication Process
• Subject to the Entry-Point Paradox• Associates data with the ‘Person
Presenting’and hence entrenches criminal IDs
• Risk of an Artefact Substituted for, or Interpolated over, the Feature
Copyright,2000-04
20
Test-Measure Quality
• The Person's Feature (‘Acquisition’)• The Acquisition Device• The Environmental Conditions• The Manual Procedures• The Interaction between Subject and
Device• The Automated Processes
Copyright,2000-04
21
Comparison Quality
• Feature Uniqueness• Feature Change:
• Permanent• Temporary
• Ethnic/Cultural Bias“Our understanding of the demographic factors affecting biometric system performance is ... poor” (Mansfield & Wayman, 2002)
• Material Differences in:
• the Processes• the Devices• the Environment• the Interactions
• An Artefact:• Substituted• Interpolated
Copyright,2000-04
22
‘Factors Affecting Performance’(Mansfield & Wayman, 2002)
• Demographics (youth, aged, ethnic origin, gender, occupation)
• Template Age• Physiology (hair,
disability, illness, injury, height, features, time of day)
• Appearance (clothing, cosmetics, tattoos, adornments, hair-style, glasses, contact lenses, bandages)
• Behaviour (language, accent, intonation, expression, concentration, movement, pose, positioning, motivation, nervousness, distractions)
• Environment (background, stability, sound, lighting, temperature, humidity, rain)
• Device (wear, damage, dirt)• Use (interface design,
training, familiarity, supervision, assistance)
Copyright,2000-04
23
Result-Computation Quality• Print Filtering and Compression:
• Arbitrary cf. Purpose-Built• The Result-Generation Process• The Threshhold Setting:
• Arbitrary? Rational? Empirical? Pragmatic?
• Exception-Handling Procedures:• Non-Enrolment• Non-Acquisition• ‘Hits’
Copyright,2000-04
24
Consequences of Quality Problems
• A Tolerance Range has to be allowed• 'False Positives' / 'False Acceptances' arise• 'False Negatives' / 'False Rejections' arise• Tighter Tolerances (to reduce False Negatives)
increase the rate of False Positives; and vice versa• The Scheme Sponsor sets (and re-sets) the Tolerances• Frequent exceptions are mostly processed cursorily• Occasional ‘scares’ slow everything, annoy everyone
Copyright,2000-04
25
Conventional Measures of Quality
• Failure to Enrol Rate FTE(R)
• Failure to Acquire Rate FTA(R)
• False Non-Match Rate FNMR• False Match Rate FMR
• False Accept Rate FAR• False Reject Rate FRR
• Equal Error Rate (where FAR = FRR) ERR [sic]
Copyright,2000-04
26
"The Boy Who Cried Wolf" increased 1000-fold
Bruce Schneier, 30 September 2001The hardest problem is the false alarms ....
Suppose this magically effective face-recognition software is 99.99% accurate. That is, if someone is a terrorist, there is a 99.99% chance that the software indicates "terrorist," and if someone is not a terrorist, there is a 99.99% chance that the software indicates "non-terrorist." Assume that 1 in 10 million flyers, on average, is a [known!] terrorist.
Is the system any good? No. It will generate 1000 false alarms for every 1 real terrorist. And every false alarm still means that all the security people go through all of their security procedures. Because the population of non-terrorists is so much larger than the number of terrorists, the test is useless.
Copyright,2000-04
27
What if it’s 95% accurate?• We get a 95% chance of finding the 1 [known]
terrorist per 10 million passengers• For people who are not [known] terrorists:
• for 95%, the system gives a true negative• for 5%, the system gives a false positive
• Across 10 million passengers, there will be 500,000 false alarms
• That will delay people and planes, infuriate everyone, and result in the system being ignored most of the time
Copyright,2000-04
28
Vulnerability Locations
• Biometric Capture Devices• Biometric Storage Devices• Connections to and within Local Systems• Infrastructure – Local, Intermediating,
Remote• Networks to Remote Servers• Back-End Processors• Back-End Databases
Copyright,2000-04
29
Threats
• Live Biometric capture, theft
• Live Biometric simulation• Live Biometric substitution• Reference Biometric
substitution• Reference Biometric
forgery• Message interception,
modification, insertion• Stored Biometric capture,
theft, change, substitution • Threshhold manipulation
• Device tampering• Environmental tampering
(e.g. lighting, jamming)• Infrastructure manipulation
(e.g. power-outage)• Device or System
override/backdoor/trojan utilisation
• Exception-Handling Procedures manipulation
• Fallback procedures for the Unenrollable subversion
• Insider collusion
Copyright,2000-04
30
The Spiralling Complexity
• To address vulnerabilities and threats, add more security features and measures(e.g. liveness-testing to detect artefacts)
• But Measures beget Countermeasures• Countermeasures demand more security
features
• With more features, there’s more complexity,and hence even more vulnerabilities
Copyright,2000-04
31
5.
Applications
of
Biometric Technology
Copyright,2000-04
32
Physical Access Controlto Personal Devices
(Portables, Mobiles, iPods, ...)
• Personally enrol with one’s own device
• Personally • Compare Test-Measure to Reference-
Measure(a 1-to-1 Authentication application)
• Do it each time you unlock the device• willing, controlled, consistent, ...
fallback?
Copyright,2000-04
33
Physical Access Controlto Sensitive Workplaces
• Impose measurement on relevant employees
• Compare Test-Measure Against a Small Database of Reference-Measures(a 1-to-many Identification application)
• Do it many times• Do it under time-pressure (entry/exit)
• ...
Copyright,2000-04
34
The Authentication of Employee Identity Assertions
• Impose measurement on many employees
• Compare Test-Measure Against Reference-Measure captured at enrolment(a 1-to-1 Authentication application)
• Do it many times• Do it under time-pressure (bundy on/off)
• ........
Copyright,2000-04
35
The Detection ofDouble-Dipping Benefits
Recipients• Impose measurement on millions of people
• Compare 1 Against a Large Population(a 1-to-very-many Identification application)
• Do it many times• Do it under time-pressure (front-end check)• Do it at leisure (off-line data-matching)
• Many potential matches, so closeness-testing, and ordering of putative positives
Copyright,2000-04
36
The Authentication of Applicants for Passports
• Impose measurement on millions of people
• New Enrolments:• 1 Reference-Measure newly captured• Evidence of Identity (‘what you have’)
• Re-Enrolments:• 1 Test-Measure against
1 or More Reference-Measures(a 1-to-1 Authentication application)
Copyright,2000-04
37
The Authentication of Presenters of Passports
After Enrolment
• 1 Test-Measure cf. 1 Reference-Measure(a 1-to-1 Authentication application)
• Outbound, do it under time-pressure• Inbound, make them queue
• There is a Reference-Measure• Effectiveness depends on many factors
Copyright,2000-04
38
The Authentication of Presenters of Passports
As Enrolment (i.e. at U.S. borders currently)
• 1 Measure (Test or Reference?)• Outbound, do it under time-pressure• Inbound, make them queue
• There is no Reference-Measure• Effectiveness depends on:
• the person holding the passport• the quality of the Measure’s capture
• Entrenches existing ‘false IDs’
Copyright,2000-04
39
The Identification of thePerpetrator of a Crime
• Compare Test-Measure Against a Database(a 1-to-many Identification application)
• Latent Prints seldom reliably identify the Perpetrator
• Do it at leisure
• The People Sought may be:• ‘Convicted Criminals’, with Biometrics• Other Categories, with Biometrics• People for whom no Biometrics are held
Copyright,2000-04
40
The Prevention of Terrorist Access to Aircraft
• Compare Test-Measure Against a Stop-List(a 1-to-many Blacklist application)
• Do it many times• Do it under time-pressure
• Many People Sought are not ‘Known’• Few People Sought have provided
Biometrics• There are no Reference-Measures
for the People on the Stop-List
Copyright,2000-04
41
Biometrics and Single-Mission Terrorists
• “Biometrics ... can’t reduce the threat of the suicide bomber or suicide hijacker on his virgin mission. The contemporary hazard is a terrorist who travels under his own name, his own passport, posing as an innocent student or visitor until the moment he ignites his shoe-bomb or pulls out his box-cutter” (Jonas G., National Post, 19 Jan 2004)
• “it is difficult to avoid the conclusion that the chief motivation for deploying biometrics is not so much to provide security, but to provide the appearance of security” (The Economist, 4 Dec 2003)
Copyright,2000-04
42
6.
The Long Litany of Failures
Copyright,2000-04
43
The Biometrics Industry is Mythical• Publicly-available, independent evaluation of
technologies and products is extremely rare• Multiple reports show that most of the technologies
and products don't work• Technologies, products and suppliers continue to
appear and disappear at a rapid rate• Pilots almost never proceed to the next stage• Anecdotally, installations are so ineffectual that
they're a great embarrassment to everybody• Unclear whether any supplier is financially viable
Copyright,2000-04
44
Fraudulent Misrepresentationof the Efficacy of Face
Recognition• The Tampa SuperBowl was an utter failure• Ybor City FL was an utter failure• Most ‘deployments’ became ‘pilots’ and then ceased• Almost no quality-measures have ever been provided• Not one person has been correctly identified by
face recognition technology in open, public places
• No evidence exists of effectiveness• Ample anecdotal evidence exists of the opposite• The few independent testing results are atrocious!
Copyright,2000-04
45
Face Recognition Vendor Tests (Vendor-Friendly)
• FRVT (2000): “[in favourable conditions], the base false detection rate was 33%, with a false acceptance rate of 10%. This means that ... to detect 90% of terrorists we’d need to raise an alarm for one in every three people passing through the airport ...” (Greene T.C., The Register, 27 Sep 2001)
• FRVT (2002): “For the best face recognition systems, the recognition rate for faces captured outdoors, at a false accept rate of 1%, was only 50% ... [indoors], the best performer had a 90% verification rate at a false accept rate of 1%” [So imagine what the others were like!]
Copyright,2000-04
46
But it’s not just Face Recognition ...
• “people’s hands do not differ enough for [hand geometry] to be used as an identification system” (The Economist, 4 Dec 2003)
• “around 5% of people do not have readable fingerprints” (The Economist, 4 Dec 2003)
• “ ... comprehensive tests of 11 consumer-oriented biometric products ... found that the devices were ... more of the nature of toys than of serious security measures” (Leyden J., The Register, 22 May 2002)
• the sole large-scale application of iris technology (by the UNCHR in Afghanistan) has not been used to generate measures of reliability and quality
Copyright,2000-04
47
Oz Farce #1 – Customs ‘SmartGate’
• ‘Facial recognition’ technology applied to the authentication of QANTAS Aircrew at Mascot
• Testing was performed by a Govt agency (DSTO)• Two overseas experts reviewed the data, and
made noises that were both positive and negative• Customs and DSTO steadfastly refuse to release
meaningful information about:• the design• analyses of the justifications• the testing
• Impact Assessment appears to be an ‘optional extra’
Copyright,2000-04
48
Oz Farce #2 – DFAT’s Passport Proposal
• Not consultation, but merely “to provide the Minister with confidence that all issues have been identified”
• No information was provided about the design, its justification, or its impacts. Once again, Impact Assessment appears to be an ‘optional extra’
• No response to requests for information• The officers appear not to understand the technology,
the system, or the questions• When the questions became too difficult,
they simply withdrew the item from the agenda• Yet they’ve retained the elements in the Bill
Copyright,2000-04
49
The [U.S.] Fingerprinting of Foreigners
Bruce Schneier, 15 January 2004According to the Bush administration, the [fingerprinting of foreigners is] designed to combat terrorism. As a security expert, it's hard for me to see how. The 9/11 terrorists would not have been deterred by this system; many of them entered the country legally on valid passports and visas. ...Capturing the biometric information of everyone entering the country doesn't make us safer.
... even if we could completely seal our borders, fingerprinting everyone still wouldn't keep terrorists out. ... there is no comprehensive fingerprint database for suspected terrorists.
... The next logical step is to fingerprint all visitors to the U.S., and then ... U.S. citizens. ... Perhaps the program can be extended to train rides, bus rides, entering and exiting government buildings. ...
Copyright,2000-04
50
7.
Conclusions
Biometrics Must Be Banned
Copyright,2000-04
51
Massive Negative Social Impactsof Biometric Applications
• Queuing, Delays, Snowballing Delays• Selectivity, Physical and Cultural Outliers• False Positives suffer, and cluster• Privacy of the physical person; of personal behaviour,
(incl. denial of anonymity and pseudonymity); of personal data; even of one’s personal fate through masquerade, id theft, access denial, identity denial
• Chilling Effect on contrarians, troublemakers, public interest advocates and representatives
• Stultification of innovation, dehumanisation
Copyright,2000-04
52
Protective Measures Are Required
• Frameworks• Laws• Open and Published:
• Technology Info• Test Designs, Results• Justification /
CBRAs / Business Cases
• Independent Testing• Public Consultation• Impact Assessment
• No Central Storage• No Transmission• Two-Way Device
Authentication
• Compliance Audits• Prohibitions on
Non-Compliant Devices and Applications
Copyright,2000-04
53
Conclusion #1• 11-September-2001 hysteria has warped the scene,
and resulted in deeply unprofessional behaviour:• Biometric technologies don't work, yet are
being applied to uses they're fundamentally unsuited for
• Marketers sell snake-oil fervidly, without restraint• Government buyers suspend their disbelief
in ways that are seriously dangerous to security• Biometrics as mantra deflects attention and
investment away from security measures that could be effective
• Biometrics does not, and cannot, deliver security• We’re suffering from biometric insecurity
Copyright,2000-04
54
Conclusion #2Biometrics Must be Banned
• Biometric technologies are highly dangerous to our security, and to our privacy and freedoms
• Their unregulated use cannot be permitted• A ban must be imposed on their application ...• ... until and unless a comprehensive and
legally enforced regulatory regime has been established
Copyright,2000-04
55
Conclusion #3And Technologists Should Be
Thankful ...• A ban may be the only means of saving the industry
from its own undisciplined excesses• If present practices continue:
• public revulsion will build up and explode• the public mood will swing violently• biometrics will be marginalised
• By calling a halt, involving public interest advocates and representatives, and getting genuine controls into place before any further mis-fires are perpetrated, the industry might yet survive
Copyright,2000-04
56
Copyright,2000-04
57
Copyright,2000-04
58
X.
Impacts & Implications
Copyright,2000-04
59
I&I – Imposition and Inconvenience
• Queuing, Delays, and Snowballing Delays• Selectivity, especially with Human Operators• Culture-Specific Sensitivities Are Ignored• Physical and Cultural Outliers Suffer More• For False Positives:
• Suspicion, Embarrassment, Apparent Arrest• Need to Prosecute One's Innocence, but hamstrung
by Information Asymmetry• Suspicion Compounded by Ethnic, Lingual, Cultural
Differences, plus Biases, Bigotries• Increased Risk of Gross Impositions
Copyright,2000-04
60
Impacts & Implications Privacy-Invasiveness
• Privacy of the Person• Privacy of Personal Behaviour• Denial of Anonymity and Pseudonymity• Privacy of Personal Data
• The Biometric Itself• Identifiers used in Particular Contexts• Data Associated with the Biometric• Additional Data from Other Sources
• Privacy of Personal Fate
Copyright,2000-04
61
Impacts & ImplicationsThreats to Personal Identity
• Masquerade• Permanent Identity Theft• Access Denial• Identity Denial
Copyright,2000-04
62
Impacts & Implications Social, Political, Ethical and Economic
Dimensions
• Meaninglessness of Consents• Denial of 'Personal Self-
Determination'• 'The Chilling Effect' on 'Different-
Thinkers', Contrarians, Troublemakers, Public Interest Advocates and Representatives
• Stultification of Innovation• Dehumanisation
Copyright,2000-04
63
Copyright,2000-04
64
X.
Protections Required
Copyright,2000-04
65
Protections Required: The Context
• Frameworks, to ensure understanding of:• identification and authentication• when they can and cannot protect
lives, property and data• Laws, to preclude:
• retention, and additional use, of data arising from biometrics in public areas
• application of biometrics in employment, without strong and clear justification
Copyright,2000-04
66
Protections Required: The Technologies and Products
• A Privacy Strategy• Privacy-Protective Architecture• Open Information
cf. '[pseudo-]security by obscurity'
• Independent Testingbased on Published Guidelines
• Publication of all Test Results
Copyright,2000-04
67
Protections Required:Application Design Features
• No Central Storage of Reference Measures; Storage only on Each Person's Own Device
• No Storage of Test-Measures• No Transmission of Measures; Devices
as Closed and Secure as ATM PIN-Pads• Two-Way Device Authentication• Design Standards for Measuring Devices
Copyright,2000-04
68
Protections Required:The Application Design Process
• Consultation with the Affected Publicfrom the conception of the project onwards
• Privacy Impact Assessmentsconducted in the open, and published
• Explicit Public Justificationfor Privacy-Invasive Features
Copyright,2000-04
69
Protections Required:Regulatory Measures
• Prohibition of the manufacture or import of non-compliant biometric measuring devices
• Prohibition of the installation and use of non-compliant biometric measuring devices
• Prohibition of the creation, maintenance and use of a database of biometrics
• Requirement for Compliance Audit of biometric measuring devices
Copyright,2000-04
70
Protections Required:A Sense of Reality
• Data, instead of 'security through obscurity'
• documented technologies• metricated pilots• less suppression of outcomes
• Independent Testing• Lawsuits for Fraudulent
Misrepresentation
Copyright,2000-04
71
Copyright,2000-04
72
X.
How To ‘Do It Right’
Copyright,2000-04
73
Conventional Architecturee.g. for Identification
Database ofReferenceMeasures
Test-MeasureSensorProc’ingModuleDatabase
MaintenanceIdentifier and
Other DataApplication
Copyright,2000-04
74
Privacy-Sensitive Architecturee.g. Authentication Against a
Block-List • Sensor gathers a Test-Measure• Sensor provides it to a Secure Processing Module• SPM accesses the Reference Measure on a Token• SPM computes the Result• SPM accesses the Relevant Data on the Token• SPM checks the Relevant Data against the Block-
List• SPM provides only the Authentication Results:
• 'The person does/does not match the token'• 'The person's identifier is /is not blocked'
Copyright,2000-04
75
Privacy-Sensitive Architecturee.g. Authentication Against a
Block-List
BlockList
Test-MeasureSensorSecureProc’ingModule
ReferenceMeasure
RelevantData
Block ListMaintenanceResults (Y/N)Application