1
Human Factors of Security SystemsAndrew Patrick, Ph.D.
Information Security GroupInstitute for Information Technology
http://iit-iti.nrc-cnrc.gc.ca
[email protected]://www.AndrewPatrick.ca
2
2Andrew Patrick @ NRC Feb. 14, 2007
•The Book
•Collection of articles on human factors of security.
•sections on authentication mechanisms
•secure systems
•privacy
•design
3
3Andrew Patrick @ NRC Feb. 14, 2007
July 18-20, 2007
Carnegie Mellon University
Pittsburgh, PA
Symposium On Usable Privacy and Security
The conference. Tutorials, workshops, keynotes, the latest research.
4
4Andrew Patrick @ NRC Feb. 14, 2007
Usable Security (USEC'07)February 15-16, 2007
Trinidad/Tobago
A workshop co-located withThe Eleventh Conference on Financial Cryptography
and Data Security (FC'07)
The other conference, happening this week.
• Financial Cryptography is an interesting conference focusing on security issues for the financial sector.
• Based in the Caribbean!
5
5Andrew Patrick @ NRC Feb. 14, 2007
‘... security is only as good as it’s weakest link, and people are the
weakest link in the chain.’
- Bruce Schneier: ‘Secrets and Lies’ (2000).
When we talk about humans and security, this is the typical message that we hear.
6
6Andrew Patrick @ NRC Feb. 14, 2007
“... the human side of computer security is easily exploited and constantly
overlooked. Companies spend millions of dollars on firewalls, encryption and secure access devices, and it’s money
wasted, because none of these measures address the weakest link in
the security chain”
- Kevin Mitnick, 2000
… and here is another. Mitnick was referring to his successful social engineering attacks.
Bio from wikipedia:
…a famous computer cracker, who was convicted of wire fraud and of breaking into the computer systems of Fujitsu, Motorola, Nokia, and Sun Microsystems.Mitnick served five years in prison (four years of it pre-trial), 8 months of that in solitary confinement, and was released on January 21, 2000. During his supervised release, which ended on January 21, 2003, he was initially restricted from using any communications technology other than a landline telephone.
He offers security consulting services through his company Mitnick Security Consulting, LLC and has co-authored two books on computer security. The books are The Art of Deception (2002), which focuses on social engineering, and The Art of Intrusion (2005), focusing on real stories of security exploits.
7
7Andrew Patrick @ NRC Feb. 14, 2007
There is no problem so complex that it cannot simply be blamed on the pilot.
— Dr Earl Weiner
I want to draw a parallel between the state of information security today and the state of airline safety 20 years ago. Back then, it was very common to blame airline accidents on “pilot error”, without an investigation to factors that led to that error.
The same is true for information security today. We are too quick to blame the users, without spending enough time understanding, and remedying, the factors that lead to the users’ behavior.
A lot of progress was made in airline safety when a systems approach was adopted, and the same can be done for IT security.
8
8Andrew Patrick @ NRC Feb. 14, 2007
So, what do users do?
They write their passwords down and store them in unsafe places.
Any they make easy to remember (easy to guess) passwords.
9
9Andrew Patrick @ NRC Feb. 14, 2007
They share their passwords with friends and coworkers.
10
10Andrew Patrick @ NRC Feb. 14, 2007
And they give away their passwords to just about anybody who asks.
11
11Andrew Patrick @ NRC Feb. 14, 2007
• And they continue to support the spam industry.
• One study showed that 33% of people have clicked on links in spam messages.
• And 10% have bought products advertised in spam!
Email users 'sustaining spam industry'23 March 2005http://www.weboptimiser.com/search_engine_marketing_news/13020846.html
12
12Andrew Patrick @ NRC Feb. 14, 2007
And, they are susceptible to phishing attacks – being lured to false web sites to give-up their username/password credentials.
A recent report suggested that 5% of people respond to phishing messages.
Report on Phishing
A Report to the Minister of Public Safety and Emergency Preparedness Canada and the Attorney General of the United States
Binational Working Group on Cross-Border Mass Marketing Fraud October 2006
13
13Andrew Patrick @ NRC Feb. 14, 2007
the blame trap
James Reason:
• blaming the fallible individuals at the front end is universal,natural, satisfying, and convenient
• but it does little to solve the problem
• and may put focus on the wrong person
• and leads to ineffective fixes (e.g., training, procedures, policies, supervision)
14
14Andrew Patrick @ NRC Feb. 14, 2007
active errors
errors made by an individual that directly leads to a problem
e.g., sharing a password
15
15Andrew Patrick @ NRC Feb. 14, 2007
latent errors
• delayed-action errors made by a system or organization
• consequences are indirect
• often related to design, construction, operation, etc.
• e.g., Chernobyl accident
• e.g., not supporting collaboration between workers
16
16Andrew Patrick @ NRC Feb. 14, 2007
the organizational accident
James Reason, talking about Chernobyl
five phases:
1. organizational process leading to latent errors
2. error-producing conditions within specific places
3. active errors made by individuals
4. events in which one or more safeguards are by-passed
5. outcomes that vary in severity of consequences (free lessons vs. catastrophes)
17
17Andrew Patrick @ NRC Feb. 14, 2007
Zippy delivery, 7 minutes or it’s free
Clip from prevent-it.ca advertisement, WSIB Ontario
18
18Andrew Patrick @ NRC Feb. 14, 2007
Framework for Examining HCI and Security(Sasse et al., 2001)
Users
Technology
Goals & Tasks
Context
new topic: framework for understanding usability and security
Users
Technology
Goals & Tasks
Context
19
19Andrew Patrick @ NRC Feb. 14, 2007
cognitive psychology
• the study of the functions of the human mind
• features, characteristics, limitations
•thoughts, emotions, behaviors are systematic, predictable•we can design with them in mind
20
20Andrew Patrick @ NRC Feb. 14, 2007
human memory
characteristics of human memory
• limited capacity of working memory
• decay over time
• memory for gist & meaning rather than literal details
• strengthening by repetition, weakening by time
• cannot forget on demand
• recognition better than recall
21
21Andrew Patrick @ NRC Feb. 14, 2007
• memory is not a set of compartments where experiences are locked away
• memory is meaningful, constructions, schemas, etc.
22
22Andrew Patrick @ NRC Feb. 14, 2007
So, we should adopt techniques that make passwords more meaningful…
HUMAN SELECTION OF MNEMONIC PHRASE-BASED PASSWORDSCynthia Kuo, Sasha Romanosky, and Lorrie Faith Cranor (Carnegie Mellon University) SOUPS 2006
23
23Andrew Patrick @ NRC Feb. 14, 2007
cognitive passwords (personal history or preferences)
– mother’s maiden name
… and we can based passwords on knowledge, rather than on arbitrary memories
24
24Andrew Patrick @ NRC Feb. 14, 2007
http://www.nature.com/neuro/journal/v4/n11/full/nn739.html
… and we know that we are much better at recognition tasks then we are for recall tasks.
25
25Andrew Patrick @ NRC Feb. 14, 2007
… and promote systems based on recognition rather than recall
26
26Andrew Patrick @ NRC Feb. 14, 2007
… because the evidence is that we will remember passwords based on recognition better
27
27Andrew Patrick @ NRC Feb. 14, 2007
Here is another example of a recognition-based authentication system
déjà vuhttp://www.ischool.berkeley.edu/~rachna/dejavu/
28
28Andrew Patrick @ NRC Feb. 14, 2007
relaxing “three strikes” rule can reduce password resets by 44%, with little
impact on vulnerability if strong passwords enforced
… and since memory is imperfect and reconstructive, we can accommodate it by relaxing some arbitrary restrictions
most failures to remember passwords are confusions, not forgetting (wrong password, old password)
"Ten strikes and you’re out": Increasing the number of login attempts can improve password usability (revised February 18 2003)Sacha Brostoff and Angela Sasse HCISEC workshop, 2003
29
29Andrew Patrick @ NRC Feb. 14, 2007
knowledge and motivation
• security is at best an enabling task, and at worst a barrier to real work
• security consciousness can be linked to fear of being labeled“paranoid” or untrusting
• password sharing as a sign of trust
• perceptions of unimportance: nobody will target me, and what would they get if they did
• perceptions of helplessness: hackers will get in anyway
• security policies and procedures unrealistic and user not accountable
• using security encourages others to try to break-in
30
30Andrew Patrick @ NRC Feb. 14, 2007
… so we can train people why and how to use passwords
University of Michigan
31
31Andrew Patrick @ NRC Feb. 14, 2007
technology
32
32Andrew Patrick @ NRC Feb. 14, 2007
passwords or PINs
are you asking users to use passwords, or PINs, or both
33
33Andrew Patrick @ NRC Feb. 14, 2007
• arbitrary password restrictions?
• exactly 8 characters?
• must have punctuation?
34
34Andrew Patrick @ NRC Feb. 14, 2007
number of passwords
I use a unique password for each login account that I have.
I have collected about 200 different passwords.
I use pwsafe to maintain them. Protected by a master key, and stored on an encrypted USB device.
35
35Andrew Patrick @ NRC Feb. 14, 2007
single sign-on
we don’t use single sign-on, or even password synchronization, enough
36
36Andrew Patrick @ NRC Feb. 14, 2007
ambushing
we often require people to change their password immediately, with no warning, when they want to do their primary task
37
37Andrew Patrick @ NRC Feb. 14, 2007
public key systems (PKI)
people often point to PKI systems as the solutionusers will have a cryptographic key, and use it to authenticate themselves
38
38Andrew Patrick @ NRC Feb. 14, 2007
Why Johnny Can’t Encrypt
• only 33% of people could sign and encrypt email
• 25% accidentally exposed the secret that was to be protected
Why Johnny Can’t Encrypt:A Usability Evaluation of PGP 5.0Alma WhittenJ. D. Tygar
39
39Andrew Patrick @ NRC Feb. 14, 2007
on the surface, an attractive and simple interface…
40
40Andrew Patrick @ NRC Feb. 14, 2007
people did not know that they had to get public keys from other people
41
41Andrew Patrick @ NRC Feb. 14, 2007
if they knew they needed keys for other people, they had trouble doing so
42
42Andrew Patrick @ NRC Feb. 14, 2007
people failed to develop a conceptual model of the security operations
43
43Andrew Patrick @ NRC Feb. 14, 2007
goals and tasks
44
44Andrew Patrick @ NRC Feb. 14, 2007
security interferes with “real” work
45
45Andrew Patrick @ NRC Feb. 14, 2007
cost of authentication failures
time
embarrassment
46
46Andrew Patrick @ NRC Feb. 14, 2007
teamwork and collaboration
• people usually work in groups, where information sharing is essential
• and yet, the information system are often not designed to support the level of sharing that goes on
• with no other alternative, users will resort to nasty habits, such as password sharing
47
47Andrew Patrick @ NRC Feb. 14, 2007
context
48
48Andrew Patrick @ NRC Feb. 14, 2007
imagine you are using the machine and can’t remember your password, or can’t get your fingerprint to be recognized
49
49Andrew Patrick @ NRC Feb. 14, 2007
imagine you are working in an organization that has this as their physical security measure
50
50Andrew Patrick @ NRC Feb. 14, 2007
hot topics
new topic: current hot topics in usability and security
51
51Andrew Patrick @ NRC Feb. 14, 2007
user authentication
traditionally has been the username and password
52
52Andrew Patrick @ NRC Feb. 14, 2007
two-factor authentication
something you have (card, token)something you know (password, PIN)something you are or do (biometric, signature)
53
53Andrew Patrick @ NRC Feb. 14, 2007
US banks
•Federal Financial Institutions Examination Council •new requirements to implement two-factor authentication for Internet banking
54
54Andrew Patrick @ NRC Feb. 14, 2007
one-time passwords
55
55Andrew Patrick @ NRC Feb. 14, 2007
challenge questions
56
56Andrew Patrick @ NRC Feb. 14, 2007
off-line attacks
these authentication methods work against simple off-line attacks where credentials obtained by •data breaches, •simple phishing, •key loggers
57
57Andrew Patrick @ NRC Feb. 14, 2007
dynamic attacks
does not work against dynamic attacks
58
58Andrew Patrick @ NRC Feb. 14, 2007
man-in-the-middle
the user does all the authentication
59
59Andrew Patrick @ NRC Feb. 14, 2007
•bad software running on the client computer•installed via luring, piggy backing, viruses (e.g., Super Bowl web site used JavaScript exploit in unpatched computers)•most common Trojans are shells designed to download specific software
•can hijack secure financial sessions and insert fraudulent transactions•relies on the user to do all the authentication for it
•part of Greek mythology•The Greek siege of Troy had lasted for ten years. The Greeks devised a new ruse: a giant hollow wooden horse. It was built by Epeius and filled with Greek warriors led by Odysseus. The rest of the Greek army appeared to leave, but actually hid behind Tenedos. Meanwhile, a Greek spy, Sinon, convinced the Trojans that the horse was a gift despite the warnings of Laocoon and Cassandra; Helen and Deiphobus even investigated the horse; in the end, the Trojans accepted the gift. In ancient times it was customary for a defeated general to surrender his horse to the victorious general in a sign of respect. It should be noted here that the horse was the sacred animal of Poseidon; during the contest with Athena over the patronage of Athens, Poseidon gave men the horse, and Athena gave the olive tree.
60
60Andrew Patrick @ NRC Feb. 14, 2007
“Zombies” are computers that are being controlled remotely in order to do malicious tasks.
Zombies organized together into “botnets” for efficient control. These are often rented out to cyber criminals.
Estimates are that 80% of spam email is sent by Zombies. New zombies believed to responsible for the massive increases in spam over the past 3-4 months
Zombies also used for phishing, distributed denial of service attacks, and click fraud.•distributed denial of service attacks are often against anti-spam organizations and other spammers
61
61Andrew Patrick @ NRC Feb. 14, 2007
site authentication
•authenticating the site to the user•used to prevent phishing attacks
62
62Andrew Patrick @ NRC Feb. 14, 2007
phishing
63
63Andrew Patrick @ NRC Feb. 14, 2007
Link in the email does not lead to Paypal, but instead to a server in China.
64
64Andrew Patrick @ NRC Feb. 14, 2007
This attacks actually writes the address bar, making it harder to recognize as a scam.
•Notice that the user is asked for their PIN for their bank card!•This is the most common type of phishing attack because of the ease of “cashing” (to be discussed later).
From the Anti-Phishing Working Group
65
65Andrew Patrick @ NRC Feb. 14, 2007
The link shows tdcanadatrust.com, but the actual destination is tdcanadatrust.go.ro, and that is actually a server in Romania
66
66Andrew Patrick @ NRC Feb. 14, 2007
The real destination for the link is niccakorea.com, a server in Korea
67
67Andrew Patrick @ NRC Feb. 14, 2007
Notice how well the Scotiabank interface is replicated. There are few cues to indicate that this is not the authentic site.
68
68Andrew Patrick @ NRC Feb. 14, 2007
From the Anti-Phishing Working Group, December 2006 data.
The most common host for fake phishing sites is the USA, because of trojans and “zombie” computers.
69
69Andrew Patrick @ NRC Feb. 14, 2007
SSL certificates
•supposed to provide site authentication•but, we have seen, anybody can buy a certificate•and many web sites don’t use them properly (e.g., http on page with login form)
•see Canadian Internet Registration Authority, cira.ca•and browsers/users not good at checking certificates
70
70Andrew Patrick @ NRC Feb. 14, 2007
SiteKey•method used by Bank of America to identify site•requires users to notice that the personalized keys are not present during this transaction•combined with machine fingerprinting, so challenge question presented if login from new machine•does not protect against bad user behavior (see Schecter article and my critique)•does not project against man-in-the-middle attacks and trojans•at least one man-in-the-middle attack against SiteKey seen in the wild!
71
71Andrew Patrick @ NRC Feb. 14, 2007
single sign-on, one master password
random unique password for each site
password is never stored, but regenerated as needed
site labeling, so imposter sites are revealed
passpet.org
72
72Andrew Patrick @ NRC Feb. 14, 2007
biometrics
NEW TOPIC: biometrics
73
73Andrew Patrick @ NRC Feb. 14, 2007
0123456789
10Ideal
Hand
Face
Speech
Iris
retina
Signature
Fingerprint
UsabilityAccuracy
from Coventry book chapter
74
74Andrew Patrick @ NRC Feb. 14, 2007
fingerprints
•
75
75Andrew Patrick @ NRC Feb. 14, 2007
•misconceptions about using scanners (rolling, swiping, tip-only, pressure)
• ergonomic issues with finger placement on scanners
76
76Andrew Patrick @ NRC Feb. 14, 2007
eye positioning
glasses, small eyes, dark irises
feedback on false rejections
77
77Andrew Patrick @ NRC Feb. 14, 2007
system provides feedback on the screen about eye position, but you can’t see it because you are looking at the sensor
78
78Andrew Patrick @ NRC Feb. 14, 2007
feedback and training
79
79Andrew Patrick @ NRC Feb. 14, 2007
transparency of operations
the best systems provide information about operations, such as capture, encoding, and recognition steps
80
80Andrew Patrick @ NRC Feb. 14, 2007
integration
the best systems are well integrated into the host, such as one-touch unlocking of screen savers, or context-menu file encryption
81
81Andrew Patrick @ NRC Feb. 14, 2007
public opinions
• security as an “enabling task” that gets in the way (Sasse)
• biometrics can be seen as unhygienic, stressful (Coventry)
• some fear of bodily harm to obtain biometric information (movies)
•union at Pearson airport objecting to iris recognition due to infra-red light
• lack of understanding about biometric templates and storage mechanisms
• opinion polls show weak support and concerns (e.g., TNS/TRUSTe 2005)
•more support in recent polls
• some cross-cultural differences (within Europe; US/Canada)
82
82Andrew Patrick @ NRC Feb. 14, 2007
privacy
• deployments likely covered by provincial and/or federal privacy legislation
• privacy impact assessments would be required
• some tools and guidelines are emerging• bioprivacy.org• Ontario Privacy Commissioner recommendations for possible Toronto anti double-dipping deployment
• require: encryption, single use, no match to latents, strict access controls, separate storage of personal information
83
83Andrew Patrick @ NRC Feb. 14, 2007
universal, public identifier
•biometrics provides a universal, public identifier•this leads to universal risk•what is usually needed is a specific, secret identifier
84
84Andrew Patrick @ NRC Feb. 14, 2007
Clip from the film Minority Report, starring Tom Cruise
85
85Andrew Patrick @ NRC Feb. 14, 2007
Sarnoff patent application for iris biometric information from moving subjectsimilar to Minority Reportarray of cameras that capture multiple images until at least one good one is obtained
86
86Andrew Patrick @ NRC Feb. 14, 2007
increased risk
•biometrics only as good as the registration process and the security mechanisms•violations can lead to increased risk if there is a false sense of security•can lead to a false sense of confidence in the authenticity of transactions
87
87Andrew Patrick @ NRC Feb. 14, 2007
identity theft
NEW TOPIC: The term "identity theft" is an oxymoron.[3] It is not possible to steal a human identity. Human identity is a psychological thing, or the subject of philosophical conversation.[4]. (Wikipedia)
see Bruce Schneier’s essays for good insights
88
88Andrew Patrick @ NRC Feb. 14, 2007
Video from Citibank advertisement
89
89Andrew Patrick @ NRC Feb. 14, 2007
impersonation
The real crime of identity theft is impersonation – using some credentials to claim you are someone else
90
90Andrew Patrick @ NRC Feb. 14, 2007
fraud
… and then the impersonator commits fraud, which is defined as deception for persona gain
This is not a new crime.
91
91Andrew Patrick @ NRC Feb. 14, 2007
authentic user not present
It is important to note that when the fraud takes place, the authentic user is not present.
So attempts to train users, or provide strong authentication, are not going to do any good.
92
92Andrew Patrick @ NRC Feb. 14, 2007
credentials for impersonation
The credentials used for the impersonation are not in the hands of the user, and may have been obtained without any involvement of the user (e.g., database breach)
93
93Andrew Patrick @ NRC Feb. 14, 2007
credential receiver
The credential receiver (bank or other institution) is the only one in a position to catch and prevent fraud
94
94Andrew Patrick @ NRC Feb. 14, 2007
wrong incentives
•but the institutions do not have the incentives to prevent fraud•positive incentive to allow transactions•little or no negative incentives if fraud is committed •e.g., recent case of mortgage fraud – default is that fraudster or the bank gets the property
•In a 2005 ruling -- Household Realty v. Liu -- the court said that a fraudulent home sale was nonetheless valid once the land title has been registered. The Liu case involved a woman who sold the family home without the consent or knowledge of her husband.
•The Ontario Court of Appeal Feb 6 2007 closed a loophole that had allowed fraudsters to sell homes from under the unsuspecting feet of those who owned them. In a 5-0 ruling, the court said that banks and other lenders will pay a painful price if they do not diligently research mortgage applications to ensure that they are not being made unwitting parties to a swindle.
95
95Andrew Patrick @ NRC Feb. 14, 2007
weak authentication
why am I receiving dozens of pre-filled credit applications each year? These are one of the favorite targets during dumpster diving.
96
96Andrew Patrick @ NRC Feb. 14, 2007
protect the transaction
•the fraud occurs at the time of the transaction (“cashing” after “phishing”)•the solution is to protect the transaction, not the credential•why do I not keep my credit card information secret –because the bank protects the transactions•verify the transaction based on risk parameters (see RSA solution)•limit possible transactions (e.g., view and internal transfers only)
97
97Andrew Patrick @ NRC Feb. 14, 2007
The Economy of PhishingChristopher Abad, Cloudmark
98
98Andrew Patrick @ NRC Feb. 14, 2007
security administrators
•it is easy to fall into the blame trap again when thinking about security administrators•but consider their job and work environment•security operations are precarious and getting worse•small changes have large impacts•systems can already be insecure, awaiting hackers•new threats daily•insiders are the greatest threat•they don’t own or control the equipment•multiple, limited tools for monitoring•limited views•special expertise and complex interfaces•24/7, real-time
•and when you get it right, nobody notices
99
99Andrew Patrick @ NRC Feb. 14, 2007
taken from Steve Chan (UC Berkeley) presentation at SOUPS 2005different work practices require different toolsGUIs don’t scale or provide the level of controlneed for integrated tools and visualization
100
100Andrew Patrick @ NRC Feb. 14, 2007
users make security errors because…
Conclusions…
101
101Andrew Patrick @ NRC Feb. 14, 2007
designs give no/wrong cues
102
102Andrew Patrick @ NRC Feb. 14, 2007
unattainable cognitive tasks
103
103Andrew Patrick @ NRC Feb. 14, 2007
insufficient knowledge
104
104Andrew Patrick @ NRC Feb. 14, 2007
a safety-critical approach to information security
I am arguing for a safety-critical approach to information security
Analyze the system as a whole, not just users, or technology, or operators
105
105Andrew Patrick @ NRC Feb. 14, 2007
aligning primary and secondary goals
The users’ primary goal is often in conflict with the secondary goal of maintaining good security. This leads to the belief that there must be a trade-off between usability and security.
106
106Andrew Patrick @ NRC Feb. 14, 2007
security and economics
“More and more people are coming to realise that security failures are often due to perverse incentives rather than to the lack of suitable technical protection mechanisms.” (Ross Anderson)
• the person guarding the system is not the person who suffers when the security mechanisms fail
107
107Andrew Patrick @ NRC Feb. 14, 2007
Are users really the weakest link in the security chain?
108
108Andrew Patrick @ NRC Feb. 14, 2007
… and just in case you think Microsoft Vista is going to solve all of our problems…
Clip from Apple Computer advertisement