e170.exsimson.net/ref/2004/csci_e-170/handouts/l14.pdf• disclosure control is hard to get right....
TRANSCRIPT
Lecture 14: Aligning Usability and Security
Simson L. Garfinkel
CSCI E-170
http://e170.ex.com
c©2004 Simson L. Garfinkel
Administrivia
December 21 Today — Last full lecture.December 28 No class — Eat Turkey.January 4 Lecture 15 — Open for presentations.January 11 Lecture 16 — Open for presentations. Final projects due.
Administrivia
December 21 Today — Last full lecture.December 28 No class — Eat Turkey.January 4 Lecture 15 — Open for presentations.January 11 Lecture 16 — Open for presentations. Final projects due.
ps: I figured out how to do slides with LATEX!
c©2004 Simson L. Garfinkel 1
Projects?
How are things going?
Plans for presentations?
Anybody want to go on January 4th?
c©2004 Simson L. Garfinkel 2
January 4th?
Extra topics?
c©2004 Simson L. Garfinkel 3
CSCI E-170: Computer Security, Privacy, and Usability
“Security” has been viewed at odds with Privacy and Usability.
CSCI E-170: Computer Security, Privacy, and Usability
“Security” has been viewed at odds with Privacy and Usability.
CSCI E-170 argues that they must go together.
CSCI E-170: Computer Security, Privacy, and Usability
“Security” has been viewed at odds with Privacy and Usability.
CSCI E-170 argues that they must go together. CSCI E-170 presents a frameworkfor understanding these properties.
c©2004 Simson L. Garfinkel 4
Bluetooth
Quick follow-up to how Bluetooth is running on the Mac.
c©2004 Simson L. Garfinkel 5
L1 — Introduction to Security, Privacy and Usability
The role of policy:
• Security requires a policy that defines what is to be secured.
• Privacy requires a policy of how information is to be treated.
c©2004 Simson L. Garfinkel 6
WIMP is more usable than the command-line.
Visiblity and direct manipulation are the breakthrough concepts that made WIMPmore usable than the command-line.
c©2004 Simson L. Garfinkel 7
Usability: What is it?
• satisfaction Interfaces we enjoy using
• efficiency Interfaces we are fast at using
• learnability Interfaces that we can use without asking for help
• accuracy Interfaces we can use without making errors.
• memorability Interfaces we can use after a long time.
c©2004 Simson L. Garfinkel 8
Usability: How do we do it?
• Observe existing work practices
• Be consistent
• Employ iterative design
• Expose necessary information, not junk data
• Avoid confirmations, use undo instead.
• Design for responsiveness
c©2004 Simson L. Garfinkel 9
Why is this so hard?
Whitten & Tygar: It is Inherently difficult to create interfaces for computersecurity applications.
Lots of reasons:
• The Secondary Goal Property
• The Hidden Failure Property
• The Abstraction Property1
• The Barn Door Property
• The Weakest Link Property1Security rules are easily understood by programmers but “alien and unintuitive” to everybody else.
c©2004 Simson L. Garfinkel 10
This really feels like “blame the user.”
c©2004 Simson L. Garfinkel 11
Why not make it invisible?
Whitten: Because you can’t!
If the user of the application depends on a security protection being enabled, andthe possibility exists of it being disabled, then the user action must be completelydisallowed —
Why not make it invisible?
Whitten: Because you can’t!
If the user of the application depends on a security protection being enabled, andthe possibility exists of it being disabled, then the user action must be completelydisallowed —
— or the lack of the protection must be made visible to the user and tools forremedying the problem should be made available.
c©2004 Simson L. Garfinkel 12
Alternative Theory
c©2004 Simson L. Garfinkel 13
Universe of software developers
expertise in usability
expertise in security
usable securityoverlap area
c©2004 Simson L. Garfinkel 14
Perhaps Usability and Security are seen as antagonistic because:
• Our Definition of “Security” precludes creating systems that are usable.
c©2004 Simson L. Garfinkel 15
Clark-Wilson Security Model
“A comparison of commercial and military computer security models,” IEEESymposium on Security and Privacy, 1987.
Clark-Wilson Security Model
“A comparison of commercial and military computer security models,” IEEESymposium on Security and Privacy, 1987.
• Each datum in the system is a constrained data item (CDI) or an unconstraineddata item (UDI).
Clark-Wilson Security Model
“A comparison of commercial and military computer security models,” IEEESymposium on Security and Privacy, 1987.
• Each datum in the system is a constrained data item (CDI) or an unconstraineddata item (UDI).
• CDI must be protected.
Clark-Wilson Security Model
“A comparison of commercial and military computer security models,” IEEESymposium on Security and Privacy, 1987.
• Each datum in the system is a constrained data item (CDI) or an unconstraineddata item (UDI).
• CDI must be protected.
• Transformation procedures (TPs) change CDIs with well-formed transactions.
Clark-Wilson Security Model
“A comparison of commercial and military computer security models,” IEEESymposium on Security and Privacy, 1987.
• Each datum in the system is a constrained data item (CDI) or an unconstraineddata item (UDI).
• CDI must be protected.
• Transformation procedures (TPs) change CDIs with well-formed transactions.
• Integrity verification procedures (IVPs) ensure that CDIs work as advertised.c©2004 Simson L. Garfinkel 16
➔ Clark-Wilson is a model in which integrity is more important than disclosurecontrol.
c©2004 Simson L. Garfinkel 17
What’s wrong with disclosure control?
• Disclosure control is hard to get right.
• Screw-ups can’t be reversed (“Barn Door.”)
• No clue how far data leaks.
It may be that Usability and Disclosure Control are difficult, but Usability andother security is easier.
c©2004 Simson L. Garfinkel 18
Other factors complicating HCI-SEC
Other factors complicating HCI-SEC
• “Security is like a chain”
Other factors complicating HCI-SEC
• “Security is like a chain”
• “Humans are the weakest link”
Other factors complicating HCI-SEC
• “Security is like a chain”
• “Humans are the weakest link”
• Emphasis on bug fixing, rather than correct design. (Similar to NTSB reports.)
Other factors complicating HCI-SEC
• “Security is like a chain”
• “Humans are the weakest link”
• Emphasis on bug fixing, rather than correct design. (Similar to NTSB reports.)
• Emphasis on cryptography.
Other factors complicating HCI-SEC
• “Security is like a chain”
• “Humans are the weakest link”
• Emphasis on bug fixing, rather than correct design. (Similar to NTSB reports.)
• Emphasis on cryptography.
• Researcher disinterest.
Other factors complicating HCI-SEC
• “Security is like a chain”
• “Humans are the weakest link”
• Emphasis on bug fixing, rather than correct design. (Similar to NTSB reports.)
• Emphasis on cryptography.
• Researcher disinterest.
• Difficulty of performing user tests.
Other factors complicating HCI-SEC
• “Security is like a chain”
• “Humans are the weakest link”
• Emphasis on bug fixing, rather than correct design. (Similar to NTSB reports.)
• Emphasis on cryptography.
• Researcher disinterest.
• Difficulty of performing user tests.
• Authentication is an attractive rathole. (Passwords, PKI, biometrics)
c©2004 Simson L. Garfinkel 19
Psychological basis
• People exaggerate minor risks
• Unknown risks are perceived to be more risky than known.
• Involuntary risks are perceived as more risky than voluntary risks.
c©2004 Simson L. Garfinkel 20
Computer Security at Crossroads
We must do better!
• Systems are now always-on
• Very powerful systems are connected.
• Viruses can do a lot of damage
c©2004 Simson L. Garfinkel 21
AOL 2004 Survey
• Experts sent to 329 homes.
• 20% were currently infected by a virus.
• 63% said that they had been infected in the past.
• 80% had spyware or adware.
. . .
AOL 2004 Survey
• Experts sent to 329 homes.
• 20% were currently infected by a virus.
• 63% said that they had been infected in the past.
• 80% had spyware or adware.
. . . Yet 70% believed they were safe from viruses and other online threats.
AOL 2004 Survey
• Experts sent to 329 homes.
• 20% were currently infected by a virus.
• 63% said that they had been infected in the past.
• 80% had spyware or adware.
. . . Yet 70% believed they were safe from viruses and other online threats.
Why? 85% had some kind of anti-virus. . .
AOL 2004 Survey
• Experts sent to 329 homes.
• 20% were currently infected by a virus.
• 63% said that they had been infected in the past.
• 80% had spyware or adware.
. . . Yet 70% believed they were safe from viruses and other online threats.
Why? 85% had some kind of anti-virus. . .
Yet 67% of those machines were not up-to-date. “AOL survey finds rampantc©2004 Simson L. Garfinkel 22
online threats, clueless users,”
Computerworld, October 23, 2004. http://www.computerworld.com/securitytopics/security/story/0,10801,96918,00.html
c©2004 Simson L. Garfinkel 23
Ideas for aligning security and usability
• A workable threat model.
• Improved Visiblity
• Decreased Functionality
• Admonitions — security co-pilots.
c©2004 Simson L. Garfinkel 24
A workable threat model
• Decreased emphasis on disclosure control — a hard sell in this era of identitytheft.
• Attackers that are active, but who do not control the infrastructure.
➔ Digital signatures, not message encryption.
c©2004 Simson L. Garfinkel 25
Improved Visiblity
• The Forensics work we did was really about visiblity.
• Outlook address book failures — also about visiblity.
• Other examples. . . ?
c©2004 Simson L. Garfinkel 26
Decreased Functionality
• Don’t allow programs unrestricted access to files — Yee’s access by designation.
c©2004 Simson L. Garfinkel 27
Admonitions — Security Co-Pilots
c©2004 Simson L. Garfinkel 28
Yee’s No-Surprise condition
“Definition: Security software is usable if the people who are expected to use it:”
Yee’s No-Surprise condition
“Definition: Security software is usable if the people who are expected to use it:”
If:
actors A = {A0, A1, . . . , An}perceived abilities P = {P0, P1, . . . , Pn}
real abilities P = {R0, R1, . . . , Rn}
Then the no-surprise condition requires that:
P0 ⊆ R0 andc©2004 Simson L. Garfinkel 29
Pi ⊇ Ri for i > 0
c©2004 Simson L. Garfinkel 30
Yee’s Guidelines for secure interaction design
General Principles:
• Path of least resistance — the most natural way to do a task should also be thesafest.
Yee’s Guidelines for secure interaction design
General Principles:
• Path of least resistance — the most natural way to do a task should also be thesafest.
• Appropriate boundaries — The interface should draw distinctions amongobjects and actions along boundaries that matter to the user.
Yee’s Guidelines for secure interaction design
General Principles:
• Path of least resistance — the most natural way to do a task should also be thesafest.
• Appropriate boundaries — The interface should draw distinctions amongobjects and actions along boundaries that matter to the user.
Maintaining the actor-ability state:
Yee’s Guidelines for secure interaction design
General Principles:
• Path of least resistance — the most natural way to do a task should also be thesafest.
• Appropriate boundaries — The interface should draw distinctions amongobjects and actions along boundaries that matter to the user.
Maintaining the actor-ability state:
• Explicit Authorization — A user’s authority should only be granted to anotheractor through an explicit user action understood to imply granting.
Yee’s Guidelines for secure interaction design
General Principles:
• Path of least resistance — the most natural way to do a task should also be thesafest.
• Appropriate boundaries — The interface should draw distinctions amongobjects and actions along boundaries that matter to the user.
Maintaining the actor-ability state:
• Explicit Authorization — A user’s authority should only be granted to anotheractor through an explicit user action understood to imply granting.
c©2004 Simson L. Garfinkel 31
• Visibility — The interface should let the user easily review any active authorityrelationships that could affect security decisions.
• Visibility — The interface should let the user easily review any active authorityrelationships that could affect security decisions.
• Recoverability - The interface should let the user easily revoke authority thatthe user has granted, whenever revocation is possible
• Visibility — The interface should let the user easily review any active authorityrelationships that could affect security decisions.
• Recoverability - The interface should let the user easily revoke authority thatthe user has granted, whenever revocation is possible
• Expected ability — The interface should not give the user the impression ofhaving authority that the user does not actually have.
• Visibility — The interface should let the user easily review any active authorityrelationships that could affect security decisions.
• Recoverability - The interface should let the user easily revoke authority thatthe user has granted, whenever revocation is possible
• Expected ability — The interface should not give the user the impression ofhaving authority that the user does not actually have.
Communicating with the user:
• Visibility — The interface should let the user easily review any active authorityrelationships that could affect security decisions.
• Recoverability - The interface should let the user easily revoke authority thatthe user has granted, whenever revocation is possible
• Expected ability — The interface should not give the user the impression ofhaving authority that the user does not actually have.
Communicating with the user:
• Trusted path — the user’s communication channel to any entity thatmanipulates authority on the user’s behalf must be unspoofable and free ofcorruption.
• Visibility — The interface should let the user easily review any active authorityrelationships that could affect security decisions.
• Recoverability - The interface should let the user easily revoke authority thatthe user has granted, whenever revocation is possible
• Expected ability — The interface should not give the user the impression ofhaving authority that the user does not actually have.
Communicating with the user:
• Trusted path — the user’s communication channel to any entity thatmanipulates authority on the user’s behalf must be unspoofable and free ofcorruption.
• Identifiability — The interface should ensure that identical objects or actionsappear identical and that distinct objects or actions appear different
• Visibility — The interface should let the user easily review any active authorityrelationships that could affect security decisions.
• Recoverability - The interface should let the user easily revoke authority thatthe user has granted, whenever revocation is possible
• Expected ability — The interface should not give the user the impression ofhaving authority that the user does not actually have.
Communicating with the user:
• Trusted path — the user’s communication channel to any entity thatmanipulates authority on the user’s behalf must be unspoofable and free ofcorruption.
• Identifiability — The interface should ensure that identical objects or actionsappear identical and that distinct objects or actions appear different
c©2004 Simson L. Garfinkel 32
• Expressiveness — The interface should provide enough expressive power to letusers easily express security policies that fit their goals.
• Expressiveness — The interface should provide enough expressive power to letusers easily express security policies that fit their goals.
• Clarity — The effect of any authority-manipulating user action should beclearly apparent to the user before the action takes effect.
c©2004 Simson L. Garfinkel 33
“Safe Staging.
Whitten — what does it mean?
Can we find examples of Safe Staging?
“Software with training wheels” — why not just use this idea?
c©2004 Simson L. Garfinkel 34
“Metaphor Tailoring”
c©2004 Simson L. Garfinkel 35
c©2004 Simson L. Garfinkel 36
HCI-SEC Enhancing Techniques
HCI-SEC Enhancing Techniques
• Explicit Install
HCI-SEC Enhancing Techniques
• Explicit Install
• Consistent Vocabulary
HCI-SEC Enhancing Techniques
• Explicit Install
• Consistent Vocabulary
• Distinguish Taint
HCI-SEC Enhancing Techniques
• Explicit Install
• Consistent Vocabulary
• Distinguish Taint
• User Audit
HCI-SEC Enhancing Techniques
• Explicit Install
• Consistent Vocabulary
• Distinguish Taint
• User Audit
• No Kit
HCI-SEC Enhancing Techniques
• Explicit Install
• Consistent Vocabulary
• Distinguish Taint
• User Audit
• No Kit
• RUn vs. Open
HCI-SEC Enhancing Techniques
• Explicit Install
• Consistent Vocabulary
• Distinguish Taint
• User Audit
• No Kit
• RUn vs. Open
• Self-Signed Certs / Continuity of Identity
HCI-SEC Enhancing Techniques
• Explicit Install
• Consistent Vocabulary
• Distinguish Taint
• User Audit
• No Kit
• RUn vs. Open
• Self-Signed Certs / Continuity of Identityc©2004 Simson L. Garfinkel 37
• More Secure vs. Less Secure — Distinguish between similar operations that aremore secure and less-secure
• More Secure vs. Less Secure — Distinguish between similar operations that aremore secure and less-secure
• Access Analysis — Provide facility for reporting specific access rights andcapabilities of a user or group
• More Secure vs. Less Secure — Distinguish between similar operations that aremore secure and less-secure
• Access Analysis — Provide facility for reporting specific access rights andcapabilities of a user or group
• Disable new features by default.
• More Secure vs. Less Secure — Distinguish between similar operations that aremore secure and less-secure
• Access Analysis — Provide facility for reporting specific access rights andcapabilities of a user or group
• Disable new features by default.
• Match Expectations — Match security expectations created in the user’s mindwith the security actually delivered by the tool.
• More Secure vs. Less Secure — Distinguish between similar operations that aremore secure and less-secure
• Access Analysis — Provide facility for reporting specific access rights andcapabilities of a user or group
• Disable new features by default.
• Match Expectations — Match security expectations created in the user’s mindwith the security actually delivered by the tool.
• Leverage authentication
c©2004 Simson L. Garfinkel 38
Thompson — Reflections on Trusting Trust
You’ve got to trust something. . .
c©2004 Simson L. Garfinkel 39