ee515/is523 think like an adversary lecture 6 access control/usability yongdae kim

68
EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Upload: clifton-pope

Post on 14-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

EE515/IS523 Think Like an AdversaryLecture 6Access

Control/Usability

Yongdae Kim

Page 2: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Recap http://security101.kr

E-mail policy Include [ee515] or [is523] in the subject of your e-mail

Student Survey http://bit.ly/SiK9M3

Student Presentation Send me email.

Preproposal meeting: Today after class

Page 3: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Kerberos vs. PKI vs. IBE

Still debating Let’s see one by one!

Page 4: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Kerberos (cnt.)

T

A B

A, B, NA

EEKBT

KBT(k, A, L), E

(k, A, L), E K

AT

KAT(k, N

(k, N AA, L, B)

, L, B)

EEKBTKBT(k, A, L), E(k, A, L), Ekk(A, T(A, TAA, A, Asubkeysubkey))

EEkk(T(TAA, B, Bsubkeysubkey))

•EEKBTKBT(k, A, L): Token for B(k, A, L): Token for B•EEKATKAT(k, N(k, NAA, L, B): Token for A, L, B): Token for A•L: Life-timeL: Life-time•NNAA??

•EEkk(A, T(A, TAA, A, Asubkeysubkey): To prove B that A knows k): To prove B that A knows k•TTAA: Time-stamp: Time-stamp

•EEkk(B, T(B, TAA, B, Bsubkeysubkey): To prove A that B knows k): To prove A that B knows k

Page 5: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Kerberos (Scalable)

T (AS)

A B

A, G, NA

EEKGT

KGT(k(k A

GAG, A, L), E

, A, L), E K

AT

KAT(k(k A

GAG, N

, N AA, L, G)

, L, G)

EEKGB KGB (k(kABAB, A, L, N, A, L, NAA’’), E), EkABkAB(A, T(A, TAA’’, A, Asubkeysubkey))

EEkk(T(TAA’’, B, Bsubkeysubkey))

G (TGS)

EE KGT

KGT(k(k AGAG

, A, L), E

, A, L), E kA

GkAG(A,

(A,

TT AA), B, N

), B, N AA

’’

EE KAG

KAG(k(k ABAB

, N

, N AA’’, L, B), E

, L, B), E kG

BkGB(k(k ABAB

, A, L, N

, A, L, N AA’’), B, NA

), B, NA

’’

Page 6: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Public Key Certificate Public-key certificates are a vehicle

public keys may be stored, distributed or forwarded over unsecured media

The objective make one entity’s public key available to others such that its authenticity and validity are verifiable.

A public-key certificate is a data structure data part

cleartext data including a public key and a string identifying the party (subject entity) to be associated therewith.

signature part digital signature of a certification authority over the data part

binding the subject entity’s identity to the specified public key.

Page 7: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

CA

a trusted third party whose signature on the certificate vouches for the authenticity of the public key bound to the subject entityThe significance of this binding must be provided by additional means, such as an attribute certificate or policy statement.

the subject entity must be a unique name within the system (distinguished name)

The CA requires its own signature key pair, the authentic public key.

Can be off-line!

Page 8: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

ID-based CryptographyNo public keyPublic key = ID (email, name, etc.)PKG

Private key generation centerSKID = PKGS(ID)PKG’s public key is public.distributes private key associated with the ID

Encryption: C= EID(M)

Decryption: DSK(C) = M

Page 9: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Discussion (PKI vs. Kerberos vs. IBE)

On-line vs. off-line TTPImplication?

Non-reputation?Revocation?Scalability?Trust issue?

Page 10: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

OS SecurityOS Security is essentially concerned with four problems:User authentication links users to processes.

Access control is about deciding whether a process can access a resource.

Protection is the task of enforcing these decisions: ensuring a process does not access resources improperly.

Isolation is the separation of processes’ resources from other processes.

Page 11: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Access ControlThe OS mediates access requests between subjects and objects.

This mediation should (ideally) be impossible to avoid or circumvent.

? ObjectSubject

Referencemonitor

Page 12: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Definitions Subjects make access requests on objects.

Subjects are the ones doing things in the system, like users, processes, and programs.

Objects are system resources, like memory, data structures, instructions, code, programs, files, sockets, devices, etc…

The type of access determines what to do to the object, for example execute, read, write, allocate, insert, append, list, lock, administer, delete, or transfer

Page 13: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Access Control Discretionary Access Control:

Access to objects (files, directories, devices, etc.) is permitted based on user identity

Each object is owned by a user. Owners can specify freely (at their discretion) how they want to

share their objects with other users, by specifying which other users can have which form of access to their

objects. Discretionary access control is implemented on any multi-user OS

(Unix, Windows NT, etc.). Mandatory Access Control:

Access to objects is controlled by a system-wide policy for example to prevent certain flows of information.

In some forms, the system maintains security labels for both objects and subjects

based on which access is granted or denied.

Labels can change as the result of an access Security policies are enforced without the cooperation of users or

application programs. Mandatory access control for Linux:

http://www.nsa.gov/research/selinux/

Page 14: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Access Control MatrixObj 1 Obj 2 Obj 3 … Obj n

Subj 1 rwl rwlx - - l

Subj 2 rwl rlx rwl - -

Subj 3 - - - rl r

Subj m rl lw rl rw r

Page 15: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

RepresentationsAn access control matrix canbe represented internally indifferent ways:

Access Control Lists (ACLs)store the columns with theobjects

Capability lists store the rows with the subjects

Role-based systems group rights according to the “role” of a subject.

O1 O2 …

S1 rwl wl -

S2 ida wlk -

S3 - - rl

Sm rwlx wi w

Page 16: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Access Control ListsThe ACL for an object lists the access rights of each subject (usually users).

To check a request, look in the object’s ACL.

ACLs are used by most OSes and network file systems, e.g. NT, Unix, and AFS.

Page 17: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

ACL ProblemsTo be secure, the OS must authenticate that the user is who (s)he claims to be.

To revoke a user’s access, we must check every object in the system.

There is often no good way to restrict a process to a subset of the user’s rights.

Page 18: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

CapabilitiesCapabilities store the allowed list of object accesses with each subject.

When the subject requests access to object O, it must provide a “ticket” granting access to O.

These tickets are stored in an OS-protected table associated to each process.

No widely-used OS uses pure capabilities. Some systems have “capability-like” features: e.g. Kerberos, NT, OLPC, Android

Page 19: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

ACL vs. CapabilitiesCapabilities do not require authentication: the OS just checks each ticket on access requests.

Capabilities can be passed, or delegated, from one process to another.

We can limit the privileges of a process, by removing unnecessary tickets from the table.

Page 20: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Roles

S1 S2 S3 Sm

O1 O2 On…

… S1 S2 S3 Sm

O1 O2 On…

R1 R2

Page 21: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Unix/POSIX Access Controlkyd@dio (~) % iduid=3259(kyd) gid=717(faculty)

groups=717(faculty),1686(mess),1847(S07C8271),1910(F07C5471),2038(S08C8271)

kyd@dio (~) % ls -l News_and_Recent_Events.zip -rw-rw-rw- 1 kyd faculty 714904 Feb 22 10:00

News_and_Recent_Events.zip

kyd@dio (/web/classes02/Spring-2011/csci5471) % ls –aldrwxrwsr-x 4 kyd S11C5471 512 Jan 19 10:23 ./drwxr-xr-x 46 root daemon 1024 Feb 17 23:04 ../drwxrwsr-x 3 kyd S11C5471 512 Feb 16 00:36 Assignment/

Page 22: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Mandatory Access Control policies

Restrictions to allowed information flows are not decided at the user’s discretion (as with Unix chmod), but instead enforced by system policies.

Mandatory access control mechanisms are aimed in particular at preventing policy violations by untrusted application software, which typically have at least the same access privileges as the invoking user.

Page 23: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Data Pump/Data DiodeLike “air gap” security, but with one-way communication link that allow users to transfer data from the low-confidentiality to the high- confidentiality environment, but not vice versa.

Examples:Workstations with highly confidential material are configured to have read-only access to low confidentiality file servers.

Page 24: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

The covert channel problem

Reference monitors see only intentional communications channels, such as files, sockets, memory.

However, there are many more “covert channels”, which were neither designed nor intended to transfer information at all.

A malicious high-level program can use these to transmit high-level data to a low-level receiving process, who can then leak it to the outside world.

Examples for covert channels: Resource conflicts – If high-level process has already created a file F, a

low-level process will fail when trying to create a file of same name → 1 bit information.

Timing channels – Processes can use system clock to monitor their own progress and infer the current load, into which other processes can modulate information.

Resource state – High-level processes can leave shared resources (disk head position, cache memory content, etc.) in states that influence the service response times for the next process.

Hidden information in downgraded documents – Steganographic embedding techniques can be used to get confidential information past a human downgrader (least-significant bits in digital photos, variations of punctuation/spelling/whitespace in plaintext, etc.).

Page 25: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

User Interface Failures

Page 26: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Humans“Humans are incapable of securely storing high-quality

cryptographic keys, and they have unacceptable speed and accuracy when performing cryptographic operations. (They are also large, expensive to maintain, difficult to manage, and they pollute the environment. It is astonishing that these devices continue to be manufactured and deployed. But they are sufficiently pervasive that we must design our protocols around their limitations.)”

−− C. Kaufman, R. Perlman, and M. Speciner. Network Security: PRIVATE Communication in a PUBLIC World.

2nd edition. Prentice Hall, page 237, 2002.

Page 27: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Humans are weakest linkMost security breaches attributed to “human error”

Social engineering attacks proliferateFrequent security policy compliance failures

Automated systems are generally more predictable and accurate than humans

Page 28: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Why are humans in the loop at all?

Don’t know how or too expensive to automate

Human judgments or policy decisions needed

Need to authenticate humans

Page 29: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

The human threatMalicious humans who will attack system

Humans who are unmotivated to perform security-critical tasks properly or comply with policies

Humans who don’t know when or how to perform security-critical tasks

Humans who are incapable of performing security-critical tasks

Page 30: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Need to better understand humans in the loop

Do they know they are supposed to be doing something?

Do they understand what they are supposed to do?

Do they know how to do it?Are they motivated to do it?Are they capable of doing it?Will they actually do it?

Page 31: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim
Page 32: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

SSL Warnings

Page 33: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

False Alarm Effect“Detection system” ≈ “System”If risk is not immediate, warning the user will decrease her trust on the system

Page 34: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Patco Construction vs. Ocean Bank

Hacker stole ~$600K from Patco through Zeus The transfer alarmed the bank, but ignored

“substantially increase the risk of fraud by asking for security answers for every $1 transaction”

“neither monitored that transaction nor provided notice before completed”

“commercially unreasonable” Out-of-Band Authentication User-Selected Picture Tokens Monitoring of Risk-Scoring Reports

34

Page 35: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Password Authentication

Page 36: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Definitions Identification - a claim about identity

Who or what I am (global or local) Authentication - confirming that claims are true

I am who I say I am I have a valid credential

Authorization - granting permission based on a valid claim Now that I have been validated, I am allowed to access certain resources or take certain actions

Access control system - a system that authenticates users and gives them access to resources based on their authorizations Includes or relies upon an authentication mechanism May include the ability to grant course or fine-grained authorizations, revoke or delegate authorizations

Also includes an interface for policy configuration and management

Page 37: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Building blocks of authentication

FactorsSomething you know (or recognize)Something you haveSomething you are

Two factors are better than oneEspecially two factors from different categories

What are some examples of each of these factors?

What are some examples of two-factor authentication?

Page 38: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Authentication mechanismsText-based passwords Graphical passwordsHardware tokensPublic key crypto protocolsBiometrics

Page 39: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

EvaluationAccessibilityMemorabilitySecurityCostEnvironmental considerations

Page 40: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Typical password advice

Page 41: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Typical password advicePick a hard to guess passwordDon’t use it anywhere elseChange it oftenDon’t write it down

So what do you do when every web site you visit asks for a password?

Page 42: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Bank = b3aYZ Amazon = aa66x!Phonebill = p$2$ta1

Page 43: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim
Page 44: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Problems with Passwords Selection

– Difficult to think of a good password– Passwords people think of first are easy to guess

Memorability– Easy to forget passwords that aren’t frequently used– Difficult to remember “secure” passwords with a mix of upper & lower case letters, numbers, and special characters

Reuse– Too many passwords to remember– A previously used password is memorable

Sharing– Often unintentional through reuse– Systems aren’t designed to support the way people work together and share information

Page 45: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Mnemonic Passwords

Four

First letter of each word (with punctuation)

fsasya,oFSubstitute numbers for words or similar-looking letters

4sa7ya,oFSubstitute symbols for words or similar-looking letters

F

4sasya,oF

Four

4sa7ya,oF

4s&7ya,oF

score s andaand seven sseven yearsy ago a ,, our o Fathers F

Source: Cynthia Kuo, SOUPS 2006

Page 46: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

The Promise?Phrases help users incorporate different character classes in passwordsEasier to think of character-for-word substitutions

Virtually infinite number of phrasesDictionaries do not contain mnemonics

Source: Cynthia Kuo, SOUPS 2006

Page 47: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Mnemonic password evaluation

Mnemonic passwords are not a panacea for password creation

No comprehensive dictionary todayMay become more vulnerable in future

– Many people start to use them– Attackers incentivized to build dictionaries

Publicly available phrases should be avoided!

Source: Cynthia Kuo, SOUPS 2006

Page 48: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Password keeper softwareRun on PC or handheldOnly remember one password

Page 49: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Single sign-onLogin once to get access to all your passwords

Page 50: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Biometrics

Page 51: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Fingerprint Spoofing Devices

Microsoft Fingerprint Reader APC Biometric Security device

Success! Very soft piece of wax flattened against hard surface

Press the finger to be molded for 5 minutes Transfer wax to freezer for 10-15 minutes Firmly press modeling material into cast Press against the fingerprint reader

Replicated several times

Page 52: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Retina/Iris Scan Retinal Scan

Must be close to camera (IR)

Scanning can be invasive Not User friendly Expensive

Iris Scan Late to the game Requires advanced technology to properly capture iris

Users do not have to consent to have their identity tested

Page 53: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Graphical passwords

Page 54: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

“Forgotten password” mechanism

Email password or magic URL to address on file

Challenge questions Why not make this the normal way to access infrequently used sites?

Page 55: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Convenient SecureID 1What problems does this approach solve?

What problems does it create?

Source:

http://worsethanfailure.com/Articles/Security_by_Oblivity.aspx

Page 56: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Convenient SecureID 2What problems does this approach solve?

What problems does is create?

56

Previously available at:

http://fob.webhop.net/

Page 57: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Browser-based mutual authentication

Chris Drake’s “Magic Bullet” proposal http://lists.w3.org/Archives/Public/p

ublic-usable-authentication/2007Mar/0004.html– User gets ID, password (or alternative),

image, hotspot at enrollment– Before user is allowed to login they are

asked to confirm URL and SSL cert and click buttons

– Then login box appears and user enters username and password (or alternative)

– Server displays set of images, including user’s image (or if user entered incorrect password, random set of images appear)

– User finds their image and clicks on hotspot

• Image manipulation can help prevent replay attacks

What problems does this solve? What problems doesn’t it solve? What kind of testing is needed

Page 58: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Phishing

Page 59: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Spear Phishing (Targeted Phishing)

Personalized mail for a (small) group of targeted users Employees, Facebook friends, Alumni, eCommerce Customers

These groups can be obtained through identity theft!

Content of the email is personalized. Different from Viagra phishing/spam

Combined with other attacks Zero-day vulnerability: unpatched Rootkit: Below OS kernel, impossible to detect with AV software

Key logger: Further obtain ID/password APT (Advanced Persistent Threat): long-term surveillance

59

Page 60: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Examples of Spear Phishing

60

Page 61: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Good Phishing example

61

Page 62: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Policy and Usability

Page 63: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim
Page 64: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

Cost of Reading Policy Cranor et al.

TR= p x R x n p is the population of all Internet users R is the average time to read one policy n is the average number of unique sites Internet users visit annually

p = 221 million Americans online (Nielsen, May 2008)

R = avg time to read a policy = # words in policy / reading rate To estimate words per policy:

Measured the policy length of the 75 most visited websites Reflects policies people are most likely to visit

Reading rate = 250 WPM Mid estimate: 2,514 words / 250 WPM = 10 minutes

Page 65: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

n = number of unique sites per yearNielsen estimates Americans visit 185 unique sites in a month:

but that doesn’t quite scale x12, so 1462 unique sites per year.

TR= p x R x n

= 221 million x 10 minutes x 1462 sites

R x n = 244 hours per year per person

Page 66: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim

P3P: Platform for Privacy Preferences

A framework for automated privacy discussionsWeb sites disclose their privacy practices in standard machine-readable formats

Web browsers automatically retrieve P3P privacy policies and compare them to users’ privacy preferences

Sites and browsers can then negotiate about privacy terms

Page 67: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim
Page 68: EE515/IS523 Think Like an Adversary Lecture 6 Access Control/Usability Yongdae Kim