© 2004 simson garfinkel csci e-170: computer security, usability & privacy simson l. garfinkel...

72
© 2004 Simson Garfinkel CSCI E-170: Computer Security, Usability & Privacy Simson L. Garfinkel [email protected] http://www.simson.net/ csci_e-170/ Erik Nordlander [email protected]

Post on 18-Dec-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

© 2004 Simson Garfinkel

CSCI E-170: Computer Security, Usability & Privacy

Simson L. Garfinkel

[email protected]

http://www.simson.net/csci_e-170/

Erik Nordlander

[email protected]

© 2004 Simson Garfinkel

Course Fundamentals

• Online Stats:– [email protected]– http://www.simson.net/csci_e-170

• Textbooks: – Lots of stuff on the web– Database Nation (Garfinkel, 2000)– Web Security, Privacy & Commerce

(Garfinkel, 2002)

• Class meetings: – Emerson 108, Tuesdays 5:30-7:30

QuickTime™ and aTIFF (Uncompressed) decompressor

are needed to see this picture.

QuickTime™ and aTIFF (Uncompressed) decompressor

are needed to see this picture.

QuickTime™ and aTIFF (Uncompressed) decompressor

are needed to see this picture.

QuickTime™ and aTIFF (Uncompressed) decompressor

are needed to see this picture.

© 2004 Simson Garfinkel

Check the website!

• Announcements• Assignments• Notes• Materials

• mirror:– http://v.lcs.mit.edu/csci_e-170

© 2004 Simson Garfinkel

© 2004 Simson Garfinkel

Today’s Class

• Hour #1: Computer Security – What is it?– What is a security policy?– What does it include / not include?– Perimeter definition & risk assessment– Best practices

• Saltzer’s Design Principles• Hour #2: Understanding Privacy

– Data disclosure– Fair information practices

© 2004 Simson Garfinkel

What is Computer Security?

• COMPUTER SECURITY:– “A computer is secure if you can depend on it and

its software to behave as you expect.” (Garfinkel & Spafford, 1991)

© 2004 Simson Garfinkel

Brief History of Computer Security

• 1930s - Turing

• 1940s - Cracking codes

• 1950s - Batch computing– Deck of cards had account, no password

• 1960s - Interactive Computing– usernames & passwords

• 1971 - First reports of “hacking”

© 2004 Simson Garfinkel

RFC 602 (1973)

• Arpa Network Working Group Bob Metcalfe (PARC-MAXC)Request for Comments: 602 Dec 1973NIC #21021

"The Stockings Were Hung by the Chimney with Care”

The ARPA Computer Network is susceptible to security violations for at least the three following reasons:

(1) Individual sites, used to physical limitations on machine access, have not yet taken sufficient precautions toward securing their systems against unauthorized remote use. For example, many people still use passwords which are easy to guess: their fist names, their initials, their host name spelled backwards, a string of characters which are easy to type in sequence (e.g. ZXCVBNM).

© 2004 Simson Garfinkel

RFC 602 (1973)

• (2) The TIP allows access to the ARPANET to a much wider audience than is thought or intended. TIP phone numbers are posted, like those scribbled hastily on the walls of phone booths and men's rooms. The TIP required no user identification before giving service. Thus, many people, including those who used to spend their time ripping off Ma Bell, get access to our stockings in a most anonymous way

© 2004 Simson Garfinkel

RFC 602 (1973)

• (3) There is lingering affection for the challenge of breaking someone's system. This affection lingers despite the fact that everyone knows that it's easy to break systems, even easier to crash them.

© 2004 Simson Garfinkel

RFC 602 (1973)

• All of this would be quite humorous and cause for raucous eye winking and elbow nudging, if it weren't for the fact that in recent weeks at least two major serving hosts were crashed under suspicious circumstances by people who knew what they were risking; on yet a third system, the system wheel password was compromised -- by two high school students in Los Angeles no less.

We suspect that the number of dangerous security violations is larger than any of us know is growing. You are advised not to sit "in hope that Saint Nicholas would soon be there".

RMV:rmv

© 2004 Simson Garfinkel

Brief History of Computer Security Cont…

• 1980s - Emergence of the hacker underground– 1983 - WarGames

• “Is it a game, or is it real?”• “War Dialing”

– 1986 - The Cuckoo’s Egg• Cliff Stoll and the German Hackers

• January 15, 1990 – AT&T Network Crash– Operation Sun Devil– http://www.mit.edu/hacker/hacker.html

QuickTime™ and aTIFF (Uncompressed) decompressorare needed to see this picture.QuickTime™ and aTIFF (Uncompressed) decompressorare needed to see this picture.

QuickTime™ and aTIFF (Uncompressed) decompressorare needed to see this picture.QuickTime™ and aTIFF (Uncompressed) decompressorare needed to see this picture.

QuickTime™ and aTIFF (Uncompressed) decompressorare needed to see this picture.QuickTime™ and aTIFF (Uncompressed) decompressorare needed to see this picture.

© 2004 Simson Garfinkel

Goals of Computer Security

• Availability – Make sure you can use your system

© 2004 Simson Garfinkel

Goals of Computer Security (2)

• Confidentiality– Keep your secrets secret!

© 2004 Simson Garfinkel

Goals of Computer Security (3)

• Data Integrity– Prevent others from modifying your data

© 2004 Simson Garfinkel

Goals of Computer Security (4)

• Control – Regulate the use of your system

© 2004 Simson Garfinkel

Goals of Computer Security (5)

• Audit– What happened? – How do we undo it?

© 2004 Simson Garfinkel

Goals of Computer Security:

• Availability

• Confidentiality

• Data Integrity

• Control

• Audit

who/what are we protecting?

who/what are we protecting against?

how are we going to do it?

© 2004 Simson Garfinkel

Different environments have different priorities

• Banking environment:– integrity, control and audit are more critical than

confidentiality and availability

• Intelligence service:– confidentiality may come first, availability last.

• Military on the battlefield:– availability may come first, audit may come last

• University:– Integrity and availability may come first.

© 2004 Simson Garfinkel

Vulnerabilities, Threats, Attacks

• Most security texts focus on bad-guy attackers, worms, viruses, etc.

• Most continuity problems arise from:– Operator error– Software error– Environmental problems

• The best security measures protect against both inadvertent and malicious threats.

© 2004 Simson Garfinkel

Class Participation:

• Threats to:– Availability– Confidentiality– Data Integrity?– Control?– Audit?

© 2004 Simson Garfinkel

Security Policy

• Defines a security perimeter

Because you can’t secure everything

© 2004 Simson Garfinkel

Security Policy

• Defines a security perimeter

• Standards codify the what should be done

• Guidelines explain how it will be done

© 2004 Simson Garfinkel

How do you create a policy?

• Option #1 Risk Assessment:– Identify assets and their value– Identify the threats– Calculate the risks– Conduct a Cost-Benefit Analysis

© 2004 Simson Garfinkel

How do you create a policy?

• Option #2: Adopt “Best Practices.”

© 2004 Simson Garfinkel

Techniques For Drafting Policies

• Assign a specific “owner” to everything that is to be protected.

• Be positive

• Be realistic in your expectations

• Concentrate on education and prevention

© 2004 Simson Garfinkel

Threats to Consider:

• Human error• “Hackers”

– technical gurus, script kiddies, criminals looking for gain.

• Disgruntled employees• Organized crime

– increasingly a threat! Breaking into hospitals, e-commerce sites, etc.

• Foreign espionage (it happens!)• Cyber terrorists (it hasn’t happened yet)• Information warfare attacks (depends on how you count)• Microsoft / RIAA / MPAA• Mom

© 2004 Simson Garfinkel

Remember, Risk Cannot Be Eliminated

• You can purchase a UPS…– But the power failure may outlast the batteries– But the UPS may fail– But the cleaning crew may unplug it– But the UPS may crash due to a software error.

© 2004 Simson Garfinkel

Spaf’s first principle of security administration:

• “If you have responsibility for security, but have no authority to set rules or punish violators, your own role in the organization is to take the blame when something big goes wrong.”

© 2004 Simson Garfinkel

Technical Design Principles

• “The Protection of Information in Computer Systems,” (Saltzer & Schroeder, 1975)

• Designed for securing operating systems, but generally applicable.

© 2004 Simson Garfinkel

Saltzer & Schroeder’s Principles:

– Least Privilege– Economy of Mechanism– Complete Mediation– Open Design– Separation of Privilege– Least Common Mechanism– Psychological Acceptability

© 2004 Simson Garfinkel

Least Privilege

• “Every user and process should have the minimum amount of access rights necessary. Least privilege limits the damage that can be done by malicious attackers and errors alike. Access rights should be explicitly required, rather than given to users by default.”

© 2004 Simson Garfinkel

Economy of mechanism

• “The design of the system should be small and simple so that it can be verifiedand correctly implemented.”

© 2004 Simson Garfinkel

Complete Mediation

• “Every access should be checked for proper authorization.”

© 2004 Simson Garfinkel

Open Design

• “Security should not depend upon the ignorance of the attacker. This criterion precludes back doors in systems, which give access to users who know about them.”

© 2004 Simson Garfinkel

Separation of privilege

• “Where possible, access to system resources should depend on more than one ondition being satisfied.”

© 2004 Simson Garfinkel

Least Common Mechanism

• “Users should be isolated from one another by the system. This limits both covert monitoring and cooperative efforts to override system security mechanisms.”

© 2004 Simson Garfinkel

Psychological acceptability

• “The security controls must be easy to use so that they will be used and not bypassed.”

• Simson L. Garfinkel

Privacy-Protecting Policies

© 2004 Simson Garfinkel

The Big Idea:

• Many technical problems can be solved through the use of policy

• Technologists tend to overlook policy solutions because they:– Aren’t 100% effective– Don’t work across legislative boundaries– Are open to [possibly intentional] misinterpretation

• Example: CAN-SPAM act

© 2004 Simson Garfinkel

On the other hand…

• Policy solutions can be more flexible than technical solutions– Policy can be “technology-neutral”– Policy doesn’t need to be upgraded– Policy doesn’t crash when there are typos– Policy can enable lawsuits that attack the human

root of problems

• The world is filled with bad people.

• You can’t put them all in jail.

The “Bad People” problem

• Decreasing inventory at stores– Shoplifting?

– Employee theft?

• Merchandise purchased with “lost” credit cards– Perhaps the card was stolen

– Perhaps the card wasn’t stolen

Evidence of “bad people”

• Money borrowed and not repaid

• Faked insurance claims

• Forged checks

More Evidence...

• Make a list of the bad people.

• Don’t do business with anybody on the list.

Solution to the “bad person” problem

• Retail Credit– List of people “known” not to reply their debts

• Medical Information Bureau (est. 1902)– List of people with “known” medical problems

• Chicago-area merchants (1950s)– List of “known” shoplifters

Examples of Solution...

• “Retired Army Lieutenant Colonel”– “A rather wild-tempered, unreasonably, and

uncouth person….– “who abused his rank and wasn’t considered a

well-adjusted person.– “He was known to roam the reservation at Ft.

Hood and shoot cattle belonging to ranchers who had leased the grazing land from the Army.”

• —Hearings on the Retail Credit Company, 1968

Typical Credit Report

Credit reports of the 1960s

• Contained information that was hearsay or just plain wrong.

• Records confused between individuals.

• No “statute of limitations” on the information.

• People frequently prohibited from seeing their own records.

Fair Credit Reporting Act, 1970

• Right to see your credit report.• Right to challenge incorrect information.• Information automatically removed from report

after 7 years– 10 years for Bankruptcy.

• Right to know who accesses your report.• Right to a free credit report if you are denied

credit.

Code of Fair Information Practices (1973)

#1• There must be no personal data record-keeping systems whose very existence is secret.

• There must be a way for a person to find out what information about the person is in a record and how it is used.

CFIPS #2

• There must be a way for a person to prevent information about the person that was obtained for one purpose from being used or made available for other purposes without the person's consent.

CIFP #3

• There must be a way for a person to correct or amend a record of identifiable information about the person.

CFIP #4

• Any organization creating, maintaining, using, or disseminating records of identifiable personal data must assure the reliability of the data for their intended use and must take precautions to prevent misuses of the data.

CFIP #5

• No Secret databanks

• You are allowed to see your own record

• Information obtained for one purpose can’t be used for another without consent.

• Ways for correcting or amending information.

• Prevention of misuse.

CFIPs in Short

CFIPs…

• Good ideas --- matches what we believe.

• FCRA - 1970

• 1980 OECD Guidelines

• 1999 Canada “C6”

• FTC’s “Notice, Choice, Security and Access”

CIFP, cont.

• Good ideas --- matches what we believe.

• Never passed into law.

• Adopted in Europe.

1980 OECD Guidelines

• “Guidelines on the Protection of Privacy and Transborder Flows of Personal Data”

• Collection Limitation Principle– “obtained by lawful and fair means”

– “with the knowledge or consent” where appropriate

• Data Quality Principle– Data should be relevant and kept up-to-date.

1980 OECD Guidelines, Cont.

• Purpose Specification Principle– Purpose specified before the data is collected.

• Use Limitation Principle– Not be used for purposes other than originally

intended except• With the consent of the data subject

• By the authority of law.

1980 OECD Guidelines, Cont.

• Security Safeguards Principle– “Reasonable security safeguards” to prevent loss,

unauthorized access, destruction, use, modification or disclosure of data.

• Openness Principle– Clearly stated practices and policies.

– No secret databases.

1980 OECD Guidelines, Cont.

• Individual Participation Principle– Individuals have the right to see their own records.

– Right to challenge and demand correction or erasure. • (note Steve Ross story!)

• Accountability Principle– “A data controller should be accountable for complying

with measures which give effect to the principles stated above.”

1995 CSA “Privacy Standard”

• 1. Accountability • 2. Identifying Purposes • 3. Consent • 4. Limiting Collection• 5. Limiting Use, Disclosure, and Retention • 6. Accuracy • 7. Safeguards • 8. Openness • 9. Individual Access • 10. Challenging Compliance

• Comprehensive privacy law applies to both public and private sector

• National businesses, banks, etc

• Medical records, prescriptions and insurance records (January 1, 2002)

• Law extends to all commercial activity in Canada (January 1, 2004)

1999: Canada “C6”

What really makes C6 work...

• Governmental Standards– Enforcement by regulatory agencies, states, etc.

• Industry Standards– “Codes of conduct”– Limited enforcement through licensing– Limited enforcement from government

• Unregulated Market– Reputation, or Caveat emptor

Approaches to Privacy Enforcement

• Key Provisions:– Largely about health insurance portability, not

about privacy– Privacy mandates are largely about security:

• Firewalls, anti-virus, etc.• Designate a privacy officer• Post privacy policy• Require outsourcing companies to protect information.• Access to health information; procedures for

correcting errors.

– Enforced by the States (unfunded mandate); HHS enforces in “extreme cases.”

• (*privacy rule passed 2002)

HIPAA - 1996*

• Key Provisions:

– Applies to online collection of info on children under 13

– Requires “verifiable parental consent”

• Very hard in most cases; letter, fax or phone call

• Some exceptions — one time response to “homework help”

– Privacy notice must be posted on website

• http://www.ftc.gov/opa/1999/9910/childfinal.htm

COPPA (1998)

• Consumers must be informed of privacy policies– Initial notice– Annual notice– Notices were mostly ignored!

• Consumers must have a chance to “opt-out”– Many different ways to “opt-out”

GLB (2000)

© 2004 Simson Garfinkel

Identity Theft

• Personal stories from class?

© 2004 Simson Garfinkel

Class Discussion:

• What online collaboration tool should we use?

© 2004 Simson Garfinkel

Written Assignment:

• Create your own defintiion of security in no more than 1 paragraph. Submit online before Friday

• Write a 950-word essay describing an incident in which you were personally involved. Be sure to include relevant details. Sanitize it for publication. Post online before class.

© 2004 Simson Garfinkel

Reading Assignment

• Database Nation:– Chapter 1 - Privacy Under Attack– Chapter 2 - Database Nation

• Web Security, Privacy & Commerce:– Chapter 1 - The Web Security Landscape– Chapter 2 - The Architecture of the WWW– Chapter 8 - The Web’s War on Your Privacy