mina deng phd defense

40
Privacy Preserving Content Protection PhD Defense Mina Deng Promoter: Prof. Bart Preneel COSIC, ESAT/SCD, KU Leuven July, 2010

Upload: minadeng

Post on 05-Dec-2014

1.486 views

Category:

Documents


3 download

DESCRIPTION

Mina Deng PhD defense presentation

TRANSCRIPT

Page 1: Mina Deng PhD defense

Privacy Preserving

Content ProtectionPhD Defense

Mina Deng

Promoter: Prof. Bart Preneel

COSIC, ESAT/SCD, KU Leuven

July, 2010

Page 2: Mina Deng PhD defense

Introduction

Page 3: Mina Deng PhD defense

The age of privacy is over?

Page 4: Mina Deng PhD defense

Privacy definitions

Individual rights

• “the right to be let alone” (Warren and Brandeis, 1890)

Informational self-determination

• “Privacy is the claim of individuals, groups, or institutions to determine for themselves when,

how, and to what extent information about them is communicated to others.” (Alan Westin, how, and to what extent information about them is communicated to others.” (Alan Westin,

1967)

Access and control

• “control access to oneself and to personal information about oneself ” (Adam Moore, 1998)

Pluralistic resemblance

• “Privacy is a plurality of different things.” “It is a set of protections against a related cluster

of problems” (Daniel Solove, 2008)

Privacy & data minimization

• “Data controllers should collect only the personal data they really need, and should keep it

only for as long as they need it”. (European Data Protection Directive 95/46/EC, 1995)

Page 5: Mina Deng PhD defense

Debate privacy vs. security

– tradeoff: security & privacy

•get more of one, at the expense of the other

•after 9/11, give up civil liberties & privacy to national security

•popular response: “I have nothing to hide”

•“The nothing to hide argument is an argument

that the privacy interest is generally minimal to

trivial, thus making the balance against

+ need to have both

•“These two components of security – safety

and privacy … I work from the assumption that

you need to have both.” – Donald Kerr (US deputy director of national intelligence)

•“Security and privacy are not opposite ends of

a seesaw. There is no security without privacy.

And liberty requires both security and privacy.” trivial, thus making the balance against

security concerns a foreordained victory for

security”. – Daniel Solove (privacy scholar)

And liberty requires both security and privacy.”

–Bruce Schneier (security commenter)

Page 6: Mina Deng PhD defense

Content protection motivation

Page 7: Mina Deng PhD defense

Industry interests

Page 8: Mina Deng PhD defense

Content protection

Core techniques

Encryption: first line of defense

• + prevent unauthorized access

• – no content protection after decryption

secret

key

secret

key

Symmetric

key

plaintext ciphertext plaintext

plaintext ciphertext plaintext

Homer’s public

key

Homer’s private

key

key

Asymmetric

key

Marge Homer

Lisa & Bart Homer

Page 9: Mina Deng PhD defense

Content protection

Core techniques

Digital watermarking: second line of defense

• embed information imperceptibly

• e.g. to prove ownership

watermark

original

content

watermarked

content

secret

watermarking

key

secret

watermarking

key

watermark

distribution / processing / attack

embedding detection / extraction

Page 10: Mina Deng PhD defense

Digital watermarking illustration

watermark

64×64 image

original image

512×512watermarked image

extracted watermark

(correlation = 0.9997)

Page 11: Mina Deng PhD defense

Privacy issue in content protection

creation use control payment

distribution monitor usage

User’s privacy nightmare

Page 12: Mina Deng PhD defense

Research motivation

Conflict

• content protection interests of provider

• privacy rights of user• privacy rights of user

Can we reconcile privacy with protection of content?

Page 13: Mina Deng PhD defense

Outline

Introduction

Overview of contributions

Privacy threat analysis frameworkPrivacy threat analysis framework

Anonymous buyer-seller watermarking protocols

Conclusion

Page 14: Mina Deng PhD defense

Overall structure

Privacy preserving content protection

Privacy analysis methodology

Privacy preserving content protection

systems

Research

Questions methodology

Threat

framework

(Ch 2)

systems

Content protection for commercial

content

BSW protocols

(Ch 3)

Privacy protection for personal content

Privacy

friendly

eHealth (Ch 4)

Personal rights

management

(Ch 5)

Questions

Proposed

Solutions

Page 15: Mina Deng PhD defense

Chapter 2. Privacy threat analysis

framework

Page 16: Mina Deng PhD defense

Contribution (J.RE 2010)

Background

Threat modeling

• threats

• requirements

• countermeasures

Problem & solution

Problem

• lacks systematic approach

• privacy threat analysis

Our solution• countermeasures

Two pillars

• methodology

• knowledge

• checklists & patterns

Security: methodological support

• goal-oriented: KAOS

• scenario-based: STRIDE

Our solution

• privacy threat analysis framework

• model threats to system elements

• instantiate threats using threat tree patterns

• elicit requirements from misuse cases

• select countermeasures according to requirements

Page 17: Mina Deng PhD defense

Chapter 3. Anonymous Buyer-

Seller Watermarking Protocols

Page 18: Mina Deng PhD defense

Contribution (J.TIFS 2010, MMSEC 2009)

Background

Massive online distribution

• + efficiency and convenience

• – threats: intellectual property rights

Traditional assumption

Problem & solution

Problem

• copyright protection (provider)

• privacy protection (user)

Our solutionTraditional assumption

• providers trustworthy

• no illegal distribution

• honest embedding

• not realistic!

Traceability discredited

• seller frames innocent buyer

• guilty buyer repudiates

Our solution

• limited trust in seller

• traceability: unique code embedded

• copyright protection & piracy tracing

• buyer’s revocable anonymity

• formal security analysis

• actual protocol security bounded to security of watermarking scheme

Page 19: Mina Deng PhD defense

Chapter 4. Privacy-friendly

architecture to manage distributed

e-Health information

Page 20: Mina Deng PhD defense

Contribution (J.OIR 2009, E-Health Handbook 2009)

Back ground

E-Health system

• privacy sensitive content

• overview of patient’s medical history

Privacy threats

Problem & solution

Problem

• content sharing– interoperability (healthcare provider)

• privacy protection (patient)Privacy threats

• cross reference content & ID across providers

• �intensive use of patient’s ID

• �different sensitivity levels

• privacy protection (patient)

Our solution

• architecture distributed e-health

• limited trust in healthcare service providers

• mediating service

• data anonymization

• practical validation

Page 21: Mina Deng PhD defense

Chapter 5. Personal rights management

for individual privacy enforcement

Page 22: Mina Deng PhD defense

Contribution (PET’06, CMS’05)

Background Problem & solution

Problem

• privacy protection (an individual)

• personal content distribution (other individuals)

Personal content distribution

• (phone) cameras, blogs, social networks, search engines

• private pictures taken & published individuals)

Our solution

• detection mechanism

• control pictures taken by others

• no restriction & no privacy infringement for photographers

• distribution channel

• non-professional adversary

• private pictures taken & published

• technology trends worsen situation

Emerging privacy threats

• governments and industry

• normal individuals

Page 23: Mina Deng PhD defense

Outline

Introduction

Overview of contributions

Privacy threat analysis frameworkPrivacy threat analysis framework

Anonymous buyer-seller watermarking protocols

Conclusion

Page 24: Mina Deng PhD defense

Privacy analysis framework

High-level

description

Assumption &

usage

scenarios

SYSTEM SPECIFIC

METHODOLOGY

Define Data

Flow

Diagram

(DFD)

Map Privacy

Threats to

DFD

Elements

Identify

Misuse Case

Scenarios

Risk-based

PrioritizationElicit Privacy Requirements

Select

Privacy

Enhancing

Solutions

Privacy

threat tree

patterns

Mapping

threats

components

to DFD

Mapping

Privacy misuse

cases to

Requirements

Mapping

Privacy

Objectives to

Solutions

KNOWLEDGERisk

Assessment

Techniques

(Not included)

Page 25: Mina Deng PhD defense

Privacy threat analysis – illustrationPrivacy properties Privacy threats

Unlinkability Linkability

Anonymity & Pseudonymity Identifiability

Plausible deniability Non-repudiation

Undetectability & Unobservability Detectability

Confidentiality Disclosure of information

Content awareness content Unawareness

Policy and consent compliance policy and consent Noncompliance

Data Flow

Data Policy and consent compliance policy and consent Noncompliance

Privacy threats Entity Data

flow

Data

store

Process

Linkability X X X X

Identifiability X X X X

Non-repudiation X X X

Detectability X X X

Information disclosure X X X

Content unawareness X

Consent/policy

noncompliance

X X X

Data

Flow

Diagram Threat Tree Pattern

Page 26: Mina Deng PhD defense

Elicited privacy requirements &

mitigation strategies

n° Threat scenarios Privacy requirements Suggested mitigation strategy11 Linkability of social network data storeLinkability of social network data store UnlinkabilityUnlinkability of data entries within the social network databaseof data entries within the social network database Protection of the data store, by applying of data anonymization Protection of the data store, by applying of data anonymization

techniques, such as ktechniques, such as k--anonymityanonymity

2 Linkability of data

flow (user-portal)

Unlinkability of messages of

user-portal communication

Employ anonymity system,

e.g. TOR

33 Linkability of entities the social network usersLinkability of entities the social network users UnlinkabilityUnlinkability of different pseudonyms (user IDs) of social network usersof different pseudonyms (user IDs) of social network users TechnicalTechnical enforcement: enforcement: Use anonymity system Use anonymity system such as TOR, for such as TOR, for

communication between user and social network web portalcommunication between user and social network web portal

UserUser privacy selfprivacy self--awareness (aware revealing too much information online awareness (aware revealing too much information online

can be privacy invasive)can be privacy invasive)

ChannelChannel and message confidentiality (of data flow)and message confidentiality (of data flow) UseUse anonymity system, such as TOR anonymity system, such as TOR

44 Identifiability at the social network data storeIdentifiability at the social network data store AnonymityAnonymity of social network users such that the user will not be of social network users such that the user will not be

identified from social network database entriesidentified from social network database entries

Protection of the data store, by applying of data anonymization Protection of the data store, by applying of data anonymization

techniques, such as ktechniques, such as k--anonymityanonymity

55 Identifiability at data flow of user data stream Identifiability at data flow of user data stream

(use(use--rportalrportal))

AnonymityAnonymity of social network users such that the user will not be of social network users such that the user will not be

identified from useridentified from user--portal communicationportal communication

TechnicalTechnical enforcement: enforcement: use anonymity system,use anonymity system, such as TOR, for such as TOR, for

communication between user and social network web portalcommunication between user and social network web portal

Page 27: Mina Deng PhD defense

Outline

Introduction

Overview of contributions

Privacy threat analysis frameworkPrivacy threat analysis framework

Anonymous buyer-seller watermarking protocols

Conclusion

Page 28: Mina Deng PhD defense

Online transaction scenario

BuyerSeller

Group

Manager

Judge

Page 29: Mina Deng PhD defense

Anonymous buyer-seller watermarking

protocols

Building blocks

• homomorphic encryption: watermarking in encrypted domain

2. Watermark generation &

embedding

)()()(

:,,,

2121

21

mEmEmmE

MmmCM

CM

CM

oo

oo

=

∈∀∈∈

• group signature

• zero-knowledge proof

Properties

• traceability (seller’s security)

• non-repudiation (seller’s security)

• non-framing (buyer’s security)

• anonymity & unlinkability (buyer’s security)

1. Registration 3. Identification &

arbitration

)()()( 2121 mEmEmmE CM oo =

Page 30: Mina Deng PhD defense

Registration phase

Buyer Group

Manager

igsk

),( ii uskgpkGSjoingsk ← ),,( ii upkiskgpkGSissreg ←

Secure & authenticated channel

Group manager

• Buyer’s ID

Buyer group joining

• secret signature key

Page 31: Mina Deng PhD defense

Watermark generation &

embedding phase

Buyer Seller21,ππ

Anonymous channel (S & B)

Zero knowledge proofs

• Fair Encryption of private Key

• Bit Encryption of watermark

),,(

),)(,,(

),(

),(

)1(),(

1

'

'

'

''

mgskgpkGSsigS

Ccjpkm

WpkBEncc

skpkJEncC

BKgenpksk

im

l

iiB

BBi

Bj

k

BB

i

=)),(,,( ' WpkBEncXswkWATemb B

Page 32: Mina Deng PhD defense

Watermark generation & embedding

Basic concept

•Seller & Buyer generate part of watermark

•Seller doesn’t know: buyer’s watermark & watermarked content delivered to the buyer

•Buyer doesn’t know: original content & seller’s watermark

Type I

•security (S & B)

•multiple transactions

(W))()E(X'(W))E(X'E(Y)

V,XX'

σσ E⊗=⊕=

⊕=

→npermutatio

⊕original content index watermark

intermediate

watermarked content

buyer’s watermark permuted

buyer’s watermark

final

watermarked content

Page 33: Mina Deng PhD defense

Watermark generation & embedding

Type II

•not limited to permutation tolerant watermarks

E(W))E(X'W)E(X'E(Y)

),E(W)E(W)WE(WE(W) V,XX' BSBS

⊗=⊕=

×=+=⊕=

→⊕

× →

+ →

→⊕

buyer’s watermark seller’s watermark

composite watermark

original content index watermark Intermediate watermarked content

final

watermarked content

additive homomorphic

Page 34: Mina Deng PhD defense

Watermark generation & embedding

Type III

•avoid double-watermark

)1(W ,)E(WE(1)

0)(W ),E(W {)WE(W)E(W

)}E(W),...,{E(W || )}E(),...,E({E(W) ),W(W||W

ii

ii

21

s

1-

B

sB

BSSBi

SBlSB11BS

==⊕=

=⊕= lφφφ

c⊕ →

|| →

original content

final

composite watermark

intermediate

composite watermark

final watermarked content

intermediate

composite watermark

index watermark

buyer’s watermark seller’s watermark

Page 35: Mina Deng PhD defense

Identification and arbitration phase

Seller

Judge

Group

Manager

),det(' YswkWATW ←

),,,,(),( mi smregoskgpkGSopenB ←τ

Secure & authenticated channel

Page 36: Mina Deng PhD defense

Implementation Type III BSW protocol

Parameters

• 512×512-pixel image, ≈ 2 Mbit

• Watermark 128 bits

• Paillier modulus N of 1024 bits

• run on CPU at 2.4 GHz

Execution time (in seconds)

Communication complexity

• (in exchanged bits)

• Watermark generation & embedding: ≈ 8 Mbit

• Identification & arbitration: ≈ 0.4 Mbit

• Expansion factor: ≈ 4.2Execution time (in seconds)

• Registration: <0.5 sec

• Identification & arbitration: < 2.5 sec

• most computational load @ Seller

• Expansion factor: ≈ 4.2

watermark generation & embedding (WGE) phase execution time

Page 37: Mina Deng PhD defense

Outline

Introduction

Overview of contributions

Privacy threat analysis frameworkPrivacy threat analysis framework

Anonymous buyer-seller watermarking protocols

Conclusions

Page 38: Mina Deng PhD defense

Conclusions

Privacy threats emerge

• trust in providers

Need balance

• content protection (provider) & privacy protection (user)

PrivacyPrivacy

• as security, embodied value

Build privacy in

• goal-oriented, framework

Content protection techniques

• also protect privacy

� Yes, it is possible to reconcile privacy with protection of content

Page 39: Mina Deng PhD defense

List of publications

International JournalsMina Deng, Kim Wuyts, Riccardo Scandariato, Bart Preneel, and Wouter Joosen. A privacy threat analysis framework: supporting the

elicitation and fulfillment of privacy requirements. Requirement Engineering Journal special issue on Data Privacy, to appear, 27 pages,

2010.

Alfredo Rial, Mina Deng, Tiziano Bianchi, Alessandro Piva, and Bart Preneel. Anonymous buyer-seller watermarking protocols: formal

definitions and security analysis. IEEE Transactions on Information Forensics and Security, to appear, 11 pages, 2010.

Mina Deng, Danny De Cock, and Bart Preneel. Towards a cross-context identity management framework in e-health. Online Information

Review, international journal 33(3):422-442, 2009.

Mina Deng and Bart Preneel. Attacks on two buyer-seller watermarking protocols and an improvement for revocable anonymity. International

Journal of Intelligent Information Technology Application, 1(2):53-64, 2008.

Book ChaptersBook ChaptersMina Deng, Danny De Cock, and Bart Preneel. An interoperable cross-context architecture to manage distributed personal e-health

information. In M. M. Cunha, R. Simoes, and A. Tavares, editors, Handbook of Research on Developments in e- Health and

Telemedicine: Technological and Social Perspectives, ISBN: 978-1-61520-670-4, chapter 27, pages 576-602. Hershey, PA, USA: IGI

Global, Inc., 2009.

Mina Deng and Bart Preneel. On secure buyer-seller watermarking protocols with revocable anonymity. In Kyeong Kang, editor, E-Commerce,

ISBN: 978-953-7619-98-5, chapter 11, pages 184-202. IN-TECH Education and Publishing, Vienna, Austria, 2009.

International conferences (Selected)Mina Deng, Tiziano Bianchi, Alessandro Piva, and Bart Preneel. An efficient buyer-seller watermarking protocol based on composite signal

representation. In Proceedings of the 11th ACM workshop on Multimedia and security (MMSEC), pages 9-18, Princeton, New Jersey,

USA. ACM New York, NY, USA, 2009.

Mina Deng and Bart Preneel. On secure and anonymous buyer-seller watermarking protocol. In Abdelhamid Mellouk, Jun Bi, Guadalupe Ortiz,

Dickson K. W. Chiu, and Manuela Popescu, editors, Third International Conference on Internet and Web Applications and Services

(ICIW), pages 524-529, Athens, Greece. IEEE Computer Society, 2008.

Mina Deng, Lothar Fritsch, and Klaus Kursawe. Personal rights management – taming camera-phones for individual privacy enforcement. In

George Danezis and Philippe Golle, editors, Privacy Enhancing Technologies, 6th International Workshop (PET), Revised Selected Papers,

volume 4258 of Lecture Notes in Computer Science, pages 172-189, Cambridge, UK. Springer, 2006.

Page 40: Mina Deng PhD defense

Questions?

Thank you! ☺☺☺☺

[email protected]