vote privacy: models and cryptographic underpinnings bogdan warinschi university of bristol 1

70
Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Upload: frida-latchford

Post on 14-Dec-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Vote privacy: models and cryptographic underpinnings

Bogdan Warinschi University of Bristol

1

Page 2: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Aims and objectives• Models are useful, desirable• Cryptographic proofs are not difficult

• Have y’all do one cryptographic proof• Have y’all develop a zero-knowledge protocol• Have y’all prove one property for a zero-knowledge protocol

2

Page 3: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Models

3

Page 4: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Voting scheme

4

v1

vn

v2 (v1,v2,…,vn)

• Votes: v1,v2,…vn in V• Result function: :V* Results• V={0,1}, (v1,v2,…,vn)= v1+v2+…+vn

Page 5: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Wish list

• Eligibility: only legitimate voters vote; each voter votes once• Fairness: voting does not reveal early results• Verifiability: individual, universal• Privacy: no information about the individual votes

is revealed• Receipt-freeness: a voter cannot prove s/he voted

in a certain way• Coercion-resistance: a voter cannot interact with a

coercer to prove that s/he voted in a certain way 5

Page 6: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Design-then-break paradigm

6

• …attack found• …attack found• …attack found• …no attack found

Guarantees: no attack has been found yet

Page 7: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Security models

7

Mathematical descriptions:• What a system is• How a system works• What is an attacker• What is a break

Advantages: clarify security notion; allows for security proofs (guarantees within clearly established boundaries) Shortcomings: abstraction – implicit assumptions, details are missing (e.g. trust in hardware, side-channels)

Page 8: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

This talk

• Privacy-relevant cryptographic primitives• Asymmetric encryption• Noninteractive zero-knowledge proofs

• Privacy-relevant techniques• Homomorphicity• Rerandomization• Threshold cryptography

• Security models for encryption• Security models for vote secrecy (Helios)

8

Page 9: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Game based models

10

Chal

leng

er

Query

Answer

0/1

Security: is secure if for any adversary the probability that the challenger outputs 1 is close to some fixed constant (typically 0, or ½)

𝜋

Page 10: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

ASYMMETRIC ENCRYPTION SCHEMES 11

Page 11: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Syntax

12

• Setup(ν): fixes parameters for the scheme

• KG(params): randomized algorithm that generates (PK,SK)

• ENCPK(m): randomized algorithm that generates an encryption of m under PK

• DECSK(C): deterministic algorithm that calculates the decryption of C under sk

Page 12: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Functional properties

• Correctness: for any PK,SK and M:

DECSK (ENCPK (M))=M

• Homomorphicity: For any PK, the function ENCPK ( ) is homomorphic

ENCPK(M1) ENCPK(M2) = ENCPK(M1+M2) 13

Page 13: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

(exponent) ElGamal

14

• Setup(ν): produces a description of (G,) with generator g

• KG(G, g): x {1,…,|G |}; X gx

output (X,x)• ENCX(m): r {1,…,|G |};

(R,C) (gr, gmXr); output (R,C)

• DECx((R,C)): find t such that gt=C/Rx

output m

Page 14: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Functional properties

• ENCX(m): (R,C) (gr, gmXr); output (R,C)

• DECx((R,C)): find t such that gt=C/Rx

output t

• Correctness: output t such that gt = gmXr/gxr = gmXr/Xr=gm

• Homorphicity:(gr, gv1Xr) (gs, gv2Xs) = (gq, gv1+v2Xq)

where q=r+s

15

Page 15: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

IND-CPA

16

par Setup() (PK,SK ) Kg (par)

b C EncPK(Mb)

win d=b

Public Key

PK

win

Security for 𝜋=(Setup ,Kg ,Enc ,Dec )

M0,MI

C

Guess d

𝜋

Theorem:If the DDH problem is hard in G then the ElGamal encryption scheme is IND-CPA secure.

Good definition?

is IND-CPA secure if Pr[win] ~ 1/2

Page 16: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

SINGLE PASS VOTING SCHEME 17

Page 17: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

BBInformal

18

C1 ENCPK(v1)

P1: v1

C2 ENCPK(v2)P2: v2

Cn ENCPK(vn)Pn: vn

C1

C2

Cn

SK

PK

Use SK to obtain v1,… vn. Compute and return

(v1,v2,…,vn)

Page 18: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Syntax of SPS schemes

• Setup(ν): generates (x,y,BB) secret information for tallying, public information parameters of the scheme, initial BB • Vote(y,v): the algorithm run by each voter

to produce a ballot b• Ballot(BB,b): run by the bulleting board;

outputs new BB and accept/reject• Tallying(BB,x): run by the tallying

authorities to calculate the final result19

Page 19: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

An implementation: Enc2Vote

• =(KG,ENC,DEC) be a homomorphic encryption scheme. Enc2Vote() is: • Setup(ν): KG generates (SK,PK,[]) • Vote(PK,v): b ENCPK(v)• Process Ballot([BB],b): [BB] [BB,b]• Tallying([BB],x): where [BB] = [b1b2,…,bn]

b = b1b2 … bn

• result DECSK(x,b)

output result20

Page 20: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

PKAttack against privacy

21

SKC1 ENCPK(v1)P1: v1

C2 ENCPK(v2)P2: v2

C1

P3

• Assume that votes are either 0 or 1• If the result is 0 or 1 then v1 was 0, otherwise v1

was 1

C1

C2

C1

FIX: weed out equal ciphertexts

BBUse SK to obtain v1 ,v2, v3

Out (v1 ,v2, v3 ) = 2v1 + v2

Page 21: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

New attack

22

C1 ENCPK(v1)P1: v1

C2 ENCPK(v2)P2: v2

CP3

PK

Calculate C0=ENCPK(0)and C=C1C0=ENCPK(v1)

C1

C2

C

FIX: Make sure ciphertexts cannot be mauled and weed out

equal ciphertexts

BBSK

Use SK to obtain v1 ,v2, v3

Out (v1 ,v2, v3 ) = 2v1 + v2

Page 22: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Non-malleable encryption (NM-CPA)

23

Params Setup() (PK,SK ) Kg (params)

b C EncPK(Mb)

Mi DecPK(Ci), for i=1..n

win d=b

Public Key

PK

win

Nonnmalleability of 𝜋=(Setup, Kg , Enc , Dec)

M0,M1

C

Guess d

𝜋

C1, C2 …,Cn

M1, M2,…,Mn

Good definition?

Page 23: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

ElGamal is not non-malleable

24

• Any homomorphic scheme is malleable:• Given EncPK(m) can efficiently compute

EncPK(m+1) (by multiplying with an encryption of 1)

• For ElGamal: • submit 0,1 as the challenge messages• Obtain c=(R,C)• Submit (R,Cg) for decryption. If

response is 1, then b is 0, if response is 2 then b is 1

Page 24: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

BB0 BB1

Ballot secrecy for SPS [BCPSW11]

25

C0 VotePK(h0)

C

h 0,h 1

C1

C

C1 VotePK(h1)

Sees BBb

d win d=b

result rTallySK(BB0)

C0

CC

PK SK

win

b

Page 25: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

27

Theorem: If s a non-malleable encryption scheme then Env2Vote() has vote secrecy.

PK

SK

h 0,h 1 BB

C’

C ENCPK(hb)

dresult

rF(H0,V)

h0,h1

C1, C2,…, Ct

d

v1, v2,…, vt

PK

CC’

PKParams Setup() (PK,SK ) Kg (params)

b C EncPK(Mb)

Mi DecPK(Ci), for i=1..n

win d=b

Page 26: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

ZERO KNOWLEDGE PROOFS 28

Page 27: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Interactive proofs

29

w

XM1

M2

M3

Mn

Prover Verifier

X

Wants to convince the Verifier that

something is true about X. Formally that:

Rel(X,w) for some w.

Variant: the prover actually knows such a

w

Accept/Reject

Examples: • Relg,h ((X,Y),z) iff X=gz and Y=hz

• Relg,X ((R,C),r) iff R=gr and C=Xr • Relg,X ((R,C),r) iff R=gr and C/g=Xr • Relg,X ((R,C),r) iff (R=gr and C=Xr ) or (R=gr and C/g=Xr)

Page 28: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Properties (informal)

• Completeness: an honest prover always convinces an honest verifier of the validity of the statement• Soundness: a dishonest prover can cheat

only with small probability• Zero knowledge: no other information is

revealed • Proof of knowledge: can extract witness

from a successful prover30

Page 29: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Equality of discrete logs [CP92]

• Fix group G and generators g and h• Relg,h ((X,Y),z) = 1 iff X=gz and Y=hz

• P → V: U := gr , V := hr

(where r is a random exponent)

• V → P: c (where c is a random exponent)• P → V: s := r + zc ; • V checks: gs=UXc and hs=VYc

31

Page 30: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Completeness

• If X=gz and Y=hz

• P → V: U := gr , V := hr

• V → P: c • P → V s := r + zc ; • V checks: gs=UXc and hs=VYc

• Check succeeds: gs = gr+zc = grgzc = U Xc 32

Page 31: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

(Special) Soundness

• From two different transcripts with the same first message can extract witness• ((U,V),c0,s0) and ((U,V),c1,s1) such that:• gs0=UXc0 and hs0=VYc0

• gs1=UXc1 and hs1=VYc1

• Dividing: gs0-s1=Xc0-c1 and hs0-s1=Yc0-c1

• Dlogg X = (s0-s1)/(c0-c1) = Dlogh Y

33

Page 32: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

(HV) zero-knowledge

34

R

c

s

Rel(X,w)

X,w X

There exists a simulator SIM that producestranscripts that are indistinguishable from those of the real execution.

R

c

s

X

Page 33: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Special zero-knowledge

35

R

c

s

Rel(X,w)

X,w X

Simulator of a special form: • pick random c• pick random s• R SIM(c,s)

R

c

s

X

Page 34: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Special zero-knowledge for CP

•Accepting transcripts: ((U,V),c,s) such that gs=UXc and hs=VYc

• Special simulator:• Select random c• Select random s• Set U= gsXc and V=hsYc

•Output ((U,V),c,s)36

Page 35: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

OR-proofs [CDS95,C96]

37

R1

c1

s1

Rel1(X,w)

X,w X

R2

c2

s2

Rel2(Y,w)

Y,w Y

Design a protocol for Rel3(X,Y,w) where:Rel3(X,Y,w) iff Rel1(X,w) or Rel2(Y,w)

Page 36: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

OR-proofs

38

X,Y,w

R1 R2

c1 c2

s1 s2

X,Y

c

Page 37: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

OR-proofs

39

Rel1(X,w)

X,Y,w

R1 R2

c1=c-c2 c2

s1 s2

X,Y

c

Page 38: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

OR-proofs

40

Rel1(X,w1)

X,Y,w

R1 R2

c1=c-c2 c2

c1,s1 c2,s2

X,Y

c

To verify: check that c1+c2=c and that (R1,c1,s1) and (R2,c2,s2) are accepting transcripts for the respective relations.

Page 39: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Non-interactive proofs

41

𝝅

Prover Verifier

X,w X

Page 40: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

The Fiat-Shamir/Blum transform

42

R

c

s

Rel(X,w)

X,w X

R

s

X,w X

c=H(X,R)

Theorem: If (P,V)s an honest verifier zero-knowledge Sigma protocol , FS/B() is a simulation-sound extractable non-interactive zero-knowledge proof system (in the random oracle model).

The proof is (R,s). To verify: compute c=H(R,s). Check (R,c,s) as before

Page 41: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

ElGamal + PoK

• Let v {0,1} and (R,C)=(gr,gvXr)• Set u=1-v• Pick: c,s at random• Set Au= gsR-c , Set Bu=Xs (Cg-u) –c

43

Page 42: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

ElGamal + PoK

• Pick Av =ga, Bv=Xa

• h H(A0,B0,A1,B1)• c’ h - c• s’ Output ((R,C), A0,B0,A1,B1,s,s’,c,c’)

44

Theorem: ElGamal+PoK as defined is NM-CPA, in the random oracle model.

Theorem: Enc2Vote(ElGamal+PoK) has vote secrecy, in the random oracle model.

Page 43: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Random oracle [BR93,CGH98]

•Unsound heuristic

• There exists schemes that are secure in the random oracle model for which any instantiation is insecure

• Efficiency vs security45

Page 44: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Exercise: Distributed ElGamal decryption

46

Party Pi has secret key xi, public key : Xi = gxi

Parties share secret key: x=x1 + x2 +…+xk

Corresponding public key: X=Xi = gΣxi = gx

To decrypt (R,C): Party Pi computes: yiRxi ; Prove that dlog (R,yi) = dlog(g,Xi) Output: C/y1y2…yk = C/Rx

Design a non interactive zero knowledge proof that Pi behaves correctly

Page 45: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Ballot secrecy vs. vote privacy

47

• Assume• (v1,v2,…,vn) = v1,v2,…,vn

• (v1,v2,…,vn) = v1+v2+…+vn and the final result is 0 or n

• The result function itself reveals information about the votes; ballot secrecy does not account for this loss of privacy

Page 46: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

AN INFORMATION THEORETIC APPROACH TO VOTE PRIVACY [BCPW12?]

48

Page 47: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Information theory

49

• Uncertainty regarding a certain value (e.g. the honest votes) = assume votes follow a distribution X.• (Use distributions and random variables

interchangeably)• Assume that F measures the difficulty that

an unbounded adversary has in predicting X (e.g. X {m0, m1} then F(X)=1)

Page 48: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Conditional privacy measure

50

• Let X,Y distributed according to a joint distribution. • F(X|Y) measures uncertainty regarding X

given that Y is observed (by an unbounded adversary):• F(X | Y) F(X)• If X and Y are independent F(X | Y) = F(X)• If X computable from Y then F(X | Y) = 0• If Y’ can be computed as a function of Y

then F(X | Y) F(X | Y’)

Page 49: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Computational variant

51

• F(M| EncPK(M)) = ?

Page 50: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Computational variant

52

• F(M| EncPK(M)) = 0 since M is computable from EncPK(M)

• How much uncertainty about X after a computationally bounded adversary sees Y?

• Look at Y’ such that (X,Y) (X,Y’)• Define: Fc (X | Y) if there exists Y’ such

that (X,Y) (X,Y’) and F (X | Y) =

Page 51: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Example

53

Proof: • Since ENC is IND-CPA secure then:

(M, EncPK(M)) (M, EncPK(0))

• Fc (M| EncPK(M)) F(M| EncPK(0)) = F(M)

Proposition: If ENC is an IND-CPA encryption scheme then Fc (M| EncPK(M)) F(M)

Page 52: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Variation

54

• Consider random variables: T (target), L (leakeage),R (result)

• Define: Fc (T | L , R ) if there exists L’ such that (T , L , R ) (T , L’, R )and F(T | L , R ) = r

Page 53: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Application to voting

55

• D distribution of honest votes• T: supp(D) target function• T(v1,v2,…,vm) = vi

• T(v1,v2,…,vm) = (vi =vj)?• Other

• Adversary sees: • his view viewA(D, ) • which includes (D ,vA)

Page 54: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Measure(s) for vote privacy

56

• Definition: For D distribution on honest votes, T target function, and protocol, define:

Mc(T,D,) = infA Fc (T(D)|viewA (D,), (D ,vA))

• Can also define an information theoretic notion:M(T,D,) = infA F(T(D)|viewA (D,), (D ,vA))

Page 55: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Privacy of idealized protocols

57

Proposition: M (T,D,) = infA F(T(D)| (D ,vA))

Page 56: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

BB0 BB1

Recall: vote secrecy for SPS

58

PKC0 ENCPK(h0)

C

h 0,h 1

C1

C

C1 ENCPK(h1)

Sees BBb

d win d=b

result rTallySK(BB0)

C0

CC

SK

win

Page 57: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

BB0 BB1

Recall: vote secrecy for SPS

59

PKC0 ENCPK(0)

C

h 0,0

C1

C

C1 ENCPK(h1)

Sees BBb

d win d=b

result rTallySK(BB0)

C0

CC

SK

win

Proposition: Secure SPS implies that (D,viewA (D,), (D ,vA)) ~ (D,viewA (0,), (D ,vA))

D

Corollary: Secure SPS for implies thatMc(T,D,) = M (T,D,)

Proof:Mc(T,D,) = infA Fc (T(D)|viewA (D,), (D ,vA)) infA F (T(D)|viewA (0,), (D ,vA)) = infA F (T(D)|(D ,vA)) = M (T,D,)

Page 58: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Choice of entropy

• Average min-entropy: measures the probability that an observer guesses the target function of the votes• Min min-entropy: measures the probability

that an observer guesses the target function of the votes for the worst possible election outcome• Min Hartley entropy: measures the minimum

number of values that the target function can take for any assignment of votes

61

Page 59: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

NOT COVERED 62

Page 60: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Threshold decryption

63

Party Pi has (xi, Xi=gxi); x=x1 + x2 +…+xk;

X= Xi = gΣxi = gx

To decrypt (R,C):Pi: yiRxi ;

prove that dlog (yi,R) = dlog(g,Xi)Output: C/y1y2…yk = C/Rx

Page 61: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Simulation-based models [Groth05]

64

𝜋 𝜋 ideal

Security: is secure (as secure as ideal) if for any there exists a such that the overall outputs are indistinguishable

Page 62: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Games vs. simulation security

• Games• Not always intuitive• Difficult to design: challenger/queries should

reflect all potential uses of the system and permit access to all the information that can be gleaned

• Simulation• More intuitive (for simple systems)• Too demanding (e.g. adaptive security) 65

Page 63: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Dolev-Yao models [DKR09]

• Protocols specified in a process algebra (applied-pi calculus)

• Vote secrecy: P[vote1/v1, vote2/v2] ≈ P[vote2/v1, vote1/v2]

• Abstraction?• Relation with the game-based definition? 67

Page 64: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Incoercibility/Receipt freeness

68

Page 65: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Mix-nets

69

Page 66: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Everlasting privacy

70

Page 67: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Commitments

71

Page 68: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Fully homomorphic encryption

72

Page 69: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Conclusions

•Models (symbolic, computational) are important

• Models, models, models…

• Proofs (symbolic, computational) are important

• Proofs, proofs?

• A first step towards a privacy measure

73

Page 70: Vote privacy: models and cryptographic underpinnings Bogdan Warinschi University of Bristol 1

Thanks

74