mark ryan university of birmingham designing secure systems october …mdr/teaching/dss15/03... ·...
TRANSCRIPT
Internet voting: designs and challenges
Mark RyanUniversity of Birmingham
Designing Secure SystemsOctober 2015
Outline
1 Current situation and potential
2 Desired properties
3 Trust assumptions
4 Some cryptography
5 Example 1: Helios (2009)
6 Conclusions
Why can’t we vote online?
Potential risks
Vote fraud!Voters / candidates / organised criminals / foreign nations / hackers/ lobbyists / secret service / military / . . .
Privacy invasionFamily members / employer / advertisers / . . .
DisenfranchisementElderly people / uneducated people / those already disenfranchisedfrom the digital world
Electronic voting: potential
Electronic voting potentially offers
Efficiency
higher voter participationgreater accuracylower costs
Better security
vote-privacy even in presenceof corrupt election authoritiesvoter verification, i.e. theability of voters and observersto check the declaredoutcome against the votescast.
Governments world over havebeen trialling e-voting, e.g.USA, UK, Canada, Brasil, theNetherlands and Estonia – butwith mixed results.
Can also be useful forsmaller-scale elections(student guild, shareholdervoting, trade union ballots,local government).
Current situation
The potential benefits have turned out to be hard to realise.
In UK
May 2007 elections included 5 local authorities that piloted a rangeof electronic voting machines.
Electoral Commission report concluded that the implementation andsecurity risk was significant and unacceptable and recommends thatno further e-voting take place until a sufficiently secure andtransparent system is available.
In USA:
Diebold controversy since 2003 when code leaked on internet.
Kohno/Stubblefield/Rubin/Wallach analysis concluded Dieboldsystem far below even most minimal security standards. Voterswithout insider privileges can cast unlimited votes without beingdetected.
Situation in USA, continued
In 2007, Secr. of State for California Debra Bowencommissioned “top-to-bottom” review by computer scienceacademics of the four machines certified for use in the state.Result is a catalogue of vulnerabilities, including
appalling software engineering practices, such as hardcoding cryptokeys in source code; bypassing OS protection mechanisms, . . .
susceptibility of voting machines to viruses that propogate frommachine to machine, and that could maliciously cause votes to berecorded incorrectly or miscounted
“weakness-in-depth”, architecturally unsound systems in which evenas known flaws are fixed, new ones are discovered.
In response to these reports, she decertified all four types of votingmachine for regular use in California, on 3 August 2007.
Situation in USA – 2008 election
Several other states followed California’s lead, and decertifiedelectronic voting machines.
But other states have continued to use touch-screen systems, havinginvested massively. (E.g., the state of Colorado spent $41M onelectronic voting systems for its 3M voters, on machines thatCalifornia has now decertified. . . )
Diebold, one of the main suppliers, tried unsuccessfully to sell theire-voting business. Instead, they rebranded it ‘Premier ElectionSolutions’ and revised their profit forecasts downwards.
Current situation in Estonia
Estonia is a tiny former Soviet republic (pop. 1.4M), nicknamed“e-Stonia” because of its tech-savvy character.
Oct. 2005 local election allowed voters to cast ballots on internet.There were 9,317 electronic votes cast out of 496,336 votes in total(1.9%) participated online.
Officials hailed the experiment a success. Said no reports of hackingor flaws. System based on linux.
Voters need special ID smartcard, a $24 device that reads the card,and a computer with internet access. About 80% of Estonian votershave the cards anyway, also used since 2002 for online banking andtax records.
2007 general election: 30,275 voters (3%) used internet voting.
Then used in elections in 2009, 2011, 2013, 2014, 2015.
2015 general election: 176,491 voters (30%) used internet voting.
Internet voting and coercion resistance
The possibility of coercion (e.g. by family members) seems very hard toavoid for internet voting.
In Estonia, the threat is somewhat mitigated:
Election system allows multiple online votes to be cast by the sameperson during the days of advance voting, with each vote cancellingthe previous one.
System gives priority to paper ballots; a paper ballot cancels anyprevious online ballot by the same person.
Where are we?
1 Current situation and potential
2 Desired properties
3 Trust assumptions
4 Some cryptography
5 Example 1: Helios (2009)
6 Conclusions
Desired properties
Verifiability● Outcome of election is
verifiable by voters and observers
● You don’t need to trust election software
Desired properties
Verifiability● Outcome of election is
verifiable by voters and observers
● You don’t need to trust election software
Incoercibility● Your vote is private
● even if you try to cooperate with a coercer
● even if the coercer is the election authorities
Desired properties
Verifiability● Outcome of election is
verifiable by voters and observers
● You don’t need to trust election software
Incoercibility● Your vote is private
● even if you try to cooperate with a coercer
● even if the coercer is the election authorities
Usability● Vote & go● Verify any time
Examples
Verifiable
Usable
Incoercible
raising hands
website voting
anonymous credentialsplus ToR
?
Internet voting in Estonia
The Estonian system is not verifiable.
In 2013, they introduced verifiability via a mobile device.But that is a very weak form of verification (the system could just belying to you).“End-to-end verifiability” means you can verify that your vote wasincluded in the outcome independently of any software or hardwarethat the election managers use.
Also, it is not coercion-resistant (even with re-voting), if electionmanagers and sysadms are considered part of the attacker.
So, it has neither of the two security properties we want!
Takoma Park (USA); and Victoria State (Aus)
To my knowledge, no instance of e2e-verifiable electronic voting hasever taken place in national political elections.
With two exceptions!
Takoma Park, a city in Maryland, USA. In 2009, 1722 votes werecast and 66 people checked their codes – almost 4%. The system,called Scantegrity, was used again in 2011.Victoria State, Australia, in 2014. Again, a small number (<2000)voters in special circumstances used an e2e verifiable system calledPret-a-Voter developed in a joint research project at Universities ofBirmingham, Surrey and Luxembourg.
E2e verifiable voting has been used in non-political elections, such asthe election of university officials, and elections in professionalassociations.
Voting system: desired properties in more detail
Individual verifiability: a
voter can verify that her vote
was really counted. We can
split this into cast-as-intended
and counted-as-cast.
Universal verifiability: a voter
can verify that the published
outcome really is the sum of all
the votes
Eligibility verifiability: a voter
can verify that only eligible
votes have been counted
Fairness: no early results can
be obtained which could
influence the remaining voters
Privacy: the fact that a
particular voted in a particular
way is not revealed to anyone
Receipt-freeness: a voter
cannot later prove to a coercer
that she voted in a certain way
Coercion-resistance: a voter
cannot interactively cooperate
with a coercer to prove that
she voted in a certain way
Usability. This is a collection
of properties, including
vote-and-go.
. . . and all this even in the presence of corrupt election authorities. . . anduser devices with malware.
Are these properties even simultaneously satisfiable?
Contradiction?
Receipt-freeness: a voter cannot later prove to a coercer that shevoted in a certain way
Individual verifiability: a voter can verify that her vote was reallycounted, and if it wasn’t, can prove that.
Yes, but we need to define them carefully (=narrowly).
This tension between opacity and transparency is the reason electronicvoting is hard (and interesting).
Where are we?
1 Current situation and potential
2 Desired properties
3 Trust assumptions
4 Some cryptography
5 Example 1: Helios (2009)
6 Conclusions
How could it be secure?
Trust assumption possibilities
Nothing is required-to-be-trusted
Some hardware or software is required-to-be-trusted
● but you can audit it on some cut/choose basis
● e.g. PaV, Helios 2.0
Some hardware or software is required-to-be-trusted
● but you can bring/make/source your own
● e.g. FOO, JCJ/Civitas
Some hardware or software is required-to-be-trusted
● and you just have to assume it is
● e.g. current DRE solutions
Security by trusted client software
→ → → → → → → → → →
trusted by user
does not need to betrusted by authoritiesor other voters
not trusted by user
doesn’t need to betrusted by anyone
Where are we?
1 Current situation and potential
2 Desired properties
3 Trust assumptions
4 Some cryptography
5 Example 1: Helios (2009)
6 Conclusions
Symmetric key cryptography
We can abstract away from the details of the cryptography, with anequation:
sdec(k , senc(k , m)) = m
Equivalently, we could put the key as a subscript.
sdeck(senck(m)) = m
Public key cryptography
In public key cryptography, we generate a key from a random seed s intwo parts: the private or secret key sk(s), and the public key pk(s). Theyare mathematically related:
pdec(sk(s), penc(pk(s), r , m)) = m
Here, r is a random that is included in the encryption process.
Or, if you likepdecsk(s)(pencr
pk(s)(m)) = m
There are several cryptographic schemes that give us this powerfulbehaviour. The best known one, RSA, was created in 1978.
Homomorphic public key crypto
Some schemes have a homomorphic property:
pdecsk(s)(pencr1
pk(s)(m1) ∗ pencr2
pk(s)(m2)) = m1 + m2
This means that you can combine two ciphertexts (by multiplying them),and you obtain a ciphertext that corresponds to having added theplaintexts together.
Insight: we could count votes in that way. Just multiply the ciphertexts –it corresponds to adding up the votes.
Zero knowledge proofs
Suppose my vote consists of pencrpk(s)(0) for a “no” vote, or pencr
pk(s)(1)for a “yes” vote.
Here, we assume the votes will be homomorphically combined, and thena trusted party will decrypt the result which will be the number of “yes”votes.
But what stops me using a rogue software client that actually submitspencr
pk(s)(100), or perhaps pencrpk(s)(−100)?
Answer: a zero-knowledge proof that the plaintext is either 0 or 1.
zkpv(pencrpk(s)(0)) = true
zkpv(pencrpk(s)(1)) = true
Distributed decryption
Single party decryption: we generate a key from a random seed s in twoparts: the private or secret key sk(s), and the public key pk(s).
pdecsk(s)(pencrpk(s)(m)) = m
Multiple-party decryption: we generate three secret keys sk1(s), sk2(s),sk3(s), and the public key pk(s). Let
pd1 = pdecsk1(s)(pencrpk(s)(m))
pd2 = pdecsk2(s)(pencrpk(s)(m))
pd3 = pdecsk3(s)(pencrpk(s)(m))
and then stipulate
decCombination(pd1, pd2, pd3) = m
Cut-and-choose auditing
Suppose Alice wants to give Bob an envelope with £100 in it.
But Bob is not allowed to open the envelope.So how can he be sure there is £100 in it?
Answer:
Alice prepares 10 envelopes.Bob randomly chooses 9 of them, and opens them. He sees thatthere is £100 in each of the 9 he opens.So he reasons, with overwhelming probability there is £100 in theremaining one.
Cut-and-choose auditing
Suppose Alice wants to give Bob an envelope with £100 in it.
But Bob is not allowed to open the envelope.So how can he be sure there is £100 in it?
Answer:
Alice prepares 10 envelopes.Bob randomly chooses 9 of them, and opens them. He sees thatthere is £100 in each of the 9 he opens.So he reasons, with overwhelming probability there is £100 in theremaining one.
Cut-and-choose auditing
Suppose your computer wants to give you a ciphertext pencrpk(s)(v)
for a value v you have chosen (and r is a random chosen by thecomputer).
How can you know the encryption is properly done? For example,how can you know the v you chose is the one that was used?The computer could give you the value r . Then you couldindependently calculate pencr
pk(s)(v) on another computer. But thatwould allow you to prove to someone else that the ciphertext was anencryption of v .How could you be sure the correct value v is inside, without learningr and hence without making you vulnerable to coercion?
Answer:
Computer prepares n ciphertexts, all with the same v , and differentvalues of r .You randomly choose n − 1 of them, and it gives you the r for thoseones. So you see that those n − 1 were prepared correctly.So you reason, with overwhelming probability the remaining one iscorrect as well.
Cut-and-choose auditing
Suppose your computer wants to give you a ciphertext pencrpk(s)(v)
for a value v you have chosen (and r is a random chosen by thecomputer).
How can you know the encryption is properly done? For example,how can you know the v you chose is the one that was used?The computer could give you the value r . Then you couldindependently calculate pencr
pk(s)(v) on another computer. But thatwould allow you to prove to someone else that the ciphertext was anencryption of v .How could you be sure the correct value v is inside, without learningr and hence without making you vulnerable to coercion?
Answer:
Computer prepares n ciphertexts, all with the same v , and differentvalues of r .You randomly choose n − 1 of them, and it gives you the r for thoseones. So you see that those n − 1 were prepared correctly.So you reason, with overwhelming probability the remaining one iscorrect as well.
Where are we?
1 Current situation and potential
2 Desired properties
3 Trust assumptions
4 Some cryptography
5 Example 1: Helios (2009)
6 Conclusions
Helios: uses
The elections
“More than 100,000 votes have been cast usingHelios.”
The President of the Universite catholiquede Louvain (2009)
25,000 potential voters
5000 registered, 4000 voted
30% voters checked their vote
No valid complaints
Verifiability
Anyone can write code to verify theelection; sample code provided
The Princeton University UndergraduateStudent Government (2010)
The board of the IACR (2010)
No coercionresistance
Only rec-ommendedfor low-coercionenviron-ments
Re-votesare allowed,but don’thelp w.r.t.“insider”coercer
[Adida/deMarneffe/Pereira/-
Quisquater 09]
Helios 2.0
→ → → → → → → → →
User prepares ballot (encryptedvote) on her computer,together with ZKPs to showthat it is a valid vote.
Cut-and-choose auditabilityprovides assurance ofcorrectness
Not much guarantee of privacyon client side.
Ballots are checked &homomorphicallycombined into a singleencrypted outcome.
Outcome is decryptedby threshold of talliers.
Proof of correctdecryption.
Helios properties
Let us consider for each one whether Helios has it or not:
Property HeliosIndividual verifiability: cast as intendedIndividual verifiability: counted as cast
Universal verifiabilityEligibility verifiability
FairnessPrivacy
Receipt-freenessCoercion-resistance
Usability: vote-and-goUsability: other aspects
Helios properties
Let us consider for each one whether Helios has it or not:
Property HeliosIndividual verifiability: cast as intended XIndividual verifiability: counted as cast X
Universal verifiability XEligibility verifiability ×
Fairness XPrivacy X
Receipt-freeness ×Coercion-resistance ×
Usability: vote-and-go XUsability: other aspects X
Conclusions
Electronic voting is coming, whether we like it or not.Sadly, the current systems are woefully inadequate.
in the USA and UK, they don’t manage to satisfy basic securityproperties, like resistance to virus attacks and to tampering.They don’t even try to satisfy stronger properties, like privacyguarantees against corrupt election officials, universal and individualverifiability.
There are many “academic” protocols which have better properties
They may underestimate the importance of scalability, deployability,usability issues.We also need to examine their real security more thoroughly:
Attack trees.Weakest links, and defence-in-depth.
We do not have any satisfactory system for internet voting at thepresent time.