update on the uocava working group

31
TGDC Meeting, July 2011 Update on the UOCAVA Working Group Andrew Regenscheid Mathematician, Computer Security Division, ITL http://vote.nist.gov

Upload: kaleb

Post on 21-Jan-2016

20 views

Category:

Documents


0 download

DESCRIPTION

Update on the UOCAVA Working Group. Andrew Regenscheid Mathematician, Computer Security Division, ITL http://vote.nist.gov. Overview. The TGDC UOCAVA working g roup has three outstanding task items: High-level g uidelines for UOCAVA voting s ystems - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Update on the UOCAVA Working Group

TGDC Meeting, July 2011

Update on the UOCAVA Working Group

Andrew RegenscheidMathematician, Computer Security

Division, ITLhttp://vote.nist.gov

Page 2: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 2Page 2

OverviewThe TGDC UOCAVA working group has three outstanding task items:High-level guidelines for UOCAVA voting systemsNarrative risk analysis on current UOCAVA voting process and demonstration project systemLow-level guidelines for demonstration project system

Page 3: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 3

Meeting ObjectivesThis session of today’s meeting has three objectives:Decide how to proceed on the high-level guidelines, including decisions on:

Intended scope and purpose Auditability/verifiability guidelines Usability/accessibility guidelines Resolution of FVAP’s comments

Decide on a course of action for conducting a risk analysis on the current UOCAVA voting processDiscuss process/timeline for approaching demonstration project guidelines

Page 4: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 4

High-Level GuidelinesEAC/NIST/FVAP UOCAVA Roadmap

“EAC and the TGDC, with technical support from NIST, and input from FVAP, will identify high-level,

non-testable guidelines for remote electronic absentee voting systems. This effort will focus on the desirable characteristics of such systems and

serve as a needs analysis for future pilots and research; and for the purposes of driving industry

to implement solutions.”

Page 5: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 5

High-Level Guidelines Purpose

Fulfill charge from UOCAVA Roadmap Interpreted the UOCAVA Roadmap language as asking for

aspirational, high-level guidelines intended to identify goals for future UOCAVA voting systems

Intent is that these high-level guidelines would form basis for the development of low-level guidelines for the demonstration project and future UOCAVA voting systems

Scope Included both demonstration project systems and future

systems Guidelines intended to be all-encompassing, covering

roughly the same scope as future low-level guidelines

Page 6: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 6

High-Level Guidelines Goal was to identify a small number (~25) high-

level guidelines that covered all important topics

Build consensus around high-level concepts, and flush out details in low-level guidelines for the future

Emphasis on aspirational goals- we recognized some may not be achievable today

Page 7: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 7

High-Level Guidelines: TopicsCurrent high-level guidelines draft includes:Voting functionsAuditabilityQuality assurance and configuration managementReliability and availabilityUsability and accessibilitySecurityInteroperability

Page 8: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 8

High-Level Guidelines: Process NIST staff initially drafted high-level guidelines in sections

using: Earlier drafts of high-level guidelines Council of Europe’s Legal, Operation and Technical

Standards for E-Voting Research done to support VVSG development Existing relevant standards

UOCAVA and U&A working group members reviewed and edited guidelines

Properties of the current UOCAVA voting system were taken into consideration, but did not limit the guidelines

Page 9: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 9

Voting Functions Primary, basic guidelines expected from any

voting system, e.g., One cast ballot counted per voter (hlg-2, 3) Accurate and reproducible vote counts (hlg-4) Supply voters with correct ballot style (hlg-5)

Some were derived from CoE E-Voting standard

Page 10: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 10

Auditability Primary guideline: “The UOCAVA voting system shall

create and preserve evidence to enable auditors to verify that it has operated correctly in an election, and to identify the cause if it has not.”

Two controversial proposed guidelines: “The audit system shall provide the ability to compare

records and verify the correct operation of the UOCAVA voting system and the accuracy of the result, in an effort to detect fraud, to prove that all counted votes are authentic and that all authentic votes have been counted as cast.”

“The UOCAVA voting system shall make it possible for voters to check whether their vote was cast and recorded as they intended, and shall make it possible for observers to check whether all cast votes have been counted and tallied correctly.”

Page 11: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 11

Quality Assurance and Configuration Management

System must be “fit for use” System must be developed, monitored and

maintained in accordance with applicable best practices for quality assurance

Documented, tested, and stable configuration Guidelines based on research done to support

VVSG 2.0 draft

Page 12: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 12

Reliability and Availability Definition of critical failure: any functional failure, the

occurrence of which jeopardizes the validity of the election, or casts doubt on the credibility of the election result

Probability of critical failures and overall system availability must be fit for intended use (hlg-1, 3)

Assure reliability of system through application of best reliability engineering practices and standard reliability analysis procedures

Based on CoE guidelines and supporting VVSG 2.0 research

Page 13: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 13

Security Security guidelines were developed accepting risks of the

current mail-system, e.g., Low-level compromises of ballot secrecy is accepted (hlg-2) Some low-level fraud accepted- the goal is to prevent an

undetectable change in the outcome of the election (hlg-3) Some new issues unique to electronic systems:

Strong user authentication for voters, administrators, officials (hlg-1)

Systems must be free of vulnerabilities that allow remote attacks (hlg-4)

Prevent malicious software on terminals from impacting election integrity (hlg-5)

Recommended use of penetration testing (hlg-6)

Page 14: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 14

User-Centered Development (hlg-1)

Develop with best practices in user-centered design and user testing

Incorporate these principles throughout the system development cycle and as part of certification

Evaluate system usability and accessibility via user testing with representative test participants

Include usability evaluation of procedures and documentation for system administration

Page 15: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 15

Accessibility (hlg-2, 5, 7) Make system accessible to voters with disabilities

Built-in access features Interoperability with personal assistive technology (PAT) PAT as supplemental rather than necessary to ensure

system accessibility Maintain privacy and independence throughout all

phases of voting process Ballot marking, verification, and casting Voter has same accessibility accommodations throughout

Comply with legal mandates

Page 16: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 16

Best Design Practices (hlg-3, 4, 6) Follow human factors design best practices, for

both system and ballot design where possible EAC’s report “Effective Designs for the Administration

of Federal Elections” American Institute of Graphic Arts (AIGA)’s report “Top

10 Election Design Guidelines” Adhere to current standards and guidelines

VVSG World Wide Web Consortium (W3C)’s Web

Accessibility Initiative (WAI), specifically the Web Content Accessibility Guidelines (WCAG 2.0) and WAI for Accessible Rich Internet Applications (WAI-ARIA)

Page 17: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 17

More on Ballot Design FVAP expressed some concern over including

ballot design in the high-level guidelines To clarify:

High-level guidelines are not intended to supersede State laws

Election Officials control formatting of ballot content High-level guidelines are intended to address only

those ballot design features controlled by the UOCAVA system

For example, navigation and user interface controls UOCAVA system should support implementation of good ballot

design

Page 18: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 18

More on Accessibility FVAP requested high-level guidelines focus on

the demonstration project, which would limit the scope of accessibility

Suggested that only Section 508 be referenced Implications of this are unclear:

Section 508 does require accessible design and some PAT interoperability

Section 508 “Refresh” on the horizon How much of W3C’s WAI guidelines should be implemented in

the demonstration project? Will we learn enough about accessibility from the demonstration

project to inform future work?

Page 19: Update on the UOCAVA Working Group

TGDC Meeting, July 2011

Discussion/Questions

Page 19

Next Topic:Risk Analysis

Open issues:Intended scope and purposeAuditability/verifiability guidelinesUsability/accessibility guidelinesResolution of FVAP’s comments

Page 20: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 20

Risk Analysis TGDC Resolution #02-11 directs the UOCAVA

Working Group to: “prepare a narrative risk assessment comparing the current UOCAVA voting process to electronic absentee voting systems used in a demonstration project with military voters.”

Currently, the demonstration project system is not defined

First step: analyzing risks in current UOCAVA voting process

Page 21: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 21

Risk Analysis: Transactional Failures

Current UOCAVA process has a number of transactional failure points between voter registration and ballot canvassing:

Voter registration failures Ballot delivery failures Ballot marking errors Ballot return failures

These failures are observable and measurable An analysis of these failures can lead us to an

overall failure rate of the current process

Page 22: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 22

Risk Analysis: Identifying Risks Transactional failures are only one type of risk The UOCAVA working group can analyze one or

more representative current UOCAVA voting processes to identify other potential risks

What is the potential vulnerability? Who is in a position to exploit it? What is the impact of a successful exploit? What is the probability of a successful exploit?

Challenge #1: Impacts are not always easily quantifiable in comparable units. What is the value of a vote?

Challenge #2: Probabilities for malicious attacks are notoriously difficult to estimate

Page 23: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 23

Risk Analysis: Comparing Risks It will be important to compare and balance risks

between different types of systems, as well as different types of risks within a given system

We can create quantifiable comparisons of impact Example: Comparing the impact of lost ballots and

tampered ballots to the outcome of the election Collaboration with NIST Statistical Engineering Division Explore use of EAC Election Operations Assessment Tool

Qualitative comparisons will be done in other areas, such as malicious attacks or risks

Page 24: Update on the UOCAVA Working Group

TGDC Meeting, July 2011

Discussion/Questions

Page 24

Next Topic:Demonstration Project Guidelines

Feedback on Risk Analysis Path Forward

Page 25: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 25

UOCAVA Demonstration Project Work is building up to the implementation of a

remote voting demonstration project for military voters

EAC has tasked the TGDC in developing guidelines for the demonstration project system

TGDC Resolution #02-11 stated TGDC’s acceptance of this task, and directed the TGDC to develop guidelines for a demonstration project with simplifying assumptions:

Military voters only Use of Common Access Card (CAC) for authentication Use of professionally-administered machines

Page 26: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 26

Page 27: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 27Page 27

Mitigated Risks The simplifying assumptions mitigate some

risks identified in NISTIR 7551: A Threat Analysis on UOCAVA Voting Systems:

Use of CAC mitigates authentication-related risks, including voter impersonation and phishing attacks

Digitally signed ballots using CAC could mitigate some malicious attacks on servers

Use of professionally-administered machines mitigates risk of malicious software on voting terminals impacting ballot secrecy or integrity

Use of military network could help to mitigate some remote attacks on servers

Page 28: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 28Page 28

Other Risks Other risks may need to be mitigated by other

means, pending results of risk analysis: Network-based attacks may not be mitigated by the

architecture Internet voting systems inherit many of the same

potential risks as electronic polling place systems

Page 29: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 29

Demonstration Project Prerequisites

Several items need to be completed prior to development of demonstration project guidelines

TGDC tasked with the high-level guidelines and risk analysis

TGDC/NIST also need: Concept of operations of the demonstration system Expected high-level system architecture Clearly defined scope for demonstration project

system How extensive will this project be? One-time only? What functions must be provided?

Who decides appropriate tradeoffs and accepts risks?

Page 30: Update on the UOCAVA Working Group

TGDC Meeting, July 2011 Page 30

Demonstration Project: Timeline Current work: complete near-term deliverables

(i.e., high-level guidelines and risk analysis) intended to inform low-level guidelines development

Demonstration project guidelines expected to take 24 months to develop, vet through a public comment period, and approve in TGDC and EAC

12 month development process 6 month vetting process 6 month revision process

For a 2016 demonstration project, guidelines would be needed by mid-2014

Page 31: Update on the UOCAVA Working Group

TGDC Meeting, July 2011

Discussion/Questions

Page 31