DOC PREVIEW
An Information-Theoretic Model of Voting Systems

This preview shows page 1-2-3-4 out of 12 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

An Information-Theoretic Model of Voting SystemsBen Hosp∗, Poorvi L. Vora∗Dept. of Computer ScienceGeorge Washington UniversityWashington DC 20052{bhosp,poorvi}@gwu.eduJune 3, 2006AbstractThis pap er presents an information-theoretic model of a voting system, consisting of (a)definitions of the desirable qualities of integrity, privacy and verifiability, and (b) quantitativemeasures of how close a system is to being perfect with respect to each of the qualities. Itdescrib e s the well-known trade-off between integrity and privacy in this model, and defines aconcept of weak privacy, which is traded off with system verifiability. This paper is a simultane-ous submission to VSRW06 and WOTE06. Both Program Committee Chairs are aware of thesimultaneous submission and have approved it, as neither meeting will have printed proceedings.1 IntroductionElections in the United States have relied more and more upon computerized or electronic votingtechnology. Additionally, other democracies are also using electronic voting – examples include theUK’s early internet voting trial, and India’s use of a single type of dedicated electronic pollingmachine. Yet, the literature does not provide a standard model to compare the electronic votingsystems with the electromechanical and paper-based systems they have replaced, or to comparethem among themselves.This paper presents a voting model that is based on information flow through an election system.Some of the more important desirable properties of voting systems – integrity, privacy and verifiability– are carefully defined in the model, and information theoretic metrics for the measurement of devi-ation from perfect are presented. The advantage of this model is (a) it provides a single frameworkin which to define and measure integrity, privacy and verifiability, and (b) the tradeoffs among thesecriteria are explicit in this model. In fact, the tradeoffs among these criteria arise exactly because allthe information required to verify and release vote counts can only be obtained from the votes. Thismakes an information-theoretic approach the natural – if not the only – approach to study these.We make some important points about our approach here. The measures we propose – for integrity,privacy, verifiability and usability – are based on the concept of entropy. The type of entropy – com-putational or information-theoretic – can, in principle, depend on what is best for the specific setting;the paper addresses, for the purpose of simplicity of prese ntation, only information-theoretic entropy.One may argue about whether the exact measures proposed are the best ones - that is, whether oneuses an average or maximum entropy measure, whether one examines the system separately for eachindividual, or aggregates across individuals. The main focus of this paper, however, is not the exact∗These authors supported in part by NSF SGER 05055101details of the measures (though these are, in our opinion, the b es t of several alternatives), but themodel itself and the manner in which it exposes all the tradeoffs.Our uniform approach while determining measures has been that, when a system treats a particularuser or groups of users differently from others, we are actually dealing with multiple systems, andeach has its own measure. Hence, for example, if a system leaks more information on voters fromlocation X than it does on voters from location Y, there are at least two measures of the syste mprivacy: one for those voting in location X, and another for those voting in location Y. It is clearhow this idea extends to, for example usability: the usability measure for a particular system for thevisually handicapped would, in general, be different from that for those not visually handicapped.Thus, while an entropy-based measure averages over randomly-obtained outputs, it need not averageover populations that are treated distinctly by the system.This pap er provides an initial attempt at formalizing the framework, and presents the types ofquestions that can be examined using it.2 Prior Work[12] contains one of the earliest list of voting system requirements, and many papers in the recentWOTE 2001 [14] and WEST 2002 [13] workshops also include overviews of voting system requirements[4, 5, 7]. None provide a means of measuring performance with respect to the requirements. Paperson evaluating voting technologies include [2, 6], and several other papers from the NIST Workshopon Threats to Voting Systems [11], in particular [8, 10], provide an evaluation with respect to threatsto count integrity. [1] provides a mathematical definition of voting system privacy, and a relatedentropy-based privacy measure, which our work draws heavily from.3 Election GoalsThis section provides a brief list of desirable properties of election systems; the goals have been drawnfrom prior work such as [12, 4, 5, 7, 9].1. Usability: Ballots should be “cast as intended,” meaning that an otherwise valid voter whointends to cast a vote for Candidate Alice should not be thwarted by election procedures ortechnology.2. Integrity: Ballots should be “counted as cast,” meaning that the voting system should declarethat Candidate Bob received m votes if and only if exactly m ballots marked for CandidateBob were c ast.3. Privacy: The secret ballot principle should apply to the election; voter i should not havethe contents of her ballot associated with her in any way by anyone – even with the collusionof many parties, including election officials and other voters. Notably, this privacy should beinvoluntary, in the sense that even a set of colluding parties that includes voter i herself shouldnot be able to prove the contents of her ballot once she has left the polling place.One may note that a system that provides privacy also provides fairness [9]: partial electionresults should not be available to anyone during the election. (This requirement ensures that theelection is fair to all candidates, as the revelation of partial counts might encourage supportersof a winning candidate to abstain from voting when they might have otherwise voted.) Fairnessis implied by privacy because the revelation of partial vote counts reveals information aboutindividual votes.24. Verifiability: Both the general public – including non-voting observers – as well as the indi-vidual voter should be able to rest assured that the above goals have been met. Such assuranceshould not require real-time


An Information-Theoretic Model of Voting Systems

Download An Information-Theoretic Model of Voting Systems
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view An Information-Theoretic Model of Voting Systems and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view An Information-Theoretic Model of Voting Systems 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?