United States Election Assistance Comittee

Register to Vote!

Use the National Mail Voter Registration Form to register to vote, update your registration information with a new name or address, or register with a political party.

Note: If you wish to vote absentee and are a uniformed service member or family member or a citizen living outside the U.S., contact the Federal Voting Assistance Program to register to vote.

EAC Newsletters
and Updates

Sign up to receive information about EAC activities including public meetings, webcasts, reports and grants.

Give Us Your Feedback

Share your feedback on EAC policy proposalsElection Resource Library materials, and OpenEAC activities. Give feedback on general issues, including the Web site, through our Contact Us page.

Military and Overseas Voters

EAC has several projects under way to assist states in serving military and overseas citizens who register and vote absentee under the Uniformed and Overseas Citizens Absentee Voting Act. Learn more

Chapter 6: General Core Requirements

6.1 General Design Requirements

Note: The ballot counter requirements from [VVSG2005] have been converted into functional requirements (Part 1: 4.3.5 "Ballot counter").

2 Comments

Comment by Seth Edelman (Voter)

All voting systems must be designed to support transactions using only one standardized, published, easy to understand format that is simple enough to be understood by people as well as read by computer devices with appropriate software. A good example would be EML.

Comment by E Smith/P Terwilliger (Manufacturer)

6.1-A. "justified is subjective and untestable. 6.1-E. "ballot card" is not defined. How does it differ from a paper ballot? 6.1-G. Switches from "vote capture device" to "voting device". 6.2-A. In the discussion, "manually" is not defined. Is this intended to mean something done outside of the EMS? For example, it will never be possible to have write-in resolution be without human intervention - no software can think of every way to misspell a name. 6.3.1.3. Why refer to the definition of "failure" and then repeat a large part of that definition here? 6.3.1.3. This is not the dictionary definition of disenfranchisement. 6.3.1.3. "highly trained" is not defined. 6.3.1.3. Why is the standard tougher for a ballot activator than for a DRE or EBM? 6.3.1.5. "devices" should be "voting devices" or "programmed devices". 6.3.1.5. The requirements under this section are all mis-numbered. 6.3.1.5-A. Table 6-3 provides inappropriate precision. If the input data to an equation has a precision of 1 or 2 significant figures (table 6-2), then the output of the equation must be rounded to an equivalent precision. For example, 1.237 should be 1.2, 2.093 should be 2.1, etc. 6.3.1.5-B. "All systems" should be "voting system", "voting equipment", or some other defined term. 6.3.1.5-B. "protect against" is vague and untestable. "prevent further voting" is unclear. For the balance of the election? Forever? Is having an on-site spare acceptable? 6.3.1.5-C. "All systems" should be "voting system", "voting equipment", or some other defined term. 6.3.1.5-C. "any data input..." should be changed to "any single data input...". No technology can prevent against multiple simultaneous failures (ie physical destruction of the equipment). 6.3.2-A. "All systems" should be "voting system", "voting equipment", or some other defined term. 6.3.2-B. "All systems" should be "voting system", "voting equipment", or some other defined term. 6.3.4. The subsections use the term "electronic voting system", which is undefined. 6.3.4.2. While ungrounded outlets "should be prohibited", this is a jurisdiction issue outside the scope of the VVSG. 6.3.4.2. At the end of the 3rd paragraph, "and manufacturer." should be added. 6.3.4.2. The 4th paragraph assumes compliance with NEC. This is an unwise assumption given the diverse nature of polling place environments. 6.3.4.3-B. The discussion makes the assumption that all voting systems use telephone communications. This is not true. 6.3.4.4-A. The frequency ranges where the 30 V/m level is to be applied are not defined. "commonly used" is not sufficient guidance. 6.3.4.4-C. No levels are specified. 6.3.5. The subsections use the term "electronic voting system", which is undefined. 6.3.5.1-A. Why is this only a polling place requirement? FCC part 15 rules also apply to office equipment. 6.3.5.1-C. The note about "cheater" plugs does not refer to any specific requirement; the earlier discussion did not contain any. This is also outside the VVSG - it is not under vendor control. 6.3.5.2-A. Why is this only a polling place requirement? FCC part 15 rules also apply to office equipment. 6.3.6.1-A. This should be clarified as only being relevant to voting devices that uses communications networks. 6.4.1.3-A. "Credible" is vague and untestable. 6.4.1.3-A.2. "Ties" is undefined. What is the definition of "organization"? The 3 year limit is absurd, as is the overall requirement; for one, no for-profit corporation will announce the adoption of a standard, nor should they. Why the limitation on the other organizations not being voting system manufacturers? It would actually be better for the certification process if the manufacturers DID use the same conventions - it would drastically reduce the VSTL code review burden. 6.4.1.4-A.1. The discussion is counter to the requirement. 6.4.1.4-B.1. "small" and "easily identifiable" are vague and untestable. 6.4.1.7-B. "prevent" is untestable. This section conflicts with numerous parts of chapter 5 that allow/require software installation. 6.4.1.7-C. Why "voting devices" here, versus "programmed devices" in 6.4.1.7-B? 6.4.1.7-D. "SHALL provide the capability" implies that use of the capability is optional. 6.4.1.8-I. "electronic device" should be "programmed device". 6.4.1.8-J. "electronic device" should be "programmed device". 6.4.1.8-J. No device can specifically alert only an election official or administrator, unless it is suggested that the alert be suppressed when a lower role is using the device. All a device can do is provide an alert (visible, audible, whatever) for the current user. 6.4.1.8-K. "electronic device" should be "programmed device". 6.4.1.8-K. "To the extent possible" and "SHALL" are not compatible. No device can specifically alert only an election official or administrator, unless it is suggested that the alert be suppressed when a lower role is using the device. All a device can do is provide an alert (visible, audible, whatever) for the current user. 6.4.1.9-A. "all systems" should be "voting devices" or similar. 6.4.1.9-E. "operator" is not defined. How does it fit into the defined role structure? 6.4.2.1. Typo: "...standards that voting system to which manufacturers..." 6.4.2.2-A. "voting system"? Even if this is changed to "voting device" it still requires this identification tag on the COTS EMS hardware (server, workstation). "main body" is unclear. 6.4.2.2-A.1. "tamper resistant" and "difficult" are not testable. 6.4.2.2-B. "voting system" or "voting device"? 6.4.2.2-B.1. "Critical" is not defined. Including all software modules in this log (as per the link to 2:2.1-A.12) is impractical. 6.4.3-A. "proper" is not testable. 6.4.3-A.1. The section title does not match the actual requirement. The requirement is vague and untestable. 6.4.3-A.2. The section title does not match the actual requirement. The requirement is vague and untestable 6.4.3-B. "suitable for" is untestable. It is also not the same as "meet or exceed" as in the discussion. 6.4.4-A. "normal use" is undefined and untestable. "deterioration" is undefined and untestable. 6.4.5-A. "electronic devices" should be "voting devices". 6.4.5-B. "voting systems" should be "voting devices". "easy" and "easily" are untestable. 6.4.5-C. How is a "data plate" different from nameplate, label or "identification tag" (6.4.2.2.-A)? Why must the "data plate be separate? "permanenty affixed" is an absolute and untestable. Is the "data plate" not required to be "permanently affixed"? Since subrequirement (c) does not call out a label (or any of its cousins), these advisory cautions, etc. can be displayed on-screen, correct? The information listed in subrequirement (a) is already part of the tag required in 1:6.4.2.2-A. Is this nameplate or label to be separate from the identification tag? If so, why? [Aside: Bob Naegele first wrote "data plate" in the 1990 standards. I have asked for clarification in every subsequent comment period...] 6.4.5-C. Typo: "manufacturer or manufacturer". 6.4.7-A. What does it mean for something to be "designated for storage"? What other option is there? 6.4.7-C. Not also activators and the entirety of polling place equipment? 6.4.7-C.1. Not also activators and the entirety of polling place equipment? Subrequirement (a) seems circular. Since 6.4.7-C designates all precinct equipment for storage, what does subrequirement (b) do that 6.4.7-A and 6.4.7-B do not? 6.4.7-D. Is this also the case for devices that are NOT designated for storage? 6.5.1. "Archivalness" is not a word. 6.5.1-A. How shall this be tested? 6.6. "Integratability" is not a word. 6.6. The first paragraph is irrelevant. Any such mixed configuration is illegal under these VVSG. 6.6-A. "maximize" is untestable. This requirement serves no purpose under these VVSG. 6.6-A.1. "standard" and "common" are vague and untestable. This requirement serves no purpose under these VVSG. The discussion does not provide any information. 6.6-B.2. Isn't an optical scan CVR the physical ballot that was scanned? There is nothing to export. 6.6-B.5. To whom is this program provided? 6.6-B.6. What is a "major device category"? 6.7. This section makes no sense.
6.1-A No obvious fraud

Voting systems SHALL contain no logic or functionality that cannot be justified in terms of a required system function or characteristic.

Applies To: Voting system

Test Reference: Part 3: 4.3 "Verification of Design Requirements", 4.5.2 "Security"

Source: New requirement

 
6.1-B Verifiably correct vote recording and tabulation

The vote recording and tabulation logic in a voting system SHALL be verifiably correct.

Applies To: Voting system

Test Reference: Part 3: 4.6 "Logic Verification"

DISCUSSION

The key word in this requirement is "verifiably." If a voting system is designed in such a way that it cannot be shown to count votes correctly despite full access to its designs, source code, etc., then it does not satisfy this requirement.

Source: New requirement

 

2 Comments

Comment by George Gilbert (Local Election Official)

How does hand tabulation of paper ballots meet this requirement?

Comment by George Ripley (Voter)

Timely and efficient election auditing depends on, among other things, getting necessary data quickly and easily -- often from a variety of different local jurisdictions that use different types of voting equipment. But not having such data in a single, standard format is a significant barrier to election auditing. Although the current draft 2007 VVSG "encourages" adoption of a standard data exchange format to facilitate interoperability between different hardware components and auditing, that is not sufficient. Voting systems should utilize a single, common XML-based data format for data import, export and exchange that is the same for all types and makes of equipment. All voting systems should support input, output and exchange of data using a single, public, standard, self-describing format that is easy for humans to read but also easily readable by other computer software without transcription -- for example, the Election Markup Language (EML).
6.1-C Voting system, minimum devices included

Voting systems SHALL contain at least one EMS and at least one vote-capture device.

Applies To: Voting system

Test Reference: Part 3: 4.2 "Physical Configuration Audit"

DISCUSSION

All voting systems must be capable of election definition, vote collection, counting and reporting. To accomplish this requires at least one EMS and at least one vote-capture device.

Source: Clarification of [VSS2002]

 

1 Comment

Comment by Fernando Morales (Manufacturer)

Add to comply with HAVA02 Section 301. Not by mail paper ballot voting systems SHALL contain at least one EMS and at least one vote-capture device.
6.1-D Paper ballots, separate data from metadata

Paper ballots used by paper-based voting devices SHALL meet the following standards:

  1. Marks that identify the unique ballot style SHALL be outside the area in which votes are recorded, so as to minimize the likelihood that these marks will be mistaken for vote responses and the likelihood that recorded votes will obliterate these marks; and
  2. If alignment marks are used to locate the vote response fields on the ballot, these marks SHALL be outside the area in which votes are recorded, so as to minimize the likelihood that these marks will be mistaken for vote responses and the likelihood that recorded votes will obliterate these marks.

Applies To: Paper-based device

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

DISCUSSION

See also Requirement Part 2:4.5.4.2-B.

Source: [VSS2002] I.3.2.4.2.1

 

1 Comment

Comment by George Gilbert (Local Election Official)

How is this not ammenable to "operational testing" per 4.3-A?
6.1-E Card holder

A frame or fixture for printed ballot cards is optional. However, if such a device is provided, it SHALL:

  1. Position the card properly; and
  2. Hold the ballot card securely in its proper location and orientation for voting.

Applies To: MMPB

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.2.4.2.5

 
6.1-F Ballot boxes

Ballot boxes and ballot transfer boxes, which serve as secure containers for the storage and transportation of voted ballots, SHALL:

  1. Provide specific points where ballots are inserted, with all other points on the box constructed in a manner that prevents ballot insertion; and
  2. If needed, contain separate compartments for the segregation of ballots that may require special handling or processing.

Applies To: Paper-based device

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

DISCUSSION

Requirement Part 1: 6.1-F.B should be understood in the context of Requirement Part 1:7.5.3-A.18, Requirement Part 1: 7.7.3-A and Requirement Part 1: 7.7.3-B. The differing options in how to handle separable ballots mean that separate compartments might not be required.

Source: [VSS2002] I.3.2.4.2.6

 

2 Comments

Comment by Frank Padilla (Voting System Test Laboratory)

"Secure" needs definition; subjective and not testable.

Comment by Frank Padilla (Voting System Test Laboratory)

6.1-F, item b.: "If needed" should be changed to 'if the manufacturer states or is required by the jurisdiction'.
6.1-G Vote-capture device activity indicator

Programmed vote-capture devices SHALL include an audible or visible indicator to provide the status of each voting device to election judges. This indicator SHALL:

  1. Indicate whether the device is in polls-opened or polls-closed state; and
  2. Indicate whether a voting session is in progress.

Applies To: Vote-capture device Λ Programmed device

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

DISCUSSION

Polls-closed could be broken down into pre-voting and post-voting states as in Part 1: 8.2 "Vote-Capture Device State Model (informative)" or further divided into separate states for not-yet-tested, testing, ready/not ready (broken), and reporting.

Source: Clarified from [VSS2002] I.2.5.1.c and I.3.2.4.3.1

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

Audible indicator? How loud?
6.1-H Precinct devices operation

Precinct tabulators and vote-capture devices SHALL be designed for operation in any enclosed facility ordinarily used as a polling place.

Applies To: Precinct tabulator, Vote-capture device

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.2.2.1 / [VVSG2005] I.4.1.2.1

 

6.2 Voting Variations

The purpose of this formulaic requirement is to clarify that support for a given voting variation cannot be asserted at the system level unless device-level support is present. It is not necessarily the case that every device in the system would support every voting variation claimed at the system level; e.g., vote-capture devices used for in-person voting may have nothing in common with the vote-capture devices (typically MMPB) used for absentee voting. However, sufficient devices must be present to enable satisfaction of the system-level claim.

 
6.2-A System composition

Systems of the X class SHALL gather votes using vote-capture devices of the X device class, count votes using tabulators of the X device class, and perform election management tasks using an EMS of the X device class, where X is any of the voting variations (In-person voting, Absentee Voting, Review-required ballots, Write-ins, Split precincts, Straight party voting, Cross-Party Endorsement, Ballot Rotation, Primary Elections, Closed Primaries, Open Primaries, Provisional-Challenged Ballots, Cumulative Voting, N-of-M Voting, and Ranked Order Voting).

Applies To: In-person voting, Absentee voting, Review-required ballots, Write-ins, Split precincts, Straight party voting, Cross-party endorsement, Ballot rotation, Primary elections, Closed primaries, Open primaries, Provisional-challenged ballots, Cumulative voting, N-of-M voting, Ranked order voting

Test Reference: Part 3: 4.2 "Physical Configuration Audit"

DISCUSSION

If the voting system requires that absentee ballots be counted manually, then it does not conform to the absentee voting class. However, it may conform to the review-required ballots class.

If the voting system requires the allocation of write-in votes to specific candidates to be performed manually, then it does not conform to the write-ins class. However, it may conform to the review-required ballots class.

If the voting system requires that provisional/challenged ballots be counted manually, then it does not conform to the provisional-challenged ballots class. However, it may conform to the review-required ballots class.

Source: Conformance ramifications of system/device relationship

 

6.3 Hardware and Software Performance, General Requirements

This section contains requirements for hardware and software performance:

 

x

6.3.1 Reliability

The following sections provide the background and rationale for the reliability benchmarks appearing in Part 1: 6.3.1.5 "Requirements". Given that there is no "typical" volume or "typical" configuration of voting system with such diversity among the many jurisdictions, it is nevertheless necessary to base the benchmarks on some rough estimates in order that they may be in the correct order of magnitude, albeit not optimal for every case.

 

1 Comment

Comment by E Smith - Sequoia Voting Systems (Manufacturer)

I attended the TGDC meetings and heard the discussions surrounding reliability calculation and the current FVSS/VVSG. I agreed with the TGDC members that the current 163 hour based requirement is lacking. However, I did not expect that the discussion would bring about so much scope creep and develop into this monster. While the goals are laudable and the detail and coverage extreme, there is really no cost/benefit analysis inherent in this model of reliability, nor is any reasonable field replacement model/election procedure taken into account that would lead to a more realistic failure rate. I suggest simplifying the model, assessing the realism inherent in the numbers, and a cost/benefit analysis with election jurisdictions as stakeholders - not just the limited input from election officials on the TGDC.

6.3.1.1 Classes of equipment

Because different classes of voting devices are used in different ways in elections, the kinds of volume against which their reliability is measured and the specific reliability that is required of them are different. The classes of voting devices for which estimates are provided are listed below. Please refer to the definitions of the parenthesized terms in Appendix A.

 

6.3.1.2 Estimated volume per election

The "typical" volumes described below are the volumes that medium-sized jurisdictions in western states need their equipment to handle in a high turn-out election, as of 2006. A county of 150 000 registered voters will have 120 000 ballots cast in a presidential election. A typical polling place will be set up to handle 2000 voters, which equals 60 polling places in a mid-sized county.
Central-count optical scanner: Medium-sized jurisdictions in western states need their central count equipment to scan 120 000 ballots in an election. Depending upon the actual throughput speeds of the scanners, they use 2 to 8 machines to handle the volume. "Typical" volume for a single scanner is the maximum tabulation rate that the manufacturer declares for the equipment times 8 hours.

Election Management System: The volume equals the total number of interactions with the vote gathering equipment required by the design configuration of the voting system to collect the election results from all the vote-capture devices.

The typical constant across the systems is that the Election Management System will interact once with each polling place for each class of equipment. Assuming our "typical" county with 60 polling places, one or more DREs in each polling place, and one or more optical scan devices, that totals 2×60=120 transactions per election.

The primary differences in the central count EMS environment are whether the optical scan devices are networked with the EMS or function independently.

In the networked environment, the device will interact with the EMS once per batch (typically around 250 ballots). So, 120 000/250=480 interactions.

In the non-networked environment, the results are handled similar to the polling place uploads. Results are copied off to media and uploaded to the EMS. Since central counting typically occurs over several days – especially in a vote-by-mail environment – the test should include several uploads from each scanner. 2 scanners × 4 days = 8 uploads.

To simplify these different cases to a single benchmark, we use the highest of the volumes (480 transactions), which leads to the lowest failure rate benchmark.

Precinct-count optical scanner: Polling place equipment has a maximum number of paper ballots that can be handled before the outtake bins fill up. Usually around 2500.

Direct Recording Electronic: Typical ballot takes 3–5 minutes to vote, so the most a single DRE should be expected to handle are 150–200 voters in a 12 hour election day.

Electronically-assisted Ballot Marker: Typically takes longer to vote than with a DRE. An individual unit should not be expected to handle more than 70 voters on election day.

Ballot activator: The volume use of these devices match the volumes for the polling place, which in our assumed county is 2000/polling place. Our assumed county would have 10–14 DREs/polling place with around 20 tokens. Each token would be used about 100 times.

Audit device: No information available.

The estimated volumes are summarized in Part 1: Table 6-1 . The estimates for PCOS and CCOS have been generalized to cover precinct tabulator and central tabulator respectively, and a default volume based on the higher of the available estimates has been supplied for other vote-capture devices that may appear in the future. Audit devices are assumed to be comparable to activation devices in the numbers that are deployed.

Table 6-1 Estimated volumes per election by device class

Device class

Estimated volume per device per election

Estimated volume per election

central tabulator

Maximum tabulation rate times 8 hours

120 000 ballots

EMS

480 transactions

480 transactions

precinct tabulator

2000 ballots

120 000 ballots

DRE

200 voting sessions

120 000 voting sessions

EBM

70 voting sessions

120 000 voting sessions

other vote-capture device

200 voting sessions

120 000 voting sessions

activation device

2000 ballot activations

120 000 ballot activations

audit device

2000 ballots

120 000 ballots

 

6.3.1.3 Manageable failures per election

The term failure is defined in Appendix A. In plain language, failures are equipment breakdowns, including software crashes, such that continued use without service or replacement is worrisome to impossible. Normal, routine occurrences like running out of paper are not considered failures. Misfeeds of ballots into optical scanners are handled by a separate benchmark (Requirement Part 1: 6.3.3-A), so these are not included as failures for the general reliability benchmark.

The following estimates express what failures would be manageable for a mid-sized county in a high-turnout election. Medium-sized counties send out troubleshooters to polling places to replace or resolve problems with machines.

Any failure that results in all CVRs pertaining to a given ballot becoming unusable or that makes it impossible to determine whether or not a ballot was cast is called disenfranchisement. It is unacceptable for even one ballot to become unrecoverable or to end up in an unknown state. For example, an optical scanner that shreds a paper ballot, rendering it unreadable by human or machine, is assessed a disenfranchisement type failure; so is a DRE that is observed to "freeze," providing no evidence one way or the other whether the ballot was cast, when the voter attempts to cast the ballot.

Central-count optical scanner: No more than one machine breakdown per jurisdiction requiring repairs done by the manufacturer or highly trained personnel. Medium sized jurisdictions plan on having one backup machine for each election.

Election Management System: This is a critical system that must perform in an extremely time sensitive environment for a mid-sized county over a 3 to 4 hour period election night. Any failure during the test that requires the manufacturer or highly trained personnel to recover should disqualify the system. Otherwise, as long as the manufacturer's documentation provides usable procedures for recovering from the failures and methods to verify results and recover any potentially missing election results, 1 failure is assessed for each 10 minutes of downtime (minimum 1 – no fractional failures are assessed). A total of 3 or more such failures disqualifies the system.

Precinct-count optical scanner: A failure in this class of machine has a negligible impact on the ability of voters to vote in the polling place. No more than 1 of the machines in an election experience serious failures that would require the manufacturer or highly trained personnel to repair (e.g., will not boot). No more than 5 % of the machines in the election experience failures that require the attention of a troubleshooter/poll worker (e.g., memory card failure).

Direct Recording Electronic and Electronically-assisted Ballot Marker: No more than 1 % of the machines in an election experience failures that would require the manufacturer or highly trained personnel to repair (e.g., won't boot) and no more than 3 % of the machines in an election experience failures that require the attention of a troubleshooter (e.g., printer jams, recalibration, etc.).

Ballot activator: The media/token should not fail more than 3 % of the time (the county will provide the polling place with more tokens than necessary). No more than 1 of the devices should fail (the device will be replaced by the county troubleshooter).

Audit device: No information available. If comparable to ballot activators, there should be at least 1 spare.

The manageable failure estimates are summarized in Part 1: Table 6-2 . A "user-serviceable" failure is one that can be remedied by a troubleshooter and/or election official using only knowledge found in voting equipment user documentation; a "non-user-serviceable" failure is one that requires the manufacturer or highly trained personnel to repair.

Please note that the failures are relative to the collection of all devices of a given class, so the value 1 in the row for central tabulator means 1 failure among the 2 to 8 central tabulators that are required to count 120 000 ballots in 8 hours, not 1 failure per device.

Table 6-2 Estimated manageable failures per election by device class

Device class

Failure type

Manageable failures per election

voting device (all)

Disenfranchisement

0

central tabulator

All1

1

EMS

Non-user-serviceable

0

EMS

User-serviceable (10 minutes)

2

precinct tabulator

Non-user-serviceable

1

precinct tabulator

User-serviceable

5 % of devices = 3

DRE

Non-user-serviceable

1 % of devices = 6

DRE

User-serviceable

3 % of devices = 18

EBM

Non-user-serviceable

1 % of devices = 17

EBM

User-serviceable

3 % of devices = 51

other vote-capture device

Non-user-serviceable

1 % of devices = 6

other vote-capture device

User-serviceable

3 % of devices = 18

activation device

Media/token

3 % of tokens = 36

activation device

Main unit

1

audit device

All

1

Apart from misfeeds, which are handled by a separate benchmark, TGDC experience is that central tabulator failures are never user-serviceable.

 

3 Comments

Comment by Brian V. Jarvis (Local Election Official)

Recommend providing a definition of the term "manageable failure" that distinguishes it from a "failure."

Comment by E Smith/L Korb (Manufacturer)

This is another laudable attempt to improve voting system reliability that does not take into account the many external factors that can cause unit and system failures on Election Day. The mandated failure rate benchmarks will be read by many to presume that all certified voting systems will always meet these metrics under all real world conditions. External events outside of the realm of these guidelines (e.g. a major lighting storm) could easily produce power disturbances (in excess of 6.3.4.3-B.2) that would result in more than the one non-user-serviceable failure in a county deploying a precinct tabulator system.

Comment by Craig Burton, CTO, EveryoneCounts.com (Manufacturer)

We question the grouping of physical ballot destruction type failure with electronic ambiguity as to the receipt of a vote. A ballot fed to an optical scanner that is subsequently entirely destroyed by the scanner with clear evidence that the vote was not recorded electronically should not be classed as a disenfranchisement type failure because it would seem this failure can be unambiguously repaired. The voter is given another ballot. The problem of electronic failure or partial failure is much more serious as we can neither assure the voter of the ballot's inclusion in the count nor confidently issue a fresh ballot for fear of collecting a duplicate vote.

6.3.1.4 Derivation of benchmarks

We focus on one class of device and one type of failure at a time, and we assume that each failure is followed by repair or replacement of the affected device. This means that we consider two failures of the same device to be equivalent to one failure each of two different devices of the same class. The sense of "X % of the machines fail" is thus approximated by a simple failure count, which is X/100 times the number of devices. This then must be related to the total volume processed by the entire group of devices over the course of an election in order to determine the number of failures that would be manageable in an election of that size.

To reduce the likelihood of an unmanageable situation to an acceptably low level, a benchmark is needed such that the probability of occurrence of an unmanageable number of failures for the total volume estimated is "acceptably low." That "acceptably low level" is here defined to be a probability of no more than 1 %, except in the case of disenfranchisement, where the only acceptable probability is 0.

Under the simplifying assumption that failures occur randomly and in a Poisson distribution, the probability of observing n or less failures for volume v and failure rate r is the value of the Poisson cumulative distribution function,

Poisson cumulative distribution function

Consequently, given ve (the estimated total volume) and ne (the maximum manageable number of failures for volume ve), the desired benchmark rate rb is found by solving P(ne,rbve)=0.99 for rb. This sets the benchmark rate such that there remains a 1 % risk that a greater number of failures would occur with marginally conforming devices during an election in which they collectively process volume ve. In the case of disenfranchisement, that risk is unacceptable; hence the benchmark is simply set to zero.

 

1 Comment

Comment by Craig Burton, CTO, EveryoneCounts.com (Manufacturer)

Compromise of voting machines may not be random enough for a Poisson distribution. Voting machines are distributed few-to-a-station and it is our opinion that they are likely to all be compromised within a station and so the minimal object for audit should be the station. This assumption of fraud is made in other places such as India which has evoting machines. There, it is assumed entire stations get thrown by corrupt officials. Ninety-five percent confidence is possibly not high enough for close elections. Confidence needs to be 95% that the effects of fraud/malfunction fall within the winning margin for the election. Deciding how many machines to audit really should depend on the outcome of the election with increasing effort for close races.

6.3.1.5 Requirements

6.3.1.5-A Failure rate benchmark

All devices SHALL achieve failure rates not exceeding those indicated in Part 1: Table 6-3.

Applies To: Voting device

Test Reference: Part 3: 5.3.2 "Critical values"

Source: Revised from [VSS2002] I.3.4.3 / [VVSG2005] I.4.3.3

Table 6-3 Failure rate benchmarks

Device class

Failure type

Unit of volume

Benchmark

voting device (all)

Disenfranchisement

 

0

central tabulator

All

ballot

1.237×10−6

EMS

Non-user-serviceable

transaction

2.093×10−5

EMS

User-serviceable (10 minutes)

transaction

9.084×10−4

precinct tabulator

Non-user-serviceable

ballot

1.237×10−6

precinct tabulator

User-serviceable

ballot

6.860×10−6

DRE

Non-user-serviceable

voting session

1.941×10−5

DRE

User-serviceable

voting session

8.621×10−5

EBM

Non-user-serviceable

voting session

8.013×10−5

EBM

User-serviceable

voting session

3.058×10−4

other vote-capture device

Non-user-serviceable

voting session

1.941×10−5

other vote-capture device

User-serviceable

voting session

8.621×10−5

activation device

Media/token

ballot activation

2.027×10−4

activation device

Main unit

ballot activation

1.237×10−6

audit device

All

ballot

1.237×10−6

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

Define disenfranchisement.
6.3.1.5-B No single point of failure

All systems SHALL protect against a single point of failure that would prevent further voting at the polling place.

Applies To: Voting system

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.2.2.4.1.a / [VVSG2005] I.2.1.4.a

 

2 Comments

Comment by Gail Audette (Voting System Test Laboratory)

For single point failure and the benchmarks listed in Table 6-3, what entity (vendor or test lab) is responsible for conducting the fault tree analysis or similar analysis to provide pass/fail of this requirement?

Comment by Frank Padilla (Voting System Test Laboratory)

Tests labs can not determine this. Depends on systems utilized and number.
6.3.1.5-C Protect against failure of input and storage devices

All systems SHALL withstand, without loss of data, the failure of any data input or storage device.

Applies To: Voting system

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.2.2.4.1.e / [VVSG2005] I.2.1.4.e

 

6.3.2 Accuracy/error rate

Since accuracy is measured at the system level, it is not necessary to define different benchmarks for different classes of devices.

 

1 Comment

Comment by Cem Kaner (Academic)

Rather than defining a benchmark in terms of a typical case, VVSG should require the manufacturer to specify a maximum ballot complexity (number of contests is one important aspect of ballot complexity, number of alternatives per contest is another) and demonstrate acceptable results in simple, typical, and maximally complex cases. .......... If we relax the end-to-end testing requirement, we can assess accuracy empirically by generating large numbers of randomly-completed ballots by computer, feeding these ballots to the voting machine, and using the records of what was generated as the oracle for what was recorded. In an optical scan test, this is easily achieved by printing completed ballots. Similarly, by connecting a test computer to the input port for the keyboard, we can simulate human user input to DREs, allowing massive sets of automated tests, all with oracles. .......... (Affiliation Note: IEEE representative to TGDC)
6.3.2-A Satisfy integrity constraints

All systems SHALL satisfy the constraints in Part 1: 8.3 "Logic Model (normative)".

Applies To: Voting system

Test Reference: Part 3: 4.6 "Logic Verification"

Source: Formalization of general requirements

 
6.3.2-B End-to-End accuracy benchmark

All systems SHALL achieve a report total error rate of no more than 8×10–6 (1 / 125 000).

Applies To: Voting system

Test Reference: Part 3: 5.3.4 "Accuracy"

DISCUSSION

For the definition of report total error rate, see Requirement Part 3: 5.3.4-B.
This benchmark is derived from the "maximum acceptable error rate" used as the lower test benchmark in [VVSG2005]. That benchmark was defined as a ballot position error rate of 2×10−6 (1 / 500 000).

Given that there is no "typical" ratio of votes to ballot positions with such diversity among the many jurisdictions, it is nevertheless necessary to base the benchmark on some rough estimates in order that it may be in the correct order of magnitude, albeit not optimal for every case. The rough estimates are as follows. In a presidential election, there will be approximately 20 contests with a vote for 1 on each ballot with an average of 4 candidates, including the write-in position, per contest. (Some states will have fewer contests and some more. A few contests, like President, would have 8–13 candidates; most have 3 candidates including the Write-in, and a few have 2 candidates.) The estimated ratio of votes to ballot positions is thus ¼.

For paper-based tabulators, this general requirement is elaborated in Part 1: 7.7.5 "Accuracy".

Source: Generalized and clarified from [VSS2002] I.3.2.1 / [VVSG2005] I.4.1.1

Other accuracy-related requirements include Requirement Part 1: 6.4.1.7-D, Requirement Part 1: 7.1-E, Requirement Part 1: 7.1-F, Requirement Part 1: 7.5.4-A, and Requirement Part 1: 7.8.3.1-B.

 

1 Comment

Comment by George Gilbert (Local Election Official)

6.3.2-B--Does this benchmark apply to hand tabulated paper ballots? If not, how can hand tabulation, in auditing or recounts, be used to measure the accuracy of certified voting and tabulation systems? The reliance of Section 4 of Chapter 4 on hand tabulation of paper audit records seems to impel the establishment of a report total error rate consistent with, or more stringent than, the benchmark for automated tabulation systems.

6.3.3 Misfeed rate

6.3.3-A Misfeed rate benchmark

The misfeed rate SHALL NOT exceed 0.002 (1 / 500).

Applies To: Paper-based device Λ Tabulator, EBM

Test Reference: Part 3: 5.3.5 "Misfeed rate"

DISCUSSION

Multiple feeds, misfeeds (jams), and rejections of ballots that meet all manufacturer specifications are all treated collectively as "misfeeds" for benchmarking purposes; i.e., only a single count is maintained.

Source: Merge of [VSS2002] I.3.2.5.1.4.b and I.3.2.5.2.c, reset benchmark

 

6.3.4 Electromagnetic Compatibility (EMC) immunity

The International Electrotechnical Commission (IEC) Technical Committee 77 on Electromagnetic Compatibility has defined [ISO95a] the concept of "ports" as the interface of an electronic device ("apparatus") with its electrical and electromagnetic environment, as illustrated in Part 1: Figure 6-1. In the sketch, the arrows point toward the apparatus, but in a complete assessment of the compatibility, one should also consider the other direction – that is, what disturbances ("emissions") can the apparatus inject into its environment.

Figure 6-1 Electrical and electromagnetic environment

Electrical and electromagnetic environment

Five of these ports involve conducted disturbances carried by metallic conductors, and the sixth, the "enclosure," allows radiated disturbances to impinge on the apparatus. In this context, the term "enclosure" should not be understood as limited to a physical entity (metallic, non metallic, totally enclosed or with openings) but rather be understood as simply the route whereby electromagnetic radiations couple with the circuitry and components of the apparatus.

In previous voting systems guidelines, possible interactions and immunity concerns have been described but perhaps not in explicit terms relating them to the concept of ports. In this updated version of the VVSG, the recitation of compatibility requirements is structured by considering the ports one at a time, plus some consideration of a possible interaction between ports:

  1. Power port – also described as "power supply" – via ordinary receptacles of the polling place
  2. Earth port – implied in the National Electric Code [NFPA05] stipulations for dealing with the power supply of the polling place
  3. Signal port – connection to the landline telephone of the polling place to the central tabulator
  4. Control port – inter-system connections such as voting station to precinct tabulator
  5. Enclosure port – considerations on immunity to radiated disturbances and electrostatic discharge
  6. Interaction between signal port and power port during surge events

Note: In this EMC section, the specified voltage and current levels are expressed in root mean square (rms) for power-frequency parameters and in peak value for surges and impulses.

 

1 Comment

Comment by E Smith/B Pevzner (Manufacturer)

Part 1 Chapter 6 Section 6.3.4 and 3 5 5.1.1 Comment There are number of test specifications being used/referred – ANSI, IEEE, FCC, CISPR, Telcordia, IEC. Most of them are overlapping. Can one specification suite such as IEC to be used as a base and maybe one or two more to fill the gaps.

6.3.4.2 Steady-state conditions

Adequate operation of an eventual surge-protective device and, more important, safety considerations demand that the power supply receptacles be of the three-prong type (Line, Neutral, and Equipment Grounding Conductor). The use of a "cheater" adapter for older type receptacles with only two-blade capacity and no dependable grounding conductor should be prohibited. Details on the safety considerations are addressed in Part 1: 3.2.8.2 "Safety".

The requirement of using a dedicated landline telephone service should also be satisfied for polling places.

Steady state conditions of a polling place are generally out of the control of the local jurisdiction.

However, for a polling place to ensure reliable voting, the power supply and telephone service need to be suitable for the purpose. Compliance with the National Electrical Code [NFPA05] is assumed to be required.

 
6.3.4.2-A Power supply – energy service provider

To obtain maximum flexibility of application, the voting system SHALL be powered by a 120 V, single phase power supply, as available in polling places, derived from typical energy service providers.

Applies To: Electronic device

Test Reference: Part 3: 3.1 "Inspection"

DISCUSSION

It is assumed that the AC power necessary to operate the voting system will be derived from the existing power distribution system of the facility housing the polling place. This single-phase power may be a leg of a 120/240 V single phase system, or a leg of a 120/208 V three-phase system, at a frequency of 60 Hz, according to the limits defined in [ANSI06], and premises wiring compliant with the [NFPA05], in particular its grounding requirements.

Source: [NFPA05]

 

1 Comment

Comment by George Gilbert (Local Election Official)

6.4.3.2-A—The most flexible, easiest to use DRE voting system my county has experienced required no external AC power source while in operation. It was powered by 6 D cell batteries which were capable of providing sufficient power to the system for far more that the hours the polls were open…and easy to replace if necessary. Polling places could be set up without the limitations of power outlet availability or location. No extension cords were needed and none interfered with access to or egress from the voting equipment. Polling place power surges or power failures did not affect the voting machines. I am told that the power requirements of modern (color capable) DRE machines exceed the capacity of available batteries…at least batteries that are practical for the application. This may not always be the case and the EAC would be well advised to avoid foreclosing this option in the VVSG. Based on our experience, battery operated DRE machines have numerous advantages over machines requiring external power.
6.3.4.2-B Telecommunications services provider

To avoid compromising voting integrity (accidentally or intentionally), the telephone connection of a voting system SHALL use a dedicated line (no extensions on the same telephone number) and be compatible with the requirements of the telephone service provider.

Applies To: Electronic device

Test Reference: Part 3: 3.1 "Inspection"

DISCUSSION

Communications (upon closing of the poll) between the polling place and the central tabulator is expected to be provided exclusively by the landline network of the telephone service provider connected to the facility housing the polling place. The use of cell phone communications is specifically prohibited.

Source: New requirement

 

1 Comment

Comment by Premier Election Solutions (Manufacturer)

Since the type of phone connection selected for use is controlled by the jurisidiction using the system, this is not testable by the VSTL, except for a review of documentation. Proposed Change: Remove this requirement.

6.3.4.3 Conducted disturbances immunity

As described in the introductory paragraphs of Part 1: 6.3.4 "Electromagnetic Compatibility (EMC) immunity", several ports of the voting system are gateways to possible electromagnetic disturbances, both inbound and outbound. This section dealing with conducted disturbances immunity addresses concerns about the power port and the communications ports (a combination of the in-house communications and communications to remote tabulating facilities).

Limitations of outbound conducted disturbances ("emissions" in EMC language) that might inject objectionable interference into the facility power distribution system or the telephone service connection are addressed in Part 1: 6.3.5 "Electromagnetic Compatibility (EMC) emission limits".

 
6.3.4.3-A Power port disturbances

All electronic voting systems SHALL withstand conducted electrical disturbances that affect the power ports of the system.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-A

DISCUSSION

The power distribution system of the polling place can be expected to be affected by several types of disturbances, ranging from very brief surges (microseconds) to longer durations (milliseconds) and ultimately the possibility of a long-term outage. These are addressed in the following requirements: A.1, A.2, A.3, and A.4.

NOTE: There are several scenarios of accidental conditions that can produce voltages far in excess of the deviations implied by [ANSI06] or [ITIC00], such as loss of a neutral conductor, commingling of distribution systems with low-voltage conductors (knocked down poles, falling tree limbs). Such an event will produce in the building massive failures of equipment other than voting systems, and be obvious to the officials conducting the polling. Hardware failure of the voting system can be expected. Fortunately, the occurrence of such events is quite rare, albeit not impossible, so that such a extreme stress should not be included in the EMC requirements nor in the regimen of national certification testing – provided that the failure mode would not result in a safety hazard.

Source: [ANSI06], [IEEE02a], [ITIC00]

 
6.3.4.3-A.1 Combination Wave

All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, a "Combination Wave" surge of 6 kV 1.2/50 µs, for high impedance power ports and 3 kA 8/20 µs, for low impedance power ports, between line and neutral terminals.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-A.1

DISCUSSION

The so-called "Combination Wave" has been accepted by industry as representative of surges that might occur in low-voltage AC power systems and be imposed on connected loads.

Source: [IEEE02a]

 
6.3.4.3-A.2 Ring Waves

All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, a "Ring Wave" surge with a 0.5 µs rise time and a decaying oscillation at 100 kHz with a first peak voltage of 6 kV between the line and neutral terminals, and between the line and equipment grounding conductor terminals, and also 3 kV between the neutral and equipment grounding conductor terminals.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-A.2

DISCUSSION

This test waveform, proposed by IEEE since 1980 [IEEE80] as a "Standard Waveform," and more recently adopted by the IEC [ISO06c] represents common disturbances on AC power lines but it was not included in previous versions of the VVSG. It originates during disturbances of power flow within the building, an occurrence more frequent than lightning surges. It is less likely than the Combination Wave to produce hardware destruction, but high levels still can produce hardware failure.

The "Power Quality" literature [Grebe96] and some standards [IEEE91] also cite "Decaying Ring Waves" or "Damped Oscillatory Waves" with lower frequencies but lesser amplitudes typically associated with the switching of power-factor correction capacitors. These can be significant for surge-protective device survival and possibly disruption of the operation of switched-mode power supplies. However, inclusion of the Combination Wave, the Ring Wave, and the Swells in these immunity criteria should be sufficient to ensure immunity against these lower frequency and lower amplitude decaying ring waves.

Source: [IEEE02a]

 
6.3.4.3-A.3 Electrical Fast Transient Burst

All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, a burst of repetitive fast transients with a waveform of 5/50 ns, each burst lasting 15 ms, from a 2 kV source.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-A.3

DISCUSSION

While the fast transients involved in this immunity requirement do not propagate very far and are not expected to travel from the energy supply provider, they can be induced within a facility if cable runs are exposed to switching disturbances in other load circuits. Unlike the preceding two disturbances that are deemed to represent possibly destructive surges, the Electrical Fast Transient (EFT) Burst has been developed to demonstrate equipment immunity to these non-destructive but disruptive transients. Their repetitive profile increases the probability that a disruption might occur when the logic circuits go through a transition. It is important to recognize that this test, which does not represent the actual environment, is one of interference immunity, not a test of withstanding energy stress.

Source: [IEEE02a]

 
6.3.4.3-A.4 Outages, sags and swells

All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, a complete loss of power lasting two hours and also a temporary overvoltage of up to 120 % of nominal system voltage lasting up to 0.5 second, and a permanent overvoltage of up to 110 % of nominal system voltage.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-A.4

DISCUSSION

Because the VVSG stipulates a two-hour back up, generally implemented by a floating battery pack, sag immunity is inherently ensured. However, the floating battery, unless buffered by a switch-mode power supply with inherent cut-off in case of a large swell, might not ensure inherent immunity against swells (short duration system overvoltages). The Information Technology industry has adopted a recommendation that IT equipment should be capable to operate correctly for swells reaching 120 % of the nominal system voltage with duration ranging from 3 ms to 0.5 s and permanent overvoltages up to 110 % of nominal system voltage.

Source: [ITIC00]

 
6.3.4.3-B Communications (telephone) port disturbances

All electronic voting systems SHALL withstand conducted electrical disturbances that affect the telephone ports of the system.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-B

DISCUSSION

Voting equipment, by being connected to the outside service provider via premises wiring, can be exposed to a variety of electromagnetic disturbances. These have been classified as lightning-induced, power-fault induced, power contact, Electrical Fast Transient (EFT), and presence of steady-state induced voltage. Within a complex voting system installed in a polling place, there is also a possibility that the various pieces of equipment can be exposed to emissions from other piece of connected equipment. In the context of the VVSG compatibility, not only must the voting system equipment be immune to these disturbances, but also the public switched telephone network must be protected against harm originating from customer premises equipment, in this context the voting system equipment. Protection of the network is discussed in the Part 1: 6.3.5 "Electromagnetic Compatibility (EMC) emission limits". Immunity to disturbances impinging on the voting system telephone port is addressed in the following requirements: B.1, B.2, B.3, B.4, B.5, and B.6.

Source: [Telcordia06]

 
6.3.4.3-B.1 Emissions from other connected equipment

All elements of an electronic voting system SHALL be able to withstand the conducted emissions generated by other elements of the voting system.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-B.1

DISCUSSION

This requirement is an issue of inherent compatibility among the diverse elements of a voting system, not compatibility with the polling place environment or subscriber equipment other than those making up the voting system. It is understood and implemented that security requirements dictate that the voting system outgoing communications be provided by a dedicated landline telephone service excluding other subscriber terminal equipment otherwise used by entities occupying the facility when telephone communication with central tabulators is established.

Source: [Telcordia06], [ANSI02]

 
6.3.4.3-B.2 Lightning-induced disturbances

All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the stresses induced into the telephone network by lightning events, which can propagate to the telephone port of the voting system. The necessary immunity level is 1 kV for high-impedance ports and 100 A for low-impedance ports, both with a 10/1000 µs waveshape.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-B.2

DISCUSSION

Lightning events (direct flashes to the network or voltages induced in the network by nearby flashes to earth) can be at the origin of voltage surges or current surges impinging upon the interface of the premises wiring with the landline network. The provision of surge protection in the Network Interface Device (primary protection NID) is not universally provided, especially in dense urban locations, therefore the immunity level of the telephone port should be demonstrated as required by the Telcordia Generic Requirements.

Source: [Telcordia06]

 

1 Comment

Comment by Premier Election Solutions (Manufacturer)

It seems that the "disruption of normal operation," should not be the standard, but, rather, "no loss of data." Although a lightning strike may require the re-transmission of data, this is not an event that would impair a successful tabulation of results. Proposed Change: Change the requirement to read as follows: All electronic voting systems SHALL be able to withstand, without loss of data, the stresses induced into the telephone network by lightning events, which can propagate to the telephone port of the voting system. The necessary immunity level is 1 kV for highimpedance ports and 100 A for low-impedance ports, both with a 10/1000 µs waveshape.
6.3.4.3-B.3 Power fault-induced disturbances

All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the stresses induced into the network by power faults occurring in adjacent power distribution systems. The necessary immunity level is 600 V at 1 A for a 1 s application.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-B.3

DISCUSSION

For overhead telephone landline cables that share the pole with power distribution cables (medium-voltage as well as low-voltage), as well as direct burial of adjacent telephone and power cables, large power system faults can induce significant voltages and the resulting currents in the telephone network.

Source: [Telcordia06]

 

1 Comment

Comment by Premier Election Solutions (Manufacturer)

Discruption of normal operations should not be a requirment of this section but simply no loss of data. There would be nothing wrong with having to re-transmit the data if a power fault induced disturbance occurred during the transmission of data. Proposed Change: Change the requirement to read as follows: All electronic voting systems SHALL be able to withstand, without loss of data, the stresses induced into the network by power faults occurring in adjacent power distribution systems. The necessary immunity level is 600 V at 1 A for a 1 s application.
6.3.4.3-B.4 Power contact disturbances

All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the stresses appearing at the telephone port as a result from an accidental contact between the telephone network cables and nearby power distribution cables. The necessary immunity level between ground and the T/R conductors at 60 Hz is 600 V for short durations and 277 V for indefinite durations.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-B.4

DISCUSSION

Outside of the polling place building, accidental contact between the telephone network cables and power distribution cables (sharing poles for overhead, or sharing trenches for underground) can inject substantial 60 Hz current and voltages into the telephone network. Within the polling place facility, while not at high probability, instances have been noted whereby contractors working in a facility can provoke a similar injection of 60 Hz current or voltage into the premises telephone wiring. The 600 V level cited in the above requirement is associated with an accidental contact with primary power lines, promptly cleared by the power system protection, while the 277 V level is associated with an accidental contact with low-voltage distribution system that might not be cleared by the power system protection.

Source: [Telcordia06]

 

1 Comment

Comment by Premier Election Solutions (Manufacturer)

Disruption of normal operations should not be a requirement of this section but simply no loss of data. There would be nothing wrong with having to re-transmit the data if a power contact disturbance occurred during the transmission of data. Proposed Change: Change the requirement to read as follows: All electronic voting systems SHALL be able to withstand, without loss of data, the stresses appearing at the telephone port as a result from an accidental contact between the telephone network cables and nearby power distribution cables. The necessary immunity level between ground and the T/R conductors at 60 Hz is 600 V for short durations and 277 V for indefinite durations.
6.3.4.3-B.5 Electrical Fast Transient (EFT)

All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the disturbances associated with an EFT burst of 5/50 ns pulses, each burst lasting 15 ms, from a 0.25 kV source.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-B.5

DISCUSSION

Electrical Fast Transient bursts emulate the interference associated with electromagnetic coupling between the premises wiring of the telephone service and the premises wiring of the power distribution system in which switching surges can occur. Because these switching surges are random events, the occurrence of interference varies with the timing of their occurrence with respect to the transitions of the circuits. It is important to recognize that this requirement deals with interference immunity, not with withstanding energy stress. Immunity against such high-frequency coupling has been added to the requirements listed by [Telcordia06], effective January 1, 2008.

Source: [Telcordia06], [ISO04b]

 
6.3.4.3-B.6 Steady-state induced voltage

All electronic voting systems SHALL be able to withstand, without disruption of normal operation or loss of data, the disturbances associated with steady-state induced voltages and currents. The necessary immunity level is ≥126 dBrn (50 V).

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-B.6

DISCUSSION

Voting systems interfacing with the telephone service provider plant can be subject to the interfering effects of steady-state voltages induced from nearby power lines. Through electromagnetic coupling, normal operating currents on these power lines can induce common-mode (longitudinal) voltages and currents in the outside cable plant. The 60 Hz and 180 Hz components of the induced voltage spectrum can interfere with signaling and supervisory functions for data transmission from a polling place toward a central tabulator. Higher frequencies can produce audible noise in voice-band transmission.

Source: [Telcordia06]

 
6.3.4.3-C Interaction between power port and telephone port

All electronic voting systems connected to both a power supply and a landline telephone system SHALL withstand the potential difference caused by the flow of surge current in the facility grounding network.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.2-C

DISCUSSION

A voting system that is powered via its power port to the power distribution system of the facility and to the telephone service provider via its telephone port can experience a potentially damaging stress between the two ports during the expected operation of the telephone network interface device in the event of a surge occurring in the telephone system. Because the level of potential differences during a surge event is principally the result of the local configuration of the premises wiring and grounding systems, and thus beyond the control of the local polling entity, inherent immunity of the voting system can be achieved by incorporating a surge reference equalizer that provides the necessary bonding between the input power port and telephone port during a surge event.

Source: [IEEE02], [IEEE05]

 

6.3.4.4 Radiated disturbances immunity

This section discusses radiated disturbances impacting the enclosure port of the voting system, including electromagnetic fields originating from adjacent or distant sources, as well as a particular radiation associated with electrostatic discharge.

Emissions limits requirements of radiated (and conducted) disturbances are addressed in Part 1: 6.3.5.2 ‘Radiated emissions".

 
6.3.4.4-A Electromagnetic field immunity (80 MHz to 6.0 GHz)

All electronic voting systems SHALL withstand, without disruption of normal operation or loss of data, exposure to radiated electromagnetic fields of ≥10 V/m over the entire frequency range of 80 MHz to 6.0 GHz, and ≥30 V/m within frequency bands commonly used by portable transmitters.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.3-A

DISCUSSION

The proliferation of portable transmitters (cellular telephones and personal communications systems) used by the general population and the common communications transmitters used by security, public safety, amateur radio, and other services increases the likelihood that the voting equipment covered in the VVSG will be exposed to the radiated electromagnetic fields from these devices. Also, other wireless devices (wireless local area networks, etc.), communications and broadcast transmitters may be operating in the vicinity and need to be considered. Since it may be impractical to eliminate nearby radio-frequency sources, voting systems must demonstrate immunity to these signals in order to operate to a high standard of reliability. This requirement is intended to ensure intrinsic immunity to the electromagnetic environment.

Source: [ANSI97], [ISO06a], [ISO06d]

 

2 Comment

Comment by E Smith/J Homewood (Manufacturer)

The requirement is for Induced RF testing to be done a 10V/m. Then in addition "30V/m within frequency bands commonly used by portable transmitters". There is no specification of what those frequencies are. Additionally even 10V/m is considered dangerous to humans, how can it be considered that 30V/m is reasonable from a cell phone or portable transmitter? The section is referred to in the testing section Part 3:5.1.1.3-C specifies using IEC 61000-4-3 which specifies 10V/m.

Comment by David B. Aragon (General Public)

I am a wireless engineer. My employer makes 802.11 (WiFi) but not voting equipment. I am speaking as an individual professional and IEEE member. Specifying this requirement in V/m units is unusual. At the frequencies to which this section applies, electrical and magnetic fields are commonly treated together as electromagnetic field strength. Other Federal bodies (e.g. FCC, OSHA) usually specify RF field strength in units of power per area (W/m^2 or mW/cm^2). Using a commonly accepted conversion factor, the proposed 10 V/m and 30 V/m thresholds correspond to 0.027 mW/cm^2 and 0.24 mW/cm^2 respectively. (By comparison, the consensus safe prolonged exposure level for humans is 1 mW/cm^2, i.e. about four times the larger of the proposed VVSG figures.) Unless there is a compelling reason to use V/m in this case, VVSG should specify limits in milliwatts per square centimeter, to better align with regulations and with the measuring equipment vendors use to comply with regulations. The actual values of the proposed limits are reasonable, but only if the equipment is not itself an RF transmitter. Thus, this section is consistent with Part I, Chapter 5.6.1-A which prohibits RF wireless technology. If the equipment were itself to radiate RF at power levels permitted by FCC for WiFi or Wimax, then the field could locally exceed the 30 V/m (or equivalently 0.24 mW/cm^2) that this section requires the equipment to tolerate.
6.3.4.4-B Electromagnetic field immunity (150 kHz to 80 MHz)

All electronic voting systems SHALL withstand, without disruption of normal operation or loss of data, exposure to radio-frequency energy induced on cables in the frequency range of 150 kHz to 80 MHz at a 10 V level.

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.3-B

DISCUSSION

The dominant coupling mechanism of radiated electromagnetic fields to equipment electronics at frequencies below 80 MHz is considered to be through currents induced on interconnecting cables. At these frequencies, the wavelengths are such that typical circuit components are electrically very small and thus inefficient in coupling energy directly from the radiated electromagnetic fields. The interconnecting cables, on the other hand, tend to be on the order of the signal wavelengths and may act as efficient and possibly resonant antennas. Thus, the radiated electromagnetic fields will efficiently induce currents on these cables that are connected directly to the equipment electronics.

Source: [ANSI97], [ISO06b]

 
6.3.4.4-C Electrostatic discharge immunity

All electronic voting systems SHALL withstand, without disruption of normal operation or loss of data, electrostatic discharges associated with human contact and contact with mobile equipment (service carts, wheelchairs, etc.).

Applies To: Electronic device

Test Reference: Part 3: 5.1.1.3-C

DISCUSSION

Electrostatic discharge events can originate from direct contact between an "intruder" (person or object) charged at a potential different from that of the units of the voting system, or from an approaching person about to touch the equipment – an "air discharge." The resulting discharge current can induce disturbances in the circuits of the equipment.

Note: The immunity addressed in this section is concerned with normal operations and procedures at the polling place. It does not include immunity to electrostatic discharges that might occur when service personnel open the enclosure and handle internal components.

Source: [ANSI93], [ISO01]

 

6.3.5 Electromagnetic Compatibility (EMC) emission limits

"Emission limits" are the companion of "Immunity Requirements" – both are necessary to achieve electromagnetic compatibility. In contrast with immunity requirements that are expressed as withstand levels for the equipment, emission limits requirements are expressed as compliance with consensus-derived limits on the parameters of the disturbances injected in the electromagnetic environment by the operation of the voting system.

 

6.3.5.1 Conducted emissions

Electronic voting systems, by their nature, can generate currents or voltages that will exit via their connecting cables to the power supply or to the telephone service provider of the voting facility. To ensure compatibility, industry standards or mandatory regulations have been developed to define maximum levels of such emissions.

 
6.3.5.1-A Power port connection to the facility power supply

All electronic voting systems installed in a polling place SHALL comply with emission limits affecting the power supply connection to the energy service provider according to Federal Regulations [FCC07].

Applies To: Electronic device

Test Reference: Part 3: 5.1.2.1 "Conducted emissions limits"

DISCUSSION

The normal operation of an electronic system can produce disturbances that will travel upstream an affect the power supply system of the polling place, creating a potential deviation from the expected electromagnetic compatibility of the system. The issue is whether these actual disturbances (after possible mitigation means incorporated in the equipment) reach a significant level to exceed stipulated limits, which include the following categories:

  1. Harmonic emissions associated with the load current drawn by the voting system. However, given the low values of the current drawn by the voting system, these emissions do not represent a significant issue, as explained in [IEEE92]. They are only mentioned here for the sake of completeness in reciting the range of disturbances and therefore do not require testing.
  2. High-frequency conducted emissions (distinct from the harmonic spectrum) into the power cord by coupling from high-frequency switching or data transmission inherent to the system operation. These are addressed in the mandatory certification requirements of [FCC07], Class B.

Source: [IEEE92], [FCC07]

 
6.3.5.1-B Telephone port connection to the public network

All electronic voting systems installed in a polling place SHALL comply with emission limits stipulated by the industry-recognized organizations of telephone service providers Telcordia [Telcordia06] and TIA [ANSI02].

Applies To: Electronic device

Test Reference: Part 3: 5.1.2.1-A

DISCUSSION

Regulatory emission limits requirements for protecting the network (public switched telephone network) from harm via customer premises equipment are contained in the source documents [Telcordia06], [ANSI02], [FCC07a] and compliance to these documents is considered mandatory for offering the equipment on the market.

Source: [Telcordia06], [ANSI02], [FCC07a]

 
6.3.5.1-C Leakage via grounding port

All electronic voting systems installed in a polling place SHALL comply with limits of leakage currents effectively established by the trip threshold of all listed Ground Fault Current Interrupters (GFCI), if any, installed in the branch circuit supplying the voting system.

Applies To: Electronic device

Test Reference: Part 3: 5.1.3.2-A

DISCUSSION

Excessive leakage current is objectionable for two reasons:

  1. For a branch circuit or wall receptacle that could be provided with a GFCI (depending upon the wiring practice applied at the particular polling place), leakage current above the GFCI built-in trip point would cause the GFCI to trip and therefore disable the operation of the system.
  2. Should the power cord lose the connection to the equipment grounding conductor of the receptacle, a personnel hazard would occur. (Note the prohibition of "cheater" adapters in the discussion of general requirements for the polling place.)

This requirement is related to safety considerations as discussed in Part 1: 3.2.8.2 "Safety" – in particular the requirement to have the voting system comply with [UL05].

Note: According to [NFPA05], a bond between the equipment grounding conductor and the neutral conductor is prohibited downstream from the entrance service panel. GFCIs are designed to trip if such a prohibited bond is detected by the GFCI.

Source: [UL06], [NFPA05]

 

6.3.5.2 Radiated emissions

6.3.5.2-A Radiated radio frequency emissions

All electronic voting systems installed in a polling place SHALL comply with emission limits according to the Rules and Regulations of the Federal Communications Commission, Part 15, Class B [FCC07] for radiated radio-frequency emissions.

Applies To: Electronic device

Test Reference: Part 3: 5.1.2.2-A

DISCUSSION

Electronic equipment in general and modern high-speed digital electronic circuits in particular have the potential to produce unintentional radiated and conducted radio-frequency emissions over wide frequency ranges. These unintentional signals can interfere with the normal operation of other equipment, especially radio receivers, in close proximity. The requirements of [FCC07] and [ANSI06a] are intended to minimize this possible interference and control the level of unwanted radio-frequency signals in the environment.

Source: [FCC07]

 

6.3.6 Other requirements

In addition to the requirements associated with EMC discussed in the preceding sections, there are other requirements, including dielectric withstand, personnel safety considerations (addressed in Part 1: 3.2.8.2 "Safety") and hardware failure modes (which can also be a safety issue) [UL05].

 

6.3.6.1 Dielectric withstand

6.3.6.1-A Dielectric stresses

All electronic voting systems SHALL be able to withstand the dielectric test stresses associated with connection to the network, characterized by limits of the admissible leakage current.

Applies To: Electronic device

Test Reference: Part 3:5.1.3.1-A

DISCUSSION

Dielectric withstand requirements stipulated by industry-consensus telephone requirements as a condition for connecting equipment to their network involve the insulation and leakage current limits between elements of the voting system hardware, including the following:

  1. Network and device or accessible circuitry which might in turn connect to the user;
  2. Network and hazardous power system; and
  3. Power equipment.

Source: [Telcordia06]

 

6.4 Workmanship

This section contains requirements for voting system materials, and for good design and construction workmanship for software and hardware:

  • Software engineering practices;
  • Quality assurance and configuration management;
  • General build quality;
  • Durability;
  • Security and audit architectural requirements;
  • Maintainability;
  • Temperature and humidity; and
  • Equipment transportation and storage.
 

1 Comment

Comment by Alec Yasinsac (Academic)

Absence of Development Process Assessment The absence of any develop process analysis efforts is an important VVSG Draft omission. Recommend that the following requirement be inserted as the new section 6.4.1: 6.4.1 Developer Capability Maturity This section defines the development standard required for all software developers submitting voting systems for VVSG certification. While the subsequent sections in Section 6.4 address product requirements, this subsection designates the proven process quality that must be achieved for voting system developers in the following two requirements. (1) Only developers that have achieved Capability Maturity Model Integrated (CMMI) Level 4 (Qualitatively Managed) certified through the Standard CMMI Appraisal Method for Process Improvement (SCAMPISM) process may submit voting systems for VVSG certification (2) Developers may only submit their own products that were developed using the certified CMMI Level 4 processes. D I S C U S S I O N The voting application is critical to the health of any democracy and software engineering excellence is dependent on mature development processes that allow developers to repeat successful practices and avoid risky practices, procedures, and approaches. Requiring mature development processes offers the best hope for creating quality, testable voting system software.

6.4.1 Software engineering practices

This section describes essential design and performance characteristics of the logic used in voting systems. The requirements of this section are intended to ensure that voting system logic is reliable, robust, testable, and maintainable.

The general requirements of this section apply to logic used to support the entire range of voting system activities. Although this section emphasizes software, the standards described also influence hardware design considerations.

While there is no best way to design logic, the use of outdated and ad hoc practices is a risk factor for unreliability, unmaintainability, etc. Consequently, these VVSG require the use of modern programming practices. The use of widely recognized and proven logic design methods will facilitate the analysis and testing of voting system logic.

 

2 Comments

Comment by Carl Hage (General Public)

This whole section is inappropriate-- arbitrary rules are specified, and the most important requirement is missing-- code comments! The "rules" listed in this section could lead to confusing and unreliable code-- depending on the routine being written. Splitting code into small subroutines could detract from readablility and testability, and a goto statement (e.g. exception handling) on occastion can simplify the code reliability and readability. What is completely missing is a requirement for comments in the code documenting the behavior. Too often code is assumed to be self documenting, and sometimes the only comments are just a repeat of the code, e.g. "WaitForLoop(); // Wait for the Loop" or "i++; /* Increment i */" rather than "i++; /* Advance to the next character*/". It would be appropriate to require every variable be documented with a comment at it's declaration. Rather than list arbirary rules, I suggest giving some general guidelines/requirements, e.g. the code is well structured, and well documented, and easily testable. The reviewers that certify the software should be asked to fill a survey form that rates the software. This specification could define the rating system, e.g. Is the code well structured: - Well structured code, exceeding requirements and above normal - Well structured code, easy to read and easy for testing - Structured code, meets minimum requirements - Does not meet requirements: (check those that apply) - difficult to understand - possible reliability problems - testability problems - spagetti code - excessive subroutine nesting - spagetti objects/excessive confusing inheritance - other: Likewise the code should be rated for documentation and possibly testing. Is the code easy to understand-- - Well documented, much better than nornal - Well documented, easy to understand - Documented, could be improved - Poorly documented, additional notes required for reading - Unacceptable - product is disqualified

Comment by Gail Audette (Voting System Test Laboratory)

Is dead code being allowed? Please provide specific requirements for the permitted and non-permitted us of hard coded passwords?

6.4.1.1 Scope

The design requirements of this section apply to all application logic, regardless of the ownership of the logic or the ownership and location of the hardware on which the logic is installed or operates. Although it would be desirable for COTS software to conform to the design requirements on workmanship, its conformity to those requirements could not be assessed without access to the source code; hence, the design requirements are scoped to exclude COTS software. However, where there are functional requirements, the behaviors of COTS software and hardware are constrained. (N.B., the definition of COTS precludes any application logic from receiving a COTS designation.)

Third-party logic, border logic, and configuration data are not required to conform to the design requirements on workmanship, but manufacturers are required to supply that source code and data to the test lab to enable a complete review of the application logic (Requirement Part 2: 3.4.7.2-E, Requirement Part 2: 3.8-D).

 

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

While fully understanding your rationale for design requirements being scoped to exclude COTS software (and hardware), you do point out in the preceding paragraph that "the use of outdated and ad hoc practices is a risk factor for unreliability, unmaintainability, etc. Consequently, these VVSG require the use of modern programming practices." Recommend that all COTS suppliers be certified to a current best practice development standard (e.g., ISO9001, CMMI). While not imposing design requirements on workmanship, using only certified suppliers would provide confidence that the supplier has suitable infrastructure, resources, information, equipment, measuring and monitoring devices, and environmental conditions to produce a conforming product.

6.4.1.2 Selection of programming languages

6.4.1.2-A Acceptable programming languages

Application logic SHALL be produced in a high-level programming language that has all of the following control constructs:

  1. Sequence;
  2. Loop with exit condition (e.g., for, while, and/or do-loops);
  3. If/Then/Else conditional;
  4. Case conditional; and
  5. Block-structured exception handling (e.g., try/throw/catch).

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

The intent of this requirement is clarified in Part 1: 6.4.1.5 "Structured programming" with discussion and examples of specific programming languages.

By excluding border logic, this requirement allows the use of assembly language for hardware-related segments, such as device controllers and handler programs. It also allows the use of an externally-imposed language for interacting with an Application Program Interface (API) or database query engine. However, the special code should be insulated from the bulk of the code, e.g. by wrapping it in callable units expressed in the prevailing language, to minimize the number of places that special code appears. C.f. [MIRA04] Rule 2.1: "Assembly language shall be encapsulated and isolated."

Acceptable programming languages are also constrained by Requirement Part 1: 6.4.1.7-A.3 and Requirement Part 1: 6.4.1.7-A.4, which effectively prohibit the invention of new languages.

Source: [VVSG2005] I.5.2.1, I.5.2.4 and II.5.4.1

 

3 Comments

Comment by Troy D. Straszheim (Academic)

I feel that the legitimate implementation languages should be enumerated here. C, C++, Java, Python. They should be languages that are "open", so that the language implementation itself is subject to peer review, and so that the programming constituency can verify the integrity of the code. E.g. this paragraph doesn't really address the situation where a Voting Machine Control Language Compiler is implemented in assembly; the compiler itself could contain malicious code. I disagree that exception handling should be here as a requirement. It is plenty possible to write secure code without exceptions. Again, in my view it makes more sense to specify languages rather than language features.

Comment by Kevin Baas (Voter)

I agree with Troy: "[T]he language implementation itself [must be] subject to peer review". Enumerating required control structures is absolutely pointless. If it's Turing complete and reasonably easy to program, it would therefore, necessarily, have a good set of control structures, so the issue need not be forced. And if it doesn't, so what? This wouldn't effect vote security. Likewise, it need not be a "high-level" language. There's absolutely no point in forcing that. High-level languages introduce security risks, as there's more software between the code and the computer (by this I mean the compiler or interpreter) that could contain malicious or errant code. Mid-level languages like C or C++ would have less such security risk because many people know how to implement a compiler for them and would be able to peer-review the language implementation. It seems that lower-level languages would be more secure, because there's less black box between the code and the execution. And as I said on a different comment, one also should be able to peer-review and test the processor that runs the instructions, and therefore the machine-code must be made open. If someone argues against this and says that making the machine code open would be a security risk, well, here's the rebuttal: The machine code instruction specifications for the computer you're using right now is published publicly, and that clearly doesn't pose a security risk. The reverse does, however, because you can't tell what's inside a black box. So long as there's a black box in the machine, it could be doing just about anything. And so long as that box lies in between a vote and a tally, it doesn't matter what anything else does - what's inside that black box ultimately determines what the tally is.

Comment by Kevin Wilson (Voting System Test Laboratory)

Suggests that SQL (which could be construed as a database query engine language) might be allowed only in encapsulated usage (border logic). SQL is not included in the Table 6-4 (1-6.4.1.5) even though some modern databases have SQL variants (T-SQL SQL Server 2000+, PL/SQL Oracle 9i+) that contain constructs like sequence, loop with exit condition, if/then/else conditional, case conditional, block-structured exception handling. In those cases can SQL be treated as an acceptable application logic language?
6.4.1.2-A.1 COTS language extensions are acceptable

Requirement Part 1: 6.4.1.2-A MAY be satisfied by using COTS extension packages to add missing control constructs to languages that could not otherwise conform.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

For example, C99 [ISO99] does not support block-structured exception handling, but the construct can be retrofitted using (e.g.) [Sourceforge00] or another COTS package.

The use of non-COTS extension packages or manufacturer-specific code for this purpose is not acceptable, as it would place an unreasonable burden on the test lab to verify the soundness of an unproven extension (effectively a new programming language). The package must have a proven track record of performance supporting the assertion that it would be stable and suitable for use in voting systems, just as the compiler or interpreter for the base programming language must.

Source: Tightening of [VVSG2005] I.5.2.4 and II.5.4.1

 

2 Comments

Comment by Michael Roth (Academic)

Approved languages should be explicitly listed. The specifications, at least as I've read them here, seem to allow any arbitrary language, provided that general language constructs such as try/catch are implemented in some form, specifically by usage of "COTS". The language here isn't clear, but based on your reference to an open-source implementation of exceptions for C99, it would seem that the requirement, in simple terms, is: any arbitrary language, with any 3rd-party implementations of missing functionality. This seems to leave a lot of room for a lot of possible issues, as the document does not place any heavy emphasis on how the languages themselves with be analyzed to ensure proper behavior. By explicitly listing exactly what languages, and exactly what "COTS" extensions, are allowed, you can better focus on specific testing protocols that can exploit known issues with particular languages such as buffer overflows (in languages without built-in run-time bounds checking for dynamically-allocated data structures like C), memory leaks (in non-garbage collecting languages like C/C++, Pascal), and other issues which may compromise proper system behavior or allow for possible security exploits. Thank you for efforts. Sincerely, Michael Roth

Comment by Kevin Wilson (Voting System Test Laboratory)

States that "If the 'three year rule' was satisfied at the time that a system was first submitted for testing, it is considered satisfied for the purpose of subsequent reassessments of that system." Are the 2002 and 2005 EAC Guidelines "grandfathered in" as a result of this statement or do the sections 6.4.1.4-6.4.1.5 apply to both legacy and new code? In reference to this, especially when dealing with legacy voting devices that contain limited memory and/or computing power, the introduction of third party tools such as except might not fit on the target platform. Are such devices to be retired because of these software workmanship standards?

6.4.1.3 Selection of general coding conventions

6.4.1.3-A Acceptable coding conventions

Application logic SHALL adhere to a published, credible set of coding rules, conventions or standards (herein simply called "coding conventions") that enhance the workmanship, security, integrity, testability, and maintainability of applications.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

Coding conventions that are excessively specialized or simply inadequate may be rejected on the grounds that they do not enhance one or more of workmanship, security, integrity, testability, and maintainability.

See the discussion for Requirement Part 1: 6.4.1.2-A regarding border logic.

Source: Rewrite of [VSS2002] I.4.2.6

 

2 Comments

Comment by Alan A. Jorgensen, Ph.D., for the Association for Software Testing Special Interest Group on eVoting (Advocacy Group)

Comments on Part1, Chapter 6, 6.4.1.3 Selection of general coding conventions The VVSG should adopt the term "coding standards" instead of "coding conventions". "Coding standards" are more rigorous by industry-accepted definition than "coding conventions", and more closely describe the level of sophistication and rigor that ought to be applied to the code running our voting machines, therefore the use of the phrase "coding conventions" is inappropriate in the VVSG. The VVSG should require vendors to supply the coding standard to which their code complies, and the publications from which it is derived. The VVSG should contain sections on minimum coding standards. For example: 6.4.1.3-B Minimum Coding Standards Vendor coding standards SHALL meet or exceed the minimum coding standards of the following sections: 6.4.1.3-B.1 Naming of procedures and data All names of procedures and data SHALL be clearly and specifically descriptive of items they represent. Abbreviations SHALL NOT be allowed. Names of procedures SHALL be imperative verb clauses except when returning a data value where the procedure SHALL be named to be descriptive of the data value returned. Names of data items SHALL be noun clauses descriptive of the data items they represent. Example Review of the SAVIOC code from SAVIOC Voting Systems reveals frequent if not total use of * Abbreviations such as "WIallowed", "ln", etc. * Generic names such as "CharPtr", "templine", etc. * Coded names such as "lc_N", "MNprty", etc. * Non descriptive procedure names such as "first_pass", "party_adds", etc. Example The code in the VVSG must conform to these requirements as well. See Part 1, Chapter 6.4.1.5-A.1 Legacy library units must be wrapped for an example of non-compliant variable names and data items. 6.4.1.3-B.2 Parameterization of Constants Constants SHALL be named, defined and explained. Only names of constants SHALL be used in procedural code. Constants SHALL be appropriately named as the data they represent. Example Review of the SAVIOC code from SAVIOC Voting Systems reveals frequent use of unidentified constants: * char digit[3]; // Numeric character What is 3? * allowed = 80-(locCselect); What is 80? * window(1,25, 78,25); What are these numbers? What is the purpose and nature of this window? 6.4.1.3-B.3 Coding Comments Comments embedded in code SHALL be complete, grammatical English language sentences that explain the code to which they refer. Comments embedded in code SHALL NOT contain abbreviations nor program identifiers (except parenthetically). Comments embedded in code SHALL contain only ASCII printable and whitespace characters. Comments SHALL NOT be used to disable the compilation of sections of code. Example See the non-compliant comments, constants, and variable names in the code in Part 3, Chapter 5.3.2 Critical values of the VVSG.

Comment by Sascha Davis (Voting System Test Laboratory)

How many coding conventions are the vendors allowed to provide us? If multiple coding standards are allowed then how do VSTL's handle any conflicting standards? Do standards submitted by the vendor over rule the 2007 standards in conflicting areas? How are different interpretations of the same standard now handled?
6.4.1.3-A.1 Published

Coding conventions SHALL be considered published if and only if they appear in a publicly available book, magazine, journal, or new media with analogous circulation and availability, or if they are publicly available on the Internet.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This requirement attempts to clarify the "published, reviewed, and industry-accepted" language appearing in previous iterations of the VVSG, but the intent of the requirement is unchanged.
Following are examples of published coding conventions (links valid as of 2007-02). These are only examples and are not necessarily the best available for the purpose.

  1. Ada: Christine Ausnit-Hood, Kent A. Johnson, Robert G. Pettit, IV, and Steven B. Opdahl, Eds., Ada 95 Quality and Style, Lecture Notes in Computer Science #1344, Springer-Verlag, 1995-06. Content available at http://www.iste.uni-stuttgart.de/ps/ada-doc/style_guide/cover.html and elsewhere.
  2. C++: Mats Henricson and Erik Nyquist, Industrial Strength C++, Prentice-Hall, 1997. Content available at http://hem.passagen.se/erinyq/industrial/.
  3. C#: "Design Guidelines for Class Library Developers," Microsoft. http://www.msdn.microsoft.com/library/default.asp?url=/library/en-us/cpgenref/html/cpconnetframeworkdesignguidelines.asp.
  4. Java: "Code Conventions for the Java™ Programming Language," Sun Microsystems. http://java.sun.com/docs/codeconv/.

Source: Clarification of [VSS2002] I.4.2.6

 
6.4.1.3-A.2 Credible

Coding conventions SHALL be considered credible if and only if at least two different organizations with no ties to the creator of the rules or to the manufacturer seeking conformity assessment, and which are not themselves voting equipment manufacturers, independently decided to adopt them and made active use of them at some time within the three years before conformity assessment was first sought.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This requirement attempts to clarify the "published, reviewed, and industry-accepted" language appearing in previous iterations of the VVSG, but the intent of the requirement is unchanged.

Coding conventions evolve, and it is desirable for voting systems to be aligned with modern practices. If the "three year rule" was satisfied at the time that a system was first submitted for testing, it is considered satisfied for the purpose of subsequent reassessments of that system. However, new systems must meet the three year rule as of the time that they are first submitted for testing, even if they reuse parts of older systems.

Source: Clarification of [VSS2002] I.4.2.6

 

1 Comment

Comment by Craig Burton, CTO, EveryoneCounts.com (Manufacturer)

We applaud the standards clearly defining coding practices with language examples. VSS02 was inconsistent, referring to 'no dynamic languages' but then been unclear on, for example, the use of Java. We are please Java figures in the VVSG as it forms the basis of much advanced work in secure transactions. Also the standards do not go so far as to assert the Capability Maturity Model or Common Criteria for security. We too do not think these are necessary at this stage but we assert there is a at least one new Common Criteria protection profile being created for e-voting systems in general. VVSG should consider creating or using such a profile as it could inherit from the international standard for computer security (ISO 15408). We assert that any developer of e-voting systems should at least be willing to surrender or allow random selection of coding examples and attest to their being representative of the entire system. The EAC's appointed code auditor should then get final say as to whether the coding style obfuscates the process it automates. In this case auditor should be able to require a rewrite before the code can be used. This would allow for the 'professional judgment' required at other sections in Chapter 6. This should not be let to providers. We think the details asserted in VVSG section 6.4.1.3 should instead be left for a professional third party code auditor to establish and report conformance.

6.4.1.4 Software modularity and programming

6.4.1.4-A Modularity

Application logic SHALL be designed in a modular fashion.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

See module. The modularity rules described here apply to the component submodules of a library.

Source: Extracted and revised from [VSS2002] I.4.2.3

 

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

The link to "module" in the DISCUSSION is broken.
6.4.1.4-A.1 Module testability

Each module SHALL have a specific function that can be tested and verified independently of the remainder of the code.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

In practice, some additional modules (such as library modules) may be needed to compile the module under test, but the modular construction allows the supporting modules to be replaced by special test versions that support test objectives.

Source: Extracted and revised from [VSS2002] I.4.2.3.a

 

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

Recommend that a related testability requirement be established indicating that the functionality of a module being tested and verified is traceable to a defined (or derived) requirement. (This ensures traceability of requirements throughout the development lifecycle.)
6.4.1.4-B Module size and identification

Modules SHALL be small and easily identifiable.

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: Revision of [VSS2002] II.5.4.2.i, as revised by Section 6.6.4.2, Paragraph i of [P1583] and subsequent issues[5]

 

2 Comments

Comment by Gail Audette (Voting System Test Laboratory)

Subjective requirement - not testable.

Comment by Premier Election Solutions (Manufacturer)

Given the extensive requirements and complexity of election systems, keeping modules small (depending on the definition of small) may not be achievable. Please define the meaning of "small and easily identifiable."
6.4.1.4-B.1 Callable unit length limit

No more than 50 % of all callable units (functions, methods, operations, subroutines, procedures, etc.) SHOULD exceed 25 lines of code in length, excluding comments, blank lines, and initializers for read-only lookup tables; no more than 5 % of all callable units SHOULD exceed 60 lines in length; and no callable units SHOULD exceed 180 lines in length.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

"Lines," in this context, are defined as executable statements or flow control statements with suitable formatting.

Source: Revision of [VSS2002] II.5.4.2.i, as revised by Section 6.6.4.2, Paragraph i of [P1583][5]

 

2 Comments

Comment by Gail Audette (Voting System Test Laboratory)

This is a SHOULD requirement, therefore, will not be adhered to. This does not speak to the complexity of the code - for example the number of paths through the code or complexity. For reliability, maintainability, and testability (at unit, integration, and functional test levels), the complexity of the code is a stronger metric than simple length limits. Similarly, the depth of inheritance structure metric also provides a strong indication as to the code's ability to be tested and maintained.

Comment by Richard Carback (Academic)

The number of lines of code it takes to do certain tasks is language dependent. This requirement might be hard to follow with certain languages.
6.4.1.4-B.2 Lookup tables in separate files

Read-only lookup tables longer than 25 lines SHOULD be placed in separate files from other source code if the programming language permits it. Test Reference: Part 3: 4.5.1 "Workmanship"

 

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

Doesn't this limit allowable languages? By this requirement, current voting system applications would be made obsolete. Is this the intention?

6.4.1.5 Structured programming

Note: Specific programming languages are identified to support the discussion. In no case does such identification imply recommendation or endorsement, nor does it imply that the programming languages identified are necessarily the best or only languages acceptable for voting system use.

Table 6-4 Presence of high-level concepts of control flow in the coding conventions of earlier versions of VVSG and in various programming languages

Concept

VSS [GPO90]
[VSS2002] / VVSG [VVSG2005]

Ada [ISO87]
[ISO95]

C
[ISO90]
[ISO99]

C++
[ISO98]
[ISO03a]

C#
[ISO03b]
[ISO06]

java [java05]

Visual Basic 8 [MS05]

Sequence

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Loop with exit condition

Yes

Yes

Yes

Yes

Yes

Yes

Yes

If/Then/Else conditional

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Case conditional

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Named block exit

No

Yes

No

No

No

Yes

No[1]

Block-structured exception handling

No

Yes

No

Yes

Yes

Yes

Yes

The requirement to follow coding conventions serves two purposes. First, by requiring specific risk factors to be mitigated, coding conventions support integrity and maintainability of voting system logic. Second, by making the logic more transparent to a reviewer, coding conventions facilitate test lab evaluation of the logic's correctness to a level of assurance beyond that provided by operational testing.

Prominent among the requirements addressing logical transparency is the requirement to use high-level control constructs and to refrain from using the low-level arbitrary branch (a.k.a. goto). As is reflected in Part 1: Table 6-4 , most high-level concepts for control flow were established by the time the first edition of the Guidelines was published and are supported by all of the programming languages that were examined as probable candidates for voting system use as of this iteration. However, two additional concepts have been slower to gain universal support.

The first additional concept, called here the "named block exit," is the ability to exit a specific block from within an arbitrary number of nested blocks, as opposed to only being able to exit the innermost block, without resorting to goto. The absence of named block exit from some languages is not cause for concern here because deeply nested blocks are themselves detrimental to the transparency of logic and most coding conventions encourage restructuring them into separate callable units.

The second additional concept, called here "block-structured exception handling," is the ability to associate exception handlers with blocks of logic, and implicitly, the presence of the exception concept in the programming language. (This simply means try/throw/catch or equivalent statements, and should not be confused with the specific implementation known as Structured Exception Handling (SEH) [Pietrek97].[2]) Unlike deeply nested blocks, exceptions cannot be eliminated by restructuring logic. "When exceptions are not used, the errors cannot be handled but their existence is not avoided." [ISO00a]

Previous versions of VVSG required voting systems to handle such errors by some means, preferably using programming language exceptions ([VVSG2005] I.5.2.3.e), but there was no unambiguous requirement for the programming language to support exception handling. These Guidelines require programming language exceptions because without them, the programmer must check for every possible error condition in every possible location, which both obfuscates the application logic and creates a high likelihood that some or many possible errors will not be checked. Additionally, these Guidelines require block-structured exception handling because, like all unstructured programming, unstructured exception handling obfuscates logic and makes its verification by the test lab more difficult. "One of the major difficulties of conventional defensive programming is that the fault tolerance actions are inseparably bound in with the normal processing which the design is to provide. This can significantly increase design complexity and, consequently, can compromise the reliability and maintainability of the software." [Moulding89]

Existing voting system logic implemented in programming languages that do not support block-structured exception handling can be brought into compliance either through migration to a newer programming language (most likely, a descendant of the same language that would require minimal changes) or through the use of a COTS package that retrofits block-structured exception handling onto the previous language with minimal changes. While the latter path may at first appear to be less work, it should be noted that many library functions may need to be adapted to throw exceptions when exceptional conditions arise, whereas in a programming environment that had exceptions to begin with the analogous library functions would already do this (see Requirement Part 1: 6.4.1.5-A.1).

 

1 Comment

Comment by Sue Sautermeister (Local Election Official)

spacing is in error 6.4.1.5a
6.4.1.5-A Block-structured exception handling

Application logic SHALL handle exceptions using block-structured exception handling constructs.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

See Part 1: 6.4.1.5 "Structured programming".

Source: Extension of [VVSG2005] requirements for structured programming

  
6.4.1.5-A.1 Legacy library units must be wrapped

If application logic makes use of any COTS or third-party logic callable units that do not throw exceptions when exceptional conditions occur, those callable units SHALL be wrapped in callable units that check for the relevant error conditions and translate them into exceptions, and the remainder of application logic SHALL use only the wrapped version.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

For example, if an application written in C99 [ISO99] + cexcept [Sourceforge00] used the malloc function of libc, which returns a null pointer in case of failure instead of throwing an exception, the malloc function would need to be wrapped. Here is one possible implementation:

void *checkedMalloc (size_t size) {
	void *ptr = malloc (size);
	if (!ptr)
		Throw bad_alloc;
	return ptr;
}
#define malloc checkedMalloc

Wrapping legacy functions avoids the need to check for errors after every invocation, which both obfuscates the application logic and creates a high likelihood that some or many possible errors will not be checked for. In C++, it would be preferable to use one of the newer mechanisms that already throw exceptions on failure and avoid use of legacy functions altogether.

Source: New requirement

 
6.4.1.5-B Unstructured control flow is prohibited

Application logic SHALL contain no unstructured control constructs.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

See the discussion for Requirement Part 1: 6.4.1.2-A regarding border logic.

Source: Generalization and summary of [VVSG2005] I.5.2.4 and II.5.4.1

 

Arbitrary branches (a.k.a. gotos) are prohibited.

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: Generalization and summary of [VVSG2005] I.5.2.4 and II.5.4.1

 
6.4.1.5-B.2 Intentional exceptions

Exceptions SHALL only be used for abnormal conditions. Exceptions SHALL NOT be used to redirect the flow of control in normal ("non-exceptional") conditions.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

"Intentional exceptions" cannot be used as a substitute for arbitrary branch. Normal, expected events, such as reaching the end of a file that is being read from beginning to end or receiving invalid input from a user interface, are not exceptional conditions and should not be implemented using exception handlers.

Source: [VSS2002] I.4.2.4.d, II.5.4.1.c / [VVSG2005] I.5.2.4.a.iii, II.5.4.1

 
6.4.1.5-B.3 Unstructured exception handling

Unstructured exception handling (e.g., On Error GoTo, setjmp/longjmp, or explicit tests for error conditions after every executable statement) is prohibited.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

The internal use of such constructs by a COTS extension package that adds block-structured exception handling to a programming language that otherwise would not have it, as described in Requirement Part 1: 6.4.1.2-A.1, is allowed. Analogously, it is not a problem that source code written in a high-level programming language is compiled into low-level machine code that contains arbitrary branches. It is only the direct use of low-level constructs in application logic that presents a problem.

Source: Extension of [VVSG2005] requirements for structured programming

 
6.4.1.5-C Separation of code and data

Application logic SHALL NOT compile or interpret configuration data or other input data as a programming language.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

The requirement in [VVSG2005] read "Operator intervention or logic that evaluates received or stored data shall not re-direct program control within a program routine." That attempt to define what it means to compile or interpret data as a programming language caused confusion.

Distinguishing what is a programming language from what is not requires some professional judgment. However, in general, sequential execution of imperative instructions is a characteristic of conventional programming languages that should not be exhibited by configuration data. Configuration data must be declarative or informative in nature, not imperative.

For example: it is permissible for configuration data to contain a template that informs a report generating application as to the form and content of a report that it should generate, but it is not permissible for configuration data to contain instructions that are executed or interpreted to generate a report, essentially embedding the logic of the report generator inside the configuration data.

The reasons for this requirement are (1) mingling code and data is bad design, and (2) embedding logic within configuration data is an evasion of the conformity assessment process for application logic.

See also Requirement Part 1: 6.4.1.7-A.3 and Requirement Part 1: 6.4.1.7-A.4.

Source: Clarification of [VSS2002] I.4.2.4.d and II.5.4.1.c / [VVSG2005] I.5.2.4.a.iii and II.5.4.1 paragraph 4

 

6.4.1.6 Comments

6.4.1.6-A Header Comments

Application logic modules SHOULD include header comments that provide at least the following information for each callable unit (function, method, operation, subroutine, procedure, etc.):

  1. The purpose of the unit and how it works (if not obvious);
  2. A description of input parameters, outputs and return values, exceptions thrown, and side-effects;
  3. Any protocols that must be observed (e.g., unit calling sequences);
  4. File references by name and method of access (read, write, modify, append, etc.);
  5. Global variables used (if applicable);
  6. Audit event generation;
  7. Date of creation; and
  8. Change log (revision record).

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

Header comments and other commenting conventions should be specified by the selected coding conventions in a manner consistent with the idiom of the programming language chosen. If the coding conventions specify a coding style and commenting convention that make header comments redundant, then they may be omitted. Otherwise, in the event that the coding conventions fail to specify the content of header comments, the non-redundant portions of this generic guideline should be applied.

Change logs need not cover the nascent period, but they must go back as far as the first baseline or release that is submitted for testing, and should go back as far as the first baseline or release that is deemed reasonably coherent.

Source: Revised from [VSS2002] I.4.2.7.a

 

2 Comments

Comment by Carl Hage (General Public)

The requirements for commenting code are inadequate. Besides header comments, the code must contain comments that clearly describe the operation of the code such that a reader can easily understand what is happening and readily identify errors or shortcomings. When data requires checking (e.g. parameter string length) comments must explain what has been checked and what needs to be checked. Every variable should be documented. Documentation in headers should be suitable to write test programs. Comments should explain concepts, not just repeat the code. Object-oriented code where the behavior of a routine is inherited requires special consideration. Detailed comments may need to be duplicated, and all inherited behavior must be well documented. Typical object-oriented code is often so complex and undocumented, using grep over dozens of modules is required to find source code to identify behavior. It may be easy to re-use code and inherit behavior, but that does not necessarily lead to documented code and behavior. The level of commenting should be reviewed and rated.

Comment by Premier Election Solutions (Manufacturer)

This requirement should only apply to callable units that perform some non-trivial operations. Callable units that simply wrap data accessing (i.e. Properties) or provide simplistic operations (sum, min, etc) should not require header comments. Proposed Change: Change the requirment to read as follows: Application logic modules SHOULD include header comments that provide at least the following information for each callable unit (function, method, operation, subroutine, procedure, etc.) that provides non-trivial operation:

6.4.1.7 Executable code and data integrity

Portions of this section are from or derived from [P1583], as noted in requirements and discussion text[3],[4].

 

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

Recommend that COTS software should not be used in the voting machine if the source code is unobtainable. Otherwise workmanship, integrity, and maintainability of applications is unverifiable which undermines all the code and data integrity rules being implemented.
6.4.1.7-A Code coherency

Application logic SHALL conform to the following subrequirements.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This is to scope the following subrequirements to Application logic. For COTS software where source code is unobtainable, they would be unverifiable.

 

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

In addition to the code rules mentioned in this section and sub-sections, recommend that additional code rules be established for the following: - dead code - deactivated code - patches/patched code
6.4.1.7-A.1 Self-modifying code

Self-modifying code is prohibited.

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: [VSS2002] I.4.2.2

 
6.4.1.7-A.2 Unsafe concurrency

Application logic SHALL be free of race conditions, deadlocks, livelocks, and resource starvation.

Test Reference: Part 3: 3.1 "Inspection", 3.2 "Functional Testing"

Source: New requirement

 

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

Unsafe concurrency is able to be checked by some of the static analysis tools on the market, but only for the more recent languages. How does this requirement apply to the older languages currently in use?
6.4.1.7-A.3 Code integrity, no strange compilers

If compiled code is used, it SHALL only be compiled using a COTS compiler.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This prohibits the use of arbitrary, nonstandard compilers and consequently the invention of new programming languages.

Source: New requirement

 

2 Comments

Comment by Richard Carback (Academic)

The compiler should be independently acquired by test labs, and not provided by the vendor.

Comment by Craig Burton, CTO, EveryoneCounts.com (Manufacturer)

This comment also applies to 6.4.1.7-A.4 Interpreted code, specific COTS interpreter We assert that a known trusted compiler or interpreter is needed to avert the situation in the classic ACM Reflections On Trusting Trust (http://www.ece.cmu.edu/~ganger/712.fall02/papers/p761-thompson.pdf). That is, NIST or EAC should offer to compile the provider's code or similarly furnish the compiled interpreter the provider will use.
6.4.1.7-A.4 Interpreted code, specific COTS interpreter

If interpreted code is used, it SHALL only be run under a specific, identified version of a COTS runtime interpreter.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This ensures that (1) no arbitrary, nonstandard interpreted languages are used, and (2) the software tested and approved during the conformity assessment process does not change behavior because of a change to the interpreter.

Source: [P1583] Section 5.6.2.2

 

2 Comments

Comment by Carl hage (General Public)

Usage of COTS software (or unnecessarily complex application software) should be minimized. Using an interpreted language in a voting or ballot counting device should be prohibited (requiring verification of the interpreter) as an unnecessary security risk and source of unverifiable code. An EMS may have a complexity that is best addressed via an interpreted language, but the output (e.g. election definitions) should be public and verifiable.

Comment by ACCURATE (Aaron Burstein) (Academic)

The current VVSG draft requires any interpreted code to "run under a specific, identified version of a COTS runtime interpreter" (Part 1:6.4.1.7-A.4). At least one recently developed voting system prototype (Ka-Ping Yee's PVote) makes extensive use of interpreted code that runs under a custom interpreter. This system offers the prospect of dramatically simplifying the code necessary to support voting; but, under plausible readings of the VVSG draft's interpreted code restrictions, this system would not conform with the VVSG. Though we recognize the difficulties that interpreted code poses with respect assuring the integrity of voting system software, such promising systems warrant a closer look at the interpreted code restriction. We recommend that the Commission reconsider the VVSG draft's basic approach to interpreted code, perhaps with input from NIST or an anciallary process for establishing related guidelines. We also recommend that the EAC explore using the innovation class (see ACCURATE's comments on Part 1:2.7.2) to manage the certification of voting systems that contain interpreted code under a more permissive standard than the current 1:6.4.1.7-A.4.
6.4.1.7-B Prevent tampering with code

Programmed devices SHALL prevent replacement or modification of executable or interpreted code (e.g., by other programs on the system, by people physically replacing the memory or medium containing the code, or by faulty code) except where this access is necessary to conduct the voting process.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This requirement may be partially satisfied through a combination of read-only memory (ROM), the memory protection implemented by most popular COTS operating systems, error checking as described in Part 1: 6.4.1.8 "Error checking", and access and integrity controls.

Source: Rewording/expansion of [VSS2002] I.4.2.2

 

3 Comments

Comment by Gail Audette (Voting System Test Laboratory)

How can a device prevent replacement or modification of executable code? Does this then disallow code upgrades to systems?

Comment by Premier Election Solutions (Manufacturer)

In general it is not possible to prevent modification but it is possible to detect modification and prevent the system from using modified data. This section should be changed from prevent to detect. Proposed Change: Change the requirement to read as follows: 6.4.1.7-B Detect tampering with code Programmed devices SHALL detect replacement or modification of executable or interpreted code (e.g., by other programs on the system, by people physically replacing the memory or medium containing the code, or by faulty code) except where this access is necessary to conduct the voting process.

Comment by Craig Burton, CTO, EveryoneCounts.com (Manufacturer)

This comment also concerns 6.4.1.7-C Prevent tampering with data These are very important and we think the standards visit these perhaps too briefly than they visit programming conventions. The standards don't seem to make reference to BS7799 or ISO90003:2004 Software Engineering Information Security Standards BS7799 (ISO27001) and ISO17799 DRE Devices could greatly enjoy a locked-in internal WORM device. This would trap all kinds of errors or intrusions and could be removed with keys and inspected at random. It could capture all votes, logs, events and so on to a DVDR UDF disk. Even the system software could come off this disk.
6.4.1.7-C Prevent tampering with data

All voting devices SHALL prevent access to or manipulation of configuration data, vote data, or audit records (e.g., by physical tampering with the medium or mechanism containing the data, by other programs on the system, or by faulty code) except where this access is necessary to conduct the voting process.

Applies To: Voting device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This requirement may be partially satisfied through a combination of the memory protection implemented by most popular COTS operating systems, error checking as described in Part 1: 6.4.1.8 "Error checking", and access and integrity controls. Systems using mechanical counters to store vote data must protect the counters from tampering. If vote data are stored on paper, the paper must be protected from tampering. Modification of audit records after they are created is never necessary.

Source: Rewording/expansion of [VSS2002] I.4.2.2

 

2 Comments

Comment by Gail Audette (Voting System Test Laboratory)

How can a device prevent replacement or modification of executable code? Does this then disallow code upgrades to systems?

Comment by Premier Election Solutions (Manufacturer)

In general it is not possible to prevent modification but it is possible to detect modification and prevent the system from using modified data. This section should be changed from prevent to detect. Proposed Change: Change the requirement to read as follows: 6.4.1.7-C Detect tampering with data All voting devices SHALL detect access to or manipulation of configuration data, vote data, or audit records (e.g., by physical tampering with the medium or mechanism containing the data, by other programs on the system, or by faulty code) except where this access is necessary to conduct the voting process.
6.4.1.7-D Monitor I/O errors

Programmed devices SHALL provide the capability to monitor the transfer quality of I/O operations, reporting the number and types of errors that occur and how they were corrected.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: [VSS2002] I.2.2.2.1.e

 

6.4.1.8 Error checking

This section contains requirements for application logic to avoid, detect, and prevent well-known types of errors that could compromise voting integrity and security[5],[6]. Additional advice from the security perspective is available at [CERT06] and related sites, esp. [DHS06].

 

1 Comment

Comment by Diane Gray (Voting System Test Laboratory)

Reference to IEEE Standard P1583 (IEEE Draft Standard for the Evaluation of Voting Equipment…) is cited. However, Appendix A End Notes No. 5 states the material is from an unapproved draft of the proposed IEEE Standard…and IEEE recommends it not be utilized for any conformance/compliance purposes. Question: if it is unapproved, why is it cited in this document?
6.4.1.8-A Detect garbage input

Programmed devices SHALL check information inputs for completeness and validity.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This general requirement applies to all programmed devices, while the specific ones following are only enforceable for application logic.

Source: [NIST05] [S-I-10]

 

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

How is this tested in a static analysis or white box?
6.4.1.8-A.1 Defend against garbage input

Programmed devices SHALL ensure that incomplete or invalid inputs do not lead to irreversible error.

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: [VSS2002] I.2.2.5.2.2.f

 

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

How is this tested in a static analysis or white box?
6.4.1.8-B Mandatory internal error checking

Application logic that is vulnerable to the following types of errors SHALL check for these errors at run time and respond defensively (as specified by Requirement Part 1: 6.4.1.8-F) when they occur:

  1. Out-of-bounds accesses of arrays or strings (includes buffers used to move data);
  2. Stack overflow errors;
  3. CPU-level exceptions such as address and bus errors, dividing by zero, and the like;
  4. Variables that are not appropriately handled when out of expected boundaries;
  5. Numeric overflows; or
  6. Known programming language specific vulnerabilities.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

It is acceptable, even expected, that logic verification will show that some error checks cannot logically be triggered and some exception handlers cannot logically be invoked. These checks and exception handlers are not redundant – they provide defense-in-depth against faults that escape detection during logic verification.

See also Requirement Part 1: 7.5.6-A.

Source: [P1583] Section 5.6.2.2 expansion of [VSS2002] I.4.2.2, modified

 
6.4.1.8-B.1 Array overflows

If the application logic uses arrays, vectors, or any analogous data structures and the programming language does not provide automatic run-time range checking of the indices, the indices SHALL be ranged-checked on every access.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

Range checking code should not be duplicated before each access. Clean implementation approaches include:

  1. Consistently using dedicated accessors (functions, methods, operations, subroutines, procedures, etc.) that range-check the indices;
  2. Defining and consistently using a new data type or class that encapsulates the range-checking logic;
  3. Declaring the array using a template that causes all accessors to be range-checked; or
  4. Declaring the array index to be a data type whose enforced range is matched to the size of the array.

Range-enforced data types or classes may be provided by the programming environment or they may be defined in application logic.

If acceptable values of the index do not form a contiguous range, a map structure may be more appropriate than a vector.

Source: Expansion of [VSS2002] I.4.2.2

 

1 Comment

Comment by E Smith/M Faulk (Manufacturer)

Per the draft VVSG: Range checking code should not be duplicated before each access. This clause is difficult to reconcile with the remainder of this section.
6.4.1.8-B.2 Stack overflows

If stack overflow does not automatically result in an exception, the application logic SHALL explicitly check for and prevent stack overflow.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

Embedded system developers use a variety of techniques for avoiding stack overflow. Commonly, the stack is monitored and warnings and exceptions are thrown when thresholds are crossed. In non-embedded contexts, stack overflow often manifests as a CPU-level exception related to memory segmentation, in which case it can be handled pursuant to Requirement Part 1: 6.4.1.8-B.3 and Requirement Part 1: 6.4.1.9-D.2.

Source: Added precision

 
6.4.1.8-B.3 CPU traps

The application logic SHALL implement such handlers as are needed to detect and respond to CPU-level exceptions.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

For example, under Unix a CPU-level exception would manifest as a signal, so a signal handler is needed. If the platform supports it, it is preferable to translate CPU-level exceptions into software-level exceptions so that all exceptions can be handled in a consistent fashion within the voting application; however, not all platforms support it.

Source: Added precision

 
6.4.1.8-B.4 Garbage input parameters

All scalar or enumerated type parameters whose valid ranges as used in a callable unit (function, method, operation, subroutine, procedure, etc.) do not cover the entire ranges of their declared data types SHALL be range-checked on entry to the unit.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This applies to parameters of numeric types, character types, temporal types, and any other types for which the concept of range is well-defined.[7] In cases where the restricted range is frequently used and/or associated with a meaningful concept within the scope of the application, the best approach is to define a new class or data type that encapsulates the range restriction, eliminating the need for range checks on each use.

This requirement differs from Requirement Part 1: 6.4.1.8-A, which deals with user input that is expected to contain errors, while this requirement deals with program internal parameters, which are expected to conform to the expectations of the designer. User input errors are a normal occurrence; the errors discussed here are grounds for throwing exceptions.

Source: Elaboration on Requirement Part 1: 6.4.1.8-B.d, which is an expansion of [VSS2002] I.4.2.2

 
6.4.1.8-B.5 Numeric overflows

If the programming language does not provide automatic run-time detection of numeric overflow, all arithmetic operations that could potentially overflow the relevant data type SHALL be checked for overflow.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

This requirement should be approached in a manner similar to Requirement Part 1: 6.4.1.8-B.1. Overflow checking should be encapsulated as much as possible.

Source: Added precision

 
6.4.1.8-C Recommended internal error checking

Application logic that is vulnerable to the following types of errors SHOuLD check for these errors at run time and respond defensively (as specified by Requirement Part 1: 6.4.1.8-F) when they occur.

  1. Pointer variable errors; and
  2. Dynamic memory allocation and management errors

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: [P1583] Section 5.6.2.2 expansion of [VSS2002] I.4.2.2, modified

 
6.4.1.8-C.1 Pointers

If application logic uses pointers or a similar mechanism for specifying absolute memory locations, the Application logic SHOuLD validate pointers or addresses before they are used.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

Improper overwriting should be prevented in general as required by Requirement Part 1: 6.4.1.7-B and Requirement Part 1: 6.4.1.7-C. Nevertheless, even if read-only memory would prevent the overwrite from succeeding, an attempted overwrite indicates a logic fault that must be corrected.

Pointer use that is fully encapsulated within a standard platform library is treated as COTS software.

Source: Slight revision of [P1583] 6.6.4.2.e

 
6.4.1.8-D Memory mismanagement

If dynamic memory allocation is performed in application logic, the application logic SHOULD be instrumented and/or analyzed with a COTS tool for detecting memory management errors.

Applies To: Programmed device

Test Reference: Part 3: 4.4 "Manufacturer Practices for Quality Assurance and Configuration Management"

DISCUSSION

Dynamic memory allocation that is fully encapsulated within a standard platform library is treated as COTS software. This is "should" not "shall" only because such tooling may not be available or applicable in all cases. See [Valgrind07] discussion of supported platforms and the barriers to portability.

 

1 Comment

Comment by Kevin Wilson (Voting System Test Laboratory)

Discussion states that "system developers should test" and Test Reference references code review. How is a VSTL expected to test this requirement?
6.4.1.8-E Nullify freed pointers

If pointers are used, any pointer variables that remain within scope after the memory they point to is deallocated SHALL be set to null or marked as invalid (pursuant to the idiom of the programming language used) after the memory they point to is deallocated.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

If this is not done automatically by the programming environment, a callable unit should be dedicated to the task of deallocating memory and nullifying pointers. Equivalently, "smart pointers" like the C++ std::auto_ptr can be used to avoid the problem. One should not add assignments after every deallocation in the source code.

In languages using garbage collection, memory is not deallocated until all pointers to it have gone out of scope, so this requirement is moot.

Source: New requirement

 
6.4.1.8-F React to errors detected

The detection of any of the errors enumerated in Requirement Part 1: 6.4.1.8-B and Requirement Part 1: 6.4.1.8-C SHALL be treated as a complete failure of the callable unit in which the error was detected. An appropriate exception SHALL be thrown and control SHALL pass out of the unit forthwith.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

 
6.4.1.8-G Do not disable error checks

Error checks detailed in Requirement Part 1: 6.4.1.8-B and Requirement Part 1: 6.4.1.8-C SHALL remain active in production code.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

These errors are incompatible with voting integrity, so masking them is unacceptable.

Manufacturers should not implement error checks using the C/C++ assert() macro. It is often disabled, sometimes automatically, when software is compiled in production mode. Furthermore, it does not appropriately throw an exception, but instead aborts the program.

"Inevitably, the programmed validity checks of the defensive programming approach will result in run-time overheads and, where performance demands are critical, many checks are often removed from the operational software; their use is restricted to the testing phase where they can identify the misuse of components by faulty designs. In the context of producing complex systems which can never be fully tested, this tendency to remove the protection afforded by programmed validity checks is most regrettable and is not recommended here." [Moulding89]

 
6.4.1.8-H Roles authorized to respond to errors

Exceptions resulting from failed error checks or CPU-level exceptions SHALL require intervention by an election official or administrator before voting can continue.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

These errors are incompatible with voting integrity, so masking them is unacceptable.

 
6.4.1.8-I Diagnostics

Electronic devices SHALL include a means of identifying device failure and any corrective action needed.

Applies To: Electronic device

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: Generalized from [VSS2002] I.2.4.1.2.2.c and I.2.4.1.3.d

 
6.4.1.8-J Equipment health monitoring

Electronic devices SHOuLD proactively detect equipment failures and alert an election official or administrator when they occur.

Applies To: Electronic device

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: Response to Issue #2147

 
6.4.1.8-K Election integrity monitoring

To the extent possible, electronic devices SHALL proactively detect or prevent basic violations of election integrity (e.g., stuffing of the ballot box or the accumulation of negative votes) and alert an election official or administrator if they occur.

Applies To: Electronic device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

Equipment can only verify those conditions that are within the scope of what the equipment does. However, insofar as the equipment can detect something that is blatantly wrong, it should do so and raise the alarm. This provides defense-in-depth to supplement procedural controls and auditing practices.

Source: Response to Issue #2147

 

6.4.1.9 Recovery

For specific requirements regarding misfed paper ballots or hangs during the vote-casting function, see Requirement Part 1: 3.2.2.1-F and Requirement Part 1: 3.2.2.2-F, Requirement Part 1: 7.7.4-A and Requirement Part 1: 7.7.4-B.

 
6.4.1.9-A System shall survive device failure

All systems SHALL be capable of resuming normal operation following the correction of a failure in any device.

Applies To: Voting system

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: Extrapolated from [VSS2002] I.2.2.3

 

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

Recommend that this requirement be modified to require manufacturers implement: - a quality management system certified to ISO 9001:2000 - QA & CM programs that conform to ISO 9000:2005 and ISO 10007:2003
6.4.1.9-B Failures shall not compromise voting or audit data

Exceptions and system recovery SHALL be handled in a manner that protects the integrity of all recorded votes and audit log information.

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: Extracted and generalized from [VSS2002] I.4.2.3.e

 
6.4.1.9-C Device shall survive component failure

All voting devices SHALL be capable of resuming normal operation following the correction of a failure in any component (e.g., memory, CPU, ballot reader, printer) provided that catastrophic electrical or mechanical damage has not occurred.

Applies To: Voting device

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: Reworded from [VSS2002] I.2.2.3.b and c

 
6.4.1.9-D Controlled recovery

Error conditions SHALL be corrected in a controlled fashion so that system status may be restored to the initial state existing before the error occurred.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

"Initial state" refers to the state existing at the start of a logical transaction or operation. Transaction boundaries must be defined in a conscientious fashion to minimize the damage. Language changed to "may" because election officials responding to the error condition might want the opportunity to select a different state (e.g., controlled shutdown with memory dump for later analysis).

Source: Generalization from [VSS2002] I.2.2.5.2.2.g.

 
6.4.1.9-D.1 Nested error conditions

Nested error conditions that are corrected without reset, restart, reboot, or shutdown of the voting device SHALL be corrected in a controlled sequence so that system status may be restored to the initial state existing before the first error occurred.

Test Reference: Part 3: 4.5.1 "Workmanship"

Source: Slight relaxation of [VSS2002] I.2.2.5.2.2.g

 
6.4.1.9-D.2 Reset CPU error states

CPU-level exceptions that are corrected without reset, restart, reboot, or shutdown of the voting device SHALL be handled in a manner that restores the CPU to a normal state and allows the system to log the event and recover as with a software-level exception.

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

System developers should test to see how CPU-level exceptions are handled and make any changes necessary to ensure robust recovery. Invocation of any other error routine while the CPU is in an exception handling state is to be avoided – software error handlers often do not operate as intended when the CPU is in an exception handling state.

If the platform supports it, it is preferable to translate CPU-level exceptions into software-level exceptions so that all exceptions can be handled in a consistent fashion within the voting application; however, not all platforms support it.

Source: Added precision

 

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

"Please define difficult - fingernails, scissors or blow torch? Please define tamper resistant."
6.4.1.9-E Coherent checkpoints

When recovering from non-catastrophic failure of a device or from any error or malfunction that is within the operator's ability to correct, the system SHALL restore the device to the operating condition existing immediately prior to the error or failure, without loss or corruption of voting data previously stored in the device.

Applies To: Programmed device

Test Reference: Part 3: 4.5.1 "Workmanship"

DISCUSSION

If, as discussed in Requirement Part 1: 6.4.1.9-D, the system is left in something other than the last known good state for diagnostic reasons, this requirement clarifies that it must revert to the last known good state before being placed back into service.

Source: [VSS2002] I.2.2.3.a

 

6.4.2 Quality assurance and configuration management

The quality assurance and configuration management requirements discussed in this section help assure that voting systems conform to the requirements of the VVSG. Quality Assurance is a manufacturer function with associated practices that is initiated prior to system development and continues throughout the maintenance life cycle of the voting system. Quality Assurance focuses on building quality into a system and reducing dependence on system tests at the end of the life cycle to detect deficiencies, thus helping ensure that the system:

  • Meets stated requirements and objectives;
  • Adheres to established standards and conventions;
  • Functions consistent with related components and meets dependencies for use within the jurisdiction; and
  • Reflects all changes approved during its initial development, internal testing, qualification, and, if applicable, additional certification processes.

Configuration management is a set of activities and associated practices that ensures full knowledge and control of the components of a system, starting with its initial development progressing through its ongoing maintenance and enhancement, and including its operational life cycle.

 

6.4.2.1 Standards based framework for Quality Assurance and Configuration Management

The requirement in this section establishes the quality assurance and configuration standards that voting system to which manufacturers must conform. The requirement to develop a Quality and Configuration Management manual, and the detailed requirements on that manual, are contained in Part 2, Chapter 2.

 
6.4.2.1-A List of standards

Voting system manufacturers SHALL implement a quality assurance and configuration management program that is conformant with the recognized ISO standards in these areas:

  1. ISO 9000:2005 [ISO05];
  2. ISO 9001:2000 [ISO00]; and
  3. ISO 10007:2003 [ISO03].
 

1 Comment

Comment by Premier Election Solutions (Manufacturer)

Putting contact, address, and certification information on the label is not appropriate as that information may change with time. Proposed Change: Change item (c) to read as follows: c. Identification of the manufacturer; and

6.4.2.2 Configuration Management requirements

This section specifies the key configuration management requirements for voting system manufacturers. The requirements include those of equipment tags and configuration logs. Continuation of the program, in the form of usage logs, is the responsibility of State and local officials.

 
6.4.2.2-A Identification of systems

Each voting system SHALL have an identification tag that is attached to the main body.

Applies To: Voting system

Test Reference: Part 3: 3.1 "Inspection", 4.4.2 "Examination of voting systems submitted for testing"

Source: New requirement

 
6.4.2.2-A.1 Secure tag

The tag SHALL be tamper-resistant and difficult to remove.

Applies To: Voting system

Test Reference: Part 3: 3.1 "Inspection", 4.4.2 "Examination of voting systems submitted for testing"

Source: New requirement

 

1 Comment

Comment by Premier Election Solutions (Manufacturer)

Requiring a list of all of the critcal components does not seem realistic nor practical for the Log, nor would most election administrators understand what this information is for. Please clarify the intent for requiring a list of critical components in the log. If the intent is not clear, then remove this requirement.
6.4.2.2-A.2 Tag contents

The tag SHALL contain the following information:

  1. The voting system model identification in the form of a model number and possibly a model name. The model identification identifies the exact variant or version of the system;
  2. The serial number that uniquely identifies the system;
  3. Identification of the manufacturer, including address and contact information for technical service, and manufacturer certification information; and
  4. Date of manufacture of the voting system.

Applies To: Voting system

Test Reference:Part 3: 3.1 "Inspection", 4.4.2 "Examination of voting systems submitted for testing"

Source: New requirement

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

"Proper workmanship" is subjective and not testable.
6.4.2.2-B The Voting System Configuration Log

For each voting system manufactured, a Voting System Configuration Log SHALL be established.

Applies To: Voting system

Test Reference:Part 3: 3.1 "Inspection", 4.4.2 "Examination of voting systems submitted for testing"

DISCUSSION

The Log is initialized by the configuration data supplied by the manufacturer. From that point on, it functions like a diary of the system. Entries are made by election officials whenever any change occurs. Every exception, disruption, anomaly, and every failure is recorded. Every time the cover is opened for inspection or a repair or maintenance is performed, an entry details what was done, and what component was changed against what other component, as well as any diagnosis of failures or exceptions.

Source: New requirement

 
6.4.2.2-B.1 Contents

The Log SHALL contain the following information:

  1. The information on the system tag described in Requirement 6. 4.2.2-A.2;
  2. The identification of all critical parts, components, and assemblies of the system; and
  3. The complete historical record, as developed by the manufacturer per Requirement Part 2: 2.1-A.12, of all critical parts, components, and assemblies included in the voting system.

Applies To: Voting system

Test Reference: Part 3: 3.1 "Inspection", 4.4.2 "Examination of voting systems submitted for testing"

DISCUSSION

The list of critical parts, components, and assemblies should be consistent with the rules for determining which of these entities is critical, as specified in the Quality and Configuration Manual. See Requirement Part 2: 2.1-A.6.

Source: New requirement

 

2 Comments

Comment by Carl Hage (General Public)

Missing is a requirement to prevent bloatware. Reviewers of the system should rate the complexity of the software and reliance of external COTS software. COTS and general subsystems should only be used where necessary-- minimizing the total amount of code. Choices in system design can dramatically affect the amount of software used. For example, a tab-delimited data file could be used, easily readable and writable by simple custom written software-- easy to review. On the other hand data could be stored in an SQL database, requiring a huge unverifyable software system. The GUI code could be self-contained, drawing characters or lines on the screen at pixel locations, or 5 layers of GUI subsystems could be used. For a vote collecting or counting system, COTS and external subsystems should be minimized. All source code should be made available to the certification labs. Unnecessary COTS components should be removed. Reviewers should count the total number of lines of code in the operating system and total lines of code (or KB of non-comment code) in the application including COTS modules. Reviewers should rate the design as compact or unnecessarily large or complex. If COTS is used and cannot be reviewed and tested along with the application, a demerit should be issued. Simpler (smaller) self-contained code is better. While some believe "reusing" code is the ultimate goal, security, reliability, and observability require minimizing external subsystems and unneeded software.

Comment by Frank Padilla (Voting System Test Laboratory)

Is this a lab or manufacturer's requirement to verify?

The Log SHALL be kept on a medium that allows the writing, but not the modification or deletion, of records.

Applies To: Voting system

Test Reference: Part 3: 3.1 "Inspection", 4.4.2 "Examination of voting systems submitted for testing"

Source: New requirement

 

6.4.3 General build quality

6.4.3-A General build quality

All manufacturers of voting systems SHALL practice proper workmanship.

Applies To: Voting system

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: New requirement

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

How is this tested? HALT testing? The requirement is very subjective.
6.4.3-A.1 High quality products

All manufacturers SHALL adopt and adhere to practices and procedures to ensure that their products are free from damage or defect that could make them unsatisfactory for their intended purpose.

Applies To: Voting system

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.4.7.a / [VVSG2005] I.4.3.7.a

 
6.4.3-A.2 High quality parts

All manufacturers SHALL ensure that components provided by external suppliers are free from damage or defect that could make them unsatisfactory or hazardous when used for their intended purpose.

Applies To: Voting system

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.4.7.b / [VVSG2005] I.4.3.7.b

 
6.4.3-B Suitability of COTS Components

Manufacturers SHALL ensure that all COTS components included in their voting systems are designed to be suitable for their intended use under the requirements specified by these VVSG.

Applies To: Voting system

Test Reference: Requirement Part 3: 4.1-B

DISCUSSION

For example, if the operating and/or storage environmental conditions specified by the manufacturer of a printer do not meet or exceed the requirements of these VVSG, a system that includes that printer cannot be found conforming.

Source: New requirement

 

6.4.4 Durability

6.4.4-A Durability

Voting systems SHALL be designed to withstand normal use without deterioration for a period of ten years.

Applies To: Voting system

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.4.2 / [VVSG2005] I.4.3.2

 
6.4.4-B Durability of paper

Paper specified for use with the voting system SHALL conform to the applicable specifications contained within the Government Paper Specification Standards, February 1999 No. 11, or the government standards that have superseded them.

Applies To: Voting system

Test Reference: Part 3: 4.1 "Initial Review of Documentation"

DISCUSSION

This is to ensure that paper records will be of adequate quality to survive the handling necessary for recounts, audits, etc. without problematic degradation. The Government Paper Specification Standards include different specifications for different kinds of paper. As of 2007-04-05, the Government Paper Specification Standards, February 1999 No. 11, are available at http://www.gpo.gov/acquisition/paperspecs.htm [GPO99].

Source: New requirement

 

6.4.5 Maintainability

Maintainability represents the ease with which maintenance actions can be performed based on the design characteristics of equipment and software and the processes the manufacturer and election officials have in place for preventing failures and for reacting to failures. Maintainability includes the ability of equipment and software to self-diagnose problems and to make non-technical election workers aware of a problem. Maintainability addresses all scheduled and unscheduled events, which are performed to:

  • Determine the operational status of the system or a component;
  • Determine if there is a problem with the equipment and be able to take it off-line (out of service) while retaining all cast ballot data;
  • Adjust, align, tune, or service components;
  • Repair or replace a component having a specified operating life or replacement interval;
  • Repair or replace a component that exhibits an undesirable predetermined physical condition or performance degradation;
  • Repair or replace a component that has failed;
  • Ensure that, by following manufacturer protocols provided in the TDP, all repairs or replacements of devices or components during election use preserve all stored ballot data and/or election results, as appropriate; and
  • Verify the restoration of a component, or the system, to operational status.

Maintainability is determined based on the presence of specific physical attributes that aid system maintenance activities, and the ease with which the testing laboratory can perform system maintenance tasks. Although a more quantitative basis for assessing maintainability, such as the mean time to repair the system, is desirable, laboratory testing of a system is conducted before it is approved for sale and thus before a broader base of maintenance experience can be obtained.

 
6.4.5-A Electronic device maintainability

Electronic devices SHALL exhibit the following physical attributes:

  1. Labels and the identification of test points;
  2. Built-in test and diagnostic circuitry or physical indicators of condition;
  3. Labels and alarms related to failures; and
  4. Features that allow non-technicians to perform routine maintenance tasks.

Applies To: Electronic device

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.4.4.1 / [VVSG2005] I.4.3.4.1

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

What is the definition of a routine maintenance task? What would be classified as a non-technician?
6.4.5-B System maintainability

Voting systems SHALL allow for:

  1. A non-technician to easily detect that the equipment has failed;
  2. A trained technician to easily diagnose problems;
  3. Easy access to components for replacement;
  4. Easy adjustment, alignment, and tuning of components; and
  5. Low false alarm rates (i.e., indications of problems that do not exist).

Applies To: Voting system

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.4.4.2 / [VVSG2005] I.4.3.4.2

 

2 Comments

Comment by Gail Audette (Voting System Test Laboratory)

Provide testable benchmarks for A thru E. Easy and low are not testable.

Comment by Frank Padilla (Voting System Test Laboratory)

Very subjective. What is a trained technician? How do you test easily?
6.4.5-C Nameplate and labels

All voting devices SHALL:

  1. Display a permanently affixed nameplate or label containing the name of the manufacturer or manufacturer, the name of the device, its part or model number, its revision identifier, its serial number, and if applicable, its power requirements;
  2. Display a separate data plate containing a schedule for and list of operations required to service or to perform preventive maintenance, or a reference to where this can be found in the Voting Equipment User Documentation; and
  3. Display advisory caution and warning instructions to ensure safe operation of the equipment and to avoid exposure to hazardous electrical voltages and moving parts at all locations where operation or exposure may occur.

Applies To: Voting device

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.4.6

 

1 Comment

Comment by Premier Election Solutions (Manufacturer)

This requirement should be reconciled with section 6.4.2.2-A.2 Tag contents. Proposed Change: Remove this requirement and merge it with 6.4.2.2-A.2.

6.4.6 Temperature and humidity

6.4.6-A Operating temperature and humidity

Voting systems SHALL be capable of operation in temperatures ranging from 5 °C to 40 °C (41 °F to 104 °F) and relative humidity from 5 % to 85 %, non-condensing.[8]

Applies To: Voting system

Test Reference: Part 3: 5.1.5 "Operating environmental testing"

Source: [P1583] 5.4.5[5]

 

6.4.7 Equipment transportation and storage

This section address items such as touchscreens going out of calibration and memory packs failing after delivery from central to precinct, and high rates of system failure when taken out of storage.

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

What is transit considered to be? This is very subjective.
6.4.7-A Survive transportation

Voting devices designated for storage between elections SHALL continue to meet all applicable requirements after transit to and from the place of use.

Applies To: Voting device

Test Reference: Part 3: 5.1 "Hardware"

Source: [VSS2002] I.2.6.a / [VVSG2005] I.2.5.a, generalized

 
6.4.7-B Survive storage

Voting devices designated for storage between elections SHALL continue to meet all applicable requirements after storage between elections.

Applies To: Voting device

Test Reference: Part 3: 5.1 "Hardware"

Source: [VSS2002] I.2.6.b / [VVSG2005] I.2.5.b, generalized

 
6.4.7-C Precinct devices storage

Precinct tabulators and vote-capture devices SHALL be designed for storage in any enclosed facility ordinarily used as a warehouse, with prominent instructions as to any special storage requirements.

Applies To: Precinct tabulator, Vote-capture device

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.2.2.1 / [VVSG2005] I.4.1.2.1

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

Very subjective. "Enclosed facility ordinarily used as a warehouse" would vary greatly by region and type of product stored.
6.4.7-C.1 Design for storage and transportation

Precinct tabulators and vote-capture devices SHALL:

  1. Provide a means to safely and easily handle, transport, and install polling place equipment, such as wheels or a handle or handles; and
  2. Be capable of using, or be provided with, a protective enclosure rendering the equipment capable of withstanding (1) impact, shock and vibration loads accompanying surface and air transportation, and (2) stacking loads accompanying storage.

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

Source: [VSS2002] I.3.3.3 / [VVSG2005] I.4.2.3

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

Very subjective. What is "easily handle, transport and install"? This would depend on the location and system.
6.4.7-D Transportation and storage conditions benchmarks

Voting devices SHALL meet specific minimum performance requirements for transportation and storage.

Applies To: Voting device

Test Reference: Part 3: 5.1 "Hardware"

DISCUSSION

The requirements simulate exposure to physical shock and vibration associated with handling and transportation by surface and air common carriers, and to temperature conditions associated with delivery and storage in an uncontrolled warehouse environment.

Source: [VSS2002] I.3.2.2.14, modified by [P1583] 5.4.6[5]

 
6.4.7-D.1 Storage temperature

Voting devices SHALL withstand high and low storage temperatures ranging from –20 °C to 60 °C (–4 °F to 140 °F).

Applies To: Voting device

Test Reference: Part 3: 5.1 "Hardware"

Source: [VSS2002] I.3.2.2.14.a, modified by [P1583] 5.4.6.a[5]

 
6.4.7-D.2 Bench handling

Voting devices SHALL withstand bench handling equivalent to the procedure of MIL-STD-810D, Method 516.3, Procedure VI. [MIL83].

Applies To: Voting device

Test Reference: Part 3: 5.1 "Hardware"

Source: [VSS2002] I.3.2.2.14.b

 
6.4.7-D.3 Vibration

Voting devices SHALL withstand vibration equivalent to the procedure of MIL-STD-810D, Method 514.3, Category 1—Basic Transportation, Common Carrier [MIL83].

Applies To: Voting device

Test Reference: Part 3: 5.1 "Hardware"

Source: [VSS2002] I.3.2.2.14.c

 
6.4.7-D.4 Storage humidity

Voting devices SHALL withstand uncontrolled humidity equivalent to the procedure of MIL-STD-810D, Method 507.2, Procedure I-Natural Hot-Humid [MIL83].

Applies To: Voting device

Test Reference: Part 3: 5.1 "Hardware"

Source: [VSS2002] I.3.2.2.14.d

 

6.5 Archival Requirements

6.5.1 Archivalness of media

See Appendix A for the definition of archivalness.

 
6.5.1-A Records last at least 22 months

All systems SHALL maintain the integrity of election management, voting and audit data, including CVRs, during an election and for a period of at least 22 months afterward, in temperatures ranging from 5 °C to 40 °C (41 °F to 104 °F) and relative humidity from 5 % to 85 %, non-condensing.

Applies To: Voting system

Test Reference: Part 3: 4.3 "Verification of Design Requirements"

DISCUSSION

See also Requirement Part 1: 6.5.2, Part 1: 6.5.3 and Requirement Part 2: 4.4.8-C.

Source: Merged from [VSS2002] I.2.2.11 and I.3.2.3.2; temperature and humidity harmonized with Requirement Part 1: 6.4.6-A

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

Does this include all paper products?

6.5.2 Procedures required for correct system functioning

The requirements for voting systems are written assuming that these procedures will be followed.

Statutory period of retention: All printed copy records produced by the election database and ballot processing systems must be labeled and archived for a period of at least 22 months after the election. ([VSS2002] I.2.2.11) See also Requirement Part 1: 6.5.1-A and Part 1: 6.5.3.

 

6.5.3 Period of retention (informative)

This informative section provides extended discussion for Requirement Part 1: 6.5.1-A and Part 1: 6.5.2.

United States Code Title 42, Sections 1974 through 1974e, states that election administrators must preserve for 22 months "all records and paper that came into (their) possession relating to an application, registration, payment of poll tax, or other act requisite to voting." This retention requirement applies to systems that will be used at any time for voting of candidates for federal offices (e.g., Member of Congress, United States Senator, and/or Presidential Elector). Therefore, all systems must provide for maintaining the integrity of voting and audit data during an election and for a period of at least 22 months thereafter.

Because the purpose of this law is to assist the federal government in discharging its law enforcement responsibilities in connection with civil rights and elections crimes, its scope must be interpreted in keeping with that objective. The appropriate state or local authority must preserve all records that may be relevant to the detection and prosecution of federal civil rights or election crimes for the 22-month federal retention period, if the records were generated in connection with an election that was held in whole or in part to select federal candidates. It is important to note that Section 1974 does not require that election officials generate any specific type or classification of election record. However, if a record is generated, Section 1974 comes into force and the appropriate authority must retain the records for 22 months.

For 22-month document retention, the general rule is that all printed copy records produced by the election database and ballot processing systems must be so labeled and archived. Regardless of system type, all audit trail information spelled out in Part 1: 5.7 must be retained in its original format, whether that be real-time logs generated by the system, or manual logs maintained by election personnel. The election audit trail includes not only in-process logs of election night (and subsequent processing of absentee or provisional ballots), but also time logs of baseline ballot definition formats, and system readiness and testing results.

In many voting systems, the source of election-specific data (and ballot styles) is a database or file. In precinct count systems, this data is used to program each machine, establish ballot layout, and generate tallying files. It is not necessary to retain this information on electronic media if there is an official, authenticatable printed copy of all final database information. However, it is recommended that the state or local jurisdiction also retain electronic records of the aggregate data for each device so that reconstruction of an election is possible without data re-entry. The same requirement and recommendation applies to vote results generated by each precinct device or system.

 

6.6 Integratability and Data Export/Interchange

The requirements in this section deal with making voting device interfaces and data formats transparent and interchangeable. The advantages of transparency and interchangeability include that systems and devices may work across different manufacturers and that data can be conveniently aggregated and analyzed across different platforms. The requirements address (a) integratability of hardware and (b) common public formats for data. The requirements in this section do not address or mandate true interoperability of interfaces and data, however they reduce the barriers to interoperability.

Integratability deals with the physical and technical aspects of connections between systems and devices, which include hardware and firmware, protocols, etc. Basic integratability of devices is achieved through use of common, standard hardware interfaces and interface protocols such as USB. Thus, a printer port must not be proprietary; it must use a common hardware interface and interface protocol, with the goal being that printers of similar type should be interchangeable.

Systems and devices that are integratable are designed such that components of systems may be compatible or can be made compatible with each other through some moderate amount of effort, for example, by writing "glue code." For example, an audit device may be designed to work with a DRE, but it may require adaptations to protocols for signaling or data exchange. Adapting the audit interface to the DRE may require some amount of software modification but should still be within reasonable bounds.

The barriers to interoperability are further reduced if all systems support the same commonly agreed upon, publicly-available data format for ballot definition, records and reports. The advantages to using common data formats include:

  • Common formats for specifying election programming data such as ballot definition files promotes greater accuracy and reduces duplication;
  • Common exported data formats can assist in aggregating results and conducting analyses and audits across among manufacturer systems; and
  • Common formats for use in data reports can be mapped as necessary to locality-specific reports as opposed to requiring the device to export the report in the locality-specific format.

Although these requirements do not mandate a specific standard data format, manufacturers are encouraged to use consensus-based, publicly available formats such as the OASIS Election Markup Language (EML) standard [OASIS07] or those emanating from the IEEE Voting System Electronic Data Interchange Project 1622 [P1622].

The requirements in this section mandate the following:

  • Common hardware interfaces;
  • Non-restrictive, publicly available formats for data export and interchange; and
  • Documentation for the format and for how the manufacturer has implemented it, including sample source code for reading the format.

The requirements promote, but do not mandate the following:

  • Integration of voting devices from different manufacturers;
  • Non-restrictive, publicly available formats for data export and interchange and reports among each manufacturer’s products; and
  • Non-restrictive, publicly available formats for data export and interchange and reports across all manufacturer products.
 

2 Comments

Comment by George Gilbert (Local Election Official)

I do not have the technical knowledge to evaluate the specific requirements of this section, however, in my view establishing common IDE standards is among the most important things the VVSG could accomplish. Without common data exchange formats and interfaces, the vendors will continue to design their systems to minimize the opportunity for outside innovation (and competition) and will continue to restrict the opportunities for innovation at the state and local level.

Comment by Chris Garvey (Voter)

All voting systems should support input, output and exchange of data using a single, public, standard, self-describing format that is easy for humans to read but also easily readable by other computer software without transcription -- for example, the Election Markup Language (EML)
6.6-A Integratability of systems and devices

Systems SHALL maximize integratability with other systems and/or devices of other systems.

Applies To: Voting system

Test Reference: Part 3: 3.5 "Interoperability Testing", 4.3 "Verification of Design Requirements"

DISCUSSION

This is a goal-oriented requirement to promote interoperability of voting system devices among and across manufacturers.

Source: Generalized from database design requirements in [VSS2002] I.2.2.6 and some state RFP(s)

 

3 Comments

Comment by Frank Padilla (Voting System Test Laboratory)

Very subjective. Not testable, and what standards are to be used?

Comment by Premier Election Solutions (Manufacturer)

Integratability is a good goal, however, with the VVSG requiring that only systems tested end-to-end be considered in comformance with the guidelines, it is not likely that other systems or devices of other systems will be certified within an end-to-end system unless there is an economic incentive to do so. Proposed Change: Change this requirement from a SHALL to a SHOULD.

Comment by Premier Election Solutions (Manufacturer)

This requirement is vague and a very subjective requirement. It has a SHALL mandate, yet it sets an unknown threshold of MAXIMIZE. A system might strive toward the goal but it would be up to the reviewer to use their subjective opinion as to whether the system passes. The VVSG introduction indicates that vague requirements from older guidelines were not used in this set of guidelines in favor of having performance based requirements that are testable. This requirement doesn't meet that intent. The goal should remain in the guidelines but unless there is a more defined threshold to which to test, it should be changed from a SHALL to a SHOULD. Proposed Change: Change this requirement from a SHALL to a SHOULD.
6.6-A.1 Standard device interfaces

Standard, common hardware interfaces and protocols SHALL be used to connect devices.

Applies To: Voting system

Test Reference: Part 3: 3.5 "Interoperability Testing", 4.3 "Verification of Design Requirements"

DISCUSSION

Standard hardware interfaces must be used to connect devices.

Source: VVSG 2005 Section 7.9.4

 
6.6-B Data export and exchange format

Data that is exported and exchanged between systems and devices SHALL use a non-restrictive, publicly-available format.

Applies To: Voting system

Test Reference: Part 3: 3.5 "Interoperability Testing", 4.3 "Verification of Design Requirements"

DISCUSSION

This is a goal-oriented requirement to promote interoperability of exported data and data exchanged between devices. For example, CVRs exported from different devices should use the same common format so that they can be easily aggregated for use in random audits. Reports from ballot activation devices or other devices that produce reports should use common formats that, if necessary, can be mapped to locality-specific formats.

Source: VVSG 2005 Section 7.9.3

 

17 Comments

Comment by Gail Audette (Voting System Test Laboratory)

Publicly available format needs to be defined, as this will evolve overtime. As there is currently no "common consensus based format" if is inappropriate to set a requirement for one.

Comment by Richard Carback (Academic)

This document should specify the acceptable format(s) for this data.

Comment by Frank Padilla (Voting System Test Laboratory)

Very Subjective. "Publicly-available format" and "non-restrictive" need to be defined.

Comment by Robert Ferraro, SAVEourVotes.org (Advocacy Group)

The current requirement in Part 1:6.6, "non-restrictive, publicly-available format" is not specific enough. To set up a post-election audit, accurate election results data need to be exported from all voting devices in a format easy to read and easy to use in calculations; the data must be available in a usable form without transcription. Formats that cannot be easily exported and then manipulated create substantial barriers to audits and other analysis of post-election data. For instance, PDF files that are used for reporting results in several voting systems now are not ideal for quickly and accurately exporting and using data. The format must be in OASIS Election Markup Language (EML) or some other structured, machine readable format easy to export and then manipulate. It should be the same for all makes of equipment.

Comment by Mr. D. Narveson (Voter)

Yes! All voting systems should support input, output and exchange of data using a single, public, standard, self-describing format easy for people to read AND easily readable by other computer software without transcription -- for example, the Election Markup Language (EML).

Comment by Barry G. D'Orazio (General Public)

As a Computer Systems professional for over 50 years I am very much aware that all voting systems should support input, output and exchange of data using a single, public, standard, self-describing format that is easy for humans to read but also easily readable by other computer software without transcription -- for example, the Election Markup Language (EML).

Comment by Mathew Goldstein (Voter)

This provision should be strengthened. Not having data in a single, standard format is a significant barrier to election auditing. Voting systems should utilize a single, common XML-based data format for data import, export and exchange that is the same for all types and makes of equipment, such as Election Markup Language (EML).

Comment by Coleen Christensen (Voter)

To prevent problems and avoid complaints from conspiracy nuts, it will be most efficient to use a code that is commonly understood by professionals, and can be clearly explained to the interested observer. Something XML-based is ideal--and because there are many professionals who know it, competitive bids will be feasible, which will keep costs low. In addition, the completed code can be publicized for review--and the government will then have the benefit of many interested experts who will analyze the code for weak spots and make corrective suggestions at no cost to the government. Anything that might permit a private entity to have particular access to election code must be avoided--and anyone who suggests otherwise should be viewed with suspicion--they are either dupes, or selling snake-oil!

Comment by David Kirschner (Voter)

We need to settle on a nationwide format that is easily human-readable and can also be read by machine without undue transcription. This could be XML based, as there is a standard and accepted syntax.

Comment by Joshua Berk Knox (Voter)

All voting systems should support input, output and exchange of data using a single, public, standard, self-describing format that is easy for humans to read but also easily readable by other computer software without transcription -- for example, the Election Markup Language (EML). Thank You

Comment by d.s. kiefer (Voter)

A standard data-exchange format should be REQUIRED, not just "encouraged"! All voting systems, regardless of the type or make of equipment, should use a singel, common, public format that is easy to read by humans and by other computer software. For example, for auditing it is extremely important that necessary data can be gotten quickly and easily, from a variety of different local jurisdicitions that use different types of election equipment. also: THROUGHOUT THESE DOCUMENTS, PLEASE REMEMBER THAT THE WORD "DATA" IS PLURAL (i.e., "data are" is correct. thank you.

Comment by d.s. kiefer (Voter)

This language is not strong enough. Adoption of a standard data-exchange format should be a requirment, in order to make possible interoperability between different hardware components and auditing. All voting systems should utilize a single, common data f ormat for import, export, and exchange of data; it should be the same for all types and makes of equipment.

Comment by Verified Voting Foundation (Advocacy Group)

The phrase "non-restrictive, publicly-available format" is too vague and insufficient for election needs. Good post-election audits and reporting (like the EAC's biannual Election Day Survey) require getting necessary data quickly and easily -- often from a variety of different local jurisdictions that use different types of voting equipment (e.g., for statewide and Congressional elections). Election results data thus need to be exported from all voting devices in a STANDARD format that is easy to read and easy to use in calculations. Data must be available in a usable form without transcription. Formats that cannot be easily exported, combined with data from other jurisdictions, and then manipulated create substantial barriers to audits and other analysis of post-election data. For instance, PDF and CVS (comma separated values) files that are used for reporting results in several voting systems now are not ideal for quickly and accurately exporting and using data. Other comments on this section say that it is too subjective, uses terms that are not well defined, and even that "As there is currently no 'common consensus based format' it is inappropriate to set a requirement for one." In fact there is already an comprehensive international standard for election data that is being adopted in other parts of the world, namely the Election Markup Language (EML). Several vendors already support EML or a closely related variant that is also based on XML, and some have said they would prefer requiring EML over the current proposed more generic requirement for a generic "common consensus based format." We strongly urge the EAC to strengthen that aspect of the draft 2007 VVSG by requiring voting systems to support input and output using the OASIS Election Markup Language (version 5.0 at minimum). For a more detailed discussion of why the 2007 VVSG should require support for EML 5.0 (or higher) for all data exchange and export, see https://vvf.jot.com/EMLforVVSG and http://www.oasis-open.org/committees/download.php/26747/The%20Case%20for%20EML%20v2.pdf

Comment by Christopher Lish (General Public)

All voting systems should support input, output and exchange of data using a single, public, standard, self-describing format that is easy for humans to read but also easily readable by other computer software without transcription -- for example, the Election Markup Language (EML).

Comment by Mary Batcher (Voter)

I am a statistician working on election audits and the analysis of election results. The American Statistical Association has a Special Interest Group on Volunteering. One of its subgroups, which I am leading is made up of statisticians throughout the US who are interested in using their analytic skills to help ensure fair and accurate elections. It is very important that election results be in a common format that is easy for both humans and computers to read without other transcription or conversion to allow analysis of results to happen immediately after elections. To that end, all voting systems should support input, output and exchange of data using the Election Markup Language (EML) version 5.0 or higher.

Comment by katharine cartwright (Academic)

All voting systems should support input, output and exchange of data using a single, public, standard, self-describing format that is easy for humans to read but also easily readable by other computer software without transcription -- for example, the Election Markup Language (EML

Comment by ACCURATE (Aaron Burstein) (Academic)

This requirement is critical to supporting software independence and should be included in the final guidelines.
6.6-B.1 Exchange of election programming data and report data

EMSs SHALL use a non-restrictive, publicly-available format with respect to election programming data and report data (the content of vote data reports, audit reports, etc.).

Applies To: EMS

Test Reference: Part 3: 3.5 "Interoperability Testing", 4.3 "Verification of Design Requirements"

DISCUSSION

The purpose of this requirement is to further the use of common formats for (a) the specification of election definition files and other election programming, (b) for the report data produced by the EMS such as for status and audit-related reports.

Source: Generalized from database design requirements in [VSS2002] I.2.2.6 and some state RFP(s)

 

4 Comments

Comment by Frank Padilla (Voting System Test Laboratory)

Very Subjective. "Publicly-available format" and "non-restrictive" need to be defined.

Comment by Premier Election Solutions (Manufacturer)

If an EMS database is encrypted, then the data is only accessible if it is exported to a file in a publicly available format. Although data exports to publicly available formats are common practice, using a publicly available format for storing the election programming data and report data (such as in a JET database format) has been viewed as a security risk. It would seem that this requirement was intended to apply the publicly available format to the export of the data and not to the database itself. Proposed Change: Change this requirement to the following: "EMSs SHALL use a non-restrictive, publicly-available format with respect to exports of election programming data and report data (the content of vote data reports, audit reports, etc.)."

Comment by Robert Ferraro, SAVEourVotes.org (Advocacy Group)

The draft guidelines state that the purpose of this requirement is "…for the report data produced by the EMS such as for status and audit-related reports." 1. "Non-restrictive, publicly-available format" is not specific enough. EML or equivalent should be required. 2. The data required for audit reports doesn’t just come from the EMS but also comes from pollbooks and vote capture machines so these types of machines should be included also. To reduce the cost of an audit, some data may come directly from individual vote capture machines. The cost of an audit depends on several factors, including the number of samples needed, so the cost can be reduced by having smaller audit units. For example, if a precinct has ten machines, it is more cost effective to treat each machine as a separate audit unit (whereby the machine count is compared to the paper based count associated with that machine) than to treat the entire precinct as an audit unit.

Comment by Verified Voting Foundation (Advocacy Group)

The phrase "non-restrictive, publicly-available format" is not sufficient. See further discussion in our comments on 6.6 above. We strongly urge requiring EML 5.0 or higher. For a more detailed discussion of why the 2007 VVSG should require support for EML 5.0 (or higher) for all data exchange and export, see https://vvf.jot.com/EMLforVVSG and http://www.oasis-open.org/committees/download.php/26747/The%20Case%20for%20EML%20v2.pdf Data required for audit reports may not just come from a single EMS but also from pollbooks and vote capture machines and EMS's in multiple local jurisdictions, so these types of machines should be included also.To reduce the cost of an audit, some data may come directly from individual vote capture machines. The cost of an audit depends on several factors, including the number of samples needed, so the cost can be reduced by having smaller audit units. For example, if a precinct has ten machines, it is more cost effective to treat each machine as a separate audit unit (whereby the machine count is compared to the paper based count associated with that machine) than to treat the entire precinct as an audit unit. Hence it is important to capture as much information as possible in a standard format that can be used quickly and easily for auditing.
6.6-B.2 Exchange of CVRs

DREs and optical scanners SHALL use a non-restrictive, publicly-available format with respect to export of CVRs.

Applies To: DRE, Optical Scanner

Test Reference: Part 3: 3.5 "Interoperability Testing", 4.3 "Verification of Design Requirements"

DISCUSSION

The purpose of this requirement is to further the use of common formats for exported CVRs produced by vote-capture devices.

Source: Generalized from database design requirements in [VSS2002] I.2.2.6 VVSG 2005 Section 7.9.3, and some state RFP(s)

 

1 Comment

Comment by Premier Election Solutions (Manufacturer)

If the electronic records being exchanged between systems are encrypted, it would seem that the data would not be useful for public auditing as it couldn't be accessed external to the system. It would seem that the intent of this requirement is to provide a function to export the contents of the CVRs to files that can be used for auditing as opposed to using the encrypted files as they are stored on the devices. Please clarify how encrypted CVRs exchanged between systems is to be exported and made available for auditing purposes.
6.6-B.3 Exchange of report data

The voting system SHALL use a non-restrictive, publicly-available format with respect to export of report data.

Applies To: Voting system

Test Reference: New requirement

DISCUSSION

The purpose of this requirement is to further the use of common formats for reports produced by voting devices.

Source: Part 3: 3.5 "Interoperability Testing", 4.3 "Verification of Design Requirements"

 

2 Comments

Comment by Robert Ferraro, SAVEourVotes.org (Advocacy Group)

"Non-restrictive, publicly-available format" is not specific enough. EML or equivalent must be required.

Comment by Verified Voting Foundation (Advocacy Group)

The phrase "non-restrictive, publicly-available format" is not sufficient. See further discussion in our comments on 6.6 above. We strongly urge requiring EML 5.0 or higher in order to minimize mistakes, extra cost and extra time required to combine data from multiple vendor formats.For a more detailed discussion of why the 2007 VVSG should require support for EML 5.0 (or higher) for all data exchange and export, see https://vvf.jot.com/EMLforVVSG and http://www.oasis-open.org/committees/download.php/26747/The%20Case%20for%20EML%20v2.pdf
6.6-B.4 Specification of common format usage

The voting system manufacturer SHALL provide a specification describing how the manufacturer has implemented the format with respect to the manufacturer’s specific voting devices and data, including such items as descriptions of elements, attributes, constraints, extensions, syntax and semantics of the format, and definitions for data fields and schemas.

Applies To: Voting system

Test Reference: Part 3: 4.1 "Initial Review of Documentation"

DISCUSSION

Conformance to a common format does not guarantee interoperability. The manufacturer must document fully how it has interpreted and implemented the common format for its voting devices and the types of data exchanged/exported.

Source: VVSG 2005 Section 7.9.3

 
6.6-B.5 Source code specification of common format

The voting system manufacturer SHALL provide a software program with source code to show how the manufacturer has programmatically implemented the format.

Applies To: Voting system

Test Reference: Part 3: 4.1 "Initial Review of Documentation"

Source: VVSG 2005 Section 7.9.3

 

1 Comment

Comment by Premier Election Solutions (Manufacturer)

Providing source code would not show how the format has been implemented. If the desire is for manufacturers to provide sample source code on how to parse the data that is possible but of limited use. Proposed Change: Remove this requirement.
6.6-B.6 Common format across manufacturer

The voting system manufacturer SHOULD use a common format for export and interchange of data and reports across its major device categories.

Applies To: Voting system

Test Reference: Part 3: 3.5 "Interoperability Testing", 4.3 "Verification of Design Requirements"

DISCUSSION

Different equipment from the same manufacturer should be interoperable with the respect to data format. For example, a common ballot definition should apply to all manufacturer vote-capture devices and should not be specific to each device. Export of data (e.g., reports and CVRs) should use a common format across all devices.

Source: New requirement

3 Comments

Comment by Frank Padilla (Voting System Test Laboratory)

Very subjective. Who is going to decide the common format?

Comment by Robert Ferraro, SAVEourVotes.org (Advocacy Group)

The "should" should be "shall". Currently, many states have heterogeneous voting systems with components from several vendors and with a variety of vote capture devices. Even within a given precinct, there are often several different types of vote capture devices; there might be precinct-based optical-scan machines for the majority of the voters, touch-screen machines for some handicapped voters, and central optical-scan machines for absentee and provisional ballots. In order to have an effective and cost efficient audit, the election results from all these different devices need to be quickly and accurately collected, stored, and used in calculations (such as number of votes and margin of victory) to prepare for the audit.

Comment by Verified Voting Foundation (Advocacy Group)

The "should" should be "shall". Currently, many states have heterogeneous voting systems with components from several vendors and with a variety of vote capture devices. Even within a given precinct, there are often several different types of vote capture devices; there might be precinct-based optical-scan machines for the majority of the voters, touch-screen machines for some handicapped voters, and central optical-scan machines for absentee and provisional ballots. In order to have an effective and cost efficient audit, the election results from all these different devices need to be quickly and accurately collected, stored, and used in calculations (such as number of votes and margin of victory) to prepare for the audit. Here again, we strongly urge requiring EML 5.0 or higher in order to minimize mistakes, extra cost and extra time required to combine data from multiple formats.For a more detailed discussion of why the 2007 VVSG should require support for EML 5.0 (or higher) for all data exchange and export, see https://vvf.jot.com/EMLforVVSG and http://www.oasis-open.org/committees/download.php/26747/The%20Case%20for%20EML%20v2.pdf
6.6-B.7 Consensus-based format

Voting systems SHOULD use a common, consensus-based format for export and interchange of data and reports.

Applies To: Voting system

Test Reference: Part 3: 3.5 "Interoperability Testing", 4.3 "Verification of Design Requirements"

DISCUSSION

Manufacturers should use a consensus-based format that is common to all manufacturers. The OASIS Election Markup Language (EML) standard [OASIS07] is being considered currently as one possible common format. The IEEE P-1622 working group [P1622] is studying several formats for eventual standardization.

Source: VVSG 2005 Section 7.9.3

3 Comments

Comment by Frank Padilla (Voting System Test Laboratory)

Very subjective. Who is going to decide the common format?

Comment by Robert Ferraro, SAVEourVotes.org (Advocacy Group)

The "should" should be "shall". Currently, many states have heterogeneous voting systems with components from several vendors and with a variety of vote capture devices. Even within a given precinct, there are often several different types of vote capture devices; there might be precinct-based optical-scan machines for the majority of the voters, touch-screen machines for some handicapped voters, and central optical-scan machines for absentee and provisional ballots. In order to have an effective and cost efficient audit, the election results from all these different devices need to be quickly and accurately collected, stored, and used in calculations (such as number of votes and margin of victory) to prepare for the audit.

Comment by David Marker (General Public)

The Scientific and Public Affairs Advisory Committee of the American Statistical Association urges that the VVSG should require that all voting systems support input, output, and exchange of data using a specified consensus-based standard format that is easy for humans to read but also easily readable by other computer software without transcription. This will allow for portability of data across manufacturers and allow for expedited and more accurate statistical analyses of election results. David Marker, Ph.D. Chair, SPA Advisory Committee Senior Statistician and Associate Director, Westat DavidMarker@Westat.com 301-251-4398

6.7 Procedures required for correct system functioning

The requirements for voting systems are written assuming that these procedures will be followed.

Follow instructions: The voting system must be deployed, calibrated, and tested in accordance with the voting equipment user documentation provided by the manufacturer.