United States Election Assistance Comittee

Register to Vote!

Use the National Mail Voter Registration Form to register to vote, update your registration information with a new name or address, or register with a political party.

Note: If you wish to vote absentee and are a uniformed service member or family member or a citizen living outside the U.S., contact the Federal Voting Assistance Program to register to vote.

EAC Newsletters
and Updates

Sign up to receive information about EAC activities including public meetings, webcasts, reports and grants.

Give Us Your Feedback

Share your feedback on EAC policy proposalsElection Resource Library materials, and OpenEAC activities. Give feedback on general issues, including the Web site, through our Contact Us page.

Military and Overseas Voters

EAC has several projects under way to assist states in serving military and overseas citizens who register and vote absentee under the Uniformed and Overseas Citizens Absentee Voting Act. Learn more

Chapter 4: Documentation and Design Reviews

An inspection or review is logically reported as one or more tests with a verdict of Pass or Fail. The number of tests reported corresponds to how the test lab chooses to structure the inspection.

To the extent possible, these VVSG provide guidance on the criteria to be applied. However, the nature of some of these inspections is to rely on the professional judgment of an expert reviewer to assess conformity with general guidelines.

1 Comment

Comment by Leathea Vanadore (None)

Because computers can break down,are subject to viruses and manipulation,and can have faulty programs, I believe that it is imperative n a democracy to have a verifiable paper trail, so that results can be indisputably and openly verified.

4.1 Initial Review of Documentation

The accredited test lab reviews the documentation submitted by the manufacturer for its completeness and satisfaction of requirements.

1 Comment

Comment by Irene Radke (Voter)

Voting system certification should be conditional based on following certain procedures necessary to ensure the security and accuracy of election outcomes, such as routine post-election audits.
4.1-A Initial review of documentation

At the beginning of inspection, the test lab SHALL verify that the documentation submitted by the manufacturer in the TDP meets all requirements applicable to the TDP, is sufficient to enable the inspections specified in this chapter, and is sufficient to enable the tests specified in Part 3: Chapter 5: "Test Methods".

Applies to: Voting system

DISCUSSION

This includes verifying that source code has been supplied compliant with Requirement Part 2: 3.4.7.2-E.

Source: [VSS2002]/[VVSG2005] II.5.3, generalized

4.1-B Review of COTS suppliers' specifications

For COTS components, such as printers and touchscreens, that were integrated into a voting device by the manufacturer, the test lab SHALL review the COTS manufacturers' specifications to verify that those manufacturers approve of their products' use under the conditions specified by these VVSG for voting systems.

Applies to: Voting system

DISCUSSION

For example, if the operating and/or storage environmental conditions specified by the manufacturer of a printer do not meet or exceed the requirements of these VVSG, a system that includes that printer cannot be found conforming.

Source: New requirement

4.2 Physical Configuration Audit

The Physical Configuration Audit (PCA) is the formal examination of the as-built version of a voting system against its design documentation in order to establish the product baseline. After successful completion of the audit, subsequent changes are subject to test lab review and reexamination.

1 Comment

Comment by Irene Radke (Voter)

This section should include requirements that the post-election audits described here be performed as a condition of voting system certification. This VVSG statement seems to be false: "Audits are considered part of election procedures and cannot be mandated by the VVSG. " Because: 1. VVSG are voluntary and are not mandated. Any State or jurisdiction may choose to purchase new voting machines which meet the new VVSG or not. 2. Lacking routine procedures to check the accuracy of machine counts, election integrity cannot be assured, no matter how stringent the VVSG. 3. Audits could be required as part of the condition for voting machine certification, as required by the California Secretary of State.
4.2-A As-built configuration reflected by records

The test lab SHALL audit the system's documentation and quality assurance records to verify that the as-built configuration is reflected by the documentation and records.

Applies to: Voting system

DISCUSSION

This includes both hardware and logic (e.g., software, firmware, etc.).

Source: [MIL85] 80.1, [VVSG2005] II.6.6

4.2-B Check identity of previously tested devices

If a limited scope of testing is planned for a system containing previously tested devices or subsystems, the test lab SHALL verify that the affected devices or subsystems are identical to those previously tested.

Applies to: Voting system

Source: [VSS2002] II.6.3.a / [VVSG2005] II.6.3

4.2-C Accuracy of system and device classification

The test lab SHALL verify that the classes claimed in the implementation statement accurately characterize the system and devices submitted for testing.

Applies to: Voting system

DISCUSSION

Any electronic device that includes software or firmware installed or commissioned by the voting system manufacturer is a programmed device. Manufacturers claiming that an electronic device is not programmed must demonstrate to the satisfaction of the test lab and any authorities approving the test plan that the device contains no software or firmware that should be subject to the requirements indicated for programmed devices.

Source: New requirement

4.2-D Validate configuration

The test lab SHALL confirm the propriety and correctness of the configuration choices described in Part 2: 3.8 "Configuration for Testing".

Applies to: Voting system

Source: [VSS2002] I.4.1.1

4.3 Verification of Design Requirements

Many design requirements state simply that the system SHALL have some physical feature without any additional constraints. Such requirements are easily verified by inspection. Other requirements that state that the system SHALL prevent something from occurring are not verifiable through operational testing, so inspection (with expert judgment) is the only effective testing strategy.

4.3-A Verify design requirements

For each requirement of Part 1 that is not amenable to operational testing, the test lab SHALL review the application logic, border logic, third-party logic, configuration data, and/or design of the voting system as needed to verify that the requirement is satisfied.

4.3-B Identification of security control inconsistencies

The test lab SHALL determine if all security controls properly implemented have no obvious inconsistencies with the voting system’s functional requirements, the overall objectives of the voting device’s security strategy, and no obvious internal errors.

Applies to: Voting system

Source: [NIST05]

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

The requirement has the term "obvious inconsistencies" which is not a testable requirement. Please provide testable criteria.

4.4 Manufacturer Practices for Quality Assurance and Configuration Management

4.4.1 Examination of quality assurance and configuration management data package

4.4.1-A Quality and Configuration Management Manual

The Quality and Configuration Management Manual SHALL be reviewed for its fulfillment of Requirement Part 1: 6.4.2.1-A, and the requirements specified in Part 2: 2.1 "Quality and Configuration Management Manual".

Source: New requirement

4.4.2 Examination of voting systems submitted for testing

These requirements deal with the quality assurance and configuration examination of voting systems submitted for testing to a test lab.

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

This section description (in 4.4.2) indicates that this section deals with the quality assurance and configuration examination of voting systems. However, the sole subsection (4.4.2.1) only deals with configuration management. Recommend that an additional section (4.4.2.2) be defined dealing with the quality assurance aspect of voting systems submitted for testing to a test lab.

4.4.2.1 Configuration management

4.4.2.1-A Identification of systems

The test lab SHALL verify that the voting system has an identification tag attached to the main body as described in Requirements Part 1: 6.4.2.2-A.1 and Part 1: 6.4.2.2-A.2

Applies to: Voting system

Source: New requirement

4.4.2.1-B Configuration log

The test lab SHALL verify that the voting system has associated with it a Configuration Log, as described in Requirements Part 1: 6.4.2.2-B.1 and Part 1: 6.4.2.2-B.2

Applies to: Voting system

Source: New requirement

4.5 Source Code Review

In the source code review, the accredited test lab will look at programming completeness, consistency, correctness, modifiability, structure, modularity and construction.

1 Comments/p>

Comment by Kevin Baas (None)

I think this section, or third document in general, has neglected an important vulnerability: so lets say the software all checks out and everything is good. Great. Now that is only meaningful assuming that the software is actually EXECUTED. The machine could, during testing or election time, be actually executing a different program than it is given, through some hidden mechanism (it could be as small as a logic gate in a microchip). So one needs to make sure that a "programmable device" actually IS programmable, and that that logic is integrated with the rest of the machine, such that it is executed, and, given a set of inputs, gives the predicted outputs. To do this, one needs to write a couple of programs that DO NOT count votes, but simply perform some arbitrary calculations; one needs to test for turing completeness (or near turing completeness), to see that the hardware actually runs the program it is given. One also should be able to roll forward (or back) all internal clock devices to election time, and set all other devices, so to that they can recreate the logical state of the system as it would be at election time, and perform this and other tests on it when the system is in this state, to make sure that there are no logic loop-holes, so to speak, That make it pass when it's being tested, and then operate differently at election time. The system should pass the aforementioned turing completeness test when the clocks and internal state are rolled forward to election time. And finally, when all is said and done, if the software tested isn't actually used at election time (that is, some other program(s) replace it(them)), then all the software testing is moot. There needs to be a system put in place to insure that the software running on the machine at election time is actually the software that was tested.

4.5.1 Workmanship

Although these requirements are scoped to application logic, in some cases the test lab may need to inspect border logic and third-party logic to assess conformity. Per Requirement Part 2: 3.4.7.2-E, the source code for all of these must be provided.

1 Comment

Comment by U.S. Public Policy Committee of the Association for Computing Machinery (USACM) (None)

USACM Comment #24. Section 4.5.1. Software Review, Security [incomplete] USACM Recommends that the following subsection be inserted before the current subsection 4.5.1-D (which would be renumbered 4.5.1-E): 4.5.1-D Required Use of Automated Static Analysis Tools The test lab SHALL exercise automated static analysis tools against all software under review. At least one run SHALL be made as a baseline with the strongest security setting available and this report SHALL be produced as part of the review work product. Thereafter, tool settings will be optimized by the OEVT team and team members must rigorously assess the appropriately tuned reports to determine if flagged events indicate dangerous security faults or not. Applies to: Voting system DISCUSSION: The software security review section is woefully lacking in description of the tools available to conduct security reviews. Static analysis tools have advanced dramatically in the past two years and can be important aids to assist reviewers in identifying vulnerabilities, particularly the most widespread vulnerability that includes known weaknesses and exploits through buffer overflow faults. Automated static analysis tools such as FLAWFINDER, Rough Auditing Tool for Security (RATS), and numerous commercial tools offer a unique opportunity for establishing an objective, consistent software metric. These tools can detect common developer errors including dead code segments, memory leaks, memory overruns, race conditions, and several other common maladies. These tools are not comprehensive and cannot replace other testing. However, they can establish an objective baseline for absence of known vulnerabilities that cannot be duplicated by standards, rules, or open-ended tests. There is now a plethora of reasonably priced static analysis tools on the market, embedded in development environments, and in open source products. Their use should be mandated in the OEVT process.
4.5.1-A Review source versus manufacturer specifications

The test lab SHALL assess the extent to which the application logic adheres to the specifications made in its design documentation.

Applies to: Voting system

DISCUSSION

Since the nature of the requirements specified by the manufacturer is unknown, conformity may be subject to interpretation. Nevertheless, egregious disagreements between the application logic and its design documentation should lead to a defensible adverse finding.

Source: [VSS2002] II.5.4

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

The nature of the requirements specified by the manufacturer should not be an unknown; they should be open to the test lab representatives, for the very reason that conformity should not be subject to interpretation. Perhaps non-disclosure agreements should be in place. Nevertheless, it should be verifiable whether or not the design documentation is consistent with requirements specified by the customer (which are derived from customer requirements). Recommend that it be mandatory that customer and/or regulatory authorities and/or test lab representatives have access to all manufacturer documentation related to voting machine development. This is a best practice, common in the industry to ensure that all requirements have been properly defined and implemented.
4.5.1-B Review source versus coding conventions

The test lab SHALL assess the extent to which the application logic adheres to the published, credible coding conventions chosen by the manufacturer.

Applies to: Voting system

DISCUSSION

See Requirement Part 1: 6.4.1.3-A.

Since the nature of the requirements specified by the coding conventions is unknown, conformity may be subject to interpretation. Nevertheless, egregious disagreements between the application logic and the coding conventions should lead to a defensible adverse finding.

Source: [VSS2002] II.5.4, II.5.4.2

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

Similar to my comments in 4.5.1-A, the requirements specified by the coding conventions should not be an unknown; the customer/test lab should be well aware of any and all requirements/constraints that the manufacturer is under that potentially impacts the development of the voting machine software. Recommend that it be required that all application logic code to adhere to applicable coding convention standards. (This is becoming more and more disconcerting the more I read. It's beginning to sound like the manufacturers are driving this activity rather than the EAC.)
4.5.1-C Review source versus workmanship requirements

The test lab SHALL assess the extent to which the application logic adheres to the requirements of Part 1: 6.4.1 "Software engineering practices".

Applies to: Voting system

DISCUSSION

With respect to Requirement Part 1: 6.4.1.4-B, see Requirement Part 2: 3.4.7.2-I. The reviewer should consider the functional organization of each module or callable unit and the use of formatting, such as blocking into readable units, that supports the intent of Requirement Part 1: 6.4.1.4-B.

Source: [VSS2002] II.5.4

 
4.5.1-D Efficacy of built-in self-tests

The test lab SHALL verify the efficacy of built-in measurement, self-test, and diagnostic capabilities described in Part 1: 7.3.1 "Logic and accuracy testing".

Applies to: Voting system

Source: [VSS2002] I.2.3.4.1.a2 (the second a)

 

4.5.2 Security

4.5.2-A Security control source code review

The test lab SHALL analyze the source code of the security controls to assess whether they function correctly and cannot be bypassed.

Applies to: Voting system

 

4.6 Logic Verification

This inspection is to assess conformity with Requirement Part 1: 6.3.2-A and related requirements.

Because of its high complexity, the scope of logic verification is pragmatically limited to core logic. Software modules that are solely devoted to interacting with election officials or voters or formatting reports are not subject to logic verification. However, they are required to conform with Requirement Part 1: 6.1-A, the testing of which is described in Part 3: 4.3 "Verification of Design Requirements" and Part 3: 4.5.2 "Security".

Although these requirements are scoped to core logic, in some cases the test lab may need to inspect other application logic, border logic and third-party logic to assess conformity. Per Requirement Part 2: 3.4.7.2-E, the source code for all of these must be provided.

[Redmill88] provides the following description of logic verification, therein known as "program proving:"

Assertions are made at various locations in the program, which are used as pre-, and post-conditions to various paths through the program. The proof consists of two parts. The first involves showing that the program transfers the pre-conditions into the post-conditions according to a set of logical rules defining the semantics of the programming language, provided that the program actually terminates (i.e., reaches its proper conclusion). The second part is to demonstrate that the program does indeed terminate (e.g., does not go into an infinite loop). Both parts may need inductive arguments.

The inspection specified here does not assume that the programming language has formally specified semantics. Consequently, a formal proof at any level cannot be mandated. Instead, a combination of informal arguments (see Requirement Part 2: 3.4.7.2-F.b) and limitations on complexity (see Requirement Part 1: 6.4.1.4-B.1) seeks to make the correctness of callable units at the lowest level intuitively obvious and to enable the verification of higher level units using the correctness of invoked units as theorems. The resulting inspection is not as rigorous as a formal proof, but still provides greater assurance than is provided by operational testing alone.

Inasmuch as the following behaviors would almost certainly preclude a demonstration of the correctness of the logic, logic verification will almost certainly involve a demonstration that they cannot occur:

  • Numeric errors such as overflow and divide-by-zero;
  • Buffer overruns / out-of-bounds accesses of arrays or strings;
  • Null pointer dereferences;
  • Stack overflows;
  • Invocations of undefined or implementation-dependent behaviors;
  • Race conditions or other nondeterministic execution;
  • Abrupt termination.

It is acceptable, even expected, that logic verification will show that some or most exception handlers in the source code cannot logically be invoked. These exception handlers are not redundant—they provide defense-in-depth against faults that escape detection during logic verification and unpredictable failures that compromise the system.

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

Extremely surprised regarding any expectation that all branches of source code cannot be logically invoked. (This is certainly a design flaw.) If some or most exception handlers in the source code cannot be logically invoked, recommend that it must be determined if any of this code is "deactivated code" or whether it is "dead code." If it is "deactivated code," evidence should be made available by the manufacturer that the deactivated code is disabled for the environments where its use is not intended. Unintended activation of deactivated code due to abnormal system conditions is the same as unintended activation of activated code. A combination of analysis and testing should show that the means by which such code could be inadvertantly executed are prevented, isolated, or liminated. "Dead code" is executable code which, as a result of a design error cannot be execured or used in an operational configuration of the target computer environment and is not traceable to a sysetm or software requirement. The "dead code" should be removed and an analysis performed to assess the effect and the need for reverification.

Comment by Gail Audette (Voting System Test Laboratory)

Is the test lab solely responsible for assessment of what is or is not core logic for the logic verification inspection? The only guidance provided within this VVSG is that core logic is not software modules solely devoted to interacting with election officials, voters, or formatting of reports. Is ballot definition software considered interacting with election officials? Is all voting software considered interacting with the voter? Does this mean that only tallying software is "core"? In order for the labs to consistently identify core logic a finite definition is required, otherwise the range of testing will be impossible to define.
4.6-A Check inductive assertions

For each callable unit (function, method, operation, subroutine, procedure, etc.) in core logic, the test lab SHALL check that the preconditions and postconditions correctly describe the behavior of the unit in all cases.

Applies to: Voting system

DISCUSSION

See Requirement Part 2: 3.4.7.2-F. For a callable unit at the lowest level, this should be achievable through code reading. For a higher level unit, the correctness of the pre- and postconditions of the units that it invokes is assumed as a premise in the argument that the pre- and postconditions of the higher level unit are correct.

4.6-B Check limits

The test lab SHALL check that the assumptions about capacities and limits that appear in the preconditions, postconditions, and proofs are consistent with the capacities and limits that the devices are claimed in the implementation statement to be capable of processing correctly.

Applies to: Voting system

DISCUSSION

See Requirement Part 2: 3.4.7.2-F.a and Requirement Part 1: 2.4-A.e.

4.6-C Check constraints

For the core logic as a whole, and for each constraint indicated in Part 1: 8.3 "Logic Model (normative)", the test lab SHALL check that the constraint is satisfied in all cases within the aforementioned capacities and limits.

Applies to: Voting system

DISCUSSION

See Requirement Part 2: 3.4.7.2-G.

4.6-D Burden of proof

If the test lab finds that the preconditions, postconditions, and proofs provided by the manufacturer are insufficient or incorrect, the responsibility for completing or correcting them SHALL be the manufacturer's.

Applies to: Voting system

DISCUSSION

Although test labs will doubtless provide advice and assistance to their clients, they are not required to fill in gaps in the manufacturer's submission.

1 Comment

Comment by Harris Glasser (Voter)

computers can be played with.....even hackers can do that............we MUST HAVE an accountable record of every single vote!!!!!!!!!!!!!!!!!!!!