United States Election Assistance Comittee

Register to Vote!

Use the National Mail Voter Registration Form to register to vote, update your registration information with a new name or address, or register with a political party.

Note: If you wish to vote absentee and are a uniformed service member or family member or a citizen living outside the U.S., contact the Federal Voting Assistance Program to register to vote.

EAC Newsletters
and Updates

Sign up to receive information about EAC activities including public meetings, webcasts, reports and grants.

Give Us Your Feedback

Share your feedback on EAC policy proposalsElection Resource Library materials, and OpenEAC activities. Give feedback on general issues, including the Web site, through our Contact Us page.

Military and Overseas Voters

EAC has several projects under way to assist states in serving military and overseas citizens who register and vote absentee under the Uniformed and Overseas Citizens Absentee Voting Act. Learn more

Chapter 2: Conformity Assessment Process

2.1 Overview

Conformity assessment encompasses the examination and testing of software and firmware; tests of hardware under conditions simulating the intended storage, operation, transportation, and maintenance environments; inspection and evaluation of system documentation; and operational tests to validate system performance and functioning under normal and abnormal conditions. Conformity assessment also evaluates the completeness of the manufacturer's developmental test program, including the sufficiency of manufacturer tests conducted to demonstrate compliance with stated system design and performance specifications, and the manufacturer's documented quality assurance and configuration management practices. The assessment addresses individual system components or elements as well as the integrated system as a whole.

Beginning in 1994, the National Association of State Election Directors (NASED) began accrediting Independent Test Authorities for the purpose of conducting qualification testing of voting systems. The qualification testing process was originally based on the 1990 voting system standards and evolved to encompass the new requirements contained in the 2002 version of the standards.

The Help America Vote Act (HAVA) directs the U.S. Election Assistance Commission (EAC) to provide for the testing, certification, decertification, and recertification of voting system hardware and software by accredited laboratories. HAVA also introduces different terminology for these functions. Under the EAC process, test labs are "accredited" and voting systems are "certified." The term "standards" has been replaced with the term "VVSG."

Conformity assessment may be performed by one or more accredited test labs that together perform the full scope of tests required. Assessment may be coordinated across accredited test labs so that equipment and materials tested by one accredited test lab can be used in the tests performed by another accredited test lab.

When multiple accredited test labs are being used, the development of the test plan (see Part 2: Chapter 5: "Test Plan (test lab)") and the test report (see Part 2: Chapter 6: "Test Report (test lab)") must be coordinated by a lead accredited test lab. The lead test lab is responsible for ensuring that all testing has been performed and documented in accordance with the VVSG and is ultimately responsible for the summary finding of conformance (see Requirement Part 2: 6.1-F).

Whether one or more accredited test labs are used, the testing generally consists of three phases:

  • Pre-test activities;
  • Chapter 3: overview of general testing approaches;
  • Testing; and
  • Post-test activities.
 

4 Comments

Comment by Christopher (None)

Accredited testing facilities are fine as long as the source code being compiled into end-point machines is released to the PUBLIC for review at some point. Take the difference between Windows and Linux operating systems. Why is Linux so secure and stable? Because it's constantly being reviewed by thousands of users at the source code level. Open source is the ONLY WAY to effectively review ANY code base where security and reliability are an issue.

Comment by Gail Audette (Voting System Test Laboratory)

"The accredited test lab may determine that a modified system is subjected only to limited testing…" The criteria provided is for functional changes only. Please provide guidance for form and fit changes?

Comment by Cem Kaner (Academic)

Independent test labs play an important role in the qualification of critical products. VVSG should continue to require conformity assessment by independent test labs. However, it should also enable testing by the general public. .......... Test lab testing is stunningly expensive. Wise manufacturers do the vast majority of their testing in-house, long before submitting software to a regulatory-function independent lab. This is one of the key reasons that the labs normally focus on confirmatory tests, rather than open-ended, harsh investigations. (This is not a universally-accepted characterization of the relationship, but I speak from knowledge of several test lab executives who have been friends, coworkers, legal clients or technical clients and this is what I have heard repeatedly from the sharpest of them, and from several technical staff who are experienced, as vendors of non-voting software products, with taking products through independent lab assessments.) .......... In the case of voting equipment, .......... (1) There is reason to believe that voting equipment vendors' internal quality control processes to date have not been sufficiently thorough, and .......... (2) Potential conflicts of interest could motivate vendor staff to embed code that could be used to change the results of an election. This is a difference in kind from other regulated industries, in which the vendor and its staff have much to lose and nothing to gain by embedding defects into their code. .......... VVSG attempts to compensate for the (deserved or undeserved) lack of trust in vendor quality control and integrity by expanding the role of the independent test lab to include thorough source code review and open-ended vulnerability testing (i.e. exploratory testing). .......... However, the time cost to thoroughly review this much code is enormous and is almost certainly beyond the expectations of the authors of VVSG, the vendors or the labs. Similarly, the time cost to build skills needed to do a thorough job of exploratory testing is well beyond the 12 person-week scope of VVSG. Independent test labs offer strong skills in creating and executing thoroughly documented tests that trace well to unambiguous documents, but these are very different skill sets from those needed for exploratory testing. It is not clear that labs whose core competencies support traditional regulated-industry testing would even know how to assess the competence of the exploratory testing services they would subcontract. .......... Rather than stretching the role, capability, and cost of lab testing beyond sensible limits, VVSG should require voting equipment vendors to enable public testing with the following requirements: .......... (a) non-COTS components of voting equipment software should be publicly reviewable; .......... (b) all test-related artifacts (including test plans and test results) for voting equipment software be public records; .......... (c) all voting equipment specifications other than proprietary specifications for COTS components should be public records; .......... (d) that voting equipment tendered for sale to any government in the United States should be made available to the public at a similar cost, so that researchers can obtain and test the voting system; .......... (e) the license agreement for the voting equipment software must not prohibit publication of any test results, benchmarks, or other results or opinions stemming from evaluation of the equipment; and .......... (f) the license agreement for the voting equipment software must not prohibit reverse engineering any of the equipment's software (including COTS embedded software) to the extent necessary to discover implementation or security flaws. .......... Public testing offers several benefits, mitigating several inherent weaknesses in traditional conformance testing. .......... The underlying problem is that software defects are not like manufacturing defects. There is relatively little value in running the same test over and over. There is great value in testing the software in different ways. Two tests are distinct if the program can pass one and fail the other. For any nontrivial program, the number of distinct tests is virtually infinite. The test design challenge boils down to a selection problem: which tiny subset of the pool of distinct tests should we select? .......... To a degree, the testing problem can be reduced by thorough code review. However, many potential errors cannot be caught in code review. .......... Let me start with an example from one of the most successful computer peripherals manufacturers in the United States. Suppose that we have a pool of N automated regression tests, and in a given build, the software passes M of them. Restrict further work in this build to these M tests that the software passes individually. Do not change the software in any way. Now run random sequences of tests sampled from the M until either the software fails or the software has not failed for a criterion period of time. Under all code coverage models and under most or all failure models used to estimate software reliability, the expectation would be that the system should not fail random sequence testing because it has passed each and every test in the sequence. However, this technique exposes timing problems, conflicts involving multiple processors, memory leaks, memory corruption that builds over time, stack corruption that builds over time, and several other serious problems. Early in development, this manufacturer's code does well to survive a test like this for more than a minute of execution. Release to the public, in this company, requires survival (no failures in a long sequence) times of at least 72 hours. Variants of long-sequence automated testing have been used to qualify telephone systems and other embedded software for at least 25 years. .......... Creating well-documented scripts for this kind of testing would be appallingly expensive. I have never seen a test plan for regulatory independent testing that includes testing of this type. .......... The long-sequence example illustrates the fact that tests can be distinct in subtle ways. Here are more examples: .......... * Features tested separately can show no problem, but tested together can yield a memory leak or a corrupted stack. .......... * Features tested with most data values can pass, but can fail on special cases that are hard to predict a priori. In a famous example from the testing of the MASPAR computer's integer square root function, of the 2^32 possible tests (32-bit integer inputs to the function), all tests were run and only two cases failed. Neither case had an input value that was near any obvious boundary. The underlying cause was a rounding error that had an impact on the final result only twice (the error was rounded out before having an impact in several other cases). Boundary testing provides a heuristic, but one that is far from infallible. A more famous but less easily summarized problem of this class was the Pentium FDIV bug. .......... * As another illustration, on the software-testing mailing list (2/18/2008), Ross Collard (a well known practitioner / teacher) said, "I have been sifting through archived data on bugs found by extensive date testing (Y2K testing), and correlating bugs with the conditions tested. No matter how I choose to define the boundaries, so far I have not found statistically valid evidence for that assertion that "errors lurk at boundaries." .......... * Features that pass on one configuration can fail under a subtly different one. As a classic example of this, Intuit released a version of Quicken that, to their surprise, crashed during a database search if and only if the computer involved was running Microsoft's new IntelliSense keyboard driver. Other early Windows programs failed on configurations that included both resolution (1600x1200) video and high-resolution (600 dpi) printing. As one example (this problem affected several software developers), in a product that I worked on, no failures occurred with high resolution video but lower resolution printing or high resolution printing but lower resolution video, but in combination, some tasks (such as a print preview) corrupted system memory and caused a system crash. .......... None of these problems stand out in source code reviews. Long-sequence bugs that I personally worked with in telephone systems showed up in code that had been thoroughly reviewed and had been subject to glass-box testing that involved (for all of the code that was eventually implicated in the failure) 100% statement coverage, 100% branch coverage, and testing of every obvious boundary of every obvious (to a person reading the source code OR operating the program black box) variable. .......... Testing is a complex problem. Our current state of the art does not allow us to fully solve it and so we are well served by creating assessment processes that view the code and test the code from widely differing perspectives, in richly different ways. .......... Independent test lab testing is not well suited to this constant variation. Decisions about what to vary in what ways are often intuitive and hard to justify. Auditing the skill and thoroughness of this style of testing is difficult for a test manager working with a skilled staff who are willing to freely share their private thoughts about their strategies and choices. Auditing under more adversarial circumstances would be much more difficult--obviously bad work could be exposed, but the range from not-awful through excellent would be very hard to assess in the face of testing staff who were responding cautiously (defensively) as is not uncommon in adversarial audits. The labs could spend millions of dollars on testing and vendors and regulators could spend hundreds or thousands of hours arguing about whether a pool of tests (and failure rates) was sufficient, representative, fair, etc.--and at the end, we would still have significant uncertainty at high cost. .......... Rather than push labs beyond their zone of excellence, the public testing approach relies on the public to generate a wide variety of approaches that supplement the conformance testing done by the labs. .......... One objection that has been raised to this proposal is that it has been very difficult, in open source projects, to attract sustained testing at this level of skill. Voting systems, though, are special: .......... * National Science Foundation merit criteria require every proposal to include discussion of the impact of the research for the public benefit. Some researchers will choose to improve the perceived merit associated with their NSF proposals by using voting equipment software as target platforms for the test technique, code review technique, reliability model, or whatever new technology they want to study. As the principal investigator for three NSF projects totaling over a million dollars in funding, and having served on several NSF panels, I would certainly do this whenever possible in my grant proposals and I would expect to see a lot of it in new proposals. These proposals would not be primarily targeting voting systems; they would be using the voting systems as test beds for the ideas that they wanted to explore. However, as they found problems in the voting systems, they would report them. .......... * Many doctoral students would find it convenient to use voting equipment software as a test bed for their work because the software is thoroughly documented and fully available. Additionally, work on this type of product can neither be dismissed as work on "toy" applications nor as work on low-grade software chosen to misleadingly highlight the strengths of one particular technical approach or unfairly denigrate another. The intimidating oral exam question, "But why did you choose THIS for your test bed?" would be easy to handle--this would be very motivating for several doctoral students, at least many of the ones I know. .......... * It is likely that other nongovernmental organizations would raise money to fund testing efforts for this software. Given the enormous public mistrust, there is an enormous opportunity for fundraising for activities that could be perceived as mitigating the risks that lead to that mistrust. .......... * With the publication of test plans and results, some test labs will be motivated to prove their capability by demonstrating that their approach to testing exposes bugs missed by the NIST-certified, prestigious labs that tested a given voting system. Such demonstrations would make for useful advertising copy. Live demonstrations of flaws in other well-tested software (e.g. Microsoft Office) have been the core of some keynote addresses at software testing conferences--very powerful marketing for the test group (and test techniques) involved. .......... Another objection raised to this proposal is that the results go nowhere. That is, if a research group does find a defect in a voting system, there is nothing in VVSG that closes the loop, requiring immediate repair by the voting equipment vendor. This might be true today, and it might stay true in all subsequent versions of VVSG, but if members of the public find and publish defects in a voting system, this can affect buying decisions by subsequent potential purchasers and it can also affect the reputation of the test lab that signed off on the system. Over time, systematic weaknesses in the assessment of voting systems will be understood and mitigated. .......... (Affiliation Note: IEEE representative to TGDC)

Comment by Cem Kaner (Academic)

VVSG specifies the testing that an independent test lab will perform. .......... The first problem with this process is that the equipment vendor picks the test lab. This creates a strong incentive for the test lab to please the vendor, in order to obtain the vendor's repeat business and the business of this vendor's competitors. The labs therefore have a disincentive against creating tests that are more harsh or more extensive (more time consuming and more expensive) than the bare minimum specified in VVSG. This is not a genuinely independent set of tests and it is a poor way to engender public trust in the test results. .......... VVSG must address and eliminate the problem of test lab conflict of interest. .......... (Affiliation Note: IEEE representative to TGDC)

2.2 Scope of Assessment

The conformity assessment process is intended to discover vulnerabilities that, should they appear in actual election use, could result in failure to complete election operations in a satisfactory manner. This involves

  • Operational accuracy in the recording and processing of voting data, as measured by report total error rate;
  • Operational failures or the number of unrecoverable failures under conditions simulating the intended storage, operation, transportation, and maintenance environments for voting systems;
  • System performance and function under normal and abnormal conditions; and
  • Completeness and accuracy of the system documentation and configuration management records to enable purchasing jurisdictions to effectively install, test, and operate the system.

Conformity assessment involves several different kinds of testing, including

  • Inspections, where the conformity of the voting system and manufacturer practices for configuration management and quality assurance are evaluated via expert review;
  • Hardware testing, where the ability of the system to tolerate the physical conditions of its operation, transportation and storage is evaluated;
  • Functional testing, where the conformity of the voting system's observable behaviors is evaluated;
  • Performance testing, where the satisfaction of specified benchmarks is either evaluated in specific tests or monitored concurrent with other testing;
  • Usability testing, where the performance is evaluated with human test subjects; and
  • Vulnerability testing, where the system's resistance to attack is evaluated.

Voting system hardware, software, communications and documentation are examined and tested to determine suitability for elections use. Examination and testing address the broad range of system functionality and components, including system functionality for pre-voting, voting, and post-voting functions. All products for election use are tested in accordance with the applicable procedures.

Tests are conducted for new systems seeking initial testing as well as for modified versions of systems that have been previously tested.

Not all systems are required to complete every category of testing. Consistent with Requirement Part 2: 5.1-D, the test lab may find that proven performance of COTS hardware, software and communications components in commercial applications other than elections obviates the need for certain specific evaluations. However, as most functional testing exercises the complete system, COTS components are always tested together with other components of the voting system. Similarly, if a previous version of the same system has been tested, the test lab may find that complete retesting would be redundant, but some tests that exercise the entire system are always conducted. The background and rationale for these decisions regarding the scope of testing must be documented in the test plan.

The accredited test lab determines which tests are necessary to reassess a modified system based on a review of the nature and scope of changes and other submitted information including the system documentation, manufacturer test documentation, configuration management records, and quality assurance information. The accredited test lab may determine that a modified system is subject only to limited retesting if the manufacturer demonstrates that the change does not affect demonstrated compliance with these VVSG for:

  • Performance of voting system functions;
  • Voting system security and privacy;
  • Overall flow of system control; and
  • The manner in which ballots are defined and interpreted, or voting data are processed.

Limited testing is intended to facilitate the correction of defects, the incorporation of improvements, the enhancement of portability and flexibility, and the integration of vote counting software with other systems and election software.

In all cases, the system documentation and configuration management records are examined to confirm that they completely and accurately reflect the components and component versions that comprise the voting system.

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

The accredited test lab may determine that a modified system is subject only to limited retesting if the manufacturer demonstrates that the change does not affect demonstrated compliance with these VVSG for: Performance of voting system functions; Voting system security and privacy; Overall flow of system control; and The manner in which ballots are defined and interpreted, or voting data are processed. The EAC needs to ensure that not only functional changes are looked at but also system changes. Ex. A new micro processor could meet all the above requirements, but have flaws elsewhere if not thoroughly tested.

2.3 Testing Sequence

Tests and inspections required by these VVSG need not be conducted in any particular order. Test labs should organize the test campaign to maximize overall testing effectiveness, to test in as efficient a manner as possible, and to minimize the amount of regression testing that is incurred when nonconformities are found and corrected. Test anomalies and errors are communicated to the system manufacturer throughout the process.

1 Comment

Comment by Brian V. Jarvis (Local Election Official)

Since there are likely to be tests dependent upon the results of prior tests (e.g., progressive or cumulative tests), recommmend that in those instances, the test lab shall document in advance the planned sequence or progression of tests.

2.4 Pre-Test Activities

Pre-test activities include the request for initiation of testing and the pre-test preparation.

 

2.4.1 Initiation of testing

Conformity assessment is conducted at the request of the manufacturer. The manufacturer must:

  • Request the performance of conformity assessment from among the accredited testing laboratories;
  • Enter into formal agreement with the accredited test lab for the performance of testing; and
  • Prepare and submit materials required for testing consistent with the requirements of the VVSG.

Conformity assessment is conducted for the initial version of a voting system as well as for all subsequent revisions to the system that are to be used in elections. As described in Part 3: 2.2 "Scope of Assessment", the nature and scope of testing for system changes or new versions is determined by the accredited test lab based on the nature and scope of the modifications to the system and on the quality of system documentation and configuration management records submitted by the manufacturer.

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

The EAC process should be referenced in this documentation. When is a lab the lead VSTL under the EAC program?

2.4.2 Pre-test preparation

Pre-test preparation encompasses the following activities:

  • The manufacturer and accredited test lab enter into an agreement for the testing to be performed by the accredited test lab;
  • The manufacturer prepares and submits a TDP to the accredited test lab. The TDP consists of the materials described in Part 2: Chapter 3: "Technical Data Package (manufacturer)";
  • The accredited test lab performs an initial review of the TDP for completeness and clarity and requests additional information as required;
  • The manufacturer provides additional information if requested by the accredited test lab;
  • The test lab witnesses the production of the implementation for testing;
  • The manufacturer delivers to the accredited test lab all hardware and software needed to perform testing.

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

The test lab witnesses the production of the implementation for testing; What is meant by this statement? It is very vague.

2.4.2.1 Documentation submitted by manufacturer

2.4.2.1-A Submit Technical Data Package

The manufacturerSHALL submit to the test lab a Technical Data Package conforming to the requirements of Part 2: Chapter 3: "Technical Data Package (manufacturer)".

Applies to: Voting system

DISCUSSION

The manufacturer must submit all the documentation necessary for the identification of the full system configuration submitted for evaluation and for the development of an appropriate test plan by the accredited test lab for conducting conformity assessment. This documentation collectively is referred to as the Technical Data Package (TDP). The TDP provides information that defines the voting system's design, method of operation, and related resources. It provides a system overview and documents the system's functionality, hardware, software, security, test specifications, operations procedures, maintenance procedures, and personnel deployment and training requirements. It also documents the manufacturer's configuration management plan and quality assurance program. If another version of the system was previously tested, the TDP would also include appropriate system change notes.

Source: [VVSG2005] II.1.5

2.4.2.2 Voting equipment submitted by manufacturer

Manufacturers may seek to market a complete voting system or an interoperable component of a voting system. In all instances, manufacturers must submit for testing the specific system configuration that will be offered to jurisdictions or that comprises the component to be marketed plus the other components with which the component is to be used. Under no circumstances will a component be assessed except as part of a complete voting system, and that assessment is valid only when that component is used with that same system (see Part 1: 2.3 "Conformance Designations").

2 Comments

Comment by Gail Audette (State Election Official)

Is this VVSG allowing for certification of individual components? If so, where is the guidance for component testing within end-to-end testing? What is the lab reporting for certification testing? How will this be reconciled with the NIST 150-22 definition of a voting system?

Comment by Frank Padilla (Voting System Test Laboratory)

Certification of individual components is not consistent with the current EAC guidelines or NIST guidelines on what a voting system is. Careful attention should be given to this subject as how this would affect cross market utilization and testing requirements.
2.4.2.2-A Submit system without COTS

If needed for compliance with Part 3: 2.4.3.4 "Unmodified COTS verification", the manufacturer SHALL supply the system with the COTS components omitted, for subsequent integration performed by or witnessed by the test lab.

Applies to: Voting system

DISCUSSION

See Part 3: 2.4.3.4 "Unmodified COTS verification".

Source: New requirement.

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

The implications of the VSTLs procuring all COTS equipment for testing could adversely affect the cost of testing, as we will not be procuring the quantities or have the deals that manufacturers have with vendors for supplying these items. VSTLs should verify that COTS are available and are the same as the manufacturers provide.
2.4.2.2-B Hardware equivalent to production version

The hardware submitted for conformity assessment SHALL be equivalent, in form and function, to the actual production version of the hardware units specified for use in the TDP.

Applies to: Voting system

Source: [VVSG2005] II.1.6.a

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

Recommend enhancing the stated criteria to be "quality, form, fit, function, durability, performance, and reliability." The intent is to have the hardware submitted for conformity assessment to be as identical as possible to the actual production verson and not to just be "basically the same."

Comment by Frank Padilla (Voting System Test Laboratory)

Hardware should be THE same as the production version. VSTLs are supposed to test the actual version that is being sold to the customers not prototypes. See 2.4.2.2-D
2.4.2.2-C Logic equivalent to production version

The firmware and software submitted for conformity assessment SHALL be the exact firmware and software that will be used in production units.

Applies to: Voting system

Source: [VVSG2005] II.1.6.b

2.4.2.2-D No prototypes

Developmental prototypes SHALL NOT be submitted unless the manufacturer can show that the equipment to be tested is equivalent to standard production units both in performance and construction.

Applies to: Voting system

Source: [VVSG2005] II.1.6.c

3 Comments

Comment by Brian V. Jarvis (Local Election Official)

Recommend that this requirement be revised to never allow developmental prototypes to be submitted for conformity assessment. Best practices would never allow for this. However, if (for some reason) this must be allowed to occur, then recommend enhancing the stated criteria to be "quality, form, fit, function, durability, performance, reliability, and construction" (instead of just performance and construction). The intent is to have the equipment submitted for conformity assessment to be as identical as possible to the actual production verson and not to just be "basically the same." (Preference would be to never allow this practice at all.)

Comment by Frank Padilla (Voting System Test Laboratory)

Contradicts what is said in 2.4.2.2-B

Comment by Harry VanSickle (State Election Official)

Please provide the rationale for allowing developmental prototypes to be accepted for testing as opposed to only the standard production units that will be placed in counties. Does allowing the developmental prototypes in some way contravene the stepped-up requirements and security measures in this iteration of the guidelines?
2.4.2.2-E Benchmark directory listings

Benchmark directory listings SHALL be submitted for all software/firmware elements (and associated documentation) included in the manufacturer's release as they would normally be installed upon setup and installation.

Applies to: Voting system

Source: [VVSG2005] II.1.6.d

2.4.3 Initial system build by test lab

The following requirements describe how test labs are to perform build of voting system software by the test lab.

Previously built voting system software being updated may be able to use the requirements found Part 3: 2.4.3.3 "Updating previously built voting system software executable code" to create the updated executable code including application logic, border logic, and third party logic.

1 Comment

Comment by Kevin Wilson (Voting System Test Laboratory)

The document does not mention that a copy of the public key of the "digital signature" must be maintained with the software and trusted build or else the "digital signature" is effectively useless outside of the originating agency (and possibly even there if the public key is lost). Perhaps a better and more clear way to do so is to specify that an X.509 certificate be included on the CD and thus more clearly defining the intent of this section.

2.4.3.1 Build environment establishment

2.4.3.1-A Test lab build environment assembly

The test lab SHALL assemble the build environment(s) used to create executable code including application logic, border logic, and third party logic.

Applies to: Voting system

Source: [EAC06] Section 5.6.1.2 and [VVSG2005] II.1.8.2.4

2.4.3.1-A.1 Witness of build environment assembly

At least one representative from the manufacturer SHALL witness the assembly of the build environment.

Applies to: Voting system

Source: [EAC06] Section 5.6 and [VVSG2005] II.1.8.2.4

1 Comments

Comment by Brian V. Jarvis (Local Election Official)

Recommend the requirement be established that this manufacturer representative must have demonstrated technical expertise and extensive, first-hand experience and knowledge of the build environment assembly.
2.4.3.1-A.2 Build environment establishment record

A representative from the test lab SHALL create a build environment establishment record that includes at a minimum: a unique identifier (such as a serial number) for the record; a list of unique identifiers of unalterable storage media associated with the record; the time, date, and location the build environment was established; names, affiliations, and signatures of all people present; copies of the procedures used to assemble the build environment; list of software and hardware used to establish the build environment; and the voting system associated with the build environment.

Applies to: Voting system

Source: [EAC06] Section 5.9

2.4.3.1-A.3 Build environment software and hardware procurement

The test lab SHALL obtain the software and hardware required to establish the build environment.

Applies to: Voting system

DISCUSSION

Requirement Part 2: 3.5.4-C documents the software and hardware required to assemble the build environment.

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

This requirement needs to define where and how the test lab will obtain the software and hardware required to establish the build environement. (Requirement Part 2: 3.5.4-C only indicates that "...the manufacturer SHALL provide a list of all software and hardware required to assemble the build environment...")

Comment by Frank Padilla (Voting System Test Laboratory)

Should state that the manufacturer will provide the test lab with the official version of the software and hardware required to establish the build environment.
2.4.3.1-A.4 Open market procurement of COTS software and hardware

The test lab SHALL obtain COTS software and hardware required to assemble the build environment from the open market.

Applies to: Voting system

DISCUSSION

Note: manufacturers are required to supply non-COTS hardware and software as part of Requirement Part 3: 2.4.2.2-A.

4 Comments

Comment by Brian V. Jarvis (Local Election Official)

Recommend that any suppliers used for procurement of COTS software and hardware for the build environment meet the same quality requirements as was used to procure the voting machine hardware and software. This just makes sense, since the build environment must be (at least) as good as the product being tested (hopefully, better in order to detect defects). Also, recommend that the test lab maintain a register of these approved suppliers that includes the scope of the approval (i.e., the quality requirements used to determine the suppliers' approval status). This is the best method for developing a supply base that supports material requirements for quality and other testing objectives for the test lab.

Comment by Kevin Wilson (Voting System Test Laboratory)

It is common for the current voting systems to rely upon COTS products that are no longer available from the open market. Is it the intention of this this requirement to effectively remove those products from the market? Is there an alternative allowable chain-of-evidence for such situations?

Comment by Frank Padilla (Voting System Test Laboratory)

The implications of the VSTLs procuring all COTS equipment for testing could adversely affect the cost of testing as we will not be procuring the quantities or have discounts that the manufacturers offer to vendors for supplying these items. VSTLs should verify that COTS is available and the same as what the manufacturers provide.

Comment by Premier Election Solutions (Manufacturer)

Due to the fast changing COTS market and the slow pace of Voting System Certification and Development there is a very real possibility that the COTS software required to build the product will no longer be available on the open market.
2.4.3.1-A.5 Erasable storage media preparation

The test lab SHALL remove any previously stored information on erasable storage media in preparation for using the media to assemble the build environment.

Applies to: Voting system

DISCUSSION

The purpose of this requirement is to prepare erasable storage media for use by the build environment. The requirement does not require the prevention of previously stored information leakage or recovery. Simply deleting files from file systems, flashing memory cards, and removing electrical power from volatile memory satisfies this requirement.

Source: [EAC06] Section 5.6.1.1

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

EAC06, Section 5.6.1.1 states that "The device that will hold the build environment shall be completely erased by the VSTL to ensure a total and complete cleaning of it. The VSTL shall use commercial off-the-shelf software, purchased by the laboratory, for cleaning the device." It would not appear that "simply deleting files" meets the criteria defined in EAC06 5.6.1.1 to "...ensure a total and complete cleaning...." (In fact, EAC06 5.6.1.1 would not require the use of "COTS, purchased by the laboratory, for cleaning the device" if a simple deleting of files would suffice!) Data remanence plays a major role when storage media is erased for the purposes of reuse or release. (Data remanence is the residual physical representation of data that has been in some way erased.) After storage media is erased there may be some physical characteristics that allow data to be reconstructed. The integrity of the conformity assessment process will be compromised if this possibility is not accounted for.

Comment by Diane Gray (Voting System Test Laboratory)

In the discussion it is stated that simply deleting files...satsfies the requirement. The source cited is the EAC Testing and Certification Program Manual. The section cited states that the device holding the build environment shall use commercial off-the-shelf software to clean the device. This seems to contradict the VVSG requirement. Also, any other upper-level procedures for deleting files should be referenced.
2.4.3.1-A.6 Build environment assembly

The test lab SHALL use the procedures found in the TDP to assemble the build environment.

Applies to: Voting system

DISCUSSION

Requirement Part 2: 3.5.4-D documents the procedures to assemble the build environment. Test lab personnel can have manufacturers provide guidance during the assembly of the build environment, but test lab personnel must perform the actual assembly.

Source: [EAC06] Section 5.6.1.2

2.4.3.1-A.7 Build environment assembly deviation record requirement

The test lab SHALL document as part of the build environment establishment record the reason for any deviation from assembly procedures found in the TDP.

Applies to: Voting system

DISCUSSION

Requirement Part 2: 3.5.4-D documents the procedures used to assemble the build environment.

Source: [EAC06] Section 5.9

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

Note that if deviation from the assembly procedures results in a different build environment than what was used by the voting machine manufacturer, then this should not be allowed. Need to ensure that the manufacturer can demonstrate that validation activities were performed successfully. (Validation: to demonstrate that a product or product component fulfills its intended use when placed in its intended environement.) Need to ensure that identical assembly procedures occur for an identical build environment.

Comment by Diane Gray (Voting System Test Laboratory)

In the discussion it is stated that simply deleting files...satsfies the requirement. The source cited is the EAC Testing and Certification Program Manual. The section cited states that the device holding the build environment shall use commercial off-the-shelf software to clean the device. This seems to contradict the VVSG requirement. Also, any other upper-level procedures for deleting files should be referenced.
2.4.3.1-A.8 Build environment digital signature verification

When digital signatures are associated with software, the test lab SHALL verify digital signatures before using the software for the build environment.

Applies to: Voting system

DISCUSSION

The digital signatures associated with the build environment may be from the manufacturer of the software, National Software Reference Library (NSRL), or other authoritative sources.

Source: [EAC06] Section 5.6.2.1

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

Note that if deviation from the assembly procedures results in a different build environment than what was used by the voting machine manufacturer, then this should not be allowed. Need to ensure that the manufacturer can demonstrate that validation activities were performed successfully. (Validation: to demonstrate that a product or product component fulfills its intended use when placed in its intended environement.) Need to ensure that identical assembly procedures occur for an identical build environment.

Comment by Frank Padilla (Voting System Test Laboratory)

The Test Lab should determine if changes should be incorporated into the TDP and state why not, if not included.
2.4.3.1-A.9 Build environment digital signature verification record

The test lab SHALL record as part of the build environment establishment record the results of digital signature verification including who generated the signature.

Applies to: Voting system

Source: [EAC06] Section 5.9

2.4.3.1-A.10 Build environment pre-build binary image copy

The test lab SHALL copy the binary image of the assembled build environment to unalterable storage media.

Applies to: Voting system

DISCUSSION

This requirement creates a snapshot of the build environment before it is used to build the voting system software executable code. Unalterable storage media includes technology such as a CD-R, but not CD-RW.

2.4.3.1-A.11 Build environment pre-build binary image digital signature

The test lab SHALL create a digital signature for the binary image of the build environment, and include the digital signature on the unalterable storage media with the binary image.

Applies to: Voting system

Source: [EAC06] Section 5.6.1.3

2.4.3.2 Build of voting system software executable code

Previously built voting system software being updated may be able to use Requirement Part 3: 2.4.3.3 to create the updated executable code including application logic, border logic, and third party logic.

2.4.3.2-A Use of established build environment

The test lab SHALL build the executable code including application logic, border logic, and third party logic of the voting system using the established build environment.

Applies to: Voting system

DISCUSSION

The build environment is established using the requirements in Part 3: 2.4.3.1 "Build environment establishment".

Source: [EAC06] and [VVSG2005] II.1.8.2.4

2.4.3.2-A.1 Witness of voting system software build

At least one representative from the manufacturer SHALL witness the build of executable code including application logic, border logic, and third party logic of the voting system.

Applies to: Voting system

Source: [EAC06] Section 5.6

2.4.3.2-A.2 Voting system software build record

A representative from the test lab SHALL create an executable code build record that includes at a minimum: a unique identifier (such as a serial number) for the record; a list of unique identifiers of unalterable storage media associated with the record; the time, date, and location of the build; names, affiliations, and signatures of all people present; filenames of the source code and resulting executable code; voting system software version; name and version of the voting system (including certification number, if possible); and copies of the procedures used to build the voting system software executable code.

Applies to: Voting system

Source: [EAC06] Section 5.9

 
2.4.3.2-A.3 Voting system software digital signature verification

The test lab SHALL validate manufacturer digital signatures on voting system software source code before placing source code on the build environment.

Applies to: Voting system

DISCUSSION

Requirement Part 3: 2.6.2.4-D requires manufacturers to provide voting system software source code with digital signatures as part of the TDP.

Source: [EAC06] Section 5.6.2.1

2.4.3.2-A.4 Voting system software digital signature verification result record

The results of digital signature validation including who generated the signature SHALL be part of the executable code build record for voting system software.

Applies to: Voting system

Source: [EAC06] Section 5.9

2.4.3.2-A.5 2.4.3.2-A.5 Voting system software build

The test lab SHALL use the procedures found in the TDP to build the voting system software executable code including application logic, border logic, and third party logic.

Applies to: Voting system

DISCUSSION

Requirement Part 2: 3.5.4-E documents the procedures to build voting system software executable code. Test lab personnel can have manufacturers provide guidance during the build of the voting system executable code, but test lab personnel must perform the actual build.

Source: [EAC06] Section 5.6.3

2.4.3.2-A.6 Voting system software executable code build deviation record

The test lab SHALL document as part of the executable code build record the reason for any deviation from build procedures found in the TDP.

Applies to: Voting system

DISCUSSION

Requirement Part 2: 3.5.4-E documents the procedures to build voting system software executable code.

Source: [EAC06] Section 5.9

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

Recommend that no deviation from the software executable code build procedures should be permitted. This goes to the heart of the integrity of the voting machine software. If the software build process is not repeatable then, nothing can be assured. If the code build procedures are not stable then, this puts into question all verification and validation activities performed by the manufacturer. (Verification: to ensure that the selected work products meet their specified requirements.) (Validation: to demonstrate that a product or product component fulfills its intended use when placed in its intended environement.) Need to ensure that build procedures from the manufacturer are accurately described and repeatable.

Comment by Frank Padilla (Voting System Test Laboratory)

The Test Lab should determine if changes should be incorporated into the TDP and state why not, if not included.
2.4.3.2-A.7 Build environment post build binary image

After voting system software executable code including application logic, border logic, and third party logic has been built, the test lab SHALL copy the binary image of the build environment (including source and executable code) to unalterable storage media.

Applies to: Voting system

DISCUSSION

This requirement creates a snapshot of the build environment after it has been used to build voting system software executable code. Unalterable storage media includes technology such as a CD-R, but not CD-RW.

Source: [EAC06] Section 5.6.2.3

2.4.3.2-A.8 Build environment post build binary image digital signature

After voting system software executable code including application logic, border logic, and third party logic has been built, the test lab SHALL create a digital signature for the binary image of the build environment (including source and executable code), and include the digital signature on the unalterable storage media with the binary image.

Applies to: Voting system

Source: [EAC06] Section 5.6.2.2

2.4.3.3 Updating previously built voting system software executable code

The following voting system software build requirements apply when updates to previously built voting system software has occurred. These requirements assume the original build environment can be used to create the updated software and a significant portion of original software is not being updated. If the original build environment cannot be used or a significant portion of the original software is being updated, then the requirements of Part 3: 2.4.3.1 "Build environment establishment" and Part 3: 2.4.3.2 "Build of voting system software executable code".

 

3 Comments

Comment by Brian V. Jarvis (Local Election Official)

The last sentence in this paragraph needs to be completed.

Comment by k (Voting System Test Laboratory)

This section has inverted the 2005 requirements by requiring a trusted build prior to a witness build. It is not clear how these two terms are being used and what is the difference between a trusted and witnessed build. This needs to be clarified.

Comment by Harry VanSickle (State Election Official)

Recommend avoiding the use of the term "significant portion" that can be subject to different interpretations. At the very least, this term should be more clearly defined or quantified.
2.4.3.3-A Witness of build for previously built voting system software

At least one representative from the manufacturer SHALL witness the establishment of the post build environment associated with the previously built voting system software, and the build of the updated voting system software executable code including application logic, border logic, and third party logic.

Applies to: Voting system

DISCUSSION

This requirement does not modify the requirement found in Section 5.6 of the EAC Testing and Certification Program Manual [EAC06] requiring a representative from both the manufacturer and test lab to be present during the build.

Source: [EAC06] Section 5.6

2.4.3.3-B Original post build environment re-establishment

The test lab SHALL establish the build environment using the original post build environment binary image associated with the previously built voting system software.

Applies to: Voting system

DISCUSSION

Requirements Part 3: 2.4.3.2-A.7 and Part 3: 2.4.3.2-A.8 create the post build binary image of the original built voting system software developed by the manufacturer. If the test lab does not posses the required hardware and software to create the build environment then Requirements Part 3: 2.4.3.2-A.7 and Part 3: 2.4.3.2-A.8 apply. This requirement extends the requirement found in [EAC06] Section 5.6.4.1 and 5.6.4.2 by explicitly stating the original build environment needs to be established.

Source: [EAC06] Section 5.6.4.1 and 5.6.4.2

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

The final statement "If the test lab does not possess the required hardware and software to create the build" conflicts with requirement of 2.4.3.1-A.4.
2.4.3.3-B.1 Erasable storage media preparation

The test lab SHALL remove previously stored information on erasable storage media in preparation for using the media to establish the build environment.

Applies to: Voting system

DISCUSSION

The purpose of this requirement is to prepare the erasable storage media for use by the original post build environment. The requirement does not require the prevention of previously stored information leakage or recovery. Simply deleting files from the file system, flash memory cards, and removing electrical power from volatile memory satisfy this requirement.

Source: [EAC06] Section 5.6.1.1

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

EAC06, Section 5.6.1.1 states that "The device that will hold the build environment shall be completely erased by the VSTL to ensure a total and complete cleaning of it. The VSTL shall use commercial off-the-shelf software, purchased by the laboratory, for cleaning the device." It would not appear that "simply deleting files" meets the criteria defined in EAC06 5.6.1.1 to "...ensure a total and complete cleaning...." (In fact, EAC06 5.6.1.1 would not require the use of "COTS, purchased by the laboratory, for cleaning the device" if a simple deleting of files would suffice!) Data remanence plays a major role when storage media is erased for the purposes of reuse or release. (Data remanence is the residual physical representation of data that has been in some way erased.) After storage media is erased there may be some physical characteristics that allow data to be reconstructed. The integrity of the conformity assessment process will be compromised if this possibility is not accounted for.

Comment by Diane Gray (Voting System Test Laboratory)

As commented in Part 3, Chapter 2.4.3.1-A.5: In the discussion it is stated that simply deleting files...satsfies the requirement. The source cited is the EAC Testing and Certification Program Manual. The section cited states that the device holding the build environment shall use commercial off-the-shelf software to clean the device. This seems to contradict the VVSG requirement. Also, any other upper-level procedures for deleting files should be referenced.
2.4.3.3-B.2 Original post build environment re-establishment digital signature verification

The test lab SHALL verify the digital signature of the original post build binary image associated with the previously built voting system software before using the binary image to establish the build environment.

Applies to: Voting system

DISCUSSION

This requirement does not modify the requirement found in Section 5.6.4.1 of the EAC Testing and Certification Program Manual [EAC06] that states the file signature of the build environment needs to be verified before use.

Source: [EAC06] Section 5.6.4.1

 
2.4.3.3-B.3 Original post build environment re-establishment digital signature verification record

The result of digital signature verification including who generated the signature SHALL be part of the original post build environment establishment record.

Applies to: Voting system

Source: [EAC06] Section 5.9

2.4.3.3-B.4 Original post build environment re-establishment record

A representative from the test lab SHALL create an original post build environment establishment record that includes at a minimum: a unique identifier (such as a serial number) for the record; a list of unique identifiers of unalterable storage media associated with the record; the time, date, and location the original post build environment was established; names, affiliations, and signatures of all people present; copies of the procedures used to assemble the original post build environment; list of software and hardware used to establish the original post build environment; and the voting system associated with the original post build environment.

Applies to: Voting system

DISCUSSION

This requirement updates the requirement found in Section 5.9 of the EAC Testing and Certification Program Manual [EAC06] by specifying the information needed to be documented when establishing the build environment.

Source: [EAC06] Section 5.9

2.4.3.3-C Build of updated voting system software executable code

The test lab SHALL build the executable code including application logic, border logic, and third party logic of the updated voting system software.

Applies to: Voting system

DISCUSSION

This requirement does not modify the requirement found in Section 5.6.4.2 of the EAC Testing and Certification Program Manual [EAC06] that states the executable files are created; and extends the requirement found at Section 1.8.2.4 of [VVSG2005] Volume II in [VVSG2005] by requiring the use of the build environment established in Part 3: 2.4.3.1 "Build environment establishment".

Source: [EAC06] Section 5.6.4.2 and [VVSG2005] II.1.8.2.4

2.4.3.3-C.1 Updated voting system software source code digital signature verification

The test lab SHALL validate manufacturer digital signatures on updated voting system software source code before placing the updated source code on the build environment.

Applies to: Voting system

DISCUSSION

This requirement modifies the requirement found in Section 5.6.4.2 of the EAC Testing and Certification Program Manual [EAC06] by constraining the verification to digital signature from a "file signature" (which could be a hash value or digital signature); extends 5.6.2.1 by specifying the verification to happen before software is installed on the build environment; and does not call for the digital signature of the build environment to be verified before installing the source code.

Source: [EAC06] Section 5.6.4.2

1 Comment

Comment by Kevin Wilson (Voting System Test Laboratory)

Here the concept of a "digital signature" is declared to be a constraint to a "file signature". It would be better to make this declaration in section 2.4.3.2-A.8 addressing the Trusted Build and not here where the Witness build is addressed.
2.4.3.3-C.2 Updated voting system software source code digital signature verification record

The result of digital signature verification including who generated the signature SHALL be part of the updated voting system software build record.

Applies to: Voting system

DISCUSSION

Requirement Part 3: 2.6.2.4-D requires manufacturers to provide voting system software source code with digital signatures as part of the TDP. This requirement updates the requirement found in Section 5.9 of the EAC Testing and Certification Program Manual [EAC06] by specifying the results of digital signature verification needs to be documented as part of the record when building the executable code.

Source: [EAC06] Section 5.9

1 Comment

Comment by Kevin Wilson (Voting System Test Laboratory)

This section describes greater definition over a witness build than the trusted build from which it owes its trust. Why isn't this information in the preceding trusted build section? (see comment submitted to 2.4.3.3)
2.4.3.3-C.3 Updated voting system software build procedures

The test lab SHALL use the procedures found in the TDP to build the updated voting system software executable code including application logic, border logic, and third party logic.

Applies to: Voting system

DISCUSSION

Requirement Part 2: 3.5.4-G documents the procedures to build the updated voting system software executable code. Test labs can have manufacturers assist in building of the updated voting system software executable code. This requirement extends the requirement found in Section 5.6.4.2 of the [EAC06] by specifying the use of the manufacturer supplied procedures to build the updated voting system software.

Source: [EAC06] Section 5.6.4.2

2.4.3.3-C.4 Updated voting system software build record

A representative from the test lab SHALL create an executable code build record that includes at a minimum: a unique identifier (such as a serial number) for the record; a list of unique identifiers of unalterable storage media associated with the record; the time, date, and location of the build; names, affiliations, and signatures of all people present; filenames of the source code and resulting executable code; voting system software version; name and version of the voting system (including certification number, if possible); and copies of the procedures used to build the updated voting system software executable code.

Applies to: Voting system

DISCUSSION

This requirement updates the requirement found in Section 5.9 of the [EAC06] by specifying the information needed to be documented when creating updated executable code.

Source: [EAC06] Section 5.9

2.4.3.3-C.5 Updated build environment post build binary image

After updated voting system software executable code including application logic, border logic, and third party logic has been built, the test lab SHALL copy the binary image of the updated build environment (including source and executable code) to unalterable storage media.

Applies to: Voting system

DISCUSSION

This requirement creates a snapshot of the updated build environment after it has been used to build the updated voting system software executable code. Unalterable storage media includes technology such as a CD-R, but not CD-RW. This requirement differs from the requirement found in Section 5.6.2.3 of the [EAC06] by creating the binary image after, instead of before, the updated software executable code has been built.

Source: [EAC06] Section 5.6.2.3

2.4.3.3-C.6 Updated build environment post build binary image digital signature

After updated voting system software executable code including application logic, border logic, and third party logic has been built, the test lab SHALL create a digital signature for the binary image of the updated build environment (including source and executable code), and include the digital signature on the unalterable storage media with the binary image.

Applies to: Voting system

DISCUSSION

This requirement differs from the requirement found in Section 5.6.2.2 of the [EAC06] by creating a digital signature on the binary image after the software executable code has been built as opposed to a "file signature" which could be a hash value or digital signature before the software executable code is built; although requirement 5.6.3.1 of the EAC Testing and Certification Program Manual requires "file signatures" for updated executable code.

Source: [EAC06] Section 5.6.2.2

1 Comment

Comment by Kevin Wilson (Voting System Test Laboratory)

Yet again the requirements fail to mention a copy of the public key of the digital signature without which the digital signature is so many useless bits. Worse even than a "file signature" since without the public key it is undecipherable to the hash (file signature) it contains. (see comment 2.4.3)

2.4.3.4 Unmodified COTS verification

The following requirements describe how test labs are to verify that products identified as COTS are unmodified when used by the voting system.

2.4.3.4-A COTS assembly and configuration documentation

The manufacturer SHALL document the procedures used to assemble and configure unmodified COTS components into the system supplied in Requirement Part 3: 2.4.2.2-A.

Applies to: Voting system

DISCUSSION

Test labs will assemble and configure unmodified COTS components into the voting system using the documentation provided by this requirement. Requirement Part 2: 4.4.1-A subitem e identifies all COTS components in the voting system, and Requirement Part 2: 3.8-D requires configuration data for unmodified COTS to be documented.

Source: COTS verification process per STS and CRT consensus, June 2006

2.4.3.4-B Obtain COTS Off the shelf

Test labs SHALL obtain COTS components identified in Requirement Part 2: 4.4.1-A subitem 5 from open market suppliers of COTS components.

Applies to: Voting system

DISCUSSION

Test labs will procure the COTS components "off-the-shelf" from suppliers of the COTS components.

 

3 Comments

Comment by Brian V. Jarvis (Local Election Official)

Recommend that since the test labs are ultimately responsible for the quality of all products purchased from suppliers, the test labs should evaluate and select suppliers based on their ability to supply products in accordance with the test lab’s requirements. A register of these "approved suppliers" should be maintained. (The criteria for selection, evaluation and re-evaluation of these suppliers should be established.)

Comment by Kevin Wilson (Voting System Test Laboratory)

Does the NIST envision that voting systems will provide such a large market for COTS as to keep the COTS software available on the open market for as long as the voting system itself is marketable? The standard should address acceptable contingent methods.

Comment by Frank Padilla (Voting System Test Laboratory)

he implications of the VSTLs procuring all COTS equipment for testing could adversely affect the cost of testing, as we will not be procuring the quantities or have the deals that manufacturers have with vendors for supplying these items. VSTLs should verify that COTS are available and the same as the manufacturers provide.
2.4.3.4-C COTS assembly and configuration witness

At least one representative from the test lab and manufacturer SHALL witness the assembly and configuration of the COTS components into the voting system.

Applies to: Voting system

 
2.4.3.4-C.1 Test lab assembly and configuration of COTS

The test lab SHALL assemble and configure the COTS components into the voting system.

Applies to: Voting system

 
2.4.3.4-C.2 Test lab record of COTS assembly and configuration

The test lab SHALL document and maintain a record of the COTS assembly and configuration that includes, at a minimum: a unique identifier for each record; the time and date and location of the voting system build; names, affiliations, and signatures of all people present; copies of the procedures used to assemble and configure the COTS components; and identification of the voting system.

Applies to: Voting system

 
2.4.3.4-C.3 Document deviations from of COTS assembly and configuration documentation

The test lab SHALL document deviations from the manufacturer documentation submitted for assembly and configuration of the COTS components.

Applies to: Voting system

 

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

If the assembly and configuration of the COTS components is different from that submitted by the manufacturer, then this should not be allowed. Need to ensure that the manufacturer can demonstrate (and document) that validation activities were performed successfully. (Validation: to demonstrate that a product or product component fulfills its intended use when placed in its intended environement.) Need to ensure that identical assembly and configuration procedures occur for identical COTS components.

Comment by Frank Padilla (Voting System Test Laboratory)

Test Lab should determine if changes should be incorporated into the TDP and stay why not, if not included.

2.5 Testing

Testing encompasses the preparation of a test plan, the establishment of the appropriate test conditions, the use of appropriate test fixtures, the witness of the system build and installation, the maintenance of test data, and the evaluation of the data resulting from tests and examinations.

 

2.5.1 Test plan

2.5.1-A Prepare test plan

The accredited test lab SHALL prepare a test plan to define all tests and procedures required to assess conformity to the VVSG, including:

  1. Verifying or checking equipment operational status by means of manufacturer operating procedures;
  2. Establishing the test environment or the special environment required to perform each test;
  3. Initiating and completing operating modes or conditions necessary to evaluate the specific performance characteristics under test;
  4. Measuring and recording the value or range of values for the characteristics to be tested, demonstrating expected performance levels;
  5. Verifying, as above, that the equipment is still in normal condition and status after all required measurements have been obtained;
  6. Confirming that documentation submitted by the manufacturer corresponds to the actual configuration and operation of the system; and
  7. Confirming that documented manufacturer practices for quality assurance and configuration management comply with the VVSG.

Applies to: Voting system

DISCUSSION

Requirements on the content of the test plan are contained in Part 2: Chapter 5: "Test Plan (test lab)".

Source: [VVSG2005] II.1.8.2.1

 

2.5.2 Test conditions

The accredited test lab may perform the tests in any facility capable of supporting the test environment.

 

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

The accredited test lab may perform the tests in any facility capable of supporting the test environment. Directly contradicts EAC Lab accreditation manual currently under review, Section 2.11.6.
2.5.2-A Witness test preparation

Preparations for testing, arrangement of equipment, verification of equipment status, and the execution of procedures SHALL be witnessed by at least one independent, qualified observer, who SHALL attest that all test and data acquisition requirements have been satisfied.

Applies to: Voting system

Source: [VSS2002] II.9.6.2.2.a

 

4 Comments

Comment by Brian V. Jarvis (Local Election Official)

For this requirement, need to define "independent" and "qualified observer." Also, need to describe how this individual is selected, how the observer will "attest", how this will be documented, etc.

Comment by Frank Padilla (Voting System Test Laboratory)

What is meant by tests "shall be witnessed by at least one independent qualified observer"? Does this mean the labs must now hire another person or agency to oversee testing?

Comment by Harry VanSickle (State Election Official)

Recommend more clearly defining "independent, qualified observer."

Comment by E Smith/J Homewood (Manufacturer)

There is a requirement for "at least one independent, qualified observer" too observe the setup. Who determines the person is independent and qualified? How does this observer get paid and stay independent? Will this be a representative of the EAC permanently available on site?
2.5.2-B Ambient conditions

When a test is to be performed at "standard" or "ambient" conditions, this SHALL refer to a nominal laboratory or office environment with a temperature in the range of 20.0 °C to 23.9 °C (68 °F to 75 °F) and prevailing atmospheric pressure and relative humidity.

Applies to: Voting system

Source: [VVSG2005] II.1.8.2.2.b

2.5.2-C Tolerances for specified temperatures and voltages

When a test is to be performed at conditions other than "standard" or "ambient," the test SHALL be performed at the required temperature and electrical supply voltage, regulated within the following tolerances:

  1. Temperature ± 2.2 °C (± 4 °F)
  2. AC electrical supply voltage ± 2 V

Applies to: Voting system

Source: [VVSG2005] II.1.8.2.2.c

 

2.5.3 Test fixtures

2.5.3-A Complete system testing

Except as provided in Requirement Part 3: 2.5.3-B, the test lab SHALL NOT use simulation devices or software that bypass portions of the voting system that would be exercised in an actual election.

Applies to: Voting system

DISCUSSION

Devices or software that closely and validly simulate actual election use of the system are permissible. If a tabulator is specified to count paper ballots that are manually-marked with a specific writing utensil, it is not valid to substitute ballots that were mechanically marked by a printer. However, ballots that were marked according to manufacturer instructions can sometimes be recycled through a tabulator without invalidating the test. Limitations on this practice are provided in Requirement Part 3: 5.2.3-D.

2.5.3-B Exceptions to complete system testing

The test lab may bypass the user interface of an interactive device in the case of environmental tests that:

  1. Would require subjecting test "voters" to unsafe or unhealthy conditions; or
  2. Would be invalidated by the presence of a test "voter."

The test lab may bypass the user interface of an interactive device in the case of environmental tests that:

Applies to: Voting system

2.5.4 Test data requirements

2.5.4-A Test log

A test log of the procedure SHALL be maintained. This log SHALL identify the system and equipment by model and serial number.

Applies to: Voting system

Source: [VVSG2005] II.1.8.2.5.a

2.5.4-B Test environment conditions

Test environment conditions SHALL be recorded.

Applies to: Voting system

Source: [VVSG2005] II.1.8.2.5.b

2.5.4-C Items to be logged

All operating steps, the identity and quantity of simulated ballots, annotations of output reports, the elapsed time for each procedure step, observations of equipment performance, and, in the case of non-operating hardware tests, the condition of the equipment SHALL be recorded.

Applies to: Voting system

Source: [VVSG2005] II.1.8.2.5.c

2.5.5 Test practices

2.5.5-A Conduct all tests

The accredited test lab SHALL conduct the examinations and tests defined in the test plan to determine compliance with the voting system requirements described in Part 1 and Part 2.

Applies to: Voting system

Source: [VVSG2005] II.1.8.2.6

2.5.5-B Log all anomalies

If any failure, malfunction or data error is detected, its occurrence and the duration of operating time preceding it SHALL be recorded for inclusion in the analysis of data obtained from the test.

Applies to: Voting system

Source: [VVSG2005] II.1.8.2.6.a

1 Comment

Comment by Frank Padilla (Voting System Test Laboratory)

Anomalies should also be reported to EAC per guidelines.
2.5.5-C Critical software defects are unacceptable

If a logic defect is responsible for the incorrect recording, tabulation, or reporting of a vote, the test campaign SHALL be terminated and the system SHALL be rejected.

Applies to: Voting system

DISCUSSION

Conformity assessment is not quality assurance. If a critical software defect is found, the system cannot be considered trustworthy even after the known fault is corrected, because the cases that the test lab does not have the opportunity to test can be expected to conceal similar faults. Any subsequent testing of a system based on or derived from the rejected system requires a new application and starting over.

Source: [GPO90] 7.1.1, [VSS2002] Overview, [VVSG2005] II.1.8.2.6.b

 

2 Comments

Comment by Gail Audette (Voting System Test Laboratory)

The term "a logic defect" is so broad that this requirement is unworkable. If a recording, reporting or tabulation defect is found during functional testing, how are the labs to interpret whether the defect is a "critical software defect"? How can it be ensured that all the labs can apply such nebulous criteria consistently? The VVSG definition of a logic defect is a "fault in software, firmware or hard wired logic". If the test campaign is terminated and the system rejected for recording, reporting or tabulation "logic defect", is the new application a software application or an EAC application?

Comment by Cem Kaner (Academic)

I agree with that the VVSG comment that "the system cannot be considered trustworthy even after the known fault is corrected, because the cases that the test lab does not have the opportunity to test can be expected to conceal similar faults." .......... However, mitigation of this risk does not merely require "a new application and starting [the testing] over" (the VVSG requirement). Genuine mitigation of this risk requires new tests that look for those "similar faults" in other parts of the code that have not been tested this way for this particular type of error. .......... (Affiliation Note: IEEE representative to TGDC)
2.5.5-D Software defects are not field-serviceable

If a logic defect is found that is not responsible for the incorrect recording, tabulation, or reporting of a vote, the test campaign SHALL be suspended and the system returned to the manufacturer for correction and quality assurance.

Applies to: Voting system

DISCUSSION

Rejection may be a foregone conclusion if sufficient evidence has been collected to show that the reliability benchmark is not satisfied (see Part 3: 5.3.3 "Reliability"). Notwithstanding that, the manufacturer will be given the opportunity to correct noncritical software defects. Revisions to the software must be performed within the manufacturer's quality assurance and configuration management processes and must undergo manufacturer regression testing before the conformity assessment process is resumed. When it is resumed, the test plan should be revised to include regression testing for the change that was made.

Source: [VVSG2005] II.1.8.2.6.b, clarified and strengthened

2.5.5-E Hardware failures are field-serviceable

If the anomaly is other than a logic defect, and if corrective action is taken to restore the equipment to a fully operational condition within eight hours, then the test campaign may be resumed at the point of suspension.

Applies to: Voting system

DISCUSSION

Rejection may be a foregone conclusion if sufficient evidence has been collected to show that the reliability benchmark is not satisfied (see Part 3: 5.3.3 "Reliability"). Notwithstanding that, the manufacturer may replace a component that has suffered a random failure, or the manufacturer may opt to suspend the test campaign in order to correct a hardware design defect that caused a nonrandom failure.

Source: [VVSG2005] II.1.8.2.6.c

3 Comments

Comment by Carolyn Coggins (Voting System Test Laboratory)

In the discussion it indicates manufacturer's may replace a component that has suffered a random failure. Please provide explicit instruction on when a component may be replaced. Providing a discussion that says it may be replaced and referencing a section that says all testing is pertinent to reliability except when you force the system to fail or bypass functionality is confusing. The issue of replacement of a random failure is important and the VVSG needs to provide unambiguous direction.

Comment by Gail Audette (Voting System Test Laboratory)

Please clarify if the allowable correction time for an anomaly other than a logic defect is clock time or work hour time? At what point is the eight hour timer started, meaning does this include or exclude the time to troubleshoot the anomaly?

Comment by Frank Padilla (Voting System Test Laboratory)

Please clarify the timeline: does this include travel, troubleshooting or actual work time? What is the determination on what can and can not be replaced? This is not very clear.
2.5.5-F Pauses in test campaign

If the test campaign is suspended for an extended period of time, the accredited test lab SHALL maintain a record of the procedures that have been satisfactorily completed. When testing is resumed at a later date, repetition of the successfully completed procedures may be waived provided that no design or manufacturing change has been made that would invalidate the earlier test results.

Applies to: Voting system

DISCUSSION

The considerations for resumption of testing are similar to those of Requirement Part 2: 5.1-D.

Source: [VVSG2005] II.1.8.2.6.d

4 Comments

Comment by Brian V. Jarvis (Local Election Official)

Need to define the phrase "extended period of time" (is it 8 hours, 16 hours, 4 hours?). How much later can a test be resumed? Is "a later date" defined as 3 days, 1 week, etc.? With the complexity of hardware and software components nowadays and with no good way of knowing how one code module might impact another code module (strange things happen with software), recommend that if any design or manufacturing change has been made that all procedures must be repeated (whether or not it is assumed that earlier test results would not be invalidated). Again, we're trying to assure everyone of the quality of the product. As much as is possible, no room should be left for doubt.

Comment by Frank Padilla (Voting System Test Laboratory)

Define extended period of time.

Comment by Harry VanSickle (State Election Official)

We take issue with the phrase "extended period of time." States that require a voting system to undergo federal testing prior to certification for use in the state should not have to wait for some indefinite period of time for the federal testing to be completed. Please see our suggestion for language below. If the test campaign is suspended for an extended period of time, the accredited test lab SHALL maintain a record of the procedures that have been satisfactorily completed. When testing is resumed at a later date, repetition of the successfully completed procedures SHALL be waived provided that no design or manufacturing change has been made that would invalidate the earlier test results and the test campaign was suspended for a period no greater than six (6) months. No waiver SHALL be available where a suspension is greater than six (6) months. Any subsequent testing of a system after the six-month period requires a new application and starting over.

Comment by Harry VanSickle (State Election Official)

How will it be determined that no design or manufacturing change has been made once testing is resumed? Please further explain.
2.5.5-G Resumption after deficiency

The test campaign may resume after a deficiency is found if:

  1. The manufacturer submits a design, manufacturing, or packaging change notice to correct the deficiency, together with test data to verify the adequacy of the change;
  2. The examiner of the equipment agrees that the proposed change is responsive to the full scope of the deficiency;
  3. Any previously failed tests are passed by the revised system; and
  4. The manufacturer attests that the change will be incorporated into all existing and future production units.

Applies to: Voting system

DISCUSSION

Consistent with configuration management, the corrected system is formally a different system from the one that failed. The failure of the previous version is never "purged" entirely; rather, a new revision of the system is found not to suffer the same defect.

Source: [VVSG2005] II.1.8.2.6.e, clarified

2 Comments

Comment by Brian V. Jarvis (Local Election Official)

Regarding 2.5.5-G(d), this will need to be verified. Verification, testing, and validation activities are not about taking someone's word. The incorporation of the change into all existing and further production units must be independently (of the manufacturer) verified.

Comment by Harry VanSickle (State Election Official)

What constitutes a "deficiency?" Please more fully explain.

2.6 Post-Test Activities

2.6.1 Voting system software version recommended for certification

The following requirements specify the version of the voting system software executable code including application logic, border logic, and third party logic that test labs included as part of a specific voting system recommended for certification.

 

2.6.1.1 Voting system software version

1 Comment

Comment by Harry VanSickle (State Election Official)

The hyperlink for "third party logic" is not activated anywhere in this section.
2.6.1.1-A Version of voting system software executable code

The test lab SHALL include voting system software executable code including application logic, border logic, and third party logic resulting from either an initial or final test lab build as part of the specific voting system recommended for certification.

Applies to: Voting system

DISCUSSION

The term "test lab build" refers to the voting system software executable code (including application logic, border logic, and third party logic) resulting from the test lab creating the executable code using (a) test lab procured equipment and build tools (such as compilers, linkers, etc.) and (b) source code and build procedures provided by the manufacturers. Note the test lab build is the result of using the requirements found in Part 3: 2.4.3 "Initial system build by test lab".

Source: [VVSG2005] II.1.8.4.2

2.6.1.1-A.1 Initial test lab build version

When no updates or modifications to the voting system software executable code including application logic, border logic, and third party logic has occurred since the initial test lab build, the test lab SHALL submit the executable code from the initial test lab build as part of the specific voting system recommended for certification.

Applies to: Voting system

2.6.1.1-A.2 Final test lab build version

When updates or modifications to the voting system software executable code including application logic, border logic, and third party logic has occurred since the initial test lab build, the test lab SHALL submit the executable code from a final test lab build as part of the specific voting system recommended for certification.

Applies to: Voting system

2.6.1.1-A.3 Final voting system software executable code build

When required by Requirement Part 3: 2.6.1.1-A.2, the test lab SHALL use the requirements found in Part 3: 2.4.3 "Initial system build by test lab" to create a final test lab build of voting system software executable code including application logic, border logic, and third party logic

.

Applies to: Voting system

Source: [VVSG2005] II.1.8.4.2

2.6.2 Software distribution requirements for repositories, test labs, and manufacturers

The following requirements describe how voting system software must be distributed by test labs, voting system software manufacturers, and repositories such as the National Software Reference Laboratory (NSRL) to support traceability back to a reference version of the voting system software from a test lab, manufacturer, or repository. This traceability provides the basis for verifying that software installed on programmed devices of the voting system is certified voting system software. Although these requirements apply only to test labs, manufacturers, and repositories, other organizations that distributed voting system software such as jurisdictions may apply these requirements to support traceability back to reference versions of voting system software they distribute.

2.6.2.1 Software distribution package requirements

Software distribution packages are used to distribute software between different parties. Software distribution packages contain software from voting system manufacturers, third party manufacturers, test labs, repositories, and jurisdictions. The software contained on software distribution packages include voting application software, election specific software, installation software, third party software, and software integrity information.

2.6.2.1-A Software distribution package master copy establishment

Test labs, manufacturers, and repositories SHALL establish software distribution package master copies from which copies are created and distributed.

Applies to: Voting system

DISCUSSION

Software is traceable back to a software distribution package master copy containing the software. Copies of software distribution packages can be distributed on via modifiable media (physically on CD-RWs, memory cards, and hard drives; or electronically via email, FTP, and Websites) since digital signatures are created as part of software distribution packages. (See Requirement Part 3: 2.6.2.1-F)

2.6.2.1-A.1 Master copy creation record

A master copy creation record SHALL be created that includes at a minimum: the unique identifier of the record; the unique identifier of the master copy; the type of unalterable storage media containing the master copy; time, date, and location the master copy was created; name(s), affiliation(s), and signature(s) of the people present during the creation of the master copy; name and version of the software distribution package; the name, version and certification number (if certified) of the voting system; identifiers of the software components (such as filename(s)) in the software distribution package; location of software components in the software distribution package; and the digital signature algorithm used to sign the contents of the software distribution package.

Applies to: Voting system

1 Comment

Comment by Kevin Wilson (Voting System Test Laboratory)

Although the document specifies that the master copy contains the digital signature algorithm, it does not specify that the master document contains the public key or public key certificate of the digital signature nor does it specify a repository for such a public key. (See comment 2.4.3) A digital signature without a public key has less value than a file signature alone in terms of the possibility of making any type of verification of that signature.
2.6.2.1-A.2 Master copy storage media

A software distribution package master copy SHALL be stored on unalterable storage media.

Applies to: Voting system

DISCUSSION

Unalterable storage media includes technology such as a CD-R, but not CD-RW.

2.6.2.1-A.3 Copy creation record

A copy creation record SHALL be created that includes at a minimum: the unique identifier of the master copy; the distribute mechanism for the copy; time, date, and location the copy was created; name(s), affiliation(s) and signature(s) of the people present during the creation of the copy; and the contact information (title, organization, address, phone number, email address, etc.) for the organizations or people to whom copies were distributed.

Applies to: Voting system

DISCUSSION

Copies of software distribution packages can be distributed on via modifiable media (physically on CD-RWs, memory cards, and hard drives; or electronically via email, FTP, and Websites) since digital signatures are created as part of software distribution packages. (See Requirement Part 3: 2.6.2.1-F)

2.6.2.1-A.4 Master copy and copy creation record storage media

The master copy and copy creation records SHALL be made on unalterable storage media.

Applies to: Voting system

DISCUSSION

Unalterable storage media includes technology such as a CD-R, but not CD-RW.

2.6.2.1-A.5 Master copy retention

Test labs manufacturers and repositories, including the National Software Reference Library (NSRL), SHALL retain the master copy of software distribution packages and associated records until notified by the national certification authority that they can be archived.

Applies to: Voting system

2.6.2.1-B Human readable software distribution package identification file

Software distribution packages SHALL contain a separate human readable file that provides at a minimum: the name and version of the software distribution package; the unique identifier of the master copy; the name, version, certification number (if certified) of the voting system; and the algorithm used to create digital signatures for the contents of the software distribution package. (See Requirement Part 3: 2.6.2.1-F).

Applies to: Voting system

DISCUSSION

Binary document formats and text containing markup tags are not considered human-readable. Applications may generate such documents, but it must also provide the functionality to render those documents in human-readable form (e.g., by including the necessary reader application).

2.6.2.1-C Human readable software distribution package content file

Software distribution packages SHALL contain a separate human readable file that provides at a minimum the following information for each component within the software distribution package: software component identifier (such as filename), software manufacturer name, software product name, software version, and component location within the software distribution package (such as the full directory path to the file or archive containing the file or memory addresses).

Applies to: Voting system

DISCUSSION

Binary document formats and text containing markup tags are not considered human-readable. Applications may generate such documents, but it must also provide the functionality to render those documents in human-readable form (e.g., by including the necessary reader application).

2.6.2.1-D Software distribution archive files format

When software distribution packages use archive files to hold multiple software components, the archive files SHALL be generated using algorithms and file formats in common usage.

Applies to: Voting system

DISCUSSION

Some commonly used archive files include but are not limited to zip, gz, and tarbz2.

2.6.2.1-E Full directory path for files within an archive file

The full directory path and filename of archive files SHALL be used as the full directory path for the files within the archive.

Applies to: Voting system

2.6.2.1-F Software distribution package digital signature

Software distribution packages SHALL contain digital signatures for each software component contained within the software distribution package.

Applies to: Voting system

DISCUSSION

Digital signatures are generated for the un-archived forms of each of the software files as well as archive files.

2.6.2.1-F.1 Software distribution package digital signature generation

Software distribution packages SHALL contain, at a minimum, digital signatures generated by the test lab, manufacturer, or repository that created the software distribution package.

Applies to: Voting system

2.6.2.1-F.2 Software distribution package digital signature format

Digital signatures SHALL be stored in a non-proprietary standard data format as part of the software distribution package.

Applies to: Voting system

DISCUSSION

Some non-proprietary standard data formats for digital signatures include IETF RFC 3852: Cryptographic Message Syntax (CMS), RSA Public Key Cryptographic Standard #7: Cryptographic Message Syntax Standard, W3C XML-Signature Syntax and Processing.

2.6.2.1-G Software distribution package physical media labeling requirement

Each piece of physical media used for software distribution packages SHALL be labeled on an external surface of the media including at a minimum: the test lab, manufacturer, or repository that created the media; the creation date of the media; unique identifier of the media (such as a serial number); software distribution package name and version; whether the software has been certified or not; and the name, version, and certification number (if certified) of the voting system.

Applies to: Voting system

DISCUSSION

Each piece of media needs to be uniquely identifiable even if the pieces contain the same information in order to support traceability. These requirements apply to master copies of software distribution packages since they are required to be stored on unalterable media. (See Requirement Part 3: 2.6.2.1-A.2).

2.6.2.1-H Physical media digital signature

Each piece of physical media used for software distribution packages SHALL contain a digital signature generated by the creating test lab, manufacturer, or repository covering the entire contents of the media.

Applies to: Voting system

DISCUSSION

The binary image refers to the complete contents of the physical media as a whole. A binary image of physical media may contain multiple files.

 

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

"..the archive files SHALL be generated using algorithms and file formats in common usage." Some commonly used archive file are listed in the discussion; however, how does a lab determine if the vendor used archive files algorithm and/or file format is "common usage"? Please provide a list of acceptable formats and/or clarification of the steps to assess a format is "common usage".

2.6.2.2 Repository software distribution requirements

Repositories receive voting system software (source and executable code) that has been certified from test labs or the national certification authority. Repositories may receive non-voting specific software from third party manufacturers and election specific software such as ballot definition files from jurisdictions. Repositories must handle software properly to insure that the software in their possession does not get modified or released to parties without appropriate approvals. However, repositories may be compelled to release software they possess to comply with court orders. Repositories can be described based on the type of service they provide: escrow, notary, and distribution. Escrow repositories hold software they receive until formal requests for the software are received and approved. Notary repositories use software they receive to generate software integrity information (such as digital signatures or hash values) which can be used to verify the integrity of the piece of software. Notary repositories distribute software integrity information, but they do not distribute the voting software or the software used to generate the software integrity information. Distribution repositories provide software they receive to parties approved by the owner of the software. Note that a single repository may provide one or more of the repository services (escrow, notary and distribution). The National Software Reference Library (NSRL) is an example of a notary repository that currently generates software integrity information in the form of hash values. Since source code is not provided to the NSRL, the NSRL only generates software integrity information for executable code.

2.6.2.2-A Repository software distribution package request process documentation

The repository SHALL publicly document the process used to request copies of the software distribution packages (including associated documentation) from the repository.

Applies to: Voting system

DISCUSSION

Manufacturer approval may be required for release for software considered in intellectual property and needs to be reflected in the request process. Copies of software distribution packages can be distributed on via modifiable media (physically on CD-RWs, memory cards, and hard drives; or electronically via email, FTP, and Websites) since digital signatures are created as part of software distribution packages (see Requirement Part 3: 2.6.2.1-F). When copies of a software distribution package are created, Requirement Part 3: 2.6.2.1-A.3 requires a record to be produced.

1 Comment

Comment by Kevin Wilson (Voting System Test Laboratory)

This requirement is a bit confusing since a "binary image" of something that contains the generated signature will necessarily be changed when the generated signature is added to the "binary image"
2.6.2.2-B Repository digital signature verification

The repository SHALL verify the digital signatures associated with software are valid before creating a software distribution package master copy containing the software.

Applies to: Voting system

DISCUSSION

In general, the digital signatures verified by repositories will be generated by test labs, the national certification authority, and possibly jurisdictions.

2.6.2.2-B.1 Repository digital signature verification result record

Results of digital signature verifications including the source of the signature SHALL be part of the creation record of software distribution package master copies created by the repository.

Applies to: Voting system

2.6.2.2-C Repository software distribution package

Distribution, escrow, and notary repositories SHALL create software distribution package master copies containing software received from test labs, the national certification authority, and jurisdictions.

Applies to: Voting system

DISCUSSION

Distribution, escrow, and notary repositories received software distribution packages created by test labs, the national certification authority, and possibly jurisdictions. This requirement establishes software distribution package master copies that support traceability of voting system software back to the repository. Requirement Part 3: 2.6.2.1-A.2 requires software distribution package master copies to be on unalterable media. Requirement Part 3: 2.6.2.1-F requires digital signatures for each software component contained in the software distribution package. Requirement Part 3: 2.6.2.1-A.5 requires repositories to retain software distribution package master copies until notified by the national certification authority.

2.6.2.2-D Notary repositories software integrity information software distribution package

Notary repositories SHALL create software distribution package master copies containing software reference integrity generated by the repository for software received from test labs, the national certification authority, and jurisdictions.

Applies to: Voting system

DISCUSSION

This requirement establishes software distribution package master copies that support traceability of software integrity information for voting system software back to the notary repository. Requirement Part 3: 2.6.2.1-A.2 requires software distribution package master copies to be on unalterable media. It requires digital signatures for each software component contained in the software distribution package. It also requires repositories to retain software distribution package master copies until notified by the national certification authority.

2.6.2.2-E Distribution and escrow repository software distribution package copy

A distribution or escrow repository SHALL provide copies of the software distribution packages they create to parties that follow the repositories request process (see Requirement Part 3: 2.6.2.2-A).

Applies to: Voting system

DISCUSSION

This requirement allows distribution and escrow repositories to provide the software distribution package they create to parties that follow the request process documented by Requirement Part 3: 2.6.2.2-A. Manufacturer approval may be required for release for software considered in intellectual property and needs to be reflected in the request process of the distribution and escrow repository. Copies of software distribution packages can be distributed on via modifiable media (physically on CD-RWs, memory cards, and hard drives; or electronically via email, FTP, and Websites) since digital signatures are created as part of software distribution packages (see Requirement Part 3: 2.6.2.1-F). When copies of a software distribution package are created, Requirement Part 3: 2.6.2.1-A.3 requires a record to be produced.

2.6.2.2-F Notary repository software distribution package copy

A notary repository SHALL provide copies of software distribution packages containing software integrity information generated by the repository to parties that follow the repository’s request process (see Requirement Part 3: 2.6.2.2-A).

Applies to: Voting system

DISCUSSION

This requirement allows notary repositories to provide the software integrity information they create for voting system software to parties that follow the request process documented by Requirement Part 3: 2.6.2.2-A.

2.6.2.3 Test labs software distribution requirements

2.6.2.3-A Software distribution package containing voting system software source and executables

The test lab SHALL create a software distribution package master copy containing the source and executable code from the test lab build of the voting system software.

Applies to: Voting system

DISCUSSION

This requirement establishes the software distribution package master copy that supports traceability of voting system software source and executable code back to the test lab.

Source: [EAC06] Section 5.6.3.1

2 Comments

Comment by Gail Audette (Voting System Test Laboratory)

What organization is contracting with the repository?

Comment by Harry VanSickle (State Election Official)

Typographical errors in the discussion section – remove "in" in the first sentence, and use either "on" or "via" in the second sentence, but not both.

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

Sections 2.6.2.3 & 2.6.2.4 The requirements for both the test lab and manufacturer are duplicated. What is the reconciliation process, validation process, or verification step that justifies this duplication of effort?
2.6.2.3-B Software distribution package containing configuration files, installation programs, and third party developed software

The test lab SHALL create a software distribution package master copy containing configuration files, installation programs, and third party software to be installed on programmed devices of the voting system.

Applies to: Voting system

DISCUSSION

This requirement establishes the software distribution package master copy that supports traceability of configuration files, installation programs, and third party software to be installed on programmed devices of the voting system back to the test lab.

Source: [EAC06] Section 5.6.3.1 and 5.6.3.3

 
2.6.2.3-C Software distribution packages for manufacturers, National Software Reference Library (NSRL), and designated national repository

The test lab SHALL provide copies of the software distribution packages containing the source and executable code from the test lab build, build environment pre- and post-build binary images, and other software to be installed on programmed devices of the voting system (configuration files, installation programs, and third party software) to the manufacturer, National Software Reference Library (NSRL), and a designated national repository.

Applies to: Voting system

DISCUSSION

This requirement requires test labs to provide a complete copy of the voting system software to the manufacturer, the national certification authority, and National Software Reference Library (NSRL).

1 Comment

Comment by Gail Audette (Voting System Test Laboratory)

What organization is contracting with the repository?
2.6.2.3-D Software distribution packages for other parties

The test lab SHALL provide copies of the software distribution packages containing a complete set or subset of the source and executable code from the test lab build, build environment pre- and post-build binary images, and other software to be installed on programmed devices of the voting system (configuration files, installation programs, and third party software) to parties approved by the manufacturer.

Applies to: Voting system

DISCUSSION

This requirement allows test labs to provide complete or partial copies of the voting system software to parties approved by the manufacturer.

Source: [EAC06] Section 5.6.2.4, 5.6.3.2, 5.7.1-5

2.6.2.4 Manufacturer software distribution requirements

2.6.2.4-A Manufacturer usage of software distribution packages

The manufacturer SHALL use software distribution packages for voting system software the manufacturer distributes.

Applies to: Voting system

2.6.2.4-B Software distribution package containing voting system software source code

The manufacturer SHALL create a software distribution package master copy containing source code of voting system software including application logic, border logic, and third party logic.

Applies to: Voting system

DISCUSSION

This requirement establishes the software distribution package master copy that supports traceability of configuration files, installation programs, and third party software to be installed on programmed devices of the voting system back to the test lab. Manufacturers will include a copy of this software distribution package as part of their TDP as required by Requirement Part 3: 2.6.2.4-D.

2.6.2.4-C Software distribution package containing configuration files, installation programs, and third party developed software

The manufacturer SHALL create a software distribution package master copy containing configuration files, installation programs, and third party software to be installed on programmed devices of the voting system.

Applies to: Voting system

DISCUSSION

This requirement establishes the software distribution package master copy that supports traceability of configuration files, installation programs, and third party software to be installed on programmed devices of the voting system back to the test lab. Manufacturers will include a copy of this software distribution package as part of their TDP as required by Requirement Part 3: 2.6.2.4-D.

2.6.2.4-D Manufacturer TDP software distribution packages

As part of the TDP, the manufacturer SHALL provide a copy of the software distribution packages from the Requirements Part 3: 2.6.2.4-A.

Applies to: Voting system

2.6.3 Final test report

The accredited test lab may issue interim reports to the manufacturer, informing the manufacturer of the testing status, findings to date, and other information.

2.6.3-A Prepare test report

The accredited test lab SHALL prepare a test report conforming to the requirements of Part 2: Chapter 5: "Test Plan (test lab)".

Applies to: Voting system

Source: [VVSG2005] II.1.8.3.b

2.6.3-B Consolidated test report

Where a system is tested by multiple accredited test labs, the lead accredited test lab SHALL prepare a consolidated test report.

Applies to: Voting system

Source: [VVSG2005] II.1.8.3.c