November 1, 2006
Do Your Own Election Verification Exit Poll!
By Stephanie Frank Singer
::::::::In response to the requests we've received, Election Integrity and the Vote Count Protection Project have published do-it-yourself exit polling instructions that citizens can use to verify whether votes are counted as cast. Until US institutions demonstrate a commitment to democracy at home in the US, this is one of the most useful tools that citizens can use to fight for our most fundamental right, the right which is, in the words of Chief Justice John Roberts, "preservative of all other rights." Please distribute these instructions far and wide. Instructions for Conducting an Election Verification Exit Poll By Kenneth F. Warren, Ph.D., President, The Warren Poll and Steve Freeman, Ph.D., Director, Election Integrity Posted: October 29, 2006 Disclaimer: These instructions will help with the planning of an election verification exit poll (EVEP). However, any exit poll will be challenging to conduct in practice. Competent polling of any kind requires knowledge, practical experience, adequate resources, and common sense. A few pages of instructions cannot possibly answer all of the questions that might arise in the implementation of an exit poll, nor can they guarantee a successful poll. Therefore, we can neither take responsibility, nor guarantee the validity of results for any poll that we do not directly supervise. Why we make these instructions publicly available: There is no substitute for professional experience in polling. The Vote Count Protection Project conducts rigorous, professional election verification exit polling, but at present we have limited funds with which we can poll but a few districts. We therefore publish these instructions in response to the requests we've received from so many of you who are fighting for democratic rights, in recognition that we have little choice but to take matters into our own hands. In jurisdictions throughout the US, election processes provide little reason for confidence. Bev Harris has aptly characterized US voting processes as "black box voting": citizen presses a button, something happens within the million or so lines of proprietary code inside the machine, and perhaps the vote is recorded as it was cast. Or not. We have no way of knowing. That's because the counts produced are inadequately verified, and the machines are inadequately audited and inspected. In many cases the systems produce counts which are inherently unverifiable. And all of it is beyond the public purview. Even when confronted with evidence of miscount, election officials have denied public access to voting machines, claiming that contractual obligations prohibit outside examination of their machines. Unfortunately, the single national exit poll conducted by a consortium of six major national media outlets (AP, CNN, CBS, NBC, ABC, and FOX) is little better. Just as your votes and the processes by which they are tabulated are the fiercely guarded private property of voting machine companies, so the exit poll questionnaires and the processes by which they are tabulated and "corrected" are the consortia's fiercely guarded private property. This was made clear by the pollsters' response to the nationwide seven percentage point discrepancy between the exit polls and the official count in the 2004 presidential election; the pollsters refused to release precinct-level data, even to qualified independent researchers. Indeed, we only know of that discrepancy one which represents nine million votes nationwide because a technical glitch prevented the pollsters from promptly uploading "corrected" results, that is, results adjusted to conform to the official count. For 2006 and future elections, the exit pollsters have stated that they will not release "early" exit poll data, by which they mean data not yet adjusted so as to conform with the official count, even to their media clients. Of course, once data are "corrected," they are no longer true exit poll results; rather, they accord an unwarranted legitimacy upon the official numbers. Until US institutions demonstrate a commitment to democracy at home in the US, we can only do what we can to give Americans a tool with which to fight for our most fundamental right, the right which is, in the words of Chief Justice John Roberts, "preservative of all other rights." What is an Election Verification Exit Poll? An EVEP differs sharply from a media exit poll in purpose, design, and methodological rigor. The purpose of a media exit poll is to strategically poll many precincts to get a representative sample for an entire district (e.g., a state) so election outcomes can be predicted. Demographic data are obtained in media exit polls so pollsters can tell how Blacks voted, women voted, rural residents voted, etc. Most of the polling is done before 2:00 pm so the exit poll results can be tabulated and presented by the news outlets immediately after the polls close. The purpose of the EVEP is not to predict election results, but rather to audit or verify the accuracy of vote counts in selected precincts. Thus, the objective is to concentrate on targeted precincts and to poll comprehensively in these precincts so election results (in these precincts only) can be verified. In media exit polls many fewer voters in a particular precinct are sampled than in an EVEP. Consequently, an EVEP should be considerably more accurate for the targeted precincts than a media exit poll would be. If EVEP results differ significantly from the actual results in the targeted precincts, it is a strong indicator that the vote count may not be correct. How to Conduct an Election Verification Exit Poll. Here are the steps that will enable you to conduct a meaningful EVEP. 1. Set your objectives. The first step in planning for an EVEP is to determine what you want to test. What precincts would satisfy your research objectives? Are you testing for accuracy in precincts that have a reputation for "funny business" or vote fraud? Or are you testing for accuracy in precincts that are using different kinds of voting machines, in order to compare machine performance? You may simply want some confidence that votes in your home precinct are being counted as cast. 2. Obtain Basic Information. This includes: a map of the district; a current listing of all the polling places within the district, including the addresses of the polling places, the precincts at each polling place, and the number of registered voters in each precinct; a history of voting at these precincts going back to at least six years; and the laws governing what exit pollsters are allowed to do at the polling places (e.g., that exit pollsters must not poll any closer than 50 feet from a polling place). Begin your search for information with the State Secretary of State office which will give you contact numbers to the county election boards within the state; you can get specific election data from the county election boards. 3. Select your Polling Places. Keep your research objectives in mind and learn the demographics and the relevant political history. For example, if you want to compare the accuracy of the vote counts in polling places using paper ballots versus touch screens, make sure that you know what precincts are using these two ways of voting. Voting technology information can be obtained from the County Election Boards. The Verified Voting "Verifier" also provides a county listing of voting systems in use with extensive information about these voting systems. If you want to test for vote fraud, you may want to talk to people about which polling places have a history of suspected fraud or suspicious results. You can also research logs of voting problems and Election Day complaints. General selection criteria to keep in mind include: Precinct size: Ideally, each targeted polling place should have at least 1000 registered voters. This is one of the reasons why you need a listing of the polling places showing the number of registered voters. Inexperienced exit pollsters have been known to send interviewers to polling places that may have only 67 registered voters with only 39% of them voting, which would mean that you would have only 26 voters to interview in an entire day or only about two per hour. Co-located precincts. It may be preferable to sample polling places that contain only a single precinct. If you do sample polling places that contain multiple precincts, do not try to sample a single precinct, as it may be difficult to distinguish which precinct the voter-respondents are from. Rather, sample all voters using the procedures given below, and compare the results with the sum of the official numbers of the all the co-located precincts. 4. Recruit Interviewers. Once you have selected the polling places where you are going to conduct interviews, make sure that you have enough interviewers to do the job competently. You will need to conduct interviews all day long from right after the polls open up to the time that they close (in most states polls open at 6:00 am and close at 7:00 pm). Ideally, you will need two pollsters at the polling place for most of the day. One person can be there for some of the time to handle the polling, but two pollsters should be there for most of the day. 5. Prepare a Questionnaire. Questionnaires must be designed to meet the research objectives. Each will also be different, depending upon the various races that will be covered in the questionnaire. Naturally, trying to explain how to develop a professional questionnaire here would be impossible. There are whole college courses taught in survey research design. It would be preferable to find a professional who would help in the design of your questionnaire, but for purposes of this poll, you may want to replicate the sample ballot you can find at www.electionintegrity.org. The questionnaire should be very short (one that can be completed in less than two minutes), asking questions on only on targeted races (e.g., U.S. Senate, U.S. Representative, Governor) and maybe a major ballot issue, if appropriate. A few demographic questions should also be asked so voters can be profiled and you will know the percentage of, say, women interviewed, blacks interviewed, etc. The core demographic questions would be: gender, race, age ranges (e.g., 18-30; 31-45; 46-65; Over 65), party preference, and any others that may relevant to your objectives in the EVEP. Also indicated on the questionnaire should be the U.S. Congressional District, County, and Precinct(s) in which the poll is being administered. Sometimes, to avoid confusion and error when coding the results, major areas that are being polled are color-coded (e.g., if a Congressional District spans four counties, the questionnaires for each county are assigned different colors). Make sure that each questionnaire is also assigned an ID # to identify each respondent in poll. The first four columns of the data entry for a respondent should be reserved for the ID#, followed by a code number for the district, then county, then precinct, then the answers to the questions. The first respondent's ID number should be 0001, the second 0002, and so on. 6. Train Interviewers. Before they are sent out into the field, interviewers should know what is expected of them. They should know how to administer the questionnaires, how to make sure the questionnaires are deposited in the questionnaire box, and how to safeguard the box. They should know what polling place to go to, when to be there, how long they must stay, and how to recruit a substitute when they need to take a bathroom break, especially if there is not another pollster at the site with them. They should do simulated interviews to ensure competence and build confidence. 7. Set up an Interviewer Schedule for each precinct for the entire day so the whole day is covered. Alternates should be on hand to take over for emergency situations such as no one showing up at a scheduled time, an interviewer getting sick, etc. If one pollster is used and this is OK the pollster would have to be relieved after no more than four hours. Thus, you would need to use pollsters at the polling places in three hour shifts. 8. Needed Supplies: a. Questionnaire Boxes. Purchase questionnaire boxes from an office supply place such as Office Max, Office Depot, Mail Boxes, etc. They should be standard size (about 24" x 12" x 18") big enough to accommodate all the questionnaires. Cut a slit in the top of the box wide enough so that the questionnaires can easily be placed in them. Reinforce the boxes with duct tape. Mark clearly on both sides, "PLEASE PLACE COMPLETED QUESTIONNAIRES IN THIS EXIT POLL BOX." Each interviewer should bring two questionnaire boxes. The extra one is for unanticipated problems; keep it in the car. Ideally, all interviewers should have a car so they can reach the polling places and have a place to keep supplies. b. Have plenty of questionnaires. If there are 300 registered voters in the precinct, make sure that you have at least 150 questionnaires. Yes, you will not use that many, but just have extra ones and do not keep them in the same place in case someone decides to steal them. Keep plenty of extra ones in your car. c. Have a good supply of pencils because many people will walk off with them. d. Have several clip boards so respondents can use them to fill out their questionnaires. You can also use them to hold the questionnaires that you will be giving out. Clip boards make you look like a real pollster. You can buy clip boards for about $1.29 a piece at Office Max, etc. e. Bring any personal supplies that you may need (e.g., bottled water, snacks, medicine, glasses, warm cloths or rain protection if there is a threat of rain). f. If you have a cell phone, bring it. You may need it if problems of any kind develop. Make sure that you have essential contact numbers and they have yours. Make sure that you have enough polling supplies with you. Never run out of supplies. The supplies are cheap. 9. What to Wear. Dress and behave in a politically neutral manner. Give no indication whatsoever of political preferences or partisanship. Wear a tag that identifies you as an EVEP interviewer interviewing for an organization noted on the tag. Dress well, as you would for work, but comfortably for the weather. No matter what the forecast, be prepared for cold and rainy weather. 10. Set-up. Interviewers should stand as close to the polling place as the law permits; typically this will be next to partisans handing out literature on who to vote for. Put the questionnaire box on the ground or a nearby ledge. You should have a "hot line" number so voters can call a person designated to answer any questions or respond to any problems that voters, poll workers, or election officials may have with the poll. Obey all election laws and be courteous to all voters and election officials. 11. The Interview. Ask the person if they could fill out a very brief exit poll questionnaire that will take less than two minutes of their time and then to put it in the questionnaire box. Do not worry about being refused. Just approach the next person. Use a pleasant personality to obtain as many interviews as possible. Always guard the box. You may answer any questions about the EVEP as common sense would dictate (e.g., what is the poll going to be used for?), just don't answer any questionnaire questions for them. 12. Selecting Subjects. Pollsters should interview as many voters as reasonable. Random selection is achieved by following a protocol that minimizes selection bias and does not break down. What is not defensible is a system that's good in theory, but that breaks down in practice (e.g., trying to interview every voter or choosing every nth voter). Our approach is to wait 10 seconds (i.e., count to ten) upon completion of the interview instructions, and then approach the next person who exits the polling booths. This averts potential interviewer bias because the interviewer is going to interview the next person who walks out of the polling place after the count of 10 regardless of the person's race, age, gender, etc. The 10 seconds gives the interviewer time to gather his/her thoughts, check on the progress of anyone filling out the questionnaires. Note that while the interviewer is counting to 10 and approaching the next voter, the previous subject will be completing the questionnaire on their own and depositing it in the box. 13. Taking Breaks and Dealing with the Unexpected. Somebody from the organization should be on call to deal with any unexpected problems. A representative ought to be available, for example, if someone questions the pollsters' right to be there or has questions the pollster cannot answer. If only one person is at a polling site and needs to take a break, another person will have to cover for the pollster. If another pollster is not there, it is usually not difficult to ask a responsible person at the polling place to take over for a few minutes while you go to the bathroom. Most people are willing to help you with this. Just ask them to guard your supplies and especially the box where the questionnaires are dropped in. If you briefly explain the process to them, the substitute could even hand out a few questionnaires to voters and ask them to fill out the questionnaire. Since you will only be gone a few minutes, nothing can really go wrong as long as you recruit a responsible person. 14. Data Entry. The boxes with the questionnaires in them should be taken to "central headquarters" after the polling places close. It may be desirable for completed questionnaires to be delivered to "central headquarters" whenever one interviewer replaces another; that permits an early start on data entry. The pre-coded questionnaire data (i.e., the responses) should then be entered into a simple spreadsheet such as Microsoft Notepad and then exported to a statistical package such as SPSS for processing. Note: pre-coded questions would mean that the questionnaire's answers are entered in the logical rows. For example, if the question is "Please indicate your race. ___ White ____ Black ____ Other," the entry would be "0" for NA, "1" for White; "2" for Black, and "3" for Other. Entering and processing the data are too technical to explain here. It is assumed that someone associated with the EVEP has enough knowledge about data processing to know how to enter and process the poll data. 15. Collect Vote Totals. At the close of the polls, interviewers should also obtain the official counts. It's important to get the counts before early and absentee ballots are mixed in. Ideally, you will get the individual count from every voting machine in the precinct. We want to confirm that the numbers are properly tabulated at every step of the process; we also want to confirm that the numbers do not vary unnaturally from machine to machine within a single precinct. 16. For questions, e-mail Ken Warren, Warrenkf@slu.edu. As time permits, he will be happy to answer simple questions, but will not be able to provide elaborate answers to complex methodological questions about survey research since whole courses are offered on this. These instructions were prepared by members of the Vote Count Protection Project. Further information is available at www.electionintegrity.org. The Vote Count Protection Project is conducting a pilot test Election Verification Exit Poll for the November 7, 2006 election, and will be developing a national Election Verification Exit Poll for 2008. Methodology and data from the Vote Count Protection Project will be made freely available to the public for their use. The Project is administered with funds from our generous donors. Please consider becoming a donor at www.electionintegrity.org. ------------------------------------------------------------------------  Beverly Harris, Black Box Voting: Ballot Tampering in the 21st Century (Renton, WA: Talion Publishing, 2003).  Freeman, Steven F. and Joel Bleifuss (2006) Was the 2004 Presidential Election Stolen? Exit Polls, Election Fraud, and the Official Count. (New York: Seven Stories Press): In response to indications that in Snohomish County, Washington, Sequoia DRE machine counts differed significantly from paper ballot counts in the same precincts, Paul Lehto requested a complete audit one that would include computer forensics of the Sequoia machines, This request was met by a strong response from Sequoia. "Let me reiterate that under no circumstances, without Sequoia's approval, will you be allowed to run any kind of testing on the machines," said Bob Terwilliger, the Snohomish County auditor. Sequoia's contract with the county prohibits outside examination of Sequoia's electronic voting system. (pgs 77-79) In March 2002, Palm Beach County, Florida held municipal elections. In Boca Raton, Mayor Emil Danciu came in third in a race for a seat on the city council, even though a pre-election opinion poll had him the frontrunner by 17%. He went to court and sued to gain access to the Sequoia source code in order to see if it was flawed. Election supervisor, Theresa LePore told the court that the code was a trade secret, and his suit was thrown out. In the city of Wellington, a runoff election was held that involved one race with only two candidates. The final vote tally was 1,263 to 1,259, but 78 ballots for some reason had no vote. LePore explained the discrepancy this way: "They knew there was an election. Everybody said, 'Go vote.' They got there and decided there was nobody they wanted to vote for and cast their ballot anyway [for nobody], for whatever reason, maybe to maintain voting history." In a legal challenge to that election, Theresa LePore testified that under the county's purchase contract with Sequoia it was a third-degree felony to disclose specifications of the how the machines operate. (pgs. 70-71)  Freeman, SF, "Who Really Won and Lost the 2004 US Presidential Election?" Presentation to the American Association of Public Opinion Research (Montreal May 19, 2006) www.appliedresearch.us/sf/Documents/AAPOR060519.pdf  Freeman and Bleifuss (2006) pages 100-104. See also Richard Morin, "Surveying the Damage-Exit Polls Can't Always Predict Winners, So Don't Expect Them to," Washington Post, Sunday, November 21, 2004, and Freeman, Steven F. (2004) "The Unexplained Exit Poll Discrepancy." University of Pennsylvania Graduate School of Arts and Sciences, Center for Organizational Dynamics Research Report, December 29, 2004. http://center.grad.upenn.edu/center/get.cgi?item=exitpollp (Originally released as Center for Organizational Dynamics working paper #04-09, November 9, 2004.)  See Presentation to the American Statistical Association, Philadelphia, October 14, 2005. Warren Mitofsky and Steve Freeman were the guest speakers of the American Statistical Association, Philadelphia Fall Meeting. http://www.appliedresearch.us/sf/ASA-P.htm  www.verifiedvoting.org/verifier/
Authors Website: www.campaignscientific.com
Stephanie Frank Singer is a mathematician, an author and an entrepreneur. She is Managing Partner of Campaign Scientific LLC.