Interactive Voice Response Polling in Election Campaigns

Files
TR Number
Date
2015-01-30
Journal Title
Journal ISSN
Volume Title
Publisher
Virginia Tech
Abstract

Since the early 2000s, Interactive Voice Response (IVR) has become a widely popular method of conducting public opinion surveys in the United States. IVR surveys use an automated computer voice to ask survey questions and elicit responses in place of a live interviewer. Previous studies have shown that IVR polls conducted immediately before elections are generally accurate, but have raised questions as to their validity in other contexts.

This study examines whether IVR polls generate measurably different levels of candidate support when compared to live interviewer polls, as a result of non-response bias owing to lower response rates in IVR surveys. It did so by comparing polling in 2010 U.S. gubernatorial and U.S. Senate elections that was conducted using both live interviewers and IVR. The findings suggest that in general elections, IVR polls find fewer undecided voters compared to surveys conducted using live interviewers. In primary elections, IVR polls can show larger support than live interview polls for a more ideologically extreme candidate who has high levels of support among more opinionated and engaged voters.

Implications are that journalists and other consumers of polling data should take into account whether a poll was conducted using IVR or live interviewers when interpreting results. IVR polls may tend to over-sample more engaged and opinionated voters, often resulting in smaller percentages of undecided respondents, and higher levels of support for specific candidates in certain contexts.

Description
Keywords
Polling, Public Opinion, Interactive Voice Response, U.S. elections, political campaigns, survey research, IVR
Citation
Collections