Many Unlikely Voters Still Vote

by Dan Hopkins on September 11, 2012 · 3 comments

in Blogs

What is the current state the Presidential race?  The answer depends on who you think is likely to vote, a point Mark Blumenthal makes over at Pollster today.  And the gap between registered voters and likely voters can be sizable: in the latest ABC News/Washington Post survey, President Obama’s six-point lead among registered voters becomes a single percentage point when shifting to likely voters.  Similarly, if we look at the last two polls conducted by CNN, we see that President Obama’s post-convention bump was primarily a change in the composition of likely voters, not in the preferences of registered voters—as the Times’ Nate Silver was quick to point out.

But imagine that you reach a voter on the phone, and she tells you she is unlikely to vote.  What is the chance that she still will do so in the end?  A working paper by Masa Aida and Todd Rogers reports the answer from three different studies, including one which compares survey responses and actual turnout for more than 10,000 voters in 39 states during the 2008 general election.  How people answer a survey question about their vote intention does predict whether they are likely to vote, with 87% of those who say they are almost certain to vote actually doing so.  But one striking finding is that 55% of those who say they will not vote still do so in the end.  Now, that figure did drop to 29% in New Jersey’s 2009 gubernatorial election—but even 29% is far from zero.  So don’t be surprised when you see a lot of “unlikely voters” at the polls in November.  Much more is in the paper.

{ 3 comments }

Brian Silver September 11, 2012 at 5:07 pm

In analyses of the ANES vote validation studies from the 70′s and 80′s (in my work with Barbara Anderson and Paul Abramson), we found that those who said in the pre-election survey that they did not “expect to vote” almost never actually voted according to the post-election vote validation study. So a finding, in the Aida and Rogers study, that this may have changed since then has to be looked at carefully.

Though the Aida and Rogers paper cites our earlier research, it did not note this difference in the findings. I would propose that their finding that many “not expect to vote” people actually did vote could be due to the fact their survey data came from telephone interviews, whereas in the years for which we used the ANES data the surveys were face-to-face? If this explains the difference, then while voter psychology may be a relevant factor to examine in why people may vote despite their stated intention not to do so, perhaps this is as much a matter of the method of data collection as it is any underlying psychology.

J September 11, 2012 at 9:06 pm

I think the main difference is that they use a voter file to test if people actually voted or not, rather than a post survey.

Brian Silver September 11, 2012 at 9:21 pm

@J: Who is the “they” you’re referring to? The ANES vote validation studies used voter files/records to check whether people actually voted.

Pre-election survey: Do you expect to vote? Post-election survey: Did you vote? Post-election validation using precinct-level records: Did the individual actually vote?

Comments on this entry are closed.

Previous post:

Next post: