Public Opinion

Partisan Bias and Political Facts

Sep 16 '10

Brendan Nyhan has “an excellent post”: on political misperceptions, documenting how partisanship and ideologies bias our perceptions of facts. I want to bring some additional research to bear.

Robert Shapiro and Yaeli Bloch-Elkon have a new paper (“ungated published version”:; earlier version with figures) that documents further evidence of partisan bias:

bq. Public opinion in the United States has become increasingly partisan and ideological along liberal-conservative lines in ways that we can even see in the normally more bipartisan area of foreign policy. While this is not troublesome in and of itself, what is disturbing is that there is evidence that political polarization can affect how the public learns from new information that should affect its opinions.

A particularly noteworthy finding in this paper — one that dovetails with my “brief examination”: of the apparent belief that Obama is a Muslim — is how little formal eduction or attention to politics (of which education is a rough proxy) weakens the pattern of partisan bias. Regarding beliefs about the Iraq War (e.g., the presence of WMDs, Saddam Hussein’s connection to al-Qaeda), they write:

bq. …we would still expect to find that at least the better educated segments of the public would learn over time, and we would expect to find this among the better educated Republicans and Democrats alike. Republicans’ opinions about the war or the Bush administration might not necessarily be expected to change as they learn the facts about the start of the war, but they should at least show noticeable learning.

bq. In this regard the trend data are not encouraging.

In some cases, the best educated learn the least. Here is a graph of the percentage of Republicans who believe that Saddam had WMDs, broken down by education levels:


Gary Jacobson documents similar tendencies in his analysis of opinion about the Iraq war (“gated”:;jsessionid=3a672g1wl1hio.victoria; “earlier ungated”:

bq. Republicans tended to misperceive, ignore, or consciously reject information undermining the war’s initial justifications; those with the strongest commitment to Bush were most likely to continue to accept the war’s original justifications, that Iraq possessed weapons of mass destruction and that Saddam Hussein was involved in 9/11.

Democrats are not immune from cognitive pathologies. In this case, they are no so much refusing to learn as forgetting that they had to learn in the first place:

bq. On the other side, a large proportion of Democrats have reconstructed memories to match their current disillusionment with the war and the president, forgetting that they had once believed Iraq possessed WMD and had supported the Iraq invasion.

The obvious response to this is “what can we do?” But perhaps the more salient question is “what difference would it make?” There are very good reasons to imagine that changing perceptions of facts may not change people’s attitudes. A Republican who came to believe that Saddam Hussein did not have WMDs may easily have supported the war for other reasons. Indeed, politicians are good at supplying reasons and we are good at rationalizing.

Ben Lauderdale has an interesting “new paper”: called, “Why Are Political Facts So Unpersuasive?” Based on a formal model of citizen learning, he arrives at three reasons why facts are unpersuasive and we typically construct perceptions of fact around our opinions rather than the other way around. He refers to this as “inferring justifications”:

bq. The model considered in this paper identifies three key ingredients for inferred justification. First, one needs a direct signal about the policies, absent the supporting facts. These kinds of cues are prevalent in mass politics. Everyone has an opinion and they are usually happier to reveal that opinion than the underlying facts that may (or may not) have contributed to its formation. Second, one needs to believe that the fact is more likely to be pivotal to the policy in one direction than the other. This is often true, though typical citizens will be quite unsure about whether a fact actually is pivotal or not, even given that it could only be pivotal in one direction. Third, one needs to be unsure about the facts. The confluence of these conditions is far from rare. We should expect to see inferred justification frequently in politics, as indeed we do.

Lauderdale also discusses the alternative process of rationalization: explaining away inconvenient facts. This process may actually be more prevalent among the better educated and politically attentive:

bq. It is important to note that rationalization is not going to occur among the voters we typically think of as low-information: such voters have not collected enough information to even need to engage in rationalization.

All of this leads to a variety of research questions, many of which are currently under investigation:

* Why do some partisans come to correct perceptions of fact, while others do not?

* What will correct misperceptions of fact?

* What differences does it make if these misperceptions are corrected?