What’s the effect of false political information? For example, if you knew a negative story about a candidate was completely fabricated, would that change your view of the candidate? John Bullock conducts some very interesting experiments and finds:
Much work on political persuasion maintains that people are influenced by information that they believe and not by information that they don’t. By this view, false beliefs have no power if they are known to be false. This helps to explain frequent efforts to change voters’ attitudes by exposing them to relevant facts. But findings from social psychology suggest that this view requires modification: sometimes, false beliefs influence people’s attitudes even after they are understood to be false. In a trio of experiments, I demonstrate that the effect is present in people’s thinking about politics and amplified by party identification. I conclude by elaborating the consequences for theories of belief updating and strategic political communication.
By “amplified by party identification,” Bullock means that whether your view is changed depends in part on your party identification as well as the candidates. For example, if a Republican were to hear a negative story about a Democratic candidate, his impression of the Democratic candidate becomes worse. However, when it is revealed that the information was false, his opinion of the Democratic candidate does not return to its initial state, instead his belief lies somewhere between the initial state and the false state (and the same is true for Democrats and Republican candidates). In contrast, if a Republican hears a negative story about a Republican candidate, his perception of the candidate goes down, but after learning that the information was false, his perception returns to his initial state (again, the same for Democrats and Democratic candidates).