Ezra Klein wants to know how.
Epistemic closure,” Julian Sanchez writes, is the toxic result of “confirmation bias plus a sufficiently large array of multimedia conservative outlets to constitute a complete media counterculture, plus an overbroad ideological justification for treating mainstream output as intrinsically suspect.” … we’d all agree that it’s certainly theoretically possible for partisans of one party to embed themselves inside an echo chamber and become systematically more hostile to outside evidence than partisans of the other party. And given that this country has only two serious political parties, that would clearly be a troubling state of affairs. So the relevance of this discussion and the potential need to have it are not, I imagine, in doubt. The question is how do you measure epistemic closure? The easy answer is you test for its product: Misinformation. What you’d want to do, I guess, is continuously poll a standard set of questions based on empirical facts. “Has GDP grown since President X’s inauguration?” “Have global temperatures been rising or falling in recent decades?” “Does the United States have longer life expectancy than other developed nations?” “Do a majority of Americans approve of the president’s job performance?”
An alternative approach seems to me to be more promising. Take levels of political awareness (i.e. respondents’ ability to answer questions about which party has more members in the House etc) as a proxy for exposure to political information (both biased and unbiased). Divide the population according to the most appropriate metric for capturing the putative cocooning effect you are interested in (i.e. liberals v. conservatives; Democrats v. Republicans or whatever). Then test to see whether increased exposure to political information makes respondents more or less likely to give the right answer to politically salient questions where you know the right answers.
Best of all, Larry Bartels has done this already. In Unequal Democracy, Bartels examines how better informed and worse informed liberals and conservatives respond to a question asking whether economic inequality (as measured by income differences) had increased or decreased over time. The differences (see the graph below) between liberals and conservatives are striking. The better informed that liberals are about politics in general, the more likely they are to answer (correctly) that income inequalities have increased over time. The better informed that conservatives are about politics (in general), the less likely they are to give the correct answer. In other words, greater exposure to political information makes conservatives less likely to be right. This strongly suggests that conservatives face epistemic closure, at least on this issue. The more conservatives ‘know,’ the more likely they are to be wrong.
To be clear – one indicator on its own is insufficient evidence that one side of the ideological divide faces worse problems of epistemic closure than the other in any general sense. One could plausibly argue that liberals face their own epistemic closure on this and other questions – the fact that their policy elites are right on this may be purely accidental. This does not mean that conservatives do not have a problem of epistemic closure on this question – they almost certainly do – but we do not know whether liberals are better informed or merely lucky in this particular instance. One would like to see evidence across a variety of politically controversial questions (including some questions where liberals’ policy preferences potentially conflict with what we know to be correct or incorrect) before drawing any substantive conclusions. Still – at the least – this offers one way for actually figuring out the extent of epistemic closure – and some initial evidence that it is, indeed, a problem.