15 September 2010

It is Not About Science, but Values

A really interesting new study is published this week in the Journal of Risk Research that seeks to explain why it is that on highly politicized issues the public does not uniformly defer to the views of scientific experts, even when those experts are largely in consensus.  The answer is not that one group in society is "anti-science," but rather that people tend to weight evidence and experts differently based on cultural considerations.  This is a line of argument that I and various colleagues (such as Dan Sarewitz, Mike Hulme, Steve Rayner and others) have advanced for a while, so it is exciting to see empirical evidence that back up these claims.  

Here is how a pre-publication version of the paper explained its hypothesis:
Kahan, Dan M., Jenkins-Smith, Hank and Braman, Donald, Cultural Cognition of Scientific Consensus (February 7, 2010). Journal of Risk Research, Forthcoming. Available at SSRN: http://ssrn.com/abstract=1549444

The goal of the study was to examine a distinctive explanation for the failure of members of the public to form beliefs consistent with apparent scientific consensus on climate change and other issues of risk. We hypothesized that scientific opinion fails to quiet societal dispute on such issues not because members of the public are unwilling to defer to experts but because culturally diverse persons tend to form opposing perceptions of what experts believe. Individuals systematically overestimate the degree of scientific support for positions they are culturally predisposed to accept as a result of a cultural availability effect that influences how readily they can recall instances of expert endorsement of those positions.
The paper used an experimental methodology in the sense that it actually looked at how individuals characterized various experts.  The paper explains its findings:
The study furnished two forms of evidence in support of this basic hypothesis. The first was the existence of a strong correlation between individuals’ cultural values and their perceptions of scientific consensus on risks known to divide persons of opposing worldviews. Subjects holding hierarchical and individualistic outlooks, on the one hand, and ones holding egalitarian and communitarian outlooks, on the other, significantly disagreed about the state of expert opinion on climate change, nuclear waste dis-posal, and handgun regulation. It is possible, of course, that one or the other of these groups is better at discerning scientific consensus than the other. But because the impressions of both groups converged and diverged from positions endorsed in NAS “expert consensus” in a pattern reflective of their respective predispositions, it seems more likely that both hierarchical individualists and egalitarian communitarians are fitting their perceptions of scientific consensus to their values.

The second finding identified a mechanism that could explain this effect. When asked to evaluate whether an individual of elite academic credentials, including membership in the NAS, was a “knowl-edgeable and trustworthy expert,” subjects’ answers proved conditional on the fit between the position the putative expert was depicted as adopting (on climate change, on nuclear waste disposal, or on handgun regulation) and the position associated with the subjects’ cultural outlooks.
A press release from NSF that accompanied the paper, explains,
. . . the study also found that the American public in general is culturally divided on what "scientific consensus" is on climate change, nuclear waste disposal, and concealed-handgun laws.

"The problem isn't that one side 'believes' science and another side 'distrusts' it," said [lead author Dan] Kahan referring to an alternate theory of why there is political conflict on matters that have been extensively researched by scientists.

He said the more likely reason for the disparity, as supported by the research results, "is that people tend to keep a biased score of what experts believe, counting a scientist as an 'expert' only when that scientist agrees with the position they find culturally congenial."
These empirical findings help to explain why there are obvious contradictions in what areas of science different groups tend to accept and reject, with no apparent systematic explanation.  For instance, many more Europeans than Americans think that GMOs are unsafe, yet many more Europeans than Americans are worried about climate change.  Similarly, US conservatives are opposed to stem cell research while the left does not, and opposition to geoengineering is generally found on the political left and its supporters on the right.

The paper explains the significance of its findings for the communication of scientific information:
This conclusion does not imply, however, that there is no prospect for rational public delibera-tions informed by the best scientific evidence on global warming, nuclear waste disposal, handguns, and like issues. But because the source of the enfeebled power of scientific opinion is different from what is normally thought, the treatment must be something other than what is normally prescribed. It is not enough to assure that scientifically sound information—including evidence of what scientists themselves believe—is widely disseminated: cultural cognition strongly motivates individuals—of all worldviews— to recognize such information as sound in a selective pattern that reinforces their cultural predispositions. To overcome this effect, communicators must attend to the cultural meaning as well as the scientific content of information.
The authors suggest that attending to the cultural meaning of science entails three tasks, first:
When shown risk information (e.g., global temperatures are increasing) that they associate with a conclusion threatening to their cultural values (commerce must be constrained), individuals tend to react dismissively toward that information; however, when shown that the information in fact supports or is consistent with a conclusion that affirms their cultural values (society should rely more on nuclear power), such individuals are more likely to consider the information open-mindedly . . .
This is why expanding the scope of policy options in highly politicized contexts can be politically important, as it gives people an opportunity to interpret science in a manner consistent with their cultural values.  Efforts to focus on green jobs or the security implications of climate policies reflect such an awareness.

Second:
Individuals reflexively reject information inconsistent with their predispositions when they perceive that it is being advocated by experts whose values they reject and opposed by ones whose values they share. In contrast, they attend more open-mindedly to such information, and are much more likely to accept it, if they perceive that there are experts of diverse values on both sides of the debate . . .
This helps to explain why efforts to enforce a rigid consensus of views in climate policy have back-fired so strongly among many in the so-called skeptical community.  The more that a consensus is invoked and the narrower it is defined, the more it puts off the very people that those seeking to share scientific knowledge should be trying to communicate with, the unconvinced.  Denigrating one's cultural or political opponents may feel satisfying, but it is not a good strategy for getting them to accept that your views are sound.  Thus, open, transparent and diverse expert advisory processes are more likely to be generally viewed as legitimate and robust.

Third:
Individuals tend to assimilate information by fitting it to pre-existing narrative templates or schemes that invest the information with meaning. The elements of these narrative templates—the identity of the stock heroes and villains, the nature of their dramatic struggles, and the moral stakes of their engagement with one another—vary in identifiable and recurring ways across cultural groups. By crafting messages to evoke narrative templates that are culturally congenial to target audiences, risk communicators can help to assure that the content of the information they are im-parting receives considered attention across diverse cultural groups . . .
Again ironically, efforts to identify or label those who are skeptical of certain expert views as "anti-science" or "deniers" are like to become self-fulfilling in the sense that they reinforce the rejection of expert views as they play directly to a narrative conditioned on cultural considerations.  Consequently, breaking down, rather than reinforcing differences across cultural groups would this seem key to broader acceptance of certain scientific findings.  Building bridges is harder work than tearing them down.

In many respects, the advice given here is exactly the opposite of that of some of the more ardent advocates for action on climate change in the scientific community (and their allies).  (The same could be said as well as nuclear power and gun control, the two other issues that the paper looked at).  Much of this just sounds like common sense, but given the state of the debate over climate, apparently it is not.

But there is a final irony here.  The advice of this paper is likely to be dismissed by those practicing such strategies for exactly the reasons described in the paper.  Experts in climate science are not experts in the science of judgment and decision making.  Thus, they will interpret the findings of this research through their own cultural lenses.  And more likely than not, that probably means giving far less weight to these findings than they deserve.