Blog post

Cultural cognition and climate change

Mar 7, 2012 by | 1 Comment

This week we have a guest blog by Professor Dan Kahan of Yale University (and vis­iting Professor at Harvard University). Dan is a key member of the ‘cul­tural cog­ni­tion’ research group, and one of the leading experts on the social and psy­cho­lo­gical pro­cesses that shape public atti­tudes about major soci­etal issues like cli­mate change. Here Dan explains what the cul­tural cog­ni­tion research pro­gramme is, what it has to do with cli­mate change, and why he thinks that the standard way of under­standing scep­ti­cism about cli­mate change is misguided.

I’m going to resist the academic’s instinct to start with a long, abstract dis­cus­sion of what cul­tural cog­ni­tion is and the theory behind it. Instead, I’m going to launch straight into a prac­tical argu­ment based on this line of research. My hope is that the argu­ment will give you a glimpse of the essentials—and an appetite for delving further.

The argu­ment has to do with the con­tri­bu­tion that mis­in­form­a­tion makes to the dis­pute over cli­mate change. I want to sug­gest that the normal account of this is wrong.

The normal account envi­sions, in effect, that the dis­pute is fueled by an external force—economic interest groups, say—inundating a cred­u­lous public with inac­curate claims about risk.

I would turn this account more or less on its head: the cli­mate change dis­pute, I want to argue, is fueled by a motiv­ated public whose (uncon­scious) desire to form cer­tain per­cep­tions of risk makes it pos­sible (and prof­it­able) to mis­in­form them.

As evid­ence, con­sider an exper­i­ment that my col­leagues at the Cultural Cognition Project and I did.
In it, we asked the par­ti­cipants (a rep­res­ent­ative sample of 1500 U.S. adults) to examine the cre­den­tials of three sci­ent­ists and tell us whether they were “know­ledge­able and cred­ible experts” about one or another risk—including cli­mate change, dis­posal of nuc­lear wastes, and laws allowing cit­izens to carry con­cealed weapons in public. Each of the sci­ent­ists (they were fic­tional; we told sub­jects that after the study) had a Ph.D. in a seem­ingly rel­evant field, was on the fac­ulty of an elite uni­ver­sity, and was iden­ti­fied as a member of the National Academy of Sciences.

Whether study sub­jects deemed the fea­tured sci­ent­ists to be “experts,” it turned out, was strongly pre­dicted by two things: the pos­i­tion we attrib­uted to the sci­ent­ists (in short book excerpts); and the cul­tural group mem­ber­ship of the sub­ject making the determination.

Where the fea­tured sci­entist was depicted as taking what we called the “high risk” pos­i­tion on cli­mate change (it’s hap­pening, is caused by humans, will have bad con­sequences, etc.) he was readily cred­ited as an “expert” by sub­jects with egal­it­arian and com­munit­arian cul­tural values, a group that gen­er­ally sees envir­on­mental risks as high, but not by sub­jects with hier­arch­ical and indi­vidu­al­istic values, a group that gen­er­ally sees envir­on­mental risks as low. However, the pos­i­tions of these groups shifted—hierarchical indi­vidu­al­ists more readily saw the same sci­entist as an “expert,” while egal­it­arian comuni­atarians did not—when he was depicted as taking a “low risk” pos­i­tion (cli­mate change is uncer­tain, models are unre­li­able, more research necessary).

The same thing, moreover, happened with respect to the sci­ent­ists who had written books about nuc­lear power and about gun con­trol: sub­jects were much more likely to deem the sci­entist an “expert” when he advanced the risk pos­i­tion that pre­dom­in­ated in the sub­jects’ respective cul­tural groups than when he took the con­trary position.

This result reflects a phe­nomenon known as “motiv­ated cog­ni­tion.” People are said to be dis­playing this bias when they uncon­sciously fit their under­stand­ings of inform­a­tion (whether sci­entific data, argu­ments, and even sense impres­sions) to some goal or end extrinsic to forming an accurate answer.

The interest or goal here was the stake study sub­jects had in main­taining a sense of con­nec­tion and solid­arity with their cul­tural groups. Hence, the label cul­tural cog­ni­tion, which refers to the tend­ency of indi­viduals to form per­cep­tions of risk that pro­mote the status of their groups and their own standing within them.

Cultural cog­ni­tion gen­er­ates my uncon­ven­tional “motiv­ated public” model of mis­in­form­a­tion. The sub­jects in our study weren’t pushed around by any external mis­in­form­a­tion pro­vider. Furnished the same inform­a­tion, they sorted them­selves into the pat­terns that char­ac­terize public divi­sions we see on cli­mate change.

This kind of self-generated biased sampling—the tend­ency to count a sci­entist as an “expert” when he takes the pos­i­tion that fits one’s group values but not otherwise—would over time be cap­able all by itself of gen­er­ating a state of rad­ical cul­tural polar­iz­a­tion over what “expert sci­entific con­sensus” is on issues like cli­mate change, nuc­lear power, and gun con­trol.
Of course, out­side the lab, delib­erate mis­in­form­a­tion almost cer­tainly makes things worse.

But the desire of the public to form cul­tur­ally con­genial beliefs sup­plies one of the main incent­ives to fur­nishing them with mis­leading inform­a­tion. To pro­tect their cul­tural iden­tities, indi­viduals more readily seek out inform­a­tion that sup­ports than that chal­lenges the beliefs that pre­dom­inate in their group. The motiv­ated public’s desire for mis­in­form­a­tion thus makes it prof­it­able to become a pro­fes­sional misinformer—whether in the media or in the world of public advocacy.

Other actors will have their own eco­nomic interest in fur­nishing mis­in­form­a­tion. How effective their efforts will be, how­ever, will still depend largely on how cul­tur­ally motiv­ated people are to accept their mes­sage. If this weren’t so, the impact of the prodi­gious efforts of com­mer­cial entities to con­vince people that cli­mate change is a hoax, that nuc­lear power is safe, and that concealed-carry laws reduce crime would wear away the cul­tural divi­sions on these issues.

The reason that indi­viduals with dif­ferent values are motiv­ated to form opposing pos­i­tions on these issues is the sym­bolic asso­ci­ation of them with com­peting groups. But that asso­ci­ation can be cre­ated just as readily by accurate inform­a­tion as by mis­in­form­a­tion if authority fig­ures iden­ti­fied with only one group end up playing a dis­pro­por­tionate role in com­mu­nic­ating it.

One can’t expect to win an “inform­a­tion war of attri­tion” in an envir­on­ment like this. Accurate inform­a­tion will simply bounce off the side that is motiv­ated to resist it.

So am I saying, then, that things are hopeless?

No, far from it.

But the only way to devise rem­edies for these patho­lo­gies is to start with an accurate under­standing of why they occur.

The study of cul­tural cog­ni­tion shows that the con­ven­tional view of mis­in­form­a­tion (external source, cred­u­lous public) is inac­curate because it fails to appre­ciate how much more likely mis­in­form­a­tion is to occur and to matter when sci­entific know­ledge becomes entangled in ant­ag­on­istic cul­tural meanings.

How to free sci­ence from such entan­gle­ments is some­thing that the study of cul­tural cog­ni­tion can help us to figure out too.

Now, would you like me to tell you some­thing about that?


Kahan, D.M. Cultural Cognition as a Conception of the Cultural Theory of Risk. in Handbook of Risk Theory: Epistemology, Decision Theory, Ethics and Social Implications of Risk (eds. Hillerbrand, R., Sandin, P., Roeser, S. & Peterson, M.) 725–760 (Springer London, Limited, 2012).

Kahan, D. Fixing the Communications Failure. Nature 463, 296–297 (2010).

Kahan, D., Braman, D., Cohen, G., Gastil, J. & Slovic, P. Who Fears the HPV Vaccine, Who Doesn’t, and Why? An Experimental Study of the Mechanisms of Cultural Cognition. Law Human Behav 34, 501–516 (2010).

Kahan, D.M. & Braman, D. Cultural Cognition of Public Policy. Yale J. L. & Pub. Pol’y 24, 147–170 (2006).

Kahan, D.M., Braman, D., Slovic, P., Gastil, J. & Cohen, G. Cultural Cognition of the Risks and Benefits of Nanotechnology. Nature Nanotechnology 4, 87–91 (2009).

Kahan, D.M., Jenkins-Smith, H. & Braman, D. Cultural Cognition of Scientific Consensus. J. Risk Res. 14, 147–174 (2011).

1 Comment + Add Comment

  • Fascinating and plaus­ible hypo­thesis. Yes I’d love you to describe what sci­ence might do to free itself from such entan­gle­ments. Or at least start to loosen them.

Make a comment

Creative Commons 2011 - 2015, Talking Climate
A project by COIN & PIRC.
This website is a project of Climate Outreach

This website, a project of Climate Outreach (COIN), has been integrated into the new Climate Outreach website. Any updates since 21 October 2015 have been made to the new website only, not here, and this website will soon be deleted. Please bookmark our new website – we look forward to continuing to share the latest in climate communication research with you. We are now tweeting from @climateoutreach so please follow us there.