• Rational group-think

    In unsurprising news, party ideology is associated with views on Obamacare. Are these evidence-based views? Most likely not.

    Thus, at one level—a very individualistic one—it will make perfect sense in this situation for individuals to attend to information, including evidence of what is known to science, that promote the formation of identity-congruent beliefs. Again, even citizens of modest science literacy and critical reasoning skills will likely be able to form such beliefs without difficulty—because figuring out what view prevails among those with whom one shares one’s most important connections depends on a basic kind of cultural competence, not on an understanding of or a facility with empirical evidence. But those citizens who enjoy above-average science comprehension will not face any less incentive to form such beliefs; indeed, they will face pressure to use their intelligence and reasoning skills to find evidentiary support for identity-congruent beliefs the comprehension of which would likely exceed the capacity of most of their peers (Kahan, Peters, Wittlin, Slovic, Ouellette, Braman & Mandel 2012).

    At a collective level, of course, this style of engaging decision-relevant science can be disastrous. If all individuals follow it at the same time, it will impede a democratic society from converging, or at least converging as quick as it otherwise would, on understandings of fact consistent with the best available evidence on matters that affect their common welfare. This outcome, however, will not change the incentive of any individual—who despite the harm he or she suffers as a result of unaddressed risks or ill-considered policies cannot change the course of public policymaking by changing his or her personal stances, which, if contrary to the ones that prevail in that person’s group, will continue to expose him or her to considerable social disadvantage. […]

    We submit that a form of information processing cannot reliably be identified as “irrational,” “subrational,” “boundedly rational” or the like independent of what an individuals’ aims are in making use of information. It is perfectly rational, from an individual-welfare perspective, for individuals to engage decision-relevant science in a manner that promotes culturally or politically congenial beliefs. Making a mistake about the best-available evidence on an issue like climate change, nuclear waste disposal, or gun control will not increase the risk an ordinary member of the public faces, while forming a belief at odds with the one that predominates on it within important affinity groups of which they are members could expose him or her to an array of highly unpleasant consequences (Kahan 2012).

    That’s from a very interesting paper by Dan Kahan on “motivated numeracy“. It is, believe it or not, about an interesting randomized experiment. I’d tell you about it, but you could just as well read the paper. It’s ungated. Recognizing that it’s long and you’re busy, I recommend you at least read Chris Mooney’s summary. If even that’s too long for you, try Kevin Drum. The real reason I’m not writing a longer post is that Kevin and Chris have already done better jobs than I could do.


    • Kahan’s work is so important.

      I wonder if he has studied possible ways to teach people how to be self-critical? That is, to be aware of the biasing effects of ideology and to look for ways to attack one’s own beliefs?

      My belief is that this kind of self-criticism is essential to real scientific achievement. (But maybe I am naive about that.)

      • It may be important but I don’t think the study is that great. It seems to have some of the same problems as the trolley problem. In an effort to simply complex problems and force a choice they leave out important information. Maybe that doesn’t make a difference, but I don’t think they have made the case.

        I don’t see an obviously correct answer based on the information presented. I know the “correct” answer that they want me to pick. In real life I’d be asking for more information (or even in their study) before answering. I’d really like them to run the study with a third option (need more information/don’t know/etc.). I suspect that might really change their data.

    • In these questions, the first-glance answer is wrong. That’s why 59% of all participants got it wrong. It would be very interesting to repeat the experiment with questions that don’t initially lead you to the wrong (or the right) answer. That would help distinguish between “not paying attention when the initial answer confirms pre-existing beliefs” and “getting the right answer but convincing yourself it is wrong when it goes against pre-existing beliefs.

      One nit to pick — I am not at all sure that most liberals would expect a concealed carry ban to reduce crime. I would think that it might reduce gun-related violence, including accidental or self-inflicted injury. Not at all clear to me that it would reduce (or increase) crime in general.

      • Interesting anecdote about concealed carry… In Michigan a few days ago, two motorists got into a road rage incident (tailgating). Both of them had legal concealed carry permits. They ended up killing each other.
        So, in this case, concealed carry turned a shouting match into two deaths… not a good outcome or endorsement for concealed carry.

    • There’s a really interesting back-and-forth with Kahan about interpretation of these results and the role of priors:


    • I have found this to be true not only of how people evaluate public policy but also how risky they believe a particular behavior or decision to be . Their decisions have more to do with their beliefs rather than the actual risk. For example; my neighbor who refuses to vaccinate her child but is putting in a pool. Or another friend who asks you every time you invite her child over if you have any guns in the house but has two very old, large, aggressive dogs that have to be locked behind a gate in the kitchen whenever a child comes to her house. The fact that a visiting child is just one loose bolt away from danger seems lost on her. Just as we underestimate the dangers we live with daily and overestimate the dangers we aren’t familiar with, I think we do the same with evidence that either reinforces or contradicts our belief system.