<!DOCTYPE html PUBLIC “-//W3C//DTD HTML 4.0 Transitional//EN” “http://www.w3.org/TR/REC-html40/loose.dtd”>
Today, Yale law professor Dan Kahan presents evidence that we Americans really suck at math. “Correctly interpreting the data was expected to be difficult,” he says of the test subjects from his latest study, and he turned out to be right. But all he was asking them to do was calculate a simple percentage. If 200 out of 300 people in one group get better by taking a pill and 100 out of 125 people in a different group get better by doing nothing, which is better? Taking a pill or doing nothing? You’d pass Kahan’s test if you understood that the raw numbers aren’t enough to get the right answer. You have to calculate the percentage of each group that got better.
It’s disheartening that most people couldn’t figure that out, though hardly unexpected. But what came next is….well, not unexpected, maybe. But certainly discouraging. Kahan ran the exact same test with the exact same data, except this time the question was about gun bans and crime levels. Half of the time, he presented data suggesting that a gun ban increased crime, while the other half of the time the data suggested that a gun ban decreased crime. And guess what? Among the subset of test subjects who were very good at math, they suddenly got really stupid if they didn’t like the answer they got. Here’s the chart:
This comes via Chris Mooney, who describes the whole thing in much more detail here. However, there’s one big caveat: If I’m interepreting the dataset correctly, the sample size of highly numerate subjects is very small. Roughly speaking, there were about 30 liberals and 30 conservatives who were highly numerate and were given the gun ban version of the test. That’s not a lot.
On the other hand, the effect size is pretty stunning. There’s a huge difference in the rate at which people did the math correctly depending on whether they liked the answer they got. I’d like to see some follow-ups with more subjects and different questions, but it sure looks as if we’d probably see the same dismal effect.
How big a deal is this? In one sense, it’s even worse than it looks. Aside from being able to tell that one number is bigger than another, this is literally about the easiest possible data analysis problem you can pose. If ideologues actively turn off their minds even for something this simple, there’s really no chance of changing their minds with anything even modestly more sophisticated. This is something that most of us pretty much knew already, but it’s a little chilling to see it so glaringly confirmed.
But in another sense, it really doesn’t matter at all. These days, even relatively simple public policy issues can only be properly analyzed using statistical techniques that are beyond the understanding of virtually all of us. So the fact that ideology destroys our personal ability to do math hardly matters. In practice, nearly all of us have to rely on the word of experts when it comes to this kind of stuff, and there’s never any shortage of experts to crunch the numbers and produce whatever results our respective tribes demand.
We believe what we want to believe, and neither facts nor evidence ever changes that much. Welcome to planet Earth.