12 July 2007

can scientists make up their minds?

How should unproven findings be publicized? at Statistical Modeling, Causal Inference, and Social Science, via Notional Slurry.

A year or so ago it was claimed by Satoshi Kanazawa that, roughly, attractive people are more likely to have daughters than sons. I've also heard recently that successful people are more likely to have sons; for example, something like sixty percent of all the children of U.S. presidents have been male. The mass media have recently picked up this story. Andrew at Statistical Modeling, Causal Inference, and Social Science criticized this finding; you can read his commentary to find out why, but he believes that the findings are statistical artifacts. However, he thinks there may be some truth to the conjectures

Of course, any given mass media outlet isn't going to report "maybe attractive people have more daughters than sons". They'll report one of the two following things:

  • Scientists say that attractive people have more daughters than sons.

  • Scientists say that there is no link between attractiveness and how many daughters you have.


Then in the first case they'd go and find an attractive couple that had, say, three sons and no daughters, to "disprove" this.

This reminds me of the way that the mass media treats, say, dark chocolate. Chocolate's supposed to be bad for you, because it is full of fat and sugar. But it's also supposed to be good for you because it contains certain antioxidants. One day they'll report one thing, one day they'll report the other. You know what I do? I ignore all the studies and eat chocolate, because I like it. (I happen to prefer dark chocolate to milk chocolate, which is probably a good thing in terms of health, but my preference is motivated purely by taste.)

This probably leads laypeople to have the idea that scientists are constantly changing their minds (which is true -- good scientists change their minds as new evidence comes in, or as new interpretations for old evidence become clear). I fear, though, that this may also lead to laypeople distrusting science -- if they can't even decide whether chocolate is "good for you" or "bad for you", what good are they?

But science is more complicated than that -- no food is entirely "good" or "bad", and so on.

What I'd like to see -- and perhaps it's already out there -- is more reporting of the meta-literature. Not "one group of scientists said today that chocolate is good", or "another group of scientists said today that chocolate is bad", but "some scientist looked at what all the other scientists said, and most of them think chocolate is bad". But you're not going to hear that on the evening news because the chocolate maker advertises there. But what about aggregating all that stuff online?

But I think most people will want a one-bit answer -- "yes" or "no". And it's more complicated than that. Some studies don't even give you a whole bit of information -- they tell you "probably yes". And a bunch of these "probably yes" answers can add up to a "yes". But not if everybody's working in isolation. This applies to ordinary life as well -- and I believe that it would be useful if there were some way that all the anecdotal experiences of people with, say, a particular company could be aggregated into something statistically significant. But that's a matter for another post.

No comments: