Saturday, November 6, 2010

Statistics in Journalism: Handle with Care


There are many ways to get statistics wrong, and we learned about some of them from the speakers at the 'Get the numbers right: A workshop on reporting statistics' session.

They also offered advice on how journalists could avoid the common mistakes and treat statistics right. Clearly this is a topic that many people care about, judging by how packed the room was.

Odds Ratios Vs Relative Risk
Stephen Ornes started off by drawing on his experience as a fact-checker, and talked about his pet peeve: the inaccurate treatment of Odds Ratios.

Odds Ratios and Relative Risk are two frequent ways to report results from scientific studies. Odds Ratios are ratios of two different probabilities, and Orne's bottomline was that journalists shouldn't report Odds Ratios as a percentage of risk, as that's not what it refers to.

Relative Risk is a lot more intuitive, and is simply the risk of one group compared to another. But that means you always have to mention what groups you're comparing between for it to make sense.

Ornes said journalists often get statistics like Odds Ratios and Relative Risks wrong with the best of intentions--when they're trying to simplify them so they're easier to understand. His advice? "Befriend a statistician."

How to properly interpret statistics in a paper
Andrew Gellman, professor of statistics and political science at Columbia University was next up. Gellman has also published some popular books and runs a blog that was recommended by all the speakers as a good place to learn about basic statistical concepts.

Gellman used the example of a recent paper that concluded that 'beautiful parents have more daughters' to show some common problems with the paper's statistics. He talked about the questions that sceptical journalists should ask when writing about a paper like that.

His conclusion is that given the sample size and the lack of statistical validity, there really wasn't much one could conclude from that paper. And this was backed up by an independent study he did looking at whether People magazine's 50 sexiest people had more boys or girls.

Gellman admitted another reason he liked doing that study was "I loved being able to refer to Brad Pitt in my statistics paper."

Gellman also pointed out an important reason why journalists needed to be skeptical about the statistics in scientific studies: to be published in the peer-reviewed literature, studies have to be statistically significant. As a result, there's a bias towards overestimating effects in peer-reviewed studies.

Things journalists often get wrong about statistics
Tom Siegfried, Editor-in-Chief of Science News, talked about the common mistakes that journalists make when writing about statistics.

- He explained what exactly statistical significance means, emphasizing that it wasn't the same as 'significant' in the sense of important. With a large enough sample, you can almost always find something that's statistically significant.
- Also, lack of statistical significance doesn't mean that there is no effect. It could just mean the sample size is too small, and could require more studies.

He suggested writing about a statistically significant effect as something that "was seen at a level unlikely to be explained by chance," which I thought was a very clear way to express it.

He also pointed out that the recipe for wrong science was the same as the recipe for wrong science news: So here are the things to watch out:
- Is it the first report of something?
- Is it an advance in a hot research field?
- Is it contrary to previous belief?

This could mean that it's a really good study, or it could mean it's wrong, and it's up to us at journalists to think about that when writing about it, the speakers said.

Practical Advice
All three speakers weighed in on how journalists can avoid common mistakes in interpreting statistics:

1. Read the actual methods section of the paper, and use your own intuition. Does the result make sense based on what you expect? Look at other studies - does this agree with past results?

2. Then contact an expert to confirm. "Befriend a statistician," was Ornes' advice. He mentioned STATS.org, a site where journalists could ask statisticians about specific questions they had. Gellman pointed out that not all statisticians are the same, so it's important to find one who's an expert in the specific field you're writing about.

3. Judge how important the study actually is, and decide how to write about it. Explain the statistics in plain language, but be careful to not oversimplify or draw inaccurate conclusions.

And if the study draws a lot of attention but your analysis finds that it's not that significant, this might be an opportunity to explain your take on it.

The bottom-line was that statistics can be invaluable in interpreting complex information, but the onus is on journalists to do their research and know enough about statistics to get it right.

No comments:

Post a Comment