Science Journalism and How To Present Probabilities [Link]
post by gimpf · 2011-03-14T18:30:51.721Z · LW · GW · Legacy · 6 commentsContents
6 comments
I just stumbled across Language Log: Thou shalt not report odds ratios (2007-07-30), HT reddit/statistics:
(…) this finding was widely reported in the media:(…)
“Doctors are only 60% as likely to order cardiac catheterization for women and blacks as for men and whites.”
Now let't try a little test of reading comprehension. The study found that the referral rate for white men was 90.6%. What was the referral rate for blacks and women?
If you're like most literate and numerate people, you'll calculate 60% of 90.6%, and come up with .6*.906 = .5436. So, you'll reason, the referral rate for blacks and women was about 54.4 %.
But in fact, what the study found was a referral rate for blacks and women of 84.7%.
This was a failure mode of pop-sci journalism which I was not aware of (if I would happen to know enough to understand real papers, I’d definitely value pop-sci at minus-whatever in the meantime…)
On a related note this article got me remembering Understanding Uncertainty: 2845 ways to spin the Risk, which argues that certain presentations bias the understanding of probabilities:
Similarly people confronted with the statement “Cancer kills 2,414 people out of 10,000” rated cancer as more risky than those told “Cancer kills 24.14 people out of 100”. The potential influence of the size of the numerator and denominator is known as the 'ratio bias'.
I’d be quite interested if anybody could point me to further resources on good presentation of statistical facts (beside the normalization on one type of presentation), or on further pop-sci journalism failure modes.
6 comments
Comments sorted by top scores.
comment by DavidAgain · 2011-03-14T19:28:13.442Z · LW(p) · GW(p)
A similar problem is presented on this (generally excellent) site
http://www.straightstatistics.org/article/stumped-claims-accurate-diagnosis
Ben Goldacre also has some good stuff
comment by benelliott · 2011-03-14T18:55:39.489Z · LW(p) · GW(p)
But in fact, what the study found was a referral rate for blacks and women of 84.7%
I might just be being stupid, but how was this figure derived at all? I understand that the point of the article is that statistics can be presented in non-intuitive and confusing ways, but whenever I've seen such examples in the past there has always been some justification, however shaky.
Did the media just outright lie this time, or am I missing something?
Replies from: janos↑ comment by janos · 2011-03-14T19:15:58.537Z · LW(p) · GW(p)
Nope: the odds ratio was (.847/(1-.847))/(.906/(1-.906)), which is indeed 57.5%, which could be rounded to 60%. If the starting probability was, say, 1%, rather than 90.6%, then translating the odds ratio statement to "60% as likely" would be legitimate, and approximately correct; probably the journalist learned to interpret odds ratios via examples like that. But when the probabilities are close to 1, it's more correct to say that the women/blacks were 60% more likely to not be referred.
Replies from: Perplexed↑ comment by Perplexed · 2011-03-15T05:22:08.918Z · LW(p) · GW(p)
it's more correct to say that the women/blacks were 60% more likely to not be referred.
Hmmm. I would have said that white men were 60% as likely to not be referred. (This is the first time I've seen the golden ratio show up in a discussion of probability!)
Replies from: janoscomment by jimrandomh · 2011-03-14T19:04:07.676Z · LW(p) · GW(p)
Science journalists are expected to read papers, pick out the important parts, and rewrite them in their own words. Unfortunately, it's impossible to reliably rewrite a mathematical statement in different words without understanding what it means, so they sometimes fail and misrepresent the research they report on. But this is a problem with the journalism and its editing, not with the original research. It's good to avoid being misconstrued, papers should be written for experts first and journalists second or lower.