Really this post is about statistics but you wouldn't have come if I had said that.
The thing is that there are reports published everyday on every possible subject, many of them food and health linked. My next post, which prompted this pre-post, is on prostate cancer and red wine. I am doing this post separately as I think I will be referring to it time and time again.
Newspapers happily publish these 'shock-horror' health reports, edited, cherry-picked, sensationalised and often unverified.
Readers, knowing no better, accept them on face value.
To help me explain, let me now call on some sharks and ice-creams.
Sharks & Ice Creams - the pursuit of causality.
There has been shown to be a strong correlation between shark attacks and ice-cream consumption in Australia.
Does that mean eating ice-creams make you more prone to being eaten by a shark?
Does it mean that, having been attacked by a shark, you start craving dairy food?
Or could it possibly be that a third thing, maybe the season, is controlling both things?
Indeed it is the case that it is in summer that both shark attacks and ice-cream consumption increase.
Causality is fundamental to interpreting any report. What caused what? Whenever you see a report in the papers linking two things, ask yourself the following three questions:
- Is A really causing B, as claimed?
- Could B actually be causing A?
- Could something else, C, be causing both A and B?
The Crud Factor
The crud factor in statistics is an acknowledgement that everything is correlated.
If I was to find the data for tin production in Bolivia for the last ten years and also find the number of deaths in bicycle accidents in Belgium over the same period and plot them on an X-Y graph there would almost certainly be a correlation between the two.
A chance correlation.
It may be negative, it may be positive, it may be big or small, but it is most unlikely to be zero.
That is why all scientific research needs to be replicated; scientists test the same hypothesis (that tin production in Bolivia is impacting on bicycle fatalities in Belgium) using different data. If they can replicate the results then their confidence in the hypothesis increases.
It is not uncommon for replicate testing to fail to reproduce the original work.
Sadly, journal editors are not as keen to publish a negative result as they are to publish a positive and interesting one and such work often never gets beyond the waste bin. And, should they publish an article negating previous work, news media are far less likely to run it because it is not "news" and not shocking enough.
So, if there is a news item showcasing some horror relationship between a food and health or even some wonder cure, be ready to ask yourself if the implied causality makes sense and could it just be a chance correlation.
There you go. That didn't hurt, did it?