One of the things that I believe very firmly is that scientific method is not just for scientists, it's for everybody. Because science is just the systematic evaluation of what is and is not likely to be true, it's the right way to go in all fields where we're interested in making effective decisions, or obtaining new knowledge. These fields include engineering, economics, law and criminal justice, politics and social policy, history, and everyday life. Science, after all, is just the systematization of common sense. In this post, I'll discuss how to employ a scientific approach to evaluating stories we read in the news.
With training and a clear idea of what common sense actually is, we can often do a surprisingly good job of evaluating issues where we have little or no expertise and very limited information. A good starting point when modeling common sense is Bayes' theorem. Its not that I think that our brains necessarily employ Bayes' theorem strictly, but to me it is perfectly obvious that natural selection of chance variations has equipped us with intellectual apparatus that is capable of mimicking to a high degree the results of Bayes' theorem in a broad range of commonly encountered circumstances.
Here's an example that shows how easy it can be to apply formal Bayesian reasoning to the news. A few months ago, a story came out that scientists had apparently observed neutrinos traveling faster than the speed of light in vacuum. Physicist and blogger, Ted Bunn, has outlined a breathtakingly simple estimation of the probability that this finding was accurate. Really, anybody who understands Bayes' theorem and has moderately well trained judgement could have performed this calculation. It juxtaposes two competing hypotheses: (1) physics as we know it severely wrong, and (2) somebody got one of their measurements wrong. The result is overwhelmingly in favor of neutrinos that can not exceed the speed of light, and holds even if the assigned prior probabilities are adjusted by orders of magnitude. Of course, we now know that the neutrinos in question were behaving perfectly in concert with known physics, and the measurement was indeed in error.
We can do a lot to train our intuition and improve our ability to discern merit or its absence in the news by learning something of the way that news journalism functions. For example, from examination of Bayes' theorem, it is clear that the probability, P(T | R), that a story is true, T, given that it has been reported, R depends on P(R | T) and P(R | F). Anything that aids you to estimate these, and similar quantities, therefore, enhances the performance of your inner Bayesian reasoner, and has got to be a good thing.
With particular regard to stories about science in the media, a gold mine of insight is to be found by browsing the archive of Ben Goldacre's Bad Science blog. Here, you can develop a healthy skepticism by reading, among other things, dozens of examples of how journalists distort stories to make them more sexy, how they misunderstand technical details, and how they play into the hands of PR companies.
Looking beyond science stories, an excellent book, 'Flat Earth News,' by seasoned newspaper journalist Nick Davies goes into terrifying detail about the extent to which news journalism in general is broken. Davies outlines ten rules of production, which he argues have important impacts on the quality and content of reported news. Examining them offers a stark vision, but also provides invaluable data in the quest to estimate what is true, what is important, how details are likely to be distorted, and what kinds of things are possibly being kept from you. Very briefly, these rules of production described by Davies are:
(1) Run cheap stories
Here's an example that shows how easy it can be to apply formal Bayesian reasoning to the news. A few months ago, a story came out that scientists had apparently observed neutrinos traveling faster than the speed of light in vacuum. Physicist and blogger, Ted Bunn, has outlined a breathtakingly simple estimation of the probability that this finding was accurate. Really, anybody who understands Bayes' theorem and has moderately well trained judgement could have performed this calculation. It juxtaposes two competing hypotheses: (1) physics as we know it severely wrong, and (2) somebody got one of their measurements wrong. The result is overwhelmingly in favor of neutrinos that can not exceed the speed of light, and holds even if the assigned prior probabilities are adjusted by orders of magnitude. Of course, we now know that the neutrinos in question were behaving perfectly in concert with known physics, and the measurement was indeed in error.
We can do a lot to train our intuition and improve our ability to discern merit or its absence in the news by learning something of the way that news journalism functions. For example, from examination of Bayes' theorem, it is clear that the probability, P(T | R), that a story is true, T, given that it has been reported, R depends on P(R | T) and P(R | F). Anything that aids you to estimate these, and similar quantities, therefore, enhances the performance of your inner Bayesian reasoner, and has got to be a good thing.
With particular regard to stories about science in the media, a gold mine of insight is to be found by browsing the archive of Ben Goldacre's Bad Science blog. Here, you can develop a healthy skepticism by reading, among other things, dozens of examples of how journalists distort stories to make them more sexy, how they misunderstand technical details, and how they play into the hands of PR companies.
Looking beyond science stories, an excellent book, 'Flat Earth News,' by seasoned newspaper journalist Nick Davies goes into terrifying detail about the extent to which news journalism in general is broken. Davies outlines ten rules of production, which he argues have important impacts on the quality and content of reported news. Examining them offers a stark vision, but also provides invaluable data in the quest to estimate what is true, what is important, how details are likely to be distorted, and what kinds of things are possibly being kept from you. Very briefly, these rules of production described by Davies are:
(1) Run cheap stories
- no long investigations, use information thats readily available. Most news stories are written in minutes.
(2) Select safe facts
- prefer official sources
(3) Don't upset the wrong people
- the more powerful somebody is, the more likely they are to sue, for example
(4) Select safe ideas
- don't print anything that goes against the widely held consensus
(5) Give both sides of the story
- if you never actually say anything definite, then nobody can accuse you of being wrong
(6) Give them what they want
- if it increases readership, then tell it. Tell it in a way that increases readership.
(7) Bias against truth
- the story can't be too complex, therefore suppress as many details as possible
(8) Give them what they want to believe in
- again, don't upset the readers
(9) Go with the moral panic
- in time of crisis, guage the public opinion and make sure to amplify it
(10) Give them what everybody else is giving them
- don't lose customers just because somebody else stoops lower then you are willing to
Understanding these things enables a skeptical mind to penetrate somewhat beyond what is actually presented in the news. A story can be judged, for example, on where the information probably comes from, what alternative sources were probably ignored, how much effort went into producing the story, why this story is in the news anyway - what is its real importance? This is skepticism. Science is healthy skepticism: scrutinizing everything, trying to find the flaws in every piece of evidence. Obvious, I know, but one or two people out there don't seem to have embraced it fully yet.