Having been something of a second-tier Mad Magazine for decades, Cracked reinvented itself for the Internet as a source of fascinating, funny and well-researched lists on a wide variety of subjects. Often its lists challenge common myths and urban legends, like "5 Things You Won't Believe Aren't In the Bible" or "6 Ridiculous Lies You Believe About the Founding of America."
Among the lists that appeared in early April was Soren Bowie's "5 Guilty Pleasures That Are Secretly Saving the World," which discussed how credit card debt was eradicating illness by reducing the transmission of disease via paper currency, how the death of print media was slowing global warming, how marijuana production helps the environment and how texting was preventing brain tumors.
Topping the list was how misinformation on the Internet encourages people to be smarter and more skeptical, and it wrapped up with this sentence: "Above all, the Internet has taught us how to discriminate between a well-informed piece with reliable sources and, say, an elaborate April Fools prank filled with bold-faced lies that sort of feel true."
That's right: The entire article was an elaborate and brilliant April Fools joke. It worked so well because it carefully exploits the same mental weaknesses that it's criticizing. As such, the article is a fantastic case study in the propagation of misinformation.
For starters, as Soren noted, all of the items feel as if they could or should be true. They start with a commonsense observation that's indisputable (people do charge/text/read online more), and then draw what would seem to be a commonsense conclusion.
But common sense is unreliable when it comes to seeking out truth. People are subject to all manner of logical fallacies and faulty observations; we have to rely on things like the scientific method to strip away those biases.
Confirmation bias makes people seek and prefer information that validates their pre-existing views, and the self-serving bias causes people to interpret information in a way that benefits them, and Soren exploits both of these starting from the title itself: "5 Guilty Pleasures That Are Secretly Saving the World." People want to believe that texting is good for their health and that pot is good for the planet, and they can see how some information might support those conclusions, and so they're predisposed to be less critical of an article telling them exactly that.
One thing confirmation bias can cause a person to do is to cherry-pick which evidence to believe and argue. Pseudoscience uses this tactic regularly, preferring anecdotes to scientific tests, or relying on poorly designed or unblended studies rather than well-designed, well-controlled, double-blinded studies that are better at eliminating possible bias by the researchers conducting the study. Evidence is weighed not on its quality, but on how helpful it is to the preferred conclusion. This plays a major role in Soren's item about cellphones and cancer.
Such cherry-picking is also at the heart of claims that vaccines are dangerous and cause autism in children. This was once a seriously considered hypothesis, worthy of careful study. But years of studies have repeatedly found no link; moreover, the researcher whose paper kicked off the whole controversy was found to have committed fraud in his research. The weight of the scientific evidence is heavily on the side of there being no link, and yet the debate nonetheless drags on, because certain vaccine denialists cherry-pick which evidence to believe.
Moreover, one can create false confidence in cherry-picked sources simply by how they're used and presented. The Cracked article includes multiple links to purported sources, creating the illusion that the article is well-sourced. But here, the sources don't always back up the article's claims. For instance, the article suggests that several infectious diseases have nearly disappeared in North America because of credit card spending, and it includes a link to a paper in a science journal that documented currency- borne diseases. But the research concerned paper money in Nepal, and reported only the existence of the pathogens, not any reduction in disease. The reader's failure to scrutinize the source material allows the reader to be deceived.
Of course, we're bombarded with so much new information every day that it's impractical to check every source. Mental shortcuts exist because of the need to process large amounts of information in short amounts of time. Falling victim to misinformation is thus probably inevitable, but there are still two basic tools to fight back.
The first is to simply familiarize oneself with those mental shortcuts, and to be aware of the logical fallacies and biases that our minds are subject to. By knowing where your mental weaknesses are, you can better identify what information might be the result of others' biases, and thus avoid falling victim to suspect information yourself.
But no system is perfect, and even the most skeptical person will fall victim to bad information sometimes. And what good critical thinking demands of you at that point is to keep your mind open to new, and potentially contrary, evidence. To be able to weigh the quality of that evidence against what you already believe to be true. To recognize that your existing beliefs may not stand on as firm a foundation as you once thought.
It can be a very humbling attitude to take. But in the long run, the best way to ensure that what you believe is right, is to always be open to the possibility that what you believe is wrong.
Collins is author of 'Bullspotting: Finding Facts in the Age of Misinformation.' Email: email@example.com.
THE FREE LANCE-STAR (FREDERICKSBURG, VA.)