Science denialism may be a deeply entrenched and enormously damaging political phenomenon. But it is not a wholly practical one, or we would see many more people abandon medical science, air travel, computer technology, etc. Most of us tacitly agree that we know certain truths about the world—gravitational force, navigational technology, the germ theory of disease, for example. How do we acquire such knowledge, and how do we use the same method to test and evaluate the many new claims we’re bombarded with daily?
The problem, many professional skeptics would say, is that we’re largely unaware of the epistemic criteria for our thinking. We believe some ideas and doubt others for a host of reasons, many of them having nothing to do with standards of reason and evidence scientists strive towards. Many professional skeptics even have the humility to admit that skeptics can be as prone to irrationality and cognitive biases as anyone else.
Carl Sagan had a good deal of patience with unreason, at least in his writing and television work, which exhibits so much rhetorical brilliance and depth of feeling that he might have been a poet in another life. His style and personality made him a very effective science communicator. But what he called his “Baloney Detection Kit,” a set of “tools for skeptical thinking,” is not at all unique to him. Sagan’s principles agree with those of all proponents of logic and the scientific method. You can read just a few of his prescriptions below, and a full unabridged list here.
Wherever possible there must be independent confirmation of the “facts.”
Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives.
Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
Another skeptic, founder and editor of Skeptic magazine Michael Shermer, surrounds his epistemology with a sympathetic neuroscience frame. We’re all prone to “believing weird things,” as he puts it in his book Why People Believe Weird Things and his short video above, where he introduces, following Sagan, his own “Baloney Detection Kit.” The human brain, he explains, evolved to see patterns everywhere as a matter of survival. All of our brains do it, and we all get a lot of false positives.
Many of those false positives become widespread cultural beliefs. Shermer himself has been accused of insensitive cultural bias (evident in the beginning of his video), intellectual arrogance, and worse. But he admits up front that scientific thinking should transcend individual personalities, including his own. “You shouldn’t believe anybody based on authority or whatever position they might have,” he says. “You should check it out yourself.”
Some of the ways to do so when we encounter new ideas involve asking “How reliable is the source of the claim?” and “Have the claims been verified by somebody else?” Returning to Sagan’s work, Shermer offers an example of contrasting scientific and pseudoscientific approaches—the SETI (Search for Extraterrestrial Intelligence) Institute and UFO believers. The latter, he says, uncritically seek out confirmation for their beliefs, where the scientists at SETI rigorously try to disprove hypotheses in order to rule out false claims.
Yet it remains the case that many people—and not all of them in good faith—think they’re using science when they aren’t. Another popular science communicator, physicist Richard Feynman, recommended one method for testing whether we really understand a concept or whether we’re just repeating something that sounds smart but makes no logical sense, what Feynman calls “a mystic formula for answering questions.” Can a concept be explained in plain English, without any technical jargon? Can we ask questions about it and make direct observations that confirm or disconfirm its claims?
Feynman was especially sensitive to what he called “intellectual tyranny in the name of science.” And he recognized that turning forms of knowing into empty rituals resulted in pseudoscientific thinking. In a wonderfully rambling, informal, and autobiographical speech he gave in 1966 to a meeting of the National Science Teachers Association, Feynman concluded that thinking scientifically as a practice requires skepticism of science as an institution.
“Science is the belief in the ignorance of experts,” says Feynman. “If they say to you, ‘Science has shown such and such,’ you might ask, ‘How does science show it? How did the scientists find out? How? What? Where?’” Asking such questions does not mean we should reject scientific conclusions because they conflict with cherished beliefs, but rather that we shouldn’t take even scientific claims on faith.
For elaboration on Shermer, Sagan and Feynman’s approaches to telling good scientific thinking from bad, read these articles in our archive:
Carl Sagan Presents His “Baloney Detection Kit”: 8 Tools for Skeptical Thinking
Richard Feynman Creates a Simple Method for Telling Science From Pseudoscience (1966)
Richard Feynman’s “Notebook Technique” Will Help You Learn Any Subject–at School, at Work, or in Life
Michael Shermer’s Baloney Detection Kit: What to Ask Before Believing
Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness