Carl Sagan’s “Baloney Detection Kit”: A Toolkit That Can Help You Scientifically Separate Sense from Nonsense

It’s prob­a­bly no stretch to say that mass dis­in­for­ma­tion cam­paigns and ram­pant anti-intel­lec­tu­al­ism will con­sti­tute an increas­ing amount of our polit­i­cal real­i­ty both today and in the future. As Han­nah Arendt wrote, the polit­i­cal lie has always been with us. But its glob­al reach, par­tic­u­lar vehe­mence, and bla­tant con­tempt for ver­i­fi­able real­i­ty seem like inno­va­tions of the present.

Giv­en the embar­rass­ing wealth of access to infor­ma­tion and edu­ca­tion­al tools, maybe it’s fair to say that the first and last line of defense should be our own crit­i­cal rea­son­ing. When we fail to ver­i­fy news—using resources we all have in hand (I assume, since you’re read­ing this), the fault for believ­ing bad infor­ma­tion may lie with us.

But we so often don’t know what it is that we don’t know. Indi­vid­u­als can’t be blamed for an inad­e­quate edu­ca­tion­al sys­tem, and one should not under­es­ti­mate the near-impos­si­bil­i­ty of con­duct­ing time-con­sum­ing inquiries into the truth of every sin­gle claim that comes our way, like try­ing to iden­ti­fy indi­vid­ual droplets while get­ting hit in the face with a pres­sur­ized blast of tar­get­ed, con­tra­dic­to­ry info, some­times com­ing from shad­owy, unre­li­able sources.

Carl Sagan under­stood the dif­fi­cul­ty, and he also under­stood that a lack of crit­i­cal think­ing did not make peo­ple total­ly irra­tional and deserv­ing of con­tempt. “It’s not hard to under­stand,” for exam­ple, why peo­ple would think their rel­a­tives are still alive in some oth­er form after death. As he writes of this com­mon phe­nom­e­non in “The Fine Art of Baloney Detec­tion,” most super­nat­ur­al beliefs are just “humans being human.”

In the essay, a chap­ter from his 1995 book The Demon-Haunt­ed World, Sagan pro­pos­es a rig­or­ous but com­pre­hen­si­ble “baloney detec­tion kit” to sep­a­rate sense from non­sense.

  • Wher­ev­er pos­si­ble there must be inde­pen­dent con­fir­ma­tion of the “facts.”
  • Encour­age sub­stan­tive debate on the evi­dence by knowl­edge­able pro­po­nents of all points of view.
  • Argu­ments from author­i­ty car­ry lit­tle weight — “author­i­ties” have made mis­takes in the past. They will do so again in the future. Per­haps a bet­ter way to say it is that in sci­ence there are no author­i­ties; at most, there are experts.
  • Spin more than one hypoth­e­sis. If there’s some­thing to be explained, think of all the dif­fer­ent ways in which it could be explained. Then think of tests by which you might sys­tem­at­i­cal­ly dis­prove each of the alter­na­tives.
  • Try not to get over­ly attached to a hypoth­e­sis just because it’s yours. It’s only a way sta­tion in the pur­suit of knowl­edge. Ask your­self why you like the idea. Com­pare it fair­ly with the alter­na­tives. See if you can find rea­sons for reject­ing it. If you don’t, oth­ers will.
  • If what­ev­er it is you’re explain­ing has some mea­sure, some numer­i­cal quan­ti­ty attached to it, you’ll be much bet­ter able to dis­crim­i­nate among com­pet­ing hypothe­ses. What is vague and qual­i­ta­tive is open to many expla­na­tions.
  • If there’s a chain of argu­ment, every link in the chain must work (includ­ing the premise) — not just most of them.
  • Occam’s Razor. This con­ve­nient rule-of-thumb urges us when faced with two hypothe­ses that explain the data equal­ly well to choose the sim­pler. Always ask whether the hypoth­e­sis can be, at least in prin­ci­ple, fal­si­fied…. You must be able to check asser­tions out. Invet­er­ate skep­tics must be giv­en the chance to fol­low your rea­son­ing, to dupli­cate your exper­i­ments and see if they get the same result.

Call­ing his rec­om­men­da­tions “tools for skep­ti­cal think­ing,” he lays out a means of com­pen­sat­ing for the strong emo­tion­al pulls that “promise some­thing like old-time reli­gion” and rec­og­niz­ing “a fal­la­cious or fraud­u­lent argu­ment.” At the top of the post, in a video pro­duced by Big Think, you can hear sci­ence writer and edu­ca­tor Michael Sher­mer explain the “baloney detec­tion kit” that he him­self adapt­ed from Sagan, and just above, read Sagan’s own ver­sion, abridged into a short list (read it in full at Brain Pick­ings).

Like many a sci­ence com­mu­ni­ca­tor after him, Sagan was very much con­cerned with the influ­ence of super­sti­tious reli­gious beliefs. He also fore­saw a time in the near future much like our own. Else­where in The Demon-Haunt­ed World, Sagan writes of “Amer­i­ca in my children’s or grandchildren’s time…. when awe­some tech­no­log­i­cal pow­ers are in the hands of a very few.” The loss of con­trol over media and edu­ca­tion ren­ders peo­ple “unable to dis­tin­guish between what feels good and what’s true.”

This state involves, he says a “slide… back into super­sti­tion” of the reli­gious vari­ety and also a gen­er­al “cel­e­bra­tion of igno­rance,” such that well-sup­port­ed sci­en­tif­ic the­o­ries car­ry the same weight or less than expla­na­tions made up on the spot by author­i­ties whom peo­ple have lost the abil­i­ty to “knowl­edge­ably ques­tion.” It’s a scary sce­nario that may not have com­plete­ly come to pass… just yet, but Sagan knew as well or bet­ter than any­one of his time how to address such a poten­tial social epi­dem­ic.

Relat­ed Con­tent:

Carl Sagan Pre­dicts the Decline of Amer­i­ca: Unable to Know “What’s True,” We Will Slide, “With­out Notic­ing, Back into Super­sti­tion & Dark­ness” (1995)

Carl Sagan’s Syl­labus & Final Exam for His Course on Crit­i­cal Think­ing (Cor­nell, 1986)

Carl Sagan’s Last Inter­view

Josh Jones is a writer and musi­cian based in Durham, NC. Fol­low him at @jdmagness


by | Permalink | Comments (6) |

Sup­port Open Cul­ture

We’re hop­ing to rely on our loy­al read­ers rather than errat­ic ads. To sup­port Open Cul­ture’s edu­ca­tion­al mis­sion, please con­sid­er mak­ing a dona­tion. We accept Pay­Pal, Ven­mo (@openculture), Patre­on and Cryp­to! Please find all options here. We thank you!


Comments (6)
You can skip to the end and leave a response. Pinging is currently not allowed.
  • Paul Tatara says:

    If you can’t trust author­i­ties, how do you get inde­pen­dent con­fir­ma­tion of the facts?? I’m not being a wise-guy. I hon­est­ly don’t know how to work that out!

  • Nigel J Watson says:

    I real­ly enjoyed his shows; esp Cos­mos. I still have his epi­taphial good­bye as he was mak­ing his exit via Can­cer.

    Too bad, then, that he failed to bring his ‘detec­tor’ (I con­stant­ly employ many of the tools on his list) to bear in his own realm of exper­tise.

  • Josh Jones says:

    Def­i­nite­ly a valid con­cern, Paul. I think there are ques­tions we can ask about sup­posed author­i­ties that help us deter­mine whether or not they are reli­able sources of infor­ma­tion. What are their edu­ca­tion­al back­grounds and areas of exper­tise? Do they cite oth­er experts or seem to just refer to them­selves as sole author­i­ties? Do they have a good rep­u­ta­tion in a par­tic­u­lar field? Do they have obvi­ous bias­es or con­flicts of inter­est that might cause them to mis­rep­re­sent facts? I don’t think it’s true that we can’t trust all author­i­ties, only that we can’t trust appeals to author­i­ty as guar­an­tees of accu­ra­cy.

  • craig moreau says:

    Hubris ‚bias ‚greed and lust tend to con­vo­lute the truth and the truth becomes sub­jec­tive and even a paper drag­on. With this in mind tread care­ful­ly when author­i­ties and experts come togeth­er or oppose each oth­er .Sagan points out experts and author­i­ties will take their bag­gage and fail to pro­vide their ver­sion of the truth with enough truth to sus­tain it as his­to­ry and sci­ence goes on. So we are left with shift­ing ideas that may sur­vive the unre­lent­ing bias of the deplorable, God save the experts. oh that’s not right

  • JS says:

    You must first admit that if you can­not dis­cern what is prob­a­bly trust­wor­thy infor­ma­tion, that’s a prob­lem all by itself that you need to solve.

    A super high IQ, high­ly edu­cat­ed, unbi­ased, expe­ri­enced per­son has the skills to dis­cern trust­wor­thi­ness in their areas of under­stand­ing. Are you that? If not you are some­thing some­what less like most of us, and that’s what you have to deter­mine “trust­wor­thi­ness.” Con­verse­ly, an illit­er­ate, une­d­u­cat­ed, low IQ per­son who has nev­er been out­side a box is a poor judge of trust­wor­thi­ness of data.

    Socrates said that the basis of wis­dom is hav­ing a decent ideas of what things you don’t know. You don’t know how to con­firm data and you admit it, that is a good first step.

  • Tim Jones says:

    The cur­ren­cy of Rea­son is in two forms; Objec­tive Philo­soph­i­cal Argu­men­ta­tion (or “O.P.A.” for short), and Hard Sci­en­tif­ic Evi­dence (or H.S.E.” for short). A the­o­ry must by those mer­its estab­lish a pri­ma facie case for itself. It is them rig­or­ous­ly test­ed in order to mer­it wide­spread accep­tance. And even then, it is always vul­ner­a­ble to being refut­ed by either coun­ter­vail­ing data or the rise of a bet­ter the­o­ry. An “expert” in just some­one who’s done this a lot, in their area of exper­tise.

Leave a Reply

Quantcast