A vague sense of disquiet settled over Europe in the period between World War I and World War II. As the slow burn of militant ultranationalism mingled with jingoist populism, authoritarian leaders and fascist factions found mounting support among a citizenry hungry for certainty. Europe’s growing trepidation fostered some of the 20th century’s most striking painterly, literary, and cinematic depictions of the totalitarianism that would soon follow. It was almost inevitable that this period would see the birth of the first deeply philosophical animated film, known as The Idea.
The Idea first emerged as a wordless novel in 1920, drawn by Frans Masereel. Masereel, a close friend of Dadaist and New Objectivist artist George Grosz, had created a stark, black-and-white story about the indomitable nature of ideas. Employing thick, aggressive lines obtained through woodcut printing, Masereel depicted a conservative political order’s fight against the birth of a new idea, which eventually flourished in spite of the establishment’s relentless attempts to suppress it.
Setting to work in 1930, a Czech filmmaker named Berthold Bartosch spent two years animating The Idea. Bartosch’s visual style remained true to Masereel’s harsh, vivid lines. His version of the story, however, took a decidedly bleaker turn—one that was more reminiscent of the writings of his compatriot, Franz Kafka. Whereas Masereel believed that the purity of good ideas would overwhelm their opposition, Bartosch, working a decade closer to the Nazis’ ascendancy, was wary of such idealism.
Above, you can watch what film historian William Moritz has called “the first animated film created as an artwork with serious, even tragic, social and philosophical themes.” Paired with a haunting score composed by Arthur Honegger, the 25-minute animation is a powerfully moving meditation on art, struggle, purity of thought, and populist savagery that remains untarnished after eight decades.
How can we know whether a claim someone makes is scientific or not? The question is of the utmost consequence, as we are surrounded on all sides by claims that sound credible, that use the language of science—and often do so in attempts to refute scientific consensus. As we’ve seen in the case of the anti-vaccine crusade, falling victim to pseudoscientific arguments can have dire effects. So how can ordinary people, ordinary parents, and ordinary citizens evaluate such arguments?
The problem of demarcation, or what is and what is not science, has occupied philosophers for some time, and the most famous answer comes from philosopher of science Karl Popper, who proposed his theory of “falsifiability” in 1963. According to Popper, an idea is scientific if it can conceivably be proven wrong. Although Popper’s strict definition of science has had its uses over the years, it has also come in for its share of criticism, since so much accepted science was falsified in its day (Newton’s gravitational theory, Bohr’s theory of the atom), and so much current theoretical science cannot be falsified (string theory, for example). Whatever the case, the problem for lay people remains. If a scientific theory is beyond our comprehension, it’s unlikely we’ll be able to see how it might be disproven.
Physicist and science communicator Richard Feynman came up with another criterion, one that applies directly to the non-scientist likely to be bamboozled by fancy terminology that sounds scientific. Simon Oxenham at Big Think points to the example of Deepak Chopra, who is “infamous for making profound sounding yet entirely meaningless statements by abusing scientific language.” (What Daniel Dennett called “deepities.”) As a balm against such statements, Oxenham refers us to a speech Feynman gave in 1966 to a meeting of the National Science Teachers Association. Rather than asking lay people to confront scientific-sounding claims on their own terms, Feynman would have us translate them into ordinary language, thereby assuring that what the claim asserts is a logical concept, rather than just a collection of jargon.
The example Feynman gives comes from the most rudimentary source, a “first grade science textbook” which “begins in an unfortunate manner to teach science”: it shows its student a picture of a “windable toy dog,” then a picture of a real dog, then a motorbike. In each case the student is asked “What makes it move?” The answer, Feynman tells us “was in the teacher’s edition of the book… ‘energy makes it move.’” Few students would have intuited such an abstract concept, unless they had previously learned the word, which is all the lesson teaches them. The answer, Feynman points out, might as well have been “’God makes it move,’ or ‘Spirit makes it move,’ or, ‘Movability makes it move.’”
Instead, a good science lesson “should think about what an ordinary human being would answer.” Engaging with the concept of energy in ordinary language enables the student to explain it, and this, Feynman says, constitutes a test for “whether you have taught an idea or you have only taught a definition. Test it this way”:
Without using the new word which you have just learned, try to rephrase what you have just learned in your own language. Without using the word “energy,” tell me what you know now about the dog’s motion.
Feynman’s insistence on ordinary language recalls the statement attributed to Einstein about not really understanding something unless you can explain it to your grandmother. The method, Feynman says, guards against learning “a mystic formula for answering questions,” and Oxenham describes it as “a valuable way of testing ourselves on whether we have really learned something, or whether we just think we have learned something.”
It is equally useful for testing the claims of others. If someone cannot explain something in plain English, then we should question whether they really do themselves understand what they profess…. In the words of Feynman, “It is possible to follow form and call it science, but that is pseudoscience.”
Does Feynman’s ordinary language test solve the demarcation problem? No, but if we use it as a guide when confronted with plausible-sounding claims couched in scientific-sounding verbiage, it can help us either get clarity or suss out total nonsense. And if anyone would know how scientists can explain complicated ideas in plainly accessible ways, Feynman would.
Note: An earlier version of this post appeared on our site in 2016.
In a 2013 blog post, the great Ursula K. Le Guin quotes a London Times Literary Supplement column by a “J.C.,” who satirically proposes the “Jean-Paul Sartre Prize for Prize Refusal.” “Writers all over Europe and America are turning down awards in the hope of being nominated for a Sartre,” writes J.C., “The Sartre Prize itself has never been refused.” Sartre earned the honor of his own prize for prize refusal by turning down the Nobel Prize in Literature in 1964, an act Le Guin calls “characteristic of the gnarly and counter-suggestible Existentialist.” As you can see in the short clip above, Sartre fully believed the committee used the award to whitewash his Communist political views and activism.
But the refusal was not a theatrical or “impulsive gesture,” Sartre wrote in a statement to the Swedish press, which was later published in Le Monde. It was consistent with his longstanding principles. “I have always declined official honors,” he said, and referred to his rejection of the Legion of Honor in 1945 for similar reasons. Elaborating, he cited first the “personal” reason for his refusal
This attitude is based on my conception of the writer’s enterprise. A writer who adopts political, social, or literary positions must act only with the means that are his own—that is, the written word. All the honors he may receive expose his readers to a pressure I do not consider desirable. If I sign myself Jean-Paul Sartre it is not the same thing as if I sign myself Jean-Paul Sartre, Nobel Prize winner.
The writer must therefore refuse to let himself be transformed into an institution, even if this occurs under the most honorable circumstances, as in the present case.
There was another reason as well, an “objective” one, Sartre wrote. In serving the cause of socialism, he hoped to bring about “the peaceful coexistence of the two cultures, that of the East and the West.” (He refers not only to Asia as “the East,” but also to “the Eastern bloc.”)
Therefore, he felt he must remain independent of institutions on either side: “I should thus be quite as unable to accept, for example, the Lenin Prize, if someone wanted to give it to me.”
As a flattering New York Times article noted at the time, this was not the first time a writer had refused the Nobel. In 1926, George Bernard Shaw turned down the prize money, offended by the extravagant cash award, which he felt was unnecessary since he already had “sufficient money for my needs.” Shaw later relented, donating the money for English translations of Swedish literature. Boris Pasternak also refused the award, in 1958, but this was under extreme duress. “If he’d tried to go accept it,” Le Guin writes, “the Soviet Government would have promptly, enthusiastically arrested him and sent him to eternal silence in a gulag in Siberia.”
These qualifications make Sartre the only author to ever outright and voluntarily reject both the Nobel Prize in Literature and its sizable cash award. While his statement to the Swedish press is filled with polite explanations and gracious demurrals, his filmed statement above, excerpted from the 1976 documentary Sartre by Himself, minces no words.
Because I was politically involved the bourgeois establishment wanted to cover up my “past errors.” Now there’s an admission! And so they gave me the Nobel Prize. They “pardoned” me and said I deserved it. It was monstrous!
Sartre was in fact pardoned by De Gaulle four years after his Nobel rejection for his participation in the 1968 uprisings. “You don’t arrest Voltaire,” the French President supposedly said. The writer and philosopher, Le Guin points out, “was, of course, already an ‘institution’” at the time of the Nobel award. Nonetheless, she says, the gesture had real meaning. Literary awards, writes Le Guin—who herself refused a Nebula Award in 1976 (she’s won several more since)—can “honor a writer,” in which case they have “genuine value.” Yet prizes are also awarded “as a marketing ploy by corporate capitalism, and sometimes as a political gimmick by the awarders [….] And the more prestigious and valued the prize the more compromised it is.” Sartre, of course, felt the same—the greater the honor, the more likely his work would be coopted and sanitized.
Perhaps proving his point, a short, nasty 1965 Harvard Crimson letter had many, less flattering things than Le Guin to say about Sartre’s motivations, calling him “an ugly toad” and a “poor loser” envious of his former friend Camus, who won in 1957. The letter writer calls Sartre’s rejection of the prize “an act of pretension” and a “rather ineffectual and stupid gesture.” And yet it did have an effect. It seems clear at least to me that the Harvard Crimson writer could not stand the fact that, offered the “most coveted award” the West can bestow, and a heaping sum of money besides, “Sartre’s big line was, ‘Je refuse.’”
Karl Marx was a German philosopher-historian (with a few other pursuits besides) who wrote in pursuit of an understanding of industrial society as he knew it in the nineteenth century and what its future evolution held in store. There are good reasons to read his work still today, especially if you have an interest in the history of economic and sociological theory, or in the time and places he lived. But in the almost century-and-a-half since his death — and more so during the twentieth century, during which the ostensibly Marxist project of the Soviet Union rose and fell — he’s turned from a historical figure into an iconic specter, representing either penetrating insight into or catastrophic delusion about the organization of human society.
It was surely Marx’s tendency to inflame strong opinions that got him placed at the center of a debate between the psychologist/cultural commentator Jordan Peterson and the philosopher/cultural theorist Slavoj Žižek. The event took place in 2019, at Toronto’s Sony Center, billed as a clash of the titans on the subject of “Happiness: Capitalism vs. Marxism.”
In fact, it ended up covering a wide range of twenty-first-century issues, with each of the two unorthodox, highly recognizable public intellectuals giving characteristic performances on the economic and political ideologies of the day. Yet they aren’t as opposed as one might have imagined: “I cannot but notice the irony of how Peterson and I, the participants in this duel of the century, are both marginalized by the official academic community,” Žižek remarks early on.
Indeed, writes the Guardian’s Stephen Marche, “the great surprise of this debate turned out to be how much in common the old-school Marxist and the Canadian identity politics refusenik had. One hated communism. The other hated communism but thought that capitalism possessed inherent contradictions. The first one agreed that capitalism possessed inherent contradictions.” Nevertheless, as in many a debate, the surprising common ground is more interesting than the predictable points of conflict, especially on themes broader than any set of ‑isms. “My basic dogma is, happiness should be treated as a necessary by-product,” says Žižek. “If you focus on it, you are lost.” To this proposition Peterson later gives his hearty assent. As for what, exactly, to focus on instead of happiness… well, that’s a matter of debate.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on Twitter at @colinmarshall or on Facebook.
In Simone de Beauvoir’s 1945 novel The Blood of Others, the narrator, Jean Blomart, reports on his childhood friend Marcel’s reaction to the word “revolution”:
It was senseless to try to change anything in the world or in life; things were bad enough even if one did not meddle with them. Everything that her heart and her mind condemned she rabidly defended—my father, marriage, capitalism. Because the wrong lay not in the institutions, but in the depths of our being. We must huddle in a corner and make ourselves as small as possible. Better to accept everything than to make an abortive effort, doomed in advance to failure.
Marcel’s fearful fatalism represents everything De Beauvoir condemned in her writing, most notably her groundbreaking 1949 study, The Second Sex, often credited as the foundational text of second-wave feminism. De Beauvoir rejected the idea that women’s historical subjection was in any way natural—“in the depths of our being.” Instead, her analysis faulted the very institutions Marcel defends: patriarchy, marriage, capitalist exploitation.
In the 1975 interview above with French journalist Jean-Louis Servan-Schreiber—“Why I’m a Feminist”—De Beauvoir picks up the ideas of The Second Sex, which Servan-Schreiber calls as important an “ideological reference” for feminists as Marx’s Capital is for communists. He asks De Beauvoir about one of her most quoted lines: “One is not born a woman, one becomes one.” Her reply shows how far in advance she was of post-modern anti-essentialism, and how much of a debt later feminist thinkers owe to her ideas:
Yes, that formula is the basis of all my theories…. Its meaning is very simple, that being a woman is not a natural fact. It’s the result of a certain history. There is no biological or psychological destiny that defines a woman as such…. Baby girls are manufactured to become women.”
Without denying the fact of biological difference, De Beauvoir debunks the notion that sex differences are sufficient to justify gender-based hierarchies of status and social power. Women’s second-class status, she argues, results from a long historical process; even if institutions no longer intentionally deprive women of power, they still intend to hold on to the power men have historically accrued.
Almost 50 years after this interview—and 75 years since The Second Sex—the debates De Beauvoir helped initiate rage on, with no sign of abating anytime soon. Although Servan-Schreiber calls feminism a “rising force” that promises “profound changes,” one wonders whether De Beauvoir, who died in 1986, would be dismayed by the plight of women in much of the world today. But then again, unlike her character Marcel, De Beauvoir was a fighter, not likely to “huddle in a corner” and give in. Servan-Schreiber states above that De Beauvoir “has always refused, until this year, to appear on TV,” but he is mistaken. In 1967, she appeared with her partner Jean-Paul Sartre on a French-Canadian program called Dossiers.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Back in 2016, New York City staged a month-long festival celebrating Albert Camus’ historic visit to NYC in 1946. One event in the festival featured actor Viggo Mortensen giving a reading of Camus’ lecture,“La Crise de l’homme” (“The Human Crisis”) at Columbia University–the very same place where Camus delivered the lecture 70 years earlier–down to the very day (March 28, 1946). The reading was initially captured on a cell phone, and broadcast live using Facebook live video. But then came a more polished recording, courtesy of Columbia’s Maison Française. Note that Mortensen takes the stage around the 11:45 mark. You can read a transcript of “The Human Crisis” here.
Note: An earlier version of this post appeared on our site in April, 2016.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
If we ask which philosophy professor has made the greatest impact in this decade, there’s a solid case to be made for the late Michael Sugrue. Yet in the nearly four-decade-long career that followed his studies at the University of Chicago under Allan Bloom (author of The Closing of the American Mind, later immortalized in Saul Bellow’s Ravelstein), he never published a book, nor took a tenured position. His last place of employment as a lecturer was Ave Maria University, a small Catholic institution founded by the man behind Domino’s Pizza. After his death earlier this year, his work might have lived on only in the memories of the students with whom he shared classrooms.
That would have been the case, at least, if Sugrue’s daughter hadn’t uploaded his lectures to Youtube during the COVID pandemic, when viewers the world over were more than ready for a dose of philosophical wisdom. “The lectures were recorded as part of the Great Minds of the Western Intellectual Tradition series,” writes John Hirschauer in a 2021 American Conservative profile, “a collection of talks on the West’s greatest authors and thinkers” published by The Teaching Company in 1992. “Sugrue’s first lecture in the series is on Plato, the last on critical theory. His remarkable oratory skill is on display throughout.” What’s more, “he does not carry a note card or read from a prompter. There is hardly a stutter in 37 hours of footage.”
Sugrue was diagnosed with cancer in the early twenty-tens, and “doctors at the time gave him five years to live. He said the thought of Marcus Aurelius had taken on new meaning since his diagnosis.” Indeed, Sugrue’s lecture on the Roman emperor and Stoic icon is the most popular of his videos, with over one and a half million views at the time of this writing. Over the years, we’ve featured differentintroductions to Stoicism here on Open Culture, as well as the work of other Stoics like the statesman-dramatist Seneca the Younger. But Sugrue’s 42-minute exegesis on Marcus Aurelius — not just “the most interesting of the Stoics,” but also “the one example of an absolute ruler who behaves himself in such a way as not to disgrace himself” — has resonated unusually far and wide.
Then, as now, Marcus Aurelius serves as “a standing reproach to our self-indulgence, a standing reproach to the idea that we are unable to deal with the circumstances of human life.” He fully internalized the central Stoic insight that there are “only two kinds of things: there are the things you can control and the things you can’t.” Everything falls into the latter group except “your intentions, your behavior, your actions.” And indeed, just as Sugrue kept looking to the example of Marcus Aurelius — returning to his text Meditations as recently as a webinar he gave two years ago — students of philosophy yet unborn will no doubt find their way to the philosophical guidance that he himself has left behind.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on Twitter at @colinmarshall or on Facebook.
“Adolf Eichmann went to the gallows with great dignity,” wrote the political philosopher Hannah Arendt, describing the scene leading up to the prominent Holocaust-organizer’s execution. After drinking half a bottle of wine, turning down the offer of religious assistance, and even refusing the black hood offered him at the gallows, he gave a brief, strangely high-spirited speech before the hanging. “It was as though in those last minutes he was summing up the lesson that this long course in human wickedness had taught us — the lesson of the fearsome word-and-thought-defying banality of evil.”
These lines come from Eichmann in Jerusalem: A Report on the Banality of Evil, originally published in 1963 as a five-part series in the New Yorker. Eichmann “was popularly described as an evil mastermind who orchestrated atrocities from a cushy German office, and many were eager to see the so-called ‘desk murderer’ tried for his crimes,” explains the narrator of the animated TED-Ed lesson above, written by University College Dublin political theory professor Joseph Lacey. “But the squeamish man who took the stand seemed more like a dull bureaucrat than a sadistic killer,” and this “disparity between Eichmann’s nature and his actions” inspired Arendt’s famous summation.
A German Jew who fled her homeland in 1933, as Hitler rose to power, Arendt “dedicated herself to understanding how the Nazi regime came to power.” Against the common notion that “the Third Reich was a historical oddity, a perfect storm of uniquely evil leaders, supported by German citizens, looking for revenge after their defeat in World War I,” she argued that “the true conditions behind this unprecedented rise of totalitarianism weren’t specific to Germany.” Rather, in modernity, “individuals mainly appear in the social world to produce and consume goods and services,” which fosters ideologies “in which individuals were seen only for their economic value, rather than their moral and political capacities.”
In such isolating conditions, she thought, “participating in the regime becomes the only way to recover a sense of identity and community. While condemning Eichmann’s “monstrous actions, Arendt saw no evidence that Eichmann himself was uniquely evil. She saw him as a distinctly ordinary man who considered obedience the highest form of civic duty — and for Arendt, it was exactly this ordinariness that was most terrifying.” According to her theory, there was nothing particularly German about all of this: any sufficiently modernized culture could produce an Eichmann, a citizen who defines himself by participation in his society regardless of that society’s larger aims. This led her to the conclusion that “thinking is our greatest weapon against the threats of modernity,” some of which have become only more threatening over the past six decades.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.