“It’s been six months since agents from Saudi Arabia killed the Washington Post columnist. What has been done in the aftermath?” In this documentary, The Assassination of Jamal Khashoggi, The Washington Post examines Khashoggi’s writings, his killing inside the Saudi Consulate in Istanbul and the Trump administration’s response.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
We’ve heard about the lawyering fool who has him- or herself for a client. The old proverb does not mean to say that lawyers are especially scrupulous, only that the intricacies of the law are best left to the professionals, and that a personal interest in a case muddies the waters. That may go double or triple for doctoring, though doctors don’t have to bear the lawyer’s social stigma.
But can we reasonably expect doctors to live healthier lives than the general population? What about other professions that seem to entail a rigorous code of conduct? Many people have lately been disabused of the idea that clergy or police have any special claim to moral upstandingness (on the contrary)….
What about ethicists? Should we have high expectations of scholars in this subset of philosophy? There are no clever sayings, no genre of jokes at their expense, but there are a few academic studies asking some version of the question: does studying ethics make a person more ethical?
You might suspect that it does not, if you’re a cynic—or the answer might surprise you!.… Put more precisely, in a recent study—“The Moral Behavior of Ethics Professors,” published in Philosophical Psychology this year—the “open but highly relevant question” under consideration is “the relation between ethical reflection and moral action.”
The paper’s authors, professor Johannes Wanger of Austria’s University of Graz and graduate student Philipp Schönegger from the University of St. Andrews in Scotland, surveyed 417 professors in three categories, reports Olivia Goldhill at Quartz: “ethicists (philosophers focused on ethics), philosophers focused on non-ethical subjects, and other professors.” The paper surveyed only German-speaking scholars, replicating the methods of a 2013 study focused on English-speaking professors.
The questions asked touched on “a range of moral topics, including organ donation, charitable giving, and even how often they called their mother.” After assessing general views on the subjects, the authors “then asked the professors about their own behavior in each category.” We must assume a base level of honesty among the respondents in their self-reported answers.
The results: “the researchers found no significant difference in moral behavior” between those who make it their business to study ethics and those who study other things. For example, the majority of the academics surveyed agreed that you should call your mother: at 75% of non-philosophers, 70% of non-ethicists, and 65% of ethicists (whose numbers might be lower here because other issues could seem weightier to them by comparison).
When it comes to picking up the phone to call mom at least twice a month, the numbers were consistently high, but ethicists did not rate particularly higher at 87% versus 81% of non-ethicist philosophers and 89% of others. The subject of charitable giving may warrant more scrutiny. Ethicists recommended donating an average of 6.9% of one’s annual salary, where non-ethicists said 4.6% was enough and others said 5.1%. The numbers for all three groups, however, hover around four and half percent.
One notable exception to this trend: vegetarianism: “Ethicists were both more likely to say that it was immoral to eat meat, and more likely to be vegetarians themselves.” But on average, scholars of ethical behavior do not seem to behave better than their peers. Should we be surprised at this? Eric Schwitzgebel, a philosophy professor at University of California, Riverside, and one of the authors of original, 2013 study, finds the results upsetting.
Using the example of a hypothetical professor who makes the case for vegetarianism, then heads to the cafeteria for a burger, Schwitzgebel refers to modern-day philosophical ethics as “cheeseburger ethics.” Of his work on the behavior of ethicists with Stetson University’s Joshua Rust, he writes, “never once have we found ethicists as a whole behaving better than our comparison groups of other professors…. Nonetheless, ethicists do embrace more stringent moral norms on some issues.”
Should philosophers who hold such views aspire to be better? Can they be? Schönegger and Wagner frame the issue upfront in their recent version of the study (which you can read in full here), with a quote from the German philosopher Max Scheler: “signposts do not walk in the direction they point to.” Ethicists draw conclusions about ideals of human behavior using the tools of philosophy. They show the way but should not personally set themselves up as exemplars or role-models. As one high-profile case of a very badly-behaved ethicist suggests, this might not do the profession any favors.
Schwitzgebel is not content with this answer. The problem, he writes at Aeon, may be professionalization itself, imposing an unnatural distance between word and deed. “I’d be suspicious of any 21st-century philosopher who offered up her- or himself as a model of wise living,” he writes, “This is no longer what it is to be a philosopher—and those who regard themselves as wise are in any case almost always mistaken. Still, I think, the ancient philosophers got something right that the cheeseburger ethicist gets wrong.”
The “something wrong” is a laissez-faire comfort with things as they are. Leaving ethics to the realm of theory takes away a sense of moral urgency. “A full-bodied understanding of ethics requires some living,” Schwitzgebel writes. It might be easier for philosophers to avoid aiming for better behavior, he implies, when they are only required, and professionally rewarded, just to think about it.
American children, a study found a few years ago, recognize over 1,000 corporate logos but almost no plants. To some it was a damning indictment of the modern world; to others it was nothing more than a description of the modern world (in the 21st century, after all, which skill is more help in finding food?); and to a few it was an opportunity to proclaim that, for the sake of the children, the modern world could use some better corporate logos.
Image by dellfi
The artists, architects, and designers of the Bauhaus, the modernist art-school-turned-movement with its origins in Weimar Germany, might well have agreed. Right from the Bauhaus’ foundation in 1919, its members worked on shaping the aesthetics of the future.
Now, for the school’s 100th anniversary (today!), 99designs has commissioned revisions of current corporate logos in the Bauhaus style. “It outlasted a century’s worth of competing styles,” writes 99designs’ Matt Ellis, “survived the initial criticisms from traditionalists, and although the Nazis shut down the institution in 1933, the Bauhaus movement itself lives on to this day.”
Image by ArsDesigns
Ellis goes on to quote the still-inspiring words of Bauhaus founder Walter Gropius: “The artist is a heightened manifestation of the craftsman. Let us form… a new guild of craftsmen without the class divisions that set out to raise an arrogant barrier between craftsmen and artists! Let us together create the new building of the future which will be all in one: architecture and sculpture and painting.” This project put up the five pillars of the Bauhaus style: “form follows function,” “minimalism,” “revolutionary typography,” “passion for geometry,” and “primary colors.”
Image by dnk
The reimagined corporate logos made for the centenary of the Bauhaus stand on all those pillars, turning the emblems of products and services that many of us consume and use every day — or perhaps, as we scroll through Instagram on our iPhones or Android devices at Starbucks in our Adidases, all at the same time — into designs that merge the cutting-edge aesthetics of interwar Europe with those of the thoroughly globalized 2010s.
Image by PonomarevDmitry
Whether a pure Bauhaus revival will result in the actual adoption of logos like these remains to be seen, but in a way, the exercise simply doubles down on an influence that already runs deep. As Artsy’s Kelsey Ables puts it, “It is a testament to the longstanding influence of Bauhausian minimalist ideals that the selected logos were already streamlined to begin with; many of the designers who reimagined ‘Bauhaus style’ logos had to add visual elements. Perhaps Google and its brethren are more Bauhaus than the Bauhaus itself.”
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
They’re the ones who spur us to study hard, so we can make something of ourselves, in order to better our communities.
They name our babies, choose our clothes, decide what we’re hungry for.
They make and break laws, organize protests, fritter away hours on social media, and give us the green light to binge watch a bunch of dumb shows when we could be reading War and Peace.
They also plant the seeds for Fitzcarraldo-like creative endeavors that take over our lives and generate little to no income.
We may describe such endeavors as a labor of love, into which we’ve poured our entire heart and soul, but think for a second.
Who’s really responsible here?
The heart, that muscular fist-sized Valentine, content to just pump-pump-pump its way through life, lub-dub, lub-dub, from cradle to grave?
On a lighter note, it also told her to devote nine months to knitting an anatomically correct replica of the human brain.
(Twelve, if you count three months of research before casting on.)
How did her brain convince her to embark on this madcap assignment?
Easy. It arranged for her to be in the middle of a more prosaic knitting project, then goosed her into noticing how the ruffles of that project resembled the wrinkles of the cerebral cortex.
Coincidence?
Not likely. Especially when one of the cerebral cortex’s most important duties is decision making.
As she explained in an interview with The Telegraph, brain development is not unlike the growth of a knitted piece:
You can see very naturally how the ‘rippling’ effect of the cerebral cortex emerges from properties that probably have to do with nerve cell growth. In the case of knitting, the effect is created by increasing the number of stitches in each row.
Dr. Norberg—who, yes, has on occasion referred to her project as a labor of love—told Scientific American that such a massive crafty undertaking appealed to her sense of humor because “it seemed so ridiculous and would be an enormously complicated, absurdly ambitious thing to do.”
That’s the point at which many people’s brains would give them permission to stop, but Dr. Norberg and her brain persisted, pushing past the hypothetical, creating colorful individual structures that were eventually sewn into two cuddly hemispheres that can be joined with a zipper.
(She also let slip that her brain—by which she means the knitted one, though the observation certainly holds true for the one in her head—is female, due to its robust corpus callosum, the “tough body” whose millions of fibers promote communication and connection.)
In the past few years, when far-right nationalists are banned from social media, violent extremists face boycotts, or institutions refuse to give a platform to racists, a faux-outraged moan has gone up: “So much for the tolerant left!” “So much for liberal tolerance!” The complaint became so hackneyed it turned into an already-hackneyed meme. It’s a wonder anyone thinks this line has any rhetorical force. The equation of tolerance with acquiescence, passivity, or a total lack of boundaries is a reductio ad absurdum that denudes the word of meaning. One can only laugh at unserious characterizations that do such violence to reason.
The concept of toleration has a long and complicated history in moral and political philosophy precisely because of the many problems that arise when the word is used without critical context. In some absurd, 21st century usages, tolerance is even conflated with acceptance, approval, and love. But it has historically meant the opposite—noninterference with something one dislikes or despises. Such noninterference must have limits. As Goethe wrote in 1829, “tolerance should be a temporary attitude only; it must lead to recognition. To tolerate means to insult.” Tolerance by nature exists in a state of social tension.
According to virtually every conception of liberal democracy, a free and open society requires tense debate and verbal conflict. Society, the argument goes, is only strengthened by the oft-contentious interplay of differing, even intolerant, points of view. So, when do such views approach the limits of toleration? One of the most well-known paradoxes of tolerance was outlined by Austrian philosopher Karl Popper in his 1945 book The Open Society and Its Enemies.
Popper was a non-religious Jew who witnessed the rise of Nazism in the 20s in his hometown of Vienna and fled to England, then in 1937, to Christchurch, New Zealand, where he was appointed lecturer at Canterbury College (now the University of Canterbury). There, he wrote The Open Society, where the famous passage appears in a footnote:
Unlimited tolerance must lead to the disappearance of tolerance. If we extend unlimited tolerance even to those who are intolerant, if we are not prepared to defend a tolerant society against the onslaught of the intolerant, then the tolerant will be destroyed, and tolerance with them. — In this formulation, I do not imply, for instance, that we should always suppress the utterance of intolerant philosophies; as long as we can counter them by rational argument and keep them in check by public opinion, suppression would certainly be unwise. But we should claim the right to suppress them if necessary even by force; for it may easily turn out that they are not prepared to meet us on the level of rational argument, but begin by denouncing all argument; they may forbid their followers to listen to rational argument, because it is deceptive, and teach them to answer arguments by the use of their fists or pistols. We should therefore claim, in the name of tolerance, the right not to tolerate the intolerant.
This last sentence has “been printed on thousands of bumper stickers and fridge magnets,” writes Will Harvie at Stuff. The quote might become almost as ubiquitous as Voltaire’s line about “defending to the death” the right of free speech (words actually penned by English writer Beatrice Evelyn Hall). Popper saw how fascism cynically exploited liberal toleration to gain a foothold and incite persecution, violent attacks, and eventually genocide. As he writes in his autobiography, he had seen how “competing parties of the Right were outbidding each other in their hostility towards the Jews.”
Popper’s formulation has been been used across the political spectrum, and sometimes applied in arguments against civil protections for some religious sects who hold intolerant views—a category that includes practitioners of nearly every major faith. But this is misleading. The line for Popper is not the mere existence of exclusionary or intolerant beliefs or philosophies, however reactionary or contemptible, but the open incitement to persecution and violence against others, which should be treated as criminal, he argued, and suppressed, “if necessary,” he continues in the footnote, “even by force” if public disapproval is not enough.
By this line of reasoning, vigorous resistance to those who call for and enact racial violence and ethnic cleansing is a necessary defense of a tolerant society. Ignoring or allowing such acts to continue in the name of tolerance leads to the nightmare events Popper escaped in Europe, or to the horrific mass killings at two mosques in Christchurch this month that deliberately echoed Nazi atrocities. There are too many such echoes, from mass murders at synagogues to concentration camps for kidnapped children, all surrounded by an echo chamber of wildly unchecked incitement by state and non-state actors alike.
Popper recognized the inevitability and healthy necessity of social conflict, but he also affirmed the values of cooperation and mutual recognition, without which a liberal democracy cannot survive. Since the publication of The Open Society and its Enemies, his paradox of tolerance has weathered decades of criticism and revision. As John Horgan wrote in an introduction to a 1992 interview with the thinker, two years before his death, “an old joke about Popper” retitles the book “The Open Society by One of its Enemies.”
With less than good humor, critics have derided Popper’s liberalism as dogmatic and itself a fascist ideology that inevitably tends to intolerance against minorities. Question about who gets to decide which views should be suppressed and how are not easy to answer. Popper liked to say he welcomed the criticism, but he refused to tolerate views that reject reason, fact, and argument in order to incite and perpetrate violence and persecution. It’s difficult to imagine any democratic society surviving for long if it decides that, while maybe objectionable, such tolerance is tolerable. The question, “these days,” writes Harvie, is “can a tolerant society survive the internet?”
In all the kingdom of nature, does any creature threaten us less than the gentle rabbit? Though the question may sound entirely rhetorical today, our medieval ancestors took it more seriously — especially if they could read illuminated manuscripts, and even more so if they drew in the margins of those manuscripts themselves. “Often, in medieval manuscripts’ marginalia we find odd images with all sorts of monsters, half man-beasts, monkeys, and more,” writes Sexy Codicology’s Marjolein de Vos. “Even in religious books the margins sometimes have drawings that simply are making fun of monks, nuns and bishops.” And then there are the killer bunnies.
Hunting scenes, de Vos adds, also commonly appear in medieval marginalia, and “this usually means that the bunny is the hunted; however, as we discovered, often the illuminators decided to change the roles around.”
Jon Kaneko-James explains further: “The usual imagery of the rabbit in Medieval art is that of purity and helplessness – that’s why some Medieval portrayals of Christ have marginal art portraying a veritable petting zoo of innocent, nonviolent, little white and brown bunnies going about their business in a field.” But the creators of this particular type of humorous marginalia, known as drollery, saw things differently.
“Drolleries sometimes also depicted comedic scenes, like a barber with a wooden leg (which, for reasons that escape me, was the height of medieval comedy) or a man sawing a branch out from under himself,” writes Kaneko-James.
This enjoyment of the “world turned upside down” produced the drollery genre of “the rabbit’s revenge,” one “often used to show the cowardice or stupidity of the person illustrated. We see this in the Middle English nickname Stickhare, a name for cowards” — and in all the drawings of “tough hunters cowering in the face of rabbits with big sticks.”
Then, of course, we have the bunnies making their attacks while mounted on snails, snail combats being “another popular staple of Drolleries, with groups of peasants seen fighting snails with sticks, or saddling them and attempting to ride them.”
Given how often we denizens of the 21st century have trouble getting humor from less than a century ago, it feels satisfying indeed to laugh just as hard at these drolleries as our medieval forebears must have — though many more of us surely get to see them today, circulating as rapidly on social media as they didn’t when confined to the pages of illuminated manuscripts owned only by wealthy individuals and institutions.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Quick fyi: I spend my days at Stanford Continuing Studies, where we’ve developed a rich lineup of online courses for lifelong learners, many of which will get started next week. The courses aren’t free. But they’re first rate, giving adult students–no matter where they live–the chance to work with dedicated teachers and students.
Everyone should read the Bible, and—I’d argue—should read it with a sharply critical eye and the guidance of reputable critics and historians, though this may be too much to ask for those steeped in literal belief. Yet fewer and fewer people do read it, including those who profess faith in a sect of Christianity. Even famous atheists like Christopher Hitchens, Richard Dawkins, and Melvyn Bragg have argued for teaching the Bible in schools—not in a faith-based context, obviously, but as an essential historical document, much of whose language, in the King James, at least, has made major contributions to literary culture. (Curiously—or not—atheists and agnostics tend to score far higher than believers on surveys of religious knowledge.)
There is a practical problem of separating teaching from preaching in secular schools, but the fact remains that so-called “biblical illiteracy” is a serious problem educators have sought to remedy for decades. Prominent Shakespeare scholar G.B. Harrison lamented it in the introduction to his 1964 edited edition, The Bible for Students of Literature and Art. “Today most students of literature lack this kind of education,” he wrote, “and have only the haziest knowledge of the book or of its contents, with the result that they inevitably miss much of the meaning and significance of many works of past generations. Similarly, students of art will miss some of the meaning of the pictures and sculptures of the past.”
Though a devout Catholic himself, Harrison’s aim was not to proselytize but to do right by his students. His edited Bible is an excellent resource, but it’s not the only book of its kind out there. In fact, no less a luminary, and no less a critic of religion, than scientist and sci-fi giant Isaac Asimov published his own guide to the Bible, writing in his introduction:
The most influential, the most published, the most widely read book in the history of the world is the Bible. No other book has been so studied and so analyzed and it is a tribute to the complexity of the Bible and eagerness of its students that after thousands of years of study there are still endless books that can be written about it.
Of those books, the vast majority are devotional or theological in nature. “Most people who read the Bible,” Asimov writes, “do so in order to get the benefit of its ethical and spiritual teachings.” But the ancient collection of texts “has a secular side, too,” he says. It is a “history book,” though not in the sense that we think of the term, since history as an evidence-based academic discipline did not exist until relatively modern times. Ancient history included all sorts of myths, wonders, and marvels, side-by-side with legendary and apocryphal events as well as the mundane and verifiable.
Asimov’s Guide to the Bible, originally published in two volumes in 1968–69, then reprinted as one in 1981, seeks to demystify the text. It also assumes a level of familiarity that Harrison did not expect from his readers (and did not find among his students). The Bible may not be as widely-read as Asimov thought, even if sales suggest otherwise. Yet he does not expect that his readers will know “ancient history outside the Bible,” the sort of critical context necessary for understanding what its writings meant to contemporary readers, for whom the “places and people” mentioned “were well known.”
“I am trying,” Asimov writes in his introduction, “to bring in the outside world, illuminate it in terms of the Biblical story and, in return, illuminate the events of the Bible by adding to it the non-Biblical aspects of history, biography, and geography.” This describes the general methodology of critical Biblical scholars. Yet Asimov’s book has a distinct advantage over most of those written by, and for, academics. Its tone, as one reader comments, is “quick and fun, chatty, non-academic.” It’s approachable and highly readable, that is, yet still serious and erudite.
Asimov’s approach in his guide is not hostile or “anti-religious,” as another reader observes, but he was not himself friendly to religious beliefs, or superstitions, or irrational what-have-yous. In the interview above from 1988, he explains that while humans are inherently irrational creatures, he nonetheless felt a duty “to be a skeptic, to insist on evidence, to want things to make sense.” It is, he says, akin to the calling believers feel to “spread God’s word.” Part of that duty, for Asimov, included making the Bible make sense for those who appreciate how deeply embedded it is in world culture and history, but who may not be interested in just taking it on faith. Find an old copy of Asimov’s Guide to the Bible at Amazon.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.