Amir joins your hosts Mark Linsenmayer, Erica Spyres, and Brian Hirt to consider this common act that can stretch from the mundane to the sublime. How have our various purposes for photography changed with the advent of digital technology, the introduction of social media, and the ready access to video? What determines what we choose to take pictures of, and how does taking photography more seriously change the way we experience? We touch on iconic and idealized images, capturing the specific vs. the universal, witnessing vs. intervening via photography, and more.
See more of Amir’s work at amirzaki.net.
A few of the articles we looked at to prepare included:
When next you meet an existentialist, ask him what kind of existentialist s/he is. There are at least as many varieties of existentialism as there have been high-profile thinkers propounding it. Several major strains ran through postwar France alone, most famously those championed by Jean-Paul Sartre, Simone de Beauvoir, and Albert Camus — who explicitly rejected existentialism, in part due to a philosophical split with Sartre, but who nevertheless gets categorized among the existentialists today. We could, perhaps, more accurately describe Camus as an absurdist, a thinker who starts with the inherent meaningless and futility of life and proceeds, not necessarily in an obvious direction, from there.
The animated TED-Ed lesson above sheds light on the historical events and personal experiences that brought Camus to this worldview. Beginning in the troubled colonial Algeria of the early 20th-century in which he was born and raised, educator Nina Medvinskaya goes on to tell of his periods as a resistance journalist in France and as a novelist, in which capacity he would write such enduring works as The Stranger and The Plague. Medvinskaya illuminates Camus’ central insight with a well-known image from his earlier essay “The Myth of Sisyphus,” on the Greek king condemned by the gods to roll a boulder up a hill for all eternity.
“Camus argues that all of humanity is in the same position,” says Medvinskaya, “and only when we accept the meaninglessness of our lives can we face the absurd with our heads held high.” But “Camus’ contemporaries weren’t so accepting of futility.” (Here the Quentin Blake-style illustrations portray a couple of figures bearing a strong resemblance to Sartre and de Beauvoir.) Many existentialists “advocated for violent revolution to upend systems they believed were depriving people of agency and purpose.” Such calls haven’t gone silent in 2020, just as The Plague — one of Camus’ writings in response to revolutionary existentialism — has only gained relevance in a time of global pandemic.
Last month the Boston Review’s Carmen Lea Dege considered the recent comeback of the thought, exemplified in different ways by Camus, Sartre, and others, that “rejected religious and political dogma, expressed scorn for academic abstraction, and focused on the finitude and absurdity of human existence.” This resurgence of interest “is not entirely surprising. The body of work we now think of as existentialist emerged during the first half of the twentieth century in conflict-ridden Germany and France, where uncertainty permeated every dimension of society.” As much as our societies have changed since then, uncertainty has a way of returning.
Today “we define ourselves and others on the basis of class, religion, race, and nationality, or even childhood influences and subconscious drives, to gain control over the contingencies of the world and insert ourselves in the myriad ways people have failed and succeeded in human history.” But the existentialists argued that “this control is illusory and deceptive,” an “alluring distraction from our own fragility” that ultimately “corrodes our ability to live well.” For the existentialists, pursuit of good life first demands an acceptance of not just fragility but futility, meaninglessness, absurdity, and ambiguity, among other conditions that strike us as deeply unacceptable. As Camus put it, we must imagine Sisyphus happy. But can we?
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
“Correlation does not equal causation” isn’t always a fun thing to say at parties, but it is always a good phrase to keep in mind when approaching survey data. Does the study really show that? Might it show the opposite? Does it confirm pre-existing biases or fail to acknowledge valid counterevidence? A little bit of critical thinking can turn away a lot of trouble.
I’ll admit, a new study, “The Role of Education in Taming Authoritarian Attitudes,” confirms many of my own biases, suggesting that higher education, especially the liberal arts, reduces authoritarian attitudes around the world. The claim comes from Georgetown University’s Center on Education and the Workforce, which analyzed and aggregated data from World Values Surveys conducted between 1994 and 2016. The study takes it for granted that rising authoritarianism is not a social good, or at least that it poses a distinct threat to democratic republics, and it aims to show how “higher education can protect democracy.”
Authoritarianism—defined as enforcing “group conformity and strict allegiance to authority at the expense of personal freedoms”—seems vastly more prevalent among those with only a high school education. “Among college graduates,” Elizabeth Redden writes at Inside Higher Ed, “holders of liberal art degrees are less inclined to express authoritarian attitudes and preferences compared to individuals who hold degrees in business or science, technology, engineering and mathematics fields.”
The “valuable bulwark” of the liberal arts seems more effective in the U.S. than in Europe, perhaps because “American higher education places a strong emphasis on a combination of specific and general education,” the full report speculates. “Such general education includes exposure to the liberal arts.” The U.S. ranks at a moderate level of authoritarianism compared to 51 other countries, on par with Chile and Uruguay, with Germany ranking the least authoritarian and India the most—a 6 on a scale of 0–6.
Higher education also correlates with higher economic status, suggesting to the study authors that economic security reduces authoritarianism, which is expressed in attitudes about parenting and in a “fundamental orientation” toward control over autonomy.
The full report does go into greater depth, but perhaps it raises more questions than it answers, leaving the intellectually curious to work through a dense bibliography of popular and academic sources. There is a significant amount of data and evidence to suggest that studying the liberal arts does help people to imagine other perspectives and to appreciate, rather than fear, different cultures, religions, etc. Liberal arts education encourages critical thinking, reading, and writing, and can equip students with tools they need to distinguish reportage from pure propaganda.
But we might ask whether these findings consistently obtain under actually existing authoritarianism, which “tends to arise under conditions of threat to social norms or personal security.” In the 2016 U.S. election, for example, the candidate espousing openly authoritarian attitudes and preferences, now the current U.S. president, was elected by a majority of voters who were well-educated and economically secure, subsequent research discovered, rather than stereotypically “working class” voters with low levels of education. How do such findings fit with the data Georgetown interprets in their report? Is it possible that those with higher education and social status learn better to hide controlling, intolerant attitudes in mixed company?
Let’s say you go home for the holidays. Anything’s possible, who knows. It’s a wild world. Let’s say you get there and someone starts laying on you that trip about how Q Continuum said mail-in voting was orchestrated by satanic cables from Anarchist HQ. Let’s say you overhear something more down-to-earth, like how if mail-in voting happens, billions of people will vote illegally… even more people than live in the country, which is how you’ll know….
Maybe you’ll want to speak up and say, hey I know something about this topic, except then maybe you realize you don’t actually know much, but you know something ain’t right with this talk and maybe it’s probably good to have a functioning Postal Service and maybe people should be able to vote. In such situations (who can say how often these things happen), you might wish to have a little information at the ready, to educate yourself and share with others.
You might share information about how mail-in voting has been around since 1775. It has worked pretty well at scale since “about 150,000 of the 1 million Union soldiers were able to vote absentee in the 1864 presidential election in what became the first widespread use of non-in person voting in American history,” Alex Seitz-Wald explains at NBC News. Since the federal government has managed to make mail-in voting work for soldiers serving away from home for over 150 years, “it’s now easier in some ways for a Marine in Afghanistan to vote than it is for an American stuck at home during the COVID-19 lockdown.”
“Some part of the military has been voting absentee since the American Revolution,” Donald Inbody, former Navy Captain turned political science professor at Texas State University, tells NBC News. Inbody refers to one of the first documented instances, when Continental Army soldiers voted in a town meeting by proxy in New Hampshire. But history is complicated, and “mail-in voting has worked just fine so shut up” needs some nuance.
In the very same election in which 150,000 Union soldiers mailed their ballots, Lincoln urged Sherman to send troops stationed in Democratic-controlled Indiana—which had banned absentee voting—back to their home states so that they could vote. The practice has always had its vocal critics and suffered accusations of fraud from all sides, though little evidence seems to have emerged. Absentee voting helped win the Civil War, Blake Stilwell argues at Military.com, in spite of a conspiracy theory alleging fraud that might have unseated Lincoln.
There are several remnants from the time of careful record-keeping, like the pre-printed envelope above that “contained a tally sheet of votes from the soldiers of Highland County the Field Hospital 2nd Division 23rd Army Corps,” notes the Smithsonian National Postal Museum. (The drawing at the top shows Pennsylvania soldiers voting in 1864.) And this is all fascinating stuff. But soldiers are actually absent, which is why they vote absentee, right? I mean, if you’re at home, why can’t you just go to the polling place in the global pandemic in your city that closed all the polling places?
It’s true that civilian mail-in voting often works differently from military absentee voting. While every state offers some version, some restrict it to voters temporarily out of state or suffering an illness. Currently, only “30 states have adopted ‘no-excuse absentee balloting,’ which allows anyone to request an absentee ballot,” Nina Strochlic reports at National Geographic. State laws vary further among those 30.
“In 2000,” for example, “Oregon became the first state to switch to fully vote-by-mail elections.” Things have rapidly changed, however. “In the face of the coronavirus pandemic, voters in every state but Mississippi and Texas were allowed to vote by mail or by absentee ballot in this year’s primaries.” If you live in the U.S. (or outside it) and don’t know what happened next… bless you. It involves defunding the post office instead of the police.
Voting by mail has expanded to meet major crises throughout history, says Alex Keyssar, history professor at the Kennedy School of Government at Harvard. “That’s the logical trajectory” and “we are not in normal times.” If a highly infectious disease that has killed at least 200,000 Americans on top of ongoing voter suppression and an election security crisis and massive civil unrest and economic turmoil aren’t reasons enough to expand the vote-by-mail franchise to every state, I couldn’t say what is.
Should only soldiers have the ability to vote easily? I imagine someone might say YES, loudly over the centerpiece, because voting is a privilege not a right!
You, empowered purveyor of accurate information, understander of absentee voting history, change-maker, will pull out your pocket Constitution and ask someone to find the word “privilege” in amendments that start with “The right of citizens of the United States to vote shall not be denied or abridged by the United States or by any State,” etc. That’ll show ’em. But if the gambit fails to impress, you’ve still got a better understanding of why voting by mail may not be one of the signs of the end times.
The greatest of the silent clowns is Buster Keaton, not only because of what he did, but because of how he did it. —Roger Ebert
In 1987, Video magazine published a story titled “Where’s Buster?” lamenting the lack of Buster Keaton films available on videotape, “despite renewed interest” in a legend who was “about to regain his rightful place next to Chaplin in silent comedy’s pantheon.” How things have changed for Keaton fans and admirers. Not only are most of the stone-faced comic genius’ films available online, but he has maybe eclipsed Chaplin as the most popularly revered silent film star of the 1920s.
Keaton has always been held in the highest esteem by his fellow artists. He was dubbed “the greatest of all the clowns in the history of the cinema” by Orson Welles, and served as a significant inspiration for Samuel Beckett. (He was the playwright’s first choice to play Waiting for Godot’s Lucky, though he was too perplexed by the script to take the role). In Peter Bogdanovich’s new documentary, The Great Buster: A Celebration, Mel Brooks and Carl Reiner discuss his foundational influence on their comedy, and Werner Herzog calls him “the essence of movies.”
For many years, however, the state of Keaton’s filmography made it hard for the general public to fully appraise his work. “The General, with Buster as a train engineer in the Civil War, has always been available,” Roger Ebert wrote in 2002, and has been “hailed as one of the supreme masterpieces of silent filmmaking. But other features and shorts existed in shabby, incomplete prints, if at all, and it was only in the 1960s that film historians began to assemble and restore Keaton’s lifework. Now almost everything has been recovered, restored, and is available on DVDs and tapes that range from watchable to sparkling.”
Access to Keaton’s films has further expanded as a dozen or so entered the public domain in recent years, including two features, Sherlock, Jr. and The Navigator, this year and three more to come in 2021. You can watch thirty-one of Keaton’s restored, recovered films on YouTube, at the links below, shared by MetaFilter user Going to Maine, who writes, “where, oh where, in this modern world, can we find the gems of his golden era? The obvious place.”
Keaton starred in his first feature-length film, The Saphead, in 1920. For the next decade, until the end of the silent era, he dominated the box office, alongside Chaplin and Harold Lloyd, with his canny blend of daredevil slapstick and everyman pathos. After the twenties, his career floundered, then rebounded. His last picture was a return to silent film in Beckett’s 1966 short, “Film,” made the year of his death. Since then, Keaton appreciation has become almost a form of worship.
In 2018, The General came in at number 34 on Sight & Sound’s Greatest Films of All Time list. But the BFI’s Geoff Andrew argued that it deserved the top spot, and Keaton deserves recognition as “not merely the greatest of the silent comedians,” but “the greatest of all comic actors to have appeared on the silver screen… not only a great American filmmaker of the silent era,” but “one of the greatest filmmakers of all time, anywhere.” Andrew likens him to a god, but “unlike gods… Buster has the advantage of being able to make us laugh. And laugh. And laugh.”
In Seoul, where I live, the success of Bong Joon-ho’s Parasite at this year’s Academy Awards — unprecedented for a non-American film, let alone a Korean one — did not go unnoticed. But even then, the celebration had already been underway at least since the movie won the Palme d’Or at Cannes. Something of a homecoming for Bong after Snowpiercer and Okja, two projects made wholly or partially abroad, Parasite takes place entirely in Seoul, staging a socioeconomic grudge match between three families occupying starkly disparate places in the human hierarchy. The denouement is chaotic, but arrived at through the precision filmmaking with which Bong has made his name over the past two decades.
When Parasite’s storyboards were published in graphic-novel form here a few months ago, I noticed ads in the subway promising a look into the mind of “Bongtail.” Though Bong has publicly declared his contempt for that nickname, it has nevertheless stuck as a reflection of his meticulous way of working.
The son of a graphic designer, he grew up not just watching movies but drawing comics, a practice that would later place him well to create his own storyboards. In so doing he assembles an entire film in his mind before shooting its first frame (a working process not dissimilar to that of Western filmmakers like the Coen brothers), which enables him and his collaborators to execute complex sequences such as what the Nerdwriter calls Parasite’s “perfect montage.”
With the English translation of Parasite: A Graphic Novel in Storyboards now available, video essayists like Thomas Flight have made comparisons between Bong’s drawings and the film. Starting with that celebrated montage, Flight shows that, where the final product departs from its plan, it usually does so to simplify the hand-drawn action, making it more legible and elegant. In the short video just above, you can watch one minute of Parasite lined up with its corresponding storyboard panels, one of which incorporates a photograph of the real Seoul neighborhood in which Bong located the main characters’ home. This is rich storyboarding indeed, but in his introduction to the book, Bong explains that he doesn’t consider it essential to filmmaking, just essential to him: “I actually storyboard to quell my own anxiety.” Would that we could all draw worldwide acclaim from doing the same.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
Open access publishing has, indeed, made academic research more accessible, but in “the move from physical academic journals to digitally-accessible papers,” Samantha Cole writes at Vice, it has also become “more precarious to preserve…. If an institution stops paying for web hosting or changes servers, the research within could disappear.” At least a couple hundred open access journals vanished in this way between 2000 and 2019, a new study published on arxiv found. Another 900 journals are in danger of meeting the same fate.
The journals in peril include scholarship in the humanities and sciences, though many publications may only be of interest to historians, given the speed at which scientific research tends to move. In any case, “there shouldn’t really be any decay or loss in scientific publications, particularly those that have been open on the web,” says study co-author Mikael Laasko, information scientist at the Hanken School of Economics in Helsinki. Yet, in digital publishing, there are no printed copies in university libraries, catalogued and maintained by librarians.
To fill the need, the Internet Archive has created its own scholarly search platform, a “fulltext search index” that includes “over 25 million research articles and other scholarly documents” preserved on its servers. These collections span digitized and original digital articles published from the 18th century to “the latest Open Access conference proceedings and pre-prints crawled from the World Wide Web.” Content in this search index comes in one of three forms:
public web content in the Wayback Machine web archives (web.archive.org), either identified from historic collecting, crawled specifically to ensure long-term access to scholarly materials, or crawled at the direction of Archive-It partners
digitized print material from paper and microform collections purchased and scanned by Internet Archive or its partners
general materials on the archive.org collections, including content from partner organizations, uploads from the general public, and mirrors of other projects
The project is still in “alpha” and “has several bugs,” the site cautions, but it could, when it’s fully up and running, become part of a much-needed revolution in academic research—that is if the major academic publishers don’t find some legal pretext to shut it down.
Academic publishing boasts one of the most rapacious legal business models on the global market, and one of the most exploitative: a double standard in which scholars freely publish and review research for the public benefit (ostensibly) and very often on the public dime; while private intermediaries rake in astronomical sums for themselves with paywalls. The open access model has changed things, but the only way to truly serve the “best interests of researchers and the public,” neuroscientist Shaun Khoo argues, is through public infrastructure and fully non-profit publication.
Maybe Internet Archive Scholar can go some way toward bridging the gap, as a publicly accessible, non-profit search engine, digital catalogue, and library for research that is worth preserving, reading, and building upon even if it doesn’t generate shareholder revenue. For a deeper dive into how the Archive built its formidable, still developing, new database, see the video presentation above from Jefferson Bailey, Director of Web Archiving & Data Services. And have a look at Internet Archive Scholarhere. It currently lacks advanced search functions, but plug in any search term and prepare to be amazed by the incredible volume of archived full text articles you turn up.
There were a lot of moments during my first view of The Wire when I realized I wasn’t watching the usual cop procedural. But the one that sticks in my head was when an obviously blitzed and blasted McNulty, the Irish-American detective that you *might* think is the hero of the show, leaves a bar, gets into his car and promptly totals it. In any other show this would have been the turning point for the character, either as a wake-up call, a reason for his boss to throw him off the case, or to gin up some suspense. But no. McNulty walks away from the accident and…it’s never really spoken about. The cops took care of their own.
Life does not follow the contours of a television drama, and neither did David Simon’s groundbreaking HBO series. Beloved characters get killed, or not, or they just transfer out of the show as in life. Nobody really gets what they want. Neither good nor evil wins.
As Simon told an audience at Loyola University, Baltimore in 2007: ““What we were trying to do was take the notion of Greek tragedy, of fated and doomed people, and instead of these Olympian gods, indifferent, venal, selfish, hurling lightning bolts and hitting people in the ass for no reason—instead of those guys whipping it on Oedipus or Achilles, it’s the postmodern institutions … those are the indifferent gods.”
The Wire still feels recent despite premiering in 2002 and in 4:3 ratio, no widescreen HD here. It feels recent because the problems depicted in the show still exist: corruption at all levels of city government and governance, institutionalized racism, failed schools, a collapsing fourth estate, a gutted economy, weakened unions, and a general nihilism and despondency. Simon may not have seen the Black Lives Matter movement coming, but the recipe for it, the warning of it, is there in the show.
So there’s definitely a reason to give it a re-watch to see how we’ve changed. The above essay from 2019 makes the case for The Wire as a subversion of the usual cop show, with Thomas Flight noting it “doesn’t try to grab and keep your attention. It requires it. And if you give it your attention it will reward you.”
It also reminds us of the literary giants in the writers’ room: crime novelists Dennis Lehane, George Pelecanos, and Richard Price were on the team, as was journalist Rafael Alvarez, and William F. Zorzi. That combined with David Simon’s years in journalism covering Baltimore and Ed Burns’ experience on the police force meant the show feels right, and the writers did research and actual Baltimore extras were encouraged to speak up if something didn’t.
If that video essay intrigues you, there’s more in the series, though with many more spoilers, such as this one on Character and Theme.
Not long after The Wire finished its fifth and final season, there were plenty of books published on the show. And now we’re nearly two decades in from its premiere, The Atlantic’s Jemele Hill and The Ringer’s Van Lathan decided to spend quarantine kicking off a podcast where the two black cultural critics give the show a spirited re-watch. Does the show feature too much “copaganda” as my leftist critics now contend? Does it hold up like white liberals (its biggest fans, let’s be honest, despite President Obama’s shout out) think it does? The hosts just wrapped up Season Three, but if you’re ready to start the show again with commentary, here’s their first episode:
Ted Mills is a freelance writer on the arts who currently hosts the Notes from the Shed podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, and/or watch his films here.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.