Perhaps one of the most criminally overlooked voices from World War I, Siegfried Sassoon, was, in his time, enormously popular with the British reading public. His war poems, as Margaret B. McDowell writes in the Dictionary of Literary Biography, are “harshly realistic laments or satires” that detail the grisly horrors of trench warfare with unsparingly vivid images and commentary. In lieu of the mass medium of television, and with film still emerging from its infancy, poets like Sassoon and Wilfred Owen served an important function not only as artists but as moving, firsthand documentarians of the war’s physical and emotional ravages.
It is unfortunate that poetry no longer serves this public function. These days, video threatens to eclipse even journalistic writing as a primary means of communication, a development made especially troubling by how easily digital video can be faked or manipulated by the same technologies used to produce blockbuster Hollywood spectacles and video games. But a fascinating new use of that technology, Peter Jackson shows us above, will also soon bring the grainy, indistinct film of the past into new life, giving footage of WWI the kind of startling immediacy still conveyed by Sassoon’s poetry.
Jackson is currently at work on what he describes as “not the usual film that you would expect on the First World War,” and as part of that documentary work, he has digitally enhanced footage from the period, “incredible footage of which the faces of the men just jump out at you. It’s the faces, it’s the people that come to life in this film. It’s the human beings that were actually there, that were thrust into this extraordinary situation that defined their lives in many cases.” In addition to restoring old film, Jackson and his team have combed through about 600 hours of audio interviews with WWI veterans, in order to further communicate “the experience of what it was like to fight in this war” from the point of view of the people who fought it.
The project, commissioned by the Imperial War Museums, “will debut at the BFI London Film Festival later this year,” reports The Independent, “later airing on BBC One. A copy of the film will also be given to every secondary school in the country for the 2018 autumn term.” No word yet on where the film can be seen outside the UK, but you can check the site 1418now.org.uk for release details. In the meanwhile, consider picking up some of the work of Siegfried Sassoon, whom critic Peter Levi once described as “one of the few poets of his generation we are really unable to do without.”
Learn more about the war at the free course offerings below.
The novel medium of social media—and the novel use of Twitter as the official PR platform for public figures—allows not only for endless amounts of noise and disinformation to permeate our newsfeeds; it also allows readers the opportunity to refute statements in real time. Whether corrections register or simply get drowned in the sea of information is perhaps a question for a 21st century Marshall McLuhan to ponder.
Another prominent theorist of older forms of media, Noam Chomsky, might also have an opinion on the matter. In his 1988 book Manufacturing Consent, written with Edward Herman, Chomsky details the ways in which governments and media collude to deliberately mislead the public and socially engineer support for wars that kill millions and enrich a handful of profiteers.
Moreover, in mass media communications, those wars, invasions, “police actions,” regime changes, etc. get conveniently erased from historical memory by public intellectuals who serve the interests of state power. In one recent example on the social medium of record, Twitter, Richard N. Haas, President of the Council on Foreign Relations, expressed dismay about the disturbingly cozy state of affairs between the U.S. Administration and Putin’s Russia by claiming that “International order for 4 centuries has been based on non-interference in the international affairs of others and respect for sovereignty.”
One recent critique of foreign policy bodies like CFR would beg to differ, as would the history of hundreds of years of colonialism. In a very Chomsky-like rejoinder to Haas, journalist Nick Turse wrote, “This might be news to Iraqis and Afghans and Libyans and Yemenis and Vietnamese and Cambodians and Laotians and Koreans and Iranians and Guatemalans and Chileans and Nicaraguans and Mexicans and Cubans and Dominicans and Haitians and Filipinos and Congolese and Russians and….”
Genuine concerns about Russian election tampering notwithstanding, the list of U.S. interventions in the “affairs of others” could go on and on. Haas’ initial statement offers an almost perfect example of what Chomsky identified in another essay, “The Responsibility of Intellectuals,” as not only a “lack of concern for truth” but also “a real or feigned naiveté about American actions that reaches startling proportions.”
“It is the responsibility of intellectuals to speak the truth and to expose lies,” wrote Chomsky in his 1967 essay. “This, at least, may seem enough of a truism to pass over without comment. Not so, however. For the modern intellectual, it is not at all obvious.” Chomsky proceeds from the pro-Nazi statements of Martin Heidegger to the distortions and outright falsehoods issued routinely by such thinkers and shapers of foreign policy as Arthur Schlesinger, economist Walt Rostow, and Henry Kissinger in their defense of the disastrous Vietnam War.
The background for all of these figures’ distortions of fact, Chomsky argues, is the perpetual presumption of innocence on the part of the U.S., a feature of the doctrine of exceptionalism under which “it is an article of faith that American motives are pure, and not subject to analysis.” We have seen this article of faith invoked in hagiographies of past Administrations whose domestic and international crimes are conveniently forgotten in order to turn them into foils, stock figures for an order to which many would like to return. (As one former Presidential candidate put it, “America is great, because America is good.”)
Chomsky would include the rhetorical appeal to a nobler past in the category of “imperialist apologia”—a presumption of innocence that “becomes increasingly distasteful as the power it serves grows more dominant in world affairs, and more capable, therefore, of the unconstrained viciousness that the mass media present to us each day.”
We are hardly the first power in history to combine material interests, great technological capacity, and an utter disregard for the suffering and misery of the lower orders. The long tradition of naiveté and self-righteousness that disfigures our intellectual history, however, must serve as a warning to the third world, if such a warning is needed, as to how our protestations of sincerity and benign intent are to be interpreted.
For those who well recall the events of even fifteen years ago, when the U.S. government, with the aid of a compliant press, lied its way into the second Iraq war, condoning torture and the “extraordinary rendition” of supposed hostiles to black sites in the name of liberating the Iraqi people, Chomsky’s Vietnam-era critiques may sound just as fresh as they did in the mid-sixties. Are we already in danger of misremembering that recent history? “When we consider the responsibility of intellectuals,” Chomsky writes, the issue at hand is not solely individual morality; “our basic concern must be their role in the creation and analysis of ideology.”
What are the ideological features of U.S. self-understanding that allow it to recreate past errors again and again, then deny that history and sink again into complacency, perpetuating crimes against humanity from the Cambodian bombings and My Lai massacre, to the grotesque scenes at Abu Ghraib and the drone bombings of hospitals and weddings, to supporting mass killings in Yemen and murder of unarmed Palestinian protestors, to the kidnapping and caging of children at the Mexican border?
The current ruling party in the U.S. presents an existential threat, Chomsky recently opined, on a world historical scale, displaying “a level of criminality that is almost hard to find words to describe.” It is the responsibility of intellectuals, Chomsky argues in his essay—including journalists, academics, and policy makers and shapers—to tell the truth about events past and present, no matter how inconvenient those truths may be.
We remember Stanley Kubrick as the archetypal cinematic auteur. Though all hugely collaborative efforts, could any of his films have been made without his presiding authorial intelligence? Certainly none could have been made without his eye for literary material. Kubrick usually began his projects not with his own original ideas but with books, famously adapting the likes of Vladimir Nabokov’s Lolita and Anthony Burgess’ A Clockwork Orange, continuing the practice right up until his final picture Eyes Wide Shut, an adaptation of Austrian writer Arthur Schnitzler’s 1926 novella Traumnovelle, or Dream Story.
But Traumnovelle, it turns out, wasn’t the only Austrian novella of the early 20th century Kubrick worked on adapting for the screen. A recently discovered “lost” Kubrick screenplay, writes the Guardian’s Dalya Alberge, “is so close to completion that it could be developed by filmmakers. Entitled Burning Secret, the script is an adaptation of the 1913 novella by the Viennese writer Stefan Zweig. In Kubrick’s adaptation of the story of adultery and passion set in a spa resort, a suave and predatory man befriends a 10-year-old boy, using him to seduce the child’s married mother.” Kubrick wrote the script in 1956 in collaboration with Calder Willingham, with whom he also wrote Paths of Glory, which would become his fourth feature the following year.
The studio MGM, Alberge writes, “is thought to have cancelled the commissioned project after learning that Kubrick was also working on Paths of Glory, putting him in breach of contract. Another account suggests that MGM told Kubrick’s producing partner James B. Harris that it did not see the screenplay’s potential as a movie.” She also quotes Nathan Abrams, the film professor at Wales’ Bangor University who recently found the Burning Secret script, as saying that “ ‘the adultery storyline’ involving a child as a go-between might have been considered too risqué” back in the 1950s. Since Kubrick could “only just” get Lolita through in 1961, this “inverse of Lolita” may not have had much chance half a decade earlier.
Zweig, one of the most popular writers in the world in the 1920s and 1930s, has already inspired one film by an American auteur: Wes Anderson’s The Grand Budapest Hotel, which came out in 2014. Not only are several of its characters modeled on Zweig himself, it has the same structure of stories nested within stories that Zweig used in his writing. “It’s a device that maybe is a bit old-fashioned,” Anderson said in a Telegraph interview, “where somebody meets an interesting, mysterious person and there’s a bit of a scene that unfolds with them before they eventually settle down to tell their whole tale, which then becomes the larger book or story we’re reading.” Usually, heightening the confessional mood further still, the teller has never told the tale to anyone else. Hence the burning nature of secrets in Zweig — and hence the fascination of Kubrick’s cool, controlled cinematic sensibility interpreting them.
Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Tucked away in the style section of yesterday’s Washington Post—after the President of the United States basically declared allegiance to a hostile dictator, again, after issuing yet more denunciations of the U.S. press as “enemies of the people”—was an admonition from Margaret Sullivan to the “reality-based press.” “The job will require clarity and moral force,” writes Sullivan, “in ways we’re not always all that comfortable with.”
Many have exhausted themselves in asking, what makes it so hard for journalists to tell the truth with “clarity and moral force”? Answers range from the conspiratorial—journalists and editors are bought off or coerced—to the mundane: they normalize aberrant behavior in order to relieve cognitive dissonance and maintain a comfortable status quo. While the former explanation can’t be dismissed out of hand in the sense that most journalists ultimately work for media megaconglomerates with their own vested interests, the latter is just as often offered by critics like NYU’s Jay Rosen.
Established journalists “want things to be normal,” writes Rosen, which includes preserving access to high-level sources. The press maintains a pretense to objectivity and even-handedness, even when doing so avoids obvious truths about the mendacity of their subjects. Mainstream journalists place “protecting themselves against criticism,” Rosen wrote in 2016, “before serving their readers. This is troubling because that kind of self-protection has far less legitimacy than the duties of journalism, especially when the criticism itself is barely valid.”
As is far too often the case these days, the questions we grapple with now are the same that vexed George Orwell over fifty years ago in his many literary confrontations with totalitarianism in its varying forms. Orwell faced what he construed as a kind of censorship when he finished his satirical novel Animal Farm. The manuscript was rejected by four publishers, Orwell noted, in a preface intended to accompany the book called “The Freedom of the Press.” The preface was “not included in the first edition of the work,” the British Library points out, “and it remained undiscovered until 1971.”
“Only one of these” publishers “had any ideological motive,” writes Orwell. “Two had been publishing anti-Russian books for years, and the other had no noticeable political colour. One publisher actually started by accepting the book, but after making preliminary arrangements he decided to consult the Ministry of Information, who appear to have warned him, or at any rate strongly advised him, against publishing it.” While Orwell finds this development troubling, “the chief danger to freedom of thought and speech,” he writes, was not government censorship.
If publishers and editors exert themselves to keep certain topics out of print, it is not because they are frightened of prosecution but because they are frightened of public opinion. In this country intellectual cowardice is the worst enemy a writer or journalist has to face, and that fact does not seem to me to have had the discussion it deserves.
The “discomfort” of intellectual honesty, Orwell writes, meant that even during wartime, with the Ministry of Information’s often ham-fisted attempts at press censorship, “the sinister fact about literary censorship in England is that it is largely voluntary.” Self-censorship came down to matters of decorum, Orwell argues—or as we would put it today, “civility.” Obedience to “an orthodoxy” meant that while “it is not exactly forbidden to say this, that or the other… it is ‘not done’ to say it, just as in mid-Victorian times it was ‘not done’ to mention trousers in the presence of a lady. Anyone who challenges the prevailing orthodoxy finds himself silenced with surprising effectiveness,” not by government agents, but by a critical backlash aimed at preserving a sense of “normalcy” at all costs.
At stake for Orwell is no less than the fundamental liberal principle of free speech, in defense of which he invokes the famous quote from Voltaire as well as Rosa Luxembourg’s definition of freedom as “freedom for the other fellow.” “Liberty of speech and of the press,” he writes, does not demand “absolute liberty”—though he stops short of defining its limits. But it does demand the courage to tell uncomfortable truths, even such truths as are, perhaps, politically inexpedient or detrimental to the prospects of a lucrative career. “If liberty means anything at all,” Orwell concludes, “it means the right to tell people what they do not want to hear.”
You can’t talk about American literature in the second half of the 20th century without talking about Kurt Vonnegut. And since so many well-known writers today imbibed his influence at one point or another, you’d have to mention him when talking about 21st-century literature as well. Despite so fully inhabiting his time, not least by wickedly lampooning it, the author of Slaughterhouse-Five,Cat’s Cradle, and Breakfast of Champions also had a few tendencies that put him ahead of his time. He worked wonders with the short story, a form in whose heyday he began his writing career, but he also had a knack for what would become the most social media-friendly of all forms, the list.
In the video above, those abilities converge to produce Vonnegut’s eight bullet points for good short-story writing:
Use the time of a total stranger in such a way that he or she will not feel the time was wasted.
Give the reader at least one character he or she can root for.
Every character should want something, even if it is only a glass of water.
Every sentence must do one of two things — reveal character or advance the action.
Start as close to the end as possible.
Be a sadist. No matter how sweet and innocent your leading characters, make awful things happen to them — in order that the reader may see what they are made of.
Write to please just one person. If you open a window and make love to the world, so to speak, your story will get pneumonia.
Give your readers as much information as possible as soon as possible. To heck with suspense. Readers should have such complete understanding of what is going on, where and why, that they could finish the story themselves, should cockroaches eat the last few pages.
In the short lecture above Vonnegut gets more technical, sketching out the shapes that stories, short or long, can take. On his chalkboard he draws two axes, the horizontal representing time and the vertical representing the protagonist’s happiness. In one possible story the protagonist begins slightly happier than average, gets into trouble (a downward plunge in the story’s curve), and then gets out of it again (returning the curve to a higher point of happiness than where it began). “People love that story,” Vonnegut says. “They never get sick of it.” Another story starts on an “average day” with an “average person not expecting anything to happen.” Then that average person “finds something wonderful” (with a concurrent upward curve), then loses it (back down), then finds it again (back up).
The third and most complicated curve represents “the most popular story in Western civilization.” It begins down toward the bottom of the happiness axis, with a motherless young girl whose father has “remarried a vile-tempered ugly women with two nasty daughters.” But a fairy godmother visits and bestows a variety of gifts upon the girl, each one causing a stepwise rise in her happiness curve. That night she attends a ball where she dances with a prince, bringing the curve to its peak before it plunges back to the bottom at the stroke of midnight, when the fairy godmother’s magical gifts expire. In order to bring the curve back up, the prince must use the glass slipper she accidentally left behind at the ball to — oh, you’ve heard this one before?
Vonnegut first explored the idea of story shapes in his master’s thesis, rejected by the University of Chicago “because it was so simple and looked like too much fun.” Clearly that didn’t stop him from continuing to think about and experiment with those shapes all throughout his career. He would also keep clarifying his other ideas about writing and literature by explaining them in a variety of settings. He assigned term papers that can still teach you how to read like a writer, he appeared on television dispensing advice to aspirants to the craft, and he even published articles on how to write with style (in publications like the Institute of Electrical and Electronics Engineers’ journal at that). Nobody could, or should try to, write just like Kurt Vonnegut, but all of us who write at all could do well to give our craft the kind of thought he did.
Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
At one time, the name Sarah Bernhardt was synonymous with melodramatic self-presentation. In her heyday, the actress created a category all her own—impossible to judge by the usual standards of the dramatic arts. Or as Mark Twain put it, “there are five kinds of actresses: bad actresses, fair actresses, good actresses, great actresses—and then there is Sarah Bernhardt.”
Admired and beloved by Victor Hugo and playwright Edmond Rostand, who called her “the queen of the pose and the princess of the gesture,” Bernhardt commanded attention in every role, and became infamous as “a canny self-promoter,” as Hannah Manktelow writes. Bernhardt “cultivated her image as a mysterious, exotic outsider. She claimed to sleep in a coffin and encouraged the circulation of outlandish rumors about her eccentric behavior.”
Bernhardt’s worldwide fame rested not only on her public relations skill, but also on her willingness to take dramatic risks most actresses of the time would never dare. In one notable example, she played Hamlet in 1899, at age 55, in a French adaptation of Shakespeare’s play. What’s more, she boldly undertook the role in London, then again in Stratford at the Shakespeare Memorial Theatre. Finally, she became the first woman to portray Hamlet on film (see a short clip above).
Reactions to her stage performance by contemporaries were mixed. In her review, actress and writer Elizabeth Robins praised Bernhardt’s “amazing skill” in playing “a spirited boy… with impetuosity, a youthfulness, almost childish.” But Robins issued a qualification at the outset: “for a woman to play at being a man is, surely, a tremendous handicap,” she writes, a criticism echoed by English essayist Max Beerbohm, who went so far as to deny women the power to create art.
“Creative power,” wrote Beerbohm, “the power to conceive ideas and execute them, is an attribute of virility; women are denied it, in so far as they practice art at all, they are aping virility, exceeding their natural sphere. Never does one understand so well the failure of women in art as when one sees them deliberately impersonating men upon the stage.” Setting Beerbohm’s categorically sexist assertions aside (for the moment), we must mark the irony that both he and Robins are troubled by a woman playing a man, given that all of Shakespeare’s female characters were once played by men, a fact both critics somehow fail to mention.
Where Beerbohm saw in Bernhardt’s performance a mere “aping of virility,” Robins, unhampered by Beerbohm’s ugly misogyny, observed the great actress in vivid detail, in an essay that brings Bernhardt’s Hamlet to life with descriptions of her, for example, “appealing dumbly for another sign” after seeing her father’s ghost (on painted gauze), “and passing pathetic fluttering hands over the unresponsive surface, groping piteously like a child in the dark.”
The pathos of Bernhardt’s performance was undercut, Robins felt, by some clumsy moments, such as her mistreatment of poor Yorick’s skull. (A real human skull, by the way, given to her by Victor Hugo). “It was not pleasant,” writes Robins, “to see the grinning object handled so callously…. Indeed, I feel sure that Madame Bernhardt treats her lap-dog more considerately.” On the whole, however, Robins felt the performance a truly dramatic achievement through Bernhardt’s “mastery of sheer poise… of sparing, clean-cut gesture… the effect that the artist in her wanted to produce.”
Some filmmakers start in commercials, honing their chops in anticipation of making personal projects later. A select few go in the other direction, realizing their distinctive vision before fielding offers from companies who want a piece of that vision’s cultural currency. Anyone who’s seen David Lynch’s most acclaimed work will suspect, correctly, that Lynch belongs in the latter group. With 1977’s cult hit Eraserhead, he showed cinema what it means to be Lynchian. This brought him the attention of Hollywood, leading to the respectable success of The Elephant Man and the disaster that was Dune. Only in 1986, with Blue Velvet, could Lynch make a truly, even troublingly personal film that hit the zeitgeist at just the right moment.
Naturally, Madison Avenue came calling soon thereafter. “With the smash Blue Velvet, a Palme d’or at Cannes for Wild at Heart, and then the national phenomenon of Twin Peaks’ first season, David Lynch clearly established himself as the U.S.A.‘s foremost commercially viable avant-garde-‘offbeat’ director,” wrote David Foster Wallace in a 1997 piece on the filmmaker.
“For a while there it looked like he might be able to single-handedly broker a new marriage between art and commerce in U.S. movies, opening formula-frozen Hollywood to some of the eccentricity and vigor of art film.” Lynch’s fans in television advertising must have imagined that he could do the same for their industry, and you can watch the fruits of that hunch in the half-hour compilation of Lynch-directed commercials above.
Lynch has worked for some startlingly big brands, beginning with Calvin Klein: his trio of spots for the fragrance Obsession take as their basis the writing of F. Scott Fitzgerald, Ernest Hemingway, and D.H. Lawrence. A few years later he directed a humorous mini-season of Twin Peaks to promote Georgia Coffee, one of the top brands of canned coffee in the Lynch-loving country of Japan. The New York Department of Sanitation engaged Lynch’s services to imbue their anti-littering campaign with his signature high-contrast ominousness, a mood also sought by fashion-industry titans like Armani, Yves Saint Laurent, Gucci, and Dior. The marketers of humbler goods like Alka-Seltzer, Barilla Pasta (a seemingly auteur-aware brand that has also hired Wim Wenders and Fellini), and Clear Blue Easy home pregnancy tests have also gone in for a touch of the Lynchian.
Quite a few of these commercials originally aired only outside America, which may reflect the supposedly more enduring appreciation of Lynch’s work that exists in Europe and Asia. But for all Lynch’s artistic daring, the man himself has always come off as an enthusiast of unreconstructed American pleasures. To this day he remains a steadfast smoker, and in 1998 brought that personal credibility to the Swiss cigarette brand Parisienne. The resulting spot features men in ties, showers of sparks, dead fish, backwards talking, a forbiddingly illuminated shack, and apocalyptic flames: Parisienne, in other words, must have got exactly what they paid for.
Based in Seoul, Colin Marshall writes and broadcasts on cities and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Historian John Hope Franklin once described the decades from the end of slavery through the advent of Jim Crow as “The Long Dark Night” because of the legislative chicanery and extreme violence used to disenfranchise and dispossess African Americans after the failure of Reconstruction. It is during these years that the blues emerged from the rural South into the cities, and the age of the “race record” brought black music into popular culture in ways that irrevocably defined what the country sounded like.
The source of the blues, writ broadly, is the sufferings and strivings of those anonymous rural folk who transmitted their experiences through song, “whether in the cotton fields or in lumber camps, on the levees or in the shacks of field hands or housemaids,” as Dave Oliphant writes in Texan Jazz. But when it comes to naming early sources, the waters get murky. Jazz writer Ted Gioia refers to the period before the mid-1920’s as “the Dark Age of myth and legend” in blues history for its paucity of written detail.
We do know that blues songs gained much popularity throughout the first two decades of the 20th century, many of them penned and published by Memphis composer and “father of the blues,” W.C. Handy. These blues were first commodified and recorded in the 1910s for white audiences by white vaudeville singers like Nora Bayes and Marion Harris. It wasn’t until 1920 that a blues record by a black singer was recorded and released, “and in a sense it was happenstance,” says Angela Davis in the NPR segment below.
“Earlier in the year,” Davis explains, “[Ukranian-born singer] Sophie Tucker had been scheduled for a recording session but became ill and [blues songwriter] Perry Bradford managed to persuade Okeh Records to allow Mamie Smith to do the recording session instead.” And so we have at the top what Gioia calls the “breakthrough event” of Smith’s “Crazy Blues,” recorded on August 10, 1920, significant because “the first recording companies were reluctant to promote black music of any sort,” and then only when it was performed by white entertainers.
In the decade of “Crazy Blues,” that changed dramatically, as record companies realized a huge untapped market of talent and potential buyers in the working-class black community. “Crazy Blues” was a hit, selling 75,000 copies in its first month. This release and subsequent recordings by Mamie Smith eventually “led the way,” says Davis, “for the professionalization of black music for the black entertainment industry and indeed for the immense popularity of black music today.” Though not strictly a traditional blues, as Oliphant and Gioia both note, the song, and Smith, established an enduring template.
Mamie Smith had been a vaudeville performer, working since childhood as “an all around entertainer,” as the Library of Congress’s Michael Taft remarks on NPR. The Blues Encyclopediapoints out that her theatrical background and flamboyant personality lent much to the “the archetypal ‘Queen of the Blues’ persona” inhabited by so many later singers. She was, we might say, the first in a long, distinguished line of songstresses, from Bessie Smith to Beyoncé, who delivered music of hardship and struggle with glamor, glitz, and swagger.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.