This is a very quick FYI for anyone who happens to be an Audible subscriber. If you’re not, you can start a free trial here.
This month, all Audible members can get free access to James Taylor’s new short memoir called Break Shot: My First 21 Years. Read by James Taylor himself, the book revisits the musician’s turbulent childhood and his emergence as an artist. It also features recorded music by the singer-songwriter.
In addition, Michael Pollan has released a new short audiobook, Caffeine: How Caffeine Created the Modern World. Read by Pollan, the book (only available in audio format) “takes us on a journey through the history of the drug, which was first discovered in a small part of East Africa and within a century became an addiction affecting most of the human species.”
Both books are part of the Audible Originals program. So if you download them, you won’t be using any of your monthly credits. They are free bonus material.
And now for an extra bonus: You can listen to Annette Bening, Jon Hamm, Matthew Rhys, Maura Tierney and others read “The Senate Intelligence Committee Report on Torture.” It’s free for all–whether you’re an Audible subscriber or not.
To sign up for an Audible free trial, click here.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Why, in the course of two extraordinary films by Ridley Scott and Denis Villeneuve, do we never learn what the term Blade Runner actually means? Perhaps the mystery only deepens the sense of “super-realism” with which the film leaves audiences, including—and especially—Philip K. Dick, who only lived long enough to see excerpts. “The impact of Blade Runner is simply going to be overwhelming, both on the public and on creative people,” he wrote. As usual, Dick saw beyond his contemporaries, who mostly panned or ignored the film.
Dick seemed to have “had no beef with the fact Blade Runner was not a faithful adaptation of his novel,” writes David Barnett at the Independent. Not only did he not write a book called Blade Runner—the film was loosely adapted from his 1968 book Do Androids Dream of Electric Sheep?—but he also never used those words, “Blade Runner,” to describe his characters. “It’s not a phrase used in the book and it doesn’t really make much sense in the context of the movie…. It’s simply a throwaway slang for cops who hunt replicants.”
The phrase, as Keele University professor Oliver Harris tells The Quietus, is so much more than that. It brings along with it “a weird backstory that tells us something about how the Burroughs virus spreads around,” infecting nearly everything science fictional and countercultural over the past half-century or so. That’s William S. Burroughs, of course, author of—among a few other things—a 1979 novelistic film treatment called Blade Runner: A Movie.
If Scott and screenwriter Hampton Fancher had adapted Burroughs’ nightmarish 21st century to the cinema, we would have seen a much different film—though one as wholly resonant with our current dystopia. The story imagines “a medical-care apocalypse,” in which medical supplies like scalpels become smuggled contraband—hence “blade runners.” Burroughs’ book is itself an adaptation—or a re-writing and re-editing—of sci-fi writer Alan Nourse’s 1974 pulp sci-fi novel The Bladerunner.
It is Nourse who introduced the scenario of a “medical apocalypse” and who coined the term “blade runner,” though we owe its separation into two words to Burroughs. “Reading one text against the other is fascinating,” says Harris. “Nourse writes pedestrian, realist prose with two-dimensional characters who all talk in the same colourless style.” Burroughs, on the other hand, writes with “extraordinary economy, mastery of idiom, and wildly unbound imagination.”
In the crumbling New York (not L.A.) of Burroughs’ future world, the government controls its citizens “through the ability to withhold essential services including work, credit, housing, retirement benefits and medical care through computerization.” Granted, this might not seem to lend itself to a very cinematic treatment, but Burroughs was attracted to the central concept of Nourse’s book, one inherently rich in human tragedy: “medical pandemics appealed to his vision of a species in peril, a planet heading for terminal disaster.”
Dick imagined a species in peril from a different kind of infection, as Burroughs would have it—artificial intelligence. Was the most cinematically-adapted sci-fi novelist aware that he had indirectly helped reintroduce a strain of the Burroughs virus—a paranoid, if justified, suspicion of authority—back into popular culture through Blade Runner? We might expect, given his status in the science fiction community at the time of his death, three months before the film debuted, that he might be aware of the connection. But he gave no hint of it, leaving us to ponder what Burroughs’ Blade Runner: The Movie, the movie, would be like, made with the skill and sensibility of a Scott or Villeneuve.
Adobe has announced that the Flash Player will come to the official end of its life on the last day of this year, December 31, 2020. News of the demise of an obsolete internet multimedia platform presumably bothers few of today’s web-surfers, but those of us belonging to a certain generation feel in it the end of an era. First introduced by Macromedia in 1996, Flash made possible the kind of animation and sound we’d seldom seen and heard — assuming we could manage to load it through our sluggish connections at all — on the internet before. By the early 2000s, Flash seemed to power most everything fun on the internet, especially everything fun to the kids then in middle and high school who’d grown up alongside the World Wide Web.
Though now deep into adulthood, we all remember the hours of the early 21st century we happily whiled away on Flash games, racing cars, solving puzzles, shooting zombies, dodging comets, firing cannons, and piloting helicopters on classroom computers. We could, in theory, find many of these games and play them still today, but that may become impossible next year when all major web browsers will discontinue their support for Flash.
On Flashpoint’s download page you’ll find its full 290-gigabyte collection of Flash games, as well as a smaller version that only downloads games as you play them. “While Flash games might not be as impressive today, they are still an important part of gaming history,” writes Zwiezen. “These small web games can be directly linked to the later rise of mobile and indie games and helped many creators get their feet wet with building and creating video games.” In other words, the simple Flash amusements of our schooldays gave rise to the graphically and sonically intense games that we play so compulsively today. Now we have kids who play those sorts of games too, but who among us will initiate the next generation into the ways of Crush the Castle, Age of War, and Bubble Trouble?
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
When I hear the word robot, I like to imagine Isaac Asimov’s delightfully Yiddish-inflected Brooklynese pronunciation of the word: “ro-butt,” with heavy stress on the first syllable. (A quirk shared by Futurama’s crustacean Doctor Zoidberg.) Asimov warned us that robots could be dangerous and impossible to control. But he also showed young readers—in his Norby series of kids’ books written with his wife Janet—that robots could be heroic companions, saving the solar system from cosmic supervillains.
The word robot conjures all of these associations in science fiction: from Blade Runner’s replicants to Star Trek’s Data. We might refer to these particular examples as androids rather than robots, but this confusion is precisely to the point. Our language has forgotten that robots started in sci-fi as more human than human, before they became Asimov-like machines. Like the sci-fi writer’s pronunciation of robot, the word originated in Eastern Europe in 1921, the year after Asimov’s birth, in a play by Czech intellectual Karel Čapek called R.U.R., or “Rossum’s Universal Robots.”
The title refers to the creations of Mr. Rossum, a Frankenstein-like inventor and possible inspiration for Metropolis’s Rotwang (who was himself an inspiration for Dr. Strangelove). Čapek told the London Saturday Review after the play premiered that Rossum was a “typical representative of the scientific materialism of the last [nineteenth] century,” with a “desire to create an artificial man—in the chemical and biological, not mechanical sense.”
Rossum did not wish to play God so much as “to prove God to be unnecessary and absurd.” This was but one stop on “the road to industrial production.” As technology analyst and Penn State professor John M. Jordan writes at the MIT Press Reader, Čapek’s robots were not appliances become sentient, nor trusty, superpowered sidekicks. They were, in fact, invented to be slaves.
The robot… was a critique of mechanization and the ways it can dehumanize people. The word itself derives from the Czech word “robota,” or forced labor, as done by serfs. Its Slavic linguistic root, “rab,” means “slave.” The original word for robots more accurately defines androids, then, in that they were neither metallic nor mechanical.
Jordan describes this history in an excerpt from his book Robots, part of the MIT Press Essential Knowledge Series, and a timelier than ever intervention in the cultural and technological history of robots, who walk (and moonwalk) among us in all sorts of machine forms, if not quite yet in the sense Čapek imagined. But a Blade Runner-like scenario seemed inevitable to him in a society ruled by “utopian notions of science and technology.”
In the time he imagines, he says, “the product of the human brain has escaped the control of human hands.” Čapek has one character, the robot Radius, make the point plainly:
The power of man has fallen. By gaining possession of the factory we have become masters of everything. The period of mankind has passed away. A new world has arisen. … Mankind is no more. Mankind gave us too little life. We wanted more life.
Sound familiar? While R.U.R. owes a “substantial” debt to Mary Shelley’s Frankenstein, it’s also clear that Čapek contributed something original to the critique, a vision of a world in which “humans become more like their machines,” writes Jordan. “Humans and robots… are essentially one and the same.” Beyond the surface fears of science and technology, the play that introduced the word robot to the cultural lexicon also introduced the darker social critique in most stories about them: We have reason to fear robots because in creating them, we’ve recreated ourselves; then we’ve treated them the way we treat each other.
Recorded at Abbey Road studios by Alan Parsons, who had previously worked on The Beatles’ Abbey Road and Let It Be, Pink Floyd’s The Dark Side of the Moon broke almost as much sonic ground as those albums. “The band chose the world-renowned studio, as it was home to, at the time, some of the most advanced recording technology ever produced – including the EMI TG12345 mixing console,” writes Anthony Sfirse at Enmore Audio.
Parsons made tasteful yet totally spaced-out use, as the Polyphonic video above shows, of synthesizers, stereo multitrack recording, and tape loops. Then there’s David Gilmour’s legendary guitar tone—so essential to a certain kind of music (and to Pink Floyd cover bands) that guitar pedal designer Robert Keeley has built an entire “workstation” around the guitar sounds on the album, even though most players, including Gilmour, will tell you that tone lives in the fingers.
The album is a perfect synthesis of the band’s strengths: epic songwriting meets epic experimentation meets epic musicianship—three musical directions that don’t always play well together. The late sixties and seventies brought increasing complexity and theatricality to rock and roll, but Pink Floyd did something extraordinary with Dark Side. They wrote accessible, riff-heavy, blues-based tunes that also set the bar for philosophically existential, wistful, melancholy, sardonic, funky, soulful, psychedelia, without sacrificing one for the other.
How the band went from cultivating a cult underground to spending 741 weeks—or 14 years—at the top of Billboard’s albums chart after the release of their “high concept lyrical masterpiece” in 1973 is the subject of a series of eight videos produced by Polyphonic. See the first, which covers “Speak to Me/Breath,” at the top, and others below. New videos will be released on the Polyphonic YouTube channel soon.
The approach is an admirable one. Too often the greatness of classic albums like Dark Side of the Moon is taken for granted and glossed too quickly. The album’s massive commercial and critical success seems proof enough. We may not know much about Pink Floyd ourselves, but we acknowledge they’ve been thoroughly vetted by the experts.
But if we want to know ourselves why critics, musicians, and fans alike have heaped so much praise on the 1973 album—and shelled out hard-earned cash by the millions for records, concerts, and merchandise—we might learn quite a lot from this series.
There may be no instrument in the classical repertoire more multidimensional than the cello. Its deep silky voice modulates from moans to exaltations in a single phrase—conveying dignified melancholy and a profound sense of awe. Hearing a skilled cellist interpret great solo music for cello can approach the feeling of a religious experience. And no piece of solo music for cello is greater, or more popularly known, than Johann Sebastian Bach’s Cello Suite No. 1 in G Major. Better known as the “Prelude,” the first of six Baroque suites Bach composed between 1717 and 1723, the piece has appeared, notes the Vox Earworm video above, “in hundreds of TV shows and films.”
You’ve heard it at wedding and funerals, in restaurants, in the lobbies of hotels. “It’s so famous, that if you don’t remember its title, “you can just google ‘that famous cello song’ and it will invariably pop up.” What is it about this piece that so appeals? Its constant, rhythmic movement conceals “what’s most compelling about it”—its simplicity. “The whole thing just takes up two pages of music, and it’s composed for an instrument with only four strings.” The Earworm video goes on to explain why this enormously popular, deceptively simple piece is “considered a masterpiece that world-class cellists… have revered for years.”
Bach’s cello suites “are the Everest of [the cello’s] repertory,” writes Zachary Woolfe at The New York Times, “offering a guide to nearly everything a cello can do—as well as, many believe, charting a remarkably complete anatomy of emotion and aspiration.” World-class cellist Yo-Yo Ma has in fact been traveling the world playing these pieces to bring people together in his “Days of Action.” He recently released the video below of the Prelude, demonstrating the outcome of a lifetime of engagement with Bach’s cello music.
Ma plays this piece as “the musician of our civic life,” writes Woolfe, appearing at collective moments of both grief and celebration, “to make us cry and then soothe us.” What we learn in the Vox video is that the cello suites come from music designed to literally move its listeners. “Within each suite are various movements named for dances.” Cellist Alisa Weilerstein demonstrates the Prelude’s beautiful simplicity and helps “deconstruct” the piece’s ideal suitability for the instrument “closest in range and ability to express to the human voice.”
What’s interesting about Bach’s six cello suites is that they were written by a non-cellist, “the first non-cellist composer to give the cello its first big break as a lead actor,” writes musicologist Ann Wittstruck. He drew on Baroque social dances for the form of the pieces, which increase in complexity as they go. The prelude is looser, with arpeggios circling around an open bass note that gives the first half “gravitas.”
As the piece shifts away to the dominant D major, then to “cloudy” diminished and minor chords, its mood shifts too; within simple harmonies play a complex of emotional tensions. Its second half wanders through an improvisatory, dissonant passage on its way back to D major. Weilerstein walks through each technique, including a disorienting run down the cello’s neck called “bariolage,” which, she says, is meant to create a “feeling of disorder.”
Perhaps that’s only one of the reasons Bach’s Prelude resonates with us so deeply in a fragmented world, and fits Ma’s harmonious intentions so well. It’s a piece that acknowledges dissonance and disorder even as it surrounds them with the joyful, stylized movements of social dances. Music critic Wilfrid Mellers described Bach’s cello suites as “monophonic music wherein a man has created a dance of God.” But they were not recognized by his contemporaries with such high praise.
Composed “just before Bach moved to Leipzig,” Woolfe writes, “the cello suites, now musical and emotional touchstones, were little known until the 1900s. It was thought, even by some who knew of them, that they were merely études, nothing you’d want to perform in public.” Now, the most famous cellist—and perhaps most famous classic icon—in the world is traveling to six continents, playing Bach’s cello suites in 36 very public concerts. Learn more about his Bach project here.
Walter Murch, perhaps the most famed film editor alive, is acclaimed for the work he’s done for directors like Francis Ford Coppola, George Lucas, and Anthony Minghella. As innovative and influential as his ways for putting images together have been, Murch has done just as much for cinema as a sound designer. In the video above Evan Puschak, better known as the Nerdwriter, examines Murch’s soundcraft through what Murch calls “worldizing,” which Filmsound.org describes as “manipulating sound until it seemed to be something that existed in real space.” This involves “playing back existing recordings through a speaker or speakers in real-world acoustic situations,” recording it, and using that recording on the film’s soundtrack.
In other words, Murch pioneered the technique of not just inserting music into a movie in the editing room, but re-recording that music in the actual spaces in which the characters hear it. Mixing the original, “clean” recording of a song with that song as re-recorded in the movie’s space — a dance hall, an outdoor wedding, a dystopian underground warren — has given Murch a greater degree of control over the viewer’s listening experience. In some shots he could let the viewer hear more of the song itself by prioritizing the original song; in others he could prioritize the re-recorded song and let the viewer hear the song as the characters do, with all the sonic characteristics contributed by the space — or, if you like, the world — around them.
Puschak uses examples of Murch’s worldizing from American Graffiti and The Godfather, and notes that he first used it in Lucas’ debut feature THX 1138. But he also discovered an earlier attempt by Orson Welles to accomplish the same effect in Touch of Evil, a film Murch re-edited in 1998. What Welles had not done, says Murch in an interview with Film Quarterly, “was combine the original recording and the atmospheric recording. He simply positioned a microphone, static in an alleyway outside Universal Sound Studios, re-recording from a speaker to the microphone through the alleyway. He didn’t have control over the balance of dry sound versus reflected sound, and he didn’t have the sense of motion that we got from moving the speaker and moving the microphone relative to one another.”
Doing this, Murch says, “creates the sonic equivalent of depth of field in photography. We can still have the music in the background, but because it’s so diffuse, you can’t find edges to focus on and, therefore, focus on the dialogue which is in the foreground.” In all earlier films besides Welles’, “music was just filtered and played low, but it still had its edges,” making it hard to separate from the dialogue. These days, as Puschak points out, anyone with the right sound-editing software can perform these manipulations with the click of a mouse. No such ease in the 1970s, when Murch had to not only execute these thoroughly analog, labor-intensive processes, but also invent them in the first place. As anyone who’s looked and listened closely to his work knows, that audiovisual struggle made Murch experience and work with cinema in a richly physical way — one that, as generations of editors and sound designers come up in wholly digital environments, may not exist much longer.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Only recently has “actor” become an acceptable gender-neutral term for performers of stage and screen.
Prior to that, we had “actor” and “actress,” and while there may have been some problematic assumptions concerning the type of woman who might be drawn to the profession, there was arguably linguistic parity between the two words.
Not so for artists.
In the not-so-distant past, female artists invariably found themselves referred to as “female artists.”
Not great, when male artists were referred to as (say it with me) “artists.”
The new season of the Getty’s podcast Recording Artists pays tribute to six significant post-war artists—two Abstract Expressionists, a portraitist, a performance artist and experimental musician, and a printmaker who progressed to assemblage and collage works with an overtly social message.
Hopefully you won’t need to reach for your smelling salts upon discovering that all six artists are female:
Molesworth, who is joined by two art world guests per episode—some of them (gasp!) non-female—is the perfect choice to consider the impact of the Radical Women who give this season its subtitle.
We also hear from the artists themselves, in excepts from taped ’60s and ’70s-era interviews with historians Cindy Nemser and Barbara Rose.
Their candid remarks give Molesworth and her guests a lot to consider, from the difficulties of maintaining a consistent artistic practice after one becomes a mother to racial discrimination. A lot of attention is paid to historical context, even when it’s warts and all.
The late Alice Neel, a white artist best remembered for her portraits of her black and brown East Harlem neighbors and friends, cracks wise about butch lesbians in Greenwich Village, prompting Molesworth to remark that she thinks she—or any artist of her acquaintance—could have “easily” swayed Neel to can the homophobic remarks.
It’s also possible that Neel, who died in 1984, would have kept step with the times and made the necessary correction unprompted, were she still with us today.
A couple of the subjects, Yoko Ono and Betye Saar, are alive …and actively creating art, though it’s their past work that seems to be the source of greatest fascination.
This is a far cry better than New York Times critic Hilton Kramer’s dismissal of Neel’s 1974 retrospective at the Whitney, when the artist was 74 years old:
… the Whitney, which can usually be counted on to do the wrong thing, devoted a solo exhibition to Alice Neel, whose paintings (we can be reasonably certain) would never have been accorded that honor had they been produced by a man. The politics of the situation required that a woman be given an exhibition, and Alice Neel’s painting was no doubt judged to be sufficiently bizarre, not to say inept, to qualify as something ‘far out.’”
Twenty six years later, his opinion of Neel’s talent had not mellowed, though he had the political sense to dial down the misogyny in his scathing Observer review of Neel’s third show at the Whitney…or did he? In citing curator Ann Temkin’s observation that Neel painted “with the eye of a caricaturist” he makes sure to note that Neel’s subject Annie Sprinkle, “the porn star who became a performance artist, is herself a caricature, no mockery was needed.”
One has to wonder if he would have described the artist’s nude self-portrait at the age of 80 as that of “a geriatric ruin” had the artist been a man.
And while neither Saar nor Ono added any current commentary to the podcast, we encourage you to check out the interviews below in which they discuss their recent work in addition to reflecting on their long artistic careers:
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.