The above video from Playing for Change imagines a world where people from all four corners of the earth play and sing a song together, and makes it real through the power of technology and interconnectivity.
It started in 2005 when Mark Johnson heard street musician Roger Ridley singing Ben E. King’s “Stand By Me” in Santa Monica. Struck by Ridley’s emotive voice, he returned with recording equipment and began a process of bringing the world to join in. Johnson recorded Grandpa Elliott in New Orleans sharing a verse, Washboard Chaz providing washboard rhythm, then Clarence Bekker in Amsterdam taking a verse, the Twin Eagle Drum Group providing a Native American rhythm, and so on. By the end of the video, Johnson had racked up frequent flier miles and stitched together a cohesive track.
One takeaway is this: the world agrees on Bob Marley. Whether he’s being political or spiritual, everybody seems to get it. Here’s “War” featuring Bono. Also see “Redemption Song” here:
Other stars have done guest spots to bring awareness to the project. Bunny Wailer, Manu Chao and Bushman singing “Soul Rebel”:
Most recently, they recorded “The Weight” with Robbie Robertson and Ringo Starr:
And we always enjoy this version of the Dead’s “Ripple.”
The videos are heartwarming, but the music stands by itself without the globetrotting. For those who need a good vibe injection to start 2020, start here.
Ted Mills is a freelance writer on the arts who currently hosts the artist interview-based FunkZone Podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, read his other arts writing at tedmills.com and/or watch his films here.
Depending on how you feel about cats, the feline situation on the island of Cyprus is either the stuff of a delightful children’s story or a horror film to be avoided at all cost.
Despite being surrounded on all sides by water, the cat population—an estimated 1.5 million—currently outnumbers human residents. The overwhelming majority are feral, though as we learn in the above episode of PBS’ EONS, they, too, can be considered domesticated. Like the other 600,000,000-some living members of Felis Catus on planet Earth—which is to say the type of beast we associate with litterboxes, laser pointers, and Tender Vittles—they are descended from a single subspecies of African wildcat, Felis Silvestris Lybica.
While there’s no single narrative explaining how cats came to dominate Cyprus, the story of their global domestication is not an uncommon one:
An ancient efficiency expert realized that herding cats was a much better use of time than hunting them, and the idea quickly spread to neighboring communities.
Kidding. There’s no such thing as herding cats (though there is a Chicago-based cat circus, whose founder motivates her skateboard-riding, barrel-rolling, high-wire-walking stars with positive reinforcement…)
Instead, cats took a commensal path to domestication, lured by their bellies and celebrated curiosity.
Ol’ Felis (Felix!) Silvestris (Sufferin’ Succotash!) Lybica couldn’t help noticing how human settlements boasted generous supplies of food, including large numbers of tasty mice and other rodents attracted by the grain stores.
Her inadvertent human hosts grew to value her pest control capabilities, and cultivated the relationship… or at the very least, refrained from devouring every cat that wandered into camp.
Eventually, things got to the point where one 5600-year-old specimen from northwestern China was revealed to have died with more millet than mouse meat in its system—a pet in both name and popular sentiment.
Chow chow chow.
Interestingly, while today’s house cats’ gene pool leads back to that one sub-species of wild mackerel-tabby, it’s impossible to isolate domestication to a single time and place.
Both archeological evidence and genome analysis support the idea that cats were domesticated both 10,000 years ago in Southwest Asia… and then again in Egypt 6500 years later.
At some point, a human and cat traveled together to Cyprus and the rest is history, an Internet sensation and an if you can’t beat em, join em tourist attraction.
Such high end island hotels as Pissouri’s Columbia Beach Resort and TUI Sensatori Resort Atlantica Aphrodite Hills in Paphos have started catering to the ever-swelling numbers of uninvited, four-legged locals with a robust regimen of healthcare, shelter, and food, served in feline-specific tavernas.
An island charity known as Cat P.A.W.S. (Protecting Animals Without Shelter) appeals to visitors for donations to defray the cost of neutering the massive feral population.
Monty Python’s surreal, slapstick parodies of history, religion, medicine, philosophy, and law depended on a competent grasp of these subjects, and most of the troupe’s members, four of whom met at Oxford and Cambridge, went on to demonstrate their scholarly acumen outside of comedy, with books, guest lectures, professorships, and serious television shows.
Michael Palin even became president of the Royal Geographical Society for a few years. And Palin’s onetime Oxford pal and early writing partner Terry Jones—who passed away at 77 on January 21 after a long struggle with degenerative aphasia—didn’t do so badly for himself either, becoming a respected scholar of Medieval history and an authoritative popular writer on dozens of other subjects.
Indeed, as the Pythons did throughout their academic and comedic careers, Jones combined his interests as often as he could, either bringing historical knowledge to absurdist comedy or bringing humor to the study of history. Jones wrote and directed the pseudo-historical spoofs Monty Python and the Holy Grail and Life of Brian, and in 2004 he won an Emmy for his television program Terry Jones’ Medieval Lives, an entertaining, informative series that incorporates sketch comedy-style reenactments and Terry Gilliam-like animations.
In the program, Jones debunks popular ideas about several stock medieval European characters familiar to us all, while he visits historical sites and sits down to chat with experts. These characters include The Peasant, The Damsel, The Minstrel, The Monk, and The Knights. The series became a popular book in 2007, itself a culmination of decades of work. Jones first book, Chaucer’s Knight: The Portrait of a Medieval Mercenary came out in 1980. There, notes Matthew Rozsa at Salon:
[Jones] argued that the concept of Geoffrey Chaucer’s knight as the epitome of Christian chivalry ignored an uglier truth: That the Knight was a mercenary who worked for authoritarians that brutally oppressed ordinary people (an argument not dissimilar to the scene in which a peasant argues for democracy in The Holy Grail).
In 2003, Jones collaborated with several historians on Who Murdered Chaucer? A speculative study of the period in which many of the figures he later surveyed in his show and book emerged as distinctive types. As in his work with Monty Python, he didn’t only apply his contrarianism to medieval history. He also called the Renaissance “overrated” and “conservative,” and in his 2006 BBC One series Terry Jones’ Barbarians, he described the period we think of as the fall of Rome in positive terms, calling the city’s so-called “Sack” in 410 an invention of propaganda.
Jones’ work as a popular historian, political writer, and comedian “is not the full extent of [his] oeuvre,” writes Rozsa, “but it is enough to help us fathom the magnitude of the loss suffered on Tuesday night.” His legacy “was to try to make us more intelligent, more well-educated, more thoughtful. He also strove, of course, to make us have fun.” Python fans know this side of Jones well. Get to know him as a passionate interpreter of history in Terry Jones’ Medieval Lives, which you can watch on YouTube here.
As least since H.G. Wells’ 1895 novel The Time Machine, time travel has been a promising storytelling concept. Alas, it has seldom delivered on that promise: whether their characters jump forward into the future, backward into the past, or both, the past 125 years of time-travel stories have too often suffered from inelegance, inconsistency, and implausibility. Well, of course they’re implausible, everyone but Ronald Mallett might say — they’re stories about time travel. But fiction only has to work on its own terms, not reality’s. The trouble is that the fiction of time travel can all too easily stumble over the potentially infinite convolutions and paradoxes inherent in the subject matter.
In the MinutePhysics video above, Henry Reich sorts out how time-travel stories work (and fail to work) using nothing but markers and paper. For the time-travel enthusiast, the core interest of such fictions isn’t so much the spectacle of characters hurtling into the future or past but “the different ways time travel can influence causality, and thus the plot, within the universe of each story.” As an example of “100 percent realistic travel” Reich points to Orson Scott Card’s Ender’s Game, in which space travelers at light speed experience only days or months while years pass back on Earth. The same thing happens in Planet of the Apes, whose astronauts return from space thinking they’ve landed on the wrong planet when they’ve actually landed in the distant future.
But when we think of time travel per se, we more often think of stories about how actively traveling to the past, say, can change its future — and thus the story’s “present.” Reich poses two major questions to ask about such stories. The first is “whether or not the time traveler is there when history happens the first time around. Was “the time-traveling version of you always there to begin with?” Or “does the very act of time traveling to the past change what happened and force the universe onto a different trajectory of history from the one you experienced prior to traveling?” The second question is “who has free will when somebody is time traveling” — that is, “whose actions are allowed to move history onto a different trajectory, and whose aren’t?”
We can all look into our own pasts for examples of how our favorite time-travel stories have dealt with those questions. Reich cites such well-known time-travelers’ tales as A Christmas Carol, Groundhog Day, and Bill & Ted’s Excellent Adventure, as well, of course, as Back to the Future, the most popular dramatization of the theoretical changing of historical timelines caused by travel into the past. Rian Johnson’s Looper treats that phenomenon more complexly, allowing for more free will and taking into account more of the effects a character in one time period would have on that same character in another. Consulting on that film was Shane Carruth, whose Primer — my own personal favorite time-travel fiction — had already taken time travel “to the extreme, with time travel within time travel within time travel.”
Harry Potter and the Prisoner of Azkaban, Reich’s personal favorite time-travel fiction, exhibits a clarity and consistency uncommon in the genre. J.K. Rowling accomplishes this by following the rule that “while you’re experiencing your initial pre-time travel passage through a particular point in history, your time-traveling clone is also already there, doing everything you’ll eventually do when you time-travel yourself.” This single-time-line version of time travel, in which “you can’t change the past because the past already happened,” gets around problems that have long bedeviled other time-travel fictions. But it also demonstrates the importance of self-consistency in fiction of all kinds: “In order to care about the characters in a story,” Reich says, “we have to believe that actions have consequences.” Stories, in other words, must obey their own rules — even, and perhaps especially, stories involving time-traveling child wizards.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
For every Nets fan cheering their team on in Brooklyn’s Barclays Center and every tourist gamboling about the post-punk, upscale East Village, there are dozens of local residents who remember what—and who—was displaced to pave the way for this progress.
It’s no great leap to assume that something had to be plowed under to make way for the city’s myriad gleaming skyscrapers, but harder to conceive of Central Park, the 840-acre oasis in the middle of Manhattan, as a symbol of ruthless gentrification.
Plans for a peaceful green expanse to rival the great parks of Great Britain and Europe began taking shape in the 1850s, driven by well-to-do white merchants, bankers, and landowners looking for temporary escape from the urban pressures of densely populated Lower Manhattan.
It took 20,000 workers—none black, none female—over three years to realize architects Frederick Law Olmsted and Calvert Vaux’s sweeping pastoral design.
A hundred and fifty years later, Central Park is still a vital part of daily life for visitors and residents alike.
But what of the vibrant neighborhood that was doomed by the park’s construction?
The best established of these was Seneca Village, which ran from approximately 82nd to 89th Street, along what is known today as Central Park West. 260-some residents were evicted under eminent domain and their homes, churches, and school were razed.
This physical erasure quickly translated to mass public amnesia, abetted, no doubt, by the way Seneca Village was framed in the press, not as a community of predominantly African-American middle class and working class homeowners, but rather a squalid shantytown inhabited by squatters.
As Brent Staples recalls in a New York Times op-ed, in the summer of 1871, when park workers dislodged two coffins in the vicinity of the West 85th Street entrance, The New York Herald treated the discovery as a baffling mystery, despite the presence of an engraved plate on one of the coffins identifying its occupant, an Irish teenager, who’d been a parishioner of Seneca Village’s All Angels Episcopal Church.
Copeland and her colleagues kept Alexander’s work in mind when they began excavating Seneca Village in 2011, focusing on the households of two African-American residents, Nancy Moore and William G. Wilson, a father of eight who served as sexton at All Angels and lived in a three-story wood-frame house. The dig yielded 250 bags of material, including a piece of a bone-handled toothbrush, an iron tea kettle, and fragments of clay pipes and blue-and-white Chinese porcelain:
Archaeologists have begun to consider the lives of middle class African Americans, focusing on the ways their consumption of material culture expressed class and racial identities. Historian Leslie Alexander believes that Seneca Village not only provided a respite from discrimination in the city, but also embodied ideas about African pride and racial consciousness.
Owning a home in Seneca Village also bestowed voting rights on African American male heads of household.
Two years before it was torn down, the community was home to 20 percent of the city’s African American property owners and 15 percent of its African American voters.
Thanks to the efforts of historians like Copeland and Alexander, Seneca Village is once again on the public’s radar, though unlike Pigtown, a smaller, predominantly agricultural community toward the southern end of the park, the origins of its name remain mysterious.
Was the village named in tribute to the Seneca people of Western New York or might it, as Alexander suggests, have been a nod to the Roman philosopher, whose thoughts on individual liberty would have been taught as part of Seneca Village’s African Free Schools’ curriculum?
At a time when much of animation was consumed with little anthropomorphized animals sporting white gloves, Oskar Fischinger went in a completely different direction. His work is all about dancing geometric shapes and abstract forms spinning around a flat featureless background. Think of a Mondrian or Malevich painting that moves, often in time to the music. Fischinger’s movies have a mesmerizing elegance to them. Check out his 1938 short An Optical Poemabove. Circles pop, sway and dart across the screen, all in time to Franz Liszt’s 2nd Hungarian Rhapsody. This is, of course, well before the days of digital. While it might be relatively simple to manipulate a shape in a computer, Fischinger’s technique was decidedly more low tech. Using bits of paper and fishing line, he individually photographed each frame, somehow doing it all in sync with Liszt’s composition. Think of the hours of mind-numbing work that must have entailed.
Born in 1900 near Frankfurt, Fischinger trained as a musician and an architect before discovering film. In the 1930s, he moved to Berlin and started producing more and more abstract animations that ran before feature films. They proved to be popular too, at least until the National Socialists came to power. The Nazis were some of the most fanatical art critics of the 20th Century, and they hated anything non representational. The likes of Paul Klee, Oskar Kokoschka and Wassily Kandinsky among others were written off as “degenerate.” (By stark contrast, the CIA reportedly loved Abstract Expressionism, but that’s a different story.) Fischinger fled Germany in 1936 for the sun and glamour of Hollywood.
The problem was that Hollywood was really not ready for Fischinger. Producers saw the obvious talent in his work, and they feared that it was too ahead of its time for broad audiences. “[Fischinger] was going in a completely different direction than any other animator at the time,” said famed graphic designer Chip Kidd in an interview with NPR. “He was really exploring abstract patterns, but with a purpose to them — pioneering what technically is the music video.”
Fischinger’s most widely seen American work was the section in Walt Disney’s Fantasia set to Bach’s Toccata and Fugue in D Minor. Disney turned his geometric forms into mountain peaks and violin bows. Fischinger was apoplectic. “The film is not really my work,” Fischinger later reflected. “Rather, it is the most inartistic product of a factory. …One thing I definitely found out: that no true work of art can be made with that procedure used in the Disney studio.” Fischinger didn’t work with Disney again and instead retreated into the art world.
There he found admirers who were receptive to his vision. John Cage, for one, considers the German animator’s experiments to be a major influence on his own work. Cage recalls his first meeting with Fischinger in an interview with Daniel Charles in 1968.
One day I was introduced to Oscar Fischinger who made abstract films quite precisely articulated on pieces of traditional music. When I was introduced to him, he began to talk with me about the spirit, which is inside each of the objects of this world. So, he told me, all we need to do to liberate that spirit is to brush past the object, and to draw forth its sound. That’s the idea which led me to percussion.
Jonathan Crow is a Los Angeles-based writer and filmmaker whose work has appeared in Yahoo!, The Hollywood Reporter, and other publications. You can follow him at @jonccrow. And check out his blog Veeptopus, featuring one new drawing of a vice president with an octopus on his head daily. The Veeptopus store is here.
Since humanity has had music, we’ve also had bad music. And bad music can come from only one source: bad musicians. Despite such personal technologies of relatively recent invention as noise-canceling headphones, bad music remains nigh unavoidable in the modern world, issuing as it constantly does from the sound systems installed in grocery stores, gyms, passing automobiles, and so on. And against the bad musicians responsible we have less recourse than ever, or at least less than medieval Europeans did, as shown by the Ripley’s Believe It or Not video above on the “shame flute,” a non-musical instrument used to punish crimes against the art.
“The contraption, which is essentially a heavy iron flute – although you probably wouldn’t want play it – was shackled to the musician’s neck,” writes Maddy Shaw Roberts at Classic FM. “The musician’s fingers were then clamped to the keys, to give the impression they were playing the instrument. Finally, just to further their humiliation, they were forced to wear the flute while being paraded around town, so the public could throw rotten food and vegetables at them.” Surely the mere prospect of such a fate made many music-minded children of the olden days think twice about slacking on their practice sessions.
The sight of this flute of shame, which you can take in at either the equally stimulating-sounding Medieval Crime Museum in Rothenburg or the Torture Museum in Amsterdam, would get any of us moderns thinking about considering which musicians of our own day deserve to be shackled to it. The Guardian’s Dave Simpson suggests, among others, “all bands with silly names,” “any musician called Sir who is over 60,” and “anyone who has ever appeared on The X Factor, ever.” In this day and age they would all probably complain of cruel and unusual punishment, but as music-related torture devices go, the shame flute certainly seems preferable to ancient Greece’s “brazen bull.”
Though still a little-known historical artifact, the shame flute has regained some cultural currency in recent years. It even inspired the name of a Finnish rock group, Flute of Shame. As the band members put it in an interview with Vice’s Josh Schneider, “We were having a night out in Amsterdam and found ourselves in a torture museum whilst looking for the Banana Bar,” a well-known spot in the city’s red-light district. “We saw the device and the rest is history.” Of course, any rock group that names itself after a torture device will draw comparisons to Iron Maiden, and journalistic diligence compels Schneider to ask Flute of Shame which band would win in a shredding contest. “Probably Iron Maiden,” the Finns respond, “but are they happy?”
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
What’s the deal with images of powerful women in media? The trope of the tough-as-nails boss-lady who may or may not have a heart of gold has evolved a lot over the years, but it’s difficult to portray such a character unobjectionably, probably due to those all-too-familiar double standards about wanting women in authority (or, say, running for office) to be assertive but not astringent.
Margaret was the female lead in major films including Independence Day and The Devil’s Own, is a mainstay on Broadway, and has appeared on TV in many roles including the mother of the Gossip Girland as an unscrupulous newscaster on the final seasons of VEEP. Her height and voice have made her a good fit for dominant-lady roles, and she leads Mark, Erica, and Brian through a quick, instructive tour through her work with male directors (e.g. in a pre-Murphy-Brown Dianne English sit-com), playing the lead in three Lifetime Network movies, on Broadway as Jackie, and opposite Harrison Ford, Al Pacino, Melanie Griffith, Michael Shannon, Wallace Shawn, and others.
Given the limitations of short-form storytelling in film, maybe some use of stereotypes is just necessary to get the gist of a character out quickly, but actors can load their performances with unseen backstory. We hear about the actor’s role in establishing a character vs. the vision of the filmmakers or show-runners. Also, the relative conservatism of film vs. stage vs. TV in granting women creative control, the “feminine voice,” why women always apparently have to trip in movies when chased, and more.
A few resources to get you thinking about this topic:
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.