The above video from Playing for Change imagines a world where people from all four corners of the earth play and sing a song together, and makes it real through the power of technology and interconnectivity.
It started in 2005 when Mark Johnson heard street musician Roger Ridley singing Ben E. King’s “Stand By Me” in Santa Monica. Struck by Ridley’s emotive voice, he returned with recording equipment and began a process of bringing the world to join in. Johnson recorded Grandpa Elliott in New Orleans sharing a verse, Washboard Chaz providing washboard rhythm, then Clarence Bekker in Amsterdam taking a verse, the Twin Eagle Drum Group providing a Native American rhythm, and so on. By the end of the video, Johnson had racked up frequent flier miles and stitched together a cohesive track.
One takeaway is this: the world agrees on Bob Marley. Whether he’s being political or spiritual, everybody seems to get it. Here’s “War” featuring Bono. Also see “Redemption Song” here:
Other stars have done guest spots to bring awareness to the project. Bunny Wailer, Manu Chao and Bushman singing “Soul Rebel”:
Most recently, they recorded “The Weight” with Robbie Robertson and Ringo Starr:
And we always enjoy this version of the Dead’s “Ripple.”
The videos are heartwarming, but the music stands by itself without the globetrotting. For those who need a good vibe injection to start 2020, start here.
Ted Mills is a freelance writer on the arts who currently hosts the artist interview-based FunkZone Podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, read his other arts writing at tedmills.com and/or watch his films here.
Depending on how you feel about cats, the feline situation on the island of Cyprus is either the stuff of a delightful children’s story or a horror film to be avoided at all cost.
Despite being surrounded on all sides by water, the cat population—an estimated 1.5 million—currently outnumbers human residents. The overwhelming majority are feral, though as we learn in the above episode of PBS’ EONS, they, too, can be considered domesticated. Like the other 600,000,000-some living members of Felis Catus on planet Earth—which is to say the type of beast we associate with litterboxes, laser pointers, and Tender Vittles—they are descended from a single subspecies of African wildcat, Felis Silvestris Lybica.
While there’s no single narrative explaining how cats came to dominate Cyprus, the story of their global domestication is not an uncommon one:
An ancient efficiency expert realized that herding cats was a much better use of time than hunting them, and the idea quickly spread to neighboring communities.
Kidding. There’s no such thing as herding cats (though there is a Chicago-based cat circus, whose founder motivates her skateboard-riding, barrel-rolling, high-wire-walking stars with positive reinforcement…)
Instead, cats took a commensal path to domestication, lured by their bellies and celebrated curiosity.
Ol’ Felis (Felix!) Silvestris (Sufferin’ Succotash!) Lybica couldn’t help noticing how human settlements boasted generous supplies of food, including large numbers of tasty mice and other rodents attracted by the grain stores.
Her inadvertent human hosts grew to value her pest control capabilities, and cultivated the relationship… or at the very least, refrained from devouring every cat that wandered into camp.
Eventually, things got to the point where one 5600-year-old specimen from northwestern China was revealed to have died with more millet than mouse meat in its system—a pet in both name and popular sentiment.
Chow chow chow.
Interestingly, while today’s house cats’ gene pool leads back to that one sub-species of wild mackerel-tabby, it’s impossible to isolate domestication to a single time and place.
Both archeological evidence and genome analysis support the idea that cats were domesticated both 10,000 years ago in Southwest Asia… and then again in Egypt 6500 years later.
At some point, a human and cat traveled together to Cyprus and the rest is history, an Internet sensation and an if you can’t beat em, join em tourist attraction.
Such high end island hotels as Pissouri’s Columbia Beach Resort and TUI Sensatori Resort Atlantica Aphrodite Hills in Paphos have started catering to the ever-swelling numbers of uninvited, four-legged locals with a robust regimen of healthcare, shelter, and food, served in feline-specific tavernas.
An island charity known as Cat P.A.W.S. (Protecting Animals Without Shelter) appeals to visitors for donations to defray the cost of neutering the massive feral population.
Monty Python’s surreal, slapstick parodies of history, religion, medicine, philosophy, and law depended on a competent grasp of these subjects, and most of the troupe’s members, four of whom met at Oxford and Cambridge, went on to demonstrate their scholarly acumen outside of comedy, with books, guest lectures, professorships, and serious television shows.
Michael Palin even became president of the Royal Geographical Society for a few years. And Palin’s onetime Oxford pal and early writing partner Terry Jones—who passed away at 77 on January 21 after a long struggle with degenerative aphasia—didn’t do so badly for himself either, becoming a respected scholar of Medieval history and an authoritative popular writer on dozens of other subjects.
Indeed, as the Pythons did throughout their academic and comedic careers, Jones combined his interests as often as he could, either bringing historical knowledge to absurdist comedy or bringing humor to the study of history. Jones wrote and directed the pseudo-historical spoofs Monty Python and the Holy Grail and Life of Brian, and in 2004 he won an Emmy for his television program Terry Jones’ Medieval Lives, an entertaining, informative series that incorporates sketch comedy-style reenactments and Terry Gilliam-like animations.
In the program, Jones debunks popular ideas about several stock medieval European characters familiar to us all, while he visits historical sites and sits down to chat with experts. These characters include The Peasant, The Damsel, The Minstrel, The Monk, and The Knights. The series became a popular book in 2007, itself a culmination of decades of work. Jones first book, Chaucer’s Knight: The Portrait of a Medieval Mercenary came out in 1980. There, notes Matthew Rozsa at Salon:
[Jones] argued that the concept of Geoffrey Chaucer’s knight as the epitome of Christian chivalry ignored an uglier truth: That the Knight was a mercenary who worked for authoritarians that brutally oppressed ordinary people (an argument not dissimilar to the scene in which a peasant argues for democracy in The Holy Grail).
In 2003, Jones collaborated with several historians on Who Murdered Chaucer? A speculative study of the period in which many of the figures he later surveyed in his show and book emerged as distinctive types. As in his work with Monty Python, he didn’t only apply his contrarianism to medieval history. He also called the Renaissance “overrated” and “conservative,” and in his 2006 BBC One series Terry Jones’ Barbarians, he described the period we think of as the fall of Rome in positive terms, calling the city’s so-called “Sack” in 410 an invention of propaganda.
Jones’ work as a popular historian, political writer, and comedian “is not the full extent of [his] oeuvre,” writes Rozsa, “but it is enough to help us fathom the magnitude of the loss suffered on Tuesday night.” His legacy “was to try to make us more intelligent, more well-educated, more thoughtful. He also strove, of course, to make us have fun.” Python fans know this side of Jones well. Get to know him as a passionate interpreter of history in Terry Jones’ Medieval Lives, which you can watch on YouTube here.
As least since H.G. Wells’ 1895 novel The Time Machine, time travel has been a promising storytelling concept. Alas, it has seldom delivered on that promise: whether their characters jump forward into the future, backward into the past, or both, the past 125 years of time-travel stories have too often suffered from inelegance, inconsistency, and implausibility. Well, of course they’re implausible, everyone but Ronald Mallett might say — they’re stories about time travel. But fiction only has to work on its own terms, not reality’s. The trouble is that the fiction of time travel can all too easily stumble over the potentially infinite convolutions and paradoxes inherent in the subject matter.
In the MinutePhysics video above, Henry Reich sorts out how time-travel stories work (and fail to work) using nothing but markers and paper. For the time-travel enthusiast, the core interest of such fictions isn’t so much the spectacle of characters hurtling into the future or past but “the different ways time travel can influence causality, and thus the plot, within the universe of each story.” As an example of “100 percent realistic travel” Reich points to Orson Scott Card’s Ender’s Game, in which space travelers at light speed experience only days or months while years pass back on Earth. The same thing happens in Planet of the Apes, whose astronauts return from space thinking they’ve landed on the wrong planet when they’ve actually landed in the distant future.
But when we think of time travel per se, we more often think of stories about how actively traveling to the past, say, can change its future — and thus the story’s “present.” Reich poses two major questions to ask about such stories. The first is “whether or not the time traveler is there when history happens the first time around. Was “the time-traveling version of you always there to begin with?” Or “does the very act of time traveling to the past change what happened and force the universe onto a different trajectory of history from the one you experienced prior to traveling?” The second question is “who has free will when somebody is time traveling” — that is, “whose actions are allowed to move history onto a different trajectory, and whose aren’t?”
We can all look into our own pasts for examples of how our favorite time-travel stories have dealt with those questions. Reich cites such well-known time-travelers’ tales as A Christmas Carol, Groundhog Day, and Bill & Ted’s Excellent Adventure, as well, of course, as Back to the Future, the most popular dramatization of the theoretical changing of historical timelines caused by travel into the past. Rian Johnson’s Looper treats that phenomenon more complexly, allowing for more free will and taking into account more of the effects a character in one time period would have on that same character in another. Consulting on that film was Shane Carruth, whose Primer — my own personal favorite time-travel fiction — had already taken time travel “to the extreme, with time travel within time travel within time travel.”
Harry Potter and the Prisoner of Azkaban, Reich’s personal favorite time-travel fiction, exhibits a clarity and consistency uncommon in the genre. J.K. Rowling accomplishes this by following the rule that “while you’re experiencing your initial pre-time travel passage through a particular point in history, your time-traveling clone is also already there, doing everything you’ll eventually do when you time-travel yourself.” This single-time-line version of time travel, in which “you can’t change the past because the past already happened,” gets around problems that have long bedeviled other time-travel fictions. But it also demonstrates the importance of self-consistency in fiction of all kinds: “In order to care about the characters in a story,” Reich says, “we have to believe that actions have consequences.” Stories, in other words, must obey their own rules — even, and perhaps especially, stories involving time-traveling child wizards.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
For every Nets fan cheering their team on in Brooklyn’s Barclays Center and every tourist gamboling about the post-punk, upscale East Village, there are dozens of local residents who remember what—and who—was displaced to pave the way for this progress.
It’s no great leap to assume that something had to be plowed under to make way for the city’s myriad gleaming skyscrapers, but harder to conceive of Central Park, the 840-acre oasis in the middle of Manhattan, as a symbol of ruthless gentrification.
Plans for a peaceful green expanse to rival the great parks of Great Britain and Europe began taking shape in the 1850s, driven by well-to-do white merchants, bankers, and landowners looking for temporary escape from the urban pressures of densely populated Lower Manhattan.
It took 20,000 workers—none black, none female—over three years to realize architects Frederick Law Olmsted and Calvert Vaux’s sweeping pastoral design.
A hundred and fifty years later, Central Park is still a vital part of daily life for visitors and residents alike.
But what of the vibrant neighborhood that was doomed by the park’s construction?
The best established of these was Seneca Village, which ran from approximately 82nd to 89th Street, along what is known today as Central Park West. 260-some residents were evicted under eminent domain and their homes, churches, and school were razed.
This physical erasure quickly translated to mass public amnesia, abetted, no doubt, by the way Seneca Village was framed in the press, not as a community of predominantly African-American middle class and working class homeowners, but rather a squalid shantytown inhabited by squatters.
As Brent Staples recalls in a New York Times op-ed, in the summer of 1871, when park workers dislodged two coffins in the vicinity of the West 85th Street entrance, The New York Herald treated the discovery as a baffling mystery, despite the presence of an engraved plate on one of the coffins identifying its occupant, an Irish teenager, who’d been a parishioner of Seneca Village’s All Angels Episcopal Church.
Copeland and her colleagues kept Alexander’s work in mind when they began excavating Seneca Village in 2011, focusing on the households of two African-American residents, Nancy Moore and William G. Wilson, a father of eight who served as sexton at All Angels and lived in a three-story wood-frame house. The dig yielded 250 bags of material, including a piece of a bone-handled toothbrush, an iron tea kettle, and fragments of clay pipes and blue-and-white Chinese porcelain:
Archaeologists have begun to consider the lives of middle class African Americans, focusing on the ways their consumption of material culture expressed class and racial identities. Historian Leslie Alexander believes that Seneca Village not only provided a respite from discrimination in the city, but also embodied ideas about African pride and racial consciousness.
Owning a home in Seneca Village also bestowed voting rights on African American male heads of household.
Two years before it was torn down, the community was home to 20 percent of the city’s African American property owners and 15 percent of its African American voters.
Thanks to the efforts of historians like Copeland and Alexander, Seneca Village is once again on the public’s radar, though unlike Pigtown, a smaller, predominantly agricultural community toward the southern end of the park, the origins of its name remain mysterious.
Was the village named in tribute to the Seneca people of Western New York or might it, as Alexander suggests, have been a nod to the Roman philosopher, whose thoughts on individual liberty would have been taught as part of Seneca Village’s African Free Schools’ curriculum?
At a time when much of animation was consumed with little anthropomorphized animals sporting white gloves, Oskar Fischinger went in a completely different direction. His work is all about dancing geometric shapes and abstract forms spinning around a flat featureless background. Think of a Mondrian or Malevich painting that moves, often in time to the music. Fischinger’s movies have a mesmerizing elegance to them. Check out his 1938 short An Optical Poemabove. Circles pop, sway and dart across the screen, all in time to Franz Liszt’s 2nd Hungarian Rhapsody. This is, of course, well before the days of digital. While it might be relatively simple to manipulate a shape in a computer, Fischinger’s technique was decidedly more low tech. Using bits of paper and fishing line, he individually photographed each frame, somehow doing it all in sync with Liszt’s composition. Think of the hours of mind-numbing work that must have entailed.
Born in 1900 near Frankfurt, Fischinger trained as a musician and an architect before discovering film. In the 1930s, he moved to Berlin and started producing more and more abstract animations that ran before feature films. They proved to be popular too, at least until the National Socialists came to power. The Nazis were some of the most fanatical art critics of the 20th Century, and they hated anything non representational. The likes of Paul Klee, Oskar Kokoschka and Wassily Kandinsky among others were written off as “degenerate.” (By stark contrast, the CIA reportedly loved Abstract Expressionism, but that’s a different story.) Fischinger fled Germany in 1936 for the sun and glamour of Hollywood.
The problem was that Hollywood was really not ready for Fischinger. Producers saw the obvious talent in his work, and they feared that it was too ahead of its time for broad audiences. “[Fischinger] was going in a completely different direction than any other animator at the time,” said famed graphic designer Chip Kidd in an interview with NPR. “He was really exploring abstract patterns, but with a purpose to them — pioneering what technically is the music video.”
Fischinger’s most widely seen American work was the section in Walt Disney’s Fantasia set to Bach’s Toccata and Fugue in D Minor. Disney turned his geometric forms into mountain peaks and violin bows. Fischinger was apoplectic. “The film is not really my work,” Fischinger later reflected. “Rather, it is the most inartistic product of a factory. …One thing I definitely found out: that no true work of art can be made with that procedure used in the Disney studio.” Fischinger didn’t work with Disney again and instead retreated into the art world.
There he found admirers who were receptive to his vision. John Cage, for one, considers the German animator’s experiments to be a major influence on his own work. Cage recalls his first meeting with Fischinger in an interview with Daniel Charles in 1968.
One day I was introduced to Oscar Fischinger who made abstract films quite precisely articulated on pieces of traditional music. When I was introduced to him, he began to talk with me about the spirit, which is inside each of the objects of this world. So, he told me, all we need to do to liberate that spirit is to brush past the object, and to draw forth its sound. That’s the idea which led me to percussion.
Jonathan Crow is a Los Angeles-based writer and filmmaker whose work has appeared in Yahoo!, The Hollywood Reporter, and other publications. You can follow him at @jonccrow. And check out his blog Veeptopus, featuring one new drawing of a vice president with an octopus on his head daily. The Veeptopus store is here.
Since humanity has had music, we’ve also had bad music. And bad music can come from only one source: bad musicians. Despite such personal technologies of relatively recent invention as noise-canceling headphones, bad music remains nigh unavoidable in the modern world, issuing as it constantly does from the sound systems installed in grocery stores, gyms, passing automobiles, and so on. And against the bad musicians responsible we have less recourse than ever, or at least less than medieval Europeans did, as shown by the Ripley’s Believe It or Not video above on the “shame flute,” a non-musical instrument used to punish crimes against the art.
“The contraption, which is essentially a heavy iron flute – although you probably wouldn’t want play it – was shackled to the musician’s neck,” writes Maddy Shaw Roberts at Classic FM. “The musician’s fingers were then clamped to the keys, to give the impression they were playing the instrument. Finally, just to further their humiliation, they were forced to wear the flute while being paraded around town, so the public could throw rotten food and vegetables at them.” Surely the mere prospect of such a fate made many music-minded children of the olden days think twice about slacking on their practice sessions.
The sight of this flute of shame, which you can take in at either the equally stimulating-sounding Medieval Crime Museum in Rothenburg or the Torture Museum in Amsterdam, would get any of us moderns thinking about considering which musicians of our own day deserve to be shackled to it. The Guardian’s Dave Simpson suggests, among others, “all bands with silly names,” “any musician called Sir who is over 60,” and “anyone who has ever appeared on The X Factor, ever.” In this day and age they would all probably complain of cruel and unusual punishment, but as music-related torture devices go, the shame flute certainly seems preferable to ancient Greece’s “brazen bull.”
Though still a little-known historical artifact, the shame flute has regained some cultural currency in recent years. It even inspired the name of a Finnish rock group, Flute of Shame. As the band members put it in an interview with Vice’s Josh Schneider, “We were having a night out in Amsterdam and found ourselves in a torture museum whilst looking for the Banana Bar,” a well-known spot in the city’s red-light district. “We saw the device and the rest is history.” Of course, any rock group that names itself after a torture device will draw comparisons to Iron Maiden, and journalistic diligence compels Schneider to ask Flute of Shame which band would win in a shredding contest. “Probably Iron Maiden,” the Finns respond, “but are they happy?”
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
What’s the deal with images of powerful women in media? The trope of the tough-as-nails boss-lady who may or may not have a heart of gold has evolved a lot over the years, but it’s difficult to portray such a character unobjectionably, probably due to those all-too-familiar double standards about wanting women in authority (or, say, running for office) to be assertive but not astringent.
Margaret was the female lead in major films including Independence Day and The Devil’s Own, is a mainstay on Broadway, and has appeared on TV in many roles including the mother of the Gossip Girland as an unscrupulous newscaster on the final seasons of VEEP. Her height and voice have made her a good fit for dominant-lady roles, and she leads Mark, Erica, and Brian through a quick, instructive tour through her work with male directors (e.g. in a pre-Murphy-Brown Dianne English sit-com), playing the lead in three Lifetime Network movies, on Broadway as Jackie, and opposite Harrison Ford, Al Pacino, Melanie Griffith, Michael Shannon, Wallace Shawn, and others.
Given the limitations of short-form storytelling in film, maybe some use of stereotypes is just necessary to get the gist of a character out quickly, but actors can load their performances with unseen backstory. We hear about the actor’s role in establishing a character vs. the vision of the filmmakers or show-runners. Also, the relative conservatism of film vs. stage vs. TV in granting women creative control, the “feminine voice,” why women always apparently have to trip in movies when chased, and more.
A few resources to get you thinking about this topic:
In the early 19th century, Aristotle’s Meteorologica still guided scientific ideas about the climate. The model “sprang from the ancient Greek concept of klima,” as Ian Beacock writes at The Atlantic, a static scheme that “divided the hemispheres into three fixed climatic bands: polar cold, equatorial heat, and a zone of moderation in the middle.” It wasn’t until the 1850s that the study of climate developed into what historian Deborah Cohen describes as “dynamic climatology.”
Indeed, 120 years before Exxon Mobile learned about—and then seemingly covered up—global warming, pioneering researchers discovered the greenhouse gas effect, the tendency for a closed environment like our atmosphere to heat up when carbon dioxide levels rise. The first person on record to link CO2 and global warming, amateur scientist Eunice Newton Foote, presented her research to the Eight Annual Meeting of the American Association for the Advancement of Science in 1856.
Foote’s paper, “Circumstances affecting the heat of the sun’s rays,” was reviewed the following month in the pages of Scientific American, in a column that approved of her “practical experiments” and noted, “this we are happy to say has been done by a lady.” She used an air pump, glass cylinders, and thermometers to compare the effects of sunlight on “carbonic acid gas” (or carbon dioxide) and “common air.” From her rudimentary but effective demonstrations, she concluded:
An atmosphere of that gas [CO2] would give to our earth a high temperature; and if as some suppose, at one period of its history the air had mixed with it a larger proportion than at present, an increased temperature…must have necessarily resulted.
Unfortunately, her achievement would disappear three years later when Irish physicist John Tyndall, who likely knew nothing of Foote, made the same discovery. With his superior resources and privileges, Tyndall was able to take his research further. “In retrospect,” one climate science database writes, Tyndall has emerged as the founder of climate science, though the view “hides a complex, and in many ways more interesting story.”
Neither Tyndall nor Foote wrote about the effect of human activity on the contemporary climate. It would take until the 1890s for Swedish scientist Svante Arrhenius to predict human-caused warming from industrial CO2 emissions. But subsequent developments depended upon their insights. Foote, whose was born 200 years ago this past July, was marginalized almost from the start. “Entirely because she was a woman,” the Public Domain Review points out, “Foote was barred from reading the paper describing her findings.”
Furthermore, Foote “was passed over for publication in the Association’s annual Proceedings.” Her paper was published in The American Journal of Science, but was mostly remarked upon, as in the Scientific American review, for the marvel of such homespun ingenuity from “a lady.” The review, titled “Scientific Ladies—Experiments with Condensed Gas,” opened with the sentence “Some have not only entertained, but expressed the mean idea, that women do not possess the strength of mind necessary for scientific investigation.”
The praise of Foote credits her as a paragon of her gender, while failing to convey the universal importance of her discovery. At the AAAS conference, the Smithsonian’s Joseph Henry praised Foote by declaring that science was “of no country and of no sex,” a statement that has proven time and again to be untrue in practice. The condescension and discrimination Foote endured points to the multiple ways in which she was excluded as a woman—not only from the scientific establishment but from the educational institutions and funding sources that supported it.
Her disappearance, until recently, from the history of science “plays into the Matilda Effect,” Leila McNeill argues at Smithsonian, “the trend of men getting credit for female scientist’s achievements.” In this case, there’s no reason not to credit both scientists, who made original discoveries independently. But Foote got there first. Had she been given the credit she was due at the time—and the institutional support to match—there’s no telling how far her work would have taken her.
Just as Foote’s discovery places her firmly within climate science history, retrospectively, her “place in the scientific community, or lack therof,” writes Amara Huddleston at Climate.gov, “weaves into the broader story of women’s rights.” Foote attended the first Women’s Rights Convention in Seneca Falls, NY in 1848, and her name is fifth down on the list of signatories to the “Declaration of Sentiments,” a document demanding full equality in social status, legal rights, and educational, economic, and, Foote would have added, scientific opportunities.
The close associations between Surrealism and Freudian psychoanalysis were liberally encouraged by the most famous proponent of the movement, Salvador Dalí, who considered himself a devoted follower of Freud. We don’t have to wonder what the founder of psychoanalysis would have thought of his self-appointed protégé.
We have them recording, in their own words, their impressions of their one and only meeting—which took place in July of 1938, at Freud’s home in London. Freud was 81, Dali 34. We also have sketches Dali made of Freud while the two sat together. Their memories of events, shall we say, differ considerably, or at least they seemed totally bewildered by each other. (Freud pronounced Dali a “fanatic.”)
In any case, There’s absolutely no way the encounter could have lived up to Dali’s expectations, as the Freud Museum London notes:
[Dalí] had already travelled to Vienna several times but failed to make an introduction. Instead, he wrote in his autobiography, he spent his time having “long and exhaustive imaginary conversations” with his hero, at one point fantasizing that he “came home with me and stayed all night clinging to the curtains of my room in the Hotel Sacher.”
Freud was certainly not going to indulge Dalí’s peculiar fantasies, but what the artist really wanted was validation of his work—and maybe his very being. “Dali had spent his teens and early twenties reading Freud’s works on the unconscious,” writes Paul Gallagher at Dangerous Minds, “on sexuality and The Interpretation of Dreams.” He was obsessed. Finally meeting Freud in ’38, he must have felt “like a believer might feel when coming face-to-face with God.”
He brought with him his latest painting The Metamorphosis of Narcissus, and an article he had published on paranoia. This, especially, Dali hoped would gain the respect of the elderly Freud.
Trying to interest him, I explained that it was not a surrealist diversion, but was really an ambitiously scientific article, and I repeated the title, pointing to it at the same time with my finger. Before his imperturbable indifference, my voice became involuntarily sharper and more insistent.
On being shown the painting, Freud supposedly said, “in classic paintings I look for the unconscious, but in your paintings I look for the conscious.” The comment stung, though Dali wasn’t entirely sure what it meant. But he took it as further evidence that the meeting was a bust. Sketching Freud in the drawing below, he wrote, “Freud’s cranium is a snail! His brain is in the form of a spiral—to be extracted with a needle!”
One might see why Freud was suspicious of Surrealists, “who have apparently chosen me as their patron saint,” he wrote to Stefan Zweig, the mutual friend who introduced him to Dali. In 1921, poet and Surrealist manifesto writer André Breton “had shown up uninvited on [Freud’s] doorstep.” Unhappy with his reception, Breton published a “bitter attack,” calling Freud an “old man without elegance” and later accused Freud of plagiarizing him.
Despite the memory of this nastiness, and Freud’s general distaste for modern art, he couldn’t help but be impressed with Dali. “Until then,” he wrote to Zweig, “I was inclined to look upon the surrealists… as absolute (let us say 95 percent, like alcohol), cranks. That young Spaniard, however, with his candid and fanatical eyes, and his undeniable technical mastery, has made me reconsider my opinion.”
You’ve almost certainly been to more art museums than you can remember, and more than likely to a few museums of natural history, science, and technology as well. But think hard: have you ever set foot inside a museum of philosophy? Not just an exhibition dealing with philosophers or philosophical concepts, but a single institution dedicated wholly to putting the practice of philosophy itself on display. Your answer can approach a yes only if you spent time in Milan last November, and more specifically at the University of Milan, in whose halls the Museo della Filosofia set up shop and proved its surprisingly untested — and surprisingly successful — concept.
“What we had in mind was not an historically-minded museum collecting relics about the lives and works of important philosophers, but something more dynamic and interactive,” writes University of Milan postdoctoral research fellow Anna Ichino at Daily Nous, “where philosophical problems and theories become intuitively accessible through a variety of games, activities, experiments, aesthetic experiences, and other such things.”
In the first hall, “we used images like Mary Midgely’s ‘conceptual plumbing’ or Wittgenstein’s ‘fly bottle’ to convey the idea according to which philosophical problems are in important respects conceptual problems, which amount to analyzing concepts that we commonly use in unreflective ways.”
In the second hall, visitors to the Museo della Filosofia “could literally play with paradoxes and thought experiments in order to appreciate their heuristic role in philosophical inquiry.” The experiences available there ranged from using an oversized deck of cards to “solve” paradoxes, the perhaps inevitable demonstration of the well-known “trolley problem” using a model railroad set, and — most harrowing of all — the chance to “eat chocolates shaped as cat excrement” straight from the litter box. Then came the “School of Athens” game, “in which visitors had to decide whether to back Plato or Aristotle; then they could also take a souvenir picture portraying themselves in the shoes (and face!) of one or the other.”
In the third, “programmatic” hall, the museum’s organizers “presented the plan for what still needs to be done,” a to-do list that includes finding a permanent home. Before it does so, you can have a look at the project’s web site as well as its pages on Facebook and Instagram. At the top of the post appears a short video introducing the Museo della Filosofia which, like the rest of the materials, is for the moment in Italian only, but it nevertheless gets across even to non-Italian-speakers a certain idea of the experience a philosophical museum can deliver. Philosophical thinking, after all, occurs prior to language. Or maybe it’s inextricably tied up with language; different philosophers have approached the problem differently. And when the Museo della Filosofia opens for good, you’ll be able to visit and approach a few philosophical problems yourself. Read more about the museum at Daily Nous.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.