Ever wonder what it was like to really fight while wearing a full suit of armor? We’ve featured a few historical reconstructions here on Open Culture, including a demonstration of the various ways combatants would vanquish their foe—including a sword right between the eyes. We’ve also shown you how long it took to create a suit of armor and the clever flexibility built into them. But really, don’t we want to see what it would be like in a full melee? In the above Vice documentary, you can finally sate your bloodlust.
Not that anyone dies in the MMA-like sword-and-chainmail brawls. In these public competitions, the weapons are blunted and contestants fight “not to the death, just until they fall over,” as the narrator somewhat sadly explains. It is just a legit sport as any other fighting challenge, and the injuries are real. There’s no fooling around with these people. They are serious, and a nation’s honor is still at stake.
This mini-doc follows the American team to the International Medieval Combat Federation World Championships in Montemor-o-Velho in Portugal. What looks like a regular Renaissance faire is only the decorations around the main, incredibly violent event. We see battles with longswords, short axes, shields used offensively and defensively, and a lot of pushing and shoving. Contestants go head-to-head, or five against five, or twelve against twelve.
Twenty-six countries take part, and I have to say for all the jingoistic hoo-hah I try to ignore, the American team’s very nicely designed stars and stripes battle gear looked pretty damn cool. The Vice team also discover an interesting cast of characters, like the Texan who wears his cowboy hat when he’s not wearing his combat helmet; the man who describes his fighting style as “nerd rage”; and the couple on their honeymoon who met while brutally beating each other in an earlier competition. (No, the knights here are not all men.).
There are injuries, sprains, broken bones. There’s also the madness of inhaling too much of your own CO2 inside the helmet; and smelling the ozone when a spark of metal-upon-metal flies into the helmet.
Thankfully nobody is fighting to the death or for King/Queen and Country. Just for the fun of adrenalin-based competition and bragging rights.
Ted Mills is a freelance writer on the arts who currently hosts the Notes from the Shed podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, and/or watch his films here.
The practice and privilege of academic science has been slow in trickling down from its origins as a pursuit of leisured gentleman. While many a leisured lady may have taken an interest in science, math, or philosophy, most women were denied participation in academic institutions and scholarly societies during the scientific revolution of the 1700s. Only a handful of women — seven known in total — were granted doctoral degrees before the year 1800. It wasn’t until 1678 that a female scholar was given the distinction, some four centuries or so after the doctorate came into being. While several intellectuals and even clerics of the time held progressive attitudes about gender and education, they were a decided minority.
Curiously, four of the first seven women to earn doctoral degrees were from Italy, beginning with Elena Cornaro Piscopia at the University of Padua. Next came Laura Bassi, who earned her degree from the University of Bologna in 1732. There she distinguished herself in physics, mathematics, and natural philosophy and became the first salaried woman to teach at a university (she was at one time the university’s highest paid employee). Bassi was the chief popularizer of Newtonian physics in Italy in the 18th century and enjoyed significant support from the Archbishop of Bologna, Prospero Lambertini, who — when he became Pope Benedict XIV — elected her as the 24th member of an elite scientific society called the Benedettini.
“Bassi was widely admired as an excellent experimenter and one of the best teachers of Newtonian physics of her generation,” says Paula Findlen, Stanford professor of history. “She inspired some of the most important male scientists of the next generation while also serving as a public example of a woman shaping the nature of knowledge in an era in which few women could imagine playing such a role.” She also played the role available to most women of the time as a mother of eight and wife of Giuseppe Veratti, also a scientist.
Bassi was not allowed to teach classes of men at the university — only special lectures open to the public. But in 1740, she was granted permission to lecture at her home, and her fame spread, as Findlen writes at Physics World:
Bassi was widely known throughout Europe, and as far away as America, as the woman who understood Newton. The institutional recognition that she received, however, made her the emblematic female scientist of her generation. A university graduate, salaried professor and academician (a member of a prestigious academy), Bassi may well have been the first woman to have embarked upon a full-fledged scientific career.
Poems were written about Bassi’s successes in demonstrating Newtonian optics; “news of her accomplishments traveled far and wide,” reaching the ear of Benjamin Franklin, whose work with electricity Bassi followed keenly. In Bologna, surprise at Bassi’s achievements was tempered by a culture known for “celebrating female success.” Indeed, the city was “jokingly known as a ‘paradise for women,’” writes Findlen. Bassi’s father was determined that she have an education equal to any of her class, and her family inherited money that had been equally divided between daughters and sons for generations; her sons “found themselves heirs to the property that came to the family through Laura’s maternal line,” notes the Stanford University collection of Bassi’s personal papers.
Bassi’s academic work is held at the Academy of Sciences in Bologna. Of the papers that survive, “thirteen are on physics, eleven are on hydraulics, two are on mathematics, one is on mechanics, one is on technology, and one is on chemistry,” writes a University of St. Andrew’s biography. In 1776, a year usually remembered for the formation of a government of leisured men across the Atlantic, Bassi was appointed to the Chair of Experimental Physics at Bologna, an appointment that not only meant her husband became her assistant, but also that she became the “first woman appointed to a chair of physics at any university in the world.”
Bologna was proud of its distinguished daughter, but perhaps still thought of her as an oddity and a token. As Dr. Eleonora Adami notes in a charming biography at sci-fi illustrated stories, the city once struck a medal in her honor, “commemorating her first lecture series with the phrase ‘Soli cui fas vidisse Minervam,’” which translates roughly to “the only one allowed to see Minerva.” But her example inspired other women, like Cristina Roccati, who earned a doctorate from Bologna in 1750, and Dorothea Erxleben, who became the first woman to earn a Doctorate in Medicine four years later at the University of Halle. Such singular successes did not change the patriarchal culture of academia, but they started the trickle that would in time become several branching streams of women succeeding in the sciences.
Alice’s Adventures in Wonderland isn’t just a beloved children’s story: it’s also a neuropsychological syndrome. Or rather the words “Alice in Wonderland,” as Lewis Carroll’s book is commonly known, have also become attached to a condition that, though not harmful in itself, causes distortions in the sufferer’s perception of reality. Other names include dysmetropsia or Todd’s syndrome, the latter of which pays tribute to the consultant psychiatrist John Todd, who defined the disorder in 1955. He described his patients as seeing some objects as much larger than they really were and other objects as much smaller, resulting in challenges not entirely unlike those faced by Alice when put by Carroll through her growing-and-shrinking paces.
Todd also suggested that Carroll had written from experience, drawing inspiration from the hallucinations he experienced when afflicted with what he called “bilious headache.” The transformations Alice feels herself undergoing after she drinks from the “DRINK ME” bottle and eats the “EAT ME” cake are now known, in the neuropsychological literature, as macropsia and micropsia.
“I was in the kitchen talking to my wife,” writes novelist Craig Russell of one of his own bouts of the latter. “I was hugely animated and full of energy, having just put three days’ worth of writing on the page in one morning and was bursting with ideas for new books.Then, quite calmly, I explained to my wife that half her face had disappeared.As I looked around me, bits of the world were missing too.”
Though “many have speculated that Lewis Carroll took some kind of mind-altering drug and based the Alice books on his hallucinatory experiences,” writes Russell, “the truth is that he too suffered from the condition, but in a more severe and protracted way,” combined with ocular migraine. Russell also notes that the sci-fi visionary Philip K. Dick, though “never diagnosed as suffering from migrainous aura or temporal lobe epilepsy,” left behind a body of work that has has given rise to “a growing belief that the experiences he described were attributable to the latter, particularly.” Suitably, classic Alice in Wonderland syndrome “tends to be much more common in childhood” and disappear in maturity. One sufferer documented in the scientific literature is just six years old, younger even than Carroll’s eternal little girl — presumably, an eternal seer of reality in her own way.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Poor Polyphonic. He was just about to deliver another perfectly mixed treatise on a classic rock magnum opus when the YouTube algorithm and the Jimi Hendrix Estate stepped in to stop him before publishing. So while you can watch this real-time explication of Hendrix’s more-than-just-a-jam “Voodoo Chile” with just the the graphics and the narration, you should cue up the 15 minute track however you can (for example on Spotify), and then press play when when the video gives the signal. (This might be the first YouTube explainer video to ask for copyright-skirting help.)
And anyway, you should have a copy of Electric Ladyland, right? It’s the one where Hendrix and the Experience really push all the boundaries, taking rock, blues, jazz, psychedelia, sci-fi, everything…all out as far as possible in the studio. It’s the one that introduced future members of the Band of Gypsies. And it’s the one that hints of everything that might have been, if Hendrix hadn’t passed away soon after.
Now, classic rock radio usually plays the much shorter and less laid back “Voodoo Child (Slight Return)” that closes the album. But this essay is about the longest track on Electric Ladyland, the one that ends side one. This is the track that Hendrix wanted to sound like a light night jam at New York club The Scene—and which he recorded after one particular night doing just that. He taped the audience effects soon after. Steve Winwood is on keyboards. Jack Casady from Jefferson Airplane plays bass. And Mitch Mitchell turns in one of his greatest performances and solos.
In the lyrics, Polyphonic notes, Hendrix connects the blues to his Cherokee heritage and to voodoo, to sex, and then beyond into science fiction landscapes. The song is a self-portrait, showing the past, the influence, the training, and then the potential that music, magic, and (let’s face it) LSD could bring. The band is vibing. Winwood drops riffs that are more British folk than Chicago blues. Hendrix strays far beyond the orbit of blues, swings past it one more time on his own slight return, and then explodes into stardust.
Polyphonic’s video also looks beautiful and perfectly intersperses his critique with the song’s main sections. It may have sounded like a jam, but Hendrix carefully designed it to flow the way it does. And Polyphonic follows suit. It is a highly enjoyable walk through a track (again find it on Spotify here) many already know, reawakening a sense of wonder about all its inherent, strange genius.
Ted Mills is a freelance writer on the arts who currently hosts the Notes from the Shed podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, and/or watch his films here.
No one living has experienced a viral event the size and scope of COVID-19. Maybe the unprecedented nature of the pandemic explains some of the vaccine resistance. Diseases of such virulence became rare in places with ready access to vaccines, and thus, ironically, over time, have come to seem less dangerous. But there are still many people in wealthy nations who remember polio, an epidemic that dragged on through the first half of the 20th century before Jonas Salk perfected his vaccine in the mid-fifties.
Polio’s devastation has been summed up visually in textbooks and documentaries by the terrifying iron lung, an early ventilator. “At the height of the outbreaks in the late 1940s,” Meilan Solly writes at Smithsonian, “polio paralyzed an average of more than 35,000 people each year,” particularly affecting children, with 3,000 deaths in 1952 alone. “Spread virally, it proved fatal for two out of ten victims afflicted with paralysis. Though millions of parents rushed to inoculate their children following the introduction of Jonas Salk’s vaccine in 1955, teenagers and young adults had proven more reluctant to get the shot.”
At the time, there were no violent, organized protests against the vaccine, nor was resistance framed as a patriotic act of political loyalty. But “cost, apathy and ignorance became serious setbacks to the eradication effort,” says historian Stephen Mawdsley. And, then as now, irresponsible media personalities with large platforms and little knowledge could do a lot of harm to the public’s confidence in life-saving public health measures, as when influential gossip columnist Walter Winchell wrote that the vaccine “may be a killer,” discouraging countless readers from getting a shot.
When Elvis Presley made his first appearance on Ed Sullivan’s show in 1956, “immunization levels among American teens were at an abysmal 0.6 percent,” note Hal Hershfield and Ilana Brody at Scientific American. To counter impressions that the polio vaccine was dangerous, public health officials did not solely rely on getting more and better information to the public; they also took seriously what Hershfield and Brody call the “crucial ingredients inherent to many of the most effective behavioral change campaigns: social influence, social norms and vivid examples.” Satisfying all three, Elvis stepped up and agreed to get vaccinated “in front of millions” backstage before his second appearance on the Sullivan show.
Elvis could not have been more famous, and the campaign was a success for its target audience, establishing a new social norm through influence and example: “Vaccination rates among American youth skyrocketed to 80 percent after just six months.” Despite the threat he supposedly posed to the establishment, Elvis himself was ready to serve the public. “I certainly never wanna do anything,” he said, “that would be a wrong influence.” See in the short video at the top how American public health officials stopped millions of preventable deaths and disabilities by admitting a fact propagandists and advertisers never shy from — humans, on the whole, are easily persuaded by celebrities. Sometimes they can even be persuaded for the good.
The British have a number of sayings that strike listeners of other English-speaking nationalities as odd. “Safe as houses” has always had a curious ring to my American ear, but it turns out to be quite ironic as well: the expression grew popular in the Victorian era, a time when Londoners were as likely to be killed by their own houses as anything else. That, at least, is the impression given by “The Bizarre Ways Victorians Sabotaged Their Own Health & Lives,” the documentary investigation starring historian Suzannah Lipscomb above.
Throughout the second half of the 19th century, many an Englishman would have regarded himself as living at the apex of civilization. He wouldn’t have been wrong, exactly, since that place and time witnessed an unprecedented number of large-scale innovations industrial, scientific, and domestic.
But a little knowledge can be a dangerous thing, and the Victorians’ understanding of their favorite new technologies’ benefits ran considerably ahead of their understanding of the attendant threats. The hazards of the dark satanic mills were comparatively obvious, but even the heights of domestic bliss, as that era conceived of it, could turn deadly.
Speaking with a variety of experts, Lipscomb investigates the dark side of a variety of accoutrements of the Victorian high (or at least comfortably middle-class) life. These harmed not just men but women and children as well: take the breeding-ground of disease that was the infant feeding bottle, or the organ-compressing corset — one of which, adhering to the experiential sensibility of British television, Lipscomb tries on and struggles with herself. Members of the eventual anti-corset revolt included Constance Lloyd, wife of Oscar Wilde, and it is Wilde’s apocryphal final words that come to mind when the video gets into the arsenic content of Victorian wallpaper. “Either that wallpaper goes, or I do,” Wilde is imagined to have said — and as modern science now proves, it could have been more than a matter of taste.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
When Rome conquered Carthage in the Third Punic War (149–146 BC), the Republic renamed the region Africa, for Afri, a word the Berbers used for local people in present-day Tunisia. (The Arabic word for the region was Ifriqiya.) Thereafter would the Roman Empire have a stronghold in North Africa: Carthage, the capital of the African Province under Julius and Augustus Caesar and their successors. The province thrived. Second only to the city of Carthage in the region, the city of Thysdrus was an important center of olive oil production and the hometown of Roman Emperor Septimius Severus, who bestowed imperial favor upon it, granting partial Roman citizenship to its inhabitants.
In 238 AD, construction began on an amphitheater in Thysdrus that would rival its largest cousins in Rome, the famed Amphitheater of El Jem. “Designed to seat a whopping crowd of 35,000 people,” writes Atlas Obscura, El Jem was listed as a UNESCO World Heritage site in 1979. Built entirely of stone blocks, the massive theater was “modeled on the Coliseum of Rome,” notes UNESCO, “without being an exact copy of the Flavian construction…. Its facade comprises three levels of arcades of Corinthian or composite style. Inside, the monument has conserved most of the supporting infrastructure for the tiered seating. The wall of the podium, the arena and the underground passages are practically intact.”
Although the small city of El Jem hardly features on tours of the classical past, it was, in the time of the Amphitheater’s construction, a prominent site of struggle for control over the Empire. The year 238 “was particularly tumultuous,” Atlas Obscura explains, due to a “revolt by the population of Thysdrus (El Jem), who opposed the enormous taxation amounts being levied by the Emperor Maximinus’s local procurator.” A riot of 50,000 people led to the ascension of Gordian I, who ruled for 21 days during the “Year of the Six Emperors,” when “in just one year, six different people were proclaimed Emperors of Rome.”
From such fraught beginnings, the massive stone structure of the El Jem Amphitheater went on to serve as a fortress during invasions of Vandals and Arabs in the 5th-7th centuries. A thousand years after the Islamic conquest, El Jem became a fortress during the Revolutions of Tunis. Later centuries saw the amphitheater used for saltpetre manufacture, grain storage, and market stalls.
Despite hundreds of years of human activity, in violent upheavals and everyday business, El Jem remains one of the best preserved Roman ruins in the world and one of the largest outdoor theaters ever constructed. More importantly, it marks the site of one of North Africa’s first imperial occupations, one that would designate a region — and eventually a continent with a dizzyingly diverse mix of peoples — as “African.”
With the recent theatrical release of The Green Knight, your Pretty Much Pop host Mark Linsenmayer, returning host Brian Hirt, plus Den of Geek’s David Crow and the very British Al Baker consider the range of cinematic Arthuriana, including Excalibur (1981), Camelot (1967), King Arthur (2004), King Arthur: Legend of the Sword (2017), First Knight (1995), Sword of the Valiant (1983), Sir Gawain and the Green Knight (1973), and Monty Python and the Holy Grail (1975).
Arthuriana encompasses numerous (sometimes contradicting) stories that accrued and evolved for nearly 1000 years after the probable existence of the unknown person who was the historical source for the character before the 14th century poem (author unknown) Sir Gawain and the Green Knight, and then in the 15th century Sir Thomas Malory wrote Le Morte d’Arthur, which provided the template for well-known modern retellings like T.H. White’s The Once and Future King (1958).
The length and complexity of this mythology makes a single film problematic, with most settling on the love triangle between Arthur, Lancelot, and Guinevere leading to Camelot’s downfall. Multiple TV treatments have tried to do it justice, and if Guy Ritchie’s King Arthur: Legend of the Sword had been a box office success, then we’d currently be seeing multiple films in an Arthurian cinematic universe. By picking a smaller story and not trying too hard to tie it to King Arthur (who appears but is not named), The Green Knight is able to be more creative in painting and updating the strange story of Sir Gawain, who in previous cinematic outings (including Sword of the Valiant where Sean Connery played The Green Knight) involved Gawain involved in a series of nonsensical adventures far removed from the events told in the original poem.
We talk through characterization in a mythic story, stylizing the epic (how much violence? how weird?), its status as public domain material (like Robin Hood and Sherlock Holmes), and the moral lesson of the original Gawain poem and what director David Lowery did with that for the new film. Is the new film actually enjoyable, or just carefully thought through and artfully shot? Note that we don’t spoil anything significant about The Green Knight until the last ten minutes, so it’s fine if you haven’t seen it (Al hadn’t either).
Here are song articles by David Crow on our topic:
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.