Asked to name our favorite concrete building, many of us would struggle to hold back a sneer. Though the copious use of that material by mid-twentieth-century style known as Brutalism has lately gained new generations of enthusiasts, we still more commonly hear it lamented as a source of architectural “monstrosities.” But as a building material, concrete goes back much further in history than the decades following World War II. To find a universally beloved example, we need merely look back to second-century Rome. There we find the Pantheon, looking much the same as it does in twenty-first century Rome today.
The best-preserved monument of ancient Rome, the Pantheon (not to be confused with the Greek Parthenon) has remained in continuous use, first as “a temple to the gods, then sanctified and made into a church. Now, of course, it’s a major tourist attraction.” So says scholar Steven Zucker in the Khan Academy video above, a brief photographic tour he leads alongside his colleague Beth Harris.
“As soon as you walk in, you notice that there’s a kind of obsession with circles, with rectangles, with squares, with those kinds of perfect geometrical shapes,” says Harris. “Because of the Roman use of concrete, the idea [obtained] that architecture could be something that shaped space and that could have a different kind of relationship to the viewer.”
You can go deeper into the Pantheon (built circa 125 AD) through the tour video by Youtuber Garrett Ryan, creator of the ancient-history channel Told in Stone. Calling the Pantheon “arguably the most influential building of all time,” he goes on to support that bold claim by examining a host of structural and aesthetic elements (not least its sublimely spherical rotunda) that would inspire architects in the Renaissance, a time dedicated to making use of ancient Greek and Roman knowledge, and in some sense ever after. This may come as a surprise to viewers with only a casual interest in architecture — more than it would to the Emperor Hadrian, commissioner of the Pantheon, who seems not to have been given to great doubts about the durability of his legacy.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
Anyone who’s ever walked the red carpet or posed for a high fashion shoot would count themselves lucky to create the sort of impression made by John Singer Sargent’s iconic portrait of Madame X.
Though not if we’re talking about the sort of impression the painting made in 1884, when the model’s haughty demeanor, plunging bodice, and unapologetic use of skin-lightening, possibly arsenic-based cosmetics got the Paris Salon all riled up.
Most scandalously, one of her gown’s jeweled straps had slipped from her shoulder, a costume malfunction this cool beauty apparently couldn’t be bothered to fix, or even turn her head to acknowledge.
Virginie Amélie Avegno Gautreau, the New Orleans-born Paris socialite (social climber, some would have sniffed) so strikingly depicted by Sargent, was horrified by her likeness’ reception at the Salon. Although Sargent had coyly replaced her name with an ellipses in the painting’s title, there was no doubt in viewers’ minds as to her identity.
John Sargent, Evan Charteris’ 1927 biography, shows Madame Gautreau very little mercy when recounting her attempts at damage control:
A demand was made that the picture should be withdrawn. It is not among the least of the curiosities of human nature, that while an individual will confess and even draw attention to his own failings, he will deeply resent the same office being undertaken by someone else. So it was with the dress of Madame Gautreau. Here a distinguished artist was proclaiming to the public in paint a fact about herself she had hitherto never made any attempt to conceal, one which had, indeed, formed one of her many social assets. Her resentment was profound.
Sargent, distraught that his portrait of the celebrated scenemaker had yielded the opposite of the hoped-for positive splash, refused to indulge her request to remove the painting from exhibition.
His friend, painter Ralph Wormeley Curtis, wrote to his parents of the scene he witnessed in Sargent’s studio when Madame Gautreau’s mother rolled up, “bathed in tears”, primed to defend her daughter:
(She) made a fearful scene saying “Ma fille est perdu — tout Paris se moque d’elle. Mon genre sera forcé de se battre. Elle mourira de chagrin” etc.
(My daughter is lost — all of Paris mocks her. My kind will be forced to fight. She will die of sorrow.)
John replied it was against all laws to retire a picture. He painted her exactly as she was dressed, that nothing could be said of the canvas than had been said of her appearance dans le monde etc. etc.
Defending his cause made him feel much better. Still we talked it all over till 1 o’clock here last night and I fear he has never had such a blow. He says he wants to get out of Paris for a time. He goes to Eng. in 3 weeks. I fear là bas he will fall into Pre‑R. Influence wh. has got a strange hold of him, he says since Siena.
As Charlotte, creator of the Art Deco YouTube channel, points out in a frenetic overview of the scandal, below, Sargent came out of this fiasco a bit better than Madame Gautreau, whose damaged reputation cost her friends as well as her queen bee status.
(In her essay, Virginie Amélie Avegno Gautreau: Living Statue, art historian Elizabeth L. Block corrects Charlotte’s assertion that the painting “destroyed Madame Gautreau’ life”. Contrary to popular opinion, within three years, she was making her theatrical debut, hosting parties, and was hailed by the New York Times as a “piece of plastic perfection.”)
Sargent did indeed decamp for England, where he found both creative and critical success. By century’s end, he was widely recognized as the most successful portrait painter of his day.
The portrait of Madame Gautreau remained enough of a sore spot that he kept it out of the public eye for more than twenty years, though shortly after its disastrous debut at the Salon, he did take another swipe at it, repositioning the suggestive shoulder strap to a more conventionally acceptable location, as the below photo, taken in his studio in 1885 confirms.
In 1905, he finally allowed it to see the light of day in a London exhibition, with subsequent engagements in Berlin, Rome and San Francisco.
In 1916, when the portrait was still on display in San Francisco, he wrote his friend Edward “Ned” Robinson, Director of The Metropolitan Museum of Art, offering to sell it for £1,000, saying, “I suppose it is the best thing I have done.”
“By the way,” he added, “I should prefer, on account of the row I had with the lady years ago, that the picture should not be called by her name.”
Even though Madame Gautreau had died the previous year, Robinson obliged, retitling the painting Portrait of Madame X, the name by which it and its glamorous model are famously known today.
Read about the discoveries Metropolitan Museum of Art conservationists made during X‑radiography and infrared reflectography of the portrait here.
The first subway train, as we know such things today, entered service in 1890. Its path is now part of the Northern line of the London Underground, itself the first urban metro system. The success of the Tube, as it’s commonly known, didn’t come right away; the whole thing was on the brink of failure, in fact, before creations like 1914’s Wonderground Map of London Town aided its public understanding and bolstered its public image.
At the time, Britain still commanded a great empire with London as its capital; the Wonderground Map placed the London Underground in the context of the city, making legible the still fairly novel concept of an underground train system with copious whimsical detail.
Nor was the Roman Empire anything to sneeze at, even during the fourth and fifth centuries after its decline had set in. Though it came up with some still-impressive inventions, including long-lasting concrete and monumental aqueducts, the technology to build and operate a subway system still lay some way off.
But that didn’t stop Marcus Vipsanius Agrippa, a general, architect, and friend of emperor Augustus, from commissioning a map of the empire that read more or less like Massimo Vignelli’s 1972 map of the New York subway. That ambitious work of cartography, historians now believe, inspired the Tabula Peutingeriana, which survives today as the only large world map from antiquity. The video above from Youtuber Jeremy Shuback approaches the Tabula Peutingeriana as “the first transit map,” despite its dating from the thirteenth century, and even then probably being a copy of a fourth- or fifth-century original.
While the Roman Empire didn’t have electric trains and payment cards, they did, of course, have transit: the word descends from the Latin transire, “go across.” Many a Roman had to go across, if not the whole empire, then at least large stretches of it. In theory, they would have found a map like Tabula useful, with its simplification of geography in order to emphasize city-to-city connections. But that wasn’t its primary purpose: as Shuback puts it, this oversized map of all lands dominated by the Romans was “made to brag.” Whoever owned it surely wanted to imply that they possessed not just a map, but the world itself.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
“Her Majesty’s a pretty nice girl, but she doesn’t have a lot to say,” sings Paul McCartney on the Beatles’ “Her Majesty.” That comic song closes Abbey Road, the last album the band ever recorded, and thus puts a cap on their brief but wondrous cultural reign. In 2002 McCartney played the song again, in front of Queen Elizabeth II herself as part of her Golden Jubilee celebrations. Earlier this year her Platinum Jubilee marked a full 70 years on the throne, but now — 53 years after that cheeky tribute on Abbey Road — Her Majesty’s own reign has drawn to a close with her death at the age of 96. She’d been Queen since 1953, but she’d been a British icon since at least the Second World War.
In October 1940, at the height of the Blitz, Prime Minister Winston Churchill asked King George VI to allow his daughter, the fourteen-year-old Princess Elizabeth, to make a morale-boosting speech on the radio. Recorded in Windsor Castle after intense preparation and then broadcast on the BBC’s Children’s Hour, it was ostensibly addressed to the young people of Britain and its empire.
“Evacuation of children in Britain from the cities to the countryside started in September 1939,” says BBC.com, with ultimate destinations as far away as Canada. “It is not difficult for us to picture the sort of life you are all leading, and to think of all the new sights you must be seeing and the adventures you must be having,” Princess Elizabeth tells them. “But I am sure that you, too, are often thinking of the old country.”
In the event, millions of young and old around the world heard the broadcast, which arguably served Churchill’s own goal of encouraging American participation in the war. But it also gave Britons a preview of the dignity and forthrightness of the woman who would become their Queen, and remain so for an unprecedented seven decades. As Paul McCartney implied, Queen Elizabeth II turned out not to be given to prolonged flights of rhetoric. But though she may not have had a lot to say, she invariably spoke in public at the proper moment, in the proper words, and with the proper manner. Today one wonders whether this admirable personal quality, already in short supply among modern rulers, hasn’t vanished entirely.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
Home baked sourdough had its moment during the early days of the pandemic, but otherwise bread has been much maligned throughout the 21st century, at least in the Western World, where carbs are vilified by body-conscious consumers.
This was hardly the case on January 18, 1943, when Americans woke up to the news that the War Foods Administration, headed by Secretary of Agriculture Claude R. Wickard, had banned the sale of sliced bread.
The reasons driving the ban were a bit murky, though by this point, Americans were well acquainted with rationing, which had already limited access to high-demand items as sugar, coffee, gasoline and tires.
Though why sliced bread, of all things?
Might depriving the public of their beloved pre-sliced bread help the war effort, by freeing up some critical resource, like steel?
War production regulations prohibited the sale of industrial bread slicing equipment for the duration, though presumably, existing commercial bakeries wouldn’t have been in the market for more machines, just the odd repair part here and there.
Wax paper then? It kept sliced bread fresh prior to the invention of plastic bags. Perhaps the Allies had need of it?
No, unlike nylon, there were no shortages of waxed paper.
Flour had been strictly regulated in Great Britain during the first World War, but this wasn’t a problem stateside in WWII, where it remained relatively cheap and easy to procure, with plenty leftover to supply overseas troops. 1942’s wheat crop had been the second largest on record.
There were other rationales having to do with eliminating food waste and relieving economic pressure for bakers, but none of these held up upon examination. This left the War Production Office, the War Price Administration, and the Office of Agriculture vying to place blame for the ban on each other, and in some cases, the American baking industry itself!
While the ill considered ban lasted just two months, the public uproar was considerable.
Although pre-sliced bread hadn’t been around all that long, in the thirteen-and-a-half years since its introduction, consumers had grown quite dependent on its convenience, and how nicely those uniform slices fit into the slots of their pop up toasters, another recently-patented invention.
A great pleasure of the History Guy’s coverage is the name checking of local newspapers covering the Sliced Bread Ban:
An absence of data did not prevent a reporter for the Wilmington News Journal from speculating that “it is believed that the majority of American housewives are not proficient bread slicers.”
One such housewife, having spent a hectic morning hacking a loaf into toast and sandwiches for her husband and children, wrote a letter to the New York Times, passionately declaring “how important sliced bread is to the morale and saneness of a household.”
The more stiff upper lipped patriotism of Vermont home economics instructor Doris H. Steele found a platform in the Barre Times:
In Grandmother’s day, the loaf of bread had a regular place at the family table. Grandmother had an attractive board for the bread to stand on and a good sharp knife alongside. Grandmother knew that a steady hand and a sharp knife were the secrets of slicing bread. She sliced as the family asked for bread and in this way, she didn’t waste any bread by cutting more than the family could eat. Let’s all contribute to the war effort by slicing our own bread.
Then, as now, celebrities felt compelled to weigh in.
First the Titanic was claimed by the ocean; now it’s being eaten by the ocean. “The iconic ocean liner that was sunk by an iceberg is now slowly succumbing to metal-eating bacteria,” the Associated Press’ Ben Finley reported last year. “Holes pervade the wreckage, the crow’s nest is already gone and the railing of the ship’s iconic bow could collapse at any time.” Given the loss to bacteria of “hundreds of pounds of iron a day,” some predictions indicate that “the ship could vanish in a matter of decades as holes yawn in the hull and sections disintegrate.”
This makes the documentation of this best-known of all shipwrecks a more pressing matter than ever — and, incidentally, provides a convenient reason for enterprising ocean-explorers to promote and sell the experience of Titanic tourism.
“OceanGate, a privately owned underwater exploration company founded in 2009, began offering annual journeys to the wreck of the Titanic in 2021,” writes Smithsonian.com’s Michelle Harris. “This year, civilian ‘mission specialists’ paid $250,000 each for the privilege of joining diving experts, historians and scientists on the expedition.”
OceanGate’s latest expedition produced the video above. It features a brief clip of footage of the Titanic in 8K resolution, the highest-quality video yet used to shoot the ship in its final resting place two and a half miles beneath the North Atlantic. (Stephen Low’s 1992 documentary Titanica used IMAX film, an extremely high-resolution medium but one difficult to compare with modern digital video.) That level of detail captures aspects of the Titanic previously only suggested in photographs, or indeed never before seen — at least not in this ruinous and eerily majestic suboceanic state. The survivors of the sinking are all long gone, but how long will the ship itself be able to reveal its secrets to us?
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
Mikhail Gorbachev, the 8th and final leader of the Soviet Union, died last month at age 91, a news event that triggered responses ranging from “Who?” to “Wow, was he still alive?” The first response reflects poorly on the teaching of history: journalists reporting on Gorbachev’s death have been obliged to explain his significance to many American readers just a few decades after his name filled U.S. headlines. But it’s also true that Gorbachev left a thoroughly ambiguous legacy that seems to grow only more muddled with time.
As historian Richard Sakwa wrote on the 20th anniversary of the short-lived Soviet empire’s collapse, Gorbachev is remembered in the U.S. — depending on who’s remembering — as either a “magnificent failure” or a “tragic success.” Some former Soviets, especially those more partial to the authoritarianism of a Stalin or Putin, omit any positive descriptions of Gorbachev’s major achievement – to wit, reforming the U.S.S.R. out of existence in the late 1980s with little need, really, for Reagan’s extravagant nuclear posturing.
Putin himself calls the fall of the U.S.S.R. “the greatest geopolitical catastrophe” of the previous century, an assessment shared by many who agree with him on nothing else. At the end of the 80s, however, an emerging generation of Russians had no clear sense of what was happening as their country fell apart. “I was 6 when the Soviet Union broke up,” Anatoly Kurmanaev writes at The New York Times. “I had no idea at the time that the person most responsible for the overwhelming changes transforming my hometown in Siberia was a man called Mikhail Gorbachev. I remember standing in line for bread in the dying days of Communism, but I don’t remember much discussion of his ‘perestroika.’ ”
Mixed admiration and contempt for Gorbachev trickled down to a younger generation a few years later. “The snatches of conversation I could hear were about people being fed up,” writes Kurmanaev, “not about the man with a distinctive birthmark sitting in the Kremlin…. Ironically, my first distinct, independent memory of Mr. Gorbachev, as perhaps for many of my generation, dates to a 1998 commercial for Pizza Hut,” an ad made by the U.S. fast-food company to celebrate the opening of a restaurant near Red Square, and made by Gorbachev because… well, also ironic, given the ad’s premise… he needed the money.
Written by Tom Darbyshire of ad agency BBDO, the commercial stages a debate between patrons at the restaurant before Gorbachev’s arrival calms things down. “Meant to be tongue-in-cheek,” Maria Luisa Paul writes at The Washington Post, the ad intended to show that “pizza is one of those foods that brings people together and bridges their differences,” says Darbyshire. In yet another irony, Gorbachev himself — who negotiated for a year before agreeing to the spot — refused to eat pizza on camera, allowing his granddaughter the honor instead.
Though he wouldn’t touch the stuff, Gorbachev defended himself against critics, including his own wife, Raisa, by saying “pizza is for everyone. It’s not only consumption. It’s also socializing.” What was the talk at Gorbachev’s local Pizza Hut on the day he popped in with his grandchild to socialize? Why, it was talk of Gorbachev.
“Because of him, we have economic confusion!” one diner alleges.
“Because of him, we have opportunity!” retorts another.
“Because of him, we have political instability,” the first responds.
An older woman breaks the impasse by stating their obvious mutual affinities for pizza, to which all reply, “Hail to Gorbachev!”
Try as they might, not even Pizza Hut could heal the wounds caused by the country’s economic confusion and political instability.
The ad has circulated on social media, and in history classes, before and after Gorbachev’s death as an example of mass media that “still reflects his legacy,” writes Paul. Gorbachev may be largely forgotten — at least in the U.S. — decades after the Pizza Hut ad aired, but it wouldn’t be his last attempt to leave his mark in advertising, as we see in the 2007 Louis Vuitton ad above, featuring a product much less accessible than pizza to the average Russian.
This fall, historian Timothy Snyder is teaching a course at Yale University called The Making of Modern Ukraine. And he’s generously making the lectures available on YouTube–so that you can follow along too. All of the currently-available lectures appear above (or on this playlist), and we will keep adding new ones as they come online. A syllabus for the course can be found here. Key questions covered by the course include:
What brought about the Ukrainian nation? Ukraine must have existed as a society and polity on 23 February 2022, else Ukrainians would not have collectively resisted Russian invasion the next day. Why has the existence of Ukraine occasioned such controversy? In what ways are Polish, Russian, and Jewish self-understanding dependent upon experiences in Ukraine? Just how and when did a modern Ukrainian nation emerge? Just how for that matter does any modern nation emerge? And why some nations and not others? What is the balance between structure and agency in history? Can nations be chosen, and does it matter? Can the choices of individuals influence the rise of much larger social organizations? If so, how? Ukraine was the country most touched by Soviet and Nazi terror: what can we learn about those systems, then, from Ukraine? Is the post-colonial, multilingual Ukrainian nation a holdover from the past, or does it hold some promise for the future?
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.