We made sand think: this phrase is used from time to time to evoke the particular technological wonders of our age, especially since artificial intelligence seems to be back on the slate of possibilities. While there would be no Silicon Valley without silica sand, semiconductors are hardly the first marvel humanity has forged out of that kind of material. Consider the three millennia of history behind the traditional Japanese sword, long known even outside the Japanese language as the katana (literally “one-sided blade”) — or, more to the point of the Veritasium video above, the 1,200 years in which such weapons have been made out of steel. How Japanese Masters Turn Sand Into Swords
In explaining the science of the katana, Veritasium host Derek Muller begins more than two and a half billion years ago, when Earth’s oceans were “rich with dissolved iron.” But then, cyanobacteria started photosynthesizing that iron and creating oxygen as a by-product. This process dropped layers of iron onto the sea floor, which eventually hardened into layers of sedimentary rock.
With few such formations of its own, the geologically volcanic Japan actually came late to steel, importing it long before it could manage domestic production using the iron oxide that accumulated in its rivers, recovered as “iron sand.”
By that time, iron swords would no longer cut it, as it were, but the addition of charcoal in the heating process could produce the “incredibly strong alloy” of steel. Certain Japanese swordsmiths have continued to use steel made with the more or less traditional smelting process you can see performed in rural Shimane prefecture in the video. To the disappointment of its producer, Petr Lebedev, who participates in the whole process, the foot-operated bellows of yore have been electrified, but he hardly seems disappointed by his chance to take up a katana himself. He may have yet to attain the skill of a master swordsman, but understanding every scientific detail of the weapon he wields must make slicing bamboo clean in half that much more satisfying.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
There have been many theories of how human history works. Some, like German thinker G.W.F. Hegel, have thought of progress as inevitable. Others have embraced a more static view, full of “Great Men” and an immutable natural order. Then we have the counter-Enlightenment thinker Giambattista Vico. The 18th century Neapolitan philosopher took human irrationalism seriously, and wrote about our tendency to rely on myth and metaphor rather than reason or nature. Vico’s most “revolutionary move,” wrote Isaiah Berlin, “is to have denied the doctrine of a timeless natural law” that could be “known in principle to any man, at any time, anywhere.”
Vico’s theory of history included inevitable periods of decline (and heavily influenced the historical thinking of James Joyce and Friedrich Nietzsche). He describes his concept “most colorfully,” writes Alexander Bertland at the Internet Encyclopedia of Philosophy, “when he gives this axiom”:
Men first felt necessity then look for utility, next attend to comfort, still later amuse themselves with pleasure, thence grow dissolute in luxury, and finally go mad and waste their substance.
The description may remind us of Shakespeare’s “Seven Ages of Man.” But for Vico, Bertland notes, every decline heralds a new beginning. History is “presented clearly as a circular motion in which nations rise and fall… over and over again.”
Two-hundred and twenty years after Vico’s 1774 death, Carl Sagan—another thinker who took human irrationalism seriously—published his book The Demon Haunted World, showing how much our everyday thinking derives from metaphor, mythology, and superstition. He also foresaw a future in which his nation, the U.S., would fall into a period of terrible decline:
I have a foreboding of an America in my children’s or grandchildren’s time — when the United States is a service and information economy; when nearly all the manufacturing industries have slipped away to other countries; when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues; when the people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what’s true, we slide, almost without noticing, back into superstition and darkness…
Sagan believed in progress and, unlike Vico, thought that “timeless natural law” is discoverable with the tools of science. And yet, he feared “the candle in the dark” of science would be snuffed out by “the dumbing down of America…”
…most evident in the slow decay of substantive content in the enormously influential media, the 30 second sound bites (now down to 10 seconds or less), lowest common denominator programming, credulous presentations on pseudoscience and superstition, but especially a kind of celebration of ignorance…
Sagan died in 1996, a year after he wrote these words. No doubt he would have seen the fine art of distracting and misinforming people through social media as a late, perhaps terminal, sign of the demise of scientific thinking. His passionate advocacy for science education stemmed from his conviction that we must and can reverse the downward trend.
As he says in the poetic excerpt from Cosmos above, “I believe our future depends powerfully on how well we understand this cosmos in which we float like a mote of dust in the morning sky.”
When Sagan refers to “our” understanding of science, he does not mean, as he says above, a “very few” technocrats, academics, and research scientists. Sagan invested so much effort in popular books and television because he believed that all of us needed to use the tools of science: “a way of thinking,” not just “a body of knowledge.” Without scientific thinking, we cannot grasp the most important issues we all jointly face.
We’ve arranged a civilization in which most crucial elements profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces.
Sagan’s 1995 predictions are now being heralded as prophetic. As Director of Public Radio International’s Science Friday, Charles Bergquist tweeted, “Carl Sagan had either a time machine or a crystal ball.” Matt Novak cautions against falling back into superstitious thinking in our praise of Demon Haunted World. After all, he says, “the ‘accuracy’ of predictions is often a Rorschach test” and “some of Sagan’s concerns” in other parts of the book “sound rather quaint.”
Of course Sagan couldn’t predict the future, but he did have a very informed, rigorous understanding of the issues of thirty years ago, and his prediction extrapolates from trends that have only continued to deepen. If the tools of science education—like most of the country’s wealth—end up the sole property of an elite, the rest of us will fall back into a state of gross ignorance, “superstition and darkness.” Whether we might come back around again to progress, as Giambattista Vico thought, is a matter of sheer conjecture. But perhaps there’s still time to reverse the trend before the worst arrives. As Novak writes, “here’s hoping Sagan, one of the smartest people of the 20th century, was wrong.”
Note: An earlier version of this post appeared on our site in 2017.
One would count neither Elon Musk nor Neil deGrasse Tyson among the most reserved public figures of the twenty-first century. Given the efforts Musk has been making to push into the business of outer space, which has long been Tyson’s intellectual domain, it’s only natural that the two would come into conflict. Not long ago, the media eagerly latched on to signs of a “feud” that seemed to erupt between them over Tyson’s remark that Musk — or rather, his company SpaceX — “hasn’t done anything that NASA hasn’t already done. The actual space frontier is still held by NASA.”
What this means is that SpaceX has yet to take humanity anywhere in outer space we haven’t been before. That’s not a condemnation, but in fact a description of business as usual. “The history of really expensive things ever happening in civilization has, in essentially every case, been led, geopolitically, by nations,” Tyson says in the StarTalk video above. “Nations lead expensive projects, and when the costs of these projects are understood, the risks are quantified, and the time frames are established, then private enterprise comes in later, to see if they can make a buck off of it.”
To go, boldly or otherwise, “where no one has gone before often involves risk that a company that has investors will not take, unless there’s a very clear return on investment. Governments don’t need a financial return on investment if they can get a geopolitical return on investment.” Though private enterprise may be doing more or less what NASA has been doing for 60 years, Tyson hastens to add, private enterprise does do it cheaper. In that sense, “SpaceX has been advancing the engineering frontier of space exploration,” not least by its development of reusable rockets. Still, that’s not exactly the Final Frontier.
Musk has made no secret of his aspirations to get to Mars, but Tyson doesn’t see that eventuality as being led by SpaceX per se. “The United States decides, ‘We need to send astronauts to Mars,’ ” he imagines. “Then NASA looks around and says, ‘We don’t have a rocket to do that.’ And then Elon says ‘I have a rocket!’ and rolls out his rocket to Mars. Then we ride in the SpaceX rocket to Mars.” That scenario will look even more possible if the unmanned Mars missions SpaceX has announced go according to plan. Whatever their differences, Tyson and Musk — and every true space enthusiast — surely agree that it doesn’t matter where the money comes from, just as long as we get out there one day soon.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
Charles Darwin’s work on heredity was partly driven by tragic losses in his own family. Darwin had married his first cousin, Emma, and “wondered if his close genetic relation to his wife had had an ill impact on his children’s health, three (of 10) of whom died before the age of 11,” Katherine Harmon writes at Scientific American. (His suspicions, researchers surmise, may have been correct.) He was so concerned about the issue that, in 1870, he pressured the government to include questions about inbreeding on the census (they refused).
Darwin’s children would serve as subjects of scientific observation. His notebooks, says Alison Pearn of the Darwin Correspondence Project at Cambridge University Library, show a curious father “prodding and poking his young infant,” Charles Erasmus, his first child, “like he’s another ape.” Comparisons of his children’s development with that of orangutans helped him refine ideas in On the Origin of Species, which he completed as he raised his family at their house in rural Kent, and inspired later ideas in Descent of Man.
But as they grew, the Darwin children became far more than scientific curiosities. They became their father’s assistants and apprentices. “It’s really an enviable family life,” Pearn tells the BBC. “The science was everywhere. Darwin just used anything that came to hand, all the way from his children right through to anything in his household, the plants in the kitchen garden.” Steeped in scientific investigation from birth, it’s little wonder so many of the Darwins became accomplished scientists themselves.
Down House was “by all accounts a boisterous place,” writes McKenna Staynor at The New Yorker, “with a wooden slide on the stairs and a rope swing on the first-floor landing.” Another archive of Darwin’s prodigious writing, Cambridge’s Darwin Manuscripts Project, gives us even more insight into his family life, with graphic evidence of the Darwin brood’s curiosity in the dozens of doodles and drawings they made in their father’s notebooks, including the original manuscript copy of his magnum opus.
The project’s director, David Kohn, “doesn’t know for certain which kids were the artists,” notes Staynor, “but he guesses that at least three were involved: Francis, who became a botanist; George, who became an astronomer and mathematician; and Horace, who became an engineer.” One imagines competition among the Darwin children must have been fierce, but the drawings, “though exacting, are also playful.” One depicts “The Battle of Fruits and Vegetables.” Others show anthropomorphic animals and illustrate military figures.
There are short stories, like “The Fairies of the Mountain,” which “tells the tale of Polytax and Short Shanks, whose wings have been cut off by a ‘naughty fairy.’” Imagination and creativity clearly had a place in the Darwin home. The man himself, Maria Popova notes, felt significant ambivalence about fatherhood. “Children are one’s greatest happiness,” he once wrote, “but often & often a still greater misery. A man of science ought to have none.”
It was an attitude born of grief, but one, it seems, that did not breed aloofness. The Darwin kids “were used as volunteers,” says Kohn, “to collect butterflies, insects, and moths, and to make observations on plants in the fields around town.” Francis followed his father’s path and was the only Darwin to co-author a book with his father. Darwin’s daughter Henrietta became his editor, and he relied on her, he wrote, for “deep criticism” and “corrections of style.”
Despite his early fears for their genetic fitness, Darwin’s professional life became intimately bound to the successes of his children. The Darwin Manuscripts Project, which aims to digitize and make public around 90,000 pages from the Cambridge University Library’s Darwin collection will have a profound effect on how historians of science understand his impact. “The scope of the enterprise, of what we call evolutionary biology,” says Kohn, “is defined in these papers. He’s got his foot in the twentieth century.”
The archive also shows the development of Darwin’s equally important legacy as a parent who inspired a boundless scientific curiosity in his kids. See many more of the digitized Darwin children’s drawings at The Marginalian.
Note: An earlier version of this post appeared on our site in 2020.
“It’s interesting that some people find science so easy, and others find it kind of dull and difficult,” says Richard Feynman at the beginning of his 1983 BBC series Fun to Imagine. “One of the things that makes it very difficult is that it takes a lot of imagination. It’s very hard to imagine all the crazy things that things really are like.” A true scientist accepts that nothing is as it seems, in that nothing, when you zoom in close enough or zoom out far enough, behaves in a way that accords with our everyday experience. Even the necessary scales — in which, for example, an atom is to an apple as an apple is to Earth itself — are difficult to conceive.
Despite his much-celebrated brilliance as a physicist, Feynman also admitted to finding the quantities with which he had to work unfathomable, at least when examined outside their particular contexts. At the atomic level, he explains, “you’re just thinking of small balls, but you don’t try to think of exactly how small they are too often, or you get kind of a bit nutty.”
In astronomy, “you have the same thing in reverse, because the distance to these stars is so enormous.” We all have an idea of what the term “light year” means — assuming we don’t misunderstand it as a unit of time — but who among us can really envision a galaxy 100,000 light years away, let alone a million?
Feynman discusses these matters with characteristic understanding and humor across Fun to Imagine’s nine segments, which cover physical phenomena from fire and magnets to rubber bands and train wheels. Those who know their physics will appreciate the vividness and concision with which he explains this material, apparently right off the top of his head, and anyone can sense the delight he feels in merely putting his mind to the behavior of matter and energy and their relationship to the world as we know it. And however much pleasure he derived from understanding, he also got a kick out of how much mystery remains: “Nature’s imagination is so much greater than man’s,” he says toward the end. “She’s never going to let us relax.”
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on the social network formerly known as Twitter at @colinmarshall.
I mean, the idea that you would give a psychedelic—in this case, magic mushrooms or the chemical called psilocybin that’s derived from magic mushrooms—to people dying of cancer, people with terminal diagnoses, to help them deal with their — what’s called existential distress. And this seemed like such a crazy idea that I began looking into it. Why should a drug from a mushroom help people deal with their mortality?
Around the same time Albert Hofmann synthesized LSD in the early 1940s, a pioneering ethnobotanist, writer, and photographer named Richard Evan Schultes set out “on a mission to study how indigenous peoples” in the Amazon rainforest “used plants for medicinal, ritual and practical purposes,” as an extensive history of Schultes’ travels notes. “He went on to spend over a decade immersed in near-continuous fieldwork, collecting more than 24,000 species of plants including some 300 species new to science.”
Described by Jonathan Kandell as “swashbuckling” in a 2001 New York Times obituary, Schultes was “the last of the great plant explorers in the Victorian tradition.” Or so his student Wade Davis called him in his 1995 bestseller The Serpent and the Rainbow. He was also “a pioneering conservationist,” writes Kandell, “who raised alarms in the 1960’s—long before environmentalism became a worldwide concern.” Schultes defied the stereotype of the colonial adventurer, once saying, “I do not believe in hostile Indians. All that is required to bring out their gentlemanliness is reciprocal gentlemanliness.”
Schultes returned to teach at Harvard, where he reminded his students “that more than 90 tribes had become extinct in Brazil alone over the first three-quarters of the 20th century.” While his research would have significant influence on figures like Aldous Huxley, William Burroughs, and Carlos Castaneda, “writers who considered hallucinogens as the gateways to self-discovery,” Schultes was dismissive of the counterculture and “disdained these self-appointed prophets of an inner reality.”
Described onAmazon as “a nontechnical examination of the physiological effects and cultural significance of hallucinogenic plants used in ancient and modern societies,” the book covers peyote, ayahuasca, cannabis, various psychoactive mushrooms and other fungi, and much more. In his introduction, Schultes is careful to separate his research from its appropriation, dismissing the term “psychedelic” as etymologically incorrect and “biologically unsound.” Furthermore, he writes, it “has acquired popular meanings beyond the drugs or their effects.”
Schultes’ interests are scientific—and anthropological. “In the history of mankind,” he writes, “hallucinogens have probably been the most important of all the narcotics. Their fantastic effects made them sacred to primitive man and may even have been responsible for suggesting to him the idea of deity.” He does not exaggerate. Schultes’ research into the religious and medicinal uses of natural hallucinogens led him to dub them “plants of the gods” in a book he wrote with Albert Hofmann, discoverer of LSD.
Neither scientist sought to start a psychedelic revolution, but it happened nonetheless. Now, another revolution is underway—one that is finally revisiting the science of ethnobotany and taking seriously the healing powers of hallucinogenic plants. It is hardly a new science among scholars in the West, but the renewed legitimacy of research into hallucinogens has given Schultes’ research new authority. Learn from him in his Golden Guide to Hallucinogenic Plants online here.
Sir Isaac Newton, arguably the most important and influential scientist in history, discovered the laws of motion and the universal force of gravity. For the first time ever, the rules of the universe could be described with the supremely rational language of mathematics. Newton’s elegant equations proved to be one of the inspirations for the Enlightenment, a shift away from the God-centered dogma of the Church in favor of a worldview that placed reason at its center. The many leaders of the Enlightenment turned to deism if not outright atheism. But not Newton.
In 1936, a document of Newton’s dating from around 1662 was sold at a Sotheby’s auction and eventually wound up at the Fitzwilliam Museum in Cambridge, England. The Fitzwilliam Manuscript has long been a source of fascination for Newton scholars. Not only does the notebook feature a series of increasingly difficult mathematical problems but also a cryptic string of letters reading:
Nabed Efyhik
Wfnzo Cpmfke
If you can solve this, there are some people in Cambridge who would like to talk to you.
But what makes the document really interesting is how incredibly personal it is. Newton rattles off a laundry list of sins he committed during his relatively short life – he was around 20 when he wrote this, still a student at Cambridge. He splits the list into two categories, before Whitsunday 1662 and after. (Whitsunday is, by the way, the Sunday of the feast of Whitsun, which is celebrated seven weeks after Easter.) Why he decided on that particular date to bifurcate his timeline isn’t immediately clear.
Some of the sins are rather opaque. I’m not sure what, for instance, “Making a feather while on Thy day” means exactly but it sure sounds like a long-lost euphemism. Other sins like “Peevishness with my mother” are immediately relatable as good old-fashioned teenage churlishness. You can see the full list below. And you can read the full document over at the Newton Project here.
Before Whitsunday 1662
1. Vsing the word (God) openly
2. Eating an apple at Thy house
3. Making a feather while on Thy day
4. Denying that I made it.
5. Making a mousetrap on Thy day
6. Contriving of the chimes on Thy day
7. Squirting water on Thy day
8. Making pies on Sunday night
9. Swimming in a kimnel on Thy day
10. Putting a pin in Iohn Keys hat on Thy day to pick him.
11. Carelessly hearing and committing many sermons
12. Refusing to go to the close at my mothers command.
13. Threatning my father and mother Smith to burne them and the house over them
14. Wishing death and hoping it to some
15. Striking many
16. Having uncleane thoughts words and actions and dreamese.
17. Stealing cherry cobs from Eduard Storer
18. Denying that I did so
19. Denying a crossbow to my mother and grandmother though I knew of it
20. Setting my heart on money learning pleasure more than Thee
21. A relapse
22. A relapse
23. A breaking again of my covenant renued in the Lords Supper.
24. Punching my sister
25. Robbing my mothers box of plums and sugar
26. Calling Dorothy Rose a jade
27. Glutiny in my sickness.
28. Peevishness with my mother.
29. With my sister.
30. Falling out with the servants
31. Divers commissions of alle my duties
32. Idle discourse on Thy day and at other times
33. Not turning nearer to Thee for my affections
34. Not living according to my belief
35. Not loving Thee for Thy self.
36. Not loving Thee for Thy goodness to us
37. Not desiring Thy ordinances
38. Not long {longing} for Thee in {illeg}
39. Fearing man above Thee
40. Vsing unlawful means to bring us out of distresses
41. Caring for worldly things more than God
42. Not craving a blessing from God on our honest endeavors.
43. Missing chapel.
44. Beating Arthur Storer.
45. Peevishness at Master Clarks for a piece of bread and butter.
46. Striving to cheat with a brass halfe crowne.
47. Twisting a cord on Sunday morning
48. Reading the history of the Christian champions on Sunday
Since Whitsunday 1662
49. Glutony
50. Glutony
51. Vsing Wilfords towel to spare my own
52. Negligence at the chapel.
53. Sermons at Saint Marys (4)
54. Lying about a louse
55. Denying my chamberfellow of the knowledge of him that took him for a sot.
56. Neglecting to pray 3
57. Helping Pettit to make his water watch at 12 of the clock on Saturday night
The 80-second clip above captures a rocket launch, something of which we’ve all seen footage at one time or another. What makes its viewers call it “the greatest shot in television” still today, 45 years after it first aired, may take more than one viewing to notice. In it, science historian James Burke speaks about how “certain gases ignite, and that the thermos flask permits you to store vast quantities of those gases safely, in their frozen liquid form, until you want to ignite them.” Use a sufficiently large flask filled with hydrogen and oxygen, design it to mix the gases and set light to them, and “you get that” — that is, you get the rocket that launches behind Burke just as soon as he points to it.
One can only admire Burke’s composure in discussing such technical matters in a shot that had to be perfectly timed on the first and only take. What you wouldn’t know unless you saw it in context is that it also comes as the final, culminating moment of a 50-minute explanatory journey that begins with credit cards, then makes its way through the invention of everything from a knight’s armor to canned food to air conditioning to the Saturn V rocket, which put man on the moon.
Formally speaking, this was a typical episode of Connections, Burke’s 1978 television series that traces the most important and surprising moves in the evolution of science and technology throughout human history.
Though not as widely remembered as Carl Sagan’s slightly later Cosmos, Connections bears repeat viewing here in the twenty-first century, not least for the intellectual and visual bravado typified by this “greatest shot in television,” now viewed nearly 18 million times on Youtube. Watch it enough times yourself, and you’ll notice that it also pulls off some minor sleight of hand by having Burke walk from a non-time-sensitive shot into another with the already-framed rocket ready for liftoff. But that hardly lessens the feeling of achievement when the launch comes off. “Destination: the moon, or Moscow,” says Burke, “the planets, or Peking” — a closing line that sounded considerably more dated a few years ago than it does today.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities and the book The Stateless City: a Walk through 21st-Century Los Angeles. Follow him on Twitter at @colinmarshall or on Facebook.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.