In his latest animation, physicist and science writer Dominic Walliman maps out the entire field of engineering and all of its subdisciplines. Civil engineering, chemical engineering, bio engineering, biomedical engineering, mechanical engineering, aerospace engineering, marine engineering, electrical engineering, computer engineering–they’re all covered here.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
The exalted status of Isaac Newton’s Philosophiæ Naturalis Principia Mathematica is reflected by the fact that everybody knows it as, simply, the Principia. Very few of us, by contrast, speak of the Historia when we mean to refer to John Ray and Francis Willughby’s De Historia Piscium, which came out in 1686, the year before the Principia. Both books were published by the Royal Society, and as it happens, the formidable cost of Willughby and Ray’s lavish work of ichthyology nearly kept Newton’s groundbreaking treatise on motion and gravitation from the printing press.
According to the Royal Society’s web site, “Ray and Willughby’s Historia did not prove to be the publishing sensation that the Fellows had hoped and the book nearly bankrupted the Society. This meant that the Society was unable to meet its promise to support the publication of Isaac Newton’s masterpiece.”
Fortunately, “it was saved from obscurity by Edmund Halley, then Clerk at the Royal Society” — and now better known for his eponymous comet — “who raised the funds to publish the work, providing much of the money from his own pocket. ”
Halley’s great reward, in lieu of the salary the Royal Society could no longer pay, was a pile of unsold copies of De Historia Piscium. That may not have been quite the insult it sounds like, given that the book represented a triumph of production and design in its day. You can see a copy in the episode of Adam Savage’s Tested at the top of the post, and you can closely examine its imagery at your leisure in the digital archive of the Royal Society. In the words of Jonathan Ashmore, Chair of the Royal Society’s Library Committee, a browsing session should help us “appreciate why early Fellows of the Royal Society were so impressed by Willughby’s stunning illustrations of piscine natural history.”
Though Savage duly marvels at the Royal Society’s copy of the Historia — a reconstruction made up of pages long ago cut out and sold separately, as was once common practice with books with pictures suitable for framing — it’s clear that much of the motivation for his visit came from the prospect of close proximity to Newtoniana, up to and including the man’s death mask. But then, Newton lays fair claim to being the most important scientist who ever lived, and the Principia to being the most important science book ever written. Almost three and a half centuries later, physics still holds mysteries for generations of Newton’s successors to solve. But then, so do the depths of the ocean.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
It’s bittersweet whenever a pioneering, long overlooked female scientist is finally given the recognition she deserves, especially so when the scientist in question is a person of color.
Chemist Alice Ball’s youth and drive — just 23 in 1915, when she discovered a gentle, but effective method for treating leprosy — make her an excellent role model for students with an interest in STEM.
But in a move that’s only shocking for its familiarity, an opportunistic male colleague, Arthur Dean, finagled a way to claim credit for her work.
We’ve all heard the tales of female scientists who were integral team players on important projects, who ultimately saw their role vastly downplayed upon publication or their names left off of a prestigious award.
But Dean’s claim that he was the one who had discovered an injectable water-soluble method for treating leprosy with oil from the seeds of the chaulmoogra fruit is all the more galling, given that he did so after Alice Ball’s tragically early death at the age of 24, suspected to be the result of accidental poisoning during a classroom lab demonstration.
Not everyone believed him.
Ball, the University of Hawaii chemistry department’s first Black female graduate student, and, subsequently, its first Black female chemistry instructor, had come to the attention of Harry T. Hollmann, a U.S. Public Health Officer who shared her conviction that chaulmoogra oil might hold the key to treating leprosy.
After her death in 1916, Hollmann reviewed Dean’s publications regarding the highly successful new leprosy treatment then referred to as the Dean Method and wrote that he could not see “any improvement whatsoever over the original [method] as worked out by Miss Ball:”
After a great amount of experimental work, Miss Ball solved the problem for me by making the ethyl esters of the fatty acids found in chaulmoogra oil.
Type “the Dean Method leprosy” into a search engine and you’ll be rewarded with a satisfying wealth of Alice Ball profiles, all of which go into detail regarding her discovery of what became known as the Ball Method, in use until the 1940s.
Kathleen M. Wong’s article on this trailblazing scientist in the Smithsonian Magazine delves into why Hollmann’s professional efforts to posthumously confer credit where credit was due were insufficient to secure Ball her rightful place in science history.
That began to change in the 1990s when Stan Ali, a retiree researching Black people in Hawaii, found his interest piqued by a reference to a “young Negro chemist” working on leprosy in The Samaritans of Molokai.
Ali teamed up with Paul Wermager, a retired University of Hawaii librarian, and Kathryn Waddell Takara, a poet and professor in the Ethnic Studies Department. Together, they began combing over old sources for any passing reference to Ball and her work. They came to believe that her absence from the scientific record owed to sexism and racism:
During and just after her lifetime, she was believed to be part Hawaiian, not Black. (Her birth and death certificates list both Ball and her parents as white, perhaps to “make travel, business and life in general easier,” according to the Honolulu Star-Bulletin.) In 1910, Black people made up just 0.4 percent of Hawaiʻi’s population.
“When [the newspapers] realized she was not part Hawaiian, but [Black], they felt they had made an embarrassing mistake, forgetting about it and hoping it would go away,” Ali said. “It did for 75 years.”
Their combined efforts spurred the state of Hawaii to declare February 28 Alice Ball Day. The University of Hawaii installed a commemorative plaque near a chaulmoogra tree on campus. Her portrait hangs in the university’s Hamilton Library, alongside a posthumously awarded Medal of Distinction.
(“Meanwhile,” as Carlyn L. Tani dryly observes in Honolulu Magazine, “Dean Hall on the University of Hawai‘i Mānoa campus stands as an enduring monument to Arthur L. Dean.)
Further afield, the London School of Hygiene and Tropical Medicine celebrated its 120th anniversary by adding Ball’s, Marie Sklodowska-Curie’s and Florence Nightingale’s names to a frieze that had previously honored 23 eminent men.
And now, the Godmother of Punk Patti Smith has taken it upon herself to introduce Ball to an even wider audience, after running across a reference to her while conducting research for her just released A Book of Days.
Things have really changed. I think we are living in a very beautiful period of time because there are so many female artists, poets, scientists, and activists. Through books especially, we are rediscovering and valuing the women who have been unjustly forgotten in our history. During my research, I came across a young black scientist who lived in Hawaii in the 1920s. At that time, there was a big leper colony in Hawaii. She had discovered a treatment using the oil from the seeds of a tree to relieve the pain and allow patients to see their friends and family. Her name was Alice Ball, and she died at just 24 after a terrible chemical accident during an experiment. Her research was taken up by a professor who removed her name from the study to take full credit. It is only recently that people have discovered that she was the one who did the work.
You’d think Wikipedia would have kept pace in this climate.
And it has…thanks almost entirely to the efforts of Dr. Jess Wade, a 33-year-old Imperial College Research Fellow who spends her days investigating spin selective charge transport through chiral systems in the Department of Materials.
Her evenings, however, belong to Wikipedia.
That’s when she drafts entries for under recognized female scientists and scientists of color.
“I had a target for doing one a day, but sometimes I get too excited and do three,” she told The Guardian in 2018.
To date she’s added more than 1,600 names, striving to make their biographies as fully fleshed out as any of the write ups for the white male scientists who flourish on the site.
This requires some forensic digging. Discovering a subject’s maiden name is often the critical step to finding her PhD thesis and early influences.
A handful of Wade’s entries have been stricken for the truly maddening reason that their subjects are too obscure to warrant inclusion.
When you make a page and it is disputed for deletion, it is not only annoying because your work is being deleted. It’s also incredibly intrusive and degrading to have someone discuss whether someone’s notable enough to be on Wikipedia – a website that has pages about almost every pop song, people who are extras in films no one has ever heard of and people who were in sports teams that never scored.
Below are just a few of the 1600+ female scientists she’s introduced to a wider audience. While history abounds with nearly invisible names whose discoveries and contributions have been inadequately recognized, or all too frequently attributed to male colleagues, these women are all contemporary.
Nuclear chemist Clarice Phelps was part of the team that helped discover, tennessine, the second heaviest known element.
Mathematician Gladys Mae West was one of the developers of GPS.
Physical chemist June Lindsey played a key role in the discovery of the DNA double helix.
Oceanographer and climate scientist Kim Cobb uses corals and cave stalagmites to inform projections of future climate change.
Vaccinologist Sarah Gilbert led the team that developed the Oxford/AstraZeneca vaccine (and inspired a Barbie created in her image, though you can be assured that the Wikipedia entry Wade researched and wrote for her came first.)
Wade’s hope is that a higher representation of female scientists and scientists of color on a crowdsourced, easily-accessed platform like Wikipedia will deal a blow to ingrained gender bias, expanding public perception of who can participate in these sorts of careers and encouraging young girls to pursue these courses of study. As she told the New York Times:
I’ve always done a lot of work to try to get young people — particularly girls and children from lower socioeconomic backgrounds and people of color — to think about studying physics at high school, because physics is still very much that kind of elitist, white boy subject.
Our science can only benefit the whole of society if it’s done by the whole of society. And that’s not currently the case.
Unsurprisingly, Wade is often asked how to foster and support girls with an interest in science, beyond upping the number of role models available to them on Wikipedia.
The way forward, she told NBC, is not attention-getting “whiz bang” one-off events and assemblies, but rather paying skilled teachers as well as bankers, to mentor students on their course of study, and also help them apply for grants, fellowships and other opportunities. As students prepare to enter the workforce, clearly communicated sexual harassment policies and assistance with childcare and eldercare become crucial:
Ultimately, we don’t only need to increase the number of girls choosing science, we need to increase the proportion of women who stay in science.
Listen to Jess Wade talk about her Wikipedia project on NPR’s science program Short Wavehere.
Never was there such an exhilarating time and place to be interested in atheism than the internet of ten or fifteen years ago. “People compiled endless lists of arguments and counterarguments for or against atheism,” remembers blogger Scott Alexander. One atheist newsgroup “created a Dewey-Decimal-system-esque index of almost a thousand creationist arguments” and “painstakingly debunked all of them.” In turn, their creationist arch-enemies “went through and debunked all of their debunkings.” Readers could enjoy a host of atheism-themed web comics and “the now-infamous r/atheism subreddit, which at the time was one of Reddit’s highest-ranked, beating topics like ‘news,’ ‘humor,’ and — somehow — ‘sex.’ At the time, this seemed perfectly normal.”
This was the culture in which Richard Dawkins published The God Delusion, in 2006, and Christopher Hitchens published his God Is Not Great: How Religion Poisons Everything in 2007. “I’m not just doing what publishers like and coming up with a provocative subtitle,” Alexander quotes Hitchens as saying. “I mean to say it infects us in our most basic integrity. It says we can’t be moral without ‘Big Brother,’ without a totalitarian permission, means we can’t be good to one another without this, we must be afraid, we must also be forced to love someone whom we fear — the essence of sadomasochism, the essence of abjection, the essence of the master-slave relationship and that knows that death is coming and can’t wait to bring it on.”
Dawkins and Hitchens became known as two of the “Four Horsemen of the Non-Apocalypse,” a group of public intellectuals that also included Sam Harris and Daniel Dennett. The label stuck after all of them sat down for a two-hour conversation on video in the fall 2007, during which each man laid out his critique of the religious worldview. Four years later, Dawkins and Hitchens sat down for another recorded conversation, this time one-on-one and with a much different tone. Having suffered from cancer for more than a year, Hitchens seemed not to be long for this world, and indeed, he would be dead in just two months. But his condition hardly stopped him from speaking with his usual incisiveness on topics of great interest, and especially his and Dawkins’ shared bête noire of fundamentalist religion.
Dawkins, a biologist, sees in the power granted to religion a threat to hard-won scientific knowledge about the nature of reality; Hitchens, a writer and thinker in the tradition of George Orwell, saw it as one of the many forms of totalitarianism that has ever threatened the intellectual and bodily freedom of humankind. In this, Hitchens’ final interview (which was printed in Hitchens’ Last Interview book and whose uncut audio recording came available only this year), Dawkins expresses some concern that he’s become a “bore” with his usual anti-religious defense of science. Nonsense, Hitchens says: an honest scientist risks being called a bore just as an honest journalist risks being called strident, but nevertheless, “you’ve got to bang on.”
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall, on Facebook, or on Instagram.
Perhaps the 143 colors showcased in The Bayer Company’s early 20th-century sample book, Shades on Feathers, could be collected in the field, but it would involve a lot of travel and patience, and the stalking of several endangered if not downright extinct avian species.
Far easier, and much less expensive, for milliners, designers and decorators to dye plain white feathers exotic shades, following the instructions in the sample book.
Such artificially obtained rainbows owe a lot to William Henry Perkin, a teenage student of German chemist August Wilhelm von Hofmann, who spent Easter vacation of 1856 experimenting with aniline, an organic base his teacher had earlier discovered in coal tar. Hoping to hit on a synthetic form of quinine, he accidentally hit on a solution that colored silk a lovely purple shade — an inadvertent eureka moment that ranks right up there with penicillin and the pretzel.
Perkin named the colour mauve and the dye mauveine. He decided to try to market his discovery instead of returning to college.
On 26 August 1856, the Patent Office granted Perkin a patent for ‘a new colouring matter for dyeing with a lilac or purple colour stuffs of silk, cotton, wool, or other materials’.
Perkin’s next step was to interest cloth dyers and printers in his discovery. He had no experience of the textile trade and little knowledge of large-scale chemical manufacture. He corresponded with Robert and John Pullar in Glasgow, who offered him support. Perkin’s luck changed towards the end of 1857 when the Empress Eugénie, wife of Napoleon III, decided that mauve was the colour to wear. In January 1858, Queen Victoria followed suit, wearing mauve to her daughter’s wedding.
(The sample book recommends cleaning the feathers prior to dying in a lukewarm solution of small amounts of olive oil soap and ammonia.)
The Science History Institute, owner of this unusual object, estimates that the undated book was produced between 1913 and 1918, the year the Migratory Bird Act Treaty outlawed the hunting of birds whose feathers humans deemed particularly fashionable.
Peruse the Science History Institute of Philadelphia’s digitized copy of the Shades on Feathers sample bookhere.
We should probably not look to science to have cherished beliefs confirmed. As scientific understanding of the world has progressed over the centuries, it has brought on a loss of humans’ status as privileged beings at the center of the universe whose task is to subdue and conquer nature. (The stubborn persistence of those attitudes among the powerful has not served the species well.) We are not special, but we are still responsible, we have learned — maybe totally responsible for our lives on this planet. The methods of science do not lend themselves to soothing existential anxiety.
But what about the most cherished, and likely ancient, of human beliefs: faith in an afterlife? Ideas of an underworld, or heaven, or hell have animated human culture since its earliest origins. There is no society in the world where we will not find some belief in an afterlife existing comfortably alongside life’s most mundane events. Is it a harmful idea? Is there any real evidence to support it? And which version of an afterlife — if such a thing existed — should we believe?
Such questions stack up. Answers in forms science can reconcile seem diminishingly few. Nonetheless, as we see in the Big Think video above, scientists, science communicators, and science enthusiasts are willing to discuss the possibility, or impossibility, of continuing after death. We begin with NASA astronomer Michelle Thaller, who references Einstein’s theory of the universe as fully complete, “so every point in the past and every point in the future are just as real as the point of time you feel yourself in right now.” Time spreads out in a landscape, each moment already mapped and surveyed.
When a close friend died, Einstein wrote a letter to his friend’s wife explaining, “Your husband, my friend, is just over the next hill. He’s still there” — in a theoretical sense. It may not have been the comfort she was looking for. The hope of an afterlife is that we’ll see our loved ones again, something Einstein’s solution does not allow. Sam Harris — who has leaned into the mystical practice of meditation while pulling it from its religious context — admits that death is a “dark mystery.” When people die, “there’s just the sheer not knowing what happened to them. And into this void, religion comes rushing with a very consoling story, saying nothing happened them; they’re in a better place and you’re going to meet up with them after.”
The story isn’t always so consoling, depending on how punitive the religion, but it does offer an explanation and sense of certainty in the face of “sheer not knowing.” The human mind does not tolerate uncertainty particularly well. Death feels like the greatest unknown of all. (Harris’ argument parallels that of anthropologist Pascal Boyer on the origin of all religions.) But the phenomenon of death is not unknown to us. We are surrounded by it daily, from the plants and animals we consume to the pets we sadly let go when their lifespans end. Do we keep ourselves up wondering what happened to these beings? Maybe our spiritual or religious beliefs aren’t always about death.…
“In the Old Testament there isn’t really any sort of view of the afterlife,” says Rob Bell, a spiritual teacher (and the only talking head here not aligned with a scientific institution or rationalist movement). “This idea that the whole thing is about when you die is not really the way that lots of people have thought about it.” For many religious practitioners, the idea of eternal life means “living in harmony with the divine right now.” For many, this “right now” — this very moment and each one we experience after it — is eternal. See more views of the afterlife above from science educators like Bill Nye and scientists like Michio Kaku, who says the kind of afterlives we’ve only seen in science fiction — “digital and genetic immortality” — “are within reach.”
Are you feeling confident about the future? No? We understand. Would you like to know what it was like to feel a deep certainty that the decades to come were going to be filled with wonder and the fantastic? Well then, gaze upon this clip from the BBC Archive YouTube channel of sci-fi author Arthur C. Clarke predicting the future in 1964.
Although we best know him for writing 2001: A Space Odyssey, the 1964 television viewing public would have known him for his futurism and his talent for calmly explaining all the great things to come. In the late 1940s, he had already predicted telecommunication satellites. In 1962 he published his collected essays, Profiles of the Future, which contains many of the ideas in this clip.
Here he correctly predicts the ease with which we can be contacted wherever in the world we choose to, where we can contact our friends “anywhere on earth even if we don’t know their location.” What Clarke doesn’t predict here is how “location” isn’t a thing when we’re on the internet. He imagines people working just as well from Tahiti or Bali as they do from London. Clarke sees this advancement as the downfall of the modern city, as we do not need to commute into the city to work. Now, as so many of us are doing our jobs from home post-COVID, we’ve also discovered the dystopia in that fantasy. (It certainly hasn’t dropped the cost of rent.)
Next, he predicts advances in biotechnology that would allow us to, say, train monkeys to work as servants and workers. (Until, he jokes, they form a union and “we’d be back right where we started.) Perhaps, he says, humans have stopped evolving—what comes next is artificial intelligence (although that phrase had yet to be used) and machine evolution, where we’d be honored to be the “stepping stone” towards that destiny. Make of that what you will. I know you might think it would be cool to have a monkey butler, but c’mon, think of the ethics, not to mention the cost of bananas.
Pointing out where Clarke gets it wrong is too easy—-nobody gets it right all of the time. However, it is fascinating that some things that have never come to pass—-being able to learn a language overnight, or erasing your memories—have managed to resurface over the years as fiction films, like Eternal Sunshine of the Spotless Mind. His ideas of cryogenic suspension are staples of numerous hard sci-fi films.
And we are still waiting for the “Replicator” machine, which would make exact duplicates of objects (and by so doing cause a collapse into “gluttonous barbarism” because we’d want unlimited amounts of everything.) Some commenters call this a precursor to 3‑D printing. I’d say otherwise, but something very close to it might be around the corner. Who knows? Clarke himself agrees about all this conjecture-—it’s doomed to fail.
“That is why the future is so endlessly fascinating. Try as we can, we’ll never outguess it.”
Ted Mills is a freelance writer on the arts who currently hosts the Notes from the Shed podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, and/or watch his films here.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.