In her view, color has the power to close the gap between the subjects of musty public domain photos and their modern viewers. The most fulfilling moment for this artist, aka Klimblim, comes when “suddenly the person looks back at you as if he’s alive.”
A before and after comparison of her digital makeover on Nadezhda Kolesnikova, one of many female Soviet snipers whose vintage likenesses she has colorized bears this out. The color version could be a fashion spread in a current magazine, except there’s nothing artificial-seeming about this 1943 pose.
“The world was never monochrome even during the war,” Shirnina reflected in the Daily Mail.
Military subjects pose a particular challenge:
When I colorize uniforms I have to search for info about the colours or ask experts. So I’m not free in choosing colors. When I colorize a dress on a 1890s photo, I look at what colors were fashionable at that time. When I have no limitations I play with colours looking for the best combination. It’s really quite arbitrary but a couple of years ago I translated a book about colours and hope that something from it is left in my head.
She also puts herself on a short leash where famous subjects are concerned. Eyewitness accounts of Vladimir Lenin’s eye color ensured that the revolutionary’s colorized irises would remain true to life.
And while there may be a market for representations of punked out Russian literary heroes, Shirnina plays it straight there too, eschewing the digital Manic Panic where Chekhov, Tolstoy, and Bulgakov are concerned.
Her hand with Photoshop CS6 may restore celebrity to those whose stars have faded with time, like Vera Komissarzhevskaya, the original ingenue in Chekhov’s much performed play The Seagull and wrestler Karl Pospischil, who showed off his physique sans culotte in a photo from 1912.
Even the unsung proletariat are given a chance to shine from the fields and factory floors.
We think of Johannes Gutenberg’s printing press (circa 1440) to have begun the era of the printed book, since his invention allowed for mass production of books on a scale unheard of before. But we must date the invention of printing itself much earlier—nearly 600 years earlier—to the Chinese method of xylography, a form of woodblock printing. Also used in Japan and Korea, this elegant method allowed for the reproduction of hundreds of books from the 9th century to the time of Gutenberg, most of them Buddhist texts created by monks. In the 11th century, writes Elizabeth Palermo at Live Science, a Chinese peasant named Bi Sheng (Pi Sheng) developed the world’s first movable type.” The technology may have also arisen independently in the 14th century Yuan Dynasty and in Korea around the same time.
Despite these innovations, xylography remained the primary method of printing in Asia. The “daunting task” of casting the thousands of characters in Chinese, Japanese, and Korean “may have made woodblocks seem like a more efficient option for printing these languages.” This still-labor-intensive process produced books and illustrations for several centuries, a good many of them incredible works of art in their own right.
Published by Hu Zhengyan’s Ten Bamboo Studio in Nanjiang, this manual for teachers contains 138 pages of multicolor prints by fifty different artists and calligraphers and 250 pages of accompanying text. “The method” that produced the stunning artifact “involves the use of multiple printing blocks which successively apply different coloured inks to the paper to reproduce the effect of watercolour painting.” Kept untouched in Cambridge’s “most secure vaults,” the book was unsealed for the first time just a couple years ago. “What surprised us,” remarked Charles Aylmer, head of the Library’s Chinese Department, “was the amazing freshness of the images, as if they had never been looked at for over 300 years.”
The 17th century copy is “unique in being complete, in perfect condition and in its original binding.” (Another, incomplete, copy was acquired in 2014 by the Huntington Library in San Marino, CA.) The book contains many “detailed instructions on brush techniques,” writes CNN, “but its phenomenal beauty has meant from the outset that it has held a greater position” than other such manuals. Like another gorgeous multicolor painting textbook, the Manual of the Mustard Seed Garden, made in 1679, this text had a significant impact on the arts in both China and Japan, “where it inspired a whole new branch of printing.”
In 2014, Google acquired DeepMind, a company which soon made news when its artificial intelligence software defeated the world’s best player of the Chinese strategy game, Go. What’s DeepMind up to these days? More elemental things–like teaching itself to walk. Above, watch what happens when, on the fly, DeepMind’s AI learns to walk, run, jump, and climb. Sure, it all seems a little kooky–until you realize that if DeepMind’s AI can learn to walk in hours, it can take your job in a matter of years.
What’s director Michel Gondry up to these days? Apparently, trying to show that you can do smart things–like make serious movies–with that smartphone in your pocket. The director of Eternal Sunshine of the Spotless Mind and the Noam Chomsky animated documentary Is the Man Who Is Tall Happy?has just released “Détour,” a short film shot purely on his iPhone 7 Plus. Subtitled in English, “Détour” runs about 12 minutes and follows “the adventures of a small tricycle as it sets off along French roads in search of its young owner.” Watch it, then ask yourself, was this really not made with a traditional camera? And then ask yourself, what’s my excuse for not getting out there and making movies?
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
In popular conceptions, we take the computer to be the natural outcome of empirical science, an inheritance of the Enlightenment and subsequent scientific revolutions in the 19th and 20th centuries. Of course, modern computers have their ancient precursors, like the Antikythera Mechanism, a 2,200-year-old bronze and wood machine capable of predicting the positions of the planets, eclipses, and phases of the moon. But even this fascinating artifact fits into the narrative of computer science as “a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II.” Much less do we invoke the names of “philosopher-mathematicians,” writes Chris Dixon at The Atlantic, like George Boole and Gottlob Frege, “who were themselves inspired by Leibniz’s dream of a universal ‘concept language,’ and the ancient logical system of Aristotle.” But these thinkers are as essential, if not more so, to computer science, especially, Dixon argues, Aristotle.
The ancient Greek thinker did not invent a calculating machine, though they may have existed in his lifetime. Instead, as Dixon writes in his recent piece, “How Aristotle Created the Computer,” Aristotle laid the foundations of mathematical logic, “a field that would have more impact on the modern world than any other.”
The claim may strike historians of philosophy as somewhat ironic, given that Enlightenment philosophers like Francis Bacon and John Locke announced their modern projects by thoroughly repudiating the medieval scholastics, whom they alleged were guilty of a slavish devotion to Aristotle. Their criticisms of medieval thought were varied and greatly warranted in many ways, and yet, like many an empiricist since, they often overlooked the critical importance of Aristotelian logic to scientific thought.
At the turn of the 20th century, almost three hundred years after Bacon sought to transcend Aristotle’s Organon with his form of natural philosophy, the formal logic of Aristotle could still be “considered a hopelessly abstract subject with no conceivable applications.” But Dixon traces the “evolution of computer science from mathematical logic” and Aristotelian thought, beginning in the 1930s with Claude Shannon, author of the groundbreaking essay “A Symbolic Analysis of Switching and Relay Circuits.” Shannon drew on the work of George Boole, whose name is now known to every computer scientist and engineer but who, in 1938, “was rarely read outside of philosophy departments.” And Boole owed his principle intellectual debt, as he acknowledged in his 1854 The Laws of Thought, to Aristotle’s syllogistic reasoning.
Boole derived his operations by replacing the terms in a syllogism with variables, “and the logical words ‘all’ and ‘are’ with arithmetical operators.” Shannon discovered that “Boole’s system could be mapped directly onto electrical circuits,” which hitherto “had no systematic theory governing their design.” The insight “allowed computer scientists to import decades of work in logic and mathematics by Boole and subsequent logicians.” Shannon, Dixon writes, “was the first to distinguish between the logical and the physical layer of computers,” a distinction now “so fundamental to computer science that it might seem surprising to modern readers how insightful it was at the time.” And yet, the field could not move forward without it—without, that is, a return to ancient categories of thought.
Since the 1940s, computer programming has become significantly more sophisticated. One thing that hasn’t changed is that it still primarily consists of programmers specifying rules for computers to follow. In philosophical terms, we’d say that computer programming has followed in the tradition of deductive logic, the branch of logic discussed above, which deals with the manipulation of symbols according to formal rules.
Dixon’s argument for the centrality of Aristotle to modern computer science takes many turns—through the quasi-mystical thought of 13th-century Ramon Llull and, later, his admirer Gottfried Leibniz. Through Descartes, and later Frege and Bertrand Russell. Through Alan Turing’s work at Bletchley Park. Nowhere do we see Aristotle, wrapped in a toga, building a circuit board in his garage, but his modes of reasoning are everywhere in evidence as the scaffolding upon which all modern computer science has been built. Aristotle’s attempts to understand the laws of the human mind “helped create machines that could reason according to the rules of deductive logic.” The application of ancient philosophical principles may, Dixon concludes, “result in the creation of new minds—artificial minds—that might someday match or even exceed our own.” Read Dixon’s essay at The Atlantic, or hear it read in its entirety in the audio above.
If you want to see where art began, go to a cave. Not just any cave, but not just one cave either. You’ll find the best-known cave paintings at Lascaux, an area of southwestern France with a cave complex whose walls feature over 600 images of animals, humans, and symbols, all of them more than 17,000 years old, but other caves elsewhere in the world reveal other chapters of art’s early history. Some of those chapters have only just come into legibility, as in the case of the cave near the Ethiopian city of Dire Dawa recently determined to be the world’s oldest “art studio.”
“The Porc-Epic cave was discovered by Pierre Teilhard de Chardin and Henry de Monfreid in 1929 and thought to date to about 43,000 to 42,000 years ago, during the Middle Stone Age,” writes Sarah Cascone at Artnet.
There, archaeologists have found “a stash of 4213 pieces, or nearly 90 pounds, of ochre, the largest such collection ever discovered at a prehistoric site in East Africa.” The “ancient visitors to the site processed the iron-rich ochre stones there by flaking and grinding the raw materials to produce a fine-grained and bright red powder,” a substance useful for “symbolic activities, such as body painting, the production of patterns on different media, or for signalling.”
In other words, those who used this ochre-rich cave over its 4,500 years of service used it to produce their tools, which functioned like proto-stamps and crayons. You can read about these findings in much more detail in the paper “Patterns of change and continuity in ochre use during the late Middle Stone Age (MSA) of the Horn of Africa: The Porc-Epic Cave record” by Daniela Eugenia Rosso of the University of Barcelona and Francesco d’Errico and Alain Queffelec of the University of Bordeaux. In it, the authors “identify patterns of continuity in ochre acquisition, treatment and use reflecting both persistent use of the same geological resources and similar uses of iron-rich rocks by late MSA Porc-Epic inhabitants.”
The Ethiopian site contains so much ochre, in fact, that “this continuity can be interpreted as the expression of a cohesive cultural adaptation, largely shared by all community members and consistently transmitted through time.” The more evidence sites like the Porc-Epic cave provide, the greater the level of detail in which we’ll be able to piece together the story of not just art, but culture itself. Culture, as Brian Eno so neatly defined it, is everything you don’t have to do, and though drawing in ochre might well have proven useful for the prehistoric inhabitants of modern-day Ethiopia, one of them had to give it a try before it had any acknowledged purpose. Little could they have imagined what that action would lead to over the next few tens of thousands of years.
Asked to list their favorite films of all times, most directors tend towards the canon. And why not? 8 1/2–loved by Scorsese and Lynch and many others–is an indisputable masterpiece, for example. So is The Godfather, Rashomon, Vertigo, and any number of movies that make top film lists over and over. The point is, most of the time, these lists are samey.
That’s why this list from Wes Anderson is a hoot. Here he’s not asked to list his favorites of all time, but rather to create a Top 10 list of Criterion titles. Yet here’s his M.O.: “I thought my take on a top-ten list might be to simply quote myself from the brief fan letters I periodically write to the Criterion Collection team,” he says.
A lot of these films are rarities, and Anderson admits he’s only just seen some of them for the first time. Martin Ritt’s The Spy Who Came in from the Cold is one. Roberto Rossellini’s The Taking of Power by Louis XIV is another. Of the latter, he says, “This is a wonderful and very strange movie. I had never heard of it. The man who plays Louis cannot give a convincing line reading, even to the ears of someone who can’t speak French—and yet he is fascinating.”
Anderson’s comments are often questions, not definitive statements. Like us, he is just as mystified by a film, and that feeling is probably why he likes them in the first place.
Of that Rossellini film he wonders “What does good acting actually mean?” And of Claude Sautet’s Classe tous risques he asks, “Who is our Lino Ventura?” referring to the Italian-born French actor who was once described as “The French John Wayne.” (So, the real question is this: who is our modern day John Wayne?)
We’ll leave the rest for you to read, but for a director so invested in artifice and nostalgia it was a surprise to hear how much he loves surrealist Luis Buñuel:
“He is my hero. Mike Nichols said in the newspaper he thinks of Buñuel every day, which I believe I do, too, or at least every other.”
Ted Mills is a freelance writer on the arts who currently hosts the artist interview-based FunkZone Podcast and is the producer of KCRW’s Curious Coast. You can also follow him on Twitter at @tedmills, read his other arts writing at tedmills.com and/or watch his films here.
Whatever else we take from it, Franz Kafka’s nightmarish fable The Metamorphosis offers readers an especially anguished allegory on troubled sleep. Filled with references to sleep, dreams, and beds, the story begins when Gregor Samsa awakens to find himself (in David Wylie’s translation) “transformed in his bed into a horrible vermin.” After several desperate attempts to roll off his back, Gregor begins to agonize, of all things, over his stressful working hours: “’Getting up early all the time,’ he thought, ‘it makes you stupid. You’ve got to get enough sleep.” Realizing that he has overslept and missed his five o’clock train, he agonizes anew over the frantic workday ahead, and we can hear in his thoughts the complaints of their author. “Sleep and lack thereof,” writes The Independent’s Christopher Hooten, “is of course a central theme in Kafka’s best known work…. It seems there was a strong dose of autobiography at play.”
Chronically insomniac, Kafka wrote at night, then rose early each morning for his hated job at an insurance office. Though he made good use of restlessness, Kafka characterized his insomnia as much more than an inconvenient physical ailment. He thought of it in metaphysical terms, as a kind of soul-sickness. “Sleep,” he wrote in his diaries, “is the most innocent creature there is and sleepless man the most guilty.”
Insomnia transformed Kafka into an unclean thing, quivering in fear of death. “Perhaps I am afraid that the soul, which in sleep leaves me, will not be able to return,” he confessed in a letter to German writer Milena Jesenská. Anxious expressions like this, writes Theresa Fisher, have led researchers to “speculate that Kafka’s pathological traits… indicate borderline personality disorder.” This posthumous diagnosis may be a leap too far. “Unearthing his insomnia, however,” and its effects on his life and work, “requires less speculation.”
Kafka’s descriptions of his anxious insomniac writing habits have led Italian doctor Antonio Perciaccante and his wife and co-author Alessia Coralli to argue in a recent paper published in TheLancet that the writer composed much of his fiction in a state of something like lucid dreaming. In one diary entry, Kafka writes, “it was the power of my dreams, shining forth into wakefulness even before I fall asleep, which did not let me sleep.” Perciaccante and Coralli note that “this seems to be a clear description of a hypnagogic hallucination, a vivid visual hallucination experienced just before the sleep onset.” It’s something we’ve all experienced. Kafka, fearing sleep, stayed there as long as he could. Lest we think of his writing as therapeutic in some way, he gives no indication that it was so. Indeed, it seems that writing introduced more pain: “When I don’t write,” he told Jesenská, “I am merely tired, sad, heavy; when I do write, I am torn by fear and anxiety.”
Kafka made many similar statements about sleep deprivation bringing him to “a depth almost inaccessible at normal conditions.” The visions he encountered, he wrote, “shape themselves into literature.” Through surveying the literature, biographies, interpretations, and the author’s diaries and letters to Jesenská and Felice Bauer, Perciaccante and Coralli pieced together a “psychophysiological” account of Kafka’s dream logic. As Perciaccante told ResearchGate in an interview, his study concerned itself less with the causes of Kafka’s sleeplessness. He admits “it’s difficult to classify Kafka’s insomnia.” Instead the authors concerned themselves with the effects of remaining in a hypnagogic state (a word, notes Drake Baer, that etymologically means “being abducted into sleep”), as well as Kafka’s awareness of his insomnia’s magical and debilitating power.
Metamorphosis, says Perciaccante, in addition to a work about social and familial alienation, “may also represent a metaphor for the negative effects that poor quality sleep, short sleep duration, and insomnia may have on mental and physical health.” Had Kafka overcome his malady, he may never have written his best-known work. Indeed, he may not have written at all. “Perhaps there are other forms of writing,” he told Max Brod in 1922, “but I know only this kind, when fear keeps me from sleeping, I know only this kind.” Perciaccante and Coralli see Kafka’s insomniac torment as a primary theme in his work, but two dissenting voices, writer Saudamini Deo and forensic doctor and anthropologist Philippe Charlier, disagree. Writing into The Lancet to express their view, they assert that despite Kafka’s persistent laments and the squirmy fate of the autobiographical Gregor Samsa, the writer’s “insomnia was not at all dehumanizing… but the exact opposite—ie, humanizing the self by bringing to surface elements of unconscious that guide most actions of our waking life.”
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.