In popular conceptions, we take the computer to be the natural outcome of empirical science, an inheritance of the Enlightenment and subsequent scientific revolutions in the 19th and 20th centuries. Of course, modern computers have their ancient precursors, like the Antikythera Mechanism, a 2,200-year-old bronze and wood machine capable of predicting the positions of the planets, eclipses, and phases of the moon. But even this fascinating artifact fits into the narrative of computer science as “a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II.” Much less do we invoke the names of “philosopher-mathematicians,” writes Chris Dixon at The Atlantic, like George Boole and Gottlob Frege, “who were themselves inspired by Leibniz’s dream of a universal ‘concept language,’ and the ancient logical system of Aristotle.” But these thinkers are as essential, if not more so, to computer science, especially, Dixon argues, Aristotle.
The ancient Greek thinker did not invent a calculating machine, though they may have existed in his lifetime. Instead, as Dixon writes in his recent piece, “How Aristotle Created the Computer,” Aristotle laid the foundations of mathematical logic, “a field that would have more impact on the modern world than any other.”
The claim may strike historians of philosophy as somewhat ironic, given that Enlightenment philosophers like Francis Bacon and John Locke announced their modern projects by thoroughly repudiating the medieval scholastics, whom they alleged were guilty of a slavish devotion to Aristotle. Their criticisms of medieval thought were varied and greatly warranted in many ways, and yet, like many an empiricist since, they often overlooked the critical importance of Aristotelian logic to scientific thought.
At the turn of the 20th century, almost three hundred years after Bacon sought to transcend Aristotle’s Organon with his form of natural philosophy, the formal logic of Aristotle could still be “considered a hopelessly abstract subject with no conceivable applications.” But Dixon traces the “evolution of computer science from mathematical logic” and Aristotelian thought, beginning in the 1930s with Claude Shannon, author of the groundbreaking essay “A Symbolic Analysis of Switching and Relay Circuits.” Shannon drew on the work of George Boole, whose name is now known to every computer scientist and engineer but who, in 1938, “was rarely read outside of philosophy departments.” And Boole owed his principle intellectual debt, as he acknowledged in his 1854 The Laws of Thought, to Aristotle’s syllogistic reasoning.
Boole derived his operations by replacing the terms in a syllogism with variables, “and the logical words ‘all’ and ‘are’ with arithmetical operators.” Shannon discovered that “Boole’s system could be mapped directly onto electrical circuits,” which hitherto “had no systematic theory governing their design.” The insight “allowed computer scientists to import decades of work in logic and mathematics by Boole and subsequent logicians.” Shannon, Dixon writes, “was the first to distinguish between the logical and the physical layer of computers,” a distinction now “so fundamental to computer science that it might seem surprising to modern readers how insightful it was at the time.” And yet, the field could not move forward without it—without, that is, a return to ancient categories of thought.
Since the 1940s, computer programming has become significantly more sophisticated. One thing that hasn’t changed is that it still primarily consists of programmers specifying rules for computers to follow. In philosophical terms, we’d say that computer programming has followed in the tradition of deductive logic, the branch of logic discussed above, which deals with the manipulation of symbols according to formal rules.
Dixon’s argument for the centrality of Aristotle to modern computer science takes many turns—through the quasi-mystical thought of 13th-century Ramon Llull and, later, his admirer Gottfried Leibniz. Through Descartes, and later Frege and Bertrand Russell. Through Alan Turing’s work at Bletchley Park. Nowhere do we see Aristotle, wrapped in a toga, building a circuit board in his garage, but his modes of reasoning are everywhere in evidence as the scaffolding upon which all modern computer science has been built. Aristotle’s attempts to understand the laws of the human mind “helped create machines that could reason according to the rules of deductive logic.” The application of ancient philosophical principles may, Dixon concludes, “result in the creation of new minds—artificial minds—that might someday match or even exceed our own.” Read Dixon’s essay at The Atlantic, or hear it read in its entirety in the audio above.
Related Content:
Free Online Computer Science Courses
The Books on Young Alan Turing’s Reading List: From Lewis Carroll to Modern Chromatics
How Arabic Translators Helped Preserve Greek Philosophy … and the Classical Tradition
Josh Jones is a writer and musician based in Durham, NC. Follow him at @jdmagness
Robert Record deserves some credit for using symbols to make math more visible. He popularized the equal sign.
His Whetstone of Whitte was in English and contributed to that language enabling it’s speakers to jump ahead in tech.
In comouter programming the LET operator has been replaced be the equal sign.
Finally, in Genesis, all of creation begins with a Let.