The Evolution of Chains

“Progress in society can be thought of as the evolution of chains….

 Phil Auerswald, The Code Economy

“I would prefer not to.”

Herman Melville, Bartleby the Scrivener

 The 21st century has proven to be all too much like living in a William Gibson novel. It may be sad, but at least it’s interesting. What gives the times this cyberpunk feel isn’t so much the ambiance (instead of dark cool of noir we’ve got frog memes and LOLcats), but instead is that absolutely every major happening and all of our interaction with the wider world has something to do with the issue of computation, of code.

Software has not only eaten the economy, as Marc Anderssen predicted back in 2011, but also culture, politics, and war. To understand both the present and the future, then, it is essential to get a handle on what exactly this code that now dominates our lives is, to see it in its broader historical perspective.

Phil Auerswald managed to do something of the sort with his book The Code Economy. A Forty-Thousand Year History. It’s a book that tries to take us to the very beginning, to chart the long march of code to sovereignty over human life. The book defines code (this will prove part of its shortcomings) broadly as a “recipe” a standardized way of achieving some end. Looked at this way human beings have been a species dominated by code since our beginnings, that is with the emergence of language, but there have been clear leaps closer to the reign of code along the way with Auerswald seeing the invention of writing being the first. Written language, it seems, was the invention of bureaucrats, a way for the tribal scribes in ancient Sumer to keep track of temple tributes and debts. And though it soon broke out of these chains and proved a tool as much for building worlds and wonders like the Epic of Gilgamesh as a radical new means of power and control, code has remained linked to the domination of one human group over another ever since.

Auerswald is mostly silent on the mathematical history of code before the modern age. I wish he had spent time discussing predecessors of the computer such as the abacus, the Antikythera mechanism, clocks, and especially the astrolabe. Where exactly did this idea of mechanizing mathematics actually emerge, and what are the continuities and discontinuities between different forms of such mechanization?

Instead, Gottfried Leibniz who envisioned the binary arithmetic that underlies all of our modern day computers is presented like he had just fallen out of the matrix. Though Leibniz’ genius does seem almost inexplicable and sui generis. A philosopher who arguably beat Isaac Newton to the invention of calculus, he was also an inventor of ingenious calculating machines and an almost real-life version of Nostradamus. In a letter to Christiaan Huygens in 1670 Leibniz predicted that with machines using his new binary arithmetic: “The human race will have a new kind of instrument which will increase the power of the mind much more than optical lenses strengthen the eyes.” He was right, though it would be nearly 300 years until these instruments were up and running.

Something I found particularly fascinating is that Leibniz found a premonition for his binary number system in the Chinese system of divination- the I-ching which had been brought to his attention by Father Bouvet, a Jesuit priest then living in China. It seems that from the beginning the idea of computers, and the code underlying them, has been wrapped up with the human desire to know the future in advance.

Leibniz believed that the widespread adoption of binary would make calculation more efficient and thus mechanical calculation easier, but it wouldn’t be until the industrial revolution that we actually had the engineering to make his dreams a reality. Two developments especially aided in the development of what we might call proto-computers in the 19th century. The first was the division of labor first identified by Adam Smith, the second was Jacquard’s loom. Much like the first lurch toward code that came with the invention of writing, this move was seen as a tool that would extend the reach and powers of bureaucracy.       

Smith’s idea of how efficiency was to be gained from breaking down complex tasks into simple easily repeatable steps served as the inspiration to applying the same methods to computation itself. Faced with the prospect of not having enough trained mathematicians to create the extensive logarithmic and trigonometric tables upon which modern states and commerce was coming to depend, innovators such as Gaspard de Prony in France hit upon the idea of breaking down complex computation into simple tasks. The human “computer” was born.

It was the mechanical loom of Joseph Marie Jacquard that in the early 1800s proved that humans themselves weren’t needed once the creation of a complex pattern had been reduced to routine tasks. It was merely a matter of drawing the two, route human computation and machine enabled pattern making, together which would show the route to Leibniz’ new instrument for thought. And it was Charles Babbage along with his young assistant Ada Lovelace who seemed to graph the implications beyond data crunching of building machines that could “think”  who would do precisely this.

Babbage’s Difference and Analytical Engines, however, remained purely things of the mind. Early industrial age engineering had yet to catch up with Leibniz’ daydreams. Society’s increasing needs for data instead came to be served by a growing army of clerks along with simple adding machines and the like.

In an echo of the Luddites who rebelled against the fact that their craft was being supplanted by the demands of the machine, at least some of these clerks must have found knowledge reduced to information processing dehumanizing. Melville’s Bartleby the Scrivener gives us a portrait of a man who’s become like a computer stuck in a loopy glitch that eventually leads to his being scraped.    

What I think is an interesting aside here, Auerswald  doesn’t really explore the philosophical assumptions behind the drive to make the world computable. Melville’s Bartleby, for example, might have been used as a jumping off point for a discussion about how both the computer the view of human beings as a sort of automata meant to perform a specific task emerged out of a thoroughly deterministic worldview. This view, after all, was the main target of  Melville’s short story where a living person has come to resemble a sort of flawed windup toy, and make direct references to religious and philosophical tracts arguing in favor of determinism; namely, the firebrand preacher Jonathan Edwards sermon On the Will, and the chemist and utopian Joseph Priestley’s book  Doctrine of Philosophical Necessity.  

It is the effect of the development of machine based code on these masses of 19th and early 20th century white collar workers, rather than the more common ground of automation’s effect on the working class, that most interests Auerswald. And he has reasons for this, given that the current surge of AI seems to most threaten low level clerical workers rather than blue collar workers whose jobs have already been automated or outsourced, and in which the menial occupations that have replaced jobs in manufacturing require too much dexterity for even the most advanced robots.

As Auerswald points out, fear of white collar unemployment driven by the automation of cognitive tasks is at least as old as the invention of the Burroughs’s calculating machine in the 1880s. Yet rather than lead to a mass of unemployed clerks, the adoption of adding machines, typewriters, dictaphones only increased the number of people needed to manage and process information. That is, the depth of code into society, or the desire for society to conform to the demands of code placed even higher demands on both humans and machines. Perhaps that historical analogy will hold for us as well. Auerswald doesn’t seem to think that this is a problem.

By the start of the 20th century we still weren’t sure how to automate computation, but we were getting close. In an updated version of De Prony, the meteorologist, Fry Richardson showed how we could predict the weather armed with a factory of human computers. There are echoes of both to be seen in the low-paid laborers working for platforms such as Amazon’s Mechanical Turk who still provide the computation behind the magic-act that is much of contemporary AI.

Yet it was World War II and the Cold War that followed that would finally bootstrap Leibniz’s dream of the computer into reality. The names behind this computational revolution are now legendary: Vannevar Bush, Norbert Wiener, Claude Shannon, John Von Neumann, and especially Alan Turing.

It was these figures who laid the cornerstone for what George Dyson has called “Turing’s Cathedral” for like the great medieval cathedrals the computational megastructure in which all of us are now embedded was the product of generations of people dedicated to giving substance to the vision of its founders. Unlike the builders of Notre Dame or Chartres who labored in the name of ad majorem Dei gloriam, those who constructed Turing’s Cathedral were driven by ideas regarding the nature of thought and the power and necessity of computation.

Surprisingly, this model of computation created in the early 20th century by Turing and his fellow travelers continues to underlie almost all of our digital technology today. It’s a model that may be reaching its limits, and needs to be replaced with a more environmentally sustainable form of computation that is actually capable of helping us navigate and find the real structure of information in a world whose complexity we are finding so deep as to be intractable, and in which our own efforts to master only results in a yet harder to solve maze. It’s a story Auerswald doesn’t tell, and must wait until another time.        

Rather than explore what computation itself means, or what its future might be, Auerswald throws wide open the definition of what constitutes a code. As Erwin Schrodinger observed in his prescient essay What is Life?”, and as we later learned with the discovery of DNA, a code does indeed stand at the root of every living thing. Yet in pushing the definition of code ever farther from its origins in the exchange of information and detailed instructions on how to perform a task, something is surely lost. Thus Auerswald sees not only computer software and DNA as forms of code, but also the work of the late celebrity chef Julia Child, the McDonald’s franchise, WalMart and even the economy itself all as forms of it.

As I suggested at the beginning of this essay, there might be something to be gained by viewing the world through the lens of code. After all, code of one form or another is now the way in which most of the largely private and remaining government bureaucracies that run our lives function. The sixties radicals terrified that computers would be the ultimate tool of bureaucrats guessed better where we were headed. “Do not fold spindle or mutilate.” Right on. Code has also become the primary tool through which the system is hacked- forced to buckle and glitch in ways that expose its underlying artificiality, freeing us for a moment from its claustrophobic embrace. Unfortunately, most of its attackers aren’t rebels fighting to free us, but barbarians at the gates.

Here is where Auerswald really lets us down. Seeing progress as tied to our loss of autonomy and the development of constraints he seems to think we should just accept our fetters now made out of 1’s and 0’s. Yet within his broad definition of what constitutes code, it would appear that at least the creation of laws remains in our power. At least in democracies the law is something that the people collectively make. Indeed, if any social phenomenon can be said to be code like it would be law. It surprising, then, that Auerswald gives it no mention in his book. Meaning it’s not really surprising at all.

You see, there’s an implicit political agenda underneath Auerswald whole understanding of code, or at least a whole set of political and economic assumptions. These are assumptions regarding the naturalness of capitalism, the beneficial nature of behemoths built on code such as WalMart, the legitimacy of global standard setting institutions and protocols, the dominance of platforms over other “trophic layers” in the information economy that provide the actual technological infrastructure on which these platforms run. Perhaps not even realizing that these are just assumptions rather than natural facts Auerswald never develops or defends them. Certainly, code has its own political-economy, but to see it we will need to look elsewhere. Next time.