“The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.”
H.P. Lovecraft, The Call of Cthulhu
“All stable processes we shall predict. All unstable processes we shall control.”
John von Neumann
For at least as long as there has been a form of written language to record such a thing, human beings have lusted after divination. The classical Greeks had their trippy Oracle at Delphi, while the Romans scanned entrails for hidden patterns, or more beautifully, sought out the shape of the future in the murmurations of birds. All ancient cultures, it seems, looked for the signs of fate in the movement of the heavens. The ancient Chinese script may have even originated in a praxis of prophecy, a search for meaning in the branching patterns of “oracle bones” and tortoise shells, signaling that perhaps written language itself originated not with accountants but with prophets seeking to overcome the temporal confines of the second law, in whose measure we are forever condemned.
The promise of computation was that this power of divination was at our fingertips at last. Computers would allow us to outrun time, and thus in seeing the future we’d finally be able to change it or deftly avoid its blows- the goal of all seers in the first place.
Indeed, the binary language underlying computation sprung from the fecund imagination of Gottfried Leibniz who got the idea after he encountered the most famous form of Chinese divination, the I-Ching. The desire to create computing machines emerged with the Newtonian worldview and instantiated its premise; namely, that the world could be fully described in terms of equations whose outcome was preordained. What computers promised was the ability to calculate these equations, offering us a power born of asymmetric information- a kind of leverage via time travel.
Perhaps we should have known that time would not be so easily subdued. Outlining exactly how our recent efforts to know and therefore control the future have failed is the point of James Bridle’s wonderful book New Dark Age: Technology and the End of the Future.
With the kind of clarity demanded of an effective manifesto, Bridle neatly breaks his book up into ten “C’s”: Chasm, Computation, Climate, Calculation, Complexity, Cognition, Complicity, Conspiracy, Concurrency, and Cloud.
Chasm defines what Bridle believes to be our problem. Our language is no longer up to the task of navigating the world which the complex systems upon which we are dependent have wrought. This failure of language, which amounts to a failure of thought, Bridle traces to the origins of computation itself.
To vastly oversimplify his argument, the problem with computation is that the detailed models it provides too often tempt us into confusing our map with the territory. Sometimes this leads us to mistrust our own judgement and defer to the “intelligence” of machines- a situation that in the most tragic of circumstances has resulted in what those in the National Parks Service call “death by GPS”. While in other cases our confusion of the model with reality results in the surrender of power to the minority of individuals capable of creating and the corporations which own and run such models.
Computation was invented under the aforementioned premise born with the success of calculus, that everything, including history itself, could be contained in an equation. It was also seen as a solution to the growing complexity of society. Echoing Stanislaw Lem, one of the founders of modern computers, Vannevar Bush with his “memex” foresaw something like the internet in the 1940s. The mechanization of knowledge the only possible lifeboat in the deluge of information modernity had brought.
One of the first projected purposes of computers would be not just to predict, but to actually control the weather. And while we’ve certainly gotten better at the former, the best we’ve gotten from the later is Kurt Vonnegut’s humorous takedown of the premise in his novel Cat’s Cradle, which was actually based on his chemist brother’s work on weather control for General Electric. It is somewhat ironic, then, that the very fossil fuel based civilization upon which our computational infrastructure depends is not only making the weather less “controllable” and predictable, but is undermining the climatic stability of the Holocene, which facilitated the rise of a species capable of imagining and building something as sophisticated as computers in the first place.
Our new dark age is not just a product of our misplaced faith in computation, but in the growing unpredictability of the world itself. A reality whose existential importance is most apparent in the case of climate change. Our rapidly morphing climate threatens the very communications infrastructure that allows us to see and respond to its challenges. Essential servers and power sources will likely be drowned under the rising seas, cooling dependent processors taxed by increasing temperatures. Most disturbingly, rising CO2 levels are likely to make human beings dumber. As Bridle writes:
“At 1,000 ppm, human cognitive ability drops by 21 per cent.33 At higher atmospheric concentrations, CO2 stops us from thinking clearly. Outdoor CO2 already reaches 500 ppm”
An unstable climate undermines the bedrock, predictable natural cycles from which civilization itself emerged, that is, those of agriculture. In a way our very success at controlling nature, by making it predictable is destabilizing the regularity of nature that made its predictability possible in the first place.
It is here that computation reveals its double edged nature, for while computation is the essential tool we need to see and respond to the “hyperobject” that is climate change, it is also one of the sources of this growing natural instability itself. Much of the energy of modern computation directly traceable to fossil fuels, a fact the demon coal lobby has eagerly pointed out.
What the explosion of computation has allowed, of course, is an exponential expansion of the power and range of calculation. While one can quibble over whether or not the driving force behind the fact that everything is now software, that is Moore’s Law, has finally proved Ray Kurzweil and his ilk wrong and bent towards the asymptote, the fact is that nothing else in our era has followed the semiconductor’s exponential curve. Indeed, as Bridle shows, precisely the opposite.
For all their astounding benefits, machine learning and big data have not, as Chris Anderson predicted, resulted in the “End of Theory”. Science still needs theory, experiment, and dare I say, humans to make progress, and what is clear is that many areas outside ICT itself progress has not merely slowed but stalled.
Over the past sixty years, rather than experience Moore’s Law type increases, the pharmaceutical industry has suffered the opposite. The so-called Eroom’s Law where: “The number of new drugs approved per billion US dollars spent on research and development has halved every nine years since 1950.”
Part of this stems from the fact that the low hanging fruit of discovery, not just in pharmaceuticals but elsewhere, have already been picked, along with the fact that the problems we’re dealing with are becoming exponentially harder to solve. Yet some portion of the slowdown in research progress is surely a consequence of technology itself, or at least the ways in which computers are being relied upon and deployed. Ease of sharing when combined with status hunting inevitably leads to widespread gaming. Scientists are little different from the rest of us, seeking ways to top Google’s Page Rank, Youtube recommendations, Instagram and Twitter feeds, or the sorting algorithms of Amazon, though for scientists the summit of recognition consists of prestigious journals where publication can make or break a career.
Data being easy to come by, while experimentation and theory remain difficult, has meant that “proof” is often conflated with what turn out to be spurious p-values, or “p-hacking”. The age of big data has also been the age of science’s “replication crisis”, where seemingly ever more findings disappear upon scrutiny.
What all this calculation has resulted in is an almost suffocating level of complexity, which is the source of much of our in-egalitarian turn. Connectivity and transparency were supposed to level the playing field, instead, in areas such as financial markets where the sheer amount of information to be processed has ended up barring new entrants, calculation has provided the ultimate asymmetric advantage to those with the highest capacity to identify and respond within nanoseconds to changing market conditions.
Asymmetries of information lie behind both our largest corporate successes and the rising inequality that they have brought in their wake. Companies such as WalMart and Amazon are in essences logistics operations built on the unique ability of these entities to see and respond first or most vigorously to consumer needs. As Bridle points out this rule of logistics has resulted in a bizarre scrambling of politics, the irony that:
“The complaint of the Right against communism – that we’d all have to buy our goods from a single state supplier – has been supplanted by the necessity of buying everything from Amazon.”
Yet unlike the corporate giants of old such as General Motors, our 21st century behemoths don’t actually employ all that many people, and in their lust after automation, seem determined to employ even less. The workplace, and the larger society, these companies are building seem even more designed around the logic of machines than factories in the heyday of heavy industry. The ‘chaotic storage’ deployed in Amazon’s warehouses is like something dreamed up by Kafka, but that’s because it emerges out of the alien “mind” of an algorithm, a real world analog to Google’s Deep Dream.
The world in this way becomes less and less sensible, except to the tiny number of human engineers who, for the moment, retain control over its systems. This is a problem that is only going to get worse with the spread of the Internet of Things. An extension of computation not out of necessity, but because capitalism in its current form seems hell bent on turning all products into services so as to procure a permanent revenue stream. It’s not a good idea. As Bridle puts it:
“We are inserting opaque and poorly understood computation at the very bottom of Maslow’s hierarchy of needs – respiration, food, sleep, and homeostasis – at the precise point, that is, where we are most vulnerable.”
What we should know by now, if anything, is that the more connected things are, the more hackable they become, and the more susceptible to rapid and often unexplainable crashes. Turning reality into a type of computer simulations comes with the danger the world at large might experience the kind of “flash crash” now limited to the stock market. Bridle wonders if we’ve already experienced just such a flash crash of reality:
“Or perhaps the flash crash in reality looks exactly like everything we are experiencing right now: rising economic inequality, the breakdown of the nation-state and the militarisation of borders, totalising global surveillance and the curtailment of individual freedoms, the triumph of transnational corporations and neurocognitive capitalism, the rise of far-right groups and nativist ideologies, and the utter degradation of the natural environment. None of these are the direct result of novel technologies, but all of them are the product of a general inability to perceive the wider, networked effects of individual and corporate actions accelerated by opaque, technologically augmented complexity.”
It’s perhaps somewhat startling that even as we place ourselves in greater and greater dependence on artificial intelligence we’re still not really certain how or even if machines can think. Of course, we’re far from understanding how human beings exhibit intelligence, but we’ve never been confronted with this issue of inscrutability when it comes to our machines. Indeed, almost the whole point of machines is to replace the “herding cats” element of the organic world with the deterministic reliability of classical physics. Machines are supposed to be precise, predictable, legible, and above all, under the complete control of the human beings who use them.
The question of legibility today hasn’t gone far beyond the traditional debate between the two schools of AI that have rivaled each other since the field’s birth. There are those who believe intelligence is merely the product of connections between different parts of the brain and those who think intelligence has more to do with the mind’s capacity to manipulate symbols. In our era of deep learning the Connectionists hold sway, but it’s not clear if we are getting any closer to machines actually understanding anything, a lack of comprehension that can result in the new comedy genre of silly or disturbing mistakes made by computers, but that also presents us with real world dangers as we place more and more of our decision making upon machines that have no idea what any of the data they are processing actually means even as these programs discover dangerous hacks such as deception that are as old as life itself.
Of course, no techno-social system can exist unless it serves the interest of at least some group in a position of power. Bridle draws an analogy between the society in ancient Egypt and our own. There, the power of the priests was premised on their ability to predict the rise and fall of the Nile. To the public this predictive power was shrouded in the language of the religious castes’ ability to commune with the gods, all the while the priests were secretly using the much more prosaic technology of nilometers hidden underground.
Who are the priests of the current moment? Bridle makes a good case that it’s the “three letter agencies”, the NSA, MI5 and their ilk that are the priests of our age. It’s in the interest of these agencies, born in the militarized atmosphere of the Cold War and the War on Terrorism that the logic of radical transparency continues to unfold- where the goal is to see all and to know all.
Who knows how vulnerable these agencies have made our communications architecture in trying to see through it? Who can say, Bridle wonders, if the strongest encryption tools available haven’t already been made useless by some genius mathematician working for the security state? And here is the cruel irony of it all, that the agencies whose quest is to see into everything are completely opaque to the publics they supposedly server. There really is a “deep state” though given our bizzaro-land media landscape our grappling with it quickly gives way conspiracy theories and lunatic cults like Q-Anon.
The hunt for conspiracy stems from the understandable need of the human mind to simplify. It is the search for clear agency where all we can see are blurred lines. Ironically, believers in conspiracy hold more expansive ideas of power and freedom than those who explain the world in terms of “social forces” or other necessary abstractions. For a conspiracists the world is indeed controllable it’s just a matter that those doing the controlling happen to be terrifying. None of this makes conspiracy anything but an unhinged way of dealing with reality, just a likely one whenever a large number of individuals feel the world is spinning out of control.
The internet ends up being a double boon for conspiracists politics because it both fragments the shared notion that of reality that existed in the age of print and mass media while allowing individuals who fall under some particular conspiracy’s spell to find one another and validate their own worldview. Yet it’s not just a matter of fragmented media and the rise of filter bubbles that plague us but a kind of shattering of our sense of reality itself.
It is certainly a terrible thing that our current communications and media landscape has fractured into digital tribes with the gap of language and assumptions between us seemingly unbridgeable, and emotion-driven political frictions resulting in what some have called “a cold civil war.” It’s perhaps even more terrifying that this same landscape has spontaneously given way to a kind of disturbed collective unconscious that is amplified, and sometimes created, by AI into what amounts to the lucid dreams of a madman that millions of people, many of them children, experience at once.
Youtube isn’t so much a television channel as it is a portal to the twilight zone, where one can move from videos of strangers compulsively wrapping and unwrapping products to cartoons of Peppa the Pig murdering her parents. Like its sister “tubes” in the porn industry, Youtube has seemingly found a way to jack straight into the human lizard brain. As is the case with slot machines, the system has been designed with addiction in mind, only the trick here is to hook into whatever tangle of weirdness or depravity exists in the individual human soul- and pull.
The even crazier thing about these sites is that the majority of viewers, and perhaps soon creators, are humans but bots. As Bridle writes:
“It’s not just trolls, or just automation; it’s not just human actors playing out an algorithmic logic, or algorithms mindlessly responding to recommendation engines. It’s a vast and almost completely hidden matrix of interactions between desires and rewards, technologies and audiences, tropes and masks.”
Bridle thinks one thing is certain, we will never again return to the feeling of being completely in control, and the very illusion that we can be, if we only had the right technical tweak, or the right political leader, is perhaps the greatest danger of our new dark age.
In a sense we’re stuck with complexity and it’s this complex human/machine artifice which has emerged without anyone having deliberately built it that is the source of all the ills he has detailed.
The historian George Dyson recently composed a very similar diagnosis. In his essay Childhood’s End Dyson argued that we are leaving the age of digital and going back to the era of analog. He didn’t mean that we’d shortly be cancelling our subscriptions to Spotify and rediscovering the beauty of LP’s (though plenty of us are doing that), but something much deeper. Rather than, being a model of the world’s knowledge, in some sense, now Google was the world’s knowledge. Rather than represent the world’s social graph, now FaceBook was the world’s social graph.
The problem with analogue systems when compared to digital is that they are hard to precisely control, and thus are full of surprises, some of which are far from good. Our quest to assert control over nature and society hasn’t worked out as planned. According to Dyson:
“Nature’s answer to those who sought to control nature through programmable machines is to allow us to build machines whose nature is beyond programmable control.”
Bridle’s answer to our situation is to urge us to think, precisely the kind of meditation on the present he has provided with his book. It’s not as wanting a solution as one might suppose, and for me had clear echoes with the perspective put forward by Roy Scranton in his book Learning to Die in the Anthropocene where he wrote:
“We must practice suspending stress-semantic chains of social exhaustion through critical thought, contemplation, philosophical debate, and posing impertinent questions…
We must inculcate ruminative frequencies in the human animal by teaching slowness, attention to detail, argumentative rigor, careful reading, and meditative reflection.”
I’m down with that. Yet the problem I had with Scranton is ultimately the same one I had with Bridle. Where is the politics? Where is human agency? For it is one thing to say that we live in a complex world roamed by “hyperobjects” we at best partly control, but it quite another to discount our capacity for continuous intervention, especially our ability to “act in concert”, that is politically, to steer the world towards desirable ends.
Perhaps what the arrival of a new dark age means is that we’re regaining a sense of humility. Starting about two centuries ago human beings got it into their heads that they had finally gotten nature under their thumb. What we are learning in the 21st century was not only was this view incorrect, but that the human made world itself seems to have a mind of its own. What this means is that we’re likely barred forever from the promised land, condemned to a state of adaptation and response to nature’s cruelty and human injustice, which will only end with our personal death or the extinction of the species, and yet still contains all the joy and wonder of what it means to be a human being cast into a living world.