The Flash Crash of Reality

“The most merciful thing in the world, I think, is the inability of the human mind to correlate all its contents. We live on a placid island of ignorance in the midst of black seas of infinity, and it was not meant that we should voyage far. The sciences, each straining in its own direction, have hitherto harmed us little; but some day the piecing together of dissociated knowledge will open up such terrifying vistas of reality, and of our frightful position therein, that we shall either go mad from the revelation or flee from the deadly light into the peace and safety of a new dark age.”

                                                                                                 H.P. Lovecraft, The Call of Cthulhu  

“All stable processes we shall predict. All unstable processes we shall control.”

John von Neumann

For at least as long as there has been a form of written language to record such a thing, human beings have lusted after divination. The classical Greeks had their trippy Oracle at Delphi, while the Romans scanned entrails for hidden patterns, or more beautifully, sought out the shape of the future in the murmurations of birds. All ancient cultures, it seems, looked for the signs of fate in the movement of the heavens. The ancient Chinese script may have even originated in a praxis of prophecy, a search for meaning in the branching patterns of “oracle bones” and tortoise shells, signaling that perhaps written language itself originated not with accountants but with prophets seeking to overcome the temporal confines of the second law, in whose measure we are forever condemned.

The promise of computation was that this power of divination was at our fingertips at last. Computers would allow us to outrun time, and thus in seeing the future we’d finally be able to change it or deftly avoid its blows- the goal of all seers in the first place.

Indeed, the binary language underlying computation sprung from the fecund imagination of Gottfried Leibniz who got the idea after he encountered the most famous form of Chinese divination, the I-Ching. The desire to create computing machines emerged with the Newtonian worldview and instantiated its premise; namely, that the world could be fully described in terms of equations whose outcome was preordained. What computers promised was the ability to calculate these equations, offering us a power born of asymmetric information- a kind of leverage via time travel.

Perhaps we should have known that time would not be so easily subdued. Outlining exactly how our recent efforts to know and therefore control the future have failed is the point of James Bridle’s wonderful book  New Dark Age: Technology and the End of the Future.

With the kind of clarity demanded of an effective manifesto, Bridle neatly breaks his book up into ten “C’s”: Chasm, Computation, Climate, Calculation, Complexity, Cognition, Complicity, Conspiracy, Concurrency, and Cloud.

Chasm defines what Bridle believes to be our problem. Our language is no longer up to the task of navigating the world which the complex systems upon which we are dependent have wrought. This failure of language, which amounts to a failure of thought, Bridle traces to the origins of computation itself.

Computation

To vastly oversimplify his argument, the problem with computation is that the detailed models it provides too often tempt us into confusing our map with the territory. Sometimes this leads us to mistrust our own judgement and defer to the “intelligence” of machines- a situation that in the most tragic of circumstances has resulted in what those in the National Parks Service call “death by GPS”. While in other cases our confusion of the model with reality results in the surrender of power to the minority of individuals capable of creating and the corporations which own and run such models.

Computation was invented under the aforementioned premise born with the success of calculus, that everything, including  history itself, could be contained in an equation. It was also seen as a solution to the growing complexity of society. Echoing Stanislaw Lem, one of the founders of modern computers, Vannevar Bush with his “memex” foresaw something like the internet in the 1940s. The mechanization of knowledge the only possible lifeboat in the deluge of information modernity had brought.

Climate

One of the first projected purposes of computers would be not just to predict, but to actually control the weather. And while we’ve certainly gotten better at the former, the best we’ve gotten from the later is Kurt Vonnegut’s humorous takedown of the premise in his novel Cat’s Cradle, which was actually based on his chemist brother’s work on weather control for General Electric. It is somewhat ironic, then, that the very fossil fuel based civilization upon which our computational infrastructure depends is not only making the weather less “controllable” and predictable, but is undermining the climatic stability of the Holocene, which facilitated the rise of a species capable of imagining and building something as sophisticated as computers in the first place.

Our new dark age is not just a product of our misplaced faith in computation, but in the growing unpredictability of the world itself. A reality whose existential importance is most apparent in the case of climate change. Our rapidly morphing climate threatens the very communications infrastructure that allows us to see and respond to its challenges. Essential servers and power sources will likely be drowned under the rising seas, cooling dependent processors taxed by increasing temperatures. Most disturbingly, rising CO2  levels are likely to make human beings dumber. As Bridle writes:

“At 1,000 ppm, human cognitive ability drops by 21 per cent.33 At higher atmospheric concentrations, CO2 stops us from thinking clearly. Outdoor CO2 already reaches 500 ppm”

An unstable climate undermines the bedrock, predictable natural cycles from which civilization itself emerged, that is, those of agriculture.  In a way our very success at controlling nature, by making it predictable is destabilizing the regularity of nature that made its predictability possible in the first place.

It is here that computation reveals its double edged nature, for while computation is the essential tool we need to see and respond to the “hyperobject” that is climate change, it is also one of the sources of this growing natural instability itself. Much of the energy of modern computation directly traceable to fossil fuels, a fact the demon coal lobby has eagerly pointed out.

Calculation

What the explosion of computation has allowed, of course, is an exponential expansion of the power and range of calculation. While one can quibble over whether or not the driving force behind the fact that everything is now software, that is Moore’s Law, has finally proved Ray Kurzweil and his ilk wrong and bent towards the asymptote, the fact is that nothing else in our era has followed the semiconductor’s exponential curve. Indeed, as Bridle shows, precisely the opposite.

For all their astounding benefits, machine learning and big data have not, as Chris Anderson predicted, resulted in the “End of Theory”. Science still needs theory, experiment, and dare I say, humans to make progress, and what is clear is that many areas outside ICT itself progress has not merely slowed but stalled.

Over the past sixty years, rather than experience Moore’s Law type increases, the pharmaceutical industry has suffered the opposite. The so-called Eroom’s Law where: “The number of new drugs approved per billion US dollars spent on research and development has halved every nine years since 1950.”

Part of this stems from the fact that the low hanging fruit of discovery, not just in pharmaceuticals but elsewhere, have already been picked, along with the fact that the problems we’re dealing with are becoming exponentially harder to solve. Yet some portion of the slowdown in research progress is surely a consequence of technology itself, or at least the ways in which computers are being relied upon and deployed. Ease of sharing when combined with status hunting inevitably leads to widespread gaming. Scientists are little different from the rest of us, seeking ways to top Google’s Page Rank, Youtube recommendations, Instagram and Twitter feeds, or the sorting algorithms of Amazon, though for scientists the summit of recognition consists of prestigious journals where publication can make or break a career.

Data being easy to come by, while experimentation and theory remain difficult, has meant that “proof” is often conflated with what turn out to be spurious p-values, or “p-hacking”. The age of big data has also been the age of science’s “replication crisis”, where seemingly ever more findings disappear upon scrutiny.

What all this calculation has resulted in is an almost suffocating level of complexity, which is the source of much of our in-egalitarian turn. Connectivity and transparency were supposed to level the playing field, instead, in areas such as financial markets where the sheer amount of information to be processed has ended up barring new entrants, calculation has provided the ultimate asymmetric advantage to those with the highest capacity to identify and respond within nanoseconds to changing market conditions.

Asymmetries of information lie behind both our largest corporate successes and the rising inequality that they have brought in their wake. Companies such as WalMart and Amazon are in essences logistics operations built on the unique ability of these entities to see and respond first or most vigorously to consumer needs. As Bridle points out this rule of logistics has resulted in a bizarre scrambling of politics, the irony that:

“The complaint of the Right against communism – that we’d all have to buy our goods from a single state supplier – has been supplanted by the necessity of buying everything from Amazon.”

Yet unlike the corporate giants of old such as General Motors, our 21st century behemoths don’t actually employ all that many people, and in their lust after automation, seem determined to employ even less. The workplace, and the larger society, these companies are building seem even more designed around the logic of machines than factories in the heyday of heavy industry. The ‘chaotic storage’ deployed in Amazon’s warehouses is like something dreamed up by Kafka, but that’s because it emerges out of the alien “mind” of an algorithm, a real world analog to Google’s Deep Dream.

The world in this way becomes less and less sensible, except to the tiny number of human engineers who, for the moment, retain control over its systems. This is a problem that is only going to get worse with the spread of the Internet of Things. An extension of computation not out of necessity, but because capitalism in its current form seems hell bent on turning all products into services so as to procure a permanent revenue stream. It’s not a good idea. As Bridle puts it:

“We are inserting opaque and poorly understood computation at the very bottom of Maslow’s hierarchy of needs – respiration, food, sleep, and homeostasis – at the precise point, that is, where we are most vulnerable.”

What we should know by now, if anything, is that the more connected things are, the more hackable they become, and the more susceptible to rapid and often unexplainable crashes. Turning reality into a type of computer simulations comes with the danger the world at large might experience the kind of “flash crash” now limited to the stock market. Bridle wonders if we’ve already experienced just such a flash crash of reality:

“Or perhaps the flash crash in reality looks exactly like everything we are experiencing right now: rising economic inequality, the breakdown of the nation-state and the militarisation of borders, totalising global surveillance and the curtailment of individual freedoms, the triumph of transnational corporations and neurocognitive capitalism, the rise of far-right groups and nativist ideologies, and the utter degradation of the natural environment. None of these are the direct result of novel technologies, but all of them are the product of a general inability to perceive the wider, networked effects of individual and corporate actions accelerated by opaque, technologically augmented complexity.”

Cognition

It’s perhaps somewhat startling that even as we place ourselves in greater and greater dependence on artificial intelligence we’re still not really certain how or even if machines can think. Of course, we’re far from understanding how human beings exhibit intelligence, but we’ve never been confronted with this issue of inscrutability when it comes to our machines. Indeed, almost the whole point of machines is to replace the “herding cats” element of the organic world with the deterministic reliability of classical physics. Machines are supposed to be precise, predictable, legible, and above all, under the complete control of the human beings who use them.

The question of legibility today hasn’t gone far beyond the traditional debate between the two schools of AI that have rivaled each other since the field’s birth. There are those who believe intelligence is merely the product of connections between different parts of the brain and those who think intelligence has more to do with the mind’s capacity to manipulate symbols. In our era of deep learning the Connectionists hold sway, but it’s not clear if we are getting any closer to machines actually understanding anything, a lack of comprehension that can result in the new comedy genre of silly or disturbing mistakes made by computers, but that also presents us with real world dangers as we place more and more of our decision making upon machines that have no idea what any of the data they are processing actually means even as these programs discover dangerous hacks such as deception that are as old as life itself.

Complicity

Of course, no techno-social system can exist unless it serves the interest of at least some group in a position of power. Bridle draws an analogy between the society in ancient Egypt and our own. There, the power of the priests was premised on their ability to predict the rise and fall of the Nile. To the public this predictive power was shrouded in the language of the religious castes’ ability to commune with the gods, all the while the priests were secretly using the much more prosaic technology of nilometers hidden underground.

Who are the priests of the current moment? Bridle makes a good case that it’s the “three letter agencies”, the NSA, MI5 and their ilk that are the priests of our age. It’s in the interest of these agencies, born in the militarized atmosphere of the Cold War and the War on Terrorism that the logic of radical transparency continues to unfold- where the goal is to see all and to know all.

Who knows how vulnerable these agencies have made our communications architecture in trying to see through it? Who can say, Bridle wonders, if the strongest encryption tools available haven’t already been made useless by some genius mathematician working for the security state? And here is the cruel irony of it all, that the agencies whose quest is to see into everything are completely opaque to the publics they supposedly server. There really is a “deep state” though given our bizzaro-land media landscape our grappling with it quickly gives way conspiracy theories and lunatic cults like Q-Anon.

Conspiracy

The hunt for conspiracy stems from the understandable need of the human mind to simplify. It is the search for clear agency where all we can see are blurred lines. Ironically, believers in conspiracy hold more expansive ideas of power and freedom than those who explain the world in terms of “social forces” or other necessary abstractions. For a conspiracists the world is indeed controllable it’s just a matter that those doing the controlling happen to be terrifying.  None of this makes conspiracy anything but an unhinged way of dealing with reality, just a likely one whenever a large number of individuals feel the world is spinning out of control.

The internet ends up being a double boon for conspiracists politics because it both fragments the shared notion that of reality that existed in the age of print and mass media while allowing individuals who fall under some particular conspiracy’s spell to find one another and validate their own worldview. Yet it’s not just a matter of fragmented media and the rise of filter bubbles that plague us but a kind of shattering of our sense of reality itself.

Concurrency

It is certainly a terrible thing that our current communications and media landscape has fractured into digital tribes with the gap of language and assumptions between us seemingly unbridgeable, and emotion-driven political frictions resulting in what some have called “a cold civil war.” It’s perhaps even more terrifying that this same landscape has spontaneously given way to a kind of disturbed collective unconscious that is amplified, and sometimes created, by AI into what amounts to the lucid dreams of a madman that millions of people, many of them children, experience at once.

Youtube isn’t so much a television channel as it is a portal to the twilight zone, where one can move from videos of strangers compulsively wrapping and unwrapping products to cartoons of Peppa the Pig murdering her parents. Like its sister “tubes” in the porn industry, Youtube has seemingly found a way to jack straight into the human lizard brain. As is the case with slot machines, the system has been designed with addiction in mind, only the trick here is to hook into whatever tangle of weirdness or depravity exists in the individual human soul- and pull.

The even crazier thing about these sites is that the majority of viewers, and perhaps soon creators, are humans but bots. As Bridle writes:

“It’s not just trolls, or just automation; it’s not just human actors playing out an algorithmic logic, or algorithms mindlessly responding to recommendation engines. It’s a vast and almost completely hidden matrix of interactions between desires and rewards, technologies and audiences, tropes and masks.”

Cloud

Bridle thinks one thing is certain, we will never again return to the feeling of being completely in control, and the very illusion that we can be, if we only had the right technical tweak, or the right political leader, is perhaps the greatest danger of our new dark age.

In a sense we’re stuck with complexity and it’s this complex human/machine artifice which has emerged without anyone having deliberately built it that is the source of all the ills he has detailed.

The historian George Dyson recently composed a very similar diagnosis. In his essay Childhood’s End Dyson argued that we are leaving the age of digital and going back to the era of analog. He didn’t mean that we’d shortly be cancelling our subscriptions to Spotify and rediscovering the beauty of LP’s (though plenty of us are doing that), but something much deeper. Rather than, being a model of the world’s knowledge, in some sense, now Google was the world’s knowledge. Rather than represent the world’s social graph, now FaceBook was the world’s social graph.

The problem with analogue systems when compared to digital is that they are hard to precisely control, and thus are full of surprises, some of which are far from good. Our quest to assert control over nature and society hasn’t worked out as planned. According to Dyson:

“Nature’s answer to those who sought to control nature through programmable machines is to allow us to build machines whose nature is beyond programmable control.”

Bridle’s answer to our situation is to urge us to think, precisely the kind of meditation on the present he has provided with his book. It’s not as wanting a solution as one might suppose, and for me had clear echoes with the perspective put forward by Roy Scranton in his book Learning to Die in the Anthropocene where he wrote:

“We must practice suspending stress-semantic chains of social exhaustion through critical thought, contemplation, philosophical debate, and posing impertinent questions…

We must inculcate ruminative frequencies in the human animal by teaching slowness, attention to detail, argumentative rigor, careful reading, and meditative reflection.”

I’m down with that. Yet the problem I had with Scranton is ultimately the same one I had with Bridle. Where is the politics? Where is human agency? For it is one thing to say that we live in a complex world roamed by “hyperobjects” we at best partly control, but it quite another to discount our capacity for continuous intervention, especially our ability to “act in concert”, that is politically, to steer the world towards desirable ends.

Perhaps what the arrival of a new dark age means is that we’re regaining a sense of humility. Starting about two centuries ago human beings got it into their heads that they had finally gotten nature under their thumb. What we are learning in the 21st century was not only was this view incorrect, but that the human made world itself seems to have a mind of its own. What this means is that we’re likely barred forever from the promised land, condemned to a state of adaptation and response to nature’s cruelty and human injustice, which will only end with our personal death or the extinction of the species, and yet still contains all the joy and wonder of what it means to be a human being cast into a living world.

 

Crushing the Stack

If in The Code Economy Philip Auerswald managed to give us a succinct history of the algorithm, while leaving us with code that floats like a ghost in the ether lacking any anchor in our very much material, economic and political world. Benjamin Bratton tries to bring us back to earth. Bratton’s recent book, The Stack: On software and sovereignty provides us with a sort of schematic with which we can grasp the political economy of code and thus anchor it to the wider world.

The problem is that Bratton, unlike Auerswald, has given us this schematic in the almost impenetrable language of postmodern theory beyond the grasp of even educated readers. Surely this is important for as Ian Bogost pointed out in his review of The Stack: “The book risks becoming a tome to own and display, rather than a tool to use.” This is a shame because the public certainly is in need of maps through which they can understand and seek to control the computational infrastructure that is now embedded in every aspect of our lives, including, and perhaps especially, in our politics. And the failure to understand and democratically regulate such technology leaves society subject to the whims of the often egomaniacal and anti-democratic nerds who design and run such systems.

In that spirit, I’ll try my best below to simplify The Stack into a map we can actually understand and therefore might be inclined to use.

In The Stack Bratton observers that we have entered the era of what he calls “planetary scale computation.” Our whole global system of processing and exchanging information, from undersea fiber-optic cables, satellites, cell-phone towers, server farms, corporate and personal computers along with our ubiquitous smartphones he see sees as “an accidental megastructure” that we have cobbled together without really understanding what we are building. Bratton’s goal, in a sense, is to map this structure by treating it as a “stack”, dissecting it into what he hopes are clearly discernible “layers.” There are six of these: Earth, Cloud, City, Address, Interface and User.

It is the Earth layer that I find both the most important and the most often missed when it comes to discussions of the political economy of code. Far too often the Stack is represented as something that literally is virtual, disconnected from the biosphere in a way that the other complex artificial systems upon which we have come to depend, such as the food system or the energy system, could never be as a matter of simple common sense. And yet the Stack, just like everything else human beings do, is dependent upon and effects the earth. As Bratton puts it in his Lovecraftian prose:

The Stack terraforms the host planet by drinking and vomiting its elemental juices and spitting up mobile phones. After its short career as a little computing brick within a larger megamachine, its fate at the dying end of the electronics component life cycle is just as sad. What is called “electronic waste” inverts the process that pulls entropic reserves of metal and oil from the ground and given form, and instead partially disassembles them and reburies them, sometimes a continent away and sometimes right next door. (p.83)

The rare earth minerals upon which much of modern technology depends come at the cost of environmental degradation and even civil war, as seen in the Democratic Republic of Congo. Huge areas of the earth are now wastelands festooned with the obsolescent silicon of our discarded computers and cell phones picked over by the world’s poorest for whatever wealth might be salvaged.

The Stack consumes upwards of 10 percent of the world’s energy. It’s an amount that is growing despite the major tech players efforts to diminish its footprint by relocating servers in the arctic, and, perhaps soon, under the sea. Although gains in efficiency have, at least temporarily, slowed the rate of growth in energy use.

The threat to the earth from the Stack, as Bratton sees it, is that its ever growing energy and material requirements will end up destroying the carbon based life that created it. It’s an apocalyptic scenario that is less fanciful than it sounds for the Stack is something like the nervous system for the fossil fuel based civilization we have built. Absent our abandonment of that form of civilization we really will create a world that is only inhabitable by machines and machine-like life forms such as bacteria. Wall-e might have been a prophecy and not just a cartoon.

Yet Bratton also sees the Stack as our potential savior, or at least the only way possible without a massive die off of human beings, to get out of this jam. A company like Exxon Mobil with its dependence on satellites and super-computers is only possible with the leverage of the Stack, but then again so is the IPCC.

For the Stack allows us to see nature, to have the tools to monitor, respond to, and perhaps even interfere with the processes of nature many of which the Stack itself is throwing out of kilter. The Stack might even give us the possibility of finding an alternative source of power and construction for itself. One that is compatible with our own survival along with the rest of  life on earth.

After the Earth layer comes the Cloud layer. It is here that Battron expands upon the ideas of Carl Schmitt. A jurist under the Nazi regime, Schmitt’s ideas about the international order have become popular among many on the left at least since the invasion of Iraq by the US in 2003 not as a prescription, but as a disturbingly prescient description of American politics and foreign policy in the wake of 9-11.

In his work The Nomos of the Earth Schmitt critiqued the American dominated international order that had begun with the US entry into WWI and reigned supreme during the Cold War  as a type of order that had, by slipping free the of the anchor of national sovereignty bound to clearly defined territories, set the world on the course of continuous interventions by states into each other’s domestic politics leading to the condition of permanent instability and the threat of total war.

Bratton updates Schmitt’s ideas for our era in which the control of infrastructure has superseded the occupation of territory as the route to power. Control over the nodes of  global networks, where assets are no longer measured in square miles, but in underwater cables, wireless towers, and satellites demands a distributed form of power, and hence helps explain the rise of multinational corporations to their current state of importance.

In terms of the Stack, these are the corporations that make up Bratton’s Cloud Layer, which include not only platforms such as Google and FaceBook, but the ISPs controlling much of the infrastructure upon which these companies (despite their best efforts to build such infrastructure themselves), continue to depend.

Bratton appears to see current geopolitics as a contest between two very different ideas regarding the future of the Cloud. There is the globalist vision found in Silicon Valley companies that aims to abandon the territorial limits of the nation-state and the Chinese model, which seeks to align the Cloud to the interests of the state. The first skirmish of this war Bratton notes was what he calls the Sino-Google War of 2009 in which Google under pressure from the Chinese government to censor its search results eventually withdrew from the country.

Unfortunately for Silicon Valley, along with those hoping we were witnessing the last gasp of the nation-state, not only did Google lose this war, it has recently moved to codify the terms of its surrender, while at the same time we have witnessed both a global resurgence of nationalism and the continuing role of the “deep-state” forcing the Cloud to conform to its interests.

Bratton also sees in the platform capitalism enabled by the Cloud the shape of a possible socialist future- a fulfillment of the dreams of rational, society-wide economic planning that was anticipated with the USSR’s Gosplan, and Project Cybersyn in pre-Pinochet Chile. The Stack isn’t the only book covering this increasingly important and interesting beat.

After the Cloud layer comes the City layer. It is in cities where the density of human population allows the technologies of the Stack to be most apparent. Cities, after all, are thick agglomerations of people and goods in motion all of which are looking for the most efficient path from point A to point B. Cities are composed privatized space made of innumerable walls that dictate entry and exit. They are the perfect laboratory for the logic and tools of the Stack. As Bratton puts it:

We recognize the city he describes as filled with suspicious responsive environments, from ATM PINs, to key cards and parking permits, e-tickets to branded entertainment, personalized recommendations from others who have purchased similar items, mobile social network transparencies, GPS-enabled monitoring of parolees, and customer phone tracking for retail layout optimization.  (p. 157)

Following the City layer we find the Address. In the Stack (or at least in the version of it dreamed up by salesmen for the Internet of Things), everything must have a location in the network, a link to which it can be connected to other persons and things. Something that lacks an address in some sense doesn’t exist for the Stack. An unconnected object or person fails to be a repository for information on which the Stack itself feeds.

We’ve only just entered the era in which our everyday objects speak to one another and in the process can reveal information we might have otherwise hidden about ourselves. What Bratton finds astounding is that in the Address layer we can see that the purpose of our communications infrastructure has become not for humans to communicate with other humans via machines, but for machines to communicate with other machines.

The next layer is that of the Interface. It is the world of programs and apps through which for most of us is the closest we get to code. Bratton says it better:

What are Apps? On the one hand, Apps are software applications and so operate within something like an application layer of a specific device-to-Cloud economy. However, because most of the real information processing is going on in the Cloud, and not in the device in your hand, the App is really more an interface to the real applications hidden away in data centers. As an interface, the App connects the remote device to oceans of data and brings those data to bear on the User’s immediate interests; as a data-gathering tool, the App sends data back to the central horde in response to how the User makes use of it. The App is also an interface between the User and his environment and the things within it, by aiding in looking, writing, subtitling, capturing, sorting, hearing, and linking things and events. (p.142)

The problem with apps is that they offer up an extremely narrow window on the world. Bratton is concerned about the political and social effects of such reality compression, a much darker version of Eli Pariser’s “filter bubble”, where the world itself is refracted into a shape that conforms to the individual’s particular fetishes, shattering a once shared social world.

The rise of filter bubbles are the first sign of a reality crisis Bratton thinks will only get worse with the perfection of augmented reality-there are already AR tours of the Grand Canyon that seek to prove creationism is true.

The Stack’s final layer is that of the User. Bratton here seems mainly concerned with expanding the definition of who or what constitutes one. There’s been a lot of hand-wringing about the use of bots since the 2016 election. California has even passed legislation to limit their use. Admittedly, these short, relatively easy to make programs that allow automated posts or calls are a major problem. Hell, over 90% of the phone calls I receive are now unsolicited robocalls, and given that I know I am not alone in this, such spam might just kill the phone call as a means of human communication. Ironically, the very reason we have cellphones in the first place.

Yet bots have also become the source of what many of us would consider not merely permissible, but desirable speech. It might upset me that countries like Russia and Saudi Arabia are vociferous users of bots to foster their interests among English speaking publics, or scammers using bots to pick people’s pockets, but I actually like the increasing use of bots by NGOs whose missions I support.

Bratton thus isn’t crazy for suggesting we give the bots some space in the form of “rights”. Things might move even further in this direction as bots become increasingly more sophisticated and personalized. Few would go so far as Jamie Susskind in his recent book Future Politics in suggesting we might replace representative government by a system of liquid democracy mediated by bots; one in which bots make political decisions for individuals based on the citizen’s preferences. But, here again, the proposal isn’t as ridiculous or reactionary as it might sound.

Given some issue to decide upon my bot could scan the position on the same by organizations and individuals I trust in regards to that issue. “My” votes on environmental policy could reflect some weighted measure between the views of the World Wildlife Fund, Bill Mckibben and the like, meaning I’d be more likely to make an informed vote than if I had pulled the lever on my own. This is not to say that I agree with this form of politics, or even believe it to be workable. Rather, I merely think that Bratton might be on to something here. That a key question in the User layer will be the place of bots- for good and ill.

The Stack, as Bratton has described it, is not without its problems and thus he ends his book with proposals for how we might build a better Stack. We could turn the Stack into a tool for the observation and management of the global environment. We could give Users design control over the interfaces that now dictate their lives, including the choice to enter and exit when we choose, a right that should be extended to the movement between states as well. We could use the power of platforms to revive something like centrally planned economies and their dream of eliminating waste and scarcity. We could harness the capacity of the Interface layer to build a world of plural utopias, extend and articulate the rights and responsibilities of users in a world full of bots.

Is Bratton right? Is this the world we are in, or at least headed towards. For my money, I think he gets some things spectacularly right, such as his explanation of the view of climate change within the political right:

“For those who would prefer neo-Feudalism and/or tooth-and-nail libertarianism, inaction on climate change is not denialism, rather it is action on behalf of a different strategic conclusion.” (p.306)

Yet, elsewhere I think his views are not only wrong, but sometimes contradictory. I think he largely misses how the Stack is in large part a product of American empire. He, therefore, misinterprets the 2009 spat between Google and China as a battle between two models of future politics, rather than seeing the current splintering of the internet for what it is: the emergence of peer competitors in the arena of information over which the US has for so long been a hegemon.

Bratton is also dismissive of privacy and enraptured by the Internet of Things in a way that can sometimes appear pollyannaish. After all, privacy isn’t just some antiquated right, but one of the few ways to keep hackable systems secure. That he views the IoT as something inevitable and almost metaphysical, rather than the mere marketing it so often is, leads me to believe he really hasn’t thought through what it means to surround ourselves with computers- that is to make everything in our environment hackable. Rather than being destined to plug everything into everything else, we may someday discover that this is not only unnecessary and dangerous, but denotes a serious misunderstanding of what computation is actually for.

Herein lies my main problem with the Stack: though radically different than Yuval Harari, Bratton too seems to have drank the Silicon Valley Kool Aid.  The Stack takes as its assumption that the apps flowing out of the likes of FaceBook and Google and the infrastructure behind them are not merely of world-historical, but of cosmic import. Matter is rearranging itself into a globe spanning intelligence with unlikely seeds like a Harvard nerd who wanted a website to rate hot-chicks. I just don’t buy it.

What I do buy is that the Stack as a concept, or something like it, will be a necessary tool for negotiating our era, where the borders between politics and technology have become completely blurred. One can imagine a much less loquacious and more reality-based version of Bratton’s book that used his layers to give us a better grasp of this situation. In the Earth layer we’d see the imperialism behind the rare-earth minerals underlying our technology, we’d see massive Chinese factories like those of FoxConn, the way in which earth destroying coal continues to be the primary energy source for the Stack.

In the Cloud layer we’d gain insight into server farms and monopolistic ISPs such as Comcast, and come to understand the fight over Net Neutrality. We’d be shown the contours of the global communications infrastructure and the way in which these are plugged into and policed by government actors such as the NSA.

In the City layer we’d interrogate idea of smart cities, along with the automation of inequality and digitization of citizenship along with exploring the role of computation in global finance. In the Address layer we’d uncover the scope of logistics and  find out how platforms such as Amazon work their magic, and ask whether it really was magic or just parasitism, and how we might use these insights for the public good, whether that meant nationalizing the platforms or breaking them into pieces.

In the User layer we’d take a hard look at the addictive psychology behind software, the owners and logic behind well-known companies such as FaceBook along with less well known such as MindGeek. Such an alternative version of the Stack, would not only better inform us as to what the Stack is, but suggest what we might actually do to build ourselves a better one.