Reflect for a moment on what for many of us has become the average day. You are awoken by your phone whose clock is set via a wireless connection to a cell phone tower, connected to a satellite, all ultimately ending in the ultimate precision machine, a clock that will not lose even a second after 15 billion years of ticking. Upon waking you connect to the world through the “sirene server” of your pleasure that “decides” for you based on an intimate profile built up over years the very world you will see- ranging from Kazakhstan to Kardasian, and connects your intimates, and those whom you “follow” and, if you’re lucky enough, your streams of “followers”.
Perhaps you use a health app to count your breakfast calories or the fat you’ve burned on your morning run, or perhaps you’ve just spent the morning by playing Bejeweled, and will need to pay for your sins of omission later. On your mindless drive to work you make the mistake of answering a text from the office while in front of a cop who unbeknownst to you has instantly run your licence plate to find out if you are a weirdo or terrorist. Thank heavens his algorithm confirms you’re neither.
When you pull into the Burger King drive through to buy your morning coffee, you thoughtlessly end up buying yet another bacon egg and cheese with a side of hash browns in spite of your best self nagging you from inside your smart phone. Having done this one too many times this month your fried food preference has now been sold to the highest bidders, two databanks, through which you’ll now be receiving annoying coupons in the mail along with even more annoying and intrusive adware while you surf the web, the first from all the fast food restaurants along the path of your morning commute, the other friendly, and sometimes frightening, suggestions you ask your doctor about the new cholesterol drug evolocumab.
You did not, of course, pay for your meal with cash but with plastic. Your money swirling somewhere out there in the ether in the form of ones and zeroes stored and exchanged on computers to magically re-coalesce and fill your stomach with potatoes and grease. Your purchases correlated and crunched to define you for all the machines and their cold souls of software that gauge your value as you go about your squishy biological and soulless existence.
The bizarre thing about this common scenario is that all of this happens before you arrive at the office, or store, or factory, or wherever it is you earn your life’s bread. Not only that, almost all of these constraints on how one views and interacts with the world have been self imposed. The medium through which much of the world and our response to is now apps and algorithms of one sort or another. It’s gotten to the point that we now need apps and algorithms to experience what it’s like to be lost, which seems to, well… misunderstand the definition of being lost.
I have no idea where future historians, whatever their minds are made of, will date the start of this trend of tracking and constraining ourselves so as to maintain “productivity” and “wellness”, perhaps with all those 7- habits- of- highly- effective books that started taking up shelf space in now ancient book stores sometime in the 1980’s, but it’s certainly gotten more personal and intimate with the rise of the smart phone. In a way we’ve brought the logic of the machine out of the factory and into our lives and even our bodies- the idea of super efficient man-machine merger as invented by Frederick Taylor and never captured better than in Charlie Chaplin’s brilliant 1936 film Modern Times.
The film is for silent pictures what the Wizard of OZ was for color and bridges the two worlds where almost all of the spoken parts are through the medium of machines including a giant flat screen that seemed entirely natural in a world that has been gone for eighty years. It portrays the Tramp asserting his humanity in the dehumanizing world of automation found in a factory where even eating lunch had been mechanized and made maximally efficient. Chaplin no doubt would have been pleasantly surprised with how well much of the world turned out given the bleakness of economic depression and soon world war he was facing, but I think he also would have been shocked at how much we have given up of the Tramp in us all without reason and largely of our own volition.
Still, the fact of the matter is that this new rule of apps and and algorithms much of which comes packaged in the spiritualized wrapping of “mindfulness” and “happiness” would be much less troubling did it not smack of a new form of Marx’s “opiate for the people” and divert us away from trying to understand and challenge the structural inadequacies of society.
For there is nothing inherently wrong with measuring performance as a means to pursue excellence, or attending to one’s health and mental tranquility. There’s a sort of postmodern cynicism that kicks in whenever some cultural trend becomes too popular, and while it protects us from groupthink, it also tends to lead to intellectual and cultural paralysis. It’s only when performance measures find their way into aspects of our lives that are trivialized by quantifying – such as love or family life- that I think we should earnestly worry, along, perhaps, with the atrophy of our skills to engage with the world absent these algorithmic tools.
My really deep concern lies with the way apps and algorithms now play the role of invisible instruments of power. Again, this is nothing new to the extent that in the pre-digital age such instruments came in the form of bureaucracy and the rule by decree rather than law as Hannah Arendt laid out in her Origins of Totalitarianism back in the 1950:
In governments by bureaucracy decrees appear in their naked purity as though they were no longer issued by powerful men, but were the incarnation of power itself and the administrator only its accidental agent. There are no general principles behind the decree, but ever changing circumstances which only an expert can know in detail. People ruled by decree never know what rules them because of the impossibility of understanding decrees in themselves and the carefully organized ignorance of specific circumstances and their practical significance in which all administrators keep their subjects. (244)
It’s quite easy to read the rule of apps and algorithms in that quote especially the part about “only an expert can know in detail” and “carefully organized ignorance” a fact that became clear to me after I read what is perhaps the best book yet on our new algorithmically ruled lives, Frank Pasquale’s The Black Box Society: The Secret Algorithms That Control Money and Information.
I have often wondered what exactly was being financially gained by gathering up all this data on individuals given how obvious and ineffective the so-called targeted advertisements that follow me around on the internet seem to be, and Pasquale managed to explain this clearly. What is being “traded” is my “digital reputation” whether as a debtor, or insurance risk (medical or otherwise), or customer with a certain depth of pocket and identity- “father, 40’s etc”- or even the degree to which I can be considered a “sucker” for scam and con artists of one sort or another.
This is a reputation matrix much different from the earlier arrangements based or personal knowledge or later impersonal systems such as credit reporting (though both had their abuses) or that for health records under H.I.P.A.A in the sense that the new digital form or reputation is largely invisible to me, its methodology inscrutable, its declarations of my “identity” immune to challenge and immutable. It is as Pasquale so-aptly terms a “black box” in the strongest sense of that word meaning unintelligible and opaque to the individual within it like the rules Kafka’s characters suffer under in his novels about the absurdity of hyper- bureaucracy (and of course more) The Castle and The Trial.
Much more troubling, however, is how such corporate surveillance interacts with the blurring of the line between intelligence and police functions – the distinction between the foreign and domestic spheres- that has been what of the defining features of our constitutional democracy. As Pasquale reminds us:
Traditionally, a critical distinction has been made between intelligence and investigation. Once reserved primarily for overseas spy operations, “intelligence” work is anticipatory, it is the job of agencies like the CIA, which gather potentially useful information on external enemies that pose a threat to national security. “Investigation” is what police do once they have evidence of a crime. (47)
It isn’t only that such moves towards a model of “predictive policing” mean the undoing of constitutionally guaranteed protections and legal due process (presumptions of innocence, and 5th amendment protections) it is also that it has far too often turned the police into a political instrument, which, as Pasquale documents, have monitored groups ranging from peaceful protesters to supporters of Ron Paul all in the name of preventing a “terrorist act” by these members of these groups. (48)
The kinds of illegal domestic spying performed by the NSA and its acronymic companions was built on back of an already existing infrastructure of commercial surveillance. The same could be said for the blurring of the line between intelligence and investigation exemplified by creation of “fusion centers” after 9-11 which repurposed the espionage tools once contained to intelligence services and towards domestic targets and for the purpose of controlling crime.
Both domestic spying by federal intelligence agencies and new forms of invasive surveillance by state and local law enforcement had been enabled by the commercial surveillance architecture established by the likes of corporate behemoths such as FaceBook and Google to whom citizens had surrendered their right to privacy seemingly willingly.
Given the degree to which these companies now hold near monopolies hold over the information citizens receive Pasquale thinks it would be wise to revisit the breakup of the “trusts” in the early part of the last century. It’s not only that the power of these companies is already enormous it’s that were they ever turned into overt political tools they would undermine or upend democracy itself given that citizen action requires the free exchange of information to achieve anything at all.
The black box features of our current information environment have not just managed to colonize the worlds of security, crime, and advertisement, they have become the defining feature of late capitalism itself. A great deal of the 2008 financial crisis can be traced to the computerization of finance over the 1980’s. Computers were an important feature of the pre-crisis argument that we had entered a period of “The Great Equilibrium”. We had become smart enough, and our markets sophisticated enough (so the argument went) that there would be no replay of something like the 1929 Wall Street crash and Great Depression. Unlike the prior era markets without debilitating crashes were not to be the consequence of government regulation to contain the madness of crowds and their bubbles and busts, but in part from the new computer modeling which would exorcise from the markets the demon of “animal spirits” and allow human beings to do what they had always dreamed of doing- to know the future. Pasquale describes it this way:
As information technology improved, lobbyists could tell a seductive story: regulators were no longer necessary. Sophisticated investors could vet their purchases. Computer models could identify and mitigate risk. But the replacement of regulation by automation turned out to be as fanciful as flying cars or space colonization. (105)
Computerization gave rise to ever more sophisticated financial products, such as mortgage backed securities, based on ever more sophisticated statistical models that by bundling investments gave the illusion of stability. Even had there been more prophets crying from the wilderness that the system was unstable they would not have been able to prove it for the models being used were “a black box, programmed in proprietary software with the details left to the quants and the computers”. (106)
It seems there is a strange dynamic at work throughout the digital economy, not just in finance but certainly exhibited in full force there, where the whole game in essence a contest of asymmetric information. You either have the data someone else lacks to make a trade, you process that data faster, or both. Keeping your algorithms secret becomes a matter of survival for as soon as they are out there they can be exploited by rivals or cracked by hackers- or at least this is the argument companies make. One might doubt it though once this you see how nearly ubiquitous this corporate secrecy and patent hoarding has become in areas radically different from software such as the pharmaceuticals or by biotech corporations like Monsanto which hold patents on life itself and whose logic leads to something like Paolo Bacigalupi’s dystopian novel The Windup Girl.
For Pasquale complexity itself becomes a tool of obfuscation in which corruption and skimming can’t help but become commonplace. The contest of asymmetric information means companies are engaged in what amounts to an information war where the goal is as much to obscure real value to rivals and clients so as to profit from the difference in this distortion. In such an atmosphere markets stop being able to perform the informative role Friedrich Hayek thought was their very purpose. Here’s Pasquale himself:
…financialization has created enormous uncertainty about the value of companies, homes, and even (thanks to the pressing need for bailouts) the once rock solid promises of governments themselves.
Finance thrives in this environment of radical uncertainty, taking commissions in cash as investors (or, more likely, their poorly monitored agents) race to speculate on or hedge against an ever less knowable future. (138)
Okay, if Pasquale has clearly laid out the problem, what is his solution? I could go through a list of his suggestions, but I should stick to the general principle. Pasquale’s goal, I think, is to restore our faith in our ability to publicly shape digital technology in ways that better reflect our democratic values. That the argument which claims software is unregulable is an assumption not a truth and the tools and models for regulation and public input over the last century for the physical world are equally applicable to the digital one.
We have already developed a complex, effective, system of privacy protections in the form of H.I.P.A, there are already examples of mandating fair understandable contracts (as opposed to indecipherable “terms of service” agreements) in the form of various consumer protection provisions, up until the 1980’s we were capable of regulating the boom and bust cycles of markets without crashing the economy. Lastly the world did not collapse when earlier corporations that had gotten so large they threatened not only the free competition of markets, but more importantly, democracy itself, were broken up and would not collapse were the like of FaceBook, Google or the big banks broken up either.
Above all, Pasquale urges us to seek out and find some way to make the algorithmization of the world intelligible and open to the political, social and ethical influence of a much broader segment of society than the current group of programmers and their paymasters who have so far been the only ones running the show. For if we do not assert such influence and algorithms continue to structure more and more of our relationship with the world and each other, them algorithmization and democracy would seem to be on a collision course. Or, as Taylor Owen pointed out in a recent issue of Foreign Affairs:
If algorithms represent a new ungoverned space, a hidden and potentially ever-evolving unknowable public good, then they are an affront to our democratic system, one that requires transparency and accountability in order to function. A node of power that exists outside of these bounds is a threat to the notion of collective governance itself. This, at its core, is a profoundly undemocratic notion—one that states will have to engage with seriously if they are going to remain relevant and legitimate to their digital citizenry who give them their power.
Pasquale has given us an excellent start to answering the question of how democracy, and freedom, can survive in the age of algorithms.
[…] interrogate how we, both voluntarily and involuntarily, seem hell bent on turning ourselves into a version of automata through technologies of micro-surveillance for the purpose of self-control and […]
[…] midst of growing complexity is to ensure that citizens have the ability to see inside what are now black boxes and shape these structures in conformity with our values, including our value for future […]