Tech

Opinion: Data isn’t the new oil — it’s the new nuclear power

Jul 17, 2018 /

Data is a valuable, powerful commodity — but unlike oil, it is unlimited in quantity and in its capacity for harm, says technology thinker James Bridle.

The phrase “data is the new oil” was apparently coined in 2006 by Clive Humby, the British mathematician and architect of the Tesco Clubcard, a supermarket reward program. Since then, it has been repeated and amplified, first by marketers, then by entrepreneurs, and ultimately by business leaders and policy makers.

In 2017, the president and CEO of Mastercard told an audience in Saudi Arabia, the world’s largest producer of actual oil, that data could be as effective as crude as a means of generating wealth. (He also said it was a “public good.”) Around the same time, in British parliamentary debates on leaving the EU, data’s oily qualities were cited by members of Parliament on both sides. Yet few citations address the implications of long-term, systemic and global reliance on such a poisonous material — or the dubious circumstances of its extraction.

In Humby’s formulation, data resembled oil because “it’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc. to create a valuable entity that drives profitable activity; so must data be broken down, analyzed for it to have value.” The emphasis on the work that is required to make information useful has been lost over the years, aided by processing power and machine intelligence, to be replaced by pure speculation. In the process of simplification, the analogy’s historical ramifications — as well as its present dangers and its long-term repercussions — have been forgotten.

Data-driven regimes repeat the racist, sexist and oppressive policies of their antecedents because these biases and attitudes have been encoded into them at the root.

Our thirst for data, like our thirst for oil, is historically imperialist and colonialist, and it’s tightly tied to capitalist networks of exploitation. The same empires first occupied, then exploited, the natural reserves of their possessions, and the networks they created live on in the digital infrastructures of the present day: the information superhighway follows the networks of telegraph cables laid down to control old empires.

While the fastest data routes from West Africa to the world still run through London, so the British-Dutch multinational Shell continues to exploit the oil of the Nigerian delta. The subsea cables girding South America are owned by corporations based in Madrid, even as countries there struggle to control their own oil profits. Fiberoptic connections funnel financial transactions by way of offshore territories quietly retained through periods of decolonization. Empire has mostly rescinded territory, only to continue its operations and maintain its power in the form of networks. Data-driven regimes repeat the racist, sexist and oppressive policies of their antecedents because these biases and attitudes have been encoded into them at the root.

In most of our interactions with power, data is not something that is freely given but is forcibly extracted.

In the present, the extraction, refinement and use of data/oil pollutes the ground and air. It spills. It leaches into everything. It gets into the groundwater of our social relationships, and it poisons them. It makes us think like computers, driving the deep divisions in society caused by misbegotten classification, fundamentalism and populism and accelerating inequality. It sustains and nourishes unequal power relationships. In most of our interactions with power, data is not something that is freely given but is forcibly extracted. Or, it’s impelled in moments of panic, like a stressed cuttlefish attempting to cloak itself from a predator.

The ability of politicians, policy makers and technocrats to talk approvingly of data/oil today should be shocking, given what we know about climate change, if we were not already so numb to their hypocrisy. This data/oil will remain hazardous well beyond our own lifetimes. The debt we’ve already accrued will take centuries to dissipate, and we have not come close as yet to experiencing its worst, inevitable effects.

While peak knowledge may be closer than we think, the exploitation of raw information can continue infinitely, along with the damage it does to us and our ability to reckon with the world.

Data/oil is insufficient as an analogy because it might give us false hope of a peaceful transfer to an information-free economy. Oil is, despite everything, defined by its exhaustibility. We’re already approaching peak oil, and while every oil shock prompts us to engage and exploit some new territory or destructive technology — further endangering the planet and ourselves — the wells will eventually run dry.

The same is not true of information, despite the desperate fracking that appears to be occurring when intelligence agencies record every email, every mouse click, and the movements of every cell phone. While peak knowledge may be closer than we think, the exploitation of raw information can continue infinitely, along with the damage it does to us and our ability to reckon with the world.

In this way, information more closely resembles atomic power than oil — an effectively unlimited resource that still contains immense destructive power and that’s even more explicitly connected to histories of violence. Viewing information as akin to atomic power might however force us to confront existential questions of time and contamination in ways that petroculture has mostly managed to avoid.

The once-secret mesa town of Los Alamos finds its contemporary equivalent in the NSA data centers under construction in the Utah desert.

Computational thinking — thinking like a machine — evolved through early computers and was supercharged to build the atomic bomb; the architecture of contemporary processing and networking was forged in the crucible of the Manhattan Project. And like nuclear waste, data leaks and breaches: critical excursions and chain reactions lead to privacy meltdowns and the collapse of governments. These analogies are not mere speculations: they are the inherent and totalizing effects of our social and engineering choices. Just as the once-secret mesa town of Los Alamos finds its contemporary equivalent in the NSA data centers under construction in the Utah desert, so Enrico Fermi’s black graphite reactor is reified today both in the opaque glass and steel of NSA’s headquarters at Fort Meade, Maryland, and in the endless, inscrutable server racks of Google, Facebook, Amazon, Palantir, Lawrence Livermore, Sunway TaihuLight, and the National Defense Management Center.

Just as we spent 45 years locked in a Cold War perpetuated by the specter of mutually assured destruction, we find ourselves in an intellectual, ontological dead end today. The primary method we have for evaluating the world — more data — is faltering. It’s failing to account for complex, human-driven systems, and its failure is becoming obvious — not least because we’ve built a vast, planet-spanning, information-sharing system for making it obvious to us. The mutually assured privacy meltdown of state surveillance and leak-driven countersurveillance activism is one example of this failure, as is the confusion caused by real-time information overload from surveillance itself. So is the discovery crisis in the pharmacological industry, where billions of dollars in computation are returning exponentially fewer drug breakthroughs.

Perhaps the most obvious failure is that despite the sheer volume of information that exists online — the plurality of moderating views and alternative explanations — conspiracy theories and fundamentalism aren’t merely surviving; they are proliferating. As in the nuclear age, we learn the wrong lesson over and over again. We stare at the mushroom cloud and see all of this power, and we enter into an arms race all over again.

But what we should be seeing instead is the network itself, in all of its complexity. Our current ways of thinking about the world can no more survive exposure to this totality of raw information than we can survive exposure to an atomic core. An atomic understanding of information presents such a cataclysmic conception of the future that it forces us to insist upon the present as the only domain for action.

We are not powerless, not without agency, and not limited by darkness. We only have to think, and think again, and keep thinking.

In contrast to dys/utopian imaginings of the future, one strand of environmental and atomic activism posits the notion of guardianship.Guardianship takes full responsibility for the toxic products of atomic culture, even and especially when they have been created for our ostensible benefit. It’s based on the principles of doing the least harm in the present and of our responsibility to future generations — but it does not presume that we can know or control them.

Guardianship calls for change, while taking on the responsibility of what we’ve already created and insisting that deep burial of radioactive materials precludes such possibilities and risks widespread contamination. In this, it aligns itself with the new dark age: a place where the future is radically uncertain and the past irrevocably contested, but where we are still capable of speaking directly to what is in front of us, of thinking clearly and acting with justice. Guardianship insists that these principles require a moral commitment that is beyond the abilities of pure computational thinking, but well within, and utterly appropriate to, our darkening reality.

Ultimately, any strategy for living in the new dark age depends upon attention to the here and now — and not to the illusory promises of computational prediction, surveillance, ideology and representation. The present is always where we live and think, poised between an oppressive history and an unknowable future. The technologies that so inform and shape our contemporary perceptions of reality are not going to go away, and in many cases we should not wish them to. Our current life-support systems on a planet of 7.5 billion and rising, utterly depend upon them.

Our understanding of those systems and their ramifications and of the conscious choices we make in their design remain entirely within our capabilities. We are not powerless, not without agency, and not limited by darkness. We only have to think, and think again, and keep thinking. The network — us and our machines and the things we think and discover together — demand it.

Excerpted from New Dark Age: Technology and the End of the Future by James Bridle. Copyright © 2018 by James Bridle. Used with the permission of Verso Books.

Watch James Bridle’s TED talk here: