The Cloud Is Material: On the Environmental Impacts of Computation and Data Storage

In the age of machine learning, cryptocurrency mining, and seemingly infinite data storage capacity enabled by cloud computing, the environmental costs of ubiquitous computing in modern life are obscured by the sheer complexity of infrastructures and supply chains involved in even the simplest of digital transactions. How does computation contribute to the warming of the planet? As information technology (IT) capacity demands continue to trend upward, what are some of the ecological obstacles that must be overcome to accommodate an ever-expanding, carbon-hungry Cloud? How do these material impacts play out in everyday life, behind the scenes, where servers, fiber optic cables, and technicians facilitate cloud services? This case study draws on firsthand ethnographic research in data centers—sprawling libraries of computer servers that facilitate everything from email to commerce—to identify some of the far-reaching and tangled environmental impacts of computation and data-storage infrastructures. It surveys a range of empirical accounts of server technicians to illustrate on-the-ground examples of material and ecological factors that permeate everyday life in the Cloud. These examples include air conditioning and thermal management, water cycling, and the disposal of e-waste. By attending to the culture of workplace practice and the behaviors and training of technicians in data centers, this case study reveals that the Cloud is not fully automated, nor is it hyperrational; emotion, instinct, and human judgment are enlisted to keep servers running. This case study closes with a speculative vignette that scales up from various local impacts to a planetary framework, sketching some of the particular ways that computation contributes to climate change and the Anthropocene.

The Cloud Is Material: On the Environmental Impacts of Computation and Data Storage 5 with more notches to improve airflow to this particular cluster of dense computing equipment. That is when I hear the alarms go off. Amid a sea of blinking green and blue lights, an entire rack of computers suddenly scintillates yellow, and then, after a few seconds, a foreboding red. In that instant, panic sweeps over Tom's face, and he too is flush and crimson as he scrambles to contain the calamity unfolding around us.
"They're overheating," Tom says, upon inspecting the thermal sensors, sweat dripping from his brow.
I feel the heat swarming the air. The flood of warmth seeps into the servers faster than the heat sinks printed onto their circuit boards can abate, faster than the fans can expel the hot air recycling in a runaway feedback loop of warming. The automatic shutdown sequence begins, and Tom curses, reminding me that every minute of downtime, of service interruption, may cost the company many thousands of dollars.
Within two minutes, however, the three massive air conditioning units that had been idling in a standby state activate to full power, flooding the room with an arctic chill and restoring order to the chaotic scene.
In the vignette above, which draws on my ethnographic fieldnotes, I recount an episode that data center technicians refer to as a "thermal runaway event," a cascading failure of cooling systems that interrupts the functioning of the servers that process, store, and facilitate everything online ( Figure 1) The molecular frictions of digital industry, as this example shows, proliferate as unruly heat. The flotsam and jetsam of our digital queries and transactions, the flurry of electrons flitting about, warm the medium of air. 6 Heat is the waste product of computation, and if left unchecked, it becomes a foil to the workings of digital civilization. 7 Heat must therefore be relentlessly abated to keep the engine of the digital thrumming in a constant state, twenty-four hours a day, every day. To quell this thermodynamic threat, data centers overwhelmingly rely on air conditioning, a mechanical process that refrigerates the gaseous medium of air, so that it can displace or lift perilous heat away from computers. 8 Today, power-hungry computer room air conditioners (CRACs) or computer room air handlers (CRAHs) are staples of even the most advanced data centers. In North America, most data centers draw power from "dirty" electricity grids, especially in Virginia's "data center alley," the site of 70 percent of the world's internet traffic in 2019. 9 To cool, the Cloud burns carbon, what Jeffrey Moro calls an "elemental irony." 10 In most data centers today, cooling accounts for greater than 40 percent of electricity usage. 11 While some of the most advanced, "hyperscale" data centers, like those maintained by Google, Facebook, and Amazon, have pledged to transition their sites to carbon-neutral via carbon offsetting and investment in renewable energy infrastructures like wind and solar, many of the smaller-scale data centers that I observed lack the resources and capital to pursue similar sustainability initiatives. 12 Smaller-scale, traditional data centers have often been set up within older buildings that are not optimized for everchanging power, cooling, and data storage capacity needs. Since the emergence of hyperscale facilities, many companies, universities, and others who operate their own small-scale data centers have begun to transfer their data to hyperscalers or cloud  Why so much energy? Beyond cooling, the energy requirements of data centers are vast. To meet the pledge to customers that their data and cloud services will be available anytime, anywhere, data centers are designed to be hyper-redundant: if one system fails, another is ready to take its place at a moment's notice, to prevent a disruption in user experiences. Like Tom's air conditioners idling in a low-power state, ready to rev up when things get too hot, the data center is a Russian doll of redundancies: redundant power systems like diesel generators, redundant servers ready to take over computational processes should others become unexpectedly unavailable, and so forth. In some cases, only 6-12 percent of energy consumed is devoted to active computational processes. 19 The remainder is allocated to cooling and maintaining chains upon chains of redundant fail-safes to prevent costly downtime.
That being said, there are two computational processes performed by servers that are particularly energy-intensive and are of increasing concern to scholars, activists, and data center industry professionals: 1) machine learning and 2) cryptocurrency mining.

Precipitations
It is late July in Arizona. The sun is white and hot on this cloudless day. I feel it scorch the back of my neck as I follow Jeremy, a junior technician, to the backlot behind a data center, where dozens of shipping containers are arrayed in rows. Amid this 117-degree heat wave, our task is to repair an evaporative cooling system that is failing.
We unfasten the screws on one of the exterior panels before entering the shipping container, which I am surprised to learn is actually a modular server cluster. Pipes snake up from tiny channels in the lot, where potable water is pumped up from the ground, to seep up into a spongey, filter media. To my eyes, this foamy material resembles a honeycomb or a wasp's nest ( Figure 2). The sediment-rich waters of the Colorado River have congealed to form an oozy soot on the porous surface that is not unlike honey. The wet tray of material evaporates quickly in the arid desert air, the roiling cloud of moisture gently cooling the loudly buzzing servers around us, Jeremy explains. This, I learn, is why the shipping container has the nickname, "The Mouth." The Cloud may be a carbonivore, but as the example of "The Mouth" shows, the Cloud is also quite thirsty. Like a pasture, server farms are irrigated. In many data centers today, chilled water is piped through the latticework of server racks to more efficiently cool the facility, liquid being a superior convective agent than air. This shift from cooling air to cooling water is an attempt to reduce carbon footprint, but it comes at a cost. Weathering historic drought and heat domes, communities in the western United States are increasingly strained for water resources. In Mesa, Arizona, where I spent six months researching the emergence of a desert data center hub, some politicians are now openly opposing the construction of data centers, framing the centers' water usage as inessential and irresponsible given resource constraints. 30 In Bluffdale, Utah, residents are suffering from water shortages and power outages, as a result of the nearby Utah Data Center, a facility of the US National Security Agency (NSA) that guzzles seven million gallons of water daily to operate. 31 Visit the web version of this article to view interactive content.
In response to increasing awareness of data centers' impact on water-stressed communities like Mesa and Bluffdale, companies like Google are pledging to go "waterpositive" by 2030, committing to "replenish" 120 percent of the water they consume in their facilities and offices. 32 By implementing costly "closed-loop" water cooling systems, companies like Google and Cyrus One are able to recycle some of the wastewater used in evaporative cooling, though much of the water escapes into the atmosphere during the evaporative process. 33 In addition to optimizing water utilization and minimizing "waste," Google and others pledge to invest in water infrastructure and community resources to enhance "water stewardship" and "water security." 34 Corporate pledges such as these, while laudable, are not enforceable, nor do they appear to be feasible given the explosive growth expected in data storage infrastructures over the next decade, a tripling by some estimates. 35 41 While the efforts of these communities to minimize the noise pollution harming them are ongoing, they are resigned to modest goals to improve rather than solve the problem. Unlike other industries, data centers are largely self-regulating: there is no sweeping federal agency to govern the siting and operation of new and existing facilities.
Because data center noise is unregulated by political authorities, facilities can be built in close proximity to residential communities. Given the subjective nature of hearing, the history of noise regulation might best be characterized by a series of contests over expertise and the "right" to quiet, as codified in liberal legal regimes. Over the course of my fieldwork with the communities of Chandler and Printer's Row, I learned that the "noise" of the Cloud uniquely eludes regulatory schemes. 42 In many cases, the loudness of the data centers, as measured in decibels (dB), falls below the threshold of intolerance as prescribed by local ordinances. 43 For this reason, when residents contacted the authorities to intervene, to attenuate or quiet their noise, no action was taken, because the data centers had not technically violated the law, and their properties were zoned for industrial purposes. 44 However, upon closer interrogation of the sound, some residents reported that the monotonal drone, a frequency hovering within the range of human speech, is particularly disturbing, given the attuned sensitivity of human ears to discern such frequencies above others. Even so, there were days when the data centers, running diesel generators, vastly exceeded permissible decibel-thresholds for noise. As with water and carbon, local companies like CyrusOne pledged in community meetings to take steps to attenuate their sound, though these were unenforceable promises that, to date, they have failed to keep.
Visit the web version of this article to view interactive content.
Discussion Question: What are some technological, legal, political, and societal pathways to ending the noise pollution caused by cloud storage facilities? What are some of the challenges to implementing or realizing these proposed solutions?

Immortal Waste
With both hands, I haul a pushcart across the server room floor, down six storeys on service elevators before depositing its contents-obsolete and decommissioned servers and equipment-into a haphazard pile labeled "decommissioned assets." Under the supervision of senior technician Ricardo, I cut off the elastic asset tags from every piece of discarded equipment, and then scan them into a terminal to finalize the rendering of these valuable computing components as null and void.
"What happens to them, now?" I ask innocently.
"We recycle what we can, but most of it gets dumped to god knows where." The technician grumbles, helping me lift the last of the heavy servers into the mountain of rubble before my feet. Since

The Cloud Is Cultural
It has been a week since the overheating incident. Tom leads me down a corridor of server racks to the site of the thermal anomaly, rack C9. I notice that some of the racks do not contain servers. Instead, the empty sockets are bracketed off with blanking panels, which keep cool air from escaping the pressurized aisles. Without consulting his instruments or tablet, Tom starts ripping out the blanking panels, handing them to me to stack neatly ( Figure 6).
"What are you doing?" I ask.
"I have a hunch this aisle is starved for air," Tom says. "Can't you hear it, the way the fans are groaning?" I hear nothing unusual in the mechanical whir of the fans. "Not really." "Look, the fluid dynamics model says these racks are just fine," Tom says, gesturing to the flashing green and yellow lights, "but you saw what happened last week. These models don't capture everything. Sometimes, you just have to trust your gut." I put my hands in the empty sockets of the rack, feeling cold air tickle my fingertips.
"How do you know how many you need to take out? Do you have to measure the airflow or do some math or calculations?" Tom shoots me a glare, eyebrows furled. "When you've been doing this as long as I have, you get a feel for things. Kinda hard to explain. It's not all numbers and curves and ratios here. We're caretakers, not robots." I set down the blanking panels. "Caretakers?" "The servers need to breathe, they need fuel just like we do, and our job is to keep them running at whatever cost because ultimately the responsibility falls on us if they shut down. You see, our asses are on the line. If they go down, we go down." "That sounds stressful." "Fear is a constant part of the job. So much can go wrong here. Mechanical failure.
Power failure. Human error. Our goal is to prevent all downtime but that's not humanly possible, so we do the best we can. We try to be as reliable as possible. 99.9 per cent uptime or whatever we can manage, but we're only human, after all." When I first started working in data centers as an ethnographic observer, I expected that the experience might teach me how to think like an engineer, how to see the world through quantitative logics and thermodynamic principles. As an anthropologist, I have been trained to be sensitive to my own biases, to do my best to leave stereotypes at the door when researching a new community. Even so, I expected people like Tom to be more like "robots" than "caretakers." I mistakenly assumed that the culture of the Cloud would mirror the technologies and artifacts that make it up. I expected to be immersed in cold mechanical rationality, not spending my days with fearful "caretakers" making decisions based on "gut feelings" rather than equations.
Over the last five years of ethnographic research and fieldwork, I have come to learn the various ways that the story of the data center is as much about humans as it is about air conditioners, servers, and the messy cables I spent hours untangling as part of my initiation into this niche world. The Cloud's stewards have a culture of their own.
As Tom puts it, he is not an automaton that can be easily replaced, he is, rather, a "caretaker." Given the enormous financial cost for every minute of downtime in data centers, incidents attributable to human error are often "career-ending" for those At an industry meetup in Boston, one data center manager shared with me that the overwhelming responsibility and constant fear of downtime in his data center had led his doctor to diagnose him with hypertension. Another, a senior technician in an Arizona data center, reported that he met with a therapist weekly to manage his "server stress." Wracked with fear, the data center managers I observed tended to rely on their own senses and instincts rather than cede too much ground to computational models or instruments. They preferred the "tried and true" over the abstracted readings and provisional models afforded by instruments and labyrinthine dashboards that display everything from network latency to power consumption at the rack level.
In the early 2000s, this same fear led many technicians to resort to the wasteful practice of "flood cooling" to eliminate hotspots and stop thermal runaway events from causing downtime. Flood cooling was about "making the room cold at whatever cost," as one technician characterized it to me, like "using a flamethrower to light a candle." 51 These tales of what Tom calls the "Wild West era" in data center management demonstrate the role of culture and behavior as we strive to understand the material impacts of data storage more holistically. "Flood cooling" is an example of a practice or norm, not a limitation of design, that led to large amounts of unnecessary energy waste in the earlier days of the data center industry.
While more reliable technologies and heightened awareness to climate change have made "flood cooling" practices largely defunct today, I still observed behaviors in the data center that were similarly motivated by fear or "hunches," like the one Tom had about aisle C9. The behaviors and practices of technicians I observed in data centers suggest that the ecological impacts of the cloud must extend beyond the realm of design to consider practice. A holistic approach to environmental reforms in the data center industry requires consideration of the workplace culture and norms within data centers. How does culture impact energy-efficiency outcomes? How might attention to training and workplace operations shift the focus of sustainability efforts? "I can't believe they are letting such large groups assemble today," Tony says, his reptilian jowls synched perfectly to his speech.
"Maybe some of them are just bots," Ines sighs, "though it feels real enough." "I've missed seeing you guys," Esmerald says, using the best buds emote to wrap her arms around her pals. "Next run, I want to go bodysurfing on the shores of Titan." "Sounds like a blast!" Ines exclaims, her avatar jumping to embody her glee.
"Or you can just take off your headset and go outside," Tony scowls, "there's plenty of extra shoreline these days." "That's not funny!" Esmerald says, but in that instant, an alarm blares, and the simulated masses and pixelated fireworks freeze.
Esmeralda tosses her headset into her lap with a loud sigh. "We are sorry to interrupt your experience. As you know, bandwidth rationing is in effect. We hope to see you again when your data allowance refills. Thank you for choosing Simphony TM ." Esmeralda sets the device aside and creeps over to the window, where she looks out at the Manhattan skyway, a new system of smart roads built to replace the many submerged and permanently flooded sections south of 51st Street. She wanted to return to the lush worlds of Simphony TM , to escape the ruin that her world was becoming, but the Cloud was finite. Network activities had to be rationed, per government regulations, to minimize environmental impacts.
Sociologist Ruha Benjamin, whose research focuses on how science and technology can both entrench and help to address longstanding problems like systemic racism, its servers and redundant chains of idling equipment designed to make the digital available anytime, anywhere. Instead, the scenario I envision above depicts a world in which tech companies continue to expand-seeking profits and remaining largely unregulated-until the cumulative effects of their industry so greatly disrupt a warming world that governments and publics have no other choice but to intervene, albeit too late.
In this warming epoch that earth scientists have named the "Anthropocene," in which climate models anticipate cataclysmic climate futures, speculation is no longer the sole province of fiction writers. 54 The survival of civilization now hinges on our collective capacities to envision and realize a sustainable future. 55 Speculation requires the imagination of worlds otherwise: worlds that might be or might have been. 56 In this case study, I introduce some of the material impacts of the Cloud, from carbon emissions to water usage, noise, and toxic e-waste. The case study illustrates that the ecological dynamics we find ourselves in are not entirely a consequence of design limits, but of human practices and choices-among individuals, communities, corporations, and governments-combined with a deficit of will and imagination to bring about a sustainable Cloud. The Cloud is both cultural and technological. Like any aspect of culture, the Cloud's trajectory-and its ecological impacts-are not predetermined or unchangeable. Like any aspect of culture, they are mutable.