Inside the physical footprint of the Cloud
Anthropologist Steven Gonzalez Monserrate draws on five years of research and fieldwork in server farms to illustrate some of the diverse environmental impacts of data storage.
The following article is excerpted from anthropologist Steven Gonzales Monserrate’s case study “The Cloud Is Material: On the Environmental Impacts of Computation and Data Storage.” It originally featured on The MIT Press Reader.
Screens brighten with the flow of words. Perhaps they are emails, hastily scrawled on smart devices, or emoji-laden messages exchanged between friends or families. On this same river of the digital, millions flock to binge their favorite television programming, to stream pornography, or enter the sprawling worlds of massively multiplayer online roleplaying games, or simply to look up the meaning of an obscure word or the location of the nearest COVID-19 testing center.
Whatever your query, desire, or purpose, the internet provides, and all of the complexity of everything from unboxing videos to do-it-yourself blogs are contained within infinitely complex strings of bits. As they travel across time and space at the speed of light, beneath our oceans in fiber optic cables thinner than human hairs, these dense packets of information, instructions for pixels or characters or frames encoded in ones and zeros, unravel to create the digital veneer before you now. The words you are reading are a point of entry into an ethereal realm that many call the “Cloud.”
While in technical parlance the “Cloud” might refer to the pooling of computing resources over a network, in popular culture, “Cloud” has come to signify and encompass the full gamut of infrastructures that make online activity possible, everything from Instagram to Hulu to Google Drive. Like a puffy cumulus drifting across a clear blue sky, refusing to maintain a solid shape or form, the Cloud of the digital is elusive, its inner workings largely mysterious to the wider public, an example of what MIT cybernetician Norbert Weiner once called a “black box.” But just as the clouds above us, however formless or ethereal they may appear to be, are in fact made of matter, the Cloud of the digital is also relentlessly material.
To get at the matter of the Cloud we must unravel the coils of coaxial cables, fiber optic tubes, cellular towers, air conditioners, power distribution units, transformers, water pipes, computer servers, and more. We must attend to its material flows of electricity, water, air, heat, metals, minerals, and rare earth elements that undergird our digital lives. In this way, the Cloud is not only material, but is also an ecological force. As it continues to expand, its environmental impact increases, even as the engineers, technicians, and executives behind its infrastructures strive to balance profitability with sustainability. Nowhere is this dilemma more visible than in the walls of the infrastructures where the content of the Cloud lives: the factory-libraries where data is stored and computational power is pooled to keep our cloud applications afloat.
Cloud the carbonivore
It is four in the morning when the incident occurs. At that moment, I am crouched on the floor of one of the containment aisles of the data center, computers arrayed like book stacks in a library on either side of me. The clamor of server fans makes it nearly impossible for me to hear Tom, the senior technician I am shadowing, explain to me how to pry open a faulty floor tile. With a specialized tool, I remove the white square tile from its hinges, noticing tiny perforations etched on its surface, points of ingress designed to help cool air rush up from a vast, pressurized cavity beneath us called a “plenum.” I set the tile aside, feeling a rush of cold tickle my nose as a gust of chill whips up from the exposed underfloor plenum. I go about replacing the tile, using one with more notches to improve airflow to this particular cluster of dense computing equipment. That is when I hear the alarms go off. Amid a sea of blinking green and blue lights, an entire rack of computers suddenly scintillates yellow, and then, after a few seconds, a foreboding red. In that instant, panic sweeps over Tom’s face, and he too is flush and crimson as he scrambles to contain the calamity unfolding around us.
“They’re overheating,” Tom says, upon inspecting the thermal sensors, sweat dripping from his brow.
I feel the heat swarming the air. The flood of warmth seeps into the servers faster than the heat sinks printed onto their circuit boards can abate, faster than the fans can expel the hot air recycling in a runaway feedback loop of warming. The automatic shutdown sequence begins, and Tom curses, reminding me that every minute of downtime, of service interruption, may cost the company many thousands of dollars. Within two minutes, however, the three massive air conditioning units that had been idling in a standby state activate to full power, flooding the room with an arctic chill and restoring order to the chaotic scene.
In the vignette above, which draws on my ethnographic fieldnotes, I recount an episode that data center technicians refer to as a “thermal runaway event,” a cascading failure of cooling systems that interrupts the functioning of the servers that process, store, and facilitate everything online. The molecular frictions of digital industry, as this example shows, proliferate as unruly heat. The flotsam and jetsam of our digital queries and transactions, the flurry of electrons flitting about, warm the medium of air. Heat is the waste product of computation, and if left unchecked, it becomes a foil to the workings of digital civilization. Heat must therefore be relentlessly abated to keep the engine of the digital thrumming in a constant state, 24 hours a day, every day.
To quell this thermodynamic threat, data centers overwhelmingly rely on air conditioning, a mechanical process that refrigerates the gaseous medium of air, so that it can displace or lift perilous heat away from computers. Today, power-hungry computer room air conditioners (CRACs) or computer room air handlers (CRAHs) are staples of even the most advanced data centers. In North America, most data centers draw power from “dirty” electricity grids, especially in Virginia’s “data center alley,” the site of 70 percent of the world’s internet traffic in 2019. To cool, the Cloud burns carbon, what Jeffrey Moro calls an “elemental irony.” In most data centers today, cooling accounts for greater than 40 percent of electricity usage.
While some of the most advanced, “hyperscale” data centers, like those maintained by Google, Facebook, and Amazon, have pledged to transition their sites to carbon-neutral via carbon offsetting and investment in renewable energy infrastructures like wind and solar, many of the smaller-scale data centers that I observed lack the resources and capital to pursue similar sustainability initiatives. Smaller-scale, traditional data centers have often been set up within older buildings that are not optimized for ever-changing power, cooling, and data storage capacity needs. Since the emergence of hyperscale facilities, many companies, universities, and others who operate their own small-scale data centers have begun to transfer their data to hyperscalers or cloud colocation facilities, citing energy cost reductions.
According to a Lawrence Berkeley National Laboratory report, if the entire Cloud shifted to hyperscale facilities, energy usage might drop as much as 25 percent. Without any regulatory body or agency to incentivize or enforce such a shift in our infrastructural configuration, there are other solutions that have been proposed to curb the Cloud’s carbon problem. Some have proposed relocating data centers to Nordic countries like Iceland or Sweden, in a bid to utilize ambient, cool air to minimize carbon footprint, a technique called “free cooling.” However, network signal latency issues make this dream of a haven for green data centers largely untenable to meet the computing and data storage demands of the wider world.
As a result, the Cloud now has a greater carbon footprint than the airline industry. A single data center can consume the equivalent electricity of 50,000 homes. At 200 terawatt hours (TWh) annually, data centers collectively devour more energy than some nation-states. Today, the electricity utilized by data centers accounts for 0.3 percent of overall carbon emissions, and if we extend our accounting to include networked devices like laptops, smartphones, and tablets, the total shifts to 2 percent of global carbon emissions.
Why so much energy? Beyond cooling, the energy requirements of data centers are vast. To meet the pledge to customers that their data and cloud services will be available anytime, anywhere, data centers are designed to be hyper-redundant: If one system fails, another is ready to take its place at a moment’s notice, to prevent a disruption in user experiences. Like Tom’s air conditioners idling in a low-power state, ready to rev up when things get too hot, the data center is a Russian doll of redundancies: redundant power systems like diesel generators, redundant servers ready to take over computational processes should others become unexpectedly unavailable, and so forth. In some cases, only 6 to 12 percent of energy consumed is devoted to active computational processes. The remainder is allocated to cooling and maintaining chains upon chains of redundant fail-safes to prevent costly downtime.
It is late July in Arizona. The sun is white and hot on this cloudless day. I feel it scorch the back of my neck as I follow Jeremy, a junior technician, to the backlot behind a data center, where dozens of shipping containers are arrayed in rows. Amid this 117-degree heat wave, our task is to repair an evaporative cooling system that is failing. We unfasten the screws on one of the exterior panels before entering the shipping container, which I am surprised to learn is actually a modular server cluster. Pipes snake up from tiny channels in the lot, where potable water is pumped up from the ground, to seep up into a spongey, filter media. To my eyes, this foamy material resembles a honeycomb or a wasp’s nest (figure 2). The sediment-rich waters of the Colorado River have congealed to form an oozy soot on the porous surface that is not unlike honey. The wet tray of material evaporates quickly in the arid desert air, the roiling cloud of moisture gently cooling the loudly buzzing servers around us, Jeremy explains. This, I learn, is why the shipping container has the nickname, “The Mouth.”
The Cloud may be a carbonivore, but as the example of “The Mouth” shows, the Cloud is also quite thirsty. Like a pasture, server farms are irrigated. In many data centers today, chilled water is piped through the latticework of server racks to more efficiently cool the facility, liquid being a superior convective agent than air. This shift from cooling air to cooling water is an attempt to reduce carbon footprint, but it comes at a cost. Weathering historic drought and heat domes, communities in the western United States are increasingly strained for water resources. In Mesa, Arizona, where I spent six months researching the emergence of a desert data center hub, some politicians are now openly opposing the construction of data centers, framing the centers’ water usage as inessential and irresponsible given resource constraints. In Bluffdale, Utah, residents are suffering from water shortages and power outages, as a result of the nearby Utah Data Center, a facility of the U.S. National Security Agency (NSA) that guzzles seven million gallons of water daily to operate.
In response to increasing awareness of data centers’ impact on water-stressed communities like Mesa and Bluffdale, companies like Google are pledging to go “water-positive” by 2030, committing to “replenish” 120 percent of the water they consume in their facilities and offices. By implementing costly “closed-loop” water cooling systems, companies like Google and Cyrus One are able to recycle some of the wastewater used in evaporative cooling, though much of the water escapes into the atmosphere during the evaporative process. In addition to optimizing water utilization and minimizing “waste,” Google and others pledge to invest in water infrastructure and community resources to enhance “water stewardship” and “water security.”
Corporate pledges such as these, while laudable, are not enforceable, nor do they appear to be feasible given the explosive growth expected in data storage infrastructures over the next decade, a tripling by some estimates. Media scholar Mél Hogan warns against entrusting “Big Tech” with its own regulation, given the companies’ financial ties to the fossil fuel industry and failure to meet the deadlines of previous pledges to reduce carbon emissions or other kinds of waste.
Per the 2021 Emissions Gap Report authored by the United Nations Environment Programme, global temperatures are projected to rise by 2.7◦C by the end of the century. Planetary heating will melt glaciers and raise sea levels. The result will be the salinization of freshwater supplies, proliferation of pathogen growth in stagnant water reservoirs, and the intensification of ongoing processes of desertification, creating near-ubiquitous conditions of water scarcity by 2040 if governments and companies fail to intensify their efforts to curb emissions. While corporate pledges offer no guarantee that data centers will regulate, larger mechanisms of accountability like the recent Climate Neutral Data Centre Pact, a consortium of European data center companies and infrastructure providers promising to become “climate neutral” by 2050, provide a model for larger-scale regulatory initiatives that could make a more substantial impact.
The Cloud is not silent
2019. Brenda Hayward takes a stroll through her sunny neighborhood, past the lovely, green lawn of Chuparosa park in Chandler, Arizona, when she hears it—the noise that haunts her every night as she attempts to sleep. It is there every morning when she wakes up. It is there in the park where her children played when they were young, riffling through the boughs of the palo verde trees, stalking her as she tries to live her life quietly. It began as a dull boom, not unlike the racket of bass-frenzied teenagers partying late into the night. Later, it evolved into a continuous, mechanical whine. She tries not to notice it, she tries to unhear it, but it is there, behind everything, a hellish background track to her life. As a nurse, she knows that the sound is more than mere annoyance. She sees the signs of its toll—hypertension, cortisol—but she cannot stop it. No one can, because it does not sleep.
2020. Lockdown has forced urban residents to remain in their homes to minimize the transmission of COVID-19. For David Gray, cabin fever is the least of his worries. Instead, he and his neighbors at Printer’s Row in downtown Chicago must weather a scourge of a sonic variety. As he mills about his home, as he works, and eats, and bathes, it is there, a monotonal drone, a clatter unceasing, a constant, undesired companion to his life. It festers in his mind, clawing at his thoughts, probing his sanity, poisoning him with a constant spell of dread and anxiety. He cannot leave; he is not allowed to. He cannot escape. He is there, with it, a prisoner to its bewitching monotone.
2021. At Chuparosa Park, I hear it, too. Above the cries of children playing, dogs barking, cars racing by, it soars. My ears prick up with the music of the Cloud, a discordant symphony of text messages, emails, cat videos, and fake news, pulsing, thrumming in my ears. Just past the basketball courts, the picnic tables, and the prickly pears, the source is visible for all to see: a CyrusOne data center.
Over vast distances, the sonic exhaust of our digital lives reverberates: the minute vibrations of hard disks, the rumbling of air chillers, the cranking of diesel generators, the mechanical spinning of fans. Data centers emit acoustic waste, what environmentalists call “noise pollution.” For communities like Brenda’s and David’s, the computational whir of data centers is not merely an annoyance, but a source of mental and physical harm. Brenda, a nurse by training, reported an uptick in her blood pressure and cortisol levels with the onset of the noise. David, a twenty-something software engineer, was diagnosed with hypertension, and meets frequently with a clinical therapist to manage the anxiety caused by the data center’s hum.
Their stories are cautionary tales; they are neither uncommon nor exceptional. The acute and longitudinal physiological effects of industrial noise pollution are well-documented to include hearing loss, elevated stress hormones like cortisol, hypertension, and insomnia. Brenda and David met with other disaffected residents in their respective communities to organize for change. Brenda soon joined the Dobson Noise Coalition, helping to organize a community meeting with her neighbors, city officials, state and federal representatives, and employees of CyrusOne, the offending data center. David took a stand with others in his building, successfully mobilizing the Chicago Department of Public Health to file a noise complaint on their behalf and successfully obtain a hearing for a noise pollution violation. While the efforts of these communities to minimize the noise pollution harming them are ongoing, they are resigned to modest goals to improve rather than solve the problem. Unlike other industries, data centers are largely self-regulating: There is no sweeping federal agency to govern the siting and operation of new and existing facilities.
Because data center noise is unregulated by political authorities, facilities can be built in close proximity to residential communities. Given the subjective nature of hearing, the history of noise regulation might best be characterized by a series of contests over expertise and the “right” to quiet, as codified in liberal legal regimes. Over the course of my fieldwork with the communities of Chandler and Printer’s Row, I learned that the “noise” of the Cloud uniquely eludes regulatory schemes. In many cases, the loudness of the data centers, as measured in decibels (dB), falls below the threshold of intolerance as prescribed by local ordinances. For this reason, when residents contacted the authorities to intervene, to attenuate or quiet their noise, no action was taken, because the data centers had not technically violated the law, and their properties were zoned for industrial purposes. However, upon closer interrogation of the sound, some residents reported that the monotonal drone, a frequency hovering within the range of human speech, is particularly disturbing, given the attuned sensitivity of human ears to discern such frequencies above others. Even so, there were days when the data centers, running diesel generators, vastly exceeded permissible decibel-thresholds for noise. As with water and carbon, local companies like CyrusOne pledged in community meetings to take steps to attenuate their sound, though these were unenforceable promises that, to date, they have failed to keep.
Since the year 2007, when the first smartphone debuted on the marketplace, over seven billion devices of the sort have since been manufactured. Their lifespans average less than two years, a consequence of designed obsolescence and a thirst to profit from flashy new features and capabilities. Meanwhile, the material and political conditions of their manufacture, and the resources required for their production, remain obscured. Under grueling conditions, miners tirelessly plumb the earth for the rare metals required to make information and communications technology (ICT) devices. Then, in vast factories like Foxconn located in the Global South, where labor can be procured cheaply and legal protections for workers are scant, smartphones are assembled and shipped out to consumers, only to be discarded in a matter of months, to end up in e-waste graveyards like those of Agbogbloshie, Ghana. These metals, many of which are toxic and contain radioactive elements, take millennia to decay. The refuse of the digital is ecologically transformative.
Historian Nathan Ensmenger writes that a single desktop computer requires 240 kilograms of fossil fuels, 22 kilograms of chemicals, and 1,500 kilograms of water to manufacture. The servers that fill the halls of data centers are dense, specialized assets, with some units valued in the tens of thousands of U.S. dollars. Cables, batteries, uninterruptible power supplies (UPS), air conditioners (CRACs and CRAHs), power distribution units (PDUs), and transformers are also periodically decommissioned and disposed of, when warranties expire and units fail to perform with the high standards of reliability and redundancy set by entities like the Uptime Institute. Some of these components have toxic polychlorinated biphenyls (PCBs) and must be disposed of rather than reused. Efforts are underway in Europe and elsewhere to augment facility and equipment designs to extend the lifespan of units, more easily accommodate repair, and formalize a system of exchange to recycle old equipment using “materials passports” that precisely document unit histories, not unlike CARFAX. Even with these sustainability initiatives in place, environmental organizations like Greenpeace estimate that less than 16 percent of the tons of e-waste generated annually is recycled.
The ecological dynamics we find ourselves in are not entirely a consequence of design limits, but of human practices and choices—among individuals, communities, corporations, and governments—combined with a deficit of will and imagination to bring about a sustainable Cloud. The Cloud is both cultural and technological. Like any aspect of culture, the Cloud’s trajectory—and its ecological impacts—are not predetermined or unchangeable. Like any aspect of culture, they are mutable.