For about as long as the internet has existed, there have been questions about how much energy our online surfing actually takes up. With climate change now pounding on our front doors, these questions have spiraled into studies that have found that our Netflix habits and increased Zoom time might add to our carbon footprints. But several experts have argued that the greenhouse gas ramifications of a person’s internet use is more complicated than that.
“Most of the time these claims are exaggerated,” says Jonathan Koomey, a researcher focused on the intersection of climate and technology, who wrote a recent commentary in Cell about the common mistakes in these reports. ”There are some [researchers] who are engaging in good faith who make some common mistakes, and then there are serial dis-informers who keep saying the same wrong thing.”
For folks who are spending a lot more time in front of a screen and feel guilty about it, climate-wise, all of this information can feel contradictory and confusing. But you don’t have to be a computer scientist to understand the relationship between the internet and energy efficiency.
Most computing data is outdated
Back as early as the 1960s, businessman and engineer Gordon Moore predicted that about every year, the density of transistors on a microchip would double—making computer chips cheaper, faster, physically smaller, and more efficient. By 1975, he adjusted that prediction to doubling every two years, and up until 2000 that’s pretty much what was happening, Koomey says. Even as people started using more computing power, energy use stayed somewhat level because more transistors could get squished onto a computer chip.
[Related: John Kerry got slammed for a statement on carbon cutting tech. Is he right?]
In the past two decades, that doubling rate has stretched to around 2.6 years, which is still fast in the grand scheme of things. But the strategy for reducing voltage while increasing power has shifted from transistor density to other creative means, like innovations in software and parallel processing, Koomey explains. This May, IBM made a teeny 2-nanometer computer chip that it claimed could be “45 percent faster than the mainstream 7-nanometer chips in many of today’s laptops and phones and up to 75 percent more power efficient.” Just six years ago, those 7-nanometer chips were all over the news. In 1965, these chips were 1/100 of an inch in size—or around 254,000 nanometers.
With that in mind, let’s say a study looks at how much carbon a data center emitted in 2020, but the only available measurements available are from back in 2015. Considering how fast the technology moves, using five-year-old stats could cause researchers to overestimate the energy use by four times or more, just by holding those retro efficiency numbers against today’s computing power, says George Kamiya, an analyst at the International Energy Agency. At the same time, not having accurate data on what devices people are watching their shows on could cause some serious underestimation; after all, a big TV sucks up a lot more energy than watching something on your phone.
One major example of these discrepancies is a 2019 environmental study published on video streaming published by French thinktank Shift Project. The authors found that an hour of online video led to a whopping 3.2 kg of carbon dioxide per hour, when in reality, that number was corrected to around 0.4 kg CO<sub>2</sub> per hour. Kamiya wrote an in-depth take on these mistakes last winter.
That’s not to say that the climate impacts of peoples’ digital habits should be completely disregarded. While the computing world still has plenty more room to improve energy efficiency, Koomey says, engineers and scientists are nearing “some fundamental limits in the physics of semiconductors.” Keeping steady growth in energy efficiency in the sector and feeding the beast that is the internet will call for a redesign from the ground up within the next handful of decades.
One day down the road, efficiency may not be improving as quickly, making concerns over growing internet use more legitimate. But that worry is at least a few years off.
Why it’s crucial to set the record straight
What worries Kamiya most with these data center and energy misconceptions is that people think they’re taking a big step against climate change by toggling from high to standard definition on their favorite Netflix show.
“It might give them this false sense of ‘oh, if I didn’t stream this video, I’m going to have a huge impact on the environment. Now I’m going to go and eat a steak or take that flight,’” he says.
[Related: Renewable energy can’t cure Bitcoin’s environmental woes.]
Not to mention, scientists need digital technologies to fix some of the biggest issues in the climate crisis—along with policies that check the carbon emissions generated by those technologies as they grow.
Of course, this doesn’t mean you should just spend all day in front of a computer without considering the larger impacts. There are plenty of problems associated with the ever-increasing use of technology—but data center energy use isn’t the biggest one of them. Companies that use loads of data centers also tend to be significant users of renewables, which are absolutely necessary for a sustainable future. For example, this summer Amazon announced plans for more than a dozen new wind and solar projects that would provide around 1.5 gigawatts of new capacity (about the same amount as the total installed renewable capacity in the state of Maine in 2020).
“If the main argument is carbon emissions, these companies can very easily try to get that to zero,” Kamiya says. So enjoy your Netflix binges—holding onto that subscription may mean you can use your influence as a customer for even more powerful climate solutions.