Climate Change photo
SHARE

If we do nothing to reduce our carbon dioxide (CO2) emissions, by the end of this century the Earth will be as hot as it was 50 million years ago in the early Eocene, according to a new study out today in the journal Nature Communications. This period—roughly 15 million years after dinosaurs went extinct and 49.8 million years before modern humans appeared on the scene—was 16F to 25F warmer than the modern norm.

Climate change doubters often point to these earlier temperature shifts as a way of rebutting the scientific evidence that climate change is caused by human activity. And yes, less than a million years ago parts of the Midwest were covered in glaciers, while 56 million years ago the Arctic was warm enough that crocodiles roamed Greenland. All of this is true.

But greenhouses gases like CO2 are so named for their ability to magnify the sun’s energy, and 50 million years ago the sun wasn’t as hot— our star is getting hotter with age. During the Eocene, it took more atmospheric CO2 to influence temperatures than it does today. In fact, if we don’t change our behavior, 2100 will be as hot as the Eocene with much less atmospheric CO2 than was present at the time. A hotter sun means we get more bang for our CO2 buck.

“Climate change denialists often mention that CO2 was high in the past, that it was warm in the past, so this means there’s nothing to worry about,” said lead study author Gavin Foster, a researcher in isotope geochemistry and paleoceanography at the United Kingdom’s University of Southampton. “It’s certainly true, that the CO2 was high in the past and that it was warm in the past. But because the sun was dimmer, the climate wasn’t being forced as much [as it will be] in the future if we carry on as we are.”

crocodile cartoon
Just because crocodiles once lived near the Arctic doesn’t mean that modern climate change isn’t terrible or caused by humans. Emily Greenlagh via NOAA

If we keep going and exhaust our supplies of fossil fuels like gas and coal, the amount of CO2 in the atmosphere could rise to 2000 ppm by 2250— levels not seen since 200 million years ago. And because the sun was much dimmer then, that concentration of CO2 would translate into temperatures not seen in the last 420 million years—not since long before the time of the dinosaurs.

Roughly 400 million years ago, CO2 levels actually declined because the warming sun increased the rate of biochemical reactions on the Earth. “The main [CO2] controller is silicate weathering, which is the term for breaking rock down into soil,” said Foster. “That’s the natural way in which CO2 is removed from the atmosphere. And that process is temperature and runoff dependent. It depends on how wet it is, and how warm it is.”

But innate silicate weathering didn’t reduce atmospheric CO2 to modern levels on its own. The increased rate of soil formation fueled the rise of land plants over the last 400 million years, which helped speed up the silicate weathering and reduce the levels of atmospheric CO2.

And yes, these processes will speed up in the presence of more CO2, but they’re incredibly slow—it’s like speeding up a sloth. Even if you double its speed, the sloth will still get run over. RIP sloth, RIP us.

“CO2 gets locked away out of the atmosphere when it’s formed as part of the skeleton of an organism in the ocean, and that organism then sinks to the seabed when its dead,” said Foster. “It’s the weathering, but it’s the weathering that then transports the items that were in the rock to the seawater. The organisms take it up in the sea, then they sink into the seabed. The process of one molecule of CO2 from the atmosphere getting locked away in the seabed takes a long time.”

If we stopped emitting CO2 today, traces of our emissions would still be in the atmosphere in a million years’ time.

“We are completely over swamping the natural processes,” said Foster.