Climate Change photo
Neethis
SHARE

When our sun first got going, some 4.5 billion years ago, it wasn’t the same blazing star we know today–its warmth and brightness grew gradually as more and more of its fuel ignited. So, for Earth’s first two billion years, our planet was bathed in a light 25 percent dimmer than it receives today.

If the sun dropped back down to that magnitude today, our planet would plunge into an ice age dramatic enough to bury the continents in miles-thick ice sheets and freeze the oceans solid. But according to the geological evidence, ancient Earth was not frozen: It was covered in vast liquid oceans and dotted over with arcs of island chains that sprouted up from undersea volcanoes and then wore back down again in the rain.

Scientists have been working to resolve this troubling paradox for decades: how, they have asked, could a faint young sun have kept Earth out of an ice age for two billion years, when several ice ages have come and gone in more recent times, under a much brighter star?

The answer, they reasoned, must lie in the planet’s early atmosphere–the air must have been packed with enough heat-trapping greenhouse gas to compensate for the lack of sunlight. But which greenhouse gas was it? Evidence from ancient soils suggested that carbon dioxide levels weren’t high enough to do the job alone, and theories pointing to methane as ancient Earth’s chief atmospheric insulator fell apart under close scientific scrutiny. (Water vapor–today’s biggest greenhouse gas–was out from the beginning, because air needs to be warm to begin with to hold large amounts of the stuff).

Now, researchers at the University of Chicago have come up with a new theory: the greenhouse gases that provided Earth with extra warmth weren’t CO2 or methane or any of the usual suspects–they were nitrogen and hydrogen. Though H2 and N2 don’t normally soak up the sun’s light, collisions between the molecules can energize them, prompting them to absorb infrared energy.

Based on computer simulations, the researchers found that, if the early atmosphere were composed of 10 percent hydrogen, the warming effect from those molecular collisions could have been enough to raise the planet’s temperature by as much as 60 degrees fahrenheit–enough to keep liquid water falling on the young, dimly-lit planet during the first part of its life.