Our sun’s adolescence was stormy—and new evidence shows that these tempests may have been just the key to seeding life as we know it.
Some 4 billion years ago, the sun shone with only about three-quarters the brightness we see today, but its surface roiled with giant eruptions spewing enormous amounts of solar material and radiation out into space. These powerful solar explosions may have provided the crucial energy needed to warm Earth, despite the sun’s faintness. The eruptions also may have furnished the energy needed to turn simple molecules into the complex molecules such as RNA and DNA that were necessary for life. The research was published in Nature Geoscience on May 23, 2016, by a team of scientists from NASA.
Understanding what conditions were necessary for life on our planet helps us both trace the origins of life on Earth and guide the search for life on other planets. Until now, however, fully mapping Earth’s evolution has been hindered by the simple fact that the young sun wasn’t luminous enough to warm Earth.
“Back then, Earth received only about 70 percent of the energy from the sun than it does today,” said Vladimir Airapetian, lead author of the paper and a solar scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “That means Earth should have been an icy ball. Instead, geological evidence says it was a warm globe with liquid water. We call this the Faint Young Sun Paradox. Our new research shows that solar storms could have been central to warming Earth.”
Scientists are able to piece together the history of the sun by searching for similar stars in our galaxy. By placing these sun-like stars in order according to their age, the stars appear as a functional timeline of how our own sun evolved. It is from this kind of data that scientists know the sun was fainter 4 billion years ago. Such studies also show that young stars frequently produce powerful flares – giant bursts of light and radiation — similar to the flares we see on our own sun today. Such flares are often accompanied by huge clouds of solar material, called coronal mass ejections, or CMEs, which erupt out into space.
NASA’s Kepler mission found stars that resemble our sun about a few million years after its birth. The Kepler data showed many examples of what are called “superflares” – enormous explosions so rare today that we only experience them once every 100 years or so. Yet the Kepler data also show these youngsters producing as many as ten superflares a day.
While our sun still produces flares and CMEs, they are not so frequent or intense. What’s more, Earth today has a strong magnetic field that helps keep the bulk of the energy from such space weather from reaching Earth. Space weather can, however, significantly disturb a magnetic bubble around our planet, the magnetosphere, a phenomenon referred to as geomagnetic storms that can affect radio communications and our satellites in space. It also creates auroras – most often in a narrow region near the poles where Earth’s magnetic fields bow down to touch the planet.
Our young Earth, however, had a weaker magnetic field, with a much wider footprint near the poles.
“Our calculations show that you would have regularly seen auroras all the way down in South Carolina,” says Airapetian. “And as the particles from the space weather traveled down the magnetic field lines, they would have slammed into abundant nitrogen molecules in the atmosphere. Changing the atmosphere’s chemistry turns out to have made all the difference for life on Earth.”
The atmosphere of early Earth was also different than it is now: Molecular nitrogen – that is, two nitrogen atoms bound together into a molecule – made up 90 percent of the atmosphere, compared to only 78 percent today. As energetic particles slammed into these nitrogen molecules, the impact broke them up into individual nitrogen atoms. They, in turn, collided with carbon dioxide, separating those molecules into carbon monoxide and oxygen.
The free-floating nitrogen and oxygen combined into nitrous oxide, which is a powerful greenhouse gas. When it comes to warming the atmosphere, nitrous oxide is some 300 times more powerful than carbon dioxide. The teams’ calculations show that if the early atmosphere housed less than one percent as much nitrous oxide as it did carbon dioxide, it would warm the planet enough for liquid water to exist.
This newly discovered constant influx of solar particles to early Earth may have done more than just warm the atmosphere, it may also have provided the energy needed to make complex chemicals. In a planet scattered evenly with simple molecules, it takes a huge amount of incoming energy to create the complex molecules such as RNA and DNA that eventually seeded life.
While enough energy appears to be hugely important for a growing planet, too much would also be an issue — a constant chain of solar eruptions producing showers of particle radiation can be quite detrimental. Such an onslaught of magnetic clouds can rip off a planet’s atmosphere if the magnetosphere is too weak. Understanding these kinds of balances help scientists determine what kinds of stars and what kinds of planets could be hospitable for life.
“We want to gather all this information together, how close a planet is to the star, how energetic the star is, how strong the planet’s magnetosphere is in order to help search for habitable planets around stars near our own and throughout the galaxy,” said William Danchi, principal investigator of the project at Goddard and a co-author on the paper. “This work includes scientists from many fields — those who study the sun, the stars, the planets, chemistry and biology. Working together we can create a robust description of what the early days of our home planet looked like – and where life might exist elsewhere.”
Filed Under: Aerospace + defense