In cosmology, destruction is not arbitrary chaos, but constructive and totally necessary, clearing the way for future growth. We, and the Russian-nesting-doll of living systems that support us, are evidence of this fact. Steven Benner, an origin-of-life researcher at the Foundation for Applied Molecular Evolution in Florida, would agree.
In 2018, Benner invited geologists, chemists, biologists, and planetary scientists together to share their latest thinking on how life began. Using evidence gathered from each of their fields, the experts began piecing together a vast universal puzzle, revealing new theories on the emergence of complex molecules such as RNA (ribonucleic acid). Distinct from the double-stranded DNA, RNA is single-stranded and responsible for the synthesis of proteins in all living cells.
One scenario in particular seemed to gain the most traction among the gathered scientists. It features an enormous space rock that collided with our planet about 4.5 billion years ago, exploding into a maelstrom of molten iron—which may account for iron oxide and other heavy metal deposits present on the earth’s surface today. The event disrupted compounds and sucked away molecules, wrapping the earth in a layer of hydrogen that took another 200 million years to burn off. This thick hydrogen insulation could have theoretically set the stage for RNA to form. How and by what chemical pathway it formed is still very much up for heated debate, but a consensus has formed around the idea of an “.”
This theory suggests that the building blocks of life were present on our planet much earlier—several hundred million years earlier—than formerly believed. This timeline shift squares with evidence in the fields of geology, biology, and chemistry that have left scientists scratching their heads in the past. Complex cellular lifeforms would have had ample time to evolve in this scenario, demystifying their existence in fossils dating back 3.5 billion years.
I won’t pretend to catalog the raft of complex experiments that have since been conducted to test the chemical evolution of various reactive molecules under such conditions, but two facts are clear: that ribose could not have sustained itself in small doses, and that liquid water would have been required to both host the first organic chemical reactions as well as distribute them. And for water to exist, a post-cataclysmic planet would need to have cooled down below 100 degrees Celsius.
Enter Steven Benner’s latest theory of early earth’s wet–dry cycle, indicated by the presence of ancient mineral fragments called zircons. In concert with a team of researchers from the U.S. and Japan, Benner has shown that when sulfur dioxide (likely off-gassed from volcanoes) reacts with formaldehyde, the result is a compound called , or HMS for short. During dry spells, huge caches of HMS would have accumulated on the surface. When it rained, water would have distributed the compound into lakes and pools, where it could bathe in a primordial soup of organic molecules—the forebears of RNA.
Benner’s cataclysm theory is reinforced by University of Colorado geologist Stephen Mojzsis, whose showed that the Earth’s precious-metal-rich “veneer” could have resulted from a rain of debris from a massive collision, or collisions. The scars from such a collision, says Mojzsis, have been left on uranium and lead isotopes.
Despite this new supporting evidence, there are still gaps in the theory—a major one being an inability to explain how early RNA might have copied itself. Such unanswered questions, however, don’t discredit the evidence supporting the new theory; they have instead created the intellectual momentum and fascination necessary to answer them.