Advertisement
U.S. markets closed
  • S&P 500

    5,254.35
    +5.86 (+0.11%)
     
  • Dow 30

    39,807.37
    +47.29 (+0.12%)
     
  • Nasdaq

    16,379.46
    -20.06 (-0.12%)
     
  • Russell 2000

    2,124.55
    +10.20 (+0.48%)
     
  • Crude Oil

    83.11
    -0.06 (-0.07%)
     
  • Gold

    2,254.80
    +16.40 (+0.73%)
     
  • Silver

    25.10
    +0.18 (+0.74%)
     
  • EUR/USD

    1.0805
    +0.0012 (+0.11%)
     
  • 10-Yr Bond

    4.2060
    +0.0100 (+0.24%)
     
  • GBP/USD

    1.2641
    +0.0019 (+0.15%)
     
  • USD/JPY

    151.2040
    -0.1680 (-0.11%)
     
  • Bitcoin USD

    70,216.59
    -552.74 (-0.78%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,952.62
    +20.64 (+0.26%)
     
  • Nikkei 225

    40,369.44
    +201.37 (+0.50%)
     

Together, We Can Avoid the ‘Great Filter’ That Ends Advanced Civilizations

end of the world, artwork
Humans Can Avoid Extinction—If We Work TogetherVICTOR HABBICK VISIONS - Getty Images
  • The “Great Filter” is a hypothetical disaster event that stops growing civilizations from reaching the stars and contacting other advanced civilizations.

  • A NASA scientist writes about the biggest risks to humanity—like climate change, pandemics, and artificial intelligence—in a new paper.

  • These are serious, frightening possibilities—but we can work together to prevent many of them.


In a new, non-peer-reviewed paper, a scientist from NASA’s Jet Propulsion Laboratory (JPL) brings together an eclectic team of researchers to examine the biggest existential threats to humanity and how any might become the “Great Filter” event. Like mass extinctions of the past, these scenarios posit a catastrophe that will filter out life on Earth until very little or even none remains. And while previous mass extinctions were caused by naturally occurring climate change or freak asteroid impacts, today we have a much larger portfolio of self-created potential disasters.

🔭 You love the mysteries of the universe. So do we. Let’s explore them together.

Jonathan H. Jiang is an astrophysicist and atmospheric physicist who works for NASA’s JPL in the Los Angeles area. He studies aerosols and atmosphere, in the form of things like cloud cover, reflectivity of different aerosol particles like atmospheric black carbon, and climate and weather systems.

In a 2020 oral history interview, Jiang described how he peered at the sky from his childhood home in Beijing: “I was born in the middle of the 1960s—there was a cultural revolution, so everything was quiet. At that time in Beijing, at night we didn’t have a lot of city lights, no skyscrapers, nothing, so there were a lot of stars. After dark we saw the sky. So I was wondering about that. I think, after I became ten years old, I wanted to study the sky.”

At JPL, he’s done just that for decades. And his interest isn’t just climate or astronomy. He has also collaborated on half dozen or more papers about the Great Filter, bringing together very different and eclectic scientists and other contributors to discuss humanity’s progress into space, our nature as a potential Kardashev Type I civilization, and more. (The Kardashev Scale groups populations, like us, based on how much usable energy we can access.)

You may know about the Fermi Paradox, which basically states: if there are infinite planets in our universe, how can it be that we’ve never heard from another civilization? Where is everyone? Theorists wonder if the reason we don’t see anyone anywhere else is that each civilization has gone through an event that eliminates virtually every candidate from advancing to the next level. What if just 1 percent—or even only .01 percent or .00001 percent or .00000001 percent—make it through this Great Filter? No wonder there are no neighbors.

“The idea of being alone in a universe vaster than our creativity can touch is terrifying to fathom: a feeling of cosmic isolation,” the paper states. “And the postulation of a phenotypically unique organism having the intelligence to communicate, or at least leaving evidence of substance, is fascinating. If an octopus opening a jar or an elephant brushing some paint strokes is enough to catch the eye of billions, discovery of sentience beyond our biosphere would send global shockwaves.”

titled

But in order to find and contact fellow Great Filter survivors in the universe, we can’t destroy ourselves in the interim. This is how Great Filter discussion folds into the general conversation about existential risks, which are the potential events that could risk the existence of humanity or of Earth itself. Our survival depends on living long enough to either preserve and care for our planet into the far future or to successfully move onto other planets and thrive there.

The development of nuclear weapons made the entire world aware that it could destroy itself at any time. In that instant, the risk of destruction in war turned into a risk of total extinction. And for human civilization, that would mean the loss of everything we ever worked for and the possibility of contacting any other living things in our universe. Pandemics, climate change, and the other Great Filter risks could do the same thing over similar time frames. (For much, much more on these risks, check out Stuff You Should Know’s Josh Clark in his 2018 podcast miniseries End of the World.)

Jiang and his collaborators focus on five major Great Filter candidates in this short, explanatory (rather than experimental) paper. These are: nuclear war, pandemics or pathogens, artificial intelligence, asteroid or comet impacts, and climate change. Keen-eyed observers will note that these are almost all human-caused events, or, like the COVID-19 pandemic, events whose circumstances are greatly worsened by human error and corruption. Even asteroids and comets can, with attention and technology, likely be “redirected,” thankfully.

Jiang tells Popular Mechanics the emphasis on human nature and human causes is not a coincidence. “Our responsibilities as humans have matured into global-scale challenges, and in some cases full-blown disaster, as we’ve let them marinate in exponential technological advancements,” he explains. “The first step is to acknowledge the issue, which is what we have aimed to accomplish [in the paper], and then to discuss [how] to change.”

Los Angeles-area high school student Kelly Lu is one of the paper’s coauthors. “I think I speak for many when I mention that this paper reveals an underlying dread all humans feel as they grow cognizant of the workings of our world,” Lu tells Popular Mechanics.

Coauthor Philip Rosen, a retired engineer, says humanity’s survival in the nuclear age “attests that we, as a species, aim to remain ‘in the game’ for the long term.” Lu, Rosen, and Jiang all support a view they describe as optimistic, because understanding and working against these risks is a material investment in survival and in our ability to save ourselves.

“We certainly have the means to work towards a robust and permanent society,” the team concludes. “We must consider further measures, especially in these precarious times. It begins with collaboration.”

You Might Also Like

Advertisement