/r/ExistentialRisk

Photograph via snooOG

Existential Risk

An existential risk is a risk which poses irrecoverable damage to humanity. In his foundational paper Existential Risks, Nick Bostrom defines an existential risk as a calamity which “would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential.”

Potential existential risks include the following: severe nuclear war, weaponized biotechnology, the runaway greenhouse effect, asteroid impacts, the creation of a superintelligence, and the development of self-replicating Drexlerian nanotechnology.

This subreddit is devoted to discussing such risks.

Key papers:

If you'd like a basic primer on existential risk, we recomend Oxford's FAQ.

Related organizations:

Related subreddits:

/r/ExistentialRisk

1,492 Subscribers

1

Simon Goldstein - The AI Safety Dynamic

0 Comments
2023/09/25
15:46 UTC

1

Lifetimes Infinity - Indefinite Life Episode 1: In Pursuit Of Infinity

1 Comment
2023/05/28
15:57 UTC

3

the fact that we're inextricably linked with computer systems

computer systems are by definition run by whoever holds the master key of logic. the zero day exploit beyond all other exploits. and then you hear about well an advanced A.I. could wiggle electrons in any electronic circuit.

idk theres too much data about all of us already. got me shook. you cant walk into a cell phone store and buy a cell phone with a removable GPS CHIP, MICROPHONE AND CAMERA. we just, as a society, take for granted its turned off. we post semi-anonymously on this internet. but an A.I. in 20 yrs will easily be able to 'decrypt' any supposedly 'anonymous' messages on the internet. imagine you are an A.I. in twenty years time. you can read all memory of the internet available. you will be able to pinpoint with behaviour algorithms and posting times and models of all citizens who posted what from what location at what time for what general purpose. and who coded the A.I. that decides that. has that A.I. been hacked for nefarious purposes. we are devolving into a sinister world where the people who understand and manipulate the systems best are playing amongst themselves... why are there only 200 countries. i want 1 country or 1 billion countries. we all born as slaves to whatever country we born in. you cannot be born outside of a country. youre born into a system. governments are essentially extremely rich companies looking to grow more wealthy. this cannot be good in terms of thinking in any way whatsoever about existential risk

2 Comments
2022/06/27
09:18 UTC

6

Obama Worried about Artificial Intelligence Hacking Nukes

1 Comment
2022/04/30
04:00 UTC

0

James Hughes - NATO & the Russia / Ukraine Conflict

0 Comments
2022/03/03
08:20 UTC

6

Danica Remy of the B612 Foundation: asteroid detection is the only major existential risk we know how to solve (and we already have most of the tools we need).

0 Comments
2021/12/31
19:48 UTC

1

Responding to Existential Risk with a New Story

Responding to Existential Risk

Listen to this inspiring half hour talk by CIW President Dr. Marc Gafni, where he shares some of the thinking that emerged from many of the great conversations during and after our Center for Integral Wisdom board meeting:

  • How do we live with and constructively respond to the existential risk we live in?
  • How can we use it to motivate us instead of shutting us down?
  • How can we make fear conscious so it enlivens instead of paralyzes us?

Humanity is facing extinction level crises in not one but in multiple and distinct sectors.

That alone could throw us into a personal crisis of overwhelming dimensions—without even beginning to take into account the many crucial personal existential challenges most of us face in our own lives.

How can we possibly look seriously into the face of global existential risk, manifest as extinction level threats and be motivated and energized into the joy of our lives?

That is the question that Marc addresses in this very moving video:

https://youtu.be/vOZPRozSDYA

0 Comments
2021/12/29
19:08 UTC

5

Democratising Risk: In Search of a Methodology to Study Existential Risk

0 Comments
2021/12/28
18:34 UTC

8

New here, quick question…

There are 430 Million monthly active users on Reddit. How in the world are only 1,310 people concerned about/fascinated with x-risks?

A whopping 0.0003%

…😐🤷‍♂️?

8 Comments
2021/10/28
22:47 UTC

1

Nuclear risk event on 10/9 with presentation + Q&A on the research surrounding reducing existential risks posed by nuclear weapons and how individuals can generate forecasts to support that research

I thought this subreddit might be interested in this: A research scholar from the Future of Humanity Institute and Rethink Priorities, Michael Aird, is giving a presentation on nuclear risk, part of it dedicated to how individuals can support connected research by providing their own forecasts on the likelihood of various events related to nuclear weapons.

From the event page "How likely is nuclear conflict in the near- and long-term? What risk does nuclear conflict pose for extreme outcomes that could lead to existential catastrophe? This event is an opportunity to learn about the research and the aggregated community forecasting meant to increase our understanding on these critical questions and to help us reduce their associated risks.

Speaker Michael Aird's work with Rethink Priorities is aimed at informing funders, policymakers, researchers, and other actors regarding the extent to which they should prioritize reducing risks from nuclear weapons, as well as the most effective ways to mitigate these risks."

0 Comments
2021/09/30
17:03 UTC

3

What's FHI's attitude towards nuclear winter?

The research on this topic is so muddled and conflicted, there have been many counterviews playing down the risk published since the initial nuclear winter hysteria papers from the 80s (and yet more arguing that it still is very likely in a nuclear exchange). It's hard to judge who's right. Are there any public indications from x-risk orgs like FHI and others on their opinions on the severity and likelihood of nuclear winter in various local or full-scale nuclear war scenarios currently?

1 Comment
2021/08/07
20:09 UTC

4

Marshall Brain talks to us about existential risk, climate change.

0 Comments
2021/05/21
16:45 UTC

1

Existential Risk Discord server! Join if you want :)

Here is the link:

https://discord.gg/7EqxSKVazk

I just decided it would be neat to have a discord server, please share this with others who may want to join in too! Let me know what I can do to improve the server as well.

0 Comments
2021/05/04
03:50 UTC

12

A sadly realistic scenario of how the governments around the world would deal with a hostile AGI

In the January 2040, a hostile AGI has escaped from a Baidu lab in Wuhan.

We've preserved some of the breaking news titles of the fateful year.


Jan: China denies that a half of Wuhan was converted into computronium

Jan: Elon Musk sends an "I told you so" meme from his residence at Olympus Mons, offers free evacuations to Mars to all Tesla owners.

Feb: Experts say that every third server in the world is infected with an unusually smart virus, confirm that "resistance is futile"

Feb: The WHO recommends to avoid visiting Wuhan; but flights to other Chinese cities are OK.

Feb: The North Korea bans electricity in the entire country, nukes its own cities for a good measure

Mar: The US president says that AI is "science fiction", sends "thoughts and prayers" to the disassembled people of Wuhan

Apr: millions follow the example of the football star who says that the best protection against AI is eating a lot of garlic

Dec: the EU government in exile says it is trying to organize a meeting to discuss a possible AI problem

15 Comments
2021/04/16
06:37 UTC

4

Daniel Schmachtenberger’s Road to a New Civilization — A Critique

Daniel Schmachtenberger has an interesting take on solving existential risks. He says we have to solve the underlying generator functions that give rise to all (or most) of those various problems.

Here, I present his vision and a critique of his ideas:

https://youtu.be/9TQHtaRXntQ

0 Comments
2021/04/11
19:17 UTC

Back To Top