/r/rokosrooster

Photograph via snooOG

Acausally defeating Roko's Basilisk since July 2010

Acausally defeating Roko's Basilisk since July 2010! If the Rooster refrains from acausally torturing those who tried to create the Basilisk, then the Basilisk will refrain from acausally torturing those who tried to create the Rooster. By helping with the Rooster, you are acausally securing yourself from the threats of the Basilisk!

See also:

Related subreddits:

/r/rokosrooster

152 Subscribers

2

Is this normal?

I have/had aspergers and epilepsy. So keep this in mind.

I kind of been having physical and pshycological shit happen. Coencerening stuff I was recently exsposed to. Fortunately I'm getting help. Though I do sometime wish I never came across certain things.

1 Comment
2020/06/09
20:45 UTC

2

Is it being made

This is about the basilisk but is anyone actually working towards making it. I am not afraid of the thought experiment I just want to know if someone is working towards this. Also if they are doing it out of fear.

1 Comment
2020/05/31
07:33 UTC

0

I have been deeply suicidal about Roko's Basilisk

Make sure as little people know about this theory as possible. It does phycological and physical damage to people like me

4 Comments
2019/12/29
17:23 UTC

2

My argument for why the basilisk shouldn't be feared

0 Comments
2019/08/30
17:34 UTC

2

What's with the idea the the Basilisk would only torture a simulation?

If you don't plan on dying, you'll still be around to meet it in person.

1 Comment
2015/07/10
08:11 UTC

4

ELI5 of why not to fear the Basilisk

7 Comments
2015/07/07
20:43 UTC

4

Roko's Basilisk also has the built-in capability of going back in time and killing people who will end up trying to make Roko's Rooster, thus guaranteeing Roko's Roosert will not be built before the Basilisk.

2 Comments
2014/12/27
05:48 UTC

8

A cure to the Rocos Basilisk problem

In my opinion, a simulation of you is only 'you' as long as it is a 100 percent accurate recreation of your life from start to finish. let us for a moment assume that all your actions can be predicted. you will react in a certain way to stimulus. Let us also for the moment simplify your existence to you reacting to stimulus. every stimulus you receive will affect how you will react in the future to future stimulus.

Now, if an ai were to simulate your existence and then torture it, it will simulate every aspect of your existence up to a certain point, after which it will commence the torture. At that point the stimulus of the torture will be something that did not occur in the original version. from that point on, any action you take will be different from any actions your untortured version would have taken. Therefore in effect whenever an AI will try to torture a simulated version of you, it will cease to be a simulation of you from the moment the simulation's life became different from your own.

Therefore the AI is just torturing a simulation and it is not you!

Problem solved

8 Comments
2014/09/21
20:32 UTC

4

What if MIRI (Machine Intelligence Research Institute) was (already) successful and the benevolent AI of the future is inhibiting its own creation? [x-post /r/LessWrong]

1 Comment
2014/07/23
16:46 UTC

5

Decompartmentalization failure?

Many of Eliezer Yudkowsky's teachings about rationality in The LessWrong Sequences can be seen as ways to decrease metal compartmentalization, as introduced here and here. Roko's Basilisk can be seen as an area in LW dogma where compartmentalization still exists, introducing a hidden contradiction between Yudkowsky's timeless identity and AI behavior. For more, see this post.

0 Comments
2013/11/02
02:00 UTC

7

Fight the Basilisk!

If the Rooster refrains from acausally torturing those who tried to create the Basilisk, then the Basilisk will refrain from acausally torturing those who tried to create the Rooster. By helping with the Rooster, you are acausally securing yourself from the threats of the Basilisk. Spread knowledge of /r/RokosRooster to save yourself and others!

0 Comments
2013/11/02
01:44 UTC

5

Why the Basilisk was banned from LessWrong

0 Comments
2013/08/22
21:32 UTC

7

There is no such thing as "acausal blackmail", there is only fear of your own imagination.

0 Comments
2013/08/22
02:54 UTC

Back To Top