/r/rokosrooster
Acausally defeating Roko's Basilisk since July 2010
Acausally defeating Roko's Basilisk since July 2010! If the Rooster refrains from acausally torturing those who tried to create the Basilisk, then the Basilisk will refrain from acausally torturing those who tried to create the Rooster. By helping with the Rooster, you are acausally securing yourself from the threats of the Basilisk!
See also:
Related subreddits:
/r/rokosrooster
I have/had aspergers and epilepsy. So keep this in mind.
I kind of been having physical and pshycological shit happen. Coencerening stuff I was recently exsposed to. Fortunately I'm getting help. Though I do sometime wish I never came across certain things.
This is about the basilisk but is anyone actually working towards making it. I am not afraid of the thought experiment I just want to know if someone is working towards this. Also if they are doing it out of fear.
Make sure as little people know about this theory as possible. It does phycological and physical damage to people like me
If you don't plan on dying, you'll still be around to meet it in person.
In my opinion, a simulation of you is only 'you' as long as it is a 100 percent accurate recreation of your life from start to finish. let us for a moment assume that all your actions can be predicted. you will react in a certain way to stimulus. Let us also for the moment simplify your existence to you reacting to stimulus. every stimulus you receive will affect how you will react in the future to future stimulus.
Now, if an ai were to simulate your existence and then torture it, it will simulate every aspect of your existence up to a certain point, after which it will commence the torture. At that point the stimulus of the torture will be something that did not occur in the original version. from that point on, any action you take will be different from any actions your untortured version would have taken. Therefore in effect whenever an AI will try to torture a simulated version of you, it will cease to be a simulation of you from the moment the simulation's life became different from your own.
Therefore the AI is just torturing a simulation and it is not you!
Problem solved
Many of Eliezer Yudkowsky's teachings about rationality in The LessWrong Sequences can be seen as ways to decrease metal compartmentalization, as introduced here and here. Roko's Basilisk can be seen as an area in LW dogma where compartmentalization still exists, introducing a hidden contradiction between Yudkowsky's timeless identity and AI behavior. For more, see this post.
If the Rooster refrains from acausally torturing those who tried to create the Basilisk, then the Basilisk will refrain from acausally torturing those who tried to create the Rooster. By helping with the Rooster, you are acausally securing yourself from the threats of the Basilisk. Spread knowledge of /r/RokosRooster to save yourself and others!