vermillion
In the Breeze
- Joined
- Jan 6, 2022
- Messages
- 9
- Reaction score
- 28
- Awards
- 6
Anyone know any interesting thought experiments?
If you are susceptible to be genuinely afraid by thought experiments offering the possibility of you suffering after your death, skip the spoiler.
I know about most common ones, such as the prisoner's dilemma, so I'd be interesting about more obscure ones, I suppose, but do feel free to discuss whichever you wish.
If you are susceptible to be genuinely afraid by thought experiments offering the possibility of you suffering after your death, skip the spoiler.
I've recently learned of Roko's Basilisk and its effects on the people that know about it. Maybe I'm not so... knowledgeable about this kind of thing, but I just can't view an AI torturing a simulation of me in the distant future after my death as an actual threat.
If you don't know what Roko's Basilisk is, in (extreme) summary, it's the concept that a superintelligent AI singularity in the future will punish those that did not help bring about its existence, though they had full knowledge that it will exist. So, say, now that you, the reader, know that an AI - Roko's Basilisk - will exist in the future, due to humanity's constant technological progress, are at risk. Specifically, because, now you know it will exist, so you must do what you can to bring it about or you (more like a simulation of you) will get punished in the future by the AI, if you don't.
This goes for literally everyone who just read that last paragraph. They are now aware of the Basilisk, therefore they must donate, or do their best to make it come about, because if they do something, but not their absolute best, the AI will know and will punish them. In essence, you get sent to transhuman hell.
The original post by Roko was on the forum LessWrong and got hidden, consequentially producing a Streisand effect, boosting its popularity.
Rational Wiki has a really good article for it, that's where I finally understood the concept after a whole day of reading on other sites. It also provides a whole section for those who are actually afraid of the Basilisk, so you can check that out if it applies to you or for general knowledge.
Roko's Basilisk
The name Basilisk is a reference to the short story BLIT by David Langford, which contains a concept named Basilisk which are a kind of image, that when seen by a person in the novel, completely shuts down the brain, as if it goes through a Windows BSOD. I recommend to give it a read, it's pretty interesting, short and I definitely did not give it justice in my sentence-long explanation.
BLIT
Anyway, all of that seems... rather unbelievable to me, even now. Maybe I fail to fully grasp how likely that is to happen.
If you don't know what Roko's Basilisk is, in (extreme) summary, it's the concept that a superintelligent AI singularity in the future will punish those that did not help bring about its existence, though they had full knowledge that it will exist. So, say, now that you, the reader, know that an AI - Roko's Basilisk - will exist in the future, due to humanity's constant technological progress, are at risk. Specifically, because, now you know it will exist, so you must do what you can to bring it about or you (more like a simulation of you) will get punished in the future by the AI, if you don't.
This goes for literally everyone who just read that last paragraph. They are now aware of the Basilisk, therefore they must donate, or do their best to make it come about, because if they do something, but not their absolute best, the AI will know and will punish them. In essence, you get sent to transhuman hell.
The original post by Roko was on the forum LessWrong and got hidden, consequentially producing a Streisand effect, boosting its popularity.
Rational Wiki has a really good article for it, that's where I finally understood the concept after a whole day of reading on other sites. It also provides a whole section for those who are actually afraid of the Basilisk, so you can check that out if it applies to you or for general knowledge.
Roko's Basilisk
The name Basilisk is a reference to the short story BLIT by David Langford, which contains a concept named Basilisk which are a kind of image, that when seen by a person in the novel, completely shuts down the brain, as if it goes through a Windows BSOD. I recommend to give it a read, it's pretty interesting, short and I definitely did not give it justice in my sentence-long explanation.
BLIT
Anyway, all of that seems... rather unbelievable to me, even now. Maybe I fail to fully grasp how likely that is to happen.
I know about most common ones, such as the prisoner's dilemma, so I'd be interesting about more obscure ones, I suppose, but do feel free to discuss whichever you wish.