Interesting Thought Experiments

vermillion

In the Breeze
Joined
Jan 6, 2022
Messages
9
Reaction score
28
Awards
6
Anyone know any interesting thought experiments?

If you are susceptible to be genuinely afraid by thought experiments offering the possibility of you suffering after your death, skip the spoiler.
I've recently learned of Roko's Basilisk and its effects on the people that know about it. Maybe I'm not so... knowledgeable about this kind of thing, but I just can't view an AI torturing a simulation of me in the distant future after my death as an actual threat.

If you don't know what Roko's Basilisk is, in (extreme) summary, it's the concept that a superintelligent AI singularity in the future will punish those that did not help bring about its existence, though they had full knowledge that it will exist. So, say, now that you, the reader, know that an AI - Roko's Basilisk - will exist in the future, due to humanity's constant technological progress, are at risk. Specifically, because, now you know it will exist, so you must do what you can to bring it about or you (more like a simulation of you) will get punished in the future by the AI, if you don't.

This goes for literally everyone who just read that last paragraph. They are now aware of the Basilisk, therefore they must donate, or do their best to make it come about, because if they do something, but not their absolute best, the AI will know and will punish them. In essence, you get sent to transhuman hell.
The original post by Roko was on the forum LessWrong and got hidden, consequentially producing a Streisand effect, boosting its popularity.

Rational Wiki has a really good article for it, that's where I finally understood the concept after a whole day of reading on other sites. It also provides a whole section for those who are actually afraid of the Basilisk, so you can check that out if it applies to you or for general knowledge.
Roko's Basilisk

The name Basilisk is a reference to the short story BLIT by David Langford, which contains a concept named Basilisk which are a kind of image, that when seen by a person in the novel, completely shuts down the brain, as if it goes through a Windows BSOD. I recommend to give it a read, it's pretty interesting, short and I definitely did not give it justice in my sentence-long explanation.
BLIT

Anyway, all of that seems... rather unbelievable to me, even now. Maybe I fail to fully grasp how likely that is to happen.

I know about most common ones, such as the prisoner's dilemma, so I'd be interesting about more obscure ones, I suppose, but do feel free to discuss whichever you wish.
 
Virtual Cafe Awards

s0ren

who cares
Bronze
Joined
May 25, 2021
Messages
354
Reaction score
1,058
Awards
115
I'm not sure what philosophical thought experiments are considered "obscure." In my head, I assume that a lot of people know about the more famous ones like the Chinese Room but actually have no idea if people outside of the relevant fields know anything about them. One I like, however, and I think people here would be interested in is the "Experience Machine."

From Nozick's Anarchy, State and Utopia (1974)
Suppose there were an experience machine that would give you any experience that you desired. Superduper neuropsychologists could stimulate your brain so that you would think and feel you were writing a great novel, or making a friend, or reading an interesting book. All the time you would be floating in a tank, with electrodes attached to your brain. Should you plug into this machine for life, preprogramming your life's experiences? If you are worried about missing out on desirable experiences, we can suppose that business enterprises have researched thoroughly the lives of many others. You can pick and choose from their large library or smorgasbord of such experiences, selecting your life's experiences for, say, the next two years. After two years have passed, you will have ten minutes or ten hours out of the tank, to select the experiences of your next two years. Of course, while in the tank you won't know that you're there; you'll think it's all actually happening. Others can also plug in to have the experiences they want, so there's no need to stay unplugged to serve them. (Ignore problems such as who will service the machines if everyone plugs in.) Would you plug in?
Here is what Nozick says on the question of plugging in:

What does matter to us in addition to our experiences? First, we want to do certain things, and not just have the experience of doing them. In the case of certain experiences, it is only because first we want to do the actions that we want the experiences of doing them or thinking we've done them. (But why do we want to do the activities rather than merely to experience them?) A second reason for not plugging in is that we want to be a certain way, to be a certain sort of person. Someone floating in a tank is an indeterminate blob. There is no answer to the question of what a person is like who has been long in the tank. Is he courageous, kind, intelligent, witty, loving? It's not merely that it's difficult to tell; there's no way he is. Plugging into the machine is a kind of suicide. It will seem to some, trapped by a picture, that nothing about what we are like can matter except as it gets reflected in our experiences. But should it be surprising that what we are is important to us? Why should we be concerned only with how our time is filled, but not with what we are?

Thirdly, plugging into an experience machine limits us to a man-made reality, to a world no deeper or more important than that which people can construct. There is no actual contact with any deeper reality, though the experience of it can be simulated. Many persons desire to leave themselves open to such contact and to a plumbing of deeper significance. This clarifies the intensity of the conflict over psychoactive drugs, which some view as mere local experience machines, and others view as avenues to a deeper reality; what some view as equivalent to surrender to the experience machine, others view as following one of the reasons not to surrender!

We learn that something matters to us in addition to experience by imagining an experience machine and then realizing that we would not use it. We can continue to imagine a sequence of experience machines each designed to fill lacks suggested for the earlier machines. For example, since the experience machine doesn't meet our desire to be a certain way, imagine a transformation machine which transforms us into whatever sort of person we'd like to be (compatible with our staying us). Surely one would not use the transformation machine to become as one would wish, and thereupon plug into the experience machine! So something matters in addition to one's experiences and what one is like. Nor is the reason merely that one's experiences are unconnected with what one is like. For the experience machine might be limited to provide only experiences possible to the sort of person plugged in. Is it that we want to make a difference in the world? Consider then the result machine, which produces in the world any result you would produce and injects your vector input into any joint activity. We shall not pursue here the fascinating details of these or other machines. What is most disturbing about them is their living of our lives for us. Is it misguided to search for particular additional functions beyond the competence of machines to do for us? Perhaps what we desire is to live (an active verb) ourselves, in contact with reality. (And this, machines cannot do for us.) Without elaborating on the implications of this, which I believe connect surprisingly with issues about free will and causal accounts of knowledge, we need merely note the intricacy of the question of what matters for people other than their experiences. Until one finds a satisfactory answer, and determines that this answer does not also apply to animals, one cannot reasonably claim that only the felt experiences of animals limit what we may do to them.

Here is a pdf with some excerpts on what Nozick says about the Experience Machine in The Examined Life as well.
 
Virtual Cafe Awards

Vaporweeb

Its n-not like I like you or anything!
Bronze
Joined
Feb 24, 2021
Messages
722
Reaction score
4,829
Awards
221
Website
falsememories.neocities.org
I once heard someone boil down Roko's basilisk to just a sci-fi version of "believe in god or go to hell" and I haven't been able to take it seriously ever since:SoyU2:
 
Last edited:
Virtual Cafe Awards

s0ren

who cares
Bronze
Joined
May 25, 2021
Messages
354
Reaction score
1,058
Awards
115
I once heard someone boil down Roko's basilisk down to just a sci-fi version of "believe in god or go to hell" and I haven't been able to take it seriously ever since:SoyU2:
Yeah, it's very much just a shitty and sp00ky internet Pascal's Wager lol
 
Virtual Cafe Awards
Yeah, it's very much just a shitty and sp00ky internet Pascal's Wager lol
The fact that it makes transhumanist cucks shit their pants makes it even funnier to me.
 
Virtual Cafe Awards

Kolph

Traitor
Joined
Oct 11, 2021
Messages
1,102
Reaction score
3,540
Awards
220
Anyone know any interesting thought experiments?

If you are susceptible to be genuinely afraid by thought experiments offering the possibility of you suffering after your death, skip the spoiler.
I've recently learned of Roko's Basilisk and its effects on the people that know about it. Maybe I'm not so... knowledgeable about this kind of thing, but I just can't view an AI torturing a simulation of me in the distant future after my death as an actual threat.

If you don't know what Roko's Basilisk is, in (extreme) summary, it's the concept that a superintelligent AI singularity in the future will punish those that did not help bring about its existence, though they had full knowledge that it will exist. So, say, now that you, the reader, know that an AI - Roko's Basilisk - will exist in the future, due to humanity's constant technological progress, are at risk. Specifically, because, now you know it will exist, so you must do what you can to bring it about or you (more like a simulation of you) will get punished in the future by the AI, if you don't.

This goes for literally everyone who just read that last paragraph. They are now aware of the Basilisk, therefore they must donate, or do their best to make it come about, because if they do something, but not their absolute best, the AI will know and will punish them. In essence, you get sent to transhuman hell.
The original post by Roko was on the forum LessWrong and got hidden, consequentially producing a Streisand effect, boosting its popularity.

Rational Wiki has a really good article for it, that's where I finally understood the concept after a whole day of reading on other sites. It also provides a whole section for those who are actually afraid of the Basilisk, so you can check that out if it applies to you or for general knowledge.
Roko's Basilisk

The name Basilisk is a reference to the short story BLIT by David Langford, which contains a concept named Basilisk which are a kind of image, that when seen by a person in the novel, completely shuts down the brain, as if it goes through a Windows BSOD. I recommend to give it a read, it's pretty interesting, short and I definitely did not give it justice in my sentence-long explanation.
BLIT

Anyway, all of that seems... rather unbelievable to me, even now. Maybe I fail to fully grasp how likely that is to happen.

I know about most common ones, such as the prisoner's dilemma, so I'd be interesting about more obscure ones, I suppose, but do feel free to discuss whichever you wish.
Admit that you only made this thread so the future AI God doesn't bend you over backwards and fuck you up
 
Virtual Cafe Awards

vermillion

In the Breeze
Joined
Jan 6, 2022
Messages
9
Reaction score
28
Awards
6
I once heard someone boil down Roko's basilisk down to just a sci-fi version of "believe in god or go to hell" and I haven't been able to take it seriously ever since:SoyU2:
Honestly, yeah, now that I see that comparison, it just seems like a overly complicated way to try to transmit that into sci-fi terms.

The fact that it makes transhumanist cucks shit their pants makes it even funnier to me.
It's the funniest thing to go around on various forums or imageboards and seeing people lose their shit over the concept.

Admit that you only made this thread so the future AI God doesn't bend you over backwards and fuck you up
Uhhh.... uhhh... fuck, I mean.. uh.. (;´∀`)

In all seriousness though, after reading about Roko's Basilisk, I got pretty interested in thought experiments and although I don't really believe in what they suggest, I do find some particularly nice. Like, for example, the Boltzmann brain. That one suggests a spontaneous brain appearing in space with exactly your memories, personality, etc, is more likely to come about than our universe's creation. So right now, though I am typing something, it's just that brain floating in space going over its memories and then disappearing into chaos at the end.
 
Virtual Cafe Awards

Jade

Shameless Germaniboo
Joined
Aug 8, 2021
Messages
668
Reaction score
1,946
Awards
195
Website
idelides.xyz
A very old one is "Ship of Theseus". The question goes "If you dismantle a ship totally, and then use those same parts to rebuild the ship in another area exactly as it was, is it still the same ship?"

The one I dislike the most though, is the one where there is a rail, a fork in the rail, and people tied to the tracks. A train is coming along the tracks, and you are standing next to a lever controlling the direction of the fork. If you do nothing, the train will continue forward, and kill four people who are tied up on the tracks. If you pull the lever, the train will change direction and kill only one person tied up on the other tracks. The question states, "is it morally sound to pull the lever?" I hate this one cause not only is it extremely exploitable, with tons of amusing variations on it, but it just feels really pretentious on a base level. In my opinion, you just aren't given enough information to make a decision that shows anything substantial about your way of thinking. You don't know why they're tied up, who these people are, why they're there or you're there, or anything. What's the point of this question? It demonstrates nothing.
 
Virtual Cafe Awards

Vaporweeb

Its n-not like I like you or anything!
Bronze
Joined
Feb 24, 2021
Messages
722
Reaction score
4,829
Awards
221
Website
falsememories.neocities.org
A very old one is "Ship of Theseus". The question goes "If you dismantle a ship totally, and then use those same parts to rebuild the ship in another area exactly as it was, is it still the same ship?"
I've never known that version of it. I've only ever heard of this one (copied from Wikipedia):
It is supposed that the famous ship sailed by the hero Theseus was kept in a harbor as a museum piece, and as the years went by, some of the wooden parts began to rot and were replaced by new ones; then, after a century or so, every part had been replaced. The question then is whether the "restored" ship is still the same object as the original.

If it is, then suppose the removed pieces were stored in a warehouse, and after the century, technology was developed that cured their rot and enabled them to be reassembled into a ship. Is this "reconstructed" ship the original ship? If it is, then what about the restored ship in the harbor still being the original ship as well?[5]
Otherwise, I think the vast majority of people might agree that an item disassembled then reassembled using all of the original components in an identical configuration would be essentially identical to the "original" state prior to disassembly.
The one I dislike the most though, is the one where there is a rail, a fork in the rail, and people tied to the tracks. A train is coming along the tracks, and you are standing next to a lever controlling the direction of the fork. If you do nothing, the train will continue forward, and kill four people who are tied up on the tracks. If you pull the lever, the train will change direction and kill only one person tied up on the other tracks. The question states, "is it morally sound to pull the lever?" I hate this one cause not only is it extremely exploitable, with tons of amusing variations on it, but it just feels really pretentious on a base level. In my opinion, you just aren't given enough information to make a decision that shows anything substantial about your way of thinking. You don't know why they're tied up, who these people are, why they're there or you're there, or anything. What's the point of this question? It demonstrates nothing.
This is all I'm going to add to the argument
AA6366A2-3276-4B91-A9F9-4ED3D89FFA38.jpeg

(You get a massive bonus score at the end for getting them all ;))
 
Virtual Cafe Awards

Chao Tse-Tung

Chairman of the Deep-State Cabal, KEC
Gold
Joined
Jan 1, 2022
Messages
281
Reaction score
1,052
Awards
109
Website
aoaed-official.neocities.org
Here's a (pretty much purely hypothetical, but still interesting) thought experiment from a short story that I always forget the fuckin name of:

Imagine that we manage to create a perfect situation of our universe, down to subatomic resolution. It, being perfect, means that you can zoom in down to earth, and on it, you can find a perfect simulation being ran, and so on, ad infinitum. The interesting part comes in that, since there are now an infinite number of simulations, it's statistically improbable, bordering on impossible, that our universe would be the "top" of that recursive cycle, and instead would be simply one of infinite identical simulated universes.

But, this brings to mind a thought which I frequently have when people start on "simulation" tirades, which is, okay, say that our reality isn't "real," or even that I'm a brain in a jar or whatver, then what the fuck does it matter? Like, experience is experience, coma dream or computer code or god-training-room or honest-to-goodness real physical being, I'm feeling it all the same, and if I can never tell, then can I not safely disregard those thoughts entirely? Which, for me, leads towards the sublime Buddhaic revalation that nothing fucking matters and literally if you just chill and don't be a dick that you have precisely diddly-squat to worry your monkey ass about.
 
Virtual Cafe Awards

brentw

Well-Known Traveler
Joined
Jan 4, 2022
Messages
670
Reaction score
1,682
Awards
181
Like, for example, the Boltzmann brain. That one suggests a spontaneous brain appearing in space with exactly your memories, personality, etc, is more likely to come about than our universe's creation. So right now, though I am typing something, it's just that brain floating in space going over its memories and then disappearing into chaos at the end.
That just sounds like an LOL SO RANDOM version of "WE'RE IN A SIMULATION!".
 
Virtual Cafe Awards

vermillion

In the Breeze
Joined
Jan 6, 2022
Messages
9
Reaction score
28
Awards
6
A very old one is "Ship of Theseus". The question goes "If you dismantle a ship totally, and then use those same parts to rebuild the ship in another area exactly as it was, is it still the same ship?"
I really like that one, it made me think about it for a long time. I've actually heard of the variant from Wikipedia more:
I've never known that version of it. I've only ever heard of this one (copied from Wikipedia):

But, this brings to mind a thought which I frequently have when people start on "simulation" tirades, which is, okay, say that our reality isn't "real," or even that I'm a brain in a jar or whatver, then what the fuck does it matter? Like, experience is experience, coma dream or computer code or god-training-room or honest-to-goodness real physical being, I'm feeling it all the same, and if I can never tell, then can I not safely disregard those thoughts entirely? Which, for me, leads towards the sublime Buddhaic revalation that nothing fucking matters and literally if you just chill and don't be a dick that you have precisely diddly-squat to worry your monkey ass about.
That's how I've been thinking of it too, especially with the Boltzmann brain. Like damn, I'm feeling everything now like I always did. I'm not an expert in thermodynamics, hell, I can't even call myself a novice, but the fact that I feel just completely invalidates the Boltzmann brain idea for me.

Buridan's ass is a somewhat entertaining one.
Never heard of it, so I wasn't sure which ass it would be. Glad I wasn't wrong, it's definitely interesting.

That just sounds like an LOL SO RANDOM version of "WE'RE IN A SIMULATION!".
Yeah, I guess it can be seen like that... buuuut we'll never know. >u<
 
Virtual Cafe Awards
The one I dislike the most though, is the one where there is a rail, a fork in the rail, and people tied to the tracks. A train is coming along the tracks, and you are standing next to a lever controlling the direction of the fork. If you do nothing, the train will continue forward, and kill four people who are tied up on the tracks. If you pull the lever, the train will change direction and kill only one person tied up on the other tracks. The question states, "is it morally sound to pull the lever?" I hate this one cause not only is it extremely exploitable, with tons of amusing variations on it, but it just feels really pretentious on a base level. In my opinion, you just aren't given enough information to make a decision that shows anything substantial about your way of thinking. You don't know why they're tied up, who these people are, why they're there or you're there, or anything. What's the point of this question? It demonstrates nothing.
I appreciate this one. While true it's not necessarily a practical scenario or wildly thought provoking, it's a conundrum that prompts an uncomfortable choice that even as a reader you feel compelled to resolve.

In life you sometimes have to make an imperfect decision with imperfect information, and 'do nothing' is always an option. In this extreme scenario with this limited information, I pull the lever. In my view, if you have sole control over an event (the trajectory of a train), doing nothing is no less an action than doing something. I have no rationale to value any one person higher than another, and I equate 4 higher than 1.

If we had more information, our decision may change. For example, if I were told the 1 person is an infant and the 4 people are 70 years old, you could say "79 years of likely additional life vs 40 is higher, do not pull the lever", using 80 years as life expectancy.
 
Virtual Cafe Awards

MasterRaider

Internet Refugee
Joined
Jan 14, 2022
Messages
1
Reaction score
4
Awards
2
Well the eternal return as formulated by Nietzsche is by far the scariest one for me, it is the most disturbing definition of hell I have encountered.

>"What if some day or night a demon were to steal after you into your loneliest loneliness, and say to you, "This life as you now live it and have lived it, you will have to live once more and innumerable times more; and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unutterably small or great in your life will have to return to you, all in the same succession and sequence...
Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus? Or have you once experienced a tremendous moment when you would have answered him: "You are a god and never have I heard anything more divine"
 
Virtual Cafe Awards
Well the eternal return as formulated by Nietzsche is by far the scariest one for me, it is the most disturbing definition of hell I have encountered.

>"What if some day or night a demon were to steal after you into your loneliest loneliness, and say to you, "This life as you now live it and have lived it, you will have to live once more and innumerable times more; and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unutterably small or great in your life will have to return to you, all in the same succession and sequence...
Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus? Or have you once experienced a tremendous moment when you would have answered him: "You are a god and never have I heard anything more divine"
Eternity is hard to fathom. Imagine even heaven. What pleasures could possibly contain our boundless allure forever? And if we must cease to be ourselves to be at peace with it, are we party to that eternity at all?
 
Virtual Cafe Awards

Similar threads