Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Now, what if you’re not the first person on the chain? What if you’re the second one. Or the n one? What now? Would you kill two or n knowing that the person before you spared them?
Yes, but it also kinda depends on what happens at and after junction 34, from which point on more than the entire population of earth is at stake.
If anything, this shows how ludicrously fast exponentials grow. At the start of the line it seems like there will be so many decisions to be made down the line, so there must be a psycho in there somewhere, right? But (assuming the game just ends after junction 34) you’re actually just one of 34 people, and the chance of getting a psycho are virtually 0.
It’s not that interesting. If you rephrase the question as a choice between a good option and a less good one, it’s still barely even a choice.
“Would you rather have only one (or, say, trillions) die now, or would you like to allow *at a minimum *twice that many people die the second we talk to a sadist?”
If you can’t choose the smaller number, all it means is that you lack moral strength - or the test proctor has put someone you know on the tracks, which is cheating. A highly principled person might struggle if choosing between their daughter and one other person. If it’s my kid versus a billion? That’s not a choice, that’s just needless torture. Any good person would sacrifice their kid to save a billion lives. I take that as an axiom, because anything else is patently insane.
Kill fewer people now is obviously the right answer, and not very interesting.
What is interesting is that the game breaks already at junction 34, which is unexpectedly low.
So a more interesting dilemma would have been “would you kill n people now or double it and pass it on, knowing the next person faces the same dilemma, but once all humanity is at stake and the lever is not pulled, the game ends.”. Because that would involve first of all figuring out that the game actually only involves 34 decisions, and then the dilemma becomes “do I trust the next 33-n people not to be psychos, or do I limit the damage now?”. Even more interestingly “limiting the damage now” makes you the “psycho” in that sense…
you have a 95% chance of never seeing it. Don’t pull the lever.
I’m confused: 0.99^32 = 0.72, not 0.95. And if you know that everyone except the last guy won’t pull the lever, that’s still a 1% chance of killing everyone on earth (average expected deaths: 70 million) is way worse than definitely killing one person!
(Edit: unless “don’t pull the lever” means killing that one person, because it isn’t clear which is the default “no action” outcome. In which case, never mind.)
(Edit 2: if you know the 34th and last person might be a sociopath, you’re best off if the first 27 people might also be sociopaths.)
This is really the only answer. The only thing that makes it “hard” is having to face the brutality of moral calculus
Now, what if you’re not the first person on the chain? What if you’re the second one. Or the n one? What now? Would you kill two or n knowing that the person before you spared them?
The thing to do is kill now even if it’s thousands. Because it’s only going to get worse.
The best time to kill was the first trolly. The second best time to kill is now.
Yes, but it also kinda depends on what happens at and after junction 34, from which point on more than the entire population of earth is at stake.
If anything, this shows how ludicrously fast exponentials grow. At the start of the line it seems like there will be so many decisions to be made down the line, so there must be a psycho in there somewhere, right? But (assuming the game just ends after junction 34) you’re actually just one of 34 people, and the chance of getting a psycho are virtually 0.
Very interesting one!
It’s not that interesting. If you rephrase the question as a choice between a good option and a less good one, it’s still barely even a choice.
“Would you rather have only one (or, say, trillions) die now, or would you like to allow *at a minimum *twice that many people die the second we talk to a sadist?”
If you can’t choose the smaller number, all it means is that you lack moral strength - or the test proctor has put someone you know on the tracks, which is cheating. A highly principled person might struggle if choosing between their daughter and one other person. If it’s my kid versus a billion? That’s not a choice, that’s just needless torture. Any good person would sacrifice their kid to save a billion lives. I take that as an axiom, because anything else is patently insane.
Kill fewer people now is obviously the right answer, and not very interesting.
What is interesting is that the game breaks already at junction 34, which is unexpectedly low.
So a more interesting dilemma would have been “would you kill n people now or double it and pass it on, knowing the next person faces the same dilemma, but once all humanity is at stake and the lever is not pulled, the game ends.”. Because that would involve first of all figuring out that the game actually only involves 34 decisions, and then the dilemma becomes “do I trust the next 33-n people not to be psychos, or do I limit the damage now?”. Even more interestingly “limiting the damage now” makes you the “psycho” in that sense…
The fact of the game never ending is what made the choice too easy, you’re right.
EDITED
For this study you want sociopathy, not psychopathy. I can report from my wasted psych degree that sociopathy occurs in 1-2% of the population.
Binary probability tells us that if you repeat a 1% chance test 32 times, you have a 95% chance of never seeing it.
Don’t pull the lever. Sorry for the ninja edit, I misread something.
I’m confused: 0.99^32 = 0.72, not 0.95. And if you know that everyone except the last guy won’t pull the lever, that’s still a 1% chance of killing everyone on earth (average expected deaths: 70 million) is way worse than definitely killing one person!
(Edit: unless “don’t pull the lever” means killing that one person, because it isn’t clear which is the default “no action” outcome. In which case, never mind.)
(Edit 2: if you know the 34th and last person might be a sociopath, you’re best off if the first 27 people might also be sociopaths.)
How could you know someone else is going to do it though? And how is their decision your responsibility?
If you kill someone you are a killer. It’s that simple.
deleted by creator