Erase button
post by Astor · 2021-11-09T09:39:39.439Z · LW · GW · 6 commentsContents
6 comments
I try to model ethical concepts with thought experiments to investigate my own uneasiness. I do not wish for anybody to get harmed by these, but want to share them nonetheless to learn from possible discussions.
There is a room with a completely white interior. In the center of the room you can see a small white table with a button. If you press this button down, you instantly cease to exist. This button only works for those who press it with the conscious understanding of its consequences and with the intention to end their existences. Additionally there is no delay to its effect. You immediately lose all of your perception. You do not feel anything after pressing the button. The perception simply ends. If you consider all of the suffering in the world and if possible, should such a room be provided?
Supplementary question: Which conditions prevent suffering?
6 comments
Comments sorted by top scores.
comment by tivelen · 2021-11-10T02:17:15.906Z · LW(p) · GW(p)
The only difference between this and current methods of painless and quick suicide is how "easy" it is for such an intention and understanding to turn into an actual case of non-existence.
Building the rooms everywhere and recommending their use to anyone with such an intention ("providing" them) makes suicide maximally "easy" in this sense. On a surface level, this increases freedom, and allows people to better achieve their current goals.
But what causes such grounded intentions? Does providing such rooms make such conclusions easier to come to? If someone says they are analyzing the consequences and might intend to kill themselves soon, what do we do? Currently, we force people to stay alive, tell them how important their life is, how their family would suffer, that suicide is a sin, and so on, as a society, and we do this to everyone who is part of society.
None of these classic generic arguments will make sense anymore. As soon as you acknowledge that some people ought to push the button, that anyone might need to consider such a thing at any time, you have to explain specifically why this particular person shouldn't right now, if you want to reduce their suicidal intentions. The fact that someone considering suicide happens to think of their family as a counter reason, is because of the universal societal meme, not its status as a rational reason (which it may very well happen to be).
We can designate certain groups (i.e. the terminally ill) as special, and restrict the rooms to them, creating new memes for everyone else to use based in their health, but the old memes remain broken, and the new ones may not be as strong.
I suspect that the main impact of providing the rooms will be socially encouraging suicide, regardless of what else we try to do, even if we tell ourselves we are only providing a choice for those who want it.
↑ comment by Astor · 2021-11-10T19:17:21.587Z · LW(p) · GW(p)
This is a thoughtful analysis of possible effects. Thank you for this. I do not want to have such rooms because I do not want to lose anybody ever. But sometimes there is a tendency in humans for quick decisions which would be supported by such an invention. I suppose this thought experiment shows me that blocking access to easy decision making has potential value.
comment by TLW · 2021-11-11T08:34:59.826Z · LW(p) · GW(p)
if possible, should such a room be provided?
Informal argument: no. People's perceptions are noisy. On the short term they fluctuate far more than is strictly rational. I believe that most people's views of existence on average are (weakly) positive. With the button, if/when there's a momentary fluctuation...? A positive mean does not imply all data points are positive.
Replies from: Astor↑ comment by Astor · 2021-11-11T10:27:18.137Z · LW(p) · GW(p)
This is the same conclusion and argument I arrived after reading tivelen's comment. But my objection would be that a "momentary fluctuation" generally is not a good moral argument. You could doubt every decision because the time you took to not be considered a fluctuation is arbitrary.
comment by JBlack · 2021-11-11T02:22:51.593Z · LW(p) · GW(p)
I would consider the existence of such rooms to be very dangerous.
Not for their stated purpose, which doesn't interest me much from an ethical point of view one way or the other. Their existence implies the prevalence of technology that can instantly read minds well enough to reliably determine both intention and understanding of potentially far-reaching consequences of carrying out that intention. While such technology could be employed very usefully, it also seems very easy to misuse.
Replies from: Astor↑ comment by Astor · 2021-11-11T08:21:55.726Z · LW(p) · GW(p)
I thought about that and also agree with you. But I wanted this room to be thought about as an investigation of personal choice rather than a choice made by others for you. So I opted for the inclusion of this concept. It would be appropriate not to overemphasize this aspect. But it is of course an understandable rejection. Thank you for bringing it to the foreground.