For what X would you be indifferent between living X days, but forgetting your day at the end of everyday, or living 10 days? (terminally)
post by Mati_Roy (MathieuRoy) · 2020-09-18T04:05:59.078Z · LW · GW · 7 commentsThis is a question post.
Contents
Answers 3 Dagon 1 Richard_Kennaway 1 Alexei 1 Mati_Roy None 7 comments
terminally meaning in and of itself, as oppose to instrumentally meaning as a mean to other ends
Answers
10 days of connected experience vs X days of disconnected experience? Honestly, I can't compound experiences/values very much in 10 days, so the amnesia doesn't cost that much - somewhere between 11 and 20 days seems reasonable to me.
I know people with severe memory problems, and they enjoy life a significant fraction (at least 10%, perhaps 80%, some days over 100%) as much as they might if they remembered yesterday.
This question gets much harder for 2, 10, or 50 years. The amount of joy/satisfaction/impact that can be had in those timeframes by building on previous days, is perhaps orders of magnitude higher if an agent has continuity than if not.
I think 10,000 or so. Assuming the days are meaningfully different.
I'm not sure whether X keeps getting more valuable-to-me as it becomes larger than 1
7 comments
Comments sorted by top scores.
comment by Mati_Roy (MathieuRoy) · 2020-09-18T04:56:25.261Z · LW(p) · GW(p)
I encourage you to answer the question before reading this comment.
.
.
.
.
.
I think answering this question has an interesting implication:
- either you say a small number, in which case (at least for some solution to identity), it means that you would value short-lived copies of yourself a lot
- or you say a large number, in which case it means you almost don't value at all the moments you completely forget, presumably at least some dreams and drug intoxication
↑ comment by Richard_Kennaway · 2020-09-21T18:12:40.426Z · LW(p) · GW(p)
Indeed. I almost don't value at all the moments I completely forget (and which leave no other residue in the present).
Replies from: Dagon↑ comment by Dagon · 2020-09-22T15:03:45.462Z · LW(p) · GW(p)
There are TONS of moments I forget, but they _do_ leave residue. Either in income, effect on other people, or environmental improvements (the lightbulb I changed continues to work). Not sure if this scenario removes or carries forward unconscious changes in habits or mental pathways, but for real memory loss, victims tend to retain some amount of such changes, even if they don't consciously remember doing so.
I also value human joy in the abstract. Whether some other person, or some un-remembered version of me experiences it, there is value.
If you give a very very large value, do you also believe that all mortal lives are very-low-value, as they won't have any memory once they die?
Replies from: Richard_Kennaway, MathieuRoy↑ comment by Richard_Kennaway · 2020-09-22T17:42:22.818Z · LW(p) · GW(p)
If you give a very very large value, do you also believe that all mortal lives are very-low-value, as they won't have any memory once they die?
They are of no value to them, because they're dead. They may be of great value to others.
Replies from: Dagon↑ comment by Dagon · 2020-09-22T20:55:24.396Z · LW(p) · GW(p)
I recognize that time-value-of-utility is unsolved, and generally ignored for this kind of question. But I'm not sure I follow the reasoning that current-you must value future experiences based on what farther-future-you values.
Specifically, why would you require a very large X? Shouldn't you value value both possibilities at 0, because you're dead either way?
Replies from: Richard_Kennaway↑ comment by Richard_Kennaway · 2020-09-22T21:35:37.136Z · LW(p) · GW(p)
Specifically, why would you require a very large X? Shouldn't you value value both possibilities at 0, because you're dead either way?
No, because I'm alive now, and will be until I'm dead. Until then, I have the preferences and values that I have.
↑ comment by Mati_Roy (MathieuRoy) · 2020-09-22T22:48:06.205Z · LW(p) · GW(p)
Either in income, effect on other people, or environmental improvements
Those are instrumental. They are important to consider, but for the purpose of this post I'm mostly interested in fundamental values.
Not sure if this scenario removes or carries forward unconscious changes in habits or mental pathways
It does for the purpose of making the thought experiment clearcut. But yeah, that's something I wonder as well in practice.
If you give a very very large value, do you also believe that all mortal lives are very-low-value, as they won't have any memory once they die?
Two (mutually exclusive) moral theories I find plausible are:
- (All else equal) someone's life is as valuable as it's longest instantiation (all shorter instantiations are irrelevant)
- Finite lives are value-less