Musings on non-stability of mind-states, etc.

post by Dorikka · 2014-11-02T02:43:39.423Z · LW · GW · Legacy · 4 comments

I thought about some stuff and have written down notes for my future reference. I'm pasting it below in case others find it useful or thought provoking. Apologies for inferential distance problems caused by my language/notation, etc. Posting my notes seemed superior to not doing so, especially given that the voting system allows for filtering content if enough people don't want to see it.

---

Mind-states are non-stable with respect to attributes valued by some agents. This is true not only with respect to death, etc but also biological/chemical changes that occur perpetually, causing behaviors in the presence of identical stimuli/provocations to differ substantially. The English language (and many other human languages) seem to hide this by their use of "pronouns" and names (handles) for humans and other objects deemed sentient which do not change from the moment the human/animal/etc is born or otherwise appears to come into existence.

As a result of this, efforts to preserve mind-states are unsuccessful while they allow mind-states to change (replacing one state with another, without retaining the pre-change state). Even given life-extension technology such that biological death is prevented, this phenomenon would likely continue - technology to preserve all mind-states as they came into existence would likely be more difficult to engineer than such required to attain mere immortality. Yet agents may also value the existence (and continued existance) of mind-states which have never existed, necessitating a division of resources between preserving existing mind-states and causing new ones to exist (perhaps variants of existing ones after they "have a (certain) experience")). Agents with such values face an engineering/(resource allocation) problem, not just a "value realization" problem.

Also consider that humans do not appear to exist in a state of perpetual optimization/strategizing; they execute, and the balance between varying methods of search and execution does not appear to be the result of such a process - to the extent such a process occurs, the recursive depth is likely minimal. Mental processes are often triggered by sensory cues or provocations (c/p). The vast majority of these c/p encountered by humans consistently trigger a small subset of the set of mental processes which are implementable by human brains, even once the large space of processes which do not optimize along held values are excluded. Human brains are limited in the number of simultaneous processes run, so c/p triggering processes reduce the extant to which current processes continue to be run - furthermore there appears to be an upper limit to the total number of simultaneous processes run (regardless of the allocation of resources to each), so c/p sometimes trigger processes which extinguish existing processes. Thus encountering certain c/p may be significantly impactful to a human agent's values as the c/p encountered shape the mental/thought processes run. If processes likely to maximize the human's values are not those triggered by the majority of c/p encountered, efforts to optimize c/p encountered may have significantly positive expected value.

Related to the above, humans fail to manage the search/execute decision with significant recursive depth. Behaviors (actions following, or in anticipation of, certain c/p) are often not the result of conscious strategy/optimization - formation of such behaviors is often the result of subconscious/emotional (s/e) processes which do not necessarily optimize effectively along the human's values, causing the human to possibly exhibit behaviors which are "inconsistent" with its values. This may be the case even if the s/e processes hold a perfect representation of the human's values, given that the success of behaviors in optimizing along values involves prediction of physical phenomena (including mental phenomena of other agents) - inaccuracies in the model used by s/e processes may result in such "inconsistent" behavior, even if that model is not consciously accessible.

4 comments

Comments sorted by top scores.

comment by Fluttershy · 2014-11-02T03:57:53.331Z · LW(p) · GW(p)

technology to preserve all mind-states as they came into existence would likely be more difficult to engineer than such required to attain mere immortality.

I do not think that most people have a problem with the fact that they will, in five years from now, have values, habits, and particular ways of responding to situations which are different from those they have now. I wouldn't want my personality to change drastically overnight, but I certainly wouldn't want to magically make myself unable to change my habits and values in the future, in order to make future versions of myself value exactly the same things as me, either.

There are definitely ways in which our values, habits, and reactions to experiences could change which would be very bad-- Alzheimer's, and other age-related diseases obviously change people for the worse. Still, I accept and embrace the fact that I will have different habits and values in five years from now.

People's values change all of the time. Changes in one's religious beliefs can cause changes in values. Becoming a parent seems to change values, habits, and the exact way in which one tends to respond to situations quite strongly, yet many people report that they enjoy becoming parents. The fact that people's personalities change over time isn't always a bad thing.

Replies from: torekp
comment by torekp · 2014-11-04T01:31:35.768Z · LW(p) · GW(p)

In light of the above, I suggest that the OP be recast from talk of "preserving mind-states" to talk of "instantiating valued mind-processes" or something like that. Then see where that leaves you.

comment by coyotespike · 2014-11-03T20:33:57.202Z · LW(p) · GW(p)

As Dorikka ackowledges above, personal notes often use a hard-to-read shorthand (I know mine do). I have roughly translated this note to a form I can more easily understand, below.

"Mind-states are non-stable with respect to attributes valued by some agents." People change their minds and selves, in important ways.

"This is true not only with respect to death, etc but also biological/chemical changes that occur perpetually, causing behaviors in the presence of identical stimuli/provocations to differ substantially." People's minds change, not only when they die and hence cease to exist, but also when they just, you know, change their minds and behaviours, so they act differently at different times, even though everything else is the same.

"The English language (and many other human languages) seem to hide this by their use of "pronouns" and names (handles) for humans and other objects deemed sentient which do not change from the moment the human/animal/etc is born or otherwise appears to come into existence." If you change your mind, you are a little bit different than before. But we still call you by the same name. Something has been lost - some unique constellation of thought, some pattern of behaviour, personality attribute - but language "hides" the loss by pretending you are the same person.

"As a result of this, efforts to preserve mind-states are unsuccessful while they allow mind-states to change (replacing one state with another, without retaining the pre-change state). Even given life-extension technology such that biological death is prevented, this phenomenon would likely continue - technology to preserve all mind-states as they came into existence would likely be more difficult to engineer than such required to attain mere immortality." What we need is GitHub for minds! Version control would let us change who we are, without losing the old selves. (For those who don't know, Git (among other version control systems) allows you to save (and preserve) file changes as you go. You can easily see the difference between old and new code, merge two different codes, and other cool stuff I don't know about.) But making Git for minds will probably be much harder than immortality.

"Yet agents may also value the existence (and continued existance) of mind-states which have never existed, necessitating a division of resources between preserving existing mind-states and causing new ones to exist (perhaps variants of existing ones after they "have a (certain) experience")). Agents with such values face an engineering/(resource allocation) problem, not just a "value realization" problem." And if we did make GitBrains, we'd have to decide whether to forge new paths ahead, mentally speaking, or spend time and resources backing up our current/past minds. Tough choices.

"Also consider that humans do not appear to exist in a state of perpetual optimization/strategizing; they execute, and the balance between varying methods of search and execution does not appear to be the result of such a process..." As we know, we're not always very strategic about how we change ourselves. We just sort of act - or rather, we are acted on as we encounter behavioural cues which cause us to change. If your mind and behaviour are being changed by environmental cues in ways that don't further your values, you could get a lot of benefit by changing your environment.

"Behaviors (actions following, or in anticipation of, certain c/p) are often not the result of conscious strategy/optimization..." Finally, you don't always act in a strategic and well-planned manner. Even if you have such plans, your methods of searching and exploiting good opportunities for yourself may not take into account your irrational subconscious and emotions, which don't have such strategic plans. Bugger.

All of this adds up to: no good ways of preserving your self at any one time, and great difficulty in changing your self in ways you want to change yourself. So you can't easily go back to a better previous version of yourself, and you can't be sure you will successfully make a better future version of yourself. This formulation of the problem is clear, though not original - but I agree that version control for my self could come in handy. "Boy, am I irritating...I'm backdating to Self -2 years. Let me try that again."

Replies from: Error
comment by Error · 2014-11-04T19:57:22.342Z · LW(p) · GW(p)

What we need is GitHub for minds! Version control would let us change who we are, without losing the old selves.

Now what will be really interesting is third party pull requests for minds...