Posts
Comments
I have an issue with this framing. 'Ghosts' are exactly as physically instantiated into reality as 'you' are. They both run on your brain hardware. If you brain goes into an unrecoverable state then the 'you' part and any 'ghost' part are equally lost. What is the actual distinction you are trying to make here?
"I define a Philosophical Ghost to be something that has an experience but is not physically instantiated into reality, although it may experience the belief that it is physically instantiated into reality. Examples include story characters, simulacra inside of hypothetical or counterfactual predictions, my mental model of the LessWrong audience that I am bouncing thoughts off of as I write this post, your mental model of me that you bounce thoughts off of as you try to read it, and so on."
Correct me if I'm wrong but those are all ethical frameworks rather than meta-ethocal frameworks? My post was an attempt to create a framework with which to discuss those, not to be an alternative to those.
You disagree with MPS/LPS in what way? In that there are cyclical states or in that it is impossible to rate states against each other or something else?
Completely agree that preferences are not consistent over time but I'm not sure about the relevance of that here.
Agent actions definitely do not maximize over reachable state preferences. My only point there was that they make some attempt to improve states. If you disagree with that what would be an example? Totally agree with your point that it can get very messy.