The utility-maximizing complexity of conscious beings
post by snowball · 2012-09-19T18:44:39.338Z · LW · GW · Legacy · 2 commentsContents
2 comments
An AI that runs conscious beings has finite resources. Complex conscious beings consume more of these resources than do simpler conscious beings. Suppose, for purposes of discussion, that a friendly AI would maximize the aggregate utility of the beings it runs by modulating their level of complexity.
The utility-maximizing level of complexity would depend on the natural laws that govern the relation between the complexity of a being and the magnitude of the utility that it can experience.
When you are uploaded to the AI, then, one of three things will happen, depending on the result above:
1. You may wake up to discover that most of your mind, memory, and identity had disappeared, because most of your computational resources had been stolen in order to create a set of new, simple beings who, in total, would enjoy more utility than you had thereby lost. You may even be much less happy than you now are.
2. You may not wake up at all, because the totality of your computational resources had been stolen from you and given to a superbeing who was capable, because of some economy of scale in utility consumption, of extracting more utility from them than you could.
3. You may wake up to discover that you are much smarter and happier than you now are, because the utility-maximizing level of complexity is larger than your own, but not so large that, if all people were to upload, it would exhaust the AI's resources, thus making rationing necessary.
2 comments
Comments sorted by top scores.
comment by RichardHughes · 2012-09-19T19:51:34.259Z · LW(p) · GW(p)
I don't see a point or thesis in your statement for me to react to beyond the situation itself. What are you getting at? What argument are you seeking to make?
Replies from: Mitchell_Porter↑ comment by Mitchell_Porter · 2012-09-19T23:53:20.561Z · LW(p) · GW(p)
It might be an unfinished draft that was accidentally posted.