'Complex Value Systems are Required to Realize Valuable Futures' (Yudkowsky, 2011)

post by lukeprog · 2011-08-08T11:32:35.903Z · LW · GW · Legacy · 9 comments

Contents

9 comments

Most of the papers from the AGI-11 conference are now available online, including Yudkowsky's new paper: 'Complex Value Systems are Required to Realize Valuable Futures.'

Enjoy.

9 comments

Comments sorted by top scores.

comment by Raemon · 2011-08-09T03:42:15.345Z · LW(p) · GW(p)

Mostly stuff we've heard before, although I'm sure it's useful to see it all in one place.

comment by nazgulnarsil · 2011-08-10T10:49:39.487Z · LW(p) · GW(p)

Eliezer presented this at the Winter Intelligence Conference.

http://www.vimeo.com/26914859

Replies from: timtyler
comment by timtyler · 2011-08-13T08:33:43.594Z · LW(p) · GW(p)

My response from last month is here.

Replies from: lessdazed, nazgulnarsil
comment by lessdazed · 2011-08-13T23:53:23.749Z · LW(p) · GW(p)

it doesn't mean that efficient optimisers don't exhibit boredom. They do exhibit boredom - when exploring mostly-unexplored search spaces.

Isn't that like saying efficient optimizers experience hunger if/when they need to eat to live?

Also, consider rewriting the last paragraph. I think I don't understand it.

Replies from: timtyler
comment by timtyler · 2011-08-14T07:03:11.432Z · LW(p) · GW(p)

The word "exhibit" was intended as a reference to the associated behavioural manifestations - to things you can observe - such as stopping repetitious behaviour. Feel free to substitute "boredom-behaviours", for "boredom" if you find the original wording difficult to digest.

Replies from: lessdazed
comment by lessdazed · 2011-08-14T14:09:34.158Z · LW(p) · GW(p)

I'm surprised you linked to your old comment. For various reasons it looks like personal notes for writing a good post, not like something to inform or persuade. Consider redoing it.

Also, one suspects that for positions generally resembling the one I think you're taking, there is an error of equivocation or similar non-exactness. For example, between boredom as an experience and boredom behavior, or regarding whether or not boredom being partly non-instrumental (by your admission) is enough to save Yudkowsky from problems you bring up that are caused by boredom's instrumentality.

For such positions, there is a high standard of how clear the argument must be because it's difficult for me to keep track otherwise.

Replies from: timtyler
comment by timtyler · 2011-08-14T14:34:10.771Z · LW(p) · GW(p)

Also, one suspects that for positions generally resembling the one I think you're taking, there is an error of equivocation or similar non-exactness. For example, between boredom as an experience and boredom behavior [...]

The issue is whether creatures with purely instrumental boredom will create a "boring, worthless, valueless future".

Whether they have subjective boredom or just behavioural boredom seems like an irrelevant side-issue to me.

I have purely-instrumental boredom (to the best of my ability, of course).

Suggesting that purely-instrumental boredom will create a "boring, worthless, valueless future" is just a baseless criticism.

Yudkowsky's idea that such creatures will explore first and then get on with tedious exploiting makes little sense. For example, "exploring" the task of expanding through the galaxy (which is a universal instrumental value) is inevitably going to take a very long time - due to the scale of the experiments required.

comment by nazgulnarsil · 2011-08-13T22:42:01.340Z · LW(p) · GW(p)

strange that was downvoted with no explanation.

comment by [deleted] · 2011-08-15T17:38:01.588Z · LW(p) · GW(p)

Comment on Accelerating Future (This is a claim I haven't encountered before):

Is anybody seriously arguing at this point that simple (trivial) goal systems will suffice for an AGI to work the way we want it >to? Yet this is the straw man that EY keeps attacking. Even Hibbard had complex goals in mind when he meant to keep >humans “happy”, although he did not communicate this well.