Second Life creators to attempt to create AI
post by nick012000 · 2011-01-09T13:50:59.337Z · LW · GW · Legacy · 13 commentsContents
13 comments
http://nwn.blogs.com/nwn/2010/02/philip-rosedale-ai.html
http://www.lovemachineinc.com/
Should I feel bad for hoping they'll fail? I do not want to see the sort of unFriendly AI would be created after being raised on social interactions with pedophiles, Gorians, and furries. Seriously, those are some of the more prominent of the groups still on Second Life, and an AI that spends its formative period interacting with them (and the first two, especially) could develop a very twisted morality.
13 comments
Comments sorted by top scores.
comment by Vladimir_Nesov · 2011-01-09T17:16:46.686Z · LW(p) · GW(p)
I do not want to see the sort of unFriendly AI would be created after being raised on social interactions with pedophiles, Gorians, and furries.
Bad Parenting is not even on the list of reasons you don't get a FAI.
Replies from: nick012000↑ comment by nick012000 · 2011-01-09T17:21:21.386Z · LW(p) · GW(p)
Oh, they'd almost certainly get an unFriendly AI regardless of how they parented it, but bad parenting could very easily make an unFriendly AI worse. Especially if it interacts a lot with the Goreans, and comes to the conclusion that women want to be enslaved, or something similar.
Replies from: benelliott↑ comment by benelliott · 2011-01-09T18:32:13.823Z · LW(p) · GW(p)
That probably won't make much of a difference, since there's no reason it should care what anyone wants.
Replies from: katydee↑ comment by katydee · 2011-01-10T17:56:49.615Z · LW(p) · GW(p)
If an AI imposed Goreanism on mankind, that would constitute a Friendly AI Critical Failure (specifically, failure 13), not a UFAI.
comment by Oscar_Cunningham · 2011-01-09T18:40:52.622Z · LW(p) · GW(p)
If it were to FOOM any social norms it absorbed at all would probably make it better not worse. In a kind of "∞ minus 1" way.
Replies from: Normal_Anomaly↑ comment by Normal_Anomaly · 2011-01-09T19:11:41.574Z · LW(p) · GW(p)
Good point, but I'm not entirely sure. Being turned into a Gorian could be worse than being turned into paperclips.
Replies from: khafra↑ comment by khafra · 2011-01-09T21:45:44.816Z · LW(p) · GW(p)
"Being turned into a Gorian" is within the range of enough human desires to make it a significant subculture, although it's repugnant to a much larger section of the culture. I have never heard of anyone with a fantasy of being turned into paperclips. So which is better seems to depend on how you sum utility over all the involved actors.
comment by Normal_Anomaly · 2011-01-09T18:59:06.229Z · LW(p) · GW(p)
Should I feel bad for hoping they'll fail?
Not at all. I certainly hope so, and this (from their site) makes it sound very likely that they will:
The Brain. Can 10,000 computers become a person?
comment by ThomasR · 2011-01-09T14:18:20.467Z · LW(p) · GW(p)
A (real)I's should find the universe pretty boring, too boring to be (un)friendly. At talk touching that at the IAS by Wilczek: "Quantum theory radically transforms our fundamental understanding of physical reality. It reveals that the world contains a hidden richness of structure that we have barely begun to control and exploit. In this lecture, Frank Wilczek indicates the extraordinary potential ofquantum engineering (the size and nature of Hilbert space); reviews one important ongoing effort to harness it (topological quantum computing); and speculates on its ultimate prospects (quantum minds).": http://video.ias.edu/wilczek I just looked it and found his remarks, that supersmart aliens may just have lost their interest in the universe, a nice confirmation of my idea, which came from an other route to the same conclusion: http://ideafoundlings.blogspot.com/2009/09/seti.html