FAI-relevant XKCD

post by Dr_Manhattan · 2011-11-22T13:28:44.070Z · score: -4 (13 votes) · LW · GW · Legacy · 9 comments

http://xkcd.com/962/

9 comments

Comments sorted by top scores.

comment by MichaelHoward · 2011-11-22T14:51:26.105Z · score: 4 (4 votes) · LW · GW

If you link to the page instead of the image, it'll save everyone googling it for the mouseover.

Or, put the mouseover text in quotes after the http bit of the link like I did above.

comment by Dr_Manhattan · 2011-11-22T14:59:48.285Z · score: 0 (0 votes) · LW · GW

Thanks - I did not have the original link.

comment by endoself · 2011-11-22T22:42:36.059Z · score: 0 (0 votes) · LW · GW

The xkcd website has a search function (scroll down just past the comic).

comment by JoshuaZ · 2011-11-22T17:05:50.912Z · score: 2 (4 votes) · LW · GW

Sorry, maybe I'm dense. How is this FAI relevant?

comment by Dr_Manhattan · 2011-11-22T17:15:50.167Z · score: 0 (0 votes) · LW · GW

"It takes longer to develop value preserving AI technologies than to develop stuff that's cool but dangerous ("more fun than survival")"

comment by dlthomas · 2011-11-22T17:12:47.234Z · score: 0 (0 votes) · LW · GW

Via the notion of a Great Filter, through existential risk generally. It's a bit of a stretch, for sure, but the link is there.

comment by JoshuaZ · 2011-11-22T17:16:18.883Z · score: 2 (4 votes) · LW · GW

The Great Filter aspect is explicit. But that seems extremely tenuous. Rationalists should worry about the Great Filter whether or not it has anything to do with FAI.

comment by dlthomas · 2011-11-22T17:30:27.932Z · score: 0 (0 votes) · LW · GW

Agreed. I was just saying that there is a link, and it's even reasonably salient within the context of this site. I make no claim that it is the most appropriate link to draw - and, indeed, would have recommended a different title.

comment by MichaelHoward · 2011-11-23T00:15:28.274Z · score: 1 (1 votes) · LW · GW

I think today's xkcd is more relevant. Has anyone good figures on what's spent on things like cryonics research, or rational attempts to improve humanity's rationality, or maybe existential risk reduction, trying to save the future of the next several billion years and 100 million galaxies, to compare with the stuff on this chart?