[SEQ RERUN] Fighting a Rearguard Action Against the Truth

post by MinibearRex · 2012-09-06T04:45:11.925Z · LW · GW · Legacy · 7 comments

Contents

7 comments

Today's post, Fighting a Rearguard Action Against the Truth was originally published on 24 September 2008. A summary (taken from the LW wiki):

 

When Eliezer started to consider the possibility of Friendly AI as a contingency plan, he permitted himself a line of retreat. He was now able to slowly start to reconsider positions in his metaethics, and move gradually towards better ideas.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was That Tiny Note of Discord, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

7 comments

Comments sorted by top scores.

comment by Oscar_Cunningham · 2012-09-06T11:53:19.281Z · LW(p) · GW(p)

This post says "Comments (2)" and yet no comments are showing up. Why? EDIT: Now "Comments (3)" but I can only see my own comment.

Replies from: Vladimir_Nesov, MinibearRex
comment by Vladimir_Nesov · 2012-09-07T08:19:00.744Z · LW(p) · GW(p)

There are two banned nonsense comments which are still counted. See this bug report.

comment by MinibearRex · 2012-09-07T03:52:51.505Z · LW(p) · GW(p)

For me, it says "Comments (5)", but I see 3. This will (theoretically, be the sixth/fourth.

Replies from: Oscar_Cunningham
comment by Oscar_Cunningham · 2012-09-07T08:10:07.842Z · LW(p) · GW(p)

Yeah, so the number of comments visible is two less than the number claimed. Where are the missing comments?

comment by Oscar_Cunningham · 2012-09-07T08:10:17.049Z · LW(p) · GW(p)

Testing.

comment by [deleted] · 2012-09-06T15:53:40.045Z · LW(p) · GW(p)

Eliezer2000 is starting to think inside the black box. His reasons for pursuing this course of action—those don't matter at all. link

When we last left Eliezer2000, he was just beginning to investigate the question of how to inscribe a morality into an AI. His reasons for doing this don't matter at all, except insofar as they happen to historically demonstrate the importance of perfectionism. link

That's two instances of Eliezer placing no moral value "at all" on his own motives in his pursuit of the motive of AI morals. Not necessarily a contradiction, but less elegant than might be.

Replies from: KPier
comment by KPier · 2012-09-07T01:48:53.265Z · LW(p) · GW(p)

I don't think he's saying that motives are morally irrelevant - I think he's saying that they are irrelevant to the point he is trying to make with that blog post.