[SEQ RERUN] Causality and Moral Responsibility

post by MinibearRex · 2012-06-01T05:29:23.750Z · LW · GW · Legacy · 2 comments

Today's post, Causality and Moral Responsibility was originally published on 13 June 2008. A summary (taken from the LW wiki):

 

Knowing that you are a deterministic system does not make you any less responsible for the consequences of your actions. You still make your decisions; you do have psychological traits, and experiences, and goals. Determinism doesn't change any of that.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Against Devil's Advocacy, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

2 comments

Comments sorted by top scores.

comment by ComoKate · 2012-06-05T04:18:56.153Z · LW(p) · GW(p)

I just discovered this site, and "SIngularity", this evening. Something I've contemplated for quite a few years is the idea that we have quite enough intelligence and technology in the world already; our collective problem is that we lack the wisdom, ethics and morality to make the best use of it. We already have the means to feed and clothe every person in this world. We have the means to control population growth, provide adequate housing for all, etc. The fact that we do not do these things is not because of a lack of intelligence but rather a lack of humanity. Greed, jealousy, and all the rest of the "seven deadly sins" continue to plague us. Without an end to them, it will surely , eventually, be the end of us. The goal of simply surviving isn't enough. Wasps and crocodiles have survived millions of years...how would AI be any different if the goal is simply greater intelligence? What good is greater intelligence if we ignore improving the less "rational" components of being human? I can't site the studies off hand but I have read many articles claiming there appears to be an inverse relationship between intelligence and empathy for others. From a purely rational stand point, why would any AI entity view mankind in its current condition to be anything other than a hindrance ? Be careful what you wish for-

comment by shminux · 2012-06-01T06:12:39.672Z · LW(p) · GW(p)

I'm somewhat confused about the EY's point. I've always thought that, ironically, we have no choice but to act as if we have free will. This post seems to emphasize the same point, but how does it address what was promised, "why do we think we have free will?" Presumably, the idea is that determinism feels like free will "from the inside". Maybe I missed a relevant post.