Holy Ghost in the Cloud (review article about christian transhumanism)
post by Gordon Seidoh Worley (gworley) · 2017-04-19T18:09:39.943Z · LW · GW · Legacy · 23 commentsThis is a link post for https://turingchurch.net/holy-ghost-in-the-cloud-christian-transhumanism-and-simulation-theology-695e83fa1c7b
Contents
23 comments
23 comments
Comments sorted by top scores.
comment by entirelyuseless · 2017-04-21T14:11:03.375Z · LW(p) · GW(p)
I've mentioned before that one of the things that bothers me about virtually all communities is that they consistently wish to say, "Our community is good and yours is bad," and this leads to interpreting things as opposed which should be interpreted as being along the same lines. It seems to me that the irritation which is expressed in some of the comments here with the idea of associating religion and transhumanism is an expression of this tendency, and I therefore find myself irritated by that irritation.
The post links to an essay by a former Christian who says that she found herself attracted to transhumanism for reasons similar to which she was originally attracted to Christianity. Saying that there could not possibly be any similarities of motives or conclusions or consequences or character, because my community must be as distant as possible from yours, does not respect that person's experience, which is just as real even if it is not universal.
Replies from: Viliam↑ comment by Viliam · 2017-04-24T11:33:30.186Z · LW(p) · GW(p)
Yeah, as if different groups of people couldn't independently have a honest desire to improve their living conditions, even if they have wildly different models of the world, and consequently different strategies to achieve that outcome.
Religious people are irrational, but not evil.
comment by Oscar_Cunningham · 2017-04-19T18:55:23.763Z · LW(p) · GW(p)
It always annoys me when people try to evaluate ideas from their social context rather than their content. It may or may not be true that transhumanism is a "secular outgrowth of Christian eschatology" or "essentially an argument for intelligent design", but whether it is or not you should still be able to evaluate it as a prediction about the future based on our knowledge of today. It's not like AIs which should work according to the laws of physics are suddenly going to crumble to dust if they're made by people of the wrong religion.
Replies from: Han, Viliam, gworley↑ comment by Han · 2017-04-20T04:48:39.188Z · LW(p) · GW(p)
I think there's a rule-of-thumby reading of this that makes a little bit more sense. It's still prejudiced, though.
A lot of religions have a narrative that ends in true believers being saved from death and pain and after that people aren't going to struggle over petty issues like scarcity of goods and things. I run into transhumanists every so often who have bolted these ideas onto their narratives. According to some of these people, the robots are going to try hard to end suffering and poverty, and they're going to make sure most of the humans will live forever. In practice, that goal is dubious from a thermodynamics perspective and if it wasn't, some of our smarter robots are currently doing high-frequency trading and winning ad revenue for Google employees. That alone has probably increased net human suffering -- and they're not even superintelligent
I imagine some transhumanism fans must have good reasons to put these things in the narrative, but I think it's extremely worth pointing out that these are ideas humans love aesthetically. If it's true, great for us, but it's a very pretty version of the truth. Even if I'm wrong, I'm skeptical of people who try to make definite assertions about what superintelligences will do, because if we knew what superintelligences would do then we wouldn't need superintelligences. It would really surprise me if it looked just like one of our salvation narratives.
(obligatory nitpick disclaimer: a superintelligence can be surprising in some domains and predictable in others, but I don't think this defeats my point, because for the conditions of these peoples' narrative to be met, we need the superintelligence to do things we wouldn't have thought of in most of the domains relevant to creating a utopia)
Replies from: username2↑ comment by username2 · 2017-04-20T22:07:27.052Z · LW(p) · GW(p)
This argument notably holds true of FAI / control theory efforts. Proponents of FAI asset that heaven-on-Earth utopian futures are not inevitable outcomes, but rather low probability possibilities they must work towards. It still seems overtly religious and weird to those of us who are not convinced that utopian outcomes are even possible / logically consistent.
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2017-04-21T17:07:27.193Z · LW(p) · GW(p)
If you're not convinced that utopian outcomes are even possible, isn't that completely compatible with the claim that utopian futures are not inevitable and low-probability?
Replies from: Lumifer, username2↑ comment by Viliam · 2017-04-21T10:09:01.168Z · LW(p) · GW(p)
This reminds me how some people notice that superintelligent AI is just another version of Golem... but the same people fail to notice that the ordinary computers around us are already just another version of Golem.
Which further reminds me of Chesterton writing:
Replies from: Oscar_CunninghamStudents of popular science [...] are always insisting that Christianity and Buddhism are very much alike [...] The reasons were of two kinds: resemblances that meant nothing because they were common to all humanity, and resemblances which were not resemblances at all. The author solemnly explained that the two creeds were alike in things in which all creeds are alike, or else he described them as alike in some point in which they are quite obviously different. [...] it was gravely urged that [Christ and Buddha], by a singular coincidence, both had to do with the washing of feet. You might as well say that it was a remarkable coincidence that they both had feet to wash.
↑ comment by Oscar_Cunningham · 2017-04-21T12:20:31.377Z · LW(p) · GW(p)
Is there actually a version of the Golem tale where AI-risk is a theme? I had a look once and I couldn't actually find a version where the Golem fastidiously follows its instructions beyond their intended meaning. Perhaps people are just confusing it with The Sorcerer's Apprentice?
Replies from: Viliam↑ comment by Viliam · 2017-04-21T13:11:59.322Z · LW(p) · GW(p)
Quite possibly; in which case I would also belong to the set of confused people.
Replies from: Oscar_Cunningham↑ comment by Oscar_Cunningham · 2017-04-21T14:56:17.499Z · LW(p) · GW(p)
Until I actually looked into this so was I. In my case I think it's Terry Pratchett's fault. In Feet of Clay he describes Golems as being prone to continue with tasks forever unless told to stop.
Replies from: Kaj_Sotala↑ comment by Kaj_Sotala · 2017-04-21T17:10:41.786Z · LW(p) · GW(p)
From the MIRI paper "Intelligence Explosion and Machine Ethics":
Let us call this precise, instruction-following genie a Golem Genie. (A golem is a creature from Jewish folklore that would in some stories do exactly as told [Idel 1990], often with unintended consequences, for example polishing a dish until it is as thin as paper [Pratchett 1996].)
(The "Idel" reference goes to Idel, Moshe. 1990. Golem: Jewish Magical and Mystical Traditions on the Artificial Anthropoid. SUNY Series in Judaica. Albany: State University of New York Press.)
↑ comment by Gordon Seidoh Worley (gworley) · 2017-04-20T18:49:14.345Z · LW(p) · GW(p)
It always annoys me when people try to evaluate ideas from their social context rather than their content.
But are you really evaluating the content of transhumanism from outside your social context? Most transhumanists are humanists, and thus can trace their philosophical lineage back through the Enlightenment, the Protestant Reformation, Catholic monks translating Aramaic texts into Latin, Zoroastrians and Muslims translating Greek texts into Aramaic, and Hellenistic post-Socratic philosophers writing their ideas down in reaction to pre-Socratic ideas (and this is just where the paper trail ends). All of that context has helped shaped modern humanism, and through that context humanists have notions of what they consider epistemologically sound and what values they support. These influence how humanists evaluate the content of transhumanism.
At best we might say that because transhumanism was developed by humanists, the humanist interpretation of transhumanism is privileged because it gives perspective on the origins of the ideas, yet that doesn't mean we can't find other contexts in which to make sense of transhumanism. To deny them, or even just be annoyed by them, is to exert pressure against the very process that generated transhumanism in the first place: successive reinterpretation and expansion of ideas that have their origins in pre-Socratic Hellenism.
There is no way to consider transhumanism, or any idea, outside of a context; to do so is to blind oneself to the lens through which one sees the world.
Replies from: Viliam, Oscar_Cunningham↑ comment by Viliam · 2017-04-21T10:22:10.204Z · LW(p) · GW(p)
and this is just where the paper trail ends
If I remember this correctly, the writing itself -- without which, there could be no paper trail -- was invented by Phoenicians.
Phoenicians also invented money. Peter Thiel has a lot of money, and he supports transhumanism. He also supports Donald Trump.
...just adding more context...
↑ comment by Oscar_Cunningham · 2017-04-20T19:26:55.438Z · LW(p) · GW(p)
My sentence
It always annoys me when people try to evaluate ideas from their social context rather than their content.
Contains a grammatical ambiguity; the first "their" could refer to the people or the ideas. I meant it to refer to the ideas. I'm not asking people to stop using their own social norms when they judge ideas. I am saying that the society from which an idea originated is irrelevant to judging the truth of that idea. (At least once you've fully understood what the idea is. Before that you might need to understand its context in order to resolve ambiguities in the description of the idea.)
So I'm not claiming that I'm not biased by my cultural heritage (although of course I aspire to be unbiased), I'm just saying that transhumanism shouldn't be attacked or defended based on its heritage.
comment by Kallandras · 2017-04-20T06:27:02.099Z · LW(p) · GW(p)
Will the machine deity require you to accept Christ as your savior before letting you become a transhuman? No? Then why the hell is that written in the bronze age book that you claim knowingly predicted this outcome?
The classic idea of heaven looks like a post-scarcity, post-death society because that's what we've always imagined would be nice. It's not divine prophecy, just something common to humanity, and we've done a lot of ignoring religious "answers" to get there. I resent that religious people would try to co-opt all this work and at this late date contemplate the idea of a digital entity with a "soul."
Replies from: gworley, Oscar_Cunningham, g_pepper↑ comment by Gordon Seidoh Worley (gworley) · 2017-04-20T18:23:10.004Z · LW(p) · GW(p)
I resent that religious people would try to co-opt all this work and at this late date contemplate the idea of a digital entity with a "soul."
At the risk of being rude, this sounds more like your problems than theirs. I'm not sure religious transhumanists are even that late to the party: we happen to be part of a community that got there very early and has been slowly prepping the party so it'll be ready when folks arrive. Maybe religious folks want to dance to different music than we do and you might find that annoying, but is that better than no one showing up to the party at all? And if we don't like it we can always go hang out in a room upstairs for a while without leaving, because the music will eventually change. It always does.
Replies from: Kallandras↑ comment by Kallandras · 2017-04-22T22:45:57.298Z · LW(p) · GW(p)
My perspective is that religious folk have not been prepping the party. Scientists have been trying to get some instruments together to make some music, but the religious people keep grabbing guitars, smashing them, and calling it music. Then, when the music finally starts up despite all the smashed instruments, religious folks say "oh hey, that's what we were trying to do, you're welcome everybody."
As soon as something conveniently fits the religious narrative (appropriately tortured beyond its original construction), it gets incorporated. I find that frustrating, as it should instead shatter the narrative and reveal it for the useless pile of dogma that it is.
Replies from: eternal_neophyte↑ comment by eternal_neophyte · 2017-04-24T13:09:22.171Z · LW(p) · GW(p)
Most scientists are not extropian in any sense - so if they have been "prepping the party" it was not deliberate. Are you considering scientists and religious folk as disjoint sets?
↑ comment by Oscar_Cunningham · 2017-04-20T19:29:22.030Z · LW(p) · GW(p)
This is a good point. It's hardly surprising that the utopia we fantasise about is the same as the one we try to create.
↑ comment by g_pepper · 2017-04-20T13:21:08.197Z · LW(p) · GW(p)
Then why the hell is that written in the bronze age book that you claim knowingly predicted this outcome?
The New Testament is not really a bronze age book. Wikipedia states that the bronze age ended in the near east region around 1200 BC.
comment by fortyeridania · 2017-04-20T04:59:50.257Z · LW(p) · GW(p)
For comparison, here are Robin Hanson's thoughts on some Mormon transhumanists: http://www.overcomingbias.com/2017/04/mormon-transhumanists.html