Jinnetic Engineering, by Richard Stallman
post by MBlume · 2010-04-28T01:24:28.927Z · LW · GW · Legacy · 10 commentsContents
10 comments
Thought the community might enjoy this:
10 comments
Comments sorted by top scores.
comment by Vladimir_Nesov · 2010-04-28T03:41:24.878Z · LW(p) · GW(p)
A top-level post should include at least a description or representative quote.
Replies from: None↑ comment by [deleted] · 2010-05-02T18:29:06.220Z · LW(p) · GW(p)
I agree, but I upvoted it anyway because I thought it was interesting and funny.
I read it as a commentary on how, when we daydream about "breaking the rules" (or discovering a fundamental rule that changes the way we live) all the myths have trained us to think selfishly. She wants to use her three wishes to end disease for everyone, and it's like she asked to accept an Academy Award in a clown suit.
EDIT: grammar
comment by mattnewport · 2010-04-28T01:59:02.217Z · LW(p) · GW(p)
I like this link but I'm afraid I don't feel it is worth 10 points so I have upvoted a random comment instead.
Replies from: RobinZcomment by humpolec · 2010-04-28T10:02:51.279Z · LW(p) · GW(p)
What, no bad ending?
Replies from: RolfAndreassen, None↑ comment by RolfAndreassen · 2010-04-28T18:30:52.191Z · LW(p) · GW(p)
Stallman apparently believes that all intelligent people are also ethical and altruistic. We can fill in the bad ending for ourselves. Although, to be fair about it, since practically everyone will catch the superintelligence virus, things would probably even out pretty shortly. Let's just hope nobody with suicidal depression and a mean desire to make others feel as bad as they do has their intelligence increased to the point where they can build a depression-vibe broadcaster, or some other destroy-the-Earth machinery.
↑ comment by [deleted] · 2010-05-01T18:45:24.767Z · LW(p) · GW(p)
"First, I want to become much smarter. I want to be far better at solving any kind of puzzle or problem than anyone who has ever lived. Of course, I should not lose any of my other mental abilities (or physical abilities) when I gain this one."
To be perfectly hones being far smarter than anyone who has ever lived may not be that smart. I mean what's the competition? Francis Galton, Issac Newton perhaps Neumann? They didn't build superweapons on their own.
I suppose, and this will be a contraversial statment, the ability to manipulate people due to high inteligence should be our real concern. Analyzing and replicating the speaking abilties and charisma of Hitler or the transeferable skill and tactis of Napoleon/Julius Caesar/Alexander dosen't seem that hard and is a more realistic threat.