Well-done documentary on the singularity: 'Transcendent Man'
post by lukeprog · 2011-03-04T23:07:28.670Z · LW · GW · Legacy · 35 commentsContents
35 comments
I just watched Transcendent Man about the singularity and Ray Kurzweil in particular. It's well-made, full-length, and includes the most popular criticisms of Kurzweil: that his prediction timeframes are driven by his own hope for immortality, that the timescale of his other predictions are too optimistic, that his predictions about the social outcomes of revolutionary technology are naively optimistic, and so on. Ben Goertzel and others get much face time.
You can rent or buy it on iTunes.
35 comments
Comments sorted by top scores.
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2011-03-05T00:49:25.301Z · LW(p) · GW(p)
that his prediction timeframes are driven by his own hope for immortality
This is not an important criticism; it is ad hominem in its purest form.
Replies from: Dorikka, katydee, Pavitra, wedrifid, JoshuaZ, lukeprog, Normal_Anomaly, MartinB↑ comment by Dorikka · 2011-03-05T04:11:42.530Z · LW(p) · GW(p)
Overall, specific errors in reasoning should generally be highlighted instead of arguing that the other person is biased. One reason is because such an accusation is an ad hominem attack -- I think that such indirect methods of analyzing the rationality of an argument have an alarming potential to provoke mind-killing.
The more obvious and important reason is that citing a logical error/fallacy/bad interpretation of data is so much more reliable than trying to read emotional cues of whether someone is biased; this is especially true considering the lack of insight which we have into each other's mind.
↑ comment by wedrifid · 2011-03-06T08:26:14.654Z · LW(p) · GW(p)
This is not an important criticism; it is ad hominem in its purest form.
Prediction: Given the right incentive and five minutes to think Eliezer would be able to give an example of a criticism that is a more pure form of fallacious ad hominem. I am only slightly less confident that a randomly selected 15 year old student, allowing the '5 minutes' to include an explanation of what ad hominem means if necessary.
↑ comment by JoshuaZ · 2011-03-05T04:48:11.274Z · LW(p) · GW(p)
This is not an important criticism; it is ad hominem in its purest form.
In certain contexts, where someone is relying on someone's expertise and lack the resources to evaluate the details of a claim, then relying on experts make sense. If a given potential expert has a reason to be biased that's a reason to rely on that expert less.
↑ comment by Normal_Anomaly · 2011-03-05T01:31:04.796Z · LW(p) · GW(p)
Perhaps a better criticism is that his prediction timeframes are the opposite of conservative estimates.
↑ comment by MartinB · 2011-03-07T16:43:04.583Z · LW(p) · GW(p)
How so?
You mean when criticizing his timeframes one should actually point out real flaws instead of just pointing out how they nicely align with his life expectancy?
At first glance I totally fail to see the ad-hominem, maybe a second will help.
comment by timtyler · 2011-03-05T12:03:12.518Z · LW(p) · GW(p)
That his prediction timeframes are driven by his own hope for immortality
Probably Kevin Kelly's point.
comment by XiXiDu · 2011-03-05T10:33:46.915Z · LW(p) · GW(p)
...the most popular criticisms of Kurzweil: that his prediction timeframes are driven by his own hope for immortality...
Interestingly, one could argue the same for Yudkowsky and much more. While Kurzweil is rich enough to sustain himself, Yudkowsky lives off the money of people who have to believe into his predictions and himself as the one who deserves the money.
Replies from: orthonormal, childofbaud, SimonF↑ comment by orthonormal · 2011-03-09T07:30:16.336Z · LW(p) · GW(p)
This has been pointed out a couple times, but Eliezer and the other leaders of SIAI don't have nearly as high a standard of living as they could easily have working for a big tech company, which doesn't require any extraordinary level of skill. Given that established programmers have been impressed with EY's general intelligence and intuition, I find it highly likely that he could have gone this route if he'd wanted.
Now, you could allege that he instead became the poorly paid head of an x-risk charity in order to feel more self-important. But suggesting the motive of greed is nonsensical.
(edited to delete snark)
Replies from: XiXiDu↑ comment by XiXiDu · 2011-03-09T09:48:16.850Z · LW(p) · GW(p)
- I have no problem with how much he earns.
- If he needs to spend contributed money for his personal fun that is completely justifiable.
- If he'd need a big car to impress donors that is justifiable.
- I called him the most intelligent person I know a few times, it is all on record.
- I said a few times that I'm less worried by dying permanently because there is someone like Yudkowsky who contains all my values and much more.
That you people permanently try to accuse me of some base motives does only reinforce my perception that there is not enough doubt and criticism here. All I'm trying to argue is that if you people take low-probability high-risk possibilities seriously then I'm surprised nobody ever talks about the possibility that Yudkowsky or the SIAI might be a risk themselves and that one could take simple measures to reduce this possibility. Given your set of beliefs those people are going to code and implement the goal-system of a fooming AI, but everyone only talks about the friendliness of the AI and not the humans who are paid to create it with your money.
I'm not the person who you should worry about, although I have no particular problem with musing about the possibility that I work for some Chutulu institute. That doesn't change much about what I am arguing though.
Replies from: orthonormal↑ comment by orthonormal · 2011-03-24T14:25:39.327Z · LW(p) · GW(p)
By the way, I was out of line with my last sentence in the grandparent. Sorry about that.
↑ comment by childofbaud · 2011-03-07T10:52:38.429Z · LW(p) · GW(p)
Kurzweil's money does not materialize out of thin air.
I don't know if he is rich enough to sustain himself, but he is certainly not giving away his futurism books, his longevity supplements, or his lecture talks for free. The people paying for Kurzweil's products and services also have to believe in his statements and predictions, and that Kurzweil is the one who deserves their money.
If I were to single out one of these two parties for having a greater financial incentive in the perpetuation of their ideas, my money would be on the businessman/entrepreneur, not on the research fellow working for a charity.
Replies from: XiXiDu↑ comment by XiXiDu · 2011-03-07T12:39:52.558Z · LW(p) · GW(p)
Kurzweil's money does not materialize out of thin air.
Well, no. He actually accomplished a lot early on:
- When he was 20, he sold the company to Harcourt, Brace & World for $100,000 (roughly $500,000 in 2006 dollars) plus royalties.
- In 1974, Kurzweil started the company Kurzweil Computer Products, Inc. and led development of the first omni-font optical character recognition system
- The Kurzweil Reading Machine.
- ...
Yudkowksy read a lot of books and rewrote existing knowledge and put it into the context of an already existing idea. What else did he accomplish? There is no evidence that he could get a lot of money differently. That's why I said that Kurzweil might be biased towards the Singularity because he doesn't want to die, but for Yudkowsky his income is dependent on the credibility of the whole idea.
Replies from: wedrifid, childofbaud↑ comment by wedrifid · 2011-03-07T14:57:20.147Z · LW(p) · GW(p)
That is a good point regarding relative biasses. Mind you...
There is no evidence that he could get a lot of money differently.
... That is blatantly false. There is overwhelming evidence that Eliezer could make a lot of money in other ways. Getting to the stage where there are a lot of people who decide to independently say your achievements are insignificant or low status in all sorts of ways is actually an impressive achievement in itself.
The combination of strategic competence, the ability to make a lot of people to get excited about your work and the ability to persuade others to give you money are a recipe for success just about anywhere.
(I have to down-vote the misuse of 'there is no evidence'. If you took that section out I could upvote.)
Replies from: XiXiDu, XiXiDu↑ comment by XiXiDu · 2011-03-07T16:59:32.745Z · LW(p) · GW(p)
... That is blatantly false. There is overwhelming evidence that Eliezer could make a lot of money in other ways.
You vastly underestimate what it takes to make a lot of money. That he was lucky to get everything right in this one case doesn't mean he would be able to to do it in any other case.
The combination of strategic competence, the ability to make a lot of people to get excited about your work and the ability to persuade others to give you money are a recipe for success just about anywhere.
I don't see that. Lady Gaga wouldn't have been able to make a lot of money if she couldn't sing and looked really ugly. The success of Kurzweil is due to a broad spectrum of skills that Yudkowsky or Lady Gaga lack.
Replies from: wedrifid↑ comment by wedrifid · 2011-03-08T04:48:10.263Z · LW(p) · GW(p)
You vastly underestimate what it takes to make a lot of money. That he was lucky to get everything right in this one case doesn't mean he would be able to to do it in any other case.
Getting everything right in the actual observed case certainly constitutes evidence of capability.
Making *the amount of money you are talking about here" isn't hard. It is hard to make a 'lot' of money only when 'lot' corresponds to a whole heap more than this kind of small change. Intelligence, determination, hard work and persuassiveness are more than enough to make the kind of figures under consideration without having to enter high risk/high reward avenues.
I am not persuaded by your 'Lady Gaga' reference class tennis. Partly because Kurzweil fits it a whole lot better than Yudkowsky. Mostly because available evidence indicates that Lady Gaga would actually be able to achieve the moderately high level of success that we are talking about if she had selected a different economically viable social arena.
↑ comment by XiXiDu · 2011-03-07T17:42:53.638Z · LW(p) · GW(p)
(I have to down-vote the misuse of 'there is no evidence'. If you took that section out I could upvote.)
Upvoted for that. There are just too many lame ducks here who can merely vote but rarely explain themselves. Karma is the true mind killer here and I recently decided to stop caring and just write what I actually think, not the political correct version that cares about Karma score.
I ask everyone to vote this comment down to -10000 to support this ambition. Thank you for your time.
Replies from: Alicorn, Costanza↑ comment by Alicorn · 2011-03-07T17:52:20.608Z · LW(p) · GW(p)
I don't like this comment, and now I'm conflicted about how to deal with that.
Replies from: byrnema, XiXiDu↑ comment by byrnema · 2011-03-07T18:43:23.662Z · LW(p) · GW(p)
Upvoted for pointing out the interesting conundrum.
Taking your comment at face value.. XiXiDu is saying he doesn't want feedback through karma at all, but through comments.
I don't agree with it either and so I'm not downvoting his comment. I don't agree because his karma is a signal that, to some extent, means something. A person or people manipulating their karma dilutes this signal even further.
I also would learn more if people explained upvotes and downvotes, or even just reactions to comments that didn't result in voting. It would be interesting if that was a norm for a while -- say two weeks -- to see if that would really be useful or too noisy and tiresome.
Replies from: XiXiDu↑ comment by XiXiDu · 2011-03-07T19:03:08.872Z · LW(p) · GW(p)
Taking your comment at face value.. XiXiDu is saying he doesn't want feedback through karma at all, but through comments.
That is only part of what I have been saying. I want people to be honest and free. Karma causes a strong positive and negative incentive to agree with the currently popular opinion or to stay quiet.
↑ comment by XiXiDu · 2011-03-07T18:48:28.970Z · LW(p) · GW(p)
I don't like this comment, and now I'm conflicted about how to deal with that.
Arguments? Tell me why Karma isn't a mind killer. Your comment makes me conclude that you would have usually just downvoted my comment to signal that you don't agree but since I especially asked for as many downvotes as possible that option is gone.
Karma just doesn't work in a lot of cases. People are not superhuman AGI's who can infer what is wrong with their submission by a mere vote. Voting will just cause them to shut up about what they really think. In a few cases someone might conclude that the more intelligent and rational collective intelligence of LW decided that their submission is wrong and subsequently think about what is wrong. But in the long run, and for everyone who doesn't think like that, Karma will just create an echo chamber or at the very least a place where being political correct is much more important than being honest.
Replies from: Alicorn, timtyler↑ comment by Alicorn · 2011-03-07T19:05:36.906Z · LW(p) · GW(p)
Well, first of all, it's "politically correct", not "political correct" and "karma" isn't a proper noun. Second, even if you were typing the correct phrase, that's not what it means; political correctness has a meaning that doesn't involve any reference to toeing the Less Wrong party line.
Ignoring the surface characteristics of the comment, karma doesn't control ideology. (If it did, I couldn't be a loud deontologist and come in third on the entire blog, as one salient example. In fact, most of my comments professing deontology have been upvoted.) Well-presented and polite nontrollish comments arguing for nearly any position can score positively. Karma is a control mechanism for topicality, tone, tidiness, visible good faith, etc. People sometimes vote for agreement, but there are enough people here that it usually comes out in the wash, and the phenomenon is dramatically less likely to bury a comment that is written kindly than one that's aggressive or snide.
...And since the benefits of karma have already received attention ad nauseum all over the site, I didn't want to spend those minutes on this redundant comment, but you rendered valueless my other mechanism of disapproval.
Replies from: XiXiDu↑ comment by XiXiDu · 2011-03-07T19:49:20.612Z · LW(p) · GW(p)
Thanks for the English lesson.
And since the benefits of karma have already received attention ad nauseum all over the site...
Reputation systems work, no doubt. The effects might be desirable, or feel desirable. But I don't see that being the case for communities where honesty, skepticism and general openness should play a large role. There is a problem in judging the effectiveness of reputation systems from the inside, especially if you are one of the trend-setters. You might perceive everyone to be on topic and polite, their submissions to be well-thought-out. The reason for that perception might be the reputation system but that doesn't make it true that it also caused people to be honest and less wrong.
If it did, I couldn't be a loud deontologist and come in third on the entire blog, as one salient example. In fact, most of my comments professing deontology have been upvoted.
First of all, you are a little star here due to your fiction. People follow your submissions directly, mostly people who already agree with you or like your fiction. A lot of comments will receive upvotes due to special interest groups while many who would disagree won't even read it or don't care to downvote them. What you said is no evidence of the success of the karma system.
What I am trying to say is that reputation systems can alter the territory to fit the map. Reputation systems are augmented reality layers for beliefs.
Replies from: Alicorn↑ comment by Alicorn · 2011-03-07T19:54:53.316Z · LW(p) · GW(p)
First of all, you are a little star here due to your fiction. People follow your submissions directly, mostly people who already agree with you or like your fiction.
Um, what? Setting aside the cutesy pseudocompliment "little star", I started Luminosity last summer. I was already third on the top contributors list before that, if I recall correctly, and the fiction I was doing before Luminosity was and remains obscure and unpopular, received and continues to receive approximately no attention on LW, etc.
↑ comment by timtyler · 2011-03-07T19:35:53.772Z · LW(p) · GW(p)
Tell me why Karma isn't a mind killer.
I have a video on the virtues of reputation systems: Universal karma.
↑ comment by Costanza · 2011-03-07T18:38:37.853Z · LW(p) · GW(p)
Not sure where you're going with this, but downvoted in accordance with your request. I'm thinking there aren't enough registered and active participants in this forum to get all the way to -10,000, and if there were, most of them won't see this comment now that it's below the default threshold. You may be stuck with a relatively high karma score for a while yet.
↑ comment by childofbaud · 2011-03-07T17:33:43.661Z · LW(p) · GW(p)
There is no evidence that he could get a lot of money differently.
I'm not all that familiar with Yudkowsky's accomplishments, but let's see... He can read, he can write, he can compute. And from what I can tell, he can do all of these things rather well. They may seem basic skills, but it's no coincidence that they make up the three constituent parts of most modern academic standardized tests (GRE, SAT, ACT, etc.). And very few people bother to actually master those skills, or to keep them sharp.
He can bring people together and shape communities (e.g. sl4.org, SIAI, lesswrong). He can do original research. He can synthesize information. He has highly developed skills of elocution and is very good at methodically picking apart people's flawed arguments, while defending his own comparatively sound ones (look him up on bloggingheads.tv). He can popularize esoteric and implausible-sounding ideas.
This is likely not an exhaustive list, but it wouldn't be out of the question to monetize even a lesser subset of these skills, if he was so inclined. And if he was really desperate, he could peruse the Optimal Employment thread for inspiration.
Replies from: XiXiDu↑ comment by XiXiDu · 2011-03-07T17:51:54.268Z · LW(p) · GW(p)
As to evidence that he could be making a lot of money, I'm not all that familiar with Yudkowsky's accomplishments, but let's see... He can read, he can write, he can compute.
Many people can do that, making a lot of money is mostly luck. Do you think IQ correlates strongly with a lot of money? Maybe Mark Zuckerberg should be solving friendly AI then? Yudkowsky had this luck by finding the right mixture to get a lot of non-conformists to act like a cult and donate a lot of money. I'm not saying this is a cult or that the idea of risks from AI is wrong. But I am saying that the argument that the credibility of the idea is in and of itself important to a person is stronger in the case of Yudkowsky than Kurzweil. Kurzweil wants to live forever but has a broad spectrum of skills to make money. I don't see that being the case for Yudkowsky. He is not particularly charismatic or a good entrepreneur as far as I know.
Replies from: childofbaud↑ comment by childofbaud · 2011-03-07T18:04:46.562Z · LW(p) · GW(p)
Many people can do that, making a lot of money is mostly luck.
"Luck favors the prepared mind." -Louis Pasteur
Perhaps many people can do that, but for some reason many people don't do that. There's not that many students with SAT scores that are competitive enough to get into competitive colleges, for example. And those that do have the scores (or other factors in their favour) often end up reasonably well off in the finance department compared to their peers, statistically speaking.
And yes, there is some luck (aka factors we don't yet understand or are able to track) involved in attaining financial stability, but everyone who really wants to be a millionaire, no matter what the cost, is reasonably intelligent (or has other factors in their favour), and doesn't die too young, has a decent chance of ending up reasonably well off (not everyone may be able to pull off financial independence by 25-30, however; and not everyone may make it to billionaire status).
I think a union of IQ and many other factors (i.e. materialism, tenacity, long-term planning, ambition, competitive drive, risk-taking behavior, opportunism, lack of scruples, etc.) correlate strongly with making a lot of money. Too high of an IQ, however, may negate some of those additional factors.
And by the way, I did not limit Yudkowsky's "money-making potential" to just those three factors you quoted, but they're an excellent foundation. Mark Zuckerberg and Raymond Kurzweil both had them.
↑ comment by Simon Fischer (SimonF) · 2011-03-06T19:24:44.349Z · LW(p) · GW(p)
I find your phrasing to be dishonest, especially because you do provide arguments.
Replies from: Pavitra