Posts

A question about Eliezer 2012-04-19T17:27:38.572Z

Comments

Comment by perpetualpeace1 on A question about Eliezer · 2012-04-19T19:55:10.773Z · LW · GW

For example, does it allow you to deal with your own problems better?

I don't agree with this at all. I could become a Christian, and then believe that all of my problems are gone because I have an eternity in heaven waiting for me simply because I accepted Jesus Christ as my savior. Christianity makes few falsifiable predictions. I want to hold EY up to a higher standard.

Comment by perpetualpeace1 on A question about Eliezer · 2012-04-19T19:51:42.364Z · LW · GW

OK, I don't want to get off-topic. EY doesn't practice the Dark Arts (at least, I hope not).

A lot of what EY writes makes sense to me. And I'd like to believe that we'll be sipping champagne on the other side of the galaxy when the last star in the Milky Way burns out (and note that I'm not saying that he's predicting that will happen). But I'm not a physicist or AI researcher - I want some way to know how much to trust what he writes. Is anything that he's said or done falsifiable? Has he ever publicly made his beliefs pay rent? I want to believe in a friendly AI future... but I'm not going to believe for the sake of believing.

Comment by perpetualpeace1 on A question about Eliezer · 2012-04-19T19:09:56.588Z · LW · GW

I would guess EY sees himself as more of a researcher than a forecaster, so you shouldn't be surprised if he doesn't make as many predictions as Paul Krugman.

OK. If that is the case, then I think that a fair question to ask is what have his major achievements in research been?

But secondly, a lot of the discussion on LW and most of EY's research presupposes certain things happening in the future. If AI is actually impossible, then trying to design a friendly AI is a waste of time (or, alternately, if AI won't be developed for 10,000 years, then developing a friendly AI is not an urgent matter). What evidence can EY offer that he's not wasting his time, to put it bluntly?

Comment by perpetualpeace1 on A question about Eliezer · 2012-04-19T19:02:25.198Z · LW · GW

You play a game that you could either win or lose. One person follows, so far as he or she is able, the tenets of timeless decision theory. Another person makes a decision by flipping a coin. The coin-flipper outperforms the TDTer.

Comment by perpetualpeace1 on A question about Eliezer · 2012-04-19T18:39:15.682Z · LW · GW

If his decision theory had a solid theoretical background, but turned out to be terrible when actually implemented, how would we know? Has there been any empirical testing of his theory?

Comment by perpetualpeace1 on A question about Eliezer · 2012-04-19T18:25:42.587Z · LW · GW

For the record, what were those predictions? What are your sources?

Comment by perpetualpeace1 on Discussion: Yudkowsky's actual accomplishments besides divulgation · 2012-04-19T15:00:16.095Z · LW · GW

On a related note... has Eliezer successfully predicted anything? I'd like to see his beliefs pay rent, so to speak. Has his interpretation of quantum mechanics predicted any phenomena which have since been observed? Has his understanding of computer science and AI lead him to accurately predict milestones in the field before they have happened?