Posts

Comments

Comment by Michael_G.R. on The Pascal's Wager Fallacy Fallacy · 2009-03-19T14:36:09.000Z · LW · GW

"Most Americans of the time were unabashedly racist, had little concept of electricity and none of computing, had vaguely heard of automobiles, etc."

So if you woke up in a strange world with technologies you don't understand (at first) and mainstream values you disagree with (at first), you would rather commit suicide than try to learn about this new world and see if you can have a pleasant life in it?

Comment by Michael_G.R. on The Pascal's Wager Fallacy Fallacy · 2009-03-18T04:13:36.000Z · LW · GW

Yvain wrote: "The deal-breaker is that I really, really don't want to live forever. I might enjoy living a thousand years, but not forever. "

I'm curious to know how you know that in advance? Isn't it like a kid making a binding decision on its future self?

As Aubrey says, (I'm paraphrasing): "If I'm healthy today and enjoying my life, I'll want to wake up tomorrow. And so on." You live a very long time one day at a time.

Comment by Michael_G.R. on OB Status Update · 2009-01-27T22:27:23.000Z · LW · GW

Eliezer, could we get a status update on the books that will (I hope) come out of all this material you've been writing?

Is it still part of the grand plan, or did that change?

Comment by Michael_G.R. on Getting Nearer · 2009-01-18T17:29:54.000Z · LW · GW

"I think that unless you're revived very quickly after death you'll most likely wake up in a weirdtopia."

Indeed, though a technologically advanced enough weirdtopia might have pretty good ways to help you adapt and feel at home (f.ex. by modifying your own self to keep up with all the post-humans, or by starting you out in a VR world that you can relate to and progressively introducing you to the current world).

Comment by Michael_G.R. on Getting Nearer · 2009-01-18T05:40:53.000Z · LW · GW

"What if you wake up in Dystopia?"

What is the counterargument to this?

I'm not sure if it's possible to convincingly argue that a dystopia bad enough to not be worth living in probably wouldn't care much about its citizens, and even less about its cryo-suspended ones, so if things get bad enough your chances of being revived are very low.

Comment by Michael_G.R. on Seduced by Imagination · 2009-01-17T06:02:21.000Z · LW · GW

I'm currently reading Global Catastrophic Risks by Nick Bostrom and Cirkovic, and it's pretty scary to think of how arbitrarily everything could go bad and we could all live through very hard times indeed.

That kind of reading usually keeps me from having my soul sucked into this imagined great future...

Comment by Michael_G.R. on Changing Emotions · 2009-01-05T04:24:25.000Z · LW · GW

"so you don't throw up every time you remember what you did on your vacation."

Oh man. If this AI thing doesn't work out, maybe you can try comedy?

I read on some skeptics blog that Jim Carey left $50 million to Jenny McCarthy. That sure could fund the SIAI for a while...

Comment by Michael_G.R. on Growing Up is Hard · 2009-01-04T21:32:51.000Z · LW · GW

"So lack of robustness against insufficient omega 6 does indeed cause much mental illness. (One reason my son has been raised on lots of fish oil.)"

Patri, did you mean Omega 3?

Comment by Michael_G.R. on BHTV: de Grey and Yudkowsky · 2008-12-14T22:27:37.000Z · LW · GW

"The paperback has an additional 40-page "Afterword"."

Argh. I already have two copies of the hardback, including an autographed one. Now you're tempting me to get a third copy (makes a good gift, I guess).

Comment by Michael_G.R. on Thanksgiving Prayer · 2008-11-28T21:04:34.000Z · LW · GW

Personally this year I'm thankful for the Earth's molten interior:

http://michaelgr.com/2008/11/28/be-thankful-for-the-earths-molten-interior/

Comment by Michael_G.R. on Whither OB? · 2008-11-17T21:28:36.000Z · LW · GW

"Why is daily posting a shibboleth ?

I would still read the site if EY posted once a week"

I second that. Even if OB was updated only 1-3 times a week by posts of the current level of quality, it would still be one of my favorite sites. In fact, I'm having a hard time keeping up with the current quantity of content and I often need to set time aside to clear my OB backlog.

A better software platform would be good, but I doubt that user-generated content could ever become central to the site. Maybe as a sidebar, with a few rare posts getting promoted to the frontpage.

"I'm not finished, but I'm way over schedule and need to move on soon."

Is the next thing on your schedule writing the books you've talked about in the past? Are you still planning to do the 'popular' book?

"Our most popular post ever, still getting hits to this day, was not written by Robin or myself or any of the recurring editors. It's "My Favorite Liar" by Kai Chang, about the professor who inserted one false statement into each lecture."

Back when it was first published, I submitted it to reddit and it got 1050 votes (which is a lot for that site). Glad it's still getting traffic!

Comment by Michael_G.R. on Ask OB: Leaving the Fold · 2008-11-10T06:33:42.000Z · LW · GW

"Eliezer must be disappointed. He makes a thread to help a recovering irrationalist deal with the social consequences of deconversion, and half the posts he gets are from religious believers and believers in religious belief."

I think this has been linked from some social media sites, which can explain the influx of non-regular OB readers.

Comment by Michael_G.R. on Ask OB: Leaving the Fold · 2008-11-09T22:26:49.000Z · LW · GW

Maybe this is fodder for another post:

A few people here said: "If that person was really special, there would be no problem with you telling him."

But are things really that simple? Not so long ago, Jo would probably have reacted badly if her special person had told her that he didn't believe anymore in what she believed. Loving someone and getting along well with them doesn't mean that you will accept anything they do without problem and vice versa.

Think about the people that you find "special" in your life and imagine telling them that your beliefs have changed about something very important that you both believe in (used to be libertarian, now you have strong authoritarian beliefs/ used to be vegan, now eating ribs every night/ strongly partisan for one political party, switching to another/ etc) and imagine how they would react. Does that make them not "special" anymore?

Comment by Michael_G.R. on Hanging Out My Speaker's Shingle · 2008-11-06T05:18:06.000Z · LW · GW

"Jack, I've spoken on many occasions previously but I was never in Toastmasters."

If you're planning to be speaking for money, Toastmasters might be a good investment. I would recommend at least checking it out to see what you could get out of it. Since you are not a beginner, you should check out 'advanced' clubs.

With public speaking, there's nothing like experience. TM allows you to practice in a friendly environment where you can try new approaches (doesn't matter if they fail), and to benefit from the knowledge of a group of people who have been doing this for a while and should be able to give you more useful feedback than most other groups.

You can also use the club as a way to practice for media appearances (tv interviews, radio, etc).

Comment by Michael_G.R. on BHTV: Jaron Lanier and Yudkowsky · 2008-11-04T16:14:00.000Z · LW · GW

"A very impressive interview - I have gained much respect for Eliezer's patience."

In a way, I think that maybe the most important stuff in this interview is what didn't happen. Eliezer indeed seems to possess super-human patience.

Comment by Michael_G.R. on BHTV: Jaron Lanier and Yudkowsky · 2008-11-03T18:10:24.000Z · LW · GW

"Jaron's laughter seems largely the laughter of frustrated politesse. This comes out in his speech when he repeats to EY "I've been having this discussion for decades.""

I think that's BS. If Jaron didn't want to discuss AI, then why agree to a BGTV episode with Eliezer, a research fellow at the Singularity Institute for Artificial Intelligence.

Eliezer tried to understand what Jaron was saying and asked him questions to get him to better explain his positions. jaron pretty much never tried to make himself clear (probably because there wasn't much to explain in the first place), and he never really explained what he didn't like about Eliezer's position.

How long he's been having this conversation ("for decades" or whatever) only means that he's been having it for a long time, not that he has convincing arguments or that there's any value to what he says.

Comment by Michael_G.R. on BHTV: Jaron Lanier and Yudkowsky · 2008-11-02T21:19:26.000Z · LW · GW

I'm 20 minutes in and wish Lanier's connection would just cut off and Eliezer would talk by himself.

"Eliezer occasionally looked like he was having trouble following Lanier's reasoning. I certainly did. My guess is that this is because, on those occasions, Lanier didn't have a reasoning."

That's my feeling too. He seemed to love calling anyone who disagrees with him an "idiot" or "religious nut" without ever really explaining why.

I'm going to keep watching because I expect Eliezer to say some interesting stuff.

Comment by Michael_G.R. on Shut up and do the impossible! · 2008-10-09T01:49:56.000Z · LW · GW

Here's my theory on this particular AI-Box experiment:

First you explain to the gatekeeper the potential dangers of AIs. General stuff about how large mind design space is, and how it's really easy to screw up and destroy the world with AI.

Then you try to convince him that the solution to that problem is building an AI very carefuly, and that a theory of friendly AI is primordial to increase our chances of a future we would find "nice" (and the stakes are so high, that even increasing these chances a tiny bit is very valuable).

THEN

You explain to the gatekeeper that this AI experiment being public, it will be looked back on by all kinds of people involved in making AIs, and that if he lets the AI out of the box (without them knowing why), it will send them a very strong message that friendly AI theory must be taken seriously because this very scenario could happen to them (not being able to keep the AI in a box) with their AI that hasn't been proven to stay friendly and that is more intelligence than Eliezer.

So here's my theory. But then, I've only thought of it just now. Maybe if I made a desperate or extraordinary effort I'd come up with something more clever :)

Comment by Michael_G.R. on Make an Extraordinary Effort · 2008-10-07T20:02:01.000Z · LW · GW

Small Typo Alert: The second quote should be attributed to "Mastering Eishin-Ryu Swordsmanship"

"Ryu", not "Ruy".

Comment by Michael_G.R. on Ban the Bear · 2008-09-20T14:39:58.000Z · LW · GW

"The capitalists are trying to save capitalism from the capitalists!"

Actually, if we didn't have fiat money that can be printed at will and interest rates were set by market forces and not a bunch of white men picked by the government, we wouldn't be in this mess.

Comment by Michael_G.R. on Brief Break · 2008-09-04T15:32:31.000Z · LW · GW

I want to echo others here and thank you for the great article, and wish you a good break. Get a nice omega 3/folic acid/vitamin D/zinc cocktail and recharge that brain :)

Comment by Michael_G.R. on Hiroshima Day · 2008-08-07T14:41:22.000Z · LW · GW

"The Second World War, as a whole, was probably the most catastrophic event in humanity's recorded history. The world was pretty much screwed as soon as it started -- indeed, probably as soon as Hitler acquired control of Germany."

WWII was just a continuation of WWI, which was a much less 'noble' war, if such a thing can even be said to exist.

War begets more war.

Comment by Michael_G.R. on Hiroshima Day · 2008-08-07T02:47:08.000Z · LW · GW

"What if the alternative was for the U.S. to firebomb and blockade Japan [...]"

That was probably another possibility, but certainly not the only alternative to nuking cities.

How about nuking somewhere very visible but not so populated with the message: "We have more where that came from. Surrender or the next one won't be in a daisy field." ?

Comment by Michael_G.R. on Hiroshima Day · 2008-08-07T02:43:01.000Z · LW · GW

I wrote something a little while ago about how Nagasaki was a secondary target, and Kokura was saved by cloudy conditions.

http://michaelgr.com/2008/02/01/nagasakis-nuke-was-supposed-to-be-dropped-on-kokura/

Comment by Michael_G.R. on Detached Lever Fallacy · 2008-07-31T21:16:42.000Z · LW · GW

"Actually, the apple-recognition machinery in the human brain really does turn off on a regular basis. You have to be awake in order to recognize an apple; you can't do it while sleeping."

I don't remember ever dreaming about fruits, but I'm pretty sure I could recognize an apple if it happened. Did I just set myself up to have a weird dream tonight? Oh boy...

The fact that the pattern that makes the apple module light up comes from different places while dreaming than while awake doesn't matter; you don't stop recognizing it, so the model probably isn't 'off'.

Comment by Michael_G.R. on Humans in Funny Suits · 2008-07-31T20:45:15.000Z · LW · GW

Thank you for writing, Anne. Your comments here, as well as your recent 'interview' posts on your blog, have been most interesting.

Comment by Michael_G.R. on The Gift We Give To Tomorrow · 2008-07-17T16:26:59.000Z · LW · GW

"Once upon a time, when all of civilization was a single galaxy and a single star: and a single planet, a place called Earth."

Did this dialogue take place aboard the Battlestar Galactica? :-P

Great post!

Comment by Michael_G.R. on The Genetic Fallacy · 2008-07-11T06:43:21.000Z · LW · GW

"later came to reject (on a deliberate level) the idea that the Bible was not written by the hand of God

Don't you mean "was written by..." here?

Comment by Michael_G.R. on I'd take it · 2008-07-02T23:44:47.000Z · LW · GW

Not too different from the rest of this crowd: Fund a bunch of high risk, high reward scientific research projects.

Comment by Michael_G.R. on The Quantum Physics Sequence · 2008-06-11T16:35:47.000Z · LW · GW

Will the eBooks also be available as hard copies? I'm probably not alone in preferring to read long texts on paper, and printing them out isn't quite the same.

Comment by Michael_G.R. on Against Devil's Advocacy · 2008-06-09T14:39:54.000Z · LW · GW

"That's what Richard Dawkins understands that Michael Rose doesn't - that Reason is not a game."

Dawkins is also acutely aware that his opponents won't always play fair, and have often quoted him and other scientists out of context to try to make it seem like they hold position that they don't actually hold. That's why he wants to have a tape recorder when he dies, so there can't be rumors about his "deathbed conversion".

Comment by Michael_G.R. on Bloggingheads: Yudkowsky and Horgan · 2008-06-08T19:26:55.000Z · LW · GW

You need a chess clock next time. John talks way too much.

Comment by Michael_G.R. on Conference on Global Catastrophic Risks · 2008-05-20T01:59:28.000Z · LW · GW

While on the topic of conferences that might interest this crowd, Aging 2008 will take place on June 27th in Los Angeles at UCLA.

"leading scientists and thinkers in stem cell research and regenerative medicine will gather in Los Angeles at UCLA for Aging 2008 to explain how their work can combat human aging, and the sociological implications of developing rejuvenation therapies.

Aging 2008 is free, with advance registration required"

More details here:

http://www.mfoundation.org/ADCI/

Comment by Michael_G.R. on Angry Atoms · 2008-03-31T03:32:24.000Z · LW · GW

It seems like this post isn't as clear as it could be - or at least not as clear as Elizer's best posts.

Either it needs another draft, or the problem lies with me and I just need to re-read it more carefully...

Comment by Michael_G.R. on Initiation Ceremony · 2008-03-29T15:55:43.000Z · LW · GW

All I can say is that when I scrolled down and saw the photo, my first thought was 'awesome'.

Nice illustration of your previous post, Eliezer.

Comment by Michael_G.R. on The Beauty of Settled Science · 2008-03-25T03:49:02.000Z · LW · GW

Definitely good advice on textbooks.

I've been slowly, sloooowly reading Molecular Biology of the Cell (5th ed, brand new) by Alberts, and Lehninger: Principles of Biochemistry (4th ed). So far, I prefer the first one.

Until recently I was too intimidated to buy them because that's far from what I studied, but now I regret it. I should have started sooner.

Comment by Michael_G.R. on If You Demand Magic, Magic Won't Help · 2008-03-22T20:53:22.000Z · LW · GW

I wish this kind of stuff was taught to more children. Too few people fall in love with reality.

Comment by Michael_G.R. on Joy in Discovery · 2008-03-21T17:20:32.000Z · LW · GW

Trivia: The stars & girlfriend story was mentioned by Richard Feynman in "What Do YOU Care what Other People Think?"

Comment by Michael_G.R. on Penguicon & Blook · 2008-03-16T04:54:18.000Z · LW · GW

I might be a bit late with this, but here's my 2 cents:

"Maybe spend another month or two doing large transhumanist sequences, either on the Singularity Institute blog (currently fairly defunct) or here on Overcoming Bias if the readers really want that."

I suggest writing the Transhumanist stuff on either a new blog (wordpress.com is free and easy to set up) or the SIAI blog, and link that prominently from Overcoming Bias. Maybe even do a weekly roundup/linkfest post here to remind people.

This would mean that you wouldn't lose those who are interested, and that those who aren't can easily skip it and spare us the complaining.

Comment by Michael_G.R. on Penguicon & Blook · 2008-03-14T20:45:49.000Z · LW · GW

"I agree, but Eli has already announced his intentions to rewrite most of the material, which will require a great deal of work."

Indeed, but less than coming up with it in the first place, and the total return on investment will likely be much higher.

f.ex., if for 1000 hours of work he got 5,000 regular readers here, for 1300 hours of work he might be able to get 100,000+ (not all at once, but over a few years) with relatively little overlap between both groups.

"Publishing this kind of a book is essentially a shotgun approach"

I'm not sure I completely agree with that. There is certainly a shotgun element, but book readers - like blog readers - are also self-selecting, and it is very unlikely that everybody who would select it is already reading this blog. And if they are now, will they be in 3 years?

"The blog posts, so far as I can tell, are doing as well a job of teaching their readers as can reasonably be expected."

I would argue that a book would be better at teaching. Personally, I know that on some days I missed posts for one reason or another (traveling, too busy, whatever), and by the time I started reading again I had a long backlog, which was tedious to read on the screen, etc. There's simply a bigger barrier to entry, especially for people who aren't already convinced, or people who simply don't read much on the net in their free time but pick up lots of books.

Newcomers might also find this blog and see a hard post that belong in the middle of a series (or simply something that happens to not interest them) and simply give up and not come back. I suspect this is happening every day. With a book, you have the benefit of having everybody start from the beginning and you can gradually hook them.

" We will need to prepare something for the next generation, but that's caused by the nature of blogging, not some deficiency in Eli's postings."

Exactly. The current format is crippling the potential of the material, so onward with the book(s)!

Comment by Michael_G.R. on Penguicon & Blook · 2008-03-14T17:58:09.000Z · LW · GW

""Almost nobody (relatively) will be rediscovering them in a few years. That's simply the nature of blogging. Who's reading 3 years old BoingBoing posts right now?""

"This is a very good point."

I think it would be a waste, and very sad, if Eliezer had spent over a year writing enough high-quality material for a book and that material just stayed buried in Overcoming Bias' archives, nearly forgotten in a matter of years.

For a little more effort, he can produce books that more chances of making a difference.

Tom, you say that people who are "helping" are a small group. That is true. But I don't believe that everybody who could be convinced, who could help is already on board. Simply think about yourself; you are in now, but at some point in your life you weren't. Something must've lit the fire. Imagine if back then you had seen an interesting review of Elizer's book on Amazon and decided to check it out. I bet that would have been a great introduction.

Since that group of people is small, it means that it only takes a small number of additional people to make a relatively big difference. And besides, not everybody who's already on board (however you define that) is thinking as clearly as Eliezer about these things. Many could learn from his book, I'm sure, and that could increase the quality of their contribution.

Another argument is that there is a chance (probably low) that the book(s) could sell a lot more than we can expect right now. Not giving that a shot when the book is almost already written would be a wasted opportunity, IMHO.

Comment by Michael_G.R. on Penguicon & Blook · 2008-03-14T04:18:58.000Z · LW · GW

I'd definitely like to have that 500 pages book in my library as reference, and give the shorter popular book as gift to friends (or my future kids?).

Only a small subset of the small group (relatively) of people who have read these blog posts as they were published will use them as reference later. Almost nobody (relatively) will be rediscovering them in a few years. That's simply the nature of blogging. Who's reading 3 years old BoingBoing posts right now?

I'm currently reading "Godel, Escher, Bach", and from what I've read here, I think that Eliezer's book could become something like that. Maybe not a Pulitzer (but who knows?), but certainly something special that changes the way people think.

Comment by Michael_G.R. on Penguicon & Blook · 2008-03-14T04:00:26.000Z · LW · GW

"I was planning a nigh-complete rewrite for the book - less than 50% previously published sentences, say. Would it be a problem if the ideas, but not sentences, all exist elsewhere?"

You might need to remove some posts from the net if you decide to use them as is in the book, but if it is all modified, there shouldn't be a problem AFAIK.

Ideas can't be copyrighted, and you wouldn't be the first person to turn blog material into dead tree.

Comment by Michael_G.R. on My Strange Beliefs · 2008-01-03T04:07:33.000Z · LW · GW

Keep using whatever examples and anecdotes you think best make your points, Eliezer. If that person doesn't like what you write, he/she can just skip it.

Comment by Michael_G.R. on Guardians of Ayn Rand · 2007-12-18T20:54:47.000Z · LW · GW

I really enjoyed reading this. Thank you Eliezer.

Comment by Michael_G.R. on Fake Fake Utility Functions · 2007-12-06T18:41:03.000Z · LW · GW

This post put a big smile on my face. Thanks Eliezer.

Comment by Michael_G.R. on Mere Messiahs · 2007-12-03T16:58:43.000Z · LW · GW

Excellent post, Eliezer. Thank you.

Comment by Michael_G.R. on Evolutions Are Stupid (But Work Anyway) · 2007-11-03T17:20:04.000Z · LW · GW

Eliezer, it certainly seems that you go over your "writer's molasse". Congrats!

Comment by Michael_G.R. on An Alien God · 2007-11-02T23:11:12.000Z · LW · GW

Great stuff, Eliezer. I'm really looking forward to you compiling your writings in a book.

Comment by Michael_G.R. on Torture vs. Dust Specks · 2007-10-30T18:36:00.000Z · LW · GW

"For those who would pick SPECKS, would you pay a single penny to avoid the dust specks?"

To avoid all the dust specks, yeah, I'd pay a penny and more. Not a penny per speck, though ;)

The reason is to avoid having to deal with the "unintended consequences" of being responsible for that very very small change over such a large number of people. It's bound to have some significant indirect consequences, both positive and negative, on the far edges of the bell curve... the net impact could be negative, and a penny is little to pay to avoid responsibility for that possibility.