Posts

Trying to track down a quote about evolution 2011-02-05T00:37:29.974Z

Comments

Comment by erniebornheimer on What risks concern you which don't seem to have been seriously considered by the community? · 2020-10-28T18:55:53.604Z · LW · GW

Sorry, I'm not too familiar with the community, so not sure if this question is about AI alignment in particular or risks more broadly. Assuming the latter: I think the most overlooked problem is politics. I worry about rich and powerful sociopaths being able to do evil without consequences or even without being detected (except by the victims, of course). We probably can't do much about the existence of sociopaths themselves, but I think we can and should think about the best ways to increase transparency and reduce inequality. For what it's worth, I'm a negative utilitarian.

Comment by erniebornheimer on Parapsychology: the control group for science · 2011-11-30T21:12:31.583Z · LW · GW

I agree with HiddenTruth and prase. The original post is flawed, because it starts with a perfectly good idea: "if there were a group that 'did science' but was always wrong, it would be a good control group to compare to 'real science'", but then blows it by assuming parapsychologists are indeed always wrong.

FWIW, I too believe parapsychologists are probably almost always wrong, but so what? Who cares what I believe? No one does, and no one should (without evidence), and that's the point.

Comment by erniebornheimer on Rationality Quotes September 2011 · 2011-09-02T22:37:43.234Z · LW · GW

Yeah. This was put very well by Fyodor Urnov, in an MCB140 lecture:

"What is blindingly obvious to us was not obvious to geniuses of ages past."

I think the lecture series is available on iTunes.

Comment by erniebornheimer on Rationality Quotes September 2011 · 2011-09-02T22:28:01.553Z · LW · GW

Sounds implausible to me, so I'm very interested in a citation (or pointers to similar material). If true, I'm going to have to do a lot of re-thinking.

Comment by erniebornheimer on Rationality Quotes August 2011 · 2011-08-25T21:17:21.850Z · LW · GW

In Soviet Russia...

Comment by erniebornheimer on Rationality Quotes August 2011 · 2011-08-25T21:12:33.060Z · LW · GW

Of course prejudices can be changed, at which point they become postjudices.

Comment by erniebornheimer on Efficient Charity: Do Unto Others... · 2011-07-13T18:23:28.149Z · LW · GW

From the aricle: "CARE is a noble organization that fights starvation. It would like your support. The American Cancer Society is a noble organization that fights disease. It would like your support, too. Here's my advice: If you're feeling very charitable, give generously—but don't give to both of them. ... Giving to either agency is a choice attached to a clear moral judgment. When you give $100 to CARE, you assert that CARE is worthier than the cancer society. Having made that judgment, you are morally bound to apply it to your next $100 donation."

Landsburg is wrong, and here's why. Because the world is shades of gray, not black and white. It's not clear what the best charity is, even by one's own standards (partly because those standards are not clear, and they sometimes conflict with each other). We know ourselves well enough to know we're not smart enough to make those judgments perfectly, so we don't bother with trying for perfection, but rather with making sure to do at least some good. It's hedging our bets knowing that some of the money is going to the "wrong" charity (we're just not sure which one is "wrong").

Comment by erniebornheimer on Efficient Charity: Do Unto Others... · 2011-07-13T17:52:27.991Z · LW · GW

Utter crap.

I'm reading this much later than it was written, but feel I must respond.

  1. "Africa receives many billions of dollars in donations, there's clearly something wrong with the way it works, and you're not going to fix it by adding a million dollars..." Even if it were 100% true that there's something wrong with the way it works, it DOES NOT FOLLOW that the answer is to give less (or none). It may be the case that we have not given enough.

  2. "It's like a car that leaks fuel, you can keep adding more and more fuel, or you should try and fix it..." Even if the analogy were 100% true and applicable, you offer a false dichotomy (add fuel OR fix it). It may be that we need to work on both simultaneously. To further the analogy, it may be that you have to add fuel to the car in the short term (knowing some will be lost), in order to keep it running long enough to get it to the mechanic who can do the long term fix.

  3. "If you prolong an African life you're probably prolonging suffering, which is a waste. A life of suffering and misery is not worth saving." This is at best, horribly wrong-headed, and at worst, disgustingly elitist. I suspect it's a troll. Only a fool thinks a life that contains suffering and misery is not worth saving. Some people indeed are suffering so much that they may prefer death. But that is THEIR choice, not yours or mine. If you're really serious about triage, and how best to spend one's money to relieve suffering, a little humility and compassion (qualities I see very little of in this comment) might go a long way toward achieving humane (and yet rational) solutions. I suspect JonatasMueller belongs to a set of people about whom no one will ever have to ask these hard questions, and I suspect his answer is influenced by that fact. And: there's a subtle mistake here: the idea that reducing suffering does not include preserving or prolonging life. Says who?

  4. "...the goal as I see it is to increase the intelligence (or cure the lack of it) to make the agents of this world able to willingly solve their problems, and thereby reach a state of technological advancement that allows them to get rid of all problems for good." That's a crap goal, if it crowds out other worthy goals. I believe we're already as smart as we need to be, so any efforts to try to increase intelligence are a waste of resources. I agree advancing technology is important, but I believe sharing what we have is much much more important. Our problems are mainly political, not technological. What good will it do, it technology improves the lives of some, but the fruits of that technology are not shared?

Comment by erniebornheimer on Trying to track down a quote about evolution · 2011-02-16T01:58:48.443Z · LW · GW

Yes, that's it! Thank you!

Comment by erniebornheimer on That Magical Click · 2010-01-20T21:44:36.643Z · LW · GW

At the risk of revealing my stupidity...

In my experience, people who don't compartmentalize tend to be cranks.

Because the world appears to contradict itself, most people act as if it does. Evolution has created many, many algorithms and hacks to help us navigate the physical and social worlds, to survive, and to reproduce. Even if we know the world doesn't really contradict itself, most of us don't have good enough meta-judgement about how to resolve the apparent inconsistencies (and don't care).

Most people who try to make all their beliefs fit with all their other beliefs, end up forcing some of the puzzle pieces into wrong-shaped holes. Their favorite part of their mental map of the world is locally consistent, but the farther-out parts are now WAY off, thus the crank-ism.

And that's just the physical world. When we get to human values, some of them REALLY ARE in conflict with others, so not only is it impossible to try to force them all to agree, but we shouldn't try (too hard). Value systems are not axiomatic. Violence to important parts of our value system can have repercussions even worse than violence to parts of our world view.

FWIW, I'm not interested in cryonics. I think it's not possible, but even if it were, I think I would not bother. Introspecting now, I'm not sure I can explain why. But it seems that natural death seems like a good point to say "enough is enough." In other words, letting what's been given be enough. And I am guessing that something similar will keep most of us uninterested in cryonics forever.

Now that I think of it, I see interest in cryonics as a kind of crankish pastime. It takes the mostly correct idea "life is good, death is bad" to such an extreme that it does violence to other valuable parts of our humanity (sorry, but I can't be more specific).

To try to head off some objections:

  • I would certainly never dream of curtailing anyone else's freedom to be cryo-preserved, and I recognize I might change my mind (I just don't think it's likely, nor worth much thought).
  • Yes, I recognize how wonderful medical science is, but I see a qualitative difference between living longer and living forever.
  • No, I don't think I will change my mind about this as my own death approaches (but I'll probably find out). Nor do I think I would change my mind if/when the death of a loved one becomes a reality.

I offer this comment, not in an attempt to change anyone's mind, but to go a little way to answer the question "Why are some people not interested in cryonics?"

Thanks!