Magical Healing Powers

post by jefftk (jkaufman) · 2012-08-12T03:19:50.062Z · LW · GW · Legacy · 26 comments

Contents

26 comments

Imagine you had magical healing powers. Sitting quietly with someone and holding their hand you could restore them to health. While this would be a wonderful ability to have, it would also be a hard one: any time you spent on something other than healing people would mean unnecessary suffering. How could you justify a pleasant dinner with your family or a relaxing weekend at the beach when that meant more people living in pain?

But you already have these powers. Through the magic of effective charity you can donate money to help people right now. The tradeoff remains: time you give yourself when you could be working means money you don't earn which then can't go to help the people who would most benefit from it.

(I don't think this means you should try for complete selflessness; you need to balance your needs against others'. But the balance should probably be a lot further towards others' than it currently is.)

Update 2012-08-12: this is a response to hearing people offline saying that if they had magical "help other people" powers then they should spend lots of time using them, without having considered that they already have non-magical "help other people" powers.

I also posted this on my blog

26 comments

Comments sorted by top scores.

comment by CarlShulman · 2012-08-12T04:03:25.717Z · LW(p) · GW(p)

This post is too low on interesting or useful new content, relative to reiteration of a standard ideological view about altruism. It would be different if the post described an interesting new argument, or a new more efficient means of helping people.

How could you justify a pleasant dinner with your family or a relaxing weekend at the beach when that meant more people living in pain?

Really, this is just a particular form of "what if you had super-high productivity, orders of magnitude higher than almost anyone else on Earth." If I had these powers, and knew they couldn't be duplicated through research and study, I would sell them at high prices. To extract as much of the surplus as possible, I would use the "financial aid" price-discrimination system employed by elite universities, demanding tax returns and other information to determine ability to pay, and then extracting a large portion of that potential in exchange for healing.

If one has to "lay on hands" only briefly, the expected annual revenue (provided one didn't get kidnapped or imprisoned) would be in the trillions of dollars. Even if a healing took a couple of hours, revenue would be in the tens of billions (driven primarily by the super-rich).

In this situation I would probably work 70-100 hour work weeks, and use a budget of billions of dollars to make quality of life while working as high as feasible, e.g. healing on the beach while getting massages, eating gourmet meals, hearing reports of scientific research projects I had commissioned, and so forth. And then a majority of the revenues would go towards improving global prospects.

Replies from: NancyLebovitz, jkaufman, buybuydandavis
comment by NancyLebovitz · 2012-08-12T15:46:05.159Z · LW(p) · GW(p)

One of the research projects should definitely be finding out how more people can have healing powers.

Replies from: CarlShulman
comment by CarlShulman · 2012-08-12T19:21:35.350Z · LW(p) · GW(p)

and knew they couldn't be duplicated through research and study

comment by jefftk (jkaufman) · 2012-08-12T04:22:34.099Z · LW(p) · GW(p)

It would be different if the post described an interesting new argument

Hmm. I left out context that is more important than I thought it was: this post is a response to two different people in different conversations speculating in person about what they would do if they had powers like this. They both thought they would have an obligation to help people, but didn't think they had an obligation to in normal life.

I don't think it was a scale issue so much as being about magic or specialness.

comment by buybuydandavis · 2012-08-13T00:56:19.227Z · LW(p) · GW(p)

This post is too low on interesting or useful new content, relative to reiteration of a standard ideological view about altruism.

The opportunity to help others is is not taken advantage of by either altruists or egoists as much as it would fulfill their values. When people continue to act contrary to their values, it's good to continue remind them.

Replies from: drethelin
comment by drethelin · 2012-08-13T07:04:03.798Z · LW(p) · GW(p)

In general, people don't act contrary to their values so much as their stated values are a simplified approximation of their actual values.

comment by Nic_Smith · 2012-08-12T04:08:47.272Z · LW(p) · GW(p)

I am reminded of a particular SMBC involving Superman.

Somewhat more seriously, I can't help but think that this post starts with far-mode idealization: would it really be difficult to turn people away for personal reasons if you had magical healing powers? Or is it merely uncomfortable, and perhaps usually bad signalling, to admit you would?

comment by Richard_Kennaway · 2012-08-12T09:48:44.508Z · LW(p) · GW(p)

I'm coming to think that "people are dying, every second you aren't stopping it you're murdering them" is a basilisk. And not too far removed from Roko's. Instead of fear of the future FAI torturing you, this basilisk does it by screaming at you inside your head.

Replies from: jkaufman
comment by jefftk (jkaufman) · 2012-08-12T13:28:53.206Z · LW(p) · GW(p)

How are you using "basilisk"?

My understanding of Roko's is that being exposed to it decreases global utility while exposure to "you should help other people and it's very important" increases it. But I don't know if that's relevant to basilisk status.

Replies from: NancyLebovitz, Richard_Kennaway
comment by NancyLebovitz · 2012-08-12T15:43:13.409Z · LW(p) · GW(p)

There's a difference between "you should help other people and it's very important" and "helping other people is so important that you should treat your quality of life as irrelevant". The latter leads to some combination of burnout, despair, and apathy, though possibly with some useful work done on the way to burnout.

Replies from: jkaufman, DanArmak
comment by jefftk (jkaufman) · 2012-08-12T17:57:38.379Z · LW(p) · GW(p)

I don't believe "helping other people is so important that you should treat your quality of life as irrelevant", because of the negative consequences you describe.

(I still don't see a basilisk here.)

Replies from: NancyLebovitz
comment by NancyLebovitz · 2012-08-12T18:32:59.225Z · LW(p) · GW(p)

You don't believe that, but

How could you justify a pleasant dinner with your family or a relaxing weekend at the beach when that meant more people living in pain?

that's how some people see the utilitarian calculation.

comment by DanArmak · 2012-08-22T19:33:52.710Z · LW(p) · GW(p)

The latter leads to some combination of burnout, despair, and apathy, though possibly with some useful work done on the way to burnout.

The problem is precisely that people are reluctant to admit that they choose not burning out over helping others.

comment by Richard_Kennaway · 2012-08-12T21:28:08.630Z · LW(p) · GW(p)

A basilisk, in this context, is a thought that kills you if you think it, which is excessive for this, and for Roko's. I mean a thought that breaks your cognitive processes in some way if you think it. Which I think is a fair way to describe someone who, on contact with the "you're murdering everyone you don't try to save" idea, is consumed with guilt that their every moment is not devoted with maximum effort to saving the world.

comment by Desrtopa · 2012-08-12T03:52:11.788Z · LW(p) · GW(p)

If I had magic healing powers, I'm pretty sure that rather than healing people all the time, my time would be better spent marketing myself and using my powers to become rich and famous, at which point I could help more people with my money than I could with a hands-on approach.

In fact, this is the most effective use of most superpowers.

Replies from: army1987
comment by A1987dM (army1987) · 2012-08-12T20:49:25.257Z · LW(p) · GW(p)

Even after you factor in how long it takes to cure someone with money vs with magic? CarlShulman's idea sounds even more effective to me.

Replies from: Desrtopa
comment by Desrtopa · 2012-08-12T22:32:08.582Z · LW(p) · GW(p)

I would say that it's effectively the same idea, in more detail.

Replies from: army1987
comment by A1987dM (army1987) · 2012-08-12T22:46:00.331Z · LW(p) · GW(p)

I had taken “at which point I could help more people with my money than I could with a hands-on approach” to imply ‘... and therefore no longer use the hands-on approach’; if you actually meant ‘... than I could with a hands-on approach alone’...

comment by Shmi (shminux) · 2012-08-12T03:32:50.659Z · LW(p) · GW(p)

But the balance should probably be a lot further towards others' than it currently is.

Why? My should is different from your should. Who is to say that your should is better for me than mine?

And no, I don't accept your "idea that we have some obligation to try to help other people". I hate obligations. They piss me off. Whatever I do, I do because I want to, not because I owe it to others.

Replies from: Vladimir_Nesov, AnotherIdiot
comment by Vladimir_Nesov · 2012-08-13T05:08:58.674Z · LW(p) · GW(p)

Who is to say that your should is better for me than mine?

This seems like a bad general heuristic that should be more restricted in its application. Who is to say that following your understanding of your goals is better for you than following someone else's? You have to consider specific arguments, not just origins of statements or beliefs. Think it possible that you may be mistaken, etc.

Replies from: shminux
comment by Shmi (shminux) · 2012-08-13T05:24:03.342Z · LW(p) · GW(p)

No disagreement there, specific arguments ought to be considered. However, in my experience, if someone tells you that you have an obligation to do something (pray to $god, donate to $cause, enlist in $military, vote for $candidate, ...), they are not to be trusted with putting forth arguments, or even estimating prior[itie]s. So, ignore people like that entirely and do your own research from scratch.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2012-08-13T05:38:36.091Z · LW(p) · GW(p)

So the heuristic is to only consider arguments that don't claim to be leading to any (novel/actionable) conclusions? This rule decides at the bottom line, stopping consideration of arguments that don't conclude with uncertainty or close match to intuitively natural desires, which would be bad if conclusions not of that form turn out to be knowably correct.

If you remain specific, you may get rid of "pray to $god", but not other similar things, "donate to $cause that's known to be worthless", but not other similar things, etc. That should lift most of the load without as many false negatives.

comment by AnotherIdiot · 2012-08-12T16:43:02.468Z · LW(p) · GW(p)

Wait, so you're saying that your right to freedom is more important than making this world as good as possible? By all moral systems I know of, that's morally wrong (though I'll admit I don't know many). Do you have a well-defined moral system you could point me to?

Replies from: saturn, shminux
comment by saturn · 2012-08-12T18:33:43.704Z · LW(p) · GW(p)

http://en.wikipedia.org/wiki/Ethical_egoism

Replies from: army1987
comment by A1987dM (army1987) · 2012-08-12T20:55:04.382Z · LW(p) · GW(p)

Even an ethical egoist would cooperate with a copy of herself in the prisoner's dilemma, if she's using the ‘right’ decision theory.

comment by Shmi (shminux) · 2012-08-13T04:51:15.933Z · LW(p) · GW(p)

I am saying nothing of the sort. My point is that I distrust anyone who tells me that I am obligated to do stuff that they think is "right".