Ineffective Altruism

post by lsusr · 2022-04-23T22:07:09.946Z · LW · GW · 17 comments

Contents

  Evil happens when you are separated from the pain you inflict upon other people.
None
17 comments

There are many things I can put my time toward.

These do not have the same impact. My technology work helps more people per hour invested than my volunteering at Robotics Club. Does that mean I should put all of my time into inventing technology and none of it into Robotics Club?

No. That would be premature optimization. The root of all evil is premature optimization.

Occasionally someone who has used my inventions will message me and tell me it changed their life. Occasionally. Occasionally a reader of my blog will tell me they found it useful. Occasionally. When I help out at Robotics Club, the teenagers are happy to see me. Every. Single. Time.

I recently flew down to San Francisco to teach a class and give a speech. The organizers offered to put me in a fancy hotel. Instead, I found the cheapest AirBnb in the area and walked for an hour to the location. Partially I did this to save money, but I also did it to stay in touch with reality.

Evil happens when you are separated from the pain you inflict upon other people.

The host of my AirBnb was an elderly Black man. He grew up poor. His mother would clean white folks' houses all day to earn money and then clean his home to save money. His parents worked so hard to survive they had little time to raise their children. Two of his sisters became pregnant at age thirteen.

When I arrived at my host's home, he gave me a white towel with lots of visible stains. But it was clean. I immediately used it to wipe my face.

My AirBnb host is really into Black Power, but he never pushed his political beliefs on me. He wanted to know what it was like to be rich. I wanted to know what it was like to be poor. Were I to go to a fancy hotel, the system would make sure I never had to interact with a man like him.

Not that AirBnb didn't try. We communicated via SMS instead of AirBnb's website because AirBnb's website has an algorithm that scans our messages for keywords and punishes hosts it thinks did a poor job—regardless of the star rating a customer like me provides.

This man who was born before the Civil Rights Act of 1964 already lives in a dystopia run by an AI.

17 comments

Comments sorted by top scores.

comment by Wei Dai (Wei_Dai) · 2022-04-24T18:05:36.515Z · LW(p) · GW(p)

Evil happens when you are separated from the pain you inflict upon other people.

If only someone would invent a time machine so we can see what effects our actions have on the far future...

We communicated via SMS instead of AirBnb’s website because AirBnb’s website has an algorithm that scans our messages for keywords and punishes hosts it thinks did a poor job—regardless of the star rating a customer like me provides.

I was skeptical of this after reading (from one of your comment replies) that you only heard about this from the host, but some searching turned up a report in the NYT (confirming an original report in the WSJ) that's even worse:

Most recently, in April, The Journal’s Christopher Mims looked at a company called Sift, whose proprietary scoring system tracks 16,000 factors for companies like Airbnb and OkCupid. “Sift judges whether or not you can be trusted,” he wrote, “yet there’s no file with your name that it can produce upon request.”

As of this summer, though, Sift does have a file on you, which it can produce upon request. I got mine, and I found it shocking: More than 400 pages long, it contained all the messages I’d ever sent to hosts on Airbnb; years of Yelp delivery orders; a log of every time I’d opened the Coinbase app on my iPhone. Many entries included detailed information about the device I used to do these things, including my IP address at the time.

Replies from: yitz
comment by Yitz (yitz) · 2022-04-25T00:45:24.273Z · LW(p) · GW(p)

welp, that's horrifying and also honestly quite expected...

comment by Eli Tyre (elityre) · 2022-04-24T17:17:18.972Z · LW(p) · GW(p)

Evil happens when you are separated from the pain you inflict upon other people.

I think this is true, and close to my deepest concern about the EA movement. I would like someone to articulate this clearly and with concrete detail, in EA's own term.

I intend to write that post myself, but if someone else beat me too it, or wrote a better one, I would be grateful.

Replies from: pktechgirl, yitz
comment by Elizabeth (pktechgirl) · 2022-04-25T23:41:44.395Z · LW(p) · GW(p)

I find this sentence really aesthetic, but the more I think about it the more it seems incorrect. I see at least as much evil caused by people being too reactive to the pain as I do being too distant. This is not exactly the exact opposite of the original sentence but it's closer to the opposite than the original.

Replies from: GWS
comment by Stephen Bennett (GWS) · 2022-04-26T13:52:42.303Z · LW(p) · GW(p)

I interpret lsusr as saying something closer to "What type of evil should you look out for in systems wherein people are separated from the pain they inflict on others? Evil that is manifested ex nihilo. If you have distance from the pain you cause, then the evil you cause will appear to you to have come from nowhere at all."

I didn't see him as claiming "The root cause of all evil is the distance between pain inflicter and pain receiver", although I think there is a case there if you take an expansive view of "distance" (such as emotional distance in the form of a lack of empathy qualifying as "distance").

comment by Yitz (yitz) · 2022-04-25T00:44:27.848Z · LW(p) · GW(p)

I agree with you, and encourage you to write this post.

comment by devansh (dpandey) · 2022-04-24T14:29:27.323Z · LW(p) · GW(p)

Most people are disconnected from reality, most of the time. This is most noticeable to me when it manifests itself in scope insensitivity, but it appears in other ways too. In this case, you choosing to spend two hours walking to save costs is not a “keep in touch with reality” measure, it is a “lsusr is wasting his time” measure. Two hours of your time could be spent on things that really matter to you. Don’t quit Robotics Club if you like Robotics Club, but recognize that you do it for fuzzies and not for utils.

The average person in a developed country is probably net-neutral or even slightly net-positive to humans as a whole. I agree with you that evil happens when you are separated from the pain you inflict on other people. But your opportunity costs are real actual costs too. If you make decisions (like quitting a project) that affect lots of people because you’re constrained on not having enough hours in a day, and then waste some of the hours in a day that you do have on a misguided idea of “staying in touch with reality,” you have failed to stay in touch with reality.

Still, I think parts of your core message are really important. Evil does happen when you separate yourself from the pain you inflict, because it’s very easy to abstract it away. This is how child slavery and other moral atrocities continue. Also, it’s actually important to stay in touch with reality and not become the “longtermist Chad” or something. You stay in touch with reality by being careful about the decisions you make, being cognizant of what you’re giving up and trading off against, and yes, by being willing to be the boots on the ground whenever it’s needed. But you gain no points by doing it when it’s not, when it’s actively harmful, when your time is limited and you have more valuable things to do.

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2022-04-25T08:38:24.662Z · LW(p) · GW(p)

Two hours of your time could be spent on things that really matter to you. 

It seemed like walking for two hours was time spent on something that really mattered to him?

then waste some of the hours in a day that you do have on a misguided idea of “staying in touch with reality,” you have failed to stay in touch with reality.

You use pretty strong language here, but AFAICT don't seem to really justify it. It's one thing to disagree with how useful it is for lsusr to spend time walking, quite another to call it a misguided idea that shows he's failing to stay in touch with reality.

comment by JohnBuridan · 2022-04-24T22:53:00.452Z · LW(p) · GW(p)

I think the OP's overarching concern is something like a narrow utilitarianism whose decision algorithm takes EV over only a limited number of horizons and decision sizes. There is unknown EV in exploring the world more personally and in reproducing knowledge and skills. My hunch is that such optimization of human life takes these different aspects at least multiplicatively.

Expected value calculations have limits for decisions which will affect your worldview, i.e. exploration. Or decisions along the axis of goods which you don't have a good model for, i.e. education.

comment by mathenjoyer · 2022-04-23T23:41:18.551Z · LW(p) · GW(p)

This is one of the most morally powerful things you have ever written. Thanks.

comment by Chris_Leong · 2022-04-24T10:24:11.634Z · LW(p) · GW(p)

Not that AirBnb didn't try. We communicated via SMS instead of AirBnb's website because AirBnb's website has an algorithm that scans our messages for keywords and punishes hosts it thinks did a poor job—regardless of the star rating a customer like me provides.


Where'd you find that out?

Replies from: lsusr
comment by lsusr · 2022-04-24T14:26:58.226Z · LW(p) · GW(p)

The host told me.

Replies from: Chris_Leong
comment by Chris_Leong · 2022-04-24T15:16:01.054Z · LW(p) · GW(p)

I wonder if that's correct or unfounded rumor?

Replies from: TLW
comment by Jiro · 2022-04-25T18:00:13.690Z · LW(p) · GW(p)

Evil happens when you are separated from the pain you inflict upon other people.

No, no it doesn't.

Consider the trolley problem, where you have to hurt 1 person to save 5. Does it work better if you feel all the pain of the one person being run over by a trolley? You might argue that feeling their pain still serves the purpose of making sure you think carefully before deciding that sacrificing the 1 person really is necessary, but the problem with that reasoning is that pain is not well calibrated for getting people to make subtle, situational, decisions. It's just "I can't stand this much pain, run from it".

You might further try to save the idea by suggesting that that only fails because we can't feel pain caused by inaction, but I can't believe that it would be good to feel pain caused by inaction--everyone who doesn't donate as much as he can afford to to save people (and not just 10%, either) would be feeling horrible pain all the time.

You also get problems with the pain equivalent of utility monsters (in this case, beings who feel exceptionally pained at slight injuries) and people who feel pain at good things (like a religious person who feels pain because heretics exist).

Replies from: gbear605
comment by gbear605 · 2022-04-25T18:45:52.502Z · LW(p) · GW(p)

In most situations, even when there is a morally correct option, not choosing it does not make you evil. Perhaps it makes you a bad person (though then everyone is a bad person), but it doesn’t make you evil. Evil has a higher bar, where the effects are quite bad without an acceptable reason for doing them.

For those true acts of evil - things like murder or rape or genocide - I imagine that very few would happen if the actors really felt the pain that they were inflicting.

Pain monsters are a theoretical problem here, but I think the concept is still helpful.

Replies from: Jiro
comment by Jiro · 2022-04-26T03:02:39.316Z · LW(p) · GW(p)

Evil has a higher bar, where the effects are quite bad without an acceptable reason for doing them.

But the idea isn't selective. You don't get to say "selecting the one person in the trolley problem inflicts not-evil pain, so you don't feel it"--you feel the pain you inflict, whether it's evil-pain or not-evil pain.

Pain monsters are a theoretical problem here, but I think the concept is still helpful.

It's more than a theoretical problem. It's basically the same problem as standard utilitarianism has, except for "disutility" you substitute "pain". Assuming it includes emotional pain, pretty much every real-life utility monster is a pain monster. If someone works themselves up into a frenzy such that they feel real pain by having to be around Trump supporters, you have to make sure that the Trump supporters are all gone (unless Trump supporters can work themselves up into a frenzy too, and then you just feel horrible pain whichever side you take).

It also has the blissful ignorance problem, only worse. Someone might want to know unpleasant truths rather than be lied to, but if telling them the unpleasant truth inflicts pain, you're stuck lying to them.