[LINK] EA Has A Lying Problem
post by Benquo · 2017-01-11T22:31:01.597Z · LW · GW · Legacy · 34 commentsThis is a link post for https://srconstantin.wordpress.com/2017/01/11/ea-has-a-lying-problem/
Contents
34 comments
34 comments
Comments sorted by top scores.
comment by Raemon · 2017-01-12T18:48:36.712Z · LW(p) · GW(p)
Note: there also discussions of this taking place on the Effective Altruism forum, which I think makes more sense as the central-repository for discourse.
http://effective-altruism.com/ea/169/a_response_to_ea_has_a_lying_problem/#comments
comment by siIver · 2017-01-12T01:46:47.459Z · LW(p) · GW(p)
I'm pretty split on this. I found the quotes from Ben Todd and Robert Wiblin to be quite harmless, but the quotes from Jacy Reese to be quite bad. I don't think it's possible to judge the scope of the problem discussed here based on the post alone. In either case, I think the effort to hold EA to high standards is good.
comment by Gurkenglas · 2017-01-13T21:52:36.190Z · LW(p) · GW(p)
Relevant sequence: Ends Don't Justify Means (Among Humans)
comment by Davidmanheim · 2017-01-12T21:01:54.344Z · LW(p) · GW(p)
This seems reasonable, but the interpretations of the quotations seem to be in some cases, extremely uncharitable.
Ben's statement could easily be construed to mean: run your criticism by us first, and we'll point you to where it's been discussed - because otherwise we'd probably not bother responding independently to all the various public criticisms, as it wastes our time with a repetitive task.
The Robert Wiblin quote is taken out of context, and the comparison to marriage makes it clear that he's not advocating "you can quit any time you feel like it".
The Reese issue is more complex, and potentially worse, and I won;t address it.
And Gleb's missteps are mostly indefensible, but this article seems to question his motives unfairly. He seems to be trying (not always well) to go down a path that he may still think is ultimately helpful. (I'd advise him to reread how to actually change your mind, and hope he can make a major change.) Will and others have done enough to disavow him, this isn't helpful. So I'm not defending him, but the attack here seems simply mean spirited, and ultimately is exactly the type of time-wasting circular firing squad that the Ben's earlier quotes were trying to get people to avoid.
Edit: All that said, the post pointed out an important issue, and at the very least, the critique of using the number of GWWC pledges as a metric which is distorting behavior is something I'm fully in agreement with Sarah on.
comment by hairyfigment · 2017-01-12T01:11:46.905Z · LW(p) · GW(p)
Another note I forgot to add: the first quote, about criticism, sounds like Ben Todd being extremely open and honest regarding his motives.
Replies from: Benquo↑ comment by Benquo · 2017-01-12T01:53:18.073Z · LW(p) · GW(p)
Well, yes.
I think it's a bad motive and one that leads towards less openness and honesty, but Ben Todd personally is being very open and honest about it, which is right and virtuous and says good things about him as a human being, and his intentions. I think this gives things like EA a chance at avoiding various traps that we'd otherwise fall into for sure - but it's not a get-out-of-jail-free card.
Replies from: hairyfigment↑ comment by hairyfigment · 2017-01-12T06:15:52.437Z · LW(p) · GW(p)
I'm talking here about the linked post. The author's first example shows the exact opposite of what she said she would show. She only gives one example of something that she called a pattern, so that's one person saying they should consider dishonesty and another person doing the opposite.
If you think there's a version of her argument that is not total crap, I suggest you write it or at least sketch it out.
Replies from: Benquo↑ comment by Benquo · 2017-01-12T09:12:22.736Z · LW(p) · GW(p)
Holding criticism to a higher standard than praise discourages people from calling out misrepresentations, which lowers the costs to liars of lying. I'd be surprised if Ben Todd were deliberately trying to clear a path for lies, but that's the direction things like that point.
Replies from: ChristianKl↑ comment by ChristianKl · 2017-01-13T15:21:20.913Z · LW(p) · GW(p)
Low quality praise is easily ignored without much effects. Low quality criticism on the other hand is more likely to have effects.
It's worthwhile to go for quantity of praise while focusing on quality of criticism.
comment by ingive · 2017-01-11T22:44:03.396Z · LW(p) · GW(p)
The writer and those in the article seem to hint that lying is ineffective.(in the long run) Then it is a matter of efficiency, not lying or integrity. But it appears that it is not, and this is simply a rationalization. Cognitive dissonance with "not lying/integrity" and "efficiency".
comment by bogus · 2017-01-12T23:30:07.472Z · LW(p) · GW(p)
Nice article. This sure puts the earlier criticism of Tyro Gfvchefxl in perspective - I think on the whole, Tyro's attitude has been a lot more honest than what's now surfacing from these other orgs. The ideas that "internal critique is bad for the organization, and who cares about critiques when we're getting pledges anyway" and "thinking that pledges should be taken seriously is just autism (but only when we say so!)" are just terrible and could easily backfire politically on the movement as a whole. Needless to say, that would lead to a significant loss of utility.
comment by Gleb_Tsipursky · 2017-01-13T12:25:07.015Z · LW(p) · GW(p)
Sarah's post highlights some of the essential tensions at the heart of Effective Altruism.
Do we care about "doing the most good that we can" or "being as transparent and honest as we can"? These are two different value sets. They will sometimes overlap, and in other cases will not.
And please don't say that "we do the most good that we can by being as transparent and honest as we can" or that "being as transparent and honest as we can" is best in the long term. Just don't. You're simply lying to yourself and to everyone else if you say that. If you can't imagine a scenario where "doing the most good that we can" or "being as transparent and honest as we can" are opposed, you've just suffered from a failure mode by flinching away from the truth.
So when push comes to shove, which one do we prioritize? When we have to throw the switch and have the trolley crush either "doing the most good" or "being as transparent and honest as we can," which do we choose?
For a toy example, say you are talking to your billionaire uncle on his deathbed and trying to convince him to leave money to AMF instead of his current favorite charity, the local art museum. You know he would respond better if you exaggerate the impact of AMF. Would you do so, whether lying by omission or in any other way, in order to get much more money for AMF, given that no one else would find out about this situation? What about if you know that other family members are standing in the wings and ready to use all sorts of lies to advocate for their favorite charities?
If you do not lie, that's fine, but don't pretend that you care about doing the most good, please. Just don't. You care about being as transparent and honest as possible over doing the most good.
If you do lie to your uncle, then you do care about doing the most good. However, you should consider at what price point you will not lie - at this point, we're just haggling.
The people quoted in Sarah's post all highlight how doing the most good sometimes involves not being as transparent and honest as we can (including myself). Different people have different price points, that's all. We're all willing to bite the bullet and sometimes send that trolley over transparency and honesty, whether questioning the value of public criticism such as Ben or appealing to emotions such as Rob or using intuition as evidence such as Jacy, for the sake of what we believe is the most good.
As a movement, EA has a big problem with believing that ends never justify the means. Yes, sometimes ends do justify the means - at least if we care about doing the most good. We can debate whether we are mistaken about the ends not justifying the means, but using insufficient means to accomplish the ends is just as bad as using excessive means to get to the ends. If we are truly serious about doing the most good as possible, we should let our end goal be the North Star, and work backward from there, as opposed to hobbling ourselves by preconceived notions of "intellectual rigor" at the cost of doing the most good.
Replies from: fubarobfusco, entirelyuseless, bogus↑ comment by fubarobfusco · 2017-01-13T16:40:27.085Z · LW(p) · GW(p)
"I got caught lying — again — so now I'm going to tell you why lying is actually better than telling the truth."
Seriously ... just stop already.
Replies from: Gleb_Tsipursky↑ comment by Gleb_Tsipursky · 2017-01-13T18:54:58.212Z · LW(p) · GW(p)
You seem to be suggesting that I had previously advocated being as transparent as possible. On the contrary - I have long advocated for the most effective communication techniques to achieve EA ends.
Replies from: Lumifer↑ comment by Lumifer · 2017-01-13T20:13:43.411Z · LW(p) · GW(p)
Why should anyone believe you?
Since to you the ends justify the means, why should we accept that your ends are EA ends? You might well be lying about it and by your set of criteria that's fine.
Let's consider the hypothesis that what you want is money and social status. These ends would justify the means of setting up an "EA" charity and collecting donations from gullible people, wouldn't they? It's just what you believe to be an effective method of reaching your goals. Since things like integrity and honesty are subservient to reaching your goals, there is no problem here, is there?
Replies from: Gurkenglas, Gleb_Tsipursky↑ comment by Gurkenglas · 2017-01-13T21:43:10.452Z · LW(p) · GW(p)
To figure out the truth, we must not punish people for advocating a position, or we might end up in a situation where everyone sees a taboo truth and is afraid to speak it.
That someone advocates lying is evidence that they would lie and should be excluded. Now take that evidence and throw it out the window, because we need to figure out whether lying is actually the right thing to do, and for that we need to listen to all the sides. In fact, Gleb should be rewarded as compensation for the s̶u̶b̶c̶o̶n̶s̶c̶i̶o̶u̶s̶ trust of his peers that he sacrificed to help this discussion.
Replies from: hairyfigment, Lumifer↑ comment by hairyfigment · 2017-01-14T19:39:04.432Z · LW(p) · GW(p)
This is wholly irrelevant, because we've already caught Gleb lying many times. His comment sacrifices nothing, and in fact he's likely posting it to excuse his crimes (the smart money says he's lying about something in the process).
Your point does apply to the OP trying to smear her first example for practicing radical honesty. This is one of the points I tried to make earlier.
↑ comment by Lumifer · 2017-01-14T17:15:31.887Z · LW(p) · GW(p)
I don't think anyone here is in position to "punish" Gleb.
However speech has consequences. In particular, consequences with respect to reputation, credibility, and trust. This is as it should be.
Replies from: ingive↑ comment by ingive · 2017-01-14T17:45:11.299Z · LW(p) · GW(p)
Then it's a question whether this speech is efficient or not, not if lying is or is not? Everyone involved should be focused on effective actions with a positive expected value. I'm not very well read in this area. But if I understand correctly, the end justifies the means but it might be an inefficient action to say it publicly? Thus some other people might react to it because it is an effective action to do so.
↑ comment by Gleb_Tsipursky · 2017-01-14T19:11:58.963Z · LW(p) · GW(p)
I have plenty of social status, and sufficient money, as a professor. I don't need any more personally. In fact, I've donated about $38K to charity over the last 2 years. My goal is EA ends. You can choose to believe me or not :-)
↑ comment by entirelyuseless · 2017-01-13T14:53:23.746Z · LW(p) · GW(p)
"Would you do so, whether lying by omission or in any other way, in order to get much more money for AMF, given that no one else would find out about this situation?"
No, I would not. Because if I would, they would find out about the situation, not by investigating those facts, but by checking my comments on Less Wrong when I said I would do that. Or in other words, if you ever are talking to a billionaire uncle in real life, they may well have read your comments, and so there will be no chance of persuading them to do what you want even if you refrain from lying.
You are very, very wrong here, and it should be evident that by your own standards, if you right, you should keep your opinions to yourself and pretend to be in favor of transparency and honesty.
Replies from: gjm, Gleb_Tsipursky↑ comment by gjm · 2017-01-13T16:53:36.105Z · LW(p) · GW(p)
by your own standards, if you right, you should keep your opinions to yourself and pretend to be in favor of transparency and honesty
I think in Gleb's case it may be rather too late for him to get anyone to believe that he is in favour of transparency and honesty.
Replies from: Gleb_Tsipursky↑ comment by Gleb_Tsipursky · 2017-01-13T18:57:54.676Z · LW(p) · GW(p)
Never claimed to be - I have long argued for the most effective communication techniques to promote EA ends.
↑ comment by Gleb_Tsipursky · 2017-01-13T18:57:12.069Z · LW(p) · GW(p)
I don't believe I am wrong here. My rich uncle doesn't read Less Wrong. However, those who have rich uncles do read Less Wrong. If I can sway even a single individual to communicate effectively, as opposed to maximizing transparency, in swaying people to give money effectively, I'll be glad to have done so.
↑ comment by bogus · 2017-01-13T16:26:44.880Z · LW(p) · GW(p)
Do we care about "doing the most good that we can" or "being as transparent and honest as we can"? These are two different value sets. They will sometimes overlap, and in other cases will not.
The EA movement does not really have to be "as transparent and honest as we can" - that's an unrealistic standard from any real-world organization, for reasons that have very little to do with any sort of 'lying' or 'dishonesty'. It only has to be markedly better than the bulk of the charitable-aid industry, which is not a very high bar at all. That still does not justify many of the things reported about in Sarah's article (I've tried to explain my view about these in a different comment to this post). It may be true that "we’re just haggling over the price" at this point, but I think I can tell when something is a bad deal.
For a toy example, say you are talking to your billionaire uncle on his deathbed and trying to convince him to leave money to AMF instead of his current favorite charity, the local art museum. You know he would respond better if you exaggerate the impact of AMF. Would you do so, whether lying by omission or in any other way, in order to get much more money for AMF
I would lie, because my billionaire uncle is smart enough to discount some things I say as exaggerations, and to do anything else might just be too confusing given the short timeframe. :-P If you think the EA movement is in a similar position, by all means feel free to advocate for the same choice!
comment by hairyfigment · 2017-01-12T01:04:45.312Z · LW(p) · GW(p)
She does eventually give an example of what she says she's talking about - one example from Facebook, when she claimed to be seeing a pattern in many statements. Before that she objects to the standard use of the English word "promise," in exactly the way we would expect from an autistic person who has no ability to understand normal humans. Of course this is also consistent with a dishonest writer trying to manipulate autistic readers for some reason. I assume she will welcome this criticism.
(Seriously, I objected to her Ra post because the last thing humanity needs is more demonology; but even I didn't expect her to urge "mistrusting Something that speaks through them," like they're actually the pawns of demons. "Something" is very wrong with this post.)
The presence of a charlatan like Gleb around EA is indeed disturbing. I seem to recall people suggesting they were slow to condemn him because EA people need data to believe anything, and lack any central authority who could declare him anathema.
Replies from: Benquo, sarahconstantin↑ comment by Benquo · 2017-01-12T07:24:12.871Z · LW(p) · GW(p)
I think that if you look at the actual epistemic content of this kind of demonology, it just cashes out to not committing the fundamental attribution error:
There are bad systems of behavior and thought that don't reflect the intrinsic badness of the individuals who express them, but rather broader social dynamics. There's selection pressure for social dynamics that justify, defend, and propagate themselves, so sometimes it can be intuitive to anthropomorphize them. A powerful agent for evil that can control otherwise good people's actions sounds like a demon.
↑ comment by sarahconstantin · 2017-01-12T07:01:10.668Z · LW(p) · GW(p)
Hi, I wrote the post.
I think that it's actually fine for me to use spooky/mystical language to describe human behavior. I'm trying to hint very broadly at subjective impressions, and provoke readers to see the same things I do. I have the rough sense of something spooky going on in the zeitgeist, and I want to evoke that spooky feeling in my readers, so that some of them might say "I see it too." That's exactly the right use case for magical thinking.
There are degrees of certainty in making accusations. If you have hard evidence that somebody did a seriously bad thing, then that's one kind of situation. I'm not making any of those kinds of claims. (There was hard evidence that Gleb did a seriously bad thing, but that's not original to me, and that was dealt with before.)
What I'm doing is more like the sort of thing that goes on when, say, a journalist/blogger might accuse EA of being a movement full of "nerdy white males" and insinuating that this is associated with certain stereotypical biases, and maybe pulling a quote or two to support the claim. It's a "hey, this smells funny" kind of deal. It's about pattern-matching and raising suspicion and smearing it around a bit.
Replies from: hairyfigment↑ comment by hairyfigment · 2017-01-12T07:43:44.508Z · LW(p) · GW(p)
I do not think it's fine. I think you're poisoning the discourse and should stop doing it, as indeed should the blogger in your example if there isn't more to go on. Is your last sentence some kind of parody, or an actual defense of the reason our country is broken?
Replies from: Davidmanheim, sarahconstantin, sarahconstantin, sarahconstantin, sarahconstantin↑ comment by Davidmanheim · 2017-01-12T21:03:49.900Z · LW(p) · GW(p)
This seems to be a mean-spirited way to poison the discourse in exactly the way it attacks Sarah for - is that irony intentional?
↑ comment by sarahconstantin · 2017-01-12T08:21:38.487Z · LW(p) · GW(p)
"A Spectre is haunting Europe" is a perfectly good way to begin a manifesto. Spooks are, and always have been, part of the business of changing people's behavior.
↑ comment by sarahconstantin · 2017-01-12T08:23:10.338Z · LW(p) · GW(p)
I do not have to fit your standards of discourse when I am doing battle against spooks.
↑ comment by sarahconstantin · 2017-01-12T09:00:04.815Z · LW(p) · GW(p)
In the interests of honesty: I have difficulties with a lot of the premises of EA as well as tactics. Accusing them of bad tactics is a method that has a lot more purchase than challenging the premises. Criticism of tactics can reach a wide variety of people; criticism of premises involves making me look like a fool or a villain.
↑ comment by sarahconstantin · 2017-01-12T07:59:43.264Z · LW(p) · GW(p)
I am not making a parody. I am doing a thing. I want to do this thing.