Women and Effective Altruism
post by P. G. Keerthana Gopalakrishnan (p-g-keerthana-gopalakrishnan) · 2022-11-12T20:57:42.092Z · LW · GW · 15 commentsThis is a link post for https://keerthanapg.com/random/ea-women/
Contents
15 comments
A lot has been talked about SBF/FTX/EA but this coverage reminds me that is time to talk about the toxicity of the culture within EA communities, especially as it relates to women.
EA circles, much like the group house in Bahamas, are widely incestous where people mix their work life (in EA cause areas), their often polyamorous love life and social life in one amalgomous mix without strict separations. This is the default status quo. This means that if you’re a reasonably attractive woman entering an EA community, you get a ton of sexual requests to join polycules, often from poly and partnered men. Some of these men control funding for projects and enjoy high status in EA communities and that means there are real downsides to refusing their sexual advances and pressure to say yes, especially if your career is in an EA cause area or is funded by them. There are also upsides, as reported by CoinDesk on Caroline Ellison. From experience it appears that, a ‘no’ once said is not enough for many men in EA. Having to keep replenishing that ‘no’ becomes annoying very fast, and becomes harder to give informed consent when socializing in the presence of alcohol/psychedelics. It puts your safety at risk. From experience, EA as a community, has very little respect for monogamy and many men, often competing with each other, will persuade you to join polyamory using LessWrong style jedi mindtricks while they stand to benefit from the erosion of your boundaries.
So how do these men maintain polycules and find sexual novelty? EA meet ups of course. Several EA communities are grounds for predatory men in search of their nth polycule partner and to fill their “dancecards”. I have seen this in NYC EA circles, I have seen this in SF. I decided to stop socializing in EA circles a couple months ago due to this toxicity, the benefits are not worth the uncovered downside risk. The power enjoyed by men who are predatory, the rate of occurrence and a lack of visible push back equals to a tacit and somewhat widespread backing for this behaviour. My experience resonates with a few other women in SF I have spoken to. They have also met red pilled, exploitative men in EA/rationalist circles. EA/rationalism and redpill fit like yin and yang. Akin to how EA is an optimization of altruism with “suboptimal” human tendencies like morality and empathy stripped from it, red pill is an optimized sexual strategy with the humanity of women stripped from it. You’ll also, surprisingly, encounter many women who are redpilled and manifest internalized misogyny in EA. How to check if you’re one: if terms like SMV, hypergamy etc are part of your everyday vocabulary and thought processes, you might be affected. You’ll also encounter many women who are unhappy participants in polygamous relationships; some of them really smart women who agree to be unhappy (dump him, sis).
Despite this culture, EA as a philosophy has a lot of good in it and they should fix this bug with some introspection. Now mind you, this is not a criticism of polyamory itself. If polyamorous love happens between consenting adults without favoritism in professional settings, all is well and good. But EA is an organization and community focused on a mission of altruism, enjoy huge swathes of donor money and exert socio-political influence. There are EA-aligned AI research labs, companies and until recently crypto exchanges and trading firms. Tolerating predatory behaviour makes EA sub optimal in pursuit of their stated mission. It puts EA at risk of alienating women / others due to reasons that have nothing to do with ideological differences. Retaining women is hard enough as is for many objective-oriented but male dominated circles in the Bay and among them intelligent women have the most optionality and a lot of places to be where they’re wanted.
EA needs to implement a stricter code of conduct, in line with Title IX, and align on this code in EA group houses / social events/ communities. They also need better processes for resolving and reporting sexual misconduct incidents - Julia Wise’s work is a beginning but is nearly not enough. Libertarian style community mediation of sexual assault cases common in EA communities need to be dumped in favour of police intervention because it leads to gross mismanagement such as this or this.
Hoping for a correction and better future.
15 comments
Comments sorted by top scores.
comment by mingyuan · 2022-11-13T18:05:24.637Z · LW(p) · GW(p)
One of my objections is similar to benjamincosman's — people not taking no for an answer in romantic/sexual contexts is a problem I've seen in people of all ages, races, cultural backgrounds, socioeconomic status, social status, and points on the autism spectrum. It was a big problem at both my urban public high school and my elite private college.
Yes power differentials make it worse, yes it's more of a problem in an environment as gender-imbalanced as EA or the wider Bay Area tech scene, and yes people who are striving to be moral should hold themselves to a higher standard. But trying to use the existence of these problems as an indictment of the community proves too much — I don't know of any community of any kind that successfully avoids them.
I am not opposed to an honest discussion of the gender issues in EA or rationality or the Bay Area as a whole or whatever other scene. I'm a woman and I care about this. But this post completely fails at "Aim to explain, not persuade". It uses inflammatory rhetoric, makes sweeping generalizations like "EA/rationalism and redpill fit like yin and yang", and lumps polyamory in with the problems in an apparent attempt to score points, when polyamory is something that many people practice in a way that's healthy, happy, and consensual for all parties involved.
I also take issue with the specific line
There are also upsides [to accepting the sexual advances of men in power], as reported by CoinDesk on Caroline Ellison.
This reads to me like you're implying that Caroline only got into a position of power because she was sleeping with Sam? That's internalized misogyny if I've ever heard it. Whatever her recent actions may have been, Caroline is an extremely smart person who was a successful trader at Jane Street before ever joining Alameda. When I heard she had become CEO, that made sense to me based on her experience, intelligence, and seniority at the company.
Maybe I am just feeling frustrated about everything lately and am taking it out on you, but come on. If you want people on LessWrong to respect what you're saying, at least try to write with epistemic honesty, instead of whatever this is.
comment by benjamincosman · 2022-11-13T02:35:35.233Z · LW(p) · GW(p)
EA is an optimization of altruism with “suboptimal” human tendencies like morality and empathy stripped from it
I so utterly disagree with this statement. Indeed I think one can almost summarize EA as the version of altruism that's been optimized for morality, instead of solely empathy (though it includes that too!).
comment by Raemon · 2022-11-12T21:25:51.674Z · LW(p) · GW(p)
I downvoted, not because I necessarily thought the topic was bad, but because I'd kinda prefer "EA community stuff" to live on the EA Forum (where I see you also posted). I realize the line between rationalist community and EA community is blurry sometimes, and you maybe could have written a similar post about rationalist communities. But I think there are some subtle distinctions in norms and culture that are better to not blur together more.
Replies from: Davidmanheim↑ comment by Davidmanheim · 2022-11-16T10:10:05.703Z · LW(p) · GW(p)
...but the problems being highlighted are, if anything, much more specifically applicable to the rationalist side of the divide. So this seems wrong.
comment by benjamincosman · 2022-11-13T02:49:10.077Z · LW(p) · GW(p)
I am not at all doubting your unpleasant experiences with some EA men. But when I read this, one thing I wonder is how much this is actually correlated with EA, vs just being a 'some men everywhere are jerks' thing? (I am not at all condoning being a jerk - saying it might be common does not mean I'm also saying it's ok!) But e.g. I would not at all be surprised to find there are (to pick some arbitrary groups) some female doctors writing on their doctor forum that
From experience it appears that, a ‘no’ once said is not enough for many male doctors.
and that there are female chess players writing on their chess forum that
So how do these men find sexual novelty? Chess meet ups of course.
I do agree that the polyamory thing is definitely correlated with EA, but I'm not actually sure it's very relevant here? E.g. would it be any better if the jerks aggressively propositioning you were serial monogamist jerks rather than poly jerks?
To be clear though, I certainly don't have any evidence that it's not an EA-specific problem; I am legitimately asking for your (and others') thoughts on this.
Replies from: lahwran↑ comment by the gears to ascension (lahwran) · 2022-11-13T10:22:03.139Z · LW(p) · GW(p)
I don't think it's an ea-specific problem, but ea is one of many groups that could improve percolation of coprotection patterns that can provide peer security against unsafe behaviors by getting better at secondhand amogus. I'm personally a fan of ...[... copies comment so far into metaphor.systems ...]
partial results of metaphor.systems query:
- instagov.com/wiki/Peer-to-Peer_Accountability_Enforcement <- this is a stub, perhaps could use some of these refs on it?
valence-utilitarianism.com/posts/the-almighty-hive-will<- I would actually submit this as a counterexample of what not to do- overcomingbias.com/2020/01/nickname-court.html <- this is a solid take I think, but meh
contextualscience.org/heat_honorably_experiencing_anger_and_threat_protocol <- interesting slatestarcodex.com/2013/05/11/raikoth-symbolic-beads/<- nope not this- gordonbrander.com/pattern/soft-security <- interesting
- makopool.com/better_space_with_wots.html <- this one is one I already knew to be cool cool but not what I sought
- socialpatterns.adl.org/patterns/personal-bubble-setting/ <- huh straightforward description
- lesswrong.com/posts/dfszfhtkEDhoQ5nH6/trustworthy-computing <- nnnnmmaybe?
- openai.com/blog/amplifying-ai-training/
- theanarchistlibrary.org/library/crimethinc-what-is-security-culture <- hmmm this seems worth debating with, but dang, anarchists sure are a certain way; I am very similar to anarchism but a lot more construction-focused than tit-for-tat-destruction focused. as such, I tend to disagree with this article, I would instead argue more for radical transparency, and the problem then is how to make that actually work.
- neighborhoodanarchists.org/security-culture/ <- this is pretty much the same article.
- localcircles.org/ <- verrry interesting!
- intelligence.org/ <- nope heh
- squadbox.org/ <- dead link
as followup, I tried only using the end part of the query - similar results, I skipped dupes:
- interesting, but not quite what I'm looking for: en.wikipedia.org/wiki/Collective_efficacy
- kinda relevant i guess: en.wikipedia.org/wiki/Security_culture
- not amazed, but i guess it's not the worst: nnscommunities.org/strategies/group-violence-intervention-v3/
- huh, very interesting take on community inclusion currencies: groupcurrency.org/
- huh interesting: medium.com/capabul/grassroots-insurance-8b353a1670f6
- cool if it works: betterangels.github.io/buoy/
- hmm interesting but not what I was looking for: en.wikipedia.org/wiki/Credible_messenger_program
- interesting! very cool! agentofuser.com/groupdag/
- oh hi vitalik arxiv.org/abs/1809.06421
- verrryyy intersting en.wikipedia.org/wiki/Social_immunity
- this is literally the exact topic we were on, in fact! what an interesting result en.wikipedia.org/wiki/Circles_of_Support_and_Accountability
- uh this is several kinds of galaxy brain and probably relevant to ai safety in several possible ways, potentially both very positive and very negative depending on details of what universe we're in: www.grassland.network/
- oooh, this is very promising: www.protectivebehaviours.org/
- huh, interesting concept to know about en.wikipedia.org/wiki/Complex_contagion
- very interesting point, which is that random peer selection allows advantage of human natural coprotection to protect against agentic unfriendly behavior: www.schneier.com/blog/archives/2009/03/the_kindness_of.html
- very interesting concept, idk that its quite what I was looking for en.wikipedia.org/wiki/Civil_inattention
- cool as hell https://friendlysocieties.org/en/index.html
comment by Stuart_Armstrong · 2022-11-18T14:16:40.094Z · LW(p) · GW(p)
It was good that this post was written and seen.
I also agree with some of the comments that it wasn't up to usual EA/LessWrong standards. But those standards could be used as excuses to downvote uncomfortable topics. I'd like to see a well-crafted women in EA post, and see whether it gets downvoted or not.
comment by Viliam · 2022-11-13T21:07:31.975Z · LW(p) · GW(p)
EA needs to implement a stricter code of conduct, in line with Title IX, and align on this code in EA group houses / social events/ communities. They also need better processes for resolving and reporting sexual misconduct incidents - Julia Wise’s work is a beginning but is nearly not enough. Libertarian style community mediation of sexual assault cases common in EA communities need to be dumped in favour of police intervention because it leads to gross mismanagement such as this or this.
I am confused about the meaning of this paragraph. The last sentence sounds to me like a suggestion to use professional intervention (with proper training and legal power), instead of an ad-hoc solution designed by the community. Yet the first sentence sounds like a call to design such ad-hoc solution.
comment by MondSemmel · 2022-11-16T11:39:28.190Z · LW(p) · GW(p)
The epistemic standard for LW posts is higher than this. The post doesn't adhere to any of the default comment guidelines:
- Aim to explain, not persuade
- Try to offer concrete models and predictions
- If you disagree, try getting curious about what your partner is thinking
- Don't be afraid to say 'oops' and change your mind
Separately, to repost my comment [LW(p) · GW(p)] from another thread which was cross-posted from the EA forum:
I think the main issue here is that Less Wrong is not Effective Altruism, and that many (at a guess, most) LW members are not affiliated with EA or don't consider themselves EAs. So from that perspective, while this post makes sense in the EA forum, it makes relatively little sense on LW, and to me looks roughly like being asked to endorse or disavow some politician X. (And if I extend the analogy, it's inevitably about a US politician even though I live in another country.)
So this specific EA forum post is just a poor fit for reposting on LW without a complete rewrite.
In this particular case, the post does mention "LessWrong style jedi mindtricks", but as it's fundamentally confused about what EA is ("Akin to how EA is an optimization of altruism with “suboptimal” human tendencies like morality and empathy stripped from it" - what is it even supposed to mean to have altruism without morality?), I'm very skeptical that it accurately attributes whatever harm was done to LW content.
And separately, I'm just tired of the pattern of posts accusing community X of having a sexism/racism/whatever problem, without making the slightest effort to argue that community X does in fact worse on those dimensions than whatever an appropriate reference class would be.
Once again, if there's a version of this post that's epistemically sound and doesn't require me to trust the author's accusations on blind faith, I'd be interested to read that.
comment by the gears to ascension (lahwran) · 2022-11-12T22:27:32.837Z · LW(p) · GW(p)
strong upvote: this seems like a great post on a difficult topic that a lot of people are going to feel a little bit touchy about and therefore downvote unnecessarily, I am personally enthusiastic about the idea of poly being a thing near my social circles, but I also agree that there are a lot of men and women who are not great at boundaries and who push on boundaries too much, [edit: convinced by replies that this clause is wrong: and who assume that dating is ok in professional contexts]. I suspect that a fair portion of them are intentionally agentic about pushing on boundaries and another larger portion of them are unaware of how boundaries normally work; there have been previous reports of people having agentically bad behavior sexually.
however, I would contest the claim that police are the best solution to this; they have a bad track record of being helpful to communities. while I would not completely write them off as useless, I would say that they have a track record of simply not doing much and taking a lot of effort to get them to not do much. in this respect I would not say the police are that different from any other form of community safety group, other than the fact that they also carry weapons and so invoking them means inviting people who are used to being on the wrong end of weapons to try to deal with a sensitive situation.
Replies from: vanessa-kosoy↑ comment by Vanessa Kosoy (vanessa-kosoy) · 2022-11-13T07:49:23.994Z · LW(p) · GW(p)
I want to push back on the implication that dating is not okay in a professional context. I agree that when there's a significant power differential (e.g. between a person and their boss), this is a serious opening for abuse and must be avoided. And ofc if someone says "no" then don't try again. However, professional context doesn't automatically imply power differential. Moreover, professional context is the most convenient way to run into like-minded people, and just meet people in general. IME when you have few professional interactions with people in your geographic area, it's very easy to become isolated and have few opportunities for romantic relationships or even just platonic friendships. So, a blanket ban on dating in professional contexts[1] really seems like throwing the baby with the bathwater.
(As an aside, the OP sounds to me like pure propaganda: "EA is an optimization of altruism with “suboptimal” human tendencies like morality and empathy stripped from it" really?!)
Btw, AFAICT it's an American invention, and not some kind of global universal. Which ofc doesn't mean it's wrong, just, be aware of possible bias. ↩︎
↑ comment by Viliam · 2022-11-13T19:28:51.527Z · LW(p) · GW(p)
I agree that the article is exaggerated. It makes FTX and EA sound like synonyms. I think the common-sense response to "socializing with presence of psychedelics" with strangers is to say no. Etc.
That said, it seems to me that you trivialize the issue of power differentials. Mere "do not date your boss" is not sufficient, because:
1) Sometimes people get promoted, and what started as dating or flirting with your equal may suddenly become a relationship with your boss.
2) In situations with complex networks of relationships (people dating each other, polycules, group houses), the only person in a formal position of power may be your boss, but there are people with various kinds of informal power, such as "a lover of your boss", "a roommate of your boss", etc.
Also, the dynamics of dating at the workplace is different if you have e.g. 7 people, where A and B are a couple, C and D are a couple, and E, F, G are dating outside of workplace (or not at all), compared to a situation where A, B, C, D, E and F are a polycule, and G is not interested in dating any of them but keeps getting all kinds of hints.
It seems to me that many people like the idea of having a great relation at a workplace, but hate getting unwanted attention or getting involved in other people's drama. Also, sex may become a factor in office politics, which some people enjoy, but others hate it with a burning passion.
(That said, I believe you can be an EA without living in a group house and having group sex with other EAs.)
Replies from: vanessa-kosoy↑ comment by Vanessa Kosoy (vanessa-kosoy) · 2022-11-14T07:51:24.339Z · LW(p) · GW(p)
If Alice is dating Bob and Alice is promoted to become Bob's boss, then Alice should refuse the promotion, or everyone involved agree that Bob moves to a different department, or some other solution along these lines. And, yes, informal positions of power is a thing that should be watched out for. I don't think I'm trivializing, I just feel there's a reasonable trade-off point and the norms associated with the American left (which is obviously very influential in this community) went too far to one end of spectrum.
Now, when most people in a workplace are dating each other... I don't know, this is too far outside my experience for me to have an informed opinion. I can believe this is a terrible idea, or that it's somehow manageable if done right. I think that usually this organically doesn't happen, but I have no experience with working in heavily-poly / heavily-EA orgs, maybe that's different.
Notice that "the gears to ascenscion" wrote "in professional contexts", not just "while working at the same place". That might be interpreted to mean things like "don't date people who you sometimes see in professional conferences", and AFAICT some people actually endorse norms along those lines. And, I think that's going way too far.
Replies from: Viliam↑ comment by Viliam · 2022-11-14T23:00:04.251Z · LW(p) · GW(p)
I agree. If it's not the same workplace, it seems generally okay to me.
There might be some special case where I would say otherwise. But I believe most people would not object against two plumbers dating, or two software developers, or two teachers...
comment by M. Y. Zuo · 2022-11-14T23:15:22.528Z · LW(p) · GW(p)
EA needs to implement a stricter code of conduct, in line with Title IX, and align on this code in EA group houses / social events/ communities. They also need better processes for resolving and reporting sexual misconduct incidents - Julia Wise’s work is a beginning but is nearly not enough. Libertarian style community mediation of sexual assault cases common in EA communities need to be dumped in favour of police intervention because it leads to gross mismanagement such as this or this.
The post was mostly plausible until this part.
This seems like advocating for establishing an entire bureaucratic system to regulate behaviour?
I’m not part of the EA scene and I do agree there are significant numbers of associated shady folks, with that estimation likely increasing for everyone, given recent events.
Yet this proposal seems to be inviting even more moral hazards?