Posts
Comments
Wasn't doing that.
Thank you so much for understanding, and sticking with my lengthy comment long enough to understand
I reject a norm that I ought to be epistemically brave and criticise the piece in any detail.
Fine. It's fine to say nothing and just downvote, sure. Also fine to have said, "this looks like it should go on the EA Forum too".
It.. did find a better reception on EA Forum... I do think that that there is a difference between what is appropriate to post here and what is appropriate to post on the EA Forum.
Sigh. I suspect you have missed my point or you are missing understanding of what EA forum really is. The EA Forum is where people can ask questions and discuss effective causes, interventions, and ways of being. The latter is similar to LW rationality but broader scope into moral behavior. If something does not have expected effect of increasing good done and solving problems, it should not be posted on the EA Forum. Yes, the EA Forum has community news, but so does LW, as LW also has community meetups and posts discussing communitybuilding tactics. And LW is where some original discussion about the cases in the Bloomberg piece occurred (re: Vassar, Brent Dill, CFAR).
So it is not fine is to imply this clearly ratfocused piece goes there and not here. If it is appropriate for the EA Forum it is appropriate for here. Period. Fine, it is getting more interest there. But just because it is getting better reception on the EA Forum doesn't mean it should be. That is still not really a point that it should go there, if you think it doesn't go on LW (having factored in the paragraph above this one) and you care about the epistemics of the EA forum (which, it would be very rude not to!!).
Yes, the EA Forum has a problem with being more drama hungry and shaming. But this is a problem, not something users on LW should lean into and throw it under the bus. doesn't mean a piece that the forum might "go for" necessarily belongs there. That the audience might go for it does not imply that it is in line with the actual purpose of the EA Forum: to increase effectiveness. For Example: Theoretically, I could post a salacious 3 page story full of misquotes and false claims and lacking context and scale, and theoretically the EA forum audience might make that salacious post of mine the most upvoted and most discussed post in all the forum's history. BUT that doesn't mean my 3 page story should have gone on the EA Forum. Even if people might drink it up, it would not belong there by definition of being a bad irrelevant piece.
And again, if this post is true and relevant, it is more relevant to the rationality community. Flat out. It might belong in both places, but it definitely does not belong only on the EA Forum.
I apologize for my bluntness but come on. Some of us are trying to keep the EA Forum from falling into chaos and undue self flagellation, and be the best forum it can be. Maybe you never hung out there but it used to epic and community drama has significantly worsened it. It's honestly hard to convey how frustrating it is to see that a LW user would try to throw the EA Forum under the bus and typecast the EA Forum when EA Forum users and mods are working hard to try to undo the community overfocus that never should have happened there. And it honestly reads like you don't respect the EA Forum and are essentially treating it with snobbery, that the LW audience is either too epistemically sound for this piece, or just too good for it... but not the EA Forum audience who should handle all the community gruntwork you don't like I guess. (Newsflash: we don't like it either)
A final different point: Have you considered that perhaps some of the reason the EA Forum can be typecast as so "community focused" is because users here on LW would just happily throw all the drama they aren't willing to handle head on, over there to the EA Forum? To the extent this Bloomberg piece has to do with the rationality community (almost totally), the rationality community should 100% own it as part of their past and acknowledge it (which it looks like other LW users are somewhat doing) so the EAs are not unfairly forced to host and handle rationalistcaused problems (either resulting valid concerns or resulting invalid PR disasters) again.
As an EA, please don't try and pin this on us. The claims are more relevant to the rationality community than the EA community. On the other hand, if you think the piece is irrelevant or poorly written and doesn't belong anywhere, then be epistemically brave and say that. You are totally allowed to think and express that, but don't try and push it off on the EA Forum. If this piece doesn't belong here, it doesn't belong there.
(fwiw as things stand it's already crossposted there!)
Maybe "troubling"
I don't think it does invoke a political frame if you use it right but perhaps I have too much confidence in how I've used the term
I use problematic
Because it is inappropriate to intentionally be doing things that bring you sexual arousal at work:
- To be in a sexually aroused state is very distracting, and she intentionally chose to boost that state at lease somewhat. Not good for workplace productivity
- It is also a bit threatening given he can statistically assume she has sexual interest toward men and is sitting right behind him in an aroused state which she in some sense intentionally chose to be in. Whether or not she would plausibly do anything sexual, it's totally normal that it would raise the hairs on the back of your neck somewhat, because the person who watches porn at work has also already sorted themself into cohort of "person who does weird and unpredictable sexual things like it's normal, likely on impulse".
- Even if still there is no chance of them doing something sexual and they are able to get back on task with zero distraction, it doesn't bode well for employment choice that you chose someone so impulsive and socially strange. Like "What else is in store today..? I feel kinda unsettled" is a totally normal gut instinct to have after that
- It kept the burden on him to follow social convention, and forced him to (with-no-help) navigate her defection socially, while she herself defected from following convention. In a workplace if you want to break social convention, you give a heads up at least. While watching porn at work was still too egregious for that, I'm just saying this more neutral workplace rudeness still applies. A more neutral example of this is an employee goes to office job in burner clothing, which is highly abnormal. Now the boss has to wonder "Should I address this? Or is this a one time thing? How do I address it if I want to? Is it actually morally okay and I'm being a prick and out of touch? Damn IDK but why did [employee] even put me in this weird position, we are both supposed to be here for the company's productivity after all..."
How do you think it would compare to being on a constant screensharing call with a friend/remote assistant, so they can always check your screen? Say you know for a fact that your screen is up at all times on their extra monitor? Been considering doing this.
(I don't know how screensharing works for ultra-wide displays like yours though, and which I also want, to whatever possibly-crappy extra monitor a friend might have. Fingers crossed it would display the whole thing, rather than cropping it. I suppose you could always buy a big screen for the remote assistant though)
Sure. Maybe it is cuz I am more in EA than LW that this is all normal to me. There are frequent retreats and workshops for different career niches and groups including student groups. Plus there are the EAG(x) conferences that, as tidy weekend events people fly in and get lodging reimbursed for, I'd say are comparable to this, and they happen at a scale 10-50x this one which has probably shaped my perception that this one is well within bounds of normal.
Examples: I am checking this on mobile so sorry for formatting and not precise examples. But you can use this link to search for terms "workshop", "retreat", "bootcamp", and "weekend" to get an idea of how popular these weekend retreats have become. I think sharing this looks like a copout on my end, but it is almost better than me giving a few concrete examples because a few concrete examples still doesn't really prove the 30x ratio I mentioned above:
https://forum.effectivealtruism.org/search
Also, Financially: the norm these days seems to be that weekend workshops are either free or the organizers may request some amount but it is made overt that people are not turned away for lack of ability to pay. This standard is kinda set by EAGs now I guess.
Also, just some personal thoughts as someone who plans EA events/workshops myself (relatively new to it): I think you just want more applicants, eg you don't want to create friction and waste an ask on finances. You want absolute freedom and discretion to filter on the things you deem important and only those things, which will not be willingness to pay in this case. Holds true in many different situations: eg, if you are offering something for beginners/students, you'd lose a substantial number due to lack of ability to pay. While if you are offering something for prestigious or experienced people, they have a lot to compete for their time which probably fits their standard career path better. You can signal respect for this struggle and also signal that you are real (without having to waste a bunch of ops time on a beautiful website and all that stuff that normies usually use to signal) by offering all-expenses covered. If offering to the average Joe who would be happy to pay something and has time, you still filter some people with mandatory charge, because it is laborious enough to attend, and putting in your credit card info is always gonna be the straw that breaks the camel's back for somebody.
I don't think the FTX funding situation changes this. When workshops do happen they will probably keep being mostly all-expenses paid. EA still has billions of funding available via Open Phil and EA Infrastructure Fund still has private donors.
I just want to say that this didn't raise alarm bells as expensive or weird for me. It is last minute. Arranging things around the holidays sucks. Basically they can either rush to get it done or wait til mid-January at earliest. And, once organizers already know they want to do it, doing it earlier means value of information can be usable sooner too (eg, they can repeat it sooner or put people in contact with opportunities that crop up in January), it most likely makes it worth a good portion of dough to get it done ASAP.
Also, the referrals are basically no expense in the grand scheme. Assume they have 40 attendees and 80% of the attendees are found via referral: that will be $3,200[1]. If the rest of the event is worth doing so last minute, that DEFINITELY is to expand their short notice applicant pool and let readers consider who is well suited. It could say, double the quality of the average accepted person. It's basically the best and most affordable last-minute marketing you can get.
Nothing raised a scammy flag for me. It's what I expect a nontrivial number of last-minute workshops to look like (10%?), and I expect last-minute workshops to be much more common than scams in this community (maybe 30x as common?). I also think only a rare scam will use something like this post (maybe 2% of scammy/net-negative/risky community things will), as there are better and cheaper ways to scam or brainwash people. Crunch those numbers and this post looks to me to be about 150x as likely to be related to a good/legit workshop than a bad one. [2]
That said, I guess it is good to realize how nervous people might be about risk? I'm just surprised if people did find this scammy.
- ^
In a comment, they've clarified they expect about 20 ppl, so actually it would be even less, $1600, but about 40 was my initial estimate
- ^
Rough numbers but yeah I think you have to be really ingratious to the LW community to get odds that are worse than 49:1 here, so def worth applying, and hardly even worth refining the ass-numbers more carefully at that initial estimate of distance between options
Any chance you will record this? I think the section on getting stuff done would be especially helpful. Makes it semi-easily replicable by people in other places too, like local EA or LW groups.
Wait actually, this is interesting. Because I bet GPT-4 could probably convince many (most?) people to brush their own teeth.
Even with actuators, you need a compliant human subject, eg, someone who has been convinced to have their teeth brushed by a robot. So "convincingness" is always a determing factor in the result. Convincing the person to do it themselves is then, basically the same thing. Yano, like AI convincing its way out of the box.
Except in this case, unlike the box hypothetical, people universally already want their teeth to be brushed (they just don't always want to do it), and it is a quick, easy, and routine task. GPT could probably dig up incentives and have a good response for each of the person's protests ("I'm tired", "I just did 2 hours ago", etc). It would be especially easy to be responsible for a counterfactual tooth-brushing given the reader skips often.
This is a measurable, non-harmful metric to see how convincing LLMs are, and is making me think about LLM productivity and coaching benefits (and some more sinister things).