Posts
Comments
Hello and goodbye.
I'm a 30 year old software engineer with a "traditional rationalist" science background, a lot of prior exposure to Singularitarian ideas like Kurzweil's, with a big network of other scientist friends since I'm a Caltech alum. It would be fair to describe me as a cryocrastinator. I was already an atheist and utilitarian. I found the Sequences through Harry Potter and the Methods of Rationality.
I thought it would be polite, and perhaps helpful to Less Wrong, to explain why I, despite being pretty squarely in the target demographic, have decided to avoid joining the community and would recommend the same to any other of my friends or when I hear it discussed elsewhere on the net.
I read through the entire Sequences and was informed and entertained; I think there are definitely things I took from it that will be valuable ("taboo" this word; the concept of trying to update your probability estimates instead of waiting for absolute proof; etc.)
However, there were serious sexist attitudes that hit me like a bucket of cold water to the face - assertions that understanding anyone of the other gender is like trying to understand an alien, for example.
Coming here to Less Wrong, I posted a little bit about that, but I was immediately struck in the "sequence rerun" by people talking about what a great utopia the gender-segregated "Failed Utopia 4-2" would be.
Looking around the site even further, I find that it is over 90% male as of the last survey, and just a lot of gender essentialist, women-are-objects-not-people-like-us crap getting plenty of upvotes.
I'm not really willing to put up with that and still less am I enthused about identifying myself as part of a community where that's so widespread.
So, despite what I think could be a lot of interesting stuff going on, I think this will be my last comment and I would recommend against joining Less Wrong to my friends. I think it has fallen very squarely into the "nothing more than sexism, the especially virulent type espoused by male techies who sincerely believe that they are too smart to be sexists" cognitive failure mode.
If you're interested in one problem that is causing at least one rationalist to bounce off your site (and, I think the odds are not unreasonable, where one person writes a long heartfelt post, there might be multiple others who just click away) here you go. If not, go ahead and downvote this into oblivion.
Perhaps I'll see you folks in some years if this problem here gets solved, or some more years after that when we're all unfrozen and immortal and so forth.
Sincerely,
Sam
OK, look, literally a five-year-old would say "but what about my friends who are girls". That the author writes a 'superintelligence' who does not address this objection, and a main character who does not mention any, say, coworkers, board-game-playing rivals, or recreational hockey team members who are women, gives an overwhelming, and overwhelmingly unpleasant, impression that women are solely romance and sex objects. That's not only gross, it's a very common failure mode of "we're too smart to be sexist" male tech geeks. And, indeed, downthread you can see other commenters talking about how great a utopia this sounds like.
This story, as well as other gender-related issues within the Sequences, mean that despite them containing what seems to be to be a lot of value, I definitely would not recommend them to anyone else without large disclaimers, in a similar fashion to how Eliezer refers to Aumann.
This story irresistibly reads to me as the author endorsing or implicitly assuming:
1) There are exactly two genders, and everyone is a member of exactly one; 2) Everyone is heterosexual; 3) Humans have literally 0 use for members of the other gender other than romance.
Registered to post this.
I was linked to the Sequences and was going through them, mostly impressed, when I hit this post.
Eliezer's assessment that the human species can be clearly divided into exactly two sexes, and that dealing with the one you are not a member of is like dealing with an alien species, struck me in an extremely analogous way to how Robert Aumann's Orthodox Judaism struck Eliezer: a usually intelligent person buying wholeheartedly into a local cultural construct that, to my fairly simple observation and deduction, should be assigned very negative log odds.
I've assigned a considerably lower weighting to my Bayesian updates from the Sequences since. I very nearly stopped reading but realized that wasn't the right plan (since, after all, Aumann's agreement theorem is still true).