comment by Nectanebo ·
2012-03-17T17:37:43.327Z · LW(p) · GW(p)
People have irrational beliefs. When people come to lesswrong and talk about them, many say "oops" and change their mind. However, often they keep their decidedly irrational beliefs despite conversation with other Lesswrongers who often point out where they went wrong, and how they went wrong, and perhaps a link to the Sequence post where the specific mistake is discussed in more detail.
This guy was told he was being Mindkilled. Many people explained to him what was wrong with his thinking, and why it was wrong, and how it was wrong, and what he could do, and all manner of helpful advice and discussion.
He rejected it, left the site and hasn't been seen since.
Not much to say. Eliezer the Wise and Always Correct himself declared him a troll.
Generally pretty irrational dude. Asked to leave lesswrong by the powerful and great Eliezer because his comments were so bad.
Another, different example:
MrHen had great insights into rationality and seemed to be a well upvoted member of lesswrong. He also believed in God. He was around in 2010 and again in 2011 for a bit, and hasn't posted in a while now.
Perhaps a more controversial example:
Around Feb 3rd Mitchell Porter brought a debate about colour, the mind, dualism, and similar thoughts. I'm actually not sure if this was resolved, but there seemed to be some small consensus (kinda) that he was talking the Crackpot Offer. This was suggested to him.
Mitchell Porter is still around, and is an active user that seems to have lots of useful insights to many things. he is very well upvoted.
To any of these people, I am sorry for mentioning you guys like this if you are offended or anything like that.
So why am I bringing this up?
Well, people fail at being rational all the time. However, there are countless examples like these, from people who turned up, got insanely downvoted, then left, and regular users who otherwise get lots of karma and are very rational.
The main thing I wanted to do was just POINT IT OUT and see if anyone wants to comment on the fact that this happens, in LessWrong, surely the place where they are MOST likely to see why and how they are wrong.
What does this mean that so many people do not? What does it mean that such failures happen so often that I could choose random examples off the top of my head? I mean, some of the things it means are obvious, but this pains me and I need it to discussed somewhere because I find it important and I think that more people should be aware that this happens and should make more concerned, perhaps vapid comments about it.
Also, thinking of upgrading to discussion post. Tell me if that's a bad idea.
If you have read this, please tell me what you think.
Replies from: ArisKatsaris, Grognor, GLaDOS
↑ comment by ArisKatsaris ·
2012-03-19T01:44:06.597Z · LW(p) · GW(p)
Half the people you listed were insanely rude at pretty much every single comment they posted.
Jake Witmer was pretty much accusing of communism everyone who downvoted him.
911truther deliberately chose a provocative name and kept wailing in every single post about the downvotes he received (which of course caused him to get more downvotes).
sam0345's main problem wasn't that he was irrational, it was that he was an ass all the time.
But I don't even know why you chose to list the above as belonging to the same category with decent people like Mitchell_Porter and MrHen, people who don't follow assholish tactics, and are therefore generally well received and treated as proper members of the community, even if occasionally downvoted (whether rightly or wrongly). As you yourself saw.
The main thing I wanted to do was just POINT IT OUT and see if anyone wants to comment on the fact that this happens, in LessWrong, surely the place where they are MOST likely to see why and how they are wrong. What does this mean that so many people do not?
The main problem with half the people you listed was that they were assholes, not that they were wrong. If people enjoy being assholes, if their utility function doesn't include a factor for being nice at people, how do you change that with mere unbiasing? Not caring about how whether you treat others nicely or nastily has to do with empathy, not with intellectual power.
Replies from: Nectanebo
↑ comment by Nectanebo ·
2012-03-19T10:50:29.228Z · LW(p) · GW(p)
The rudeness wouldn't help with the downvotes, I can understand that.
But the factor that I was pointing out, and the common factor for my grouping them together was the lack of being able to say "oops". I am sorry, I didn't make it very clear. Thus why I listed the assholes with nice people.
MrHen left LessWrong believing in a God, and Mitchell_Porter (as far as I can tell) still believes dualism needs to be true if colour exists (or whatever his argument was, I'm embarrasing myself by trying to simplify it when I had a poor understanding of what he was trying to say).
They were/are also great rationalists apart from that, and they both make sure to be very humble in general while on the site.
The other 3 were often rude, but the main reason I pointed them out was their lack of ability to say "oops" when their rational failings were pointed out to them. Unlike the other two, these 2 them proceeded to act very douchey until friven from the site, but their first posts are much less abrasive and rude.
In general though, if they aren't going to work out they are wrong at LessWrong, where are they going to?
Some of these people may work it out with time, and it may be unreasonable to expect them to change their mind straight away.
But this should show at least how difficult it is for an irrational person to attempt to become more rational; it's like having to know the rules to play the rules.
What does it take to commit to wanting rationality from a beginning of irrationality?
These examples show the existence of people on LessWrong who aren't rational, and while that isn't a surprise, I feel like the Lesswrong community should be perhaps learn from the failings of some of these people, in order to better react to situations like this in the future, or something. I don't know.
In any case, thank you for replying.
Replies from: GLaDOS, Mitchell_Porter, TheOtherDave, Viliam_Bur
↑ comment by GLaDOS ·
2012-03-24T10:31:25.074Z · LW(p) · GW(p)
MrHen left LessWrong believing in a God, and Mitchell_Porter (as far as I can tell) still believes dualism needs to be true if colour exists (or whatever his argument was, I'm embarrasing myself by trying to simplify it when I had a poor understanding of what he was trying to say). They were/are also great rationalists apart from that, and they both make sure to be very humble in general while on the site.
Bold statment that somehow still seems true: Most LessWrongers probably have a belief of comparable wrongness. MrHen is just unlucky.
↑ comment by Mitchell_Porter ·
2012-03-24T10:47:11.123Z · LW(p) · GW(p)
Mitchell_Porter (as far as I can tell) still believes dualism needs to be true if colour exists (or whatever his argument was, I'm embarrasing myself by trying to simplify it when I had a poor understanding of what he was trying to say)
The argument is that for dualism not to be true, we need a new ontology of fundamental quantum monads that no-one else quite gets. :-) My Chalmers-like conclusion that the standard computational theory of mind implies dualism, is an argument against the standard theory.
↑ comment by TheOtherDave ·
2012-03-19T15:32:28.172Z · LW(p) · GW(p)
What does it take to commit to wanting rationality from a beginning of irrationality?
Deciding that being less wrong than I am now is valuable, realizing that doing what I've been doing all along is unlikely to get me there, and being willing to give up familiar habits in exchange for alternatives that seem more likely to get me there. These are independently fairly rare and the intersection of them is still more so.
This doesn't get me to wanting "rationality" per se (let alone to endorsing any specific collection of techniques, assumptions, etc., still less to the specific collection that is most popular on this site), it just gets me looking for some set of tools that is more reliable than the tools I have.
I've always understood the initial purpose of LW to be to present a specific collection of tools such that someone who has already decided to look can more easily settle on that specific collection (which, of course, is endorsed by the site founder as particularly useful), at-least-ostensibly in the hope that some of them will subsequently build on it and improve it.
Getting someone who isn't looking to start looking is a whole different problem, and more difficult on multiple levels (practical, ethical, etc.).
↑ comment by Viliam_Bur ·
2012-03-19T13:38:35.124Z · LW(p) · GW(p)
But this should show at least how difficult it is for an irrational person to attempt to become more rational; it's like having to know the rules to play the rules. What does it take to commit to wanting rationality from a beginning of irrationality?
You need some intial luck. It's like human mind is a self-modifying system, where the rules can change the rules, and again, and again. Thus human mind is floating around in a mindset space. The original setting is rather fluid, for evolutionary reasons -- you should be able to join a different tribe if it becomes essential for your survival. On the other hand, the mindset space contains some attractors; if you happen to have some set of rules, these rules keep preserving themselves. Rationality could be one of these attractors.
Is the inability to update one's mind really so exceptional on LW? One way of not updating is "blah, blah, blah, I don't listen to you". This happens a lot everywhere on the internet, but for these people probably LW is not attractive. The more interesting case is "I listen to you, and I value our discussion, but I don't update". This seems paradoxical. But I think it's actually not unusual... the only unusual thing is the naked form -- people who refuse to update, and recognize that they refuse to update. The usual form is that people pretend to update... except that their updates don't fully propagate. In other words, there is no update, only belief in update. Things like: yeah I agree about Singularity and stuff, but somehow I don't subscribe for cryopreservation; and I agree human lives are valuable and there are charities which can save hundred human lifes for every dollar sent to them, but somehow I didn't send a single dollar yet; and I agree that rationality is very important and being strategic can increase one's utility, and then I procrastinate on LW and other web sites and my everyday life goes on without any changes.
We are so irrational that even our attempts to become rational are horribly irrational, and that's why they often fail.
↑ comment by Grognor ·
2012-03-17T23:06:49.192Z · LW(p) · GW(p)
What does this mean that so many people do not? What does it mean that such failures happen so often that I could choose random examples off the top of my head?
Absolutely nothing. Your sample is a selection bias of all the worst examples you can think of. Please don't make a discussion post about this.