Social incentives affecting beliefs
post by John_Maxwell (John_Maxwell_IV) · 2013-10-28T06:08:35.679Z · LW · GW · Legacy · 47 commentsContents
47 comments
Having weird ideas relative to your friends and associates means paying social costs. If you share your weird ideas, you'll have more arguments, your associates will see you as weird and you'll experience some degree of rejection and decreased status. If you keep your weird ideas to yourself, you'll have to lead a double life of secret constructed knowledge on the one hand and public facade on the other.
For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world. (I've heard from reliable sources that the stereotypes about the South are accurate.) Not many of us would choose to do this voluntarily.
The weirder your beliefs get relative to your peer group, the greater the social costs you'll have to pay. Imagine we plot the beliefs of your associates on a multidimensional plot and put a hook at the center of mass of this plot. Picture yourself attached with an elastic band to this hook. The farther you stray from the center of mass, the greater the force pulling you towards conventional beliefs.
This theorizing has a few straightforward implications:
- If you notice yourself paying a high social or psychological cost for your current set of beliefs, and you have reasons to not abandon the beliefs (e.g. you think they're correct), consider trying to find a new set of associates where those psychosocial costs are lower (either people who agree with you more, people who are less judgmental, or some combination) so you can stop paying the costs. If you can't find any such associates, create some: convince a close friend or two of your beliefs, so you have a new center of mass to anchor yourself on. Also cultivate psychological health through improving your relationships, meditation, self-love and acceptance, etc.
- If you're trying to help a group have accurate beliefs on aggregate, stay nonjudgmental so that the forces pulling people towards conventional wisdom will be lower, and they'll be more influenced by the evidence they encounter as opposed to the incentives they encounter. You may say "well, I'm only judgmental towards peoples' beliefs when they're incorrect." But even if you happen to be perfect at figuring out which beliefs are incorrect, this is still a bad idea. If I'm trying to figure out whether to officially adopt some belief as part of my thinking, I'll calculate my expected social cost of holding the belief using the probability that it's incorrect times the penalty in the case where it's incorrect. So even punishing only the incorrect beliefs will counterfactually decrease the rate of people holding unusual beliefs.
Some more bizarre ideas:
- Deliberately habituate yourself to/adapt to the social costs associated with having weird ideas. Practice coming across as "eccentric" rather than "kooky" when explaining your weird ideas, and state them confidently as if they're naturally and obviously true, to decrease status loss effects. Consider adopting a posture of aloofness or mystery. Or for a completely alternative approach, deliberately adopt a few beliefs that you suspect are true but your social group rejects, and keep them secret to practice having your own model of the world independent of that of your social group.
- If you notice a weird idea of yours is either not getting adopted by you because of social costs, or is costing you "rent" in terms of social costs you are having to pay to maintain it, do a cost-benefit analysis and deliberately either maintain the belief and pay the upkeep costs or discard it from your everyday mental life (preferably making a note at the time you discard it). You have to pick your battles.
- Start being kinda proud of the weird things you think you've figured out, in order to cancel out the psychosocial punishment for weird ideas with a dose of psychosocial reward. Keep your pride to yourself to avoid being humiliated if your beliefs turn out to be proven wrong. The point is to be guided only by the evidence you have, even if that evidence is biased or incomplete, rather than solely the opinion of the herd. (Of course, the herd's opinion should be considered evidence. But if you're doing it right, you'll err on the side of agreeing with the herd too much and agreeing with the herd too little about the same amount... unfortunately, agreeing with the herd too little and being wrong generally hurts you much more than agreeing with the herd too much and being wrong.)
47 comments
Comments sorted by top scores.
comment by gattsuru · 2013-10-28T16:06:33.761Z · LW(p) · GW(p)
For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world. (I've heard from reliable sources that the stereotypes about the South are accurate.)
I'd caution against using this as an example. Not just because it's a stereotype, or just because there's a wide variation in that stereotype, but because it's neither the most effective or the most common level of social pressure that inspires conformity. When you say "Bible-Thumping Redneck", readers will jump to some level of coercion between Inherit the Wind, and a literal torch-and-pitchfork-wielding mob. That's meaningful, but it's also an iceberg situation: the most obvious answer is not the full answer, and can distract you from the full answer. It tempts folk to think about social incentives that affect expression of belief, rather than social incentives that alter beliefs themselves.
A good deal of effective social conformity is far more subtle. You need a much greater form of self-introspection to counter this attribute than you'd expect, and it's far more pervasive than your example would suggest.
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2013-10-29T19:23:58.908Z · LW(p) · GW(p)
So, in the Wikipedia article about the Asch Conformity Experiment, it says that 25% of study participants never gave an incorrect answer. I'd expect readers of a blog about how to think rationally to plausibly be in the top 10% of the population when it comes to thinking rationally, so I doubt many of us would give incorrect answers in the Asch Conformity Experiment (except as a deliberate choice to tell a lie in order to fit in better).
I agree that a town full of religious hicks is maybe a bad example to anchor from. My thought was that some people are going to pretend that social pressure to conform with their beliefs doesn't exist, so if I convince them that they'd feel uncomfortable in the extreme case, then maybe I can convince them that something subtler happens in less extreme cases. Speaking personally, even though people who read Less Wrong agree with me on much more than a typical person does, I still notice substantial social incentives for me to conform with them better... and I had a reputation as a contented misfit in high school. So yes, I agree pressure to conform is very pervasive. My thought was by deconstructing the psychological and social pressures involved, they'd lose some of their power and we could consciously decide how best to deal with each pressure. I'd love to hear if you've detected additional subtle pressures in your own thinking that I didn't mention in my post.
comment by RomeoStevens · 2013-10-28T09:22:15.534Z · LW(p) · GW(p)
My strategy is to associate my weird beliefs with the high status people who hold them and use it to curtail non-productive arguments. e.g. "I don't have much expertise in this area myself, but I trust X who studies this problem professionally."
Replies from: Error↑ comment by Error · 2013-10-29T20:18:28.452Z · LW(p) · GW(p)
Odd. I avoid doing exactly that, because I subconsciously expect people to call me on it; along the lines of "what, you believe X just because so-and-so says so?" It's fairly rare for me to express weird beliefs unless I'm either prepared to defend them in detail, or on such good terms with whoever I'm speaking with that I know they'll cut me some slack.
LW itself is exhibit A, I suppose. I've adopted a fair number of ideas from the Sequences based mostly on Eliezer being extremely convincing -- but I hesitate to put myself in a position where I'd have to admit that, because I can't replicate the argument spontaneously.
Replies from: RomeoStevens, TheOtherDave↑ comment by RomeoStevens · 2013-10-30T00:25:28.388Z · LW(p) · GW(p)
"what, you believe X just because so-and-so says so?"
"No, I don't believe X because so-and-so says so, I put a strong weight on X being true based on so-and-so's track record. If you'd like to discuss the evidence for or against this position in more rigor we should do so online so we can link to citations."
This acts as a great litmus for people who will actually provide me with high quality evidence as well.
Replies from: Error↑ comment by TheOtherDave · 2013-10-29T20:48:01.557Z · LW(p) · GW(p)
The irony here, of course, is that Eliezer has written at length about the importance of being able to reconstruct the argument that convinces one that X is true, not just recite "X is true."
Replies from: Error↑ comment by Error · 2013-10-30T02:12:04.448Z · LW(p) · GW(p)
Reconstructing such, given time, is something I can do. But I can't do it in real time for non-trivial arguments. Does that make me a minority here? I have never been able to do that for any abstract argument that I can think of, except maybe in my area of professional expertise where all the relevant information is perpetually in cache.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-10-30T02:32:55.652Z · LW(p) · GW(p)
I doubt it makes you a minority anywhere.
comment by bramflakes · 2013-10-28T10:42:59.070Z · LW(p) · GW(p)
Practice coming across as "eccentric" rather than "kooky" when explaining your weird ideas, and state them confidently as if they're naturally and obviously true, to decrease status loss effects.
This only works if you're already high status. If you're low status you come across as attention-seeking.
Replies from: hyporational, jimmy↑ comment by hyporational · 2013-11-01T16:32:56.246Z · LW(p) · GW(p)
If you're low status you come across as attention-seeking.
I'd be more worried about it coming across as mental illness. Being eccentric in the good way isn't easy.
↑ comment by jimmy · 2013-10-28T18:21:45.482Z · LW(p) · GW(p)
That sounds wrong to me. Why do you think it comes off as attention-seeking if you're not already high status?
Replies from: bramflakes↑ comment by bramflakes · 2013-10-28T18:44:52.641Z · LW(p) · GW(p)
1) Given that you exhibit attention-seeking behavior, displaying eccentric beliefs is a reliable way of getting more attention. 2) Attention seeking behavior can be a pathological response to perceived low status, and egregious attention-seeking may cause you to lose status as well. 3) So seeing a low-status person with eccentric beliefs, you should update more in favor of them doing it for attention.
Of course I now realize I'm privileging the hypothesis. If you're low status with eccentric beliefs you might just come across as a creepy weirdo.
Replies from: jimmy↑ comment by jimmy · 2013-10-29T18:23:42.449Z · LW(p) · GW(p)
The reason I disagree is that when your goal is attention, you also show other signs (like trying to argue you into agreeing). If you aren't seeking attention, then you don't push your weird views and simply state them matter-of-fact when asked. I don't think someone who keeps their mouth shut until asked and then talks matter-of-fact about weird beliefs comes off as attention seeking or loses status in most cases.
comment by chaosmage · 2013-10-28T14:10:34.780Z · LW(p) · GW(p)
Being wrong is correlated with using an authority heuristic. If you encounter someone who you're sure is wrong, consider using an authoritative source who agrees with you. Attempt to establish that source's authoritativeness before you reveal that source agrees with you.
Worst case scenario, that other person still disagrees, but you only pay for "listening to the wrong guy", which I expect (confidence 95%) is less expensive in the median case.
comment by buybuydandavis · 2013-10-29T02:58:49.108Z · LW(p) · GW(p)
For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world.
A scenario much more available for the Libertarian readers would be living in an area and in broad social circles dominated by Progressives.
comment by Viliam_Bur · 2013-10-28T11:33:29.529Z · LW(p) · GW(p)
Nonjudgemental people may help you socially with trying new ideas, but will not help you epistemically with finding the correct ones. You will have to reinvent every wheel alone. If you have unlimited time, go ahead. Otherwise, it is better to find people who are a bit judgemental -- who have a preference for correct beliefs.
Of course, replacing judgemental people with a preference for incorrect beliefs by nonjudgemental people is an improvement. But sometimes you can do much better than this.
Maybe this is what happens to many people: they replace the judgemental people with incorrect beliefs by nonjudgemental people, they realize the improvement... but then they can't improve further because their heuristics says that "nonjudgemental is best". Noise can be better than active misinformation, but signal can be even better than noise. But when most of your experience is with active misinformation, all signals seem dangerous.
Replies from: None↑ comment by [deleted] · 2013-10-28T13:52:44.779Z · LW(p) · GW(p)
There is some confusion that pops up whenever there's a discussion of 'being judgmental'. Some people distinguish between disagreement and condemnation and believe that you can strongly disagree with someone in a non-judgmental manner and others think of it as a package deal, where being non-judgmental is a trade-off between niceness and ability to form correct beliefs.
When I hear people talking about being nonjudgmental I tend to assume the first interpretation (which I also agree with). But being non-judgmental in that way might itself be an example of a weird, costly attitude. If others don't share it, they will think that you are judging them and there's no way of convincing them otherwise.
Replies from: Ishaan, Lumifer↑ comment by Ishaan · 2013-10-29T01:16:19.605Z · LW(p) · GW(p)
The two contexts in which I see "judgemental"
1) Making assumptions about people based on incomplete knowledge: judging character from the way someone dresses. Or thinking badly of someone for, say, alcoholism without knowing them and the circumstances they live in. Or judging a lifestyle without understanding it.
I'd define this as: "the sort of person who tends to make moral judgements based on insufficient evidence". Seems like a reasonable accusation.
2) Treating oneself as the final arbitrator of right and wrong, deciding morality for others: "Who are you to say what is right?"
I'd code this as "anyone who makes moral judgements". That doesn't seem like a negative trait to me at all...rather, the usage of "don't judge' in this way seems like the moral equivalent of anti-epistemology. What gives?
The amalgam of the two definitions is "One who judges too much", where Judgement is a high confidence statement about the moral status of a thing. So "you're being judgmental" should be coded as "You are far too confident in your morality-related claim. Shame on you!" This makes sense to me...though it seems less like an actual argument and more like a statement of belief.
The seeming double meaning arises because some individuals believe that no one can make any moral claim with any confidence (especially when it comes to other people), while others believe in absolute God-given morality or absolute self-created morality. In fact, there is only one definition of the word, but the usage varies depending on the moral philosophy of the user.
Unfortunately in my experience, the majority of people who use "don't judge" are using it as a rhetorical device to put a stop to moral conversations that they'd rather not have. It's shorthand for, "Oh, you are trying making a moral judgement? But morality is relative anyway!" from a person who has no strong opinions and/or is largely naive to concepts in moral philosophy and thus is able to implicitly switch moral philosophies as it suits them in rhetoric without even realizing that they are doing it.
It's a lot like "faith = trust without evidence" vs "faith = justified trust as a result of evidence" in this regard. The simple definition is "the belief that you can trust someone", but one's epistemology as to how one aught to form beliefs alter one's usage of the word, and most people will use the "trust without evidence" version to implicitly switch epidemiological philosophies when it suits them in rhetoric.
↑ comment by Lumifer · 2013-10-28T16:11:43.805Z · LW(p) · GW(p)
Both interpretations are viable and can co-exist -- depending on the matter under discussion.
It's pretty easy for people to strongly disagree about e.g. the merits of a sports team without condemnation being involved.
It's very hard for people to strongly disagree about e.g. slavery without condemnation being involved.
I think the relevant attribute is "seriousness" or importance. If you imagine a spectrum of importance from "I don't really care" on one end and "I will die for this" on the other end, the closer you are to the don't-care end the easier it is to disagree without judging. But the closer you get to the will-die-for-it end, the harder passionless disagreement becomes.
comment by knb · 2013-10-29T23:56:31.742Z · LW(p) · GW(p)
For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world. (I've heard from reliable sources that the stereotypes about the South are accurate.)
They're such vapid hicks, I bet they believe negative stereotypes about large groups of people just because someone who shares their worldview assures them they are accurate.
Ugh they're such evil mutants! Good thing nothing like that happens on Less Wrong.
comment by Shmi (shminux) · 2013-10-28T15:32:12.406Z · LW(p) · GW(p)
Fight the restoring force with leverage. Make friends with the local priest and affect the congregation through him. Most instrumental rationality techniques are compatible with religion. You can consider your options once you gain status.
Replies from: CAE_Jones↑ comment by CAE_Jones · 2013-10-28T16:09:51.232Z · LW(p) · GW(p)
I read this as "become the king's most trusted adviser; it's easy."
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-28T16:14:33.616Z · LW(p) · GW(p)
You can misread it any way you like, that's not what I wrote.
Replies from: CAE_Jones↑ comment by CAE_Jones · 2013-10-28T17:39:03.197Z · LW(p) · GW(p)
Was I overly hyperbolic, or completely orthogonal to your point? Was I reading in an implication that such things would be (somewhat) trivial that isn't there? (I've noticed quite a few comments that leave me wanting to reply this way can be read as trivializing something decidedly non-trivial, and I suspect that if I called out the commenters the response would be that they meant otherwise.)
Replies from: shminux↑ comment by Shmi (shminux) · 2013-10-28T17:54:54.907Z · LW(p) · GW(p)
My point was that you can befriend the person(s) with leverage in the community without having to become their "trusted advisor". Here is one scenario. Befriending a priest is indeed easy: if you come across as a friendly and curious non-believer, it is in his job description to attempt to win you over. If you cheerfully volunteer in the (non-religious) community events organized by the church, you win people's respect and eventually their sympathetic ear. If you know some useful stuff about instrumental rationality people can benefit from, that might be the time to share. Eventually you can start discussing religious beliefs and tenets in a respectful and friendly manner.
comment by blacktrance · 2013-10-30T03:31:48.835Z · LW(p) · GW(p)
For people reading this site, the most vivid analogy here might be being forced to live in a town full of religious hicks in the south of the USA, with minimal contact with the outside world. (I've heard from reliable sources that the stereotypes about the South are accurate.) Not many of us would choose to do this voluntarily.
I have lived in a town full of religious hicks in the South, and I have many weird beliefs. My experience was that while it did cost me social status, it wasn't that important to me, and it would have been significantly more unpleasant to conceal my beliefs.
comment by Eugine_Nier · 2013-10-30T01:34:00.047Z · LW(p) · GW(p)
If you're trying to help a group have accurate beliefs on aggregate, stay nonjudgmental so that the forces pulling people towards conventional wisdom will be lower, and they'll be more influenced by the evidence they encounter as opposed to the incentives they encounter. You may say "well, I'm only judgmental towards peoples' beliefs when they're incorrect."
The problem with giving rationalists this kind of advice is that it lowers the average sanity of the people defining conventional wisdom.
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2013-10-30T03:47:45.799Z · LW(p) · GW(p)
I don't follow.
Replies from: Eugine_Nier↑ comment by Eugine_Nier · 2013-10-31T01:45:50.932Z · LW(p) · GW(p)
If the more rational people in a group are the least judgmental, the only source of peer pressure will be from the less rational people.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2013-10-31T02:16:01.053Z · LW(p) · GW(p)
That's true only if judgmentalness is the only source of peer pressure.
If, for example, being successful allows one to successfully exert peer pressure and rational people are more successful, then even if the more rational people in a group are less judgmental they might still exert significant peer pressure.
comment by Larks · 2013-10-30T00:04:12.879Z · LW(p) · GW(p)
But even if you happen to be perfect at figuring out which beliefs are incorrect, this is still a bad idea. If I'm trying to figure out whether to officially adopt some belief as part of my thinking, I'll calculate my expected social cost of holding the belief using the probability that it's incorrect times the penalty in the case where it's incorrect. So even punishing only the incorrect beliefs will counterfactually decrease the rate of people holding unusual beliefs.
This argument seems to assume, contra the first sentence, that I am coming down on the side of the received opinion each time, rather than on the side of the correct opinion.
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2013-10-30T03:50:44.977Z · LW(p) · GW(p)
Let's say I have an idea X which happens to be correct, although I only think it's correct with 80% probability. I believe that if idea X is incorrect, and I share it, I will be burned at the stake. So I keep idea X, a correct idea, to myself in order to avoid censure. It so happens that if I had shared idea X, the person in charge of handing out punishments would not have punished me because X happens to be a correct idea. But I had no way of knowing that for certain in advance.
Does that answer your question? I'm not sure I understand your objection.
Replies from: Larks↑ comment by Larks · 2013-10-31T01:09:48.030Z · LW(p) · GW(p)
Ahh. That works if you allow 'not holding a belief either way' as an option. If you punish failing to believe the true thing as well then that problem is avoided.
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2013-10-31T05:35:20.903Z · LW(p) · GW(p)
Yep. People typically don't receive social punishments for agreeing with the group even when the group ends up being wrong.
comment by JQuinton · 2013-10-29T20:01:49.403Z · LW(p) · GW(p)
If you're trying to help a group have accurate beliefs on aggregate, stay nonjudgmental so that the forces pulling people towards conventional wisdom will be lower, and they'll be more influenced by the evidence they encounter as opposed to the incentives they encounter. You may say "well, I'm only judgmental towards peoples' beliefs when they're incorrect." But even if you happen to be perfect at figuring out which beliefs are incorrect, this is still a bad idea. If I'm trying to figure out whether to officially adopt some belief as part of my thinking, I'll calculate my expected social cost of holding the belief using the probability that it's incorrect times the penalty in the case where it's incorrect. So even punishing only the incorrect beliefs will counterfactually decrease the rate of people holding unusual beliefs.
There are already some heuristics that allow you to nudge people in a direction where they are more likely to accept your arguments. However, these techniques are all about getting people to like you, in effect, taking advantage of their cognitive biases, so it might seem to straddle that line of Dark Arts. One was actually posted pretty recently: Have the person tell a self-affirming thing about themselves before trying to convince them of your point of view. Or ask them for a favor or their opinion on something. Another way to get people to like you is to uncover likable things about the person; the Dark Arts version of that would be something like Barnum statements.
If anything, these persuasion techniques will increase your social capital so that you have more to spend on having beliefs that don't quite mesh with the group's center.
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2013-10-30T03:52:52.021Z · LW(p) · GW(p)
One was actually posted pretty recently: Have the person tell a self-affirming thing about themselves before trying to convince them of your point of view.
Hm, complimenting people right before telling them they're wrong about something seems like a good idea and not very dark arts-ish.
comment by JDelta · 2013-10-29T13:11:31.104Z · LW(p) · GW(p)
In my experience the 'social cost' tends to be paid by people trying to push (hard) concepts and ideas that are, often at least, very true and useful, but when it is not necessarily relevant to the conversation or the person isn't interested.
No price is due -- if anything, the opposite -- as long as you follow a few basic rules, and are able to explain your ideas eloquently and succinctly (and if you can't -- should you be talking about it at all?)
A curse of intelligent people seems to be to want to try to 'show' everyone just how smart they are by talking about subjects they don't totally understand, or else subjects that are irrelevant to whatever situation they are in.
A discussion on Bayesian Reasoning may be very appropriate in certain college or high school academic situations, but repeatedly bringing it up in all sorts of casual conversations will cause social harm to the individual refusing to follow social expectations in this are
comment by hyporational · 2013-10-29T04:55:35.233Z · LW(p) · GW(p)
If you keep your weird ideas to yourself, you'll have to lead a double life of secret constructed knowledge on the one hand and public facade on the other.
I kinda think most people have to hide some of their beliefs. It doesn't feel like a facade to me, just a fact of life you've got to live with.
Replies from: John_Maxwell_IV↑ comment by John_Maxwell (John_Maxwell_IV) · 2013-10-29T05:45:07.806Z · LW(p) · GW(p)
For sure, but you don't feel like there's any kind of psychological cost you're paying there?
comment by passive_fist · 2013-10-28T06:43:12.115Z · LW(p) · GW(p)
This is where the divide between epistemic vs. instrumental rationality is so important. If you truly practice instrumental rationality, none of this even enters into the discussion. You should not have to pay any social cost for being rational vs. not being so.
Replies from: savageorange, buybuydandavis↑ comment by savageorange · 2013-10-28T07:57:13.728Z · LW(p) · GW(p)
It sounds as if you think either that an instrumentally rational person will not disclose beliefs that have social cost, or that they will change their social situation so that their group doesn't assign social cost to those beliefs. The former is insufficient if you value improving the rationality of people you know; The latter is extremely slow and highly susceptible to external influences.
To me your claim just defines 'instrumentally rational' so narrowly that there would be no instrumentally rational people in existence. I don't find that useful.
Replies from: CAE_Jones↑ comment by CAE_Jones · 2013-10-28T13:19:35.827Z · LW(p) · GW(p)
He mentioned the example of being trapped in the southern US, surrounded by a bunch of Bible-thumping hicks. Trying to increase the rationality of the people around you in that environment through conventional means is dangerous. (This is where I currently live, with too little power to escape. I'm fortunate in that, whenever I did something weird, it'd get attributed to being somehow related to my visual impairment, so instead of getting the treatment given to less unacceptable targets, I just got isolation and being treated more like a goofy pet. People without disability as a shield have been subject to open scorn, newspaper campaigns, occasionally open protests pushing people out of town (this is more for opposing religious practices than anything rational); this town is a bit more progressive than some surrounding areas (it's in that spot where it wants to stay a small rural town even though it's huge locally and gradually urbanizing), so you don't get literal torches and pitchforks or lynch mobs, but it's still costly to do something as simple as, say, be an Earth science teacher.)
(There was actually a local skeptics/atheists meetup group at one time, when I first discovered meetup.com. Before I could get over the combination signalling costs/needing to ask for a ride anxiety, it disappeared and I have been unable to find it since. IIRC, there might have been as many as three people involved.)
All of which is to say, the article knows what it's talking about.
Replies from: savageorange↑ comment by savageorange · 2013-10-28T20:49:44.517Z · LW(p) · GW(p)
I understand that there are situations in which you definitely do not want to show how relatively rational you are.
But there are also situations where bad outcomes are unlikely. At some point you've gotta say "the risk is low enough and the potential gain is great enough that I'll do this thing.", because it's hard to get more rational on your own.
Have you interpreted my comment as a comment on the article rather than passive_fist's comment? Personally I think the OP is competently written and reasonably accurate.
The problem was with passive_fist's excessively simplified representation of what it means to be instrumentally rational (as a human being with complex values, rather than a paperclip optimizer with simple values).
Replies from: CAE_Jones↑ comment by CAE_Jones · 2013-10-29T01:57:22.535Z · LW(p) · GW(p)
Have you interpreted my comment as a comment on the article rather than passive_fist's comment?
Yes. And I embarrass myself again. At least this time I can blame it on not being able to see the formatting? On rereading, your comment makes sense as a reply to Passive_fist's, and I don't know how I managed to miss that the first time through.
↑ comment by buybuydandavis · 2013-10-28T07:55:41.897Z · LW(p) · GW(p)
No, because social costs aren't the only costs, and instrumental rationality should be about all your values, and the most important ones first.
The OP hit the real instrumentally rational move; find more compatible companions who value and reinforce your fundamental values. One of the compliments I've cherished over the years was "You're weird, but in the right way."