Posts
Comments
i did not say it established she was better off as a theist than as an atheist. i was merely pointing out that being a theist does not make anyone more or less likely (as far as i know) to believe things which are false about their local environment (beyond those things which necessarily follow from their beliefs, e.g., this priest sure is wise in the ways of the Lord! he must be wise about other things, too!).
do we have any data suggesting atheists hold more accurate beliefs than theists about phenomena that they experience firsthand?
watching life-sized talking heads in the morning is roberts' way of lifting his spirits, not his cure for insomnia.
human beings are capable of having domain and context-specific cognitive algorithms. preferring comforting but false metaphysical truths does not mean she will prefer (more than others) reassuring but maladaptive beliefs about her local environment. her incentives to believe in some fanciful anthropomorphized abstraction are of an entirely different type than her incentives to believe true or false things about the intentions and motives of those she will interact with professionally, say.
are theists more or less likely to demonstrate competence on card-selection tasks or other tests of rational belief formation?
as robin has pointed out on numerous occasions, in many situations it is in our best interest to believe, or profess to believe, things that are false. because we cannot deceive others very well, and because we are penalized for lying about our beliefs, it is often in our best interest to not know how to believe things more likely to be true. refusing to believe popular lies forces you to either lie continually or to constantly risk your relative status within a potentially useful affiliative network by professing contrarian beliefs or, almost as bad, no beliefs at all. you're better off if you only apply ''epistemic rationality techniques'' within domains where true beliefs are more frequently or largely rewarded, i.e., where they lead to winning strategies.
trying to suppress or correct your unconscious judgments (often) requires willpower. indiscriminately applying ''epistemic rationality techniques'' may have the unintended consequence of draining your willpower more quickly (and needlessly).
when you volunteer your own time and energy to a cause, and experience the ''charity process'' firsthand, you increase your emotional investment and thus future commitment to it. sending a cheque is easy to forget; spending an afternoon with like-minded Cause Enthusiasts doing whatever it is volunteers do is not so easily forgotten, and the feel-good, warm fuzzy memories may even be conflated with the cause itself.
you want supporters who will stick around and proselytize. you will not succeed by having them just give money. you will succeed by having them invest an -experience - directly in the cause and the institution supporting it.
everything in the post is true but could easily lead unthinking activists to a long-term losing strategy. -you must combat ''care decay'' and foster commitment or you will lose-.
2) as stated demonstrates a persistent problem i see here and elsewhere: just because a behavior signals something to observers does not mean the behavior was chosen because it signals something to observers.
we use the same evaluative criteria to assess ourselves as we use to determine the relative value of our peers. for example, if i evaluate the relative worth of members of my peer group within the context of ''athletics'' using a criterion like their vertical leap, i will likely apply the same evaluative criterion to myself when assessing -my- value. when i spend hours alone at the gym doing plyometric training to improve my vertical leap i am not signalling, or improving myself with the intention of signalling something later on: i am just doing something that will let me score higher on the metric i use to evaluate my worth. it will make me more estimable in my own eyes and i will get a kick from internal self-approval.
what method are you using to ''correct for such a bias''? how do you ''correct'' your associational networks or the preferences that define who you are?
the only method that comes to mind is perspective-shifting or play-acting. trying to imitate the thoughts and (verbal) behaviors of someone who's attracted to spiritual ideas like ''nirvana'' and ''enlightenment'' might give you an appreciation for values that you do not typically use to define yourself.
at the same time, if you're a lawyer defending someone likely to be innocent and your goal is to have him exonerated, the most rational strategy is to use whatever lawyerly wiles you have at your disposal to convince an irrational jury of his innocence. an airtight bayesian argument may not be understood or it may be understood but disregarded, whereas a persuasive story vividly told can convince a jury of almost anything.
you cannot win the game if you refuse to accept the rules, and one of the implicit rules in almost every social game is that almost all of the participants are irrational almost all of the time.
you're missing the essential ingredient:
- winner-takes-all
in any situation where the spoils of victory are shared its best to align with the most competent. contrarily, when the winner gets everything, like life or the girl or the title, its almost always best to team up with your fellow incompetents to take down the likely victor.
the game show survivor strikes me as especially illustrative. players routinely gang-up on those perceived to be the most competent to increase everyone's chances of winning. once their usefulness as a workhorse or a ''challenge winner'' has been exhausted, or at least no longer outweighs concerns about winning a million bucks (as soon as the perceived probability of winning exceeds some minimum), the "strongest" or "most (apparently) cunning" player is often ousted..