Marketing rationalism

post by PhilGoetz · 2009-04-12T21:41:26.537Z · LW · GW · Legacy · 65 comments

Contents

65 comments

Suppose you're a protestant, and you want to convince other people to do what the Bible says to do.  Would you persuade them by showing them that the Bible says that they should?

Now suppose you're a rationalist, and you want to convince other people to be rational.  Would you persuade them with a rational argument?

If not, how?

ADDED:  I'm not talking about persuading others who already accept reason as final arbiter to adopt Bayesian principles, or anything like that.  I mean persuading Joe on the street who does whatever feels good, and feels pretty good about that.  Or a doctor of philosophy who believes that truth is relative and reason is a social construct.  Or a Christian who believes that the Bible is God's Word, and things that contradict the Bible must be false.

Christians don't place a whole set of the population off-limits and say, "These people are unreachable; their paradigms are too different."  They go after everyone.  There is no class of people whom they are unsuccessful with.

Saying that we have to play by a set of self-imposed rules in the competition for the minds of humanity, while our competitors don't, means we will lose.  And isn't rationality about winning?

ADDED:  People are missing the point that the situation is symmetrical for religious evangelists.  For them to step outside of their worldview, and use reason to gain converts, is as epistemically dangerous for them, as it is for us to gain converts using something other than reason.  Contemporary Christians consider themselves on good terms with reason; but if you look back in history, you'll find that many of the famous and influential Christian theologians (starting with Paul) made explicit warnings against the temptation of reason.  The proceedings from Galileo's trial contain some choice bits on the relation between reason and faith.

Using all sorts of persuasive techniques that are not grounded in religious truth, and hence are epistemically repulsive to them and corrosive to their belief system, has proven a winning strategy for all religions.  It's a compromise; but these compromises did not weaken those religions.  They made them stronger.

65 comments

Comments sorted by top scores.

comment by gjm · 2009-04-12T21:56:48.010Z · LW(p) · GW(p)

Most people (even those who aren't rationalists) consider rationality to be usually a good thing. Most people who aren't Christians don't consider following the Bible to be particularly useful or meritorious. So I find your analogy unconvincing. But:

You should convince them by appealing to whatever they understand, in a way you can do with integrity. If there is no such way, then you're probably both wasting your time. I'd have thought that saying more than this would require consideration of the special characteristics of particular situations.

Replies from: orthonormal, David_Gerard, PhilGoetz
comment by orthonormal · 2009-04-12T23:52:48.001Z · LW(p) · GW(p)

To expand on this: one major arrow in rationality's quiver is that practically everyone (a few genuine postmodernists excepted) values some basic concept of rationality. If this weren't so, then political actors wouldn't get such mileage out of showing inconsistencies, biases and (purported) fallacies in their opponents.

Furthermore, the vast majority of people believe themselves to be epistemically rational, now excepting some fideists of various types as well (but even these usually have arguments for doing so that appeal to some sense of second-order rationality).

comment by David_Gerard · 2011-04-11T21:46:04.079Z · LW(p) · GW(p)

I'm not sure sweet reason will work. I do remember a sophisticated theist friend coming back from visiting our local Church of England, which is low enough church to be about one notch above the local Pentecostal churches, and includes recruits from the higher end of there. People who quite literally believe in a simplistic karmic model of the world, where good things happen to good people, so if bad things or good things happen to you it's because you're bad or good. He was horrified. I then ran EY's concentric series of retcons theory of religion past him and he had to concur to some degree. I've since given him The God Delusion to read, after I caught him bitching about it without having read it. I am now biding my time. Muwahahaha.

comment by PhilGoetz · 2009-04-12T22:01:15.393Z · LW(p) · GW(p)

You should convince them by appealing to whatever they understand, in a way you can do with integrity. If there is no such way, then you're probably both wasting your time.

Yet Christians manage the same trick, on a large scale.

There is a Mahayana Buddhist doctrine - it might have to do with the "doctrine of the lesser vehicle", but I forget - that says (paraphrased), "No one can be persuaded of the truth of Buddhism unless they already understand the truth of Buddhism. Therefore for their own good you may deceive them, and tell them that the study of this doctrine will give them the lesser things that they in their ignorance desire, to persuade them to follow it unto understanding."

Replies from: gjm, JulianMorrison
comment by gjm · 2009-04-12T22:46:50.597Z · LW(p) · GW(p)

I think a much-too-large fraction of how Christians manage it is by means that most people here would deplore: not merely because they appeal to something other than reason, but because they're actually anti-rational.

If you wish to proceed in that way, go ahead. My guess is that (1) rationalists in general will not do well using techniques that go so badly against the grain, (2) rationalists who do what it takes to use such techniques will tend to corrupt their own rationalism in so doing (because, e.g., the most effective way to fool others is to fool yourself first), and (3) the loss -- e.g., from people noticing that they've been tricked and deciding not to trust anything you've ever told them -- might well turn out to be greater than the gain anyway.

I remain unconvinced of the need, anyway: most people agree, at least in theory and in general, that rationality is good and useful. Convincing someone to be rationalist might be harder; so focus instead on showing them how to be rational more effectively in particular cases where they are agreed that being rational is good. The principles generalize, after all.

comment by JulianMorrison · 2009-04-12T22:10:10.643Z · LW(p) · GW(p)

They cheat. Persuasion per se is not involved.

Replies from: loqi, PhilGoetz
comment by loqi · 2009-04-12T22:31:19.015Z · LW(p) · GW(p)

Is it cheating to suggest to a theist that the tools of rational thought can help them more fully understand God?

Replies from: JulianMorrison
comment by JulianMorrison · 2009-04-12T22:34:35.007Z · LW(p) · GW(p)

It would be a truth in the denotation and a dirty trick in the connotation, but it isn't what I meant by "cheating".

comment by PhilGoetz · 2009-04-12T22:18:51.701Z · LW(p) · GW(p)

How do they cheat? Can/should we cheat in a similar fashion?

Replies from: JulianMorrison
comment by JulianMorrison · 2009-04-12T22:26:10.886Z · LW(p) · GW(p)

They exploit brain hacks. Teaching kids. Guilt or shame and the promise of absolution. Peer pressure. Tribal cohesiveness. Force, fear, and pain-reinforcement. Mere exposure effect. Etc etc.

Basically you're asking "is the dark side stronger", and I refer you to Yoda.

Replies from: infotropism, PhilGoetz
comment by infotropism · 2009-04-12T22:36:15.224Z · LW(p) · GW(p)

Appeal to fictional evidence, that's dangerous too. Involving the dark side of star wars will elicit cached thoughts. The force is a fictional contraption devised by people for a story, and it doesn't work in the same way as rationality do.

That said, is it still ok to rob a bank to give to a charity though ? We must be damn sure of our truth, and of the nobleness of our purposes, to lie others into the same understanding as ours.

Replies from: JulianMorrison
comment by JulianMorrison · 2009-04-12T22:52:53.249Z · LW(p) · GW(p)

I wonder if there's a bit of Aumann agreement in there. We might disagree with other people, but to just hack their brains cancels any useful updates we might have got from their unique knowledge.

Replies from: infotropism
comment by infotropism · 2009-04-12T23:08:01.048Z · LW(p) · GW(p)

That is much too complicated to be solved in one sentence. However, ultimately, we'll make a bet, on the assurance that we must be right. If we indeed are, it makes sense to convert other people to our worldview, provided their objectives are similar to ours, since that will help them.

Historically, though, it has been shown that people believing they were right, were not even close to being that. Would we be repeating that mistake if we said that what we advocate is the truth ?

What do we advocate anyway ? It seems our vision of truth is much more flexible than any other seen so far. We don't even have a fixed vision, anything we believe at this point, is liable to be rewritten.

It seems to me that to be a good rationalist, you should ideally not need someone else to show you unique knowledge, that might change your mind. You should be able to do it yourself. But that idea can be potentially abused too.

comment by PhilGoetz · 2009-04-12T22:32:22.319Z · LW(p) · GW(p)

Yoda is an unabashed religious, moral realist. In his world, you can measure someone's goodness by the color of their lightsaber.

It is irrational to label a set of tools "dark arts" and place them off limits to us. EY has a justification for not using the "dark arts", but it's (my interpretation) supposed to be a lot more sophisticated than just calling them evil - and hence has many more possible exceptions or failure points.

Replies from: JulianMorrison
comment by JulianMorrison · 2009-04-12T22:42:39.330Z · LW(p) · GW(p)

I'm sure a rationalist society would teach its kids. That hack is hardly avoidable - people have to start from somewhere.

The other stuff has an obvious downside: it makes the victim dumber. Zombies are useful to theists but not to us. Also, it tangles the dark-sider in nonsense that they must subsequently defend. It makes them a practicing anti-rationalist in order to shore up their gains. In the end and with a sufficiently smart victim, it's simply fated to collapse, leaving bad odor all around.

Replies from: Strange7, PhilGoetz
comment by Strange7 · 2010-03-25T07:57:22.269Z · LW(p) · GW(p)

Also, it tangles the dark-sider in nonsense that they must subsequently defend.

Actually, i think that might be the best part: somebody starts to notice that it's nonsense, you take them aside and say, "Congratulations! Most of what I taught you was lies, and, of course, you can't trust me to say which is which. You'll just have to look at the evidence, figure it out for yourself."

comment by PhilGoetz · 2009-04-12T22:46:23.379Z · LW(p) · GW(p)

I think the same argument could be made against using anything other than Biblical principles to win converts to Christianity. A Christian church that believed those arguments would lose.

And aren't rationalists supposed to win?

Replies from: infotropism, JulianMorrison
comment by infotropism · 2009-04-12T22:51:09.751Z · LW(p) · GW(p)

Rationality is supposed to score a win (whenever it is possible). Rationalists only try to use rationality, to the best of the capability, to win. They may or may not succeed.

Replies from: PhilGoetz
comment by PhilGoetz · 2009-04-12T22:57:29.785Z · LW(p) · GW(p)

Looks to me like Christianity has the more winning strategy (where winning = gaining converts).

Replies from: gjm, infotropism, JulianMorrison
comment by gjm · 2009-04-12T23:26:13.639Z · LW(p) · GW(p)

It may well be that Christianity is winning (in that sense). That doesn't mean that it has a winning strategy: it might (and clearly does) have other advantages which rationalism doesn't have and either couldn't or shouldn't get.

Replies from: gjm
comment by gjm · 2009-04-13T01:47:21.683Z · LW(p) · GW(p)

I'm going to take the downvote I got for that as indicating that I wasn't clear enough and explain a bit further.

Suppose A beats B at some game. (Here A is Christianity, B is rationalism, the game is having as many people as possible onside.) It could be that this is because A is playing the game better than B. But A could also be winning for reasons that have nothing to do with how A and B are playing.

Example 1: two people are trying to outdo one another in getting many sexual partners. (I make no comment on the wisdom or morals of playing this game.) A might be winning by being physically more attractive, or by having a pile of inherited money and therefore more scope for generous gestures.

(... Perhaps Christianity just is more appealing to most people than rationalism; see, e.g., Pascal Boyer's theories about what sorts of belief tend to lodge in people's minds and form religious doctrines. Perhaps Christianity benefits from having been officially adopted by the Roman Empire and plenty of other empires since then, and spread by the sword or by economic intimidation.)

Example 2: two people are playing the game of making as much money as possible. A might be winning by virtue of getting lucky early on and therefore having more resources for the rest of the game.

(... Perhaps Christianity has many adherents now simply because it had many in the past, and people tend to pass on their religion to their children and to others around them.)

Example 3: two people are playing a game of tennis. A might be winning because she's friends with the referee, who calls balls in or out dishonestly to favour A.

(... Perhaps Christianity has many adherents because powerful people and institutions are Christian and others are intimidated or impressed by their status. Roman Empire, again -- or the US today.)

It's not hard to come up with further examples, but I'll leave it there. Rationalism doesn't have the option of being something different and more appealing, or changing history so as to have the advantage of lots of existing members; perhaps rationalists could somehow contrive to gain enough power to intimidate, or enough influence in schools etc. to brainwash, but it might not be possible to do that without becoming corrupted and ceasing to be rationalist.

These are all ways in which Christianity could "win" whether or not it employs a "winning strategy".

Replies from: PhilGoetz
comment by PhilGoetz · 2009-04-13T03:27:32.083Z · LW(p) · GW(p)

I didn't say that it was winning. I said it looked to me like it had a more winning strategy. Their strategy is to win converts by any means, as opposed to the rationalist strategy that several people are endorsing that says that we can't use irrational persuasive methods. Comparing those two strategies, I predict the first will win.

comment by infotropism · 2009-04-12T23:15:17.277Z · LW(p) · GW(p)

Yes, where winning equates gaining converts. But gaining converts, for us, ought to be only instrumental to a greater purpose. Many strategies may win on the short or mid term, being more explosive or efficient, but still lead to a dead end.

So what religion uses to gain converts, may not work for us, as it destroys our long term purposes. Though I find it difficult to disentangle what in those methods we could use, and what we couldn't.

Replies from: PhilGoetz
comment by PhilGoetz · 2009-04-13T03:30:54.763Z · LW(p) · GW(p)

I would call 2000 years long term. (In the set of strategy histories observed so far.)

Part of my point is that the methods they use to gain converts are also against their long term purpose. The fact that thoroughly-evolved religions do this indicates it is adaptive, despite the short-term hit to their worldview.

comment by JulianMorrison · 2009-04-12T22:58:57.142Z · LW(p) · GW(p)

What use is a dumbed down, brain-hacked convert? Are you using them to keep score, or something?

comment by JulianMorrison · 2009-04-12T22:49:44.126Z · LW(p) · GW(p)

What same argument? I don't follow.

Replies from: PhilGoetz
comment by PhilGoetz · 2009-04-12T23:02:14.356Z · LW(p) · GW(p)

Rationality is to a Christian somewhat as the Dark Arts are to us. Christians have often made conversions based on reason, even though giving reason legitimacy makes their converts "dumber" and less-able to resist the temptation of reason.

They haven't said "these practices are off-limits to us". They strive for an optimal tradeoff between winning converts and corrupting their religion. We can consider their policies to have been selected by evolution. So we should be suspicious of claims that we, using reason, can find tradeoffs better than 2000 years of cultural evolution can. Particularly when our tradeoff ax + by involves suspicious numbers like a=0 and b=1.

Replies from: infotropism
comment by infotropism · 2009-04-12T23:56:02.376Z · LW(p) · GW(p)

Actually quite a few Christians are very rational people. It is possible to use only some of the tools or rationality, to dig your own grave even deeper than you could if you knew nothing of it.

Becoming a more sophisticate debater for instance.

Those people don't consider "rationality" as something negative, far from it. They have their own idea of what rationality is, of course, but that idea overlaps ours enough that those two concepts can be considered to be similar.

Replies from: PhilGoetz
comment by PhilGoetz · 2009-04-13T00:47:55.453Z · LW(p) · GW(p)

I'm oversimplifying; but if you go back into church history, especially pre-Enlightenment, you'll find that most of the major church fathers made statements explicitly condemning rationality.

comment by CannibalSmith · 2009-05-01T07:59:11.639Z · LW(p) · GW(p)

I would muster all their biases in my favor. I would try every trick in the book to influence the unconscious part of their mind because that's where it all happens.

For the conscious part, I'd tell them useful stuff and try to demonstrate it on the spot.

For new age people, I'd try to get them to try to attract rationality.

Replies from: David_Gerard
comment by David_Gerard · 2011-04-11T21:39:17.237Z · LW(p) · GW(p)

For new age people, I'd try to get them to try to attract rationality.

I must admit, I laughed out loud there.

Dark Side Epistemology exercise: go to Google, find a really painfully written web ad for a Pick-Up-Artist manual or something comparably cheesy and dark-side-susceptible. Rewrite the ad to be an ad for rationality techniques.

(Don't feel you have to show it to anyone, this is just an exercise.)

comment by robzahra · 2009-04-12T23:46:00.002Z · LW(p) · GW(p)

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.
Replies from: cousin_it, PhilGoetz
comment by cousin_it · 2009-04-13T09:13:35.490Z · LW(p) · GW(p)

create a model of the other person, then use standard rationality to explore how to most efficiently change them

Standard rationality tells me it's most efficient to lie to them from a young age.

Replies from: PhilGoetz
comment by PhilGoetz · 2009-04-13T12:56:01.601Z · LW(p) · GW(p)

(In this case, to tell the truth to them, we hope.)

comment by PhilGoetz · 2009-04-13T01:37:02.044Z · LW(p) · GW(p)

Thanks! Those are good examples. Although the fact that they wouldn't make me feel dirty makes me suspect we should go farther.

comment by ChrisHibbert · 2009-04-12T22:33:39.531Z · LW(p) · GW(p)

I think WW Bartley was striving to achieve just this goal in "The Retreat to Commitment". His approach (after much discussion of the Protestants' approach to philosophy as background) is to ask what the goal of thinking about thinking should be. He concludes that anyone who thinks the question is interesting must be looking for techniques to help find out what is true about the world we live in. If you and your interlocutor can agree on that, then you're well on your way to being able to establishing correspondence with reality as the metric for better and worse choices of how to think.

Bartley's success in the book is in arguing well that you don't have to taken any particular theory, approach, or metric as primary in the struggle to decide what works best. At any particular moment you have to have a place to stand, but if there are reasons to doubt the foundations you are using, you can stand somewhere else for a while and inspect those foundations.

At the end you want to come back to contemplation of the question as to which approaches seem to lead to the best understanding of the truth of the world.

comment by mwengler · 2016-02-10T15:56:29.936Z · LW(p) · GW(p)

This comment is in reply to some ideas in the comments below.

In my opinion, my rationality is as faith-based as is a religious person's religious belief.

Among my highest values is "being right" in the sense of being able to instrumentally effect or predict the world. I want to be able to communicate across long distances, to turn combustible fuel into safe transportation, to correctly predict what an interstellar probe will find and to be able to build an interstellar probe that will work. Looking at the world, I see much more success in endeavors like these from science and rationality than from religiosity or appeals to god. And so I adopt rationality as it supports my values.

I also want to raise healthy, happy, "good" children. My one child who dabbles in alcohol, drugs, and petty theft, I am pretty sure I could "help" him by going to church with him. I've known many people who are effective at doing things I see as good because, it seems, of their religious beliefs and participation in churches and religious communities. I liked being a Lutheran for a few years. One night I told our pastor that I just didn't believe in god. He told me he thought half the church had that happening. Even so I couldn't stay engaged.

I feel the loss of religious faith as a sorrow, or a pain, or a burr under my saddle, or something. But I can't justify it, or more importantly, I can only pretend to believe, actual belief does not seem to me to be a real option anymore.

And it turns out I have enough "faith" in scientific rationalism that I won't even pretend I believe in god. I choose to believe that staying consistent with rational principles will payoff more for me and those I care about than will falling back to the more accessible morality of religious faith. It is a leap of faith, especially in light of "rationalists win." If my son were to become an heroin addict and devote his life to petty theft, jail, and shooting up, AND I could have prevented that by bringing him to church, I will have paid a price for my faith, as much as any Christian Martyr who was harmed or whos family was harmed because he did not deny his Christian belief.

People who think their rationality does not come from a faith they possess remind me of religious people who think their belief in god is just right, that it does not come from a faith that they possess or have chosen.

Replies from: DSimon
comment by DSimon · 2016-02-10T17:44:28.088Z · LW(p) · GW(p)

Taboo "faith", what do you mean specifically by that term?

Replies from: mwengler
comment by mwengler · 2016-02-11T15:38:29.250Z · LW(p) · GW(p)

Taboo "faith", what do you mean specifically by that term?

Good idea. I mean that EVERYBODY, rationalist atheist and christian alike, starts with an axiom or assumption.

In the case of rationalist atheists (or at least come such as myself) the axioms started with are things like 1) truth is inferred with semi=quantifiable confidence from evidence supporting hypotheses, 2) explanations like "god did it" or "alpha did it" or "a benevolent force of the universe did it" are disallowed. I think some people are willing to go circular, allow the axioms to remain implicit and then "prove" them along the way: I see no evidence for a conscious personality with supernatural powers. But I do claim that is circular, you can't prove anything without knowing how you prove things and so you can't prove how you prove things by applying how you prove things without being circular.

So for me, I support my rationalist atheist point of view by appealing to the great success it has in advancing engineering and science. By pointing to the richness of the connections to data, the "obvious" consistency of geology with a 4 billion year old earth, the "obvious" consistency of evolution from common ancestors of similar structures across species right down to the ADP-ATP cycle and DNA.

But a theist is doing the same thing. They START with the assumption that there is a powerful conscious being running both the physical and the human worlds. They marvel at the brilliance of the design of life to support their claim even though it can't prove their axioms. They marvel at the richness of the human moral and emotional world as more support for the richness and beauty of conscious and good creation.

Logically, there is no logic without assumptions. Deduction needs something to deduce from. I like occams razor and naturalism because my long exposure to it leaves me feeling very satisfied with its ability to describe many things I think are important. Other people like theism because their long exposure to it leaves them feeling very satisfied with its ability to describe and even prescribe the things they think are important.

I am not aware of a definitive way to challenge axioms, and I don't think there is one at the level I think of it.

comment by imaxwell · 2009-04-15T17:08:02.011Z · LW(p) · GW(p)

It took me a long time to respond to this because I found the question resistant to analysis. My immediate impulse is to shout, "But, dammit, my rational argument is really truly actually valid and your bible quotation isn't!" This is obviously irrelevant since, by hypothesis, my goal is to be convincing rather than correct.

After thinking about it, I've decided that the reason the question was hard to analyze is because that hypothesis is so false for me. You haven't placed any constraints at all; in particular, you haven't said that my goal is

  • to convince others to be more rational via a correct argument, or
  • to convince others to be more rational, provided that this is true, or
  • to convince others to be more rational, as long as I don't have to harm anyone to do it, or
  • to convince others to be more rational, as long as I can maintain my integrity in the process.

If I take "convince others to be more rational" as a supergoal, then of course I should lie and brain-hack and do whatever is in my power to turn others into rationalists. But in reality, many of my highest values have less to do with the brain-states of others, than with what comes out of my own mouth. Turning others into rationalists at the price of becoming a charlatan myself would not be such a great tradeoff.

I regularly "lose" debates because I'm not willing to use rhetoric I personally find unconvincing. (Though I'm probably flattering myself to suppose that I would "win" otherwise.) To give a specific example, I am deeply opposed to drug prohibition, while openly predicting that more people will be addicted to drugs if they are legally available. This is a very difficult position to quickly relay to someone who doesn't already agree with me, but any simplification would be basically dishonest. I could invent instrumental reasons why I shouldn't use a basically dishonest argument in this case, but the truth is that I just hate lying to people, even more than I hate letting them walk around with false, deadly ideas.

I imagine Eliezer and Robin run into this themselves, when they say that a certain unusual medical procedure only has a small probability of success, but should be undergone anyway because the payoff is so high. Many people will hear "low probability of success" and stop listening, and many of those people will therefore die unnecessarily. Does this mean Eliezer and Robin should start saying that there is a high probability of success after all, in order to better save lives?

Now maybe your point here is that yes, we all should be lying for the sake of our goals---that we should throw out our rules of engagement and wage a total war on epistemic wrongness. I have considered this myself, and honestly I don't have a good rebuttal. I can only say that I'm not ready to be that kind of warrior.

comment by AndySimpson · 2009-04-13T09:51:31.380Z · LW(p) · GW(p)

I think coming to agreement on terms through a dialectic is something most everyone can agree to engage in, and I don't think it's offensive to or beyond the scope of rationality. Socrates' way is the sort of meta-winning way, the way that, if fully pursued, will arrive at the conclusion of rationality.

For instance, In any one of those cases, I could start with a dialectic about problem-solving in everyday life, or at least general cases, and proceed to the principle that rationality is the best way. I'd try to come to agreement about the methods we use to diagnose a car problem, calculate how much they owe in taxes, or decide to enter an intersection, and extrapolate to epistemology from there. The philosopher, the Christian, and the hedonist all use reason, not will-to-power, faith, or desire to fix and drive their cars and pay their taxes, and this gives the evangelist of reason a method of proving the epistemological assertion that there is such a thing as truth, which we encounter in passing, and that rationality is the optimal way to approach it.

comment by AlexU · 2009-04-13T04:11:35.462Z · LW(p) · GW(p)

People are irrational largely because they're stupid. I have yet to be convinced that "rationality" is something entirely distinct from intelligence itself, such that you can appeal to someone to become significantly more "rational" without simultaneously effecting the seemingly tougher feat of boosting IQ a standard deviation or so.

Replies from: conchis
comment by conchis · 2009-04-13T11:40:56.109Z · LW(p) · GW(p)

For some evidence to the contrary (and the beginnings of a theory about when cognitive ability will correlate with rationality and when it won't) try this:

  • Stanovich, K. E, & West. R. F. (2008). On the relative independence of thinking biases and cognitive ability. Journal of Personality and Social Psychology, 94, 672-695. JPSP08.pdf

(More here.)

comment by badger · 2009-04-12T22:27:01.229Z · LW(p) · GW(p)

The approach Jaynes takes in the opening chapters Probability: The Logic of Science based on Cox's theorem was very persuasive to me and the few others I've mentioned it to. The basic idea is to start with a few criteria that just seem like common sense that everyone should agree are desirable in a reasoning system. Then Jaynes shows that probability theory is the only possible system that fulfills these criteria.

Is there a similar approach that can be used to argue to rationality in general? I like to appeal to universiality. Different subjects should be governed by the same rules. And science and rationality have been enormously successful, so why shouldn't it be applied universally. Unfortunately, this approach can easily be abused. Can we formulate a good approach of this sort that isn't just leading people to say what we want?

Replies from: smoofra, PhilGoetz
comment by smoofra · 2009-04-12T23:49:19.029Z · LW(p) · GW(p)

I hate everything Jaynes has written about Cox's theorem. He glosses over assumptions and pretends the assumptions that he admits too are weaker than they are.

Go back and read it again. Cox's theorem isn't anywhere near as strong as Jaynes makes it out to be.

Replies from: badger, Psy-Kosh
comment by badger · 2009-04-13T00:35:06.325Z · LW(p) · GW(p)

Will do. I just obtained a hard copy of P:tLoS to study a little more seriously.

Are there any specific issues to watch out for? Is there a better source for understanding Cox's theorem you could point me to?

Replies from: smoofra
comment by smoofra · 2009-04-13T04:09:01.317Z · LW(p) · GW(p)

http://www.stats.org.uk/cox-theorems/Halpern1999a.pdf

Halpern gives a correct proof of one of the rigorous variations on Cox's theorem, and gives a counterexample to Cox's theorem for a set of propositions that's too small to satisfy the density requirement.

comment by Psy-Kosh · 2009-04-13T01:09:38.462Z · LW(p) · GW(p)

There're a couple things he seemed to gloss over, but those seemed more like "boilerplate that was 'more of the same' for certain bits, IIRC" rather than "significant things that we're missing significant bits of"... but then, I guess "glossing over" is a problem because it makes things seem like that. :)

Anyways, I happen to be a fan of vulnerability/coherence/dutch book style arguments. I mean, for cleanliness/simplicity, those just win hands down. (a touch of, at most, linear algebra vs the functional analysis of Cox's Theorem? :)) And in some forms build up decision theory right at the same time!

Although, now I'm wondering... just how much weaker is Cox's theorem than Jaynes makes it sound?

Replies from: smoofra
comment by smoofra · 2009-04-13T04:04:07.312Z · LW(p) · GW(p)

The proof in Jaynes applies proves that if you want to assign plausibilities to propositions, and those plausibilities are going to be real numbers, and P(a^b|c) is a function of P(a|b^c) and p(b|c) and P(not a) is a function of P(a) and all those functions are twice-differentiable, and P satisfies a certain density requirement, then P has to be isomorphic to probability.

It just doesn't have the same philosophical punch as "a few criteria that just seem like common sense that everyone should agree are desirable in a reasoning system." when you actually spell out the assumptions and they contain seemingly unjustified technical things like differentiability and density.

There are a bunch of rigorous variations on Cox's theorem, but as far as i know there is nothing that lives up to the hype.

Replies from: Psy-Kosh
comment by Psy-Kosh · 2009-04-13T16:55:49.987Z · LW(p) · GW(p)

Well, some of those criteria at least seem perfectly reasonable.

As far as what which thing was a function of, IIRC, he kinda went through that, discussing some examples and basically outlining an argument for what sort of things could depend on what vs what would lead to absurdities, so the "this is a function of this and that" wasn't, IIRC, pulled out of thin air.

comment by PhilGoetz · 2009-04-12T22:39:22.267Z · LW(p) · GW(p)

I'm not talking about convincing people who believe in reason to use probability theory. I meant, people who don't accept reason as the final arbiter in arguments. Which may still be most people.

Replies from: badger
comment by badger · 2009-04-12T23:03:53.390Z · LW(p) · GW(p)

I may have been unclear. I only meant Jaynes's approach as an analogy. I was speculating whether an approach based on common-sense desirata would work as well for rationality in general as it does for probability.

comment by [deleted] · 2015-12-13T06:38:34.239Z · LW(p) · GW(p)

I acknowledge the symmetry and appreciate the conceptual novelty of your argument to my idea set. It's a very powerful interpretation of some behaviour I have observed.

Has it been experimentally tested?

comment by lavalamp · 2009-04-17T19:43:32.683Z · LW(p) · GW(p)

ADDED: People are missing the point that the situation is symmetrical for religious evangelists. For them to step outside of their worldview, and use reason to gain converts, is as epistemically dangerous for them, as it is for us to gain converts using something other than reason.

Disagree. Most people know very little of logic and reason, and will not bother to do their homework upon being approached with Christian arguments based on reason.

comment by robzahra · 2009-04-12T23:40:02.806Z · LW(p) · GW(p)

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Most generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.
comment by robzahra · 2009-04-12T23:39:36.600Z · LW(p) · GW(p)

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Most generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.
comment by robzahra · 2009-04-12T23:39:32.644Z · LW(p) · GW(p)

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Most generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.
comment by robzahra · 2009-04-12T23:39:18.975Z · LW(p) · GW(p)

Some examples of what I think you're looking for:

  1. Vassar's proposed shift from saying "this is the best thing you can do" to "this is a cool thing you can do" because people's psychologies respond better to this
  2. Operant conditioning in general
  3. Most generally, create a model of the other person, then use standard rationality to explore how to most efficiently change them. Obviously, the less wrong and overcoming bias knowledge base is very relevant for this.
comment by JulianMorrison · 2009-04-12T21:59:53.392Z · LW(p) · GW(p)

A moral argument is often a good start.

Replies from: PhilGoetz
comment by PhilGoetz · 2009-04-12T22:37:24.804Z · LW(p) · GW(p)

Do you mean a utilitarian argument? Rational behavior helps us not kill each other and build more cool stuff, something like that?

A purely moral argument seems difficult; it would go something like, "Rationalism is good. Period."

Replies from: JulianMorrison
comment by JulianMorrison · 2009-04-12T22:43:52.494Z · LW(p) · GW(p)

Instrumental rationality as a tool for moral ends. Epistemological rationality as a tool for instrumental ends. Oh look, now we have to revise our worldview.

Replies from: infotropism, PhilGoetz
comment by infotropism · 2009-04-13T00:55:05.282Z · LW(p) · GW(p)

In fact I've never yet seen the relationship between epistemic, instrumental rationalities, and real world objectives, so clearly stated. Would it be ok to collect such material in the wiki ? Like, all those short, concise, illuminating quotes we stumble upon in here ?

Replies from: JulianMorrison
comment by JulianMorrison · 2009-04-13T01:03:44.014Z · LW(p) · GW(p)

No objection here.

comment by PhilGoetz · 2009-04-13T00:44:42.422Z · LW(p) · GW(p)

Sneaky. I like it.

Replies from: JulianMorrison
comment by JulianMorrison · 2009-04-13T01:02:50.131Z · LW(p) · GW(p)

That is after all, how Christianity got sucker-punched the first time around.