Posts

Comments

Comment by Johnny_Logic on Professing and Cheering · 2007-08-02T15:58:16.000Z · LW · GW

More and more, I get the sense that the metaphor-loving religious are promoting something like their right to willingly suspend disbelief, like gamer does when involved in a 'verse, like WoW. It has the same virtues: community, immersion, the thrill of exercising imagination and participating in grand narratives. Only, World of Warcraft buffs don't let their fantasy life impinge on the public sphere as often. I'm aware that I will likely receive flak for drawing this analogy, as it seems terribly dismissive.

Comment by Johnny_Logic on Bayesian Judo · 2007-08-02T03:23:47.000Z · LW · GW

A less personal response to the second bit I quoted from Mark D: Yes, changing our beliefs in the face of good evidence and argument is desirable, and to the extent that we are able to do this we can be called critical thinkers.

Comment by Johnny_Logic on Bayesian Judo · 2007-08-02T03:09:59.000Z · LW · GW

Mark D,

"JL, I’ve programmed in several languages, but you have me correctly pegged as someone who is more familiar with databases. And since I’ve never designed anything on the scale we’re discussing I’m happy to defer to your experience. It sounds like an enormously fun exercise though."

There are programs (good ol' chatter bots) that use methods like you supposed, but they are far from promising. No need to defer to me-- I am familiar with machine learning methods, some notable programs and the philosophical debate, but I am far from an expert on AI, and would listen to counterarguments.

"Would you put aside your convictions and adopt religion if a skilful debater put forward an argument more compelling than yours? If you were to still say “no” in the face of overwhelming logic, you wouldn’t justifiably be able to identify yourself as a critical thinker. And THAT’S what I was driving at."

It is not the skillfulness of the debtor that is the issue, but the quality of the reasoning and evidence given the magnitude of the claims. I have sought good arguments and found them all to be seriously lacking. However, if I were presented with a very good argument (overwhelming evidence is better) though, I would like to think I would be able to change my beliefs. Of course, such a new belief would not be immune to revision in the future. Also, knowing what I do about the many ways we fool ourselves and our reasoning fails, I may be wrong about my ability to change cherished unbeliefs, but I do try. Keeping an open, curious, yet appropriately critical attitude toward everything even when we are at our best is not easy, or maybe even possible.

"I don’t really have any passion for debating so I’ll leave it there. I’m sure EY can pass along the email address I entered on this site if you’re determined to talk me out of my wayward Christianity."

I trust that you are serious about signing-off, so I will leave you with a few questions I do not expect to be answered, but are in my opinion, worth considering: Are there any conditions under which you would reject Christianity? Why do you believe in your flavor of Christianity, rather than anything else? Are these good reasons? Is your belief proportionate to the evidence, or total? Would you accept this kind of reasoning in other domains (buying a car, convicting a criminal), or if it led to different conclusions than yours (Islam, Mormonism)? Why or why not?

"Best of luck to you all"

Cheers.

Comment by Johnny_Logic on Bayesian Judo · 2007-08-01T22:58:33.000Z · LW · GW

Mark M.,

"His beliefs have great personal value to him, and it costs us nothing to let him keep them (as long as he doesn’t initiate theological debates). Why not respect that?"

Values may be misplaced, and they have consequences. This particular issue doesn't have much riding on it (on the face of it, anyway), but many do. Moreover, how we think is in many ways as important as what we think. The fellows ad hoc moves are problematic. Ad hoc adjustments to our theories/beliefs to avoid disconfirmation are like confirmation bias and other fallacies and biases-- they are hurdles creativity, making better decisions and increasing our understanding of ourselves and the world. This all sounds more hard-nosed than I really am, but you get the point.

"By definition, wouldn’t our AI friend have clearly defined rules that tell us what it believes?"

You seem to envision AI as a massive database of scripts chosen according to circumstance, but this is not feasible. The number of possible scripts to enable intelligent behavior would be innumerable. No, an AI need not have "clearly defined rules" in the sense of being intelligible to humans. I suspect anything robust enough to pass the Turing Test in any meaningful (non-domain restricted) sense would be either too complicated to decode or predict upon its inspection, or would be the result of some artificial evolutionary process that would be no more decodable than a brain. Have you ever looked at complex code--it can be difficult if not impossible for a person to understand as code, let alone all the possible ways it may implement (thus bugs, infinite loops, etc.). As Turing said, "Machines take me by surprise with great frequency."

"You’ll just have to take my word for it that I had other unquantifiable impulses."

But you would not take the word of an AI that exhibited human level robustness in its actions? Why?

"I think you might be misapplying the Turing test. Let’s frame this as a statistical problem. When you perform analysis, you separate factors into those that have predictive power and those that don’t. A successful Turing test would tell us that a perfect predictive formula is possible, and that we might be able to ignore some factors that don’t help us anticipate behaviour. It wouldn’t tell us that those factors don’t exist however."

Funny, I'm afraid that you might be misapplying the Turing Test. The Turing Test is not supposed to provide a maximally predictive "formula" for a putative intelligence. Rather, passing it is arguably supposed to demonstrate that the subject is, in some substantive sense of the word, intelligent.

Comment by Johnny_Logic on Bayesian Judo · 2007-08-01T18:20:29.000Z · LW · GW

Where do people get the impression that we all have the right not to be challenged in our beliefs? Tolerance is not about letting every person's ideas go unchallenged; it's about refraining from other measures (enforced conformity, violence) when faced with intractable personal differences.

As for politeness, it is an overrated virtue. We cannot have free and open discussions, if we are chained to the notion that we should not challenge those that cannot countenance dissent, or that we should be free from the dissent of others. Some people should be challenged often and publicly. Of course, the civility of these exchanges matters, but, as presented by Eliezer, no serious conversational fouls or fallacies were committed in this case (contemptuous tone, ad hominems, tu quoque or other Latinate no-nos, etc.).

Mark D,

How do you know what the putative AI "believes" about what is advantageous or logical? How do you know that other humans are feeling compassion? In other words, how you feel about the Turing test, and how, other than their behavior, would you be able to know about what people or AIs believe and feel?