Persuasiveness vs Soundness
post by Patrick · 2009-04-13T08:43:12.255Z · LW · GW · Legacy · 19 commentsContents
19 comments
Compare the following two arguments.
- E. is described by the following axioms
- Therefore under E. The square of the longest side of a right angle triangle is equal to the sum of the squares of the remaining two sides.
- All men are mortal.
- Socrates is a man.
- Therefore Socrates is mortal.
Naively, the second argument seems tautological, whereas with the first it's much harder to tell. Of course in reality the first argument is the tautology and the second argument is the more dubious one. The phrase "immortal man" doesn't seem contradictory, and how do we know Socrates is a man? He could be an android doppleganger. And the first argument's conclusion completely follows from euclid's axioms. Euclid and 100s of other mathematicians have proved that it does.
So, why does the average human think the opposite?
The arguments that change what we think, and the arguments that would change what a logically omniscient bayesian superman wielding Solomonoff's Lightsaber thinks are not very tightly correlated. In fact, there's a whole catalogue dedicated to finding out types of arguments we're persuaded by that a bayesian superman wouldn't be, the logical fallacies. But it's not just argument structure that causes us to lose our way, another factor is how well the argument is written.
People are more persuaded by essays from Eliezer Yudkowsky, Bertrand Russell, Paul Graham or George Orwell than they would be from a forum post by an average 13 year old atheist, even if they both make the exact same point. This threw me for a bit of a loop, until I realized that Eliezer was pitching to bayesian supermen as well as us mortals. How well stylish writing correlates to the truth compared to unstylish writing is nowhere near how much we are persuaded by stylish writing compared to unstylish writing.
There's also the dreaded intuition pump, I think the reason it's so maligned is because it makes things much more persuasive without making them any more sound. A well chosen metaphor can do more to the human mind than a thousand pages of logic. Of course, we *want* intuition pumps for things that are actually true, because we want people to persuaded of things that are true and more importantly, we want them to be able to reason about things that are true. A good metaphor can enable this reasoning far more effectively than a list of axioms.
The problem lies in both directions, we aren't always persuaded by cogent arguments, and we are sometime persuaded by crappy arguments that are delivered well. I put it to Less Wrong readers, how can we reduce the gap between what we are persuaded by, and what a bayesian superman is persuaded by?
19 comments
Comments sorted by top scores.
comment by mcook10128 · 2009-04-13T20:31:25.295Z · LW(p) · GW(p)
The syllogism's premises are not proven, they are assumed. It is really not relevant if Socrates is an alien android. IF the premises are true, the conclusion follows. We go along with it easily, because the premises happen to "seem" true. Tell someone: ALL A is B, C is A, therefore C is B they might not view it is tautological. So I think part of the answer to your question is that verbal / linguistic operations take place more automatically than formal symbolic manipulations in our mind, having been learned at an earlier age. It's also a valid syllogism to say:
All alien androids are mortal, Socrates is an alien android, therefore Socrates is mortal.
comment by Jack · 2009-04-14T01:30:20.506Z · LW(p) · GW(p)
You're comparing the soundness of the second argument with the validity of the first. Yes, Euclid and lots of mathematicians have proved that "argument's conclusion completely follows from euclid's axioms" but that just means the argument is valid. Similarly, the second argument's conclusion follows from its premises.
The first premise of the first argument is an assumption. I have no idea how one would show that to be true. E is described by Euclid's axioms because we've decided thats a helpful way to describe it. Usually, the first two premises of the second argument are taken to be assumptions also but we can come up with very good empirical reasons for believing them to be true if we have to. I don't really think either argument is more sound except that question the truth of the first premise of the first argument is probably a category error.
In any case 13 year-old atheists usually screw up the arguments. Do you think a 13 year-old atheist would win a debate with say Alvin Plantinga?
comment by CronoDAS · 2009-04-13T19:07:30.155Z · LW(p) · GW(p)
Hmmm...
There is a definite sense in which the second "theorem" is much simpler than the first. To get from "All men are mortal" and "Socrates is a man" to "Therefore Socrates is mortal" requires only a very short proof in first-order predicate logic.
- ∀x(Man(x)->Mortal(x)) [For all x, if X is a man, then X is mortal] (premise 1)
- Man(Socrates) ->Mortal(Socrates) [If Socrates is human, then Socrates is mortal] (universal instantiation, applied to 1)
- Man(Socrates) [Socrates is a man] (premise 2)
- Mortal(Socrates) [Socrates is mortal] (Modus ponens, applied to 2 and 3)
In most formal proof systems, the proof of the Pythagorean Theorem from the axioms of Euclidean geometry would be much, much longer.
comment by dclayh · 2009-04-13T17:36:35.634Z · LW(p) · GW(p)
I believe you mean, "The square of the longest side of a right triangle is equal to the sum of the squares of the remaining two sides."
Or alternatively, "The square of the longest side of a triangle is equal to the sum of the squares of the remaining two sides, minus twice the product of those two sides and the cosine of the angle between them."
comment by infotropism · 2009-04-13T11:16:39.114Z · LW(p) · GW(p)
I'd also link this to the recent marketing rationalism post. Using "dark art" means, an argument can be made as, or more compelling, than the argument would otherwise, all things being equal.
The question there seems to be, is it possible to use those means without negative side effects on the reader, and writer, when we know how easily the human mind can be displaced out of the (fairly artificial) state of rationality ?
Maybe we should cautiously train ourselves into both the defense against, and the use of "dark arts", at the very least to be able to spot them when they're used, and maybe, to know when it can be right to use them. This should follow the lesson in It's okay to be (at least a little) irrational. We are running on a human brain. If we use, or even encounter, the dark arts, we run the risk of warping our own judgement, but if we shy away from them, pretending not to notice, we run the risk of using them without even our realizing it.
We need a safe, error recoverable, way to delve into these. How could we go about doing that ?
Replies from: AlanCrowe↑ comment by AlanCrowe · 2009-04-13T13:59:06.816Z · LW(p) · GW(p)
I don't think it is enough to split the "dark arts" along the true/false axis. We also need to split along internal/external.
Consider the case that we have tried very hard to avoid being taken in by false arguments and have at long last reached a true and useful conclusion. Now what? We still have to remember our conclusion and refresh it so that it doesn't slowly fade from view. Harder still, if we want our life to change, we need to find emotional equivalents for our intellectual understanding.
So I think that there are good internal uses for the "dark arts". Once you have made your rational decision find a slogan that sticks in the memory and find motivational techniques with personal resonnance, even if they are not entirely honest. Of course, if one has made a mistake with ones initial assessment one is now digging oneself a very deep hole, they aren't called dark arts for nothing.
Replies from: AlexU, PhilGoetz↑ comment by AlexU · 2009-04-13T14:08:41.312Z · LW(p) · GW(p)
What the hell are the "dark arts"? Could we quit playing super-secret dress-up society around here for one day and just speak in plain English, using terms with known meanings?
Replies from: robzahra, Annoyance↑ comment by robzahra · 2009-04-13T14:23:21.518Z · LW(p) · GW(p)
This is the Dark Side root link. In my opinion it's a useful chunked) concept, though maybe people should be hyperlinking here when they use the term, to be more accessible to people who haven't read every post. At the very least, the FAQ builders should add this, if it's not there already.
Replies from: Eliezer_Yudkowsky, AlexU↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-04-13T15:38:29.855Z · LW(p) · GW(p)
Actually, the term "Dark Side Epistemology" seems to be tending towards over-generalization (being used to describe any persuasive art, say, rather than explicitly defended systematized bad rules of reasoning). "Dark Arts" isn't even a term of my own invention; someone else imported that from Harry Potter. It seems to be trending towards synonymy with "Dark Side". I may have to deprecate both terms as overly poetic and come up with something else - I'm thinking of Anti-Epistemology for systematically bad epistemology.
Replies from: komponisto, JulianMorrison, infotropism↑ comment by komponisto · 2009-04-13T23:23:08.527Z · LW(p) · GW(p)
I have to say that I like the term "Dark Arts". It's kind of... cute.
I enjoy the sort of warm-and-fuzzy atmosphere that poetic vocabulary like this tends to foster.
↑ comment by JulianMorrison · 2009-04-13T19:49:26.026Z · LW(p) · GW(p)
We do actually need a term for "persuasion by cold blooded dirty tricks, even though possibly for noble ends"
↑ comment by infotropism · 2009-04-13T16:14:35.710Z · LW(p) · GW(p)
Okay, in this case, how about this :
We have persuasive arguments in general, that which may be used to efficiently change someone else's opinion in a predictable way (for instance to bring their opinion or beliefs closer to yours).
Those persuasive arguments may or may not be, intellectually honest or epistemically correct ones. The subset of persuasive arguments which are thusly wrong, overlaps with the set of arguments and techniques that are used in "anti-epistemology", which is more general and contains methods and arguments that are not only wrong and deceptive, but also of no use to us.
As to the subset of arguments and methods which are epistemically wrong or dishonest, their specificity is that they are instrumentally right, and therefore whether the end result of their use is instrumentally rational, hinges on the epistemic rationality, and honesty, of the one who uses them.
↑ comment by AlexU · 2009-04-13T14:36:18.670Z · LW(p) · GW(p)
I'm certainly not against using chunked concepts on here per se. But I think associating this community too closely with sci-fi/fantasy tropes could have deleterious consequences in the long run, as far as attracting diverse viewpoints and selling the ideas to people who aren't already pre-disposed to buying them. If Eliezer really wanted to proselytize by poeticizing, he should turn LW into the most hyper-rational, successful PUA community on the Internet, rather than the Star Wars-esque roleplaying game it seems to want to become.
Replies from: robzahra↑ comment by PhilGoetz · 2009-04-13T15:43:30.868Z · LW(p) · GW(p)
Anders Sandberg is (or was, 10 years ago) a technopagan. As near as I can tell, this means using the dark arts on yourself in a controlled and planned manner for self-improvement.
Replies from: timtyler↑ comment by timtyler · 2009-04-13T16:06:03.141Z · LW(p) · GW(p)
I see no mention of "dark arts" on: http://en.wikipedia.org/wiki/Pagan
comment by AllanCrossman · 2009-04-13T08:57:37.124Z · LW(p) · GW(p)
Links are broken. You need to select the text you want to make a link, then click on the chain thing.
Replies from: Patrick