Dark Arts: Schopenhauer wrote The Book on How To Troll

post by Raw_Power · 2011-07-05T13:13:59.722Z · LW · GW · Legacy · 57 comments

AKA "The Art Of Controversy" AKA "The Art Of Always Being Right" AKA "Dialectic Eristic". Here's a pretty fun, illustrated version of the text, in actual Troll terms]. Here's an audiobook.

EDIT: In this article I adopt a bit of a Devil's Advocate attitude. I'm not entirely convinced of what I'm suggesting, but I'll try to give it my all to make it look at least worth considering. I might get carried away at some points and overtly relish the villainy like a mad Britannian prince, which is unsightly, and, more importantly, unwarranted, so please forgive that. I'll leave those elements in, so that this is a Self Demonstrating Article.

So, the rationale is as follows: sometimes you get in an argument with someone. You're not quite sure you're right. You're not quite sure he's right. Even if you play fair, there's no guarantee it's the truth that'll come out. A few hours later, you could think up of an argument that would have saved your cause, you just failed to think of it during the discussion itself. And usually it's not just a matter of finding the truth.

First, it's a matter of "being right". If you want to clash intellects, there's no more violent, crude, intimate way than this. When you're proven wrong in a discussion, especially in public and in a way that makes you look like an idiot, your ego could get hit hard. Not to mention your status. Back when this book was written, people killed themselves, and each other, over this stuff.

Second, beside your own pride and life, there might be stuff bigger than yourself riding on this. You just can't afford to stick to the truth, or to give up just because the other side has better arguments. You gotta win, in the eyes of the public, no matter what.

This book makes a fairly good job of singling out different tricks to bullshit your way into winning an argument. Or at least stall for time and take your opponent off-balance and distract them while you think of something legitimate to say. Let's review a non-comprehensive list of the tricks he proposes (the cartoon site and the full text are much more adequate, having one or many examples per case and being very eloquently phrased by the writer himself.

Let's classify them by blocks:

I'm just surprised Schopenhauer isn't an Internet idol by this point. I'm also pleasantly surprised at how our discussions avert most of this stuff, most of the time. Then again, our motivations are different from the usual, are they not? But what about our relation to the general public? Suppose one of us accepted an invitation at O'Reilly's? What about convincing people to donate when there just isn't time to convince them of how important our cause is or how we are the right people to carry it out (not to mention we're not quite consensual or certain on either ourselves)?

Spartans were famed for their laconic way of communicating. In fact, the term was named after them. It was an actual course in their education: teachers would mock them and provoke them, and the kids would be punished harshly unless they could respond quickly, forcefully and wittily. I think we should train ourselves on this. There is a time and a place for careful deconstruction of the opponent's arguments, and careful weighting of what is right and wrong. There's another for trusting in the heuristics you're following and acting on them now. Sometimes you just have to win, and worry about the truth later. So we should learn to identify when exactly the gloves should come off, and learn how to take them off quickly, so that we are never taken off-guard. If the very existence of humanity is riding on this project, I think a little verbal swashbuckling is the *least* we can allow ourselves in terms of consequentially moral leeway.

Not that just sticking to the truth is entirely ineffective, but opponents aren't always as malleable as the one in that example, we're not all as smart and witty as Eliezer, and sometimes the inferential distances are just too huge not to resort to Dan Browned, Conviction By Counterfactual Clue, or Lies To Children for the sake of expediency (there's an entire rule in Schopenhauer's book dedicated to the case of debating of technical matters before an untrained public, and he provides a really good example, to boot).


This article suggests that learning about, and perhaps embracing the dark arts may be a useful if not outright necessary necessary means to achieve our goals. The author, on the other hand, isn't so sure. However, at the very least, I think we should know about this stuff, if only in a Defense Against The Dark Arts way, and make and study a list of similar, more contemporary works that would give us a better results-to-time-investment ratio in learning these tricks and others, and, more importantly, their counters.


BTW, Robert Greene's books, despite being rather unscientific, are very promising in that regard. Their advice is fairly useless if you want to apply it, but once you've gone through all the contents (and there's a lot of stories there) you'll be on guard against practically anything: it's really hard to beat the Epic Fails he lists there, which are all the more epic because usually they involve smart, perceptive, strong, powerful people, and they all still fall for the exact same tricks, over and over again. If only because they are fascinating narrative anthologies, and a very fun intellectual read, and we are very much in favor of fun and intellectualism, right?

Also, for those that have followed this article from the start, notice how the successive rewrites make it a self-demonstration of the "defense by subtle distinction" rule. Whether its use here was legitimate or not is left to the reader.


EDIT: As usual TVTropes never fails to pleasantly surprise. Here is their wittily written, fairly comprehensive list of fallacies: they called it You Fail Logic Forever. Remebember that fallacies are just part of the Dark Arts of Winning Debates, and a very dangerous bluff if your opponent calls you out on them, second only to counterfactual arguments.


Comments sorted by top scores.

comment by TimFreeman · 2011-07-08T23:13:35.947Z · LW(p) · GW(p)

I have a fear that becoming skilled at bullshitting others will increase my ability to bullshit myself. This is based on my informal observation that the people who bullshit me tend to be a bit confused even when manipulating me isn't their immediate goal.

However, I do find that being able to authoritatively blame someone else who is using a well-known rhetorical technique for doing that is very useful, and therefore I have found reading "Art of Controversy" to be very useful. The obviously useful skill is to be able to recognize each rhetorical technique and be able to find a suitable retort in real time; the default retort is to name the rhetorical technique.

Replies from: deepthoughtlife, Raw_Power
comment by deepthoughtlife · 2011-07-09T07:32:54.165Z · LW(p) · GW(p)

Why shouldn't you want to bullshit yourself? You'll get to believe you are the most successful man on earth, even after getting evicted. Your children will all be geniuses who will change the world, even after flunking out of high school. Your arguments will be untouchable, even after everyone else agrees you lost. Obviously, I believe said fear is highly legitimate, if the premise is true.

People who are talking bullshit do generally seem to be confused in my experience as well, but BS being caused at least in part by that confusion seems to be a highly likely scenario. Some things done in an external setting do affect similar internal processes, but not all.

An (quick and dirty) inductive argument follows:

Premise 1: It is far easier to BS than to logically analyze and respond. Premise 2: It is far faster to BS than to logically analyze and respond. Premise 3: People prefer to do things that are easier, ceteris paribus. Premise 4: People prefer to do things that are faster, ceteris paribus. Premise 5: People very strongly do not want to be wrong. Premise 6: Losing the argument is a significant proxy for being wrong. Premise 7: Winning the argument is a significant proxy for being right.

(Intermediate)Conclusion 1: If BS wins you the argument, you will prefer BS to logical analysis and response. (Intermediate)Conclusion 2: If BS loses you the argument, you will regard BS far more poorly as an option. (Intermediate)Conclusion 3: Being good enough at BS to consistently win (necessarily avoid losing) arguments drastically increases the chance you will not resort to logical analysis and response, at all. Final Conclusion: If you BS to others, you will BS to yourself.

On the idea that it is useful to know when another is using one of the devices of blowing smoke, you are obviously correct, but it can be very tempting to misuse such knowledge simply to browbeat your opponent, when they haven't actually done it. In a similar vein (though not directly on topic), sometimes a fallacy isn't really a fallacy in the precise context it is within (IE sometimes the appeal to authority is legitimate in an argument, especially to settle a minor point).

I must say one thing on the idea behind all this. While the ends occasionally justify the means, the idea that rational ends are best served via irrational means is extraordinarily likely to be incorrect. More likely, an inability to properly argue your point should have you questioning your point instead.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-09T11:51:55.206Z · LW(p) · GW(p)

an inability to properly argue your point should have you questioning your point instead.

When dealing with trolls, whether on the Internet or in Real Life, no matter how absolutely damn sure you are of your point, you have no time to unravel their bullshit for what it is, and if you try it you will only bore your audience and exhaust their patience. Debates aren't battles of truth: there's publishing papers and articles for that. Debates are battles of status. If you manage to come off as the one with higher status, people will listen more to what you said during the debate, and, more importantly, to what you said afterwards.

A very interesting way of taking advantage of this and neutralizing the effects of the dirty fighting would be to immediately afterwards publish a play-by-play analysis of the discussion, using the opportunity as an occasion to teach those who were impressed by you and went to see your work how debate really works. You could even go so far as actually listing the arguments you and your opponents use, and openly admit it if your opponent's arguments are good enough that they have caused you to actually undertake a Bayesian update. That way, you show that:

*You're smart, witty, and charismatic enough to win the debate.

*You're rational, honest, and moral enough to admit to the truth afterwards.

Replies from: TimFreeman
comment by TimFreeman · 2011-07-10T18:13:00.609Z · LW(p) · GW(p)

When dealing with trolls, whether on the Internet or in Real Life, no matter how absolutely damn sure you are of your point, you have no time to unravel their bullshit for what it is, and if you try it you will only bore your audience and exhaust their patience. Debates aren't battles of truth: there's publishing papers and articles for that. Debates are battles of status.

I agree. There's also the scenario where you're talking to a reasonable person for the purpose of figuring out the truth better than either of you could do alone. That's useful, and it's important to be able to distinguish that from debating with trolls for the purpose of gaining status. Trolls can be recognized by how often they use rhetoric that obviously isn't truth-seeking, and Schopenhauer is very good for that.

Well, actually, on the Internet you never gain status by debating with trolls. Even if I win an argument, I lose status to the extent my behavior justifies the conclusion "Tim wastes time posting to (LessWrong|SlashDot|whatever) instead of doing anything useful."

My ability to identify and stonewall trolls varies. Sometimes I catch them saying something silly and refuse to continue unless they correct themselves, and that stops the time-waste pretty quickly. Sometimes I do three-strikes-and-your-out, and the time-waste stops reasonably soon. Sometimes it takes me a long time to figure out if they're a troll, especially if they're hinting that they know something worthwhile. I wish I had a more stable rule of thumb for doing this right. Any suggestions?

Replies from: Raw_Power
comment by Raw_Power · 2011-07-10T19:20:52.931Z · LW(p) · GW(p)

That's okay for Internet trolls, but sometimes you'll have to confront people in Real Life. These people won't be aiming to make a point, they'll be aiming to discredit you, by whatever means necessary.

When I wrote this article, one of the scenarios I had in mind was "What if I was forced to confront Bill O'Reily (or some similarly hostile, dirty opponent) on the topic of Less Wrong, and how do I not only "not lose points" but actually come out making us look even cooler than before? Bonus point if he loses status, not among those who already despise him, but among his own fans". Ideally destroying his career, but that's a pretty big dream.

comment by Raw_Power · 2011-07-09T11:41:34.103Z · LW(p) · GW(p)

my informal observation that the people who bullshit me tend to be a bit confused even when manipulating me isn't their immediate goal.

True story. I know a girl that has completely lost the ability to distinguish between her lies and reality. For example, if for some reason she says she doesn't like an item of food that she is known to like, just to piss off her parents, she will henceforth always act as if she hates it. If you slip it into the food and she aks what's making the food so delicious, and you tell her what's in it, she will immediately stop liking it even though she was relishing it a minute ago.

That's just one of the examples I can summon. She believes in her bullshit very strongly on a conscious level, but subconsciously, what is true remains so, and this leads to some very amusing behavior (amusing because she insists she is fine the way she is and is generally a very obnoxious person).

comment by khafra · 2011-07-05T14:30:48.535Z · LW(p) · GW(p)

I feel the text you wrote would have worked better with, say, 4 paragraphs arguing in favor of learning to BS an argument, and 5 paragraphs reviewing or summarizing the work; instead of all 9 paragraphs defending the necessity of learning these skills.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-05T16:01:24.645Z · LW(p) · GW(p)

You're absolutely right. Upvote. However, note that the first three paragraphs were actually a summary of the parts of the work that were "about" the work itself. I thought the links I provided at first (especially the one with the cartoons) would serve fine, but I notice this is just plain lazy. I'll soon hereafter remedy this.

Edit: remedie'd, and was a greeat opportunity to order my thoughts here, especially since many of the rules in the book were fairly redundant and could easily be lumped together.

comment by JoshuaZ · 2011-07-05T18:37:22.228Z · LW(p) · GW(p)

Some of these which are labeled as bad aren't necessarily. For example, getting someone to accept a bunch of premises and then pointing out the conclusion is one way you can actually get people to change their opinions. I know I've had my opinion changed that way. Similarly, the use of Black Swan is questionable- one person might see something as a Black Swan where another will see it as a basic counterexample that needs to be dealt with. The dividing line is not at all clear.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-05T21:10:16.195Z · LW(p) · GW(p)

I did say the "checkmate" methods were the most legitimate. This is "Hoy To Be Always Right", not "How To Always Be A Demagogical Sophist": the parts that involve manipulating the opponent without actually resorting to lies and fallacies are the most satisfying, since those are the parts where it's more clearly "intellect vs intellect", and the parts where you actually know yourself to be right. A clean fight is a good fight. As for Black Swans, they are obviously subjective in that they depend on how well-informed the receiving party is. Usually we talk about Black Swans when it's the enitre scientific community that's taken aback by a paradigm breaker.

comment by endoself · 2011-07-06T01:45:19.447Z · LW(p) · GW(p)

I personally don't intend to use these techniques because I already have a hard enough time getting called on my mistakes. On multiples occasions I have convinced people of unintuitive correct contrarian ideas while forgetting to mention a crucial and nonobvious premise. If anything, I need to lose arguments more often.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-06T01:50:46.247Z · LW(p) · GW(p)

In an ideal world, I'd agree with you, but sometimes, especially in live conversation, let alone public debate, you just don't have the time to go through all the premises and the syllogisms. Plus, it's unfair to expect everyone to be able to properly counter your arguing and point out its flaws. That's something you should reserve to your Worthy Opponent of choice. Wasn't there a figure like that in (Hassidic?) Judaism? A "comrade of studies" or something like that?

Replies from: endoself, JoshuaZ
comment by endoself · 2011-07-06T03:14:44.327Z · LW(p) · GW(p)

I don't think my advice necessarily applies to everyone, I just thought that this is an important con to consider. For me, the cons currently outweigh the pros.

That's something you should reserve to your Worthy Opponent of choice. Wasn't there a figure like that in (Hassidic?) Judaism? A "comrade of studies" or something like that?

I want one! This could be a useful rationalist institution too.

comment by JoshuaZ · 2011-07-06T02:04:42.345Z · LW(p) · GW(p)

That's something you should reserve to your Worthy Opponent of choice. Wasn't there a figure like that in (Hassidic?) Judaism? A "comrade of studies" or something like that?

In many forms of Judaisms one often studies with a chavruta, with whom one will debate and engage the same texts. Such individuals are generally chosen to be about the same background level and intelligence, often for precisely the sort of reason you touch upon (as well as it helping encourage them to each try their hardest). In modern times, as the levels of interest have become much more divorced from the level of actual knowledge (due to the baal tshuvah movement as well as some other modern social effects) this last aspect has broken down somewhat.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-06T10:07:02.368Z · LW(p) · GW(p)

Care to elaborate? A cursory reading of the article doesn't reveal any mentions of the topic's effect on the chavruta institution. I'm not sure if you mean that highly intellectual Jews are more enthusiastic about "returning to their roots" or the inverse, that Jews with very little academic level have invaded the Synagogues in a religious version of Eternal September.

Replies from: JoshuaZ
comment by JoshuaZ · 2011-07-06T13:38:05.649Z · LW(p) · GW(p)

Something closer to the Eternal September, but a little more complicated than that. (Disclaimer: I don't have any sources for what I'm about to say. I'm more generalizing based on my own experience when I was Orthodox and the general impression of the community.)

One has among those who have become Orthodox a large number of very different people. Some of them are very intelligent but have little to no background knowledge. Others are not so bright and have no background. Others have are not so bright and have a little background, etc. Moreover, the general lack of background means that most of them can't form chavrutas on their own, since they didn't grow up with the large amount of basic experience about how the system works, what sort of approaches work and which don't. Much of that knowledge is procedural and not stated explicitly. So, as a result, a lot of these people are pairing with people of much more background knowledge than they have but might be not as bright. There are other complicating factors; for example, some Orthodox Jews form chavrutas with less religious, less educated, Jews deliberately trying to rope them in further.

The whole situation is really quite complicated, and there's an unfortunate lack of serious anthropological or sociological work on what is happening at a broad level, so I don't have any thing to rely on other than my own impressions.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-06T13:50:51.699Z · LW(p) · GW(p)

Would you care to repost this on the chavruta thread? I think this system could pique our interest, and if we're going to emulate it we might as well learn more about what works and what doesn't, what fits us and what doesn't, and how we can improve on it in our own special way and make it ours.

comment by pr00thmatic · 2019-10-05T17:31:51.221Z · LW(p) · GW(p)

logicen.fr no longer has the image... it says 404 D:

Replies from: SaidAchmiz
comment by Friendly-HI · 2011-07-05T15:51:21.153Z · LW(p) · GW(p)

Indeed, the ends sometimes justify the means - but I'm not sure how bashing irrationalists with cheap (or expensive) rhetoric is helping our cause. Where lies the value in convincing people who afterwards won't be useful for "our cause" anyway (we're talking fAGI I assume)?

The "intellectual high-rollers" should be our major target group, and while they surely aren't completely immune to rhetoric themselves, remember that bad arguments will reflect badly on us in the eyes of those people we should care the most about. I'm not sure which role the public perception of this issue will play in the future, but I'd just stick to simple but true arguments. You don't have to present your whole convoluted line of reasoning to make a simple, quick point like "dying sucks". Personally I'd just avoid black holes of retardation like Fox News - we'll probably be much better off by being invisible to Fox News' target-group. I don't see any value in engaging such people directly, but only a sizable potential for downfall. Keep a low profile towards the enemies of reason and don't engage unless necessary - that would be my preferred tactic. Pandering to the religious and the irrational will accomplish nothing, or at the very least I'm convinced the costs will outweigh the marginal benefits.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-05T16:29:24.657Z · LW(p) · GW(p)

Where lies the value in convincing people who afterwards won't be useful for "our cause" anyway (we're talking fAGI I assume)?

It's called "getting vote and then passing the bills". We get people to vote for us for what they thought they heard us say. We'll pass the bills that are about what we actually said. Doing away with metaphors: we get them to actually listen to what we have to say by giving a great first impression, then once we have them captivated we start showing them the fine print, most notably the bits about how much they suck and they need to change and if they listen to us everything in their lives will be better and they'll be spiritually and emotionally more fulfilling and awesome. Parties do this. Religions do this. Universities do this. Parents do this. Lovers do this. Why should we be any different? Once we get to the point where we can persuade them that talking about rationality in clown suits is a perfectly reasonable idea, the rest is pretty much done.

Replies from: nerzhin, Friendly-HI
comment by nerzhin · 2011-07-05T20:28:36.485Z · LW(p) · GW(p)

Parties do this. Religions do this. Universities do this. Parents do this. Lovers do this. Why should we be any different?

Um. Because we want to be different from political parties and religions?

Replies from: Raw_Power
comment by Raw_Power · 2011-07-05T21:18:07.571Z · LW(p) · GW(p)

How about the other two? And we don't just want to be different, we want to change them. Rationalists should win, even if our winning conditions might not be what our adversaries expect.

comment by Friendly-HI · 2011-07-05T18:26:18.796Z · LW(p) · GW(p)

I suspect we have a very different conception of how the future is going to pan out in terms of what role the public perception and acceptance of AGI will play.

I understand your point: Lure em' in with happytalk, then bash em' with a rationality course. ("Excuse me Miss, how would you like a free rationality test"?) However, I simply don't think that we can positively prepare a sizable proportion of the public (let alone the GLOBAL public) for the arrival of AGI by simply teaching rationality skills. I believe our idea of the future will just continue to compete with countless other worldviews in the public memesphere, without ever becoming truly mainstream until it is "too late" and we face something akin to a hard takeoff.

I don't really think that we can (or need to) reach a consensus within the public for the successful takeoff of fAGI. Quite to the contrary, I actually worry that carrying our view to the mainstream will have adverse effects, especially once they realize that we aren't some kind of technophile crackpot religion, but that the futuristic picture we try to paint is actually possible and not at all unlikely to happen. I prefer to face apathy over antagonism when push comes to shove - and since AGI could spring into existence very rapidly and take everyone apart from "those in the know" by surprise, I would hate to lose that element of surprise over our potentially numerous "enemies".

Now of course I don't know which path will yield the best result: confronting the public hard and fast or keeping a low profile? I suspect this may become one of the few hot-button topics where our community will have widely diverging opinions, because we simply lack a way to accurately model how people will behave upon encountering the potential threat of AGI (especially so far in advance). Just remember, that the world doesn't consist entirely of the US and that fAGI will impact everyone. I think it is likely, that we may face serious violence once our vision of the future becomes more known and gains additional credibility by exponential improvements in advanced technologies. There are players on this planet who will not be happy to see an AGI come out of America, or for that matter Eliezer's or whoever's garage. (Which is why I'd strongly advocate a semi-covert international effort when it comes to the development of friendly AGI)

It is incredibly hard to predict the future behavior of people, but on a gut-level I absolutely favor an international semi-stealthy approach. It seems to be by far the safest course to take. Once the concept of the singularity and fAGI gains traction in the spheres of science and maybe even politics (perhaps in a decade or two), I would hope that minds in AI and AGI from all over the world join an international initiative to develop this sucker together. (Think CERN). To be honest, I can't think of any other approach to develop the later stages of AGI that doesn't look doomed from the start (not doomed in terms of being technically unfeasible, but doomed in terms of significant others thinking: "we're not letting this suspicious organization/country take over the world with their dubious AI". Remember that AGI is potentially much more destructive than any nuclear warhead and powers not involved in its development may blow a gasket upon realizing the potential danger.)

So from my point of view the public perception and acceptance of AGI is a comparatively negligible factor in the overall bigger picture. "People" don't get a say in weapons development, and I predict they won't get a say when it comes to AGI. (And we should be glad they don't if you ask me.)

PS: When you're just talking about teaching rationality to people however, the way to go is to lobby it into the school curriculum as a full-blown subject. Every other plan to educate the public on "all things rational" completely pales in terms of effectiveness. Teaming up with the skeptics and the "new" atheists may be very helpful for this purpose, but of course we should never let ourselves be associated with such "ungodly" worldviews while advertising our rationalistic concepts.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-05T21:16:58.882Z · LW(p) · GW(p)

Very interesting post overall. Ciuld you refer me to article about this particular problem? I feel humans should be allowed to choose their collective destiny together, but I don't know whether it's such a bad idea to hide it from them if it will result in this. Are we on the way to becoming the new Project Manhattan?

And yes, getting it into the curriculum is great, but first we need to train teachers, and the teachers' teachers, etc. and develop a pedagogy that works with kids, who are infamous for not beling able to make the distinctions we make or assimilating the concepts we assimilate, at certain ages, so it'd have to be really fine-tuned to be optimal.

Replies from: Friendly-HI
comment by Friendly-HI · 2011-07-05T21:48:28.965Z · LW(p) · GW(p)

I've rewritten my comment and posted it as a standalone article. I've somewhat improved it, so you may want to read the other one.

I am not aware of any articles concerning the problem of how we should approach self-improving AGI, I was just hacking my personal thoughts on this matter into the keyboard. If you are talking about the potentially disastrous effects of public rage over the matter of AGI, then Hugo de Garis comes to mind - but personally I find his downright apocalyptic scenarios of societal upheaval and wars over AI a bit ridiculous and hyper-pessimistic, given that as far as I know he lacks any really substantial arguments to support such a disastrous scenario.

EDIT: I have revised my opinion of Hugo's views after watching all parts of his youtube interview on the following YTchannel: http://www.youtube.com/user/TheRationalFuture. He does make a lot of valid points and I would advise everyone to take a look in order to broaden one's perspective.

comment by fortyeridania · 2011-07-05T15:24:38.946Z · LW(p) · GW(p)

The author seems to be saying that shady means can be used to achieve noble ends. I agree. However, consider these three possibilities: (1) Being too honest is (on average) worse than being dishonest. (2) Being too dishonest is worse than being too honest. (3) Each error is equally harmful. (We could say that the first two involve asymmetric loss functions.

I think the author wants us to consider the first possibility. If honesty hurts more than dishonesty, then let's aim for more dishonesty.

But even if this possibility is true in the near-term, there is a clear benefit to committing to honesty. People trust those who have a record of honesty. Thus there is long-term eristic value to be gained by sacrificing short-term eristic success.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-05T16:20:27.726Z · LW(p) · GW(p)

People trust those who have a record of honesty.

Counterexample: this. Voting. And any and every Politics Is The Mindkiller phenomena. Lord Byron and all the other vamps of both sexes who are reputedly "mad, bad, and dangerous to know" (they ain't just fictional, any PUA, nay, any historian will tell you that). Among many other things.

Actual honestly is worth little to the general public, especially when their hearts are captred, their minds seduced by the possible materialization of fantasies that might even not be their own, planted by the seducer. Our program involves changing this. Using the Dark Arts to achieve a world of mental hygiene and clear thinking is the problem, because the means, on the long term, detracting from the goal.

On the other hand, the closer we are to the goal, the less necessary the means will become, and the more they can be potentially hurtful to the goal if people, having gained enlightenment thanks to our effort, look back on what we did and think "they manipulated us" rather than "they told us what we needed to hear", and develop a Romanticism sort of backlash against our Enlightenment work.

Replies from: fortyeridania
comment by fortyeridania · 2011-07-08T15:03:04.726Z · LW(p) · GW(p)

Your counterexamples are valid; they show that dishonesty doesn't always breed distrust among everyone. Specifically, they fail to breed distrust among those who (on some level) want beliefs that oppose reality. I suppose we all fit into this group at times.

The possibility of a Romanticism-like backlash against rationalism is one disadvantage to using deceptive rhetoric, but that assumes the happy situation that rationalism will one day become widespread. I fear that deceptive rhetoric would help prevent that happy situation from obtaining. The use and endorsement of Dark Arts could pose a PR problem for LW even before the "enlightenment" got around.

LW might not be a cult, but deceptive rhetoric is a stereotype of cults. Why make it easier for others to peg LW as one?

Replies from: Raw_Power
comment by Raw_Power · 2011-07-08T16:28:51.461Z · LW(p) · GW(p)

People already dismiss us as a cult out-of-hand. On the other hand, Fox News and the GPO seem to have a policy of using the Dark Arts, and while their opponents call them out on it, it doesn't deter their supporters in the very least (something that amazes me to no end, I really just can't comprehend it). Ayn Rand and her Objectivism are among the darkest mentalities there is (it's basically Slytherin Frat, USA), and some have gone as far as kalling them Randroids, yet nobody seems to care and the movement is as healthy as ever!

Additionally, there's an order in the darkness of the techniques Shopenhauer describes. The checkmates are for example entirely legitimate, what makes them dark is that they aren't straightforward, assuming bad faith in the opponent. Declaring victory when you're actually being defeated ("“I’ll tell you why [religion is] not a scam, in my opinion. Tide goes in, tide goes out. Never a miscommunication. You can’t explain that. You can’t explain why the tide goes in.”") or using an ad personam ("Your argument is invalid because you're an immoral person")

Ad hominem, on the other hand ("You say you're a God-fearing good Christian, so how can you support Rand's ideas about how weak people should be treated?") is kind of on the fence: surely your opponent supporting opposing views simultaeously argues against their own sanity and their credibility, but it doesn't make the specific position they're arguing against you any less valid by itself. On the other hand it's also a subtle case of appeal to consequences, in that you are forcing them against the wall in what regards their reputation, which may motivate them to abandon the debate. That still doesn't prove you're right though.

Replies from: jsalvatier, arundelo
comment by jsalvatier · 2011-07-08T18:39:06.517Z · LW(p) · GW(p)

Ayn Rand and her Objectivism are among the darkest mentalities there is (it's basically Slytherin Frat, USA)

This is flatly wrong. Their language might be of pure selfishness, but their actions are clearly ideological and not about maximizing personal wealth/power/sex. You'll notice that Objectivists talk all the time about how Objectivists shouldn't take handouts for one reason or another.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-08T18:44:36.556Z · LW(p) · GW(p)

I refer you to this, You wanna defend the negative? Fine, start a discussion or share some links. [But one-liners like "this is flatly wrong" do not advance the debate one iota.] But I don't see how the ideology itself isn't entirely built on strawmen, bias, privilege blindness and just plain bad logic. You'll also notice that Objectivists, like other people cheering and professing for other ideologies and religions, are very much known for not living up to their own brutally inhumane, lofty ideals, and simply summon them at their convenience, to justify acts of simple greed and selfishness. At least Rationalism is an ideal that by design acknowledges the difficulties in living up to it, and attempts to answer them with methods more subtle that "you're just weak". Again, if we want to expand on this, I suggest we open a discussion thread.

Replies from: jsalvatier
comment by jsalvatier · 2011-07-08T19:04:55.949Z · LW(p) · GW(p)

Perhaps I came across as too adversarial. I totally agree that Objectivism is wrong and useless and if implemented widely would be very bad. The point I was trying to convey is that in practice, neither Ayn Rand nor Objectivists have acted especially Slytherin.

Unfortunately, your link is broken.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-08T19:11:47.301Z · LW(p) · GW(p)

Oh. I apologize and admit to my mistake: unlike the better sort of Slytherin their mentality isn't exactly Machiavellian, though it is take-no-prisoners in its own, different way. I'd still like to hear a more detailed study of the difference between the two, for the sake of moral clarity (heh) since in my dislike for them I have unwisely lumped them together, which is a mistake, which needs to be corrected.

As a desperate save for the sake of fanfictionish silliness, I'd add that the same way conservatism has turned out very different in the USA and the UK, it's fairly plausible that the USA offshots of the House Of Slytherin may have developed differently from their continental peers. You wouldn't know of any fanfic that properly explored Potterverse Magical America, would you?

...Now I'm really far off-topic.

Replies from: jsalvatier, jsalvatier
comment by jsalvatier · 2011-07-08T22:06:05.804Z · LW(p) · GW(p)

Heh, I somehow only read your first sentence when I responded. I was basing my comment off of my own experience with rand as a teenager and friends who influenced by Objectivism. I don't know of an analysis of the two and I don't read much fiction.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-09T00:16:59.906Z · LW(p) · GW(p)

Please tell me you've at least read Methods Of Rationality and Shinji and Warhammer 40K.

Replies from: jsalvatier, TimFreeman
comment by jsalvatier · 2011-07-09T00:18:41.310Z · LW(p) · GW(p)

I've read MoR and started Shinji.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-09T14:31:54.620Z · LW(p) · GW(p)

I hope you enjoy the ride.

comment by TimFreeman · 2011-07-09T00:35:26.763Z · LW(p) · GW(p)

Please tell me you've at least read Methods Of Rationality and Shinji and Warhammer40k.

I read the presently existing part of MoR. I could read Shinji 40K. Why do you think it's worthwhile? Should I read or watch Neon Genesis Evangelion first?

Replies from: Raw_Power
comment by Raw_Power · 2011-07-09T11:12:26.370Z · LW(p) · GW(p)

Hm, reading the original EVA is not compulsory, the story stands very well on its own... but since you ask, I heartily recommend you watch EVA and Gurren Lagann. They are both flawed, but they are still very good, and very memorable.

Replies from: TimFreeman
comment by TimFreeman · 2011-07-13T16:53:45.084Z · LW(p) · GW(p)

The story isn't working for me. A boy or novice soldier, depending on how you define it, is inexplicably given the job of running a huge and difficult-to-use robot to fight with a sequence of powerful similarly huge aliens while trying not to do too much collateral damage to Tokyo in the process. In the original, I gather he was an unhappy boy. In this story, he's a relatively well-adjusted boy who hallucinates conversations with his Warhammer figurines. I don't see why I should care about this scenario or any similar scenarios, but maybe I'm missing something.

Can someone who read this or watched the original say something interesting that happens in it? Wikipedia mentions profound philosophical questions about the nature of reality, but it also mentions that the ending is widely regarded as incomprehensible. The quote about how every possible statement sounds profound if you get the rhetoric right seems to apply here. I don't want to invest multiple hours to end up reading (or watching) some pseudo-profound nonsense.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-13T20:43:28.175Z · LW(p) · GW(p)

It's not pseudo profound, but, like The Matrix, it has a lot of window-dressing and pomous wanking around absolutely legitimate questions. It's also frustrating in that many of the questions are asked, but few are resolved. And they're mainly a framework for the character arcs to develop. Eva has a very simple plot, which it doses very carefully in order to make it more interesting, so that it comes off as a Jigsaw puzzle. The most interesting thing is how the characters evolve and... really, I don't want to spoil anything, but you should definitely give it a try: it's a character story where the characters are extremely human, layered, and rich, and their stories are extremely poignantes.

If you don't want to watch the original, all I can tell you is, the "inexplicable" turns out to be "not explained yet". Everything will be revealed in due time. As for why it is interesting... well, if you watch EVA, and especially the final movie, The End Of Evangelion, you might identify a lot with Shinji, put a lot of yourself into him. This is especially true if you watch it as a teenager of the same age. And then... well, stuff happens to him, and to you, by proxy. Seeing him well-adjusted, happy, strong, while still having the same fundamental character traits... it's a very intense feeling.

Replies from: TimFreeman
comment by TimFreeman · 2011-08-04T04:59:29.162Z · LW(p) · GW(p)

Okay, I watched End of Evangelion and a variety of the materials leading up to it. I want my time back. I don't recommend it.

Replies from: Raw_Power
comment by Raw_Power · 2011-08-04T12:48:42.425Z · LW(p) · GW(p)

You watched EoE without watching the series first? Instead you watched "Death And Rebirth"?


That's probably the wrongest possible way to do it. It's like watching 2001 a Space Odyssey starting from when Dave gets on the pod and into the Jupiter monolith. Like, there's no point to EoE if you aren't already very familiar with the characters AND very very invested in them and the plot.

comment by jsalvatier · 2011-07-08T19:30:22.156Z · LW(p) · GW(p)


comment by arundelo · 2011-07-08T18:00:58.308Z · LW(p) · GW(p)

Ayn Rand and her Objectivism are among the darkest mentalities there is (it's basically Slytherin Frat, USA)

What do you mean by this?

Replies from: Raw_Power
comment by Raw_Power · 2011-07-08T18:37:52.974Z · LW(p) · GW(p)

What do you mean by this?

If you want to open a dedicated discussion thread, I'm your man. But at least read this first. This might also be interesting reading. There were also several discussions around that topic on TV Tropes (I'm not suggesting you read them all, but TV Tropes discussions are usually interesting, so you might actually enjoy them, and TV Tropes Useful Notes pages are usually quite enlightened, balanced and well-written: see the one for atheism).

If you're going to start that discussion, then we might as well make another for someone to explain libertarianism to me, because it seems fairly popular on this site, and the superficial understanding of it I've had through the media has given me the first impression that it's the dumbest, most dangerously stupid idea on how to run a country since, and even more impractical than, pure communism. The same way the first impression one could take of LW is "The Cult of the Leizer" (and damn I'm tired of being called a cultist, but no matter how much I disagree, I see where they're coming from).

As a rationalist, when I have a prejudice, the first thing I need to do, is kill it: I must ask for a little guidance. If you decide to help me, first make me go through the beginner stuff. Same as here, my guess is that some of the more advanced writings of libertarians can be unintentionally infuriating to outsiders, the same way some of our posts here can be. If you come out of the blue and tell someone "if your children aren't signed up for cryogenics, you're a lousy parent, how do you expect them to react?)

Replies from: arundelo
comment by arundelo · 2011-07-08T23:56:45.204Z · LW(p) · GW(p)

read this first

I read it when Eliezer wrote it and remember more-or-less agreeing with it. I am not an Objectivist, though I think I'm more sympathetic to Rand than the typical LWer.

This might also be interesting reading.

I just scanned it; it looks like a good summary. Note in particular the bit about Comte and altruism. (You may not think this, but some people incorrectly think that Rand advocated a "might makes right" or "I'm better than you therefore I get to do whatever I want" ethic.)

"The Cult of the Leizer"

Is this a pun on Eliezer's name?

Replies from: Raw_Power, Raw_Power
comment by Raw_Power · 2011-07-09T15:04:37.158Z · LW(p) · GW(p)

Oh, and about Rand... the problem with her philosophy is that she made a mess of it and of presenting it, up to and including extensive use of the Dark Arts in her rhetoric. And many who present themselves as Randians are as bad at it as many alleged Christians are at Christianity. That there are many who attempt to claim for themselves the prestige of Randianism and of being Christian is a clear sign of lack of intellectual and moral health (Randianisms are great for bashing people!), and there's probably a correlation between this and the oxymoron that is the Republican party's existence as an unholy alliance between conservatives and classical liberals . Finally some of the stuff she presents, such as coercion being immoral and making one "subhuman" is just plain fallacious. Her notion of "choice" is very suspect, and sounds to me like Free Will, which has been debunked by quantum physics of all things, if I remember right.

Now, look, I can respect someone reading her work and extracting the stuff that makes sense and presenting that as Objectivism, then stating that dismissing the package because of how it was handled by its creator is fallacious via ad hominem and ad personam. But Rand defined Objectivism as basically "whatever I say it is, and, whatever I say, it is". If you want to salvage bits of it that have some merit, that's fine, but you're not an objectivist anymore. Many of those bits are areas of overlap with other ideologies that developed them more soundly and consistently.

I still think you could call the more ethically thorough users of moral Objectivism "Slytherins" in the sense of the purer form of the ideology: the pursuit of Enlightened Self Interest and the rejection of Rules and Sacrificial Altruism. The difference being that Pure Slytherism doesn't have anything against initiation of coertion, accepting it as part of the mechanics of real human interaction, nor does it assume that in a Free Market system of consenting and informed adults all agreements will necessarily be by mutual agreement and benefit: it instead supports the administration of information as carefull and sparesly as any other element that may give the user an edge. As a whole, I'd argue that Slytherism is the more consistent and practical of the two, and therefore the more morally valuable.

comment by Raw_Power · 2011-07-09T00:13:17.782Z · LW(p) · GW(p)

To the last question: definitely! Is it at all funny? After I wrote it my mind wandered without my permission and coughed up something even worse: "Imma chargin Eleizer" and "Allstar Leizer and Robin the Man Handsom" but I told it those just sounded dumb and the latter would only amuse Frank Miller haters, while the former wouldn't amuse anyone... I have a habit of doing that: the nickname I gave JoshuaZ was (based on JoshuaZ's Judaism, Biblical!Joshua's "badass" status... and the Z at the end...) "The Super Scion of Zion".

He liked it, or so he told me.

Replies from: arundelo
comment by arundelo · 2011-07-09T00:19:29.998Z · LW(p) · GW(p)

Check the spelling, though, and note that he pronounces it "el-ee-EHZ-er".

Replies from: Raw_Power
comment by Raw_Power · 2011-07-09T11:08:07.855Z · LW(p) · GW(p)

Yes, I know, but since jokes, no matter how affectionate, are about irrationality in the first place, I think those sacrificies are necessary. Plus, the only puns I can think up on "Eliezer" pronounced right are in French and Spanish.

comment by Raw_Power · 2011-07-08T16:34:14.823Z · LW(p) · GW(p)

Okay, I'm going to be blunt here. While I'm on a Devils Advocate position, I'll have you notice that Devil's Advocacy is a social process, and an important, non-trivial one at that, as Yudkowsky pointed out in the end his article on it

Now, please, from a consequentialist POV, prove to me why an intelligent, precise, and carefully thought-out use of the Dark Arts wouldn't work for us, and how it would be counterproductive to our cause.

Replies from: Peterdjones
comment by Peterdjones · 2011-07-08T16:38:06.507Z · LW(p) · GW(p)

Rationality is something you do, not something you are.

If you use DA instead of rationality, you are not setting any kind of teaching example to the not particularly rational section of mankind; and the already-fairly-rational segment are going to detect what you are doing and be put off by it.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-08T18:09:32.347Z · LW(p) · GW(p)

You mean the same way such works as Michael Moore's films or The Story of Stuff are off-putting to those who can detect all the bad faith and the manipulative style they employ (even though I don't know of any instances of actual lying)? Those are still a minority in an advanced stage of knowledge and savviness, and even then they can appreciate the message and agree with it, they just kind of look down on the creators of such works as a little crass and unsubtle. It's because of attitudes like this that progressive movements have such trouble advancing in places like the USA: there's a fairly large demographic of people who are almost instinctively repulsed by intellectual hipsters

Additionally, I am fairly sure there are public debaters who are appreciated for their talent on the debate floor without having to resort to any lies, half-truths or distortions of the truth, but who nevertheless ruthlessly beat their opponents to a pulp, usually because they do have truth on their side, but also often because they're just that good at debate. Others have this reputation but fail to live up to it... only if you are an extremely attentive observatior: the subtle tricks they use completely fly over the heads of 99% of the audience.

Furthermore, there's no reason the halfway-rational/rational-in-progress be put off by dark arts if they are used right, especially if the contrast between the rationalist debater and the other is very stark in terms of truthfulness and in term of shade of grey. Confusing "dark" and "pitch black" is as bad as saying everything is the same tone of grey . This contrast, if stark enough, still allows to teach a much better example to the irrational than what they are used to. At the very least, it doesn't take a freaking saint to expose a crook or an idiot for what they are.

comment by rwallace · 2011-07-05T21:02:28.230Z · LW(p) · GW(p)

This is a hard one to judge.

Halfway through, I was about to reach for the upvote button for the Defense against the Dark Arts post with an excellent summary of dishonest rhetorical tricks everyone needs to know how to guard against.

Then I was about to reach for the downvote button once it started advocating embracing the Dark Arts and employing any dishonest rhetorical tricks that look in the short term like they might further the author's favorite cause.

But I find myself bookmarking the illustrated version of the list for use next time I want to refer someone to a clear introduction to this stuff. And it doesn't feel quite right to downvote a post that provided me with something worth bookmarking. So I'll abstain from voting on this one.

Replies from: Raw_Power
comment by Raw_Power · 2011-07-05T21:38:58.748Z · LW(p) · GW(p)

Every group needs a token "evil" teammate], if only so that Dark methods are at least given consideration rather than rejected out of hand. I think it's a role we should all endorse from time to time, our little inner Slitherin