The Goal of the Bayesian Conspiracy

post by Arandur · 2011-08-16T18:40:55.939Z · LW · GW · Legacy · 80 comments

Contents

80 comments

Suppose that there were to exist such an entity as the Bayesian Conspiracy.

I speak not of the social group of that name, the banner under which rationalists meet at various conventions – though I do not intend to disparage that group! Indeed, it is my fervent hope that they may in due time grow into the entity which I am setting out to describe. No, I speak of something more like the “shadowy group of scientists” which Yudkowsky describes, tongue (one might assume) firmly in cheek. I speak of such an organization which has been described in Yudkowsky's various fictional works, the secret and sacred cabal of mathematicians and empiricists who seek unwaveringly for truth... but set in the modern-day world, perhaps merely the seed of such a school, an organization which can survive and thrive in the midst of, yet isolated from, our worldwide sociopolitical mess. I ask you, if such an organization existed, right now, what would – indeed, what should – be its primary mid-term (say, 50-100 yrs.) goal?

I submit that the primary mid-term goal of the Bayesian Conspiracy, at this stage of its existence, is and/or ought to be nothing less than world domination.

Before the rotten fruit begins to fly, let me make a brief clarification.

The term “world domination” is, unfortunately, rather socially charged, bringing to mind an image of the archetypal mad scientist with marching robot armies. That's not what I'm talking about. My usage of the phrase is intended to evoke something slightly less dramatic, and far less sinister. “World domination”, to me, actually describes rather a loosely packed set of possible world-states. One example would be the one I term “One World Government”, wherein the Conspiracy (either openly or in secret) is in charge of all nations via an explicit central meta-government. Another would be a simple infiltration of the world's extant political systems, followed by policy-making and cooperation which would ensure the general welfare of the world's entire population – control de facto, but without changing too much outwardly. The common thread is simply that the Conspiracy becomes the only major influence in world politics.

(Forgive my less-than-rigorous definition, but a thorough examination of the exact definition of the word “influence” is far, far outside the scope of this article.)

So there is my claim. Let me tell you why I believe this is the morally correct course of action.

Let us examine, for a moment, the numerous major good works which are currently being openly done by rationalists, or with those who may not self-identify as rationalists, but whose dogmas and goals accord with ours. We have the Singularity Institute, which is concerned with ensuring that our technological, transhumanistic advent happens smoothly and with a minimum of carnage. We have various institutions worldwide advocating and practicing cryonics, which offers a non-zero probability of recovery from death. We have various institutions also who are working on life extension technologies and procedures, which offer to one day remove the threat of death entirely from our world.

All good things, I say. I also say: too slow!

Imagine what more could be accomplished if the United States, for example, granted to the Life Extension Foundation or to Alcor the amount of money and social prominence currently reserved for military purposes. Imagine what would happen if every scientist around the world were perhaps able to contribute under a unified institution, working on this vitally important problem of overcoming death, with all the money and time the world's governments could offer at their disposal.

Imagine, also, how many lives are lost every day due to governmental negligence, and war, and poverty, and hunger. What does it profit the world, if we offer to freeze the heads of those who can afford it, while all around us there are people who can't even afford their bread and water?

I have what is, perhaps, to some who are particularly invested, an appalling and frightening proposition: for the moment, we should devote fewer of our resources to cryonics and life extension, and focus on saving the lives of those to whom these technologies are currently beyond even a fevered dream. This means holding the reins of the world, that we might fix the problems inherent in our society. Only when significant steps have been taken in the direction of saving life can we turn our focus toward extending life.

What should the Bayesian Conspiracy do, once it comes to power? It should stop war. It should usurp murderous despots, and feed the hungry and wretched who suffered under them. Again: before we work on extending the lives of the healthy and affluent beyond what we've so far achieved, we should, for example, bring the average life expectancy in Africa above the 50-year mark, where it currently sits (according to a 2006 study in the BMJ). This is what will bring about the maximum level of happiness in the world; not cryonics for those who can afford it.

Does this mean that we should stop researching these anti-death technologies? No! Of course not! Consider: even if cryonics drops to, say, priority 3 or 4 under this system, once the Conspiracy comes to power, that will still be far more support than it's currently receiving from world governments. The work will end up progressing at a far faster rate than it currently does.

Some of you may have qualms about this plan of action. You may ask, what about individual choice? What about the peoples' right to choose who leads them? Well, for those of us who live in the United States, at least, this is already a bit of a naïve question: due to color politics, you already do not have much of a choice in who leads you. But that's a matter for another time. Even if you think that dictatorship – even benevolent, rationalist dictatorship – would be inherently morally worse than even the flawed democratic system we enjoy here – a notion that may not even necessarily be the case!  do not worry: there's no reason why world domination need entail dictatorships. In countries where there are democratic systems in place, we will work within the system, placing Conspirators into positions where they can convince the people, via legitimate means, to give them public office. Once we have attained a sufficient level of power over this democratic system, we will effect change, and thence the work will go forth until this victory of rationalist dogma covers all the earth. When there are dictators, they will be removed and replaced with democratic systems... under the initial control of Conspirators, of course, and ideally under their continued control as time passes – but legitimately obtained control.

It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions. Therefore, the people who make decisions that affect large numbers of people ought to be those who have the highest level of rationality. In this way we can seek to avoid the many, many, many pitfalls of politics, including the inefficiency which Yudkowsky has again and again railed against. If all the politicians are on the same side, who's to argue?

In fact, even if two rationalists disagree on a particular point (which they shouldn't, but hey, even the best rationalists aren't perfect yet), they'll be able to operate more efficiently than two non-rationalists in the same position. Is the disagreement able to be settled by experiment? If it's important, throw funds at a lab to conduct such an experiment! After all, we're in charge of the money and the scientists. Is it not? Find a compromise that has the maximum expected utility for the constituents. We can do that with a high degree of accuracy; we have access to the pollsters and sociologists, and know about reliable versus unreliable polling methods!

What about non-rationalist aspiring politicians? Well, under an ideal Conspiracy takeover, there would be no such thing. Lessons on politics would include rationality as a basis; graduation from law school would entail induction into the Conspiracy, and access to the truths had therein.

I suppose the biggest question is, is all this realistic? Or is just an idealist's dream? Well, there's a non-zero probability that the Conspiracy already exists, in which case, I hope that they will consider my proposal... or, even better, I hope that I've correctly deduced and adequately explained the master plan. If the Conspiracy does not currently exist, then if my position is correct, we have a moral obligation to work our hardest on this project.

“But I don't want to be a politician,” you exclaim! “I have no skill with people, and I'd much rather tinker with the Collatz Conjecture at my desk for a few years!” I'm inclined to say that that's just too bad; sacrifices must be made for the common good, and after all, it's often said that anyone who actually wants a political office is by the fact unfit for the position. But in all realism, I'm quite sure that there will be enough room in the Conspiracy for non-politicians. We're all scientists and mathematicians at heart, anyway.

So! Here is our order of business. We must draw up a charter for the Bayesian Conspiracy. We must invent a testing system able to keep a distinction between those who are and are not ready for the Truths the Conspiracy will hold. We must find our strongest Rationalists – via a testing procedure we have not yet come up with – and put them in charge, and subordinate ourselves to them (not blindly, of course! The strength of community, even rationalist community, is in debate!). We must establish schools and structured lesson plans for the purpose of training fresh students; we must also take advantage of those systems which are already in place, and utilize them for (or turn them to) our purposes. I expect to have the infrastructure set up in no more than five years.

At that point, our real work will begin.

80 comments

Comments sorted by top scores.

comment by Mitchell_Porter · 2011-08-16T05:16:56.671Z · LW(p) · GW(p)

The more time that passes, the likelier it becomes that transhumanism and Singularity futurism will eventually find political expression. It's also likely that the various forms of rationalistic utilitarian altruism existing in certain corners of the Internet will eventually give rise to a distinctive ideology that will take its place in the spectrum of political views that count. It is even possible that some intersection of these two currents - the futurological rationalism on display at this site - will give rise to a politically minded movement or organization. This post, the earlier "Altruist Support" sequence by Giles, a few others show that there's some desire to do this. However, as things stand, this desire is still too weak and formless for anyone to actually do anything, and if anyone did become worked-up and fanatical enough to organize seriously, the result would most likely be an irrelevant farce, a psychodrama only meaningful to half a dozen people.

The current post combines: complete blindness with respect to what's involved in acquiring power at a national or international level; no sense of how embattled and precarious is the situation of futurist causes like cryonics and Friendly AI; and misplaced confidence in the correctness of the local belief system.

Let's start with the political naivete. Rather than taking over openly, it's proposed that the Conspiracy could settle for

a simple infiltration of the world's extant political systems

I love the word "simple". Look, politics isn't a game of hide and seek. Ideological groups have the cohesion that they do because membership in the group depends on openly espousing the ideology. If you get to be head of the politburo of the Tragic Soulfulness League after years of dutifully endorsing the party line, and then, once you're in charge, you announce to your colleagues that you actually believe in Maximum Happiness, what happens is that the next day, the media carry the tragically soulful news of the unfortunate accident which cut you down just at the beginning of your term in office, and your successor, the former deputy head, wiping away a tear, vows to uphold the principles of the tragic soul, just as you would have wanted.

the Conspiracy becomes the only major influence in world politics

A perfect picture of fanaticism... Apparently you think of political influence only in terms of belief systems. The perfect end state is that the one true belief system is triumphant! But political influence is also an expression just of the existence of a group of people; it means that the system knows about them, listens to them, contains their representatives. If the world still contains a billion Indians or three hundred million Americans, then India and America will continue to be major "influences" in world politics.

Now let's turn to the author's innocence regarding the situation of cryonics, etc, in the world.

we should devote fewer of our resources to cryonics and life extension, and focus on saving the lives of those to whom these technologies are currently beyond even a fevered dream

In other words, the microscopic number of highly embattled people who are currently working on these matters, should instead take on the causes which are already ubiquitously signposted as Good, and which already receive billions of dollars per year. The rationale proposed for this perspective is that when the Conspiracy is in charge, it will own all the resources of the world, so it will be able to afford to do both things at once.

Arandur, if you take this line of thought, you end up working neither on life extension nor on poverty alleviation, but simply on assuming power, with the plan of doing those promised good works at some unknown time in the future.

In passing, let's consider what specific proposals are offered here, regarding the solution of recognized problems like war and starvation (as opposed to unrecognized problems like ageing or unfriendly AI)? The answers I see are (1) spend even more money on them (2) trust us to think of a better approach, we're rationalists and that means we're better at problem-solving.

At least an explicitly transhumanist agenda would bring something concrete and new to politics. With respect to the existing concerns of politics, this proposal offers no-one any reason to offer you a share of power or to support your aspirations.

Finally, fanatical faith in the correctness of the local philosophy and the way that it is just destined to empower the true believer:

It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions.

It is even more demonstrable that one's level of self-identification as a rationalist has a direct correlation to the probability that one is irrelevant to anything of any significance, especially the sort of worldly affairs that you are talking about.

Replies from: Arandur, lessdazed, casebash
comment by Arandur · 2011-08-16T06:35:34.848Z · LW(p) · GW(p)

I'm being pulled off to bed, but from my skimming this looks like a very, very helpful critique. Thank you for posting it; I'll peruse it as soon as I'm able. One note: I did note after posting this, but too late to make a meaningful change, that "we should support cryonics less" is rather a ridiculous notion, considering the people I'm talking to are probably not the same people who are working hardest on cryonics. So: oops.

comment by lessdazed · 2011-08-16T07:06:21.397Z · LW(p) · GW(p)

The more time that passes, the likelier it becomes that transhumanism and Singularity futurism will eventually find political expression.

What does this mean, exactly? It's something that without thinking about it I seem to intuitively understand, but my thinking falls apart when I try to examine it closely. It's like zooming in on a picture and finding that no further pixels are part of the data.

Replies from: Mitchell_Porter
comment by Mitchell_Porter · 2011-08-16T08:07:08.192Z · LW(p) · GW(p)

Originally I wrote "It is inevitable that" there will be a politics of the Singularity. But it's possible (e.g. AI hard takeoff) that a singularity could happen before the political culture digests the concept. So there are two ways in which more time equals more futurism in politics. First, the further into the human future we go, the more futurist becomes the general sensibility. Second, the longer the posthuman future holds off, the more time there is for this evolution of human culture to occur.

comment by casebash · 2016-07-03T12:51:38.161Z · LW(p) · GW(p)

It's interesting reading this old comment in light of Effective Altruism.

comment by katydee · 2011-08-16T10:17:36.996Z · LW(p) · GW(p)

If it were that simple to take over the world, someone would have already done it. Whether this should update you in the direction of things not being so simple or in the direction of other conspiracies already controlling the world has been left as an exercise to the reader.

comment by orthonormal · 2011-08-16T02:06:26.135Z · LW(p) · GW(p)

If there is such a Conspiracy, you've already failed the most obvious membership test.

Replies from: beoShaffer, Arandur
comment by beoShaffer · 2011-08-16T05:09:06.711Z · LW(p) · GW(p)
  • First Rule .... p(A|X) =p(X|A)p(A) / (p(X|A)p(A) + p(X|~A)*p(~A))
  • Second Rule You DO NOT talk about Bayes Club.
comment by Arandur · 2011-08-16T06:23:14.386Z · LW(p) · GW(p)

Ha! Yes, I had this thought as well. I actually messaged Yudkowsky, warning him that I was considering posting this, on the off chance that a) the Conspiracy existed, b) he was among their ranks, and c) he wanted me to not stir up this possibility. I waited for a response for a period of time consistent with my estimation of the probability of the Conspiracy existing in an incarnation that would meaningfully object.

Replies from: orthonormal
comment by orthonormal · 2011-08-16T13:50:04.854Z · LW(p) · GW(p)

Conditional on a Conspiracy existing, the probability that they'd reveal themselves to an unknown person asking via e-mail has to be pretty low. What you obviously should have done instead is to brainstorm for five minutes on how you would really recruit new members if you were the Conspiracy, or alternately on what courses of action you could take to benefit the Conspiracy if it existed. But, like I said, it's too late now- instead, you've signaled that you're clever enough to come up with an idea but not disciplined enough to think it through properly, and that's precisely the type of member a Bayesian Conspiracy would wish to avoid.

Replies from: Nisan, Arandur
comment by Nisan · 2011-08-17T01:11:30.544Z · LW(p) · GW(p)

Indeed, to the extent that members of the Conspiracy reason similarly, they do not need to communicate at all.

comment by Arandur · 2011-08-16T16:04:51.646Z · LW(p) · GW(p)

Your chastisement is well taken. Thank you.

comment by dspeyer · 2011-08-16T03:46:09.988Z · LW(p) · GW(p)

This needs a safety hatch.

It is a recurring pattern in history for determined, well-intentioned people to seize power and then do damage. Certainly we're different because we're rational, but they were different because they were ${virtueTheyValueMost}. See also The Outside View and The Sorting Hat's Warning.

A conspiracy of rationalists is even more disturbing because of how closely it resembles an AI. As individuals, we balance more logic based on our admittedly underspecified terminal values against moral intuition. But our intuitions do not match, nor do we communicate them easily. So collectively moral logic dominates. Pure moral logic without really good terminal values... we've been over this.

Replies from: None, Arandur
comment by [deleted] · 2011-08-16T04:06:09.267Z · LW(p) · GW(p)

Don't worry. This is exactly what the Contrarian Conspiracy was designed to prevent.

Everything is going according to plan.

comment by Arandur · 2011-08-16T06:24:40.345Z · LW(p) · GW(p)

Huh. An interesting point, and one that I should have considered. So what would you suggest as a safety hatch?

Replies from: dspeyer
comment by dspeyer · 2011-08-16T15:14:46.536Z · LW(p) · GW(p)

I don't know, but I'll throw some ideas up. These aren't all the possibilities and probably don't include the best possibility.

Each step must be moral taken in isolation. No it'll-be-worth-it-in-ten-years reasoning, since that can go especially horribly wrong.

Work honestly within the existing systems. This allows existing safeguards to apply. On the other hand, it assumes it's possible to get anything done within existing systems by being honest.

Establish some mechanism to keep moral intuition. Secret-ballot mandatory does-this-feel-right votes.

Divide into several conspiracies, which are forbidden to have discuss issues with eachother, preventing groupthink.

Have an oversight conspiracy, with the power to shut us down if they believe we've gone evil.

comment by lessdazed · 2011-08-16T02:51:07.511Z · LW(p) · GW(p)

Imagine, also, how many lives are lost every day due to governmental negligence, and war, and poverty, and hunger

I was watching a hockey game with my ex-girlfriend when a fight broke out (on the ice, not between us). "That shouldn't be allowed!" she said. "It isn't," I responded. "It's a five minute penalty." "But the referees are just watching them fight. They should stop them from fighting!" "That's not an action. They can move their bodies and arms, and step between them, or pull them from behind. But 'making them stop' isn't something that a person can just decide to do. If they step between them now, someone could get hurt."

"Ending negligence" unfortunately isn't an action, unlike, say, typing. It's more like "stopping fighting".

Replies from: Document, Arandur
comment by Arandur · 2011-08-16T06:27:05.219Z · LW(p) · GW(p)

That's quite true. But I have a hunch (warning: bare assertion) that much governmental negligence is due to a) self-interest and b) corruption (see: corrupt African dictatorships).

Replies from: lessdazed, peter_hurford
comment by lessdazed · 2011-08-16T16:45:21.405Z · LW(p) · GW(p)

Somehow I missed seeing your comment (I think), and said what amounts to basically the same thing a few hours later elsewhere. The way I put it was more hopeless and forgiving though, implying that a lot of corruption is inevitable and we should judge actual governments against the ideal government that would also have a lot of negligence, just less.

(Warning: political comment ahead.) I had an insight recently about why I approved of the conclusions of certain conservative or libertarian arguments less often than one would think given my agreement with the premises. (I'm not giving the percentages or my aggregate leanings here, I think it works regardless of what they are.) Namely, I realized that many valid anti-government arguments are mostly anti-bureaucracy arguments. Bureaucracy is still a cost of privatization, just less of one, and it is roughly inversely proportional to the number of businesses that would fill the economic function if the government didn't. So my intuitions (far view, compartmentalizations) were correct this time, and accounting for the hidden cost of the options that lessened or minimized bureaucracy. Baselines are very important, and its also important to note victories of the compartmentalization heuristic for those like me who are inclined the other way.

Now I will indulge in a few words about the role of fighting in professional hockey.

It would be easy for me to say that all of the anti-fighting arguments I've heard are either foolish, naive, dismissive of obvious unintended consequences, contemptuous towards evidence, deontological, and/or unaware of human nature. Some genuinely militate against fighting, but are weak, so I don't believe I'm seeing arguments as soldiers too much However, one connotation of the above hypothetical statement would be false, for I have generated an argument from the wreckage of what I have heard, in an attempt to have sound conclusions. This doesn't happen very often so it's worth noticing and mentioning even when it happens in such a mundane context as this.

It is very possible that the rituals surrounding fighting in the modern NHL, which are approximately the best way to ensure safety at that level short of dramatically slashing at the entertainment value of the sport itself (for example, by reducing the speed of everything), are so safe because by the time people make it to the NHL level, they have experience fighting at lower levels, levels during which bad injuries occurred because they were fighting as inexperienced fighters.

I don't know what exactly would be the best policy. Having two linesmen step between punching men whose primary or secondary priorities are self-defense and simultaneously try to restrain them as they struggle is not a good idea. Having penalties for being the third man to enter a fight that dwarf those of participating in a fair fight is a good idea, this necessitates having relatively small penalties for participating in a fair fight, etc.

That reminds me, Scott Adams recently advocated the death penalty for some forms of rape, which obviously removes the incentive perpetrators have to not kill those victims, unless one tortures as a penalty for murder. I bring this up largely to discuss his role as a thinker and how it relates to others'. He is good at generating creative ideas, but hits upon a lot of false negatives. He isn't very bright, I think, but I greatly admire his ability to not feel the need to justify half-formed ideas while holding off on proposing solutions, as well as to chuck ideas without becoming attached to them as part of his identity, and simply generate new ones.

I have found it helpful to think of people as along the false positive to false negative idea bearing axis, and think it is something to bear in mind during disagreements,

Replies from: cata
comment by cata · 2011-08-17T22:05:32.559Z · LW(p) · GW(p)

Not to go too far off-topic here, but it would be trivial for the league to prevent fighting; just impose real penalties, like ejection from the game and/or suspension from future games. That's how most other professional sports work, and, not surprisingly, there aren't typically fights during the game in those sports (even in physically aggressive ones like football and basketball.) I don't see why one would expect the implementation of such a rule in hockey to result in anything different.

Whether or not you think that ice hockey without fighting would have a "dramatically slash[ed]" entertainment value, is, I suppose, a matter of opinion.

Replies from: lessdazed
comment by lessdazed · 2011-08-18T06:40:49.371Z · LW(p) · GW(p)

Whether or not you think that ice hockey without fighting would have a "dramatically slash[ed]" entertainment value, is, I suppose, a matter of opinion.

I didn't say that fighting is entertaining, but that fighting maintains safety, and many unrelated safety measures would reduce entertainment.

it would be trivial for the league to prevent fighting

Less fighting is probably a means, yes? The end is well-being and safety?

That's how most other professional sports work

It's how most hockey leagues and tournaments work, allowing for even better comparisons.

Replies from: cata
comment by cata · 2011-08-18T17:04:33.355Z · LW(p) · GW(p)

Got it, I think I misunderstood your position about fighting and safety. I get your point now. Thanks!

Replies from: lessdazed
comment by lessdazed · 2011-08-18T23:44:01.202Z · LW(p) · GW(p)

In the news...it's not often that a link to icanhascheezburger.com is appropriate and on topic at LW. So let's savor the moment!

There was no norm to stop the conflict with a one-on-one fight ended by referees after the parties were tired, nor a secondary one for the conflict to be between all members on the court paired off 5v5, so it went straight to a bench clearing brawl.

Assuming nuclear arsenals were universal and impossible to disarm, I would be wary of extremist conventional arms control.

comment by Peter Wildeford (peter_hurford) · 2011-08-16T06:53:59.327Z · LW(p) · GW(p)

Don't underestimate the concept of people just not thinking through their actions. People who are guilty of negligence are the ones who simply failed to properly secure their beliefs, not the ones who deliberately decided they benefited from killing others.

comment by Alexei · 2011-08-16T01:57:27.291Z · LW(p) · GW(p)

Wow, this post shot LW's "politics is a mind-killer" policy in the head and jumped up and down on its corpse. That said, I'm at loss about how I feel. This seems to me at once dangerously naive and blissfully idealistic. I do feel, though, that having a government/system like this in place will increase the chances of positive singularity by a good margin, and that's nothing to scuff at.

comment by lessdazed · 2011-08-16T01:29:46.758Z · LW(p) · GW(p)

The term “world domination” is, unfortunately, rather socially charged, bringing to mind an image of the archetypal mad scientist with marching robot armies.

Or a pair of laboratory mice, whose genes have been spliced.

Replies from: Alexei
comment by Alexei · 2011-08-16T01:59:08.539Z · LW(p) · GW(p)

Downvoted. This is a serious post and this comment adds absolutely nothing to the discussion. Funny references belong on reddit.

Replies from: lessdazed, wedrifid
comment by lessdazed · 2011-08-16T02:13:56.445Z · LW(p) · GW(p)

Because every university's resources are limited, an educational institution must routinely make decisions concerning the use of the time and space that is available for extracurricular activities. In my judgment, it is both necessary and appropriate for those decisions to evaluate the content of a proposed student activity. I should think it obvious, for example, that if two groups of 25 students requested the use of a room at a particular time -- one to view Mickey Mouse cartoons and the other to rehearse an amateur performance of Hamlet -- the First Amendment would not require that the room he reserved for the group that submitted its application first. Nor do I see why a university should have to establish a "compelling state interest" to defend its decision to permit one group to use the facility and not the other. In my opinion, a university should be allowed to decide for itself whether a program that illuminates the genius of Walt Disney should be given precedence over one that may duplicate material adequately covered in the classroom.

--Supreme Court Justice Stevens, WIDMAR v. VINCENT, concurring opinion.

Some people believe that humor belongs even in serious things, like supreme court legal opinions. Some people believe there is value in cartoons.

Replies from: Alexei
comment by Alexei · 2011-08-16T02:54:56.393Z · LW(p) · GW(p)

I have nothing against humor. In fact, if you look at my reply to OP, you'll see I even have a joke in there. The point I was making is that your post didn't have any substance aside from humor.

Replies from: lessdazed
comment by lessdazed · 2011-08-16T03:05:30.662Z · LW(p) · GW(p)

1) I reject the implication that there is no amount of humor that could justify a comment regardless of its other substance (given its length and the context). I accept for consideration the criticism that my comment wasn't funny enough, but not that it was categorically wrong to have a comment that is nothing but humorous.

2) To say that the comment had no substance aside from humor is a fine enough thing to say, because and only because the reader will interpret it as meaning that you didn't see any other substance. It is a fine enough thing to say if one thinks the probability of other substance is sufficiently low...but how close to zero did you think it was? "World domination" really did make me think of Pinky and the Brain, FWIW.

3) The value of a comment with no substance aside from humor here was to somewhat mitigate what I saw as an impending avalanche of critical comments and downvotes.

Replies from: Arandur
comment by Arandur · 2011-08-16T06:30:17.218Z · LW(p) · GW(p)

Heh, I appreciate the mitigation.

comment by wedrifid · 2011-08-16T03:52:43.813Z · LW(p) · GW(p)

Downvoted. This is a serious post and this comment adds absolutely nothing to the discussion. Funny references belong on reddit.

Reversed. I liked the comment. You underestimate the relevance.

Replies from: Arandur, None
comment by Arandur · 2011-08-16T06:30:47.439Z · LW(p) · GW(p)

Seconded. I actually found this very relevant, and quite a good point.

comment by [deleted] · 2012-04-27T10:55:32.223Z · LW(p) · GW(p)

Well I don't get it... What's Pinkie and the Brain got to do with this?

comment by calcsam · 2011-08-16T00:59:12.706Z · LW(p) · GW(p)

Not feasible. Let's aim for a more modest goal, say, better PR and functional communities.

Moreover, not this community's comparative advantage. Why do we think we'd be any better than anyone else at running the world? And why wouldn't we be subject to free-riders, power-seekers, and rationalists-of-fortune if we started winning?

Replies from: PlacidPlatypus, Arandur, MatthewBaker
comment by PlacidPlatypus · 2011-08-16T02:45:07.440Z · LW(p) · GW(p)

We think we'd be better at running the world because we think rationalists should be better at pretty much everything that benefits from knowing the truth. If we didn't believe that we wouldn't be (aspiring) rationalists. And just because we couldn't do it perfectly doesn't mean we're not better than the alternatives.

Replies from: Vaniver, lessdazed
comment by Vaniver · 2011-08-16T03:03:29.416Z · LW(p) · GW(p)

We think we'd be better at running the world because we think rationalists should be better at pretty much everything

Overconfidence seems like a poor qualification.

Replies from: Arandur
comment by Arandur · 2011-08-16T06:25:42.213Z · LW(p) · GW(p)

And yet confidence seems a good one. The question is how much is too much, which can really only be verified after the fact.

comment by lessdazed · 2011-08-16T03:22:04.019Z · LW(p) · GW(p)

And just because we couldn't do it perfectly doesn't mean we're not better than the alternatives.

I wonder how well a group whose members didn't study how to think and instead devoted themselves to not letting emotions interfere with their decisions would do. All its work would be advances, I think - there would be no analog to the "valley of rationality" in which people lost touch with their intuitions and made poor decisions.

Replies from: Strange7
comment by Strange7 · 2011-08-16T03:36:46.790Z · LW(p) · GW(p)

I dispute your claim.

In fact, I would assert the exact opposite: that attempting to remove emotions from decisionmaking is what causes the "valley of rationality." Furthermore, I suspect it is a necessary transitional phase, comparable in it's horrific necessity to the process of re-breaking a mangled shinbone so that it can heal straight.

Replies from: lessdazed
comment by lessdazed · 2011-08-16T03:49:15.961Z · LW(p) · GW(p)

All its work would be advances, I think

I dispute your claim.

I'm well disposed towards your viewpoint on that.

attempting to remove emotions from decisionmaking is what causes the "valley of rationality."

I disagree with the implication of this. I think the main causes are misusing tools like Bayesian updating and considering what a rationalist would do, and trying to do that.

Insofar as poorly calibrated emotions are part of the problem, one must subtract the problems that would have been caused by them under non-aspiring rationalist conditions from those under aspiring-rationalist conditions to calculate what the aspiring rationalism is responsible for. I don't think this usually leaves much left over, positive or negative.

comment by Arandur · 2011-08-16T16:13:42.410Z · LW(p) · GW(p)

Functional communities would be nice. I'm not so sure that better PR is the way to go. Why not no PR? Why not subtle induction via existing infrastructure? Let the people who most deserve to be here be the ones who will find us. Let us not go out with blaring trumpet, but with fishing lure.

comment by MatthewBaker · 2011-08-16T01:43:10.660Z · LW(p) · GW(p)

What specific concerns make you disagree with its feasibility?

Replies from: calcsam
comment by calcsam · 2011-08-16T05:03:53.812Z · LW(p) · GW(p)

We have neither the numbers, the organizational skill, nor the social skills to be good at this. There is a joke that organizing libertarians is like herding cats and the same principle seems to be partly true here for the same reason: Lw draws a lot of smart contrarian people. Unless there is a technological way to conquer the world, say the Singularity, but that demands an entirely different organizational strategy, namely channeling all efforts into FAI.

comment by Jack · 2011-08-17T03:26:36.359Z · LW(p) · GW(p)

In addition to everything thats already been said: when the median rationalist is still struggling to get a date the idea of winning popularity contests and infiltrating the domain of charismatic, glad-handing net-workers is preposterous.

Replies from: lucidfox, None
comment by lucidfox · 2011-08-19T04:40:56.272Z · LW(p) · GW(p)

the median rationalist is still struggling to get a date

First, [citation needed].

Second, if it's true, perhaps one should look at oneself and ask why.

Replies from: Jack
comment by Jack · 2011-08-19T12:27:11.733Z · LW(p) · GW(p)

First, it really isn't. I'm making a generalization about a group I'm familiar with. Second, I don't struggle to get dates.

comment by [deleted] · 2012-04-27T01:48:36.531Z · LW(p) · GW(p)

More importantly, if works such as "The Thick Of It" are to be believed, politicians actually don't get much tail at all, on average, what with being married to the job.

comment by MBlume · 2011-08-16T01:16:54.923Z · LW(p) · GW(p)

I submit that the primary mid-term goal of the Bayesian Conspiracy, at this stage of its existence, is and/or ought to be nothing less than world domination.

Before the rotten fruit begins to fly, let me make a brief clarification.

Is it odd that I laughed out loud at the idea that this should even be controversial?

comment by Peter Wildeford (peter_hurford) · 2011-08-16T03:08:53.069Z · LW(p) · GW(p)

I suppose the biggest question is, is all this realistic? Or is just an idealist's dream?

While beautifully written; it does sound all an idealist's dream. Or at least you have said very little to suggest otherwise.

More downvotes would send you to negative karma if there is such a place, and that's a harsh punishment for someone so eloquent. In sparing you a downvote, I encourage you to figure out what went wrong with this post and learn from it.

If there's three things I've found in my little time here it is that the community strongly admires in posts is novelty (the post discusses new material or adds to material in a way that is not covered in other posts), specifics (the post explains a plan for action or a set of facts to be learned rather than more vague philosophic generalities), and balance (the post examines the pros and cons of making a change, rather than appearing one-sided). You're post seems little on all of the three.

Replies from: Arandur, lessdazed, ScottMessick
comment by Arandur · 2011-08-16T06:32:24.583Z · LW(p) · GW(p)

..... I will meditate on this constructive criticism. Thank you very much; I think this is the most useful response I've seen.

Replies from: peter_hurford
comment by Peter Wildeford (peter_hurford) · 2011-08-16T07:04:25.831Z · LW(p) · GW(p)

I feel sorry you had to learn this by being taken for every karma point you own. I strongly suggest you make use of the Discussion Seciton for your future posts; that's a great place to learn what does work and what doesn't. My first two posts got downvoted, but I didn't lose out because votes are only -1 karma there. Read the LW About Page if you haven't already.

And remember that karma is not the end-all be-all of LW. I think you benefitted a lot by trading your karma for knowledge of how the LW community works. Karma itself is not a terminal value, but a means to fitting in with LW, which is also not a terminal value, but a means to furthering your rationality, which is also likely not a terminal value, but a means to getting better at satisfying your goals.

To further clarify my criticism just to make sure your karma freefall was worth it, this post would have benefitted by, among other things, being fifty times more practical -- what do you think is the first step toward gaining control of all the world's institutions? If you don't even know that, why are you writing about world domination? Maybe you can talk a lot more about how to sell rationality to the public so that they react to the conspiracy favorably rather than negatively, for instance.

I think you have a gift for writing in a very eloquent, enjoyable manner, so I would hate for you to leave just because of one fiasco. I implore you to reflect, refocus, and give it another shot.

Replies from: peter_hurford
comment by Peter Wildeford (peter_hurford) · 2011-08-16T07:07:11.490Z · LW(p) · GW(p)

By the way, sorry that this comment treats you like you're new to LW -- I can see from going through your comment and post history that you're not. My mistake.

Replies from: Arandur, Arandur
comment by Arandur · 2011-08-16T16:10:15.467Z · LW(p) · GW(p)

That's quite all right; I'm sure the naivete blossoming forth from the OP makes that an easy mistake to make. :P

I'm well aware of the Discussion Section... which only compounds my error. Yes, this should have been posted there. Losing some eighty Karma (by the way, apparently negative Karma does not exist per se, but perhaps it does de facto... is as good a wakeup call as any for the sin of overconfidence.

I would have traded my karma simply for the advice you've given here. Thank you. And thank you for the compliment on my writing style; nice to see not everything about this experience was negative. I assure you that I will not be leaving any time soon. When I first saw that this post was getting a negative response, I made a split-second decision: should I flee, or should I learn? I choose to learn.

Replies from: saturn
comment by saturn · 2011-08-17T00:34:17.786Z · LW(p) · GW(p)

I think even though the karma counter never goes below zero, downvotes still count and it won't go above zero until you get enough upvotes to cancel them out.

Replies from: Arandur
comment by Arandur · 2011-08-17T01:07:07.823Z · LW(p) · GW(p)

I can confirm that hypothesis; I'm still at zero, even though the grandfather to this post has received 4 points, given after I lost all my karma. Actually, this is a bit of an annoyance; I have no way to gauge how far I have to go to get into the positives...

Replies from: falenas108
comment by falenas108 · 2011-08-17T13:04:35.875Z · LW(p) · GW(p)

As long as you didn't delete any other comments/posts, you can figure out what your karma is by adding up everything else.

comment by Arandur · 2011-08-16T20:20:40.373Z · LW(p) · GW(p)

I am still relatively new to LW, though - or else I'm just not very good at picking up on social values - so I'll ask this question of you: What stigma would be attached to my decision to delete this post? I don't want to do it just to get my Karma back; I'm willing to accept the consequences of my mistake. On the pro side, this would no longer come up under my posts, and so people who have not already seen it would fail to judge me by it. This is only a positive because I have in fact learned much from the response, and plan to act upon those lessons. On the con side, it might be viewed as... I almost want to say cowardly? Failing to take responsibility for my actions? Running away?

I'm not sure, though, what the implications of that action would be to the rest of the community, so I need an outside opinion.

EDIT: I recognize that it is good to recognize that I have made stupid decisions for bad reasons. I do not know if it is a virtue to keep your mistakes around and visible.

Replies from: Alicorn, shminux, peter_hurford
comment by Alicorn · 2011-08-16T20:42:36.150Z · LW(p) · GW(p)

I don't want to do it just to get my Karma back

Good, because it wouldn't do that.

Replies from: Arandur
comment by Arandur · 2011-08-16T20:45:14.280Z · LW(p) · GW(p)

Oh, good. :3 I was worried that doing so would give that false implication.

comment by Shmi (shminux) · 2011-08-16T21:17:26.576Z · LW(p) · GW(p)

If this potential confusion is your real reason and not a convenient rationalization, I would suggest an EDIT along the lines of " convinced me that was not a good one to hold, and I no longer think that Bayesian conspiracy is a good idea outside of the HPMoR fanfiction". If you still hold that it is, then bear it like a rationalist you aspire to be, since you presumably examined this model of action with an utmost care, to avoid any biases.

EDIT: I certainly do not plan to delete my discussion post with negative karma, though I did retract (not delete) one rather poorly thought out comment previously.

Replies from: Arandur, Arandur
comment by Arandur · 2011-08-17T11:57:18.906Z · LW(p) · GW(p)

Ha! Now I feel like a noob. How do I edit a top-level post? :3

Replies from: shminux
comment by Shmi (shminux) · 2011-08-17T15:35:42.332Z · LW(p) · GW(p)

If you click on your nick, you will see it among your other posts, and you can edit it there, I suppose.

comment by Arandur · 2011-08-16T22:26:17.646Z · LW(p) · GW(p)

Thank you.

comment by Peter Wildeford (peter_hurford) · 2011-08-17T07:21:06.008Z · LW(p) · GW(p)

I'm newer than you and have not yet braved into the "Main" section, so I don't really know. I didn't know deleting a post could get you the karma back, that seems like a bad policy and counterproductive to what karma is supposed to do.

Still, I think you've "learned your lesson", so to speak, so I personally wouldn't mind at all.

Replies from: Arandur
comment by Arandur · 2011-08-17T11:51:08.526Z · LW(p) · GW(p)

Apparently it can't, which is a good thing, upon reflection.

comment by lessdazed · 2011-08-16T03:14:29.313Z · LW(p) · GW(p)

Your list is good. I would also add that references to relevant studies are valued.

The OP was novel enough. On the A-F scale I give it a B in the novelty category. No mercy, a cold blooded-judgement, neither a B- nor a B+.

Replies from: peter_hurford
comment by Peter Wildeford (peter_hurford) · 2011-08-16T03:55:05.533Z · LW(p) · GW(p)

I would put references to relevant studies under "specifics" but it is definitely something I should of highlighted.

comment by ScottMessick · 2011-08-16T23:02:48.992Z · LW(p) · GW(p)

While beautifully written; it does sound all an idealist's dream. Or at least you have said very little to suggest otherwise.

More downvotes would send you to negative karma if there is such a place, and that's a harsh punishment for someone so eloquent. In sparing you a downvote, I encourage you to figure out what went wrong with this post and learn from it.

I downvoted the OP. A major turn-off for me was the amount of rhetorical flourish. While well-written posts should include some embellishment for clarity and engagement, when there's this much of it, the alarm bells go off...what is this person trying to convince me of by means other than reasoned argument?

See also: the dark arts.

Replies from: peter_hurford
comment by Peter Wildeford (peter_hurford) · 2011-08-17T07:22:53.188Z · LW(p) · GW(p)

Maybe, but I think that's just because the post was also low on specifics. If Arandur brought the flourish and the specifics, I think it would be great, and would balance out the other stuff that can appear boring, dry, and overly technical.

Though it could just be a difference in preferences.

comment by Kaj_Sotala · 2011-08-16T21:08:42.451Z · LW(p) · GW(p)

I agree that more rationality in politics would be a good thing, but I think this post is making too big of a deal out of it. Eliezer said essentially the same thing, "rationalists ought to be more active in politics", much more succintly here.

Are you a rationalist who feels like you could go into politics? Well then, go into politics if you think that's where your comparative advantage lies. See if you can get your local friends to support you. Getting the support of fellow rationalists is good, but the main thing is getting emotional support from someone - you'll need a lot of it. If you can get other rationalists to also become candidates, great. If not, you'll just have to work with smart people who aren't explicitly rationalist.

If you feel like you could manage politics (emotionally as well as intellectually), then go into politics. That's all there is to it. You don't need to discuss an elaborate Bayesian Conspiracy for that. Rationalists who are into politics and are in a position to assist each other will find each other regardless.

(I've dabbled in politics, but become demotivated because a large part of it involves repeating the same message over and over and over, which I'm not good at. I'd rather just write up one killer argument for issue X and then move on to Y since X has been settled, barring any devastating critiques to my argument for X. Doesn't work. Also, I'm bad at thinking up answers to interview questions on the spot, and developed a bit of an ugh field for doing interviews.)

comment by Vaniver · 2011-08-16T02:05:04.678Z · LW(p) · GW(p)

It seems pretty obvious that Eliezer's view is that FAI is the quick ticket to world domination (in the sense of world states that you talk about), and he seems to be structuring his conspiracy accordingly.

It is demonstrable that one's level of strength as a rationalist has a direct correlation to the probability that the one will make correct decisions.

Really? How would one demonstrate this? What does it mean for a definition to be "correct"? If something is true by definition, is it really demonstrable?

we have a moral obligation to work our hardest on this project

Really? Your plan is to get people interested in world domination by guilting them?

Replies from: Arandur
comment by Arandur · 2011-08-16T06:29:09.732Z · LW(p) · GW(p)

It seems pretty obvious that Eliezer's view is that FAI is the quick ticket to world domination...

I hadn't considered that, but now I see it clearly. How interesting.

Really? Your plan is to get people interested in world domination by guilting them?

Ha! If that would work, maybe it'd be a good idea. But no, pointing out a moral obligation is not the same as guilting. Guilting would be me messaging you, saying "See that poor starving African woman? if you had listened to my plan, she'd be happier." But I won't be doing that.

comment by MatthewBaker · 2011-08-16T02:03:48.712Z · LW(p) · GW(p)

I read it and I thought it was amazingly similar to a lot of the thoughts and feelings I've had going through my head recently. Maybe this is just the emotion and fallow of youth but I feel like the world as a whole is very apathetic towards the suffering that exists outside of the bubble of the First World that LW exists in. How can you honestly choose cryonics over the utility of an organization built to protect human life until the singularity along with Eliezer's group which works to ensure a positive singularity.

I recently saw a movie about government corruption and the UN dealing with it in Europe when it comes to fighting the sex trafficking industry, the courage it takes to fight oppression around the world is rare and expensive to come by but its definitely something we need more of. Once I master the art of willpower I intend to devote even more time to this pursuit, and I hope others will do the same.

Replies from: lessdazed
comment by lessdazed · 2011-08-16T02:51:57.299Z · LW(p) · GW(p)

Why was this downvoted?

comment by JosephMcMahon · 2014-07-21T04:29:01.800Z · LW(p) · GW(p)

What if we had the infiltration of non rationalists who, through their cunning, gained control and we ended up with what we've got now? lol Or what if the road to being a rationalist is paved by the mistakes of the non rationalists eg the original thought to live forever may not have come from a rationalist, but can only be determined by a rationalist.? lolx2 Or worse, the richness of life is felt, or only made real by the exquisite angst that is caused by the tension between rationality and mendacity?

comment by XFrequentist · 2011-08-16T14:40:39.613Z · LW(p) · GW(p)

(glances left and right)

SHHHHH!!!

comment by Kevin · 2011-08-16T03:10:07.749Z · LW(p) · GW(p)

Bayesian Conspiracy @ Burning Man 2011, a social group? Ha.

Replies from: Arandur
comment by Arandur · 2011-08-16T06:31:32.547Z · LW(p) · GW(p)

I do apologize if I've given offense; not having had the opportunity yet to attend, I used the broadest term I could conjure while maintaining applicability.