The Goal of the Bayesian Conspiracy 2011-08-16T18:40:55.939Z · score: -9 (46 votes)
Book of Mormon Discussion 2011-08-01T18:25:50.438Z · score: -1 (19 votes)


Comment by arandur on Mushrooms · 2017-11-10T20:53:04.363Z · score: 13 (5 votes) · LW · GW

That's where randomized controlled trials come in. Rigor! Scholarship! Risks to one's health! That's the scientific method!

Comment by arandur on Mushrooms · 2017-11-10T18:53:12.822Z · score: 5 (2 votes) · LW · GW

That's where the virtue of experimentation comes in. Let us know what you find! :D

(Purely incidentally, I love what you're doing on We've Got Worm. Didn't know you ran in these circles, though I might have guessed.)

Comment by arandur on The Journal of High Standards · 2017-11-10T15:30:48.889Z · score: 2 (1 votes) · LW · GW
If you can convince people that the standards of the new journal are actually beter than the existing ones that further helps with making the decision to publish in the journal seem virtuous.

I concur with the implication, but that's a very big "if". It's possible that many scientists know that e.g. the CONSORT standards are good, but how many do you think would be able to differentiate between two sets of standards, and determine which one is "better"? In addition, I'm not sure that "virtue" really is much of a factor when deciding in which journal to publish one's research, otherwise we wouldn't see people following the incentive gradients they are.

Finally I don't see how taking money for publishing instead of taking money from a grant seems more like selling out.

One is status quo, the other is novel. Sometimes that's all it takes. I can easily imagine a conversation like the following:

"Hey Kit, in which journal did you publish your recent research paper?"
"Oh, I published it in The Journal of High Standards."
"Huh, I've never heard of them. Why didn't you submit it to The Prestigious Yet Unvirtuous Journal?"
"Well, The Journal of High Standards paid me a few thousand dollars."
"Really? That sounds suspicious. You sure it isn't a scam?"

The idea of money coming from the government to fund scientific research is already well-established (since it's what we do), it naturally appeals to our democratic ideals, and everyone understands the incentive structure involved. The idea of money coming as a reward from a publisher is novel (and therefore weird), and the incentive structure is murkier (and therefore suspicious).

This all said, I'm speaking solely from my intuition regarding how people would react to this situation, and my intuition seems to differ substantially from yours. I'm not trying to convince you that you're wrong and I'm right; rather, I'm trying to signal that there is a wide possibility space here, and I'm not sure why you've picked "offering money will lead to greater prestige" out of it when other possibilities seem to be just as likely, if not more so.

Comment by arandur on On Inconvenient Truth · 2017-11-10T15:05:51.225Z · score: 3 (3 votes) · LW · GW

You could try to bite bullets and believe the inconvenient facts.
You could try to find the facts and change your politics to fit.

You mention that you "feel committed to the last". If you had used the word "beliefs" instead of "politics," I would endorse and agree with your commitment. Given that you used the word "politics," though, I'm inclined to believe that the better path is somewhere between the two positions quoted above.

I agree that "[for] almost any political position, there is at least one inconvenient fact." (Or, at least, I think I agree; I think that you are using the terms "political position" and "inconvenient" in the way that is intuitive to me.) But political positions are only sufficiently powerful enough to enact change when multiple people believe in them, or believe in something close enough that they can work together. There are inconvenient facts that I'm aware of which cast doubt on some of my political positions, but the doubts are small enough that I believe it better to hold onto the imperfect position than to abandon it. Politics is not a game of finding optimum solutions; it is a game of coalition-building, of incremental change, of pushing for policies that are better than what existed before. (I would imagine that this fact is part of the source of the frustration many aspiring rationalists feel toward politics. I know that I feel this frustration.)

So let's set aside politics for the moment, because you seem to use the term roughly interchangeably with "belief", e.g. "whatever you wish to believe, there is, somewhere, a fact that will cast doubt on it."

But my beliefs are probabilistic in nature. Seemingly contradictory facts are not enemies -- to the contrary, they are expected. If I believe that a die is weighted such that the number six will show up 25% of the time, I will bet on each roll coming up on six. 75% of the time, the facts will appear to be against me -- and yet, if I bet at the right odds, I'll still expect to win in the long run. It is my intuition that this stance lies closer to "[trying] to bite the bullets and believe the inconvenient facts" than "[trying] to find the facts and [changing] my [beliefs] to fit."

Of course, I should still update the probability based on each roll I see, so maybe that counts as "[changing] my [beliefs]"? I'm not sure. Maybe my point is just that I'm not clear on what difference you're making between those two options. Really, your first option, "[taking] a stance of strong epistemic and moral modesty, and never [taking] a position with confidence," could also describe the situations I touched on above.

Comment by arandur on Some suggestions (desperate please, even) · 2017-11-10T14:41:25.231Z · score: 1 (1 votes) · LW · GW

Adding to the Markdown parsing comment: if we're going to type in Markdown anyway instead of having a proper WYSIWYG editor (make no mistake; I prefer the former!) (Although I see that highlighting text causes a WYSIWYG panel to open up, which I think is excellent), I think it makes sense to separate the raw input from the formatting. I would prefer a system such as Reddit or Stack Exchange have, where the text-box shows the raw Markdown, and the resultant formatted text is displayed elsewhere for review. Combining the two into one area makes fine editing more difficult.

I'm inclined to believe that this would also go some way toward fixing Jiro's underscore issue, since "[struggling] with a malicious parser" is somewhat easier when you have easy access to the parser's input and output. It might not help as much as I think, though, and it would indeed be easier not to have to struggle. (Jiro, as a workaround I would recommend surrounding the offending text with backticks, e.g. `expand_less`.

Comment by arandur on The Journal of High Standards · 2017-11-10T14:32:37.150Z · score: 11 (4 votes) · LW · GW
Even if the money alone isn't enough to warrant the scientist to publish in a no-name journal, the journal would soon stop being a no-name journal because scientists would expect that their colleges want to publish in the journal to get the money. That expectation makes the journal more prestigious. The expectations that other people expect the journal to get more prestigious in-turn will increase it's prestige. 

I'm inclined to dispute this point. Setting quite aside the difficulty of setting up such a project, supposing that the money came ex nihilo and we magically caught the ear of prestigious scientists... it is my intuition that our journal would nevertheless fail to gain prestige. I believe that scientists who published with us would be seen as having been "bought", and I expect that this scorn would overpower any demonstrable merit the research or our journal as a whole possessed. "I want to publish in this journal to get the prize money" is a different motivation than "I want to publish in this journal because it has prestige," and I don't think that gap is as easily crossed as you seem to think.

Comment by arandur on 10/19/2017: Development Update (new vote backend, revamped user pages and advanced editor) · 2017-11-01T16:18:32.511Z · score: 3 (2 votes) · LW · GW

The Navbar is transparent on the About page on Android -- when I scroll down, the content and the navbar text overlap each other. Not sure if that's intentional, but it seems a bit awkward to me.

Actually, much of my experience on Android has been buggy -- is mobile performance not a high development priority right now?

Comment by arandur on High Challenge · 2012-11-09T18:41:24.566Z · score: 0 (0 votes) · LW · GW

... huh. I wonder if Neal Stephenson is a LW reader. See his (most recent?) book, REAMDE, for an implementation of this idea.

Comment by arandur on High Challenge · 2012-11-09T18:36:58.210Z · score: 0 (0 votes) · LW · GW

I'm not sure that the difference between 4D states and 3D states is meaningful, with respect to eudaimoniac valuations. Doesn't this overlook the fact that human memories are encoded physically, and are therefore part of the 3D state being looked at? I don't see any meaningful difference between a valuation over a 4D state, and a valuation over a 3D state including memories of the past.

In other words, I can think of no 3D state whose eudaimoniac valuation is worse than that of the 4D state having it as its endpoint.

(In fact, I can think of quite a few which may in fact be better, for pathological choices of 4D state, e.g. ones extending all the way back to the Dark Ages or before.)

P.S. Is there a standardized spelling for the term which I have chosen to spell as "eudaimoniac"? A quick Google search suggested this one as the best candidate.

Comment by arandur on Is Morality Given? · 2011-08-22T01:05:30.503Z · score: 1 (3 votes) · LW · GW

Oh dear; how embarrassing. Let me try my argument again from the top, then.

Comment by arandur on Is Morality Given? · 2011-08-22T00:46:04.788Z · score: 5 (9 votes) · LW · GW

... Just to check: we're talking about Microsoft Office's Clippy, right?

Comment by arandur on Is Morality Given? · 2011-08-21T17:35:12.110Z · score: 0 (4 votes) · LW · GW

Ha! No. I guess I'm using a stricter definition of a "mind" than is used in that post: one that is able to model itself. I recognize the utility of such a generalized definition of intelligence, but I'm talking about a subclass of said intelligences.

Comment by arandur on Is Morality Given? · 2011-08-20T04:27:35.708Z · score: 1 (5 votes) · LW · GW

Which sounds like that fuzzily-defined "conscience" thing. So suppose I say that this "Stone tablet" is not a literal tablet, but is rather a set of rules that sufficiently advanced lifeforms will tend to accord to? Is this fundamentally different than the opposite side of the argument?

Comment by arandur on Pascal's Mugging: Tiny Probabilities of Vast Utilities · 2011-08-20T04:23:35.124Z · score: -1 (3 votes) · LW · GW

I'm not sure that was ever a question. :3

Comment by arandur on Pascal's Mugging: Tiny Probabilities of Vast Utilities · 2011-08-19T22:16:56.721Z · score: 1 (3 votes) · LW · GW

... which doesn't solve the problem, but at least that AI won't be giving anyone... five dollars? Your point is valid, but it doesn't expand on anything.

Comment by arandur on Pascal's Mugging: Tiny Probabilities of Vast Utilities · 2011-08-19T22:12:55.722Z · score: 2 (6 votes) · LW · GW

I think the problem might lie in the almost laughable disparity between the price and the possible risk. A human mind is not capable of instinctively providing a reason why it would be worth killing 3^^^^3 people - or even, I think, a million people - as punishment for not getting $5. A mind who would value $5 as much or more than the lives of 3^^^^3 people is utterly alien to us, and so we leap to the much more likely assumption that the guy is crazy.

Is this a bias? I'd call it a heuristic. It calls to my mind the discussion in Neal Stephenson's Anathem about pink nerve-gas-farting dragons. (Mandatory warning: fictional example.) The crux of it is, our minds only bother to anticipate situations that we can conceive of as logical. Therefore, the manifest illogicality of the mugging (why is 3^^^^3 lives worth $5; if you're a Matrix Lord why can't you just generate $5 or better yet, modify my mind so that I'm inclined to give you $5, etc.) causes us to anti-anticipate its truth. Otherwise, what's to stop you from imagining, as stated by Tom_McCabe2 (and mitchell_porter2, &c.), that typing the string "QWERTYUIOP" leads to, for example, 3^^^^3 deaths? If you imagine it, and conceive of it as a logically possible outcome, then regardless of its improbability, by your argument (as I see it), a "mind that worked strictly by Solomonoff induction" should cease to type that string of letters ever again. By induction, such a mind could cause itself to cease to take any action, which would lead to... well, if the AI had access to itself, likely self-deletion.

That's my top-of-the-head theory. It doesn't really answer the question at hand, but maybe I'm on the right track...?

Comment by arandur on Is Morality Given? · 2011-08-18T15:41:16.383Z · score: 2 (4 votes) · LW · GW

"If morality exists independently of human nature, then isn't it a remarkable coincidence that, say, love is good?"

I'm going to play Devil's Advocate for a moment here. Anyone, please feel free to answer, but do not interpret the below arguments as correlating with my set of beliefs.

"A remarkable coincidence? Of course not! If we're supposing that this 'stone tablet' has some influence on the universe - and if it exists, it must exert influence, otherwise we wouldn't have any evidence wherewith to be arguing over whether or not it exists - then it had influence on our 'creation', whether (in order to cover all bases) we got here purely through evolution, or via some external manipulation as well. I should think it would be yet stranger if we had human natures that did not accord with such a 'stone tablet'."

Comment by arandur on Are Deontological Moral Judgments Rationalizations? · 2011-08-17T15:11:09.706Z · score: 2 (4 votes) · LW · GW

Yes, I've read through Yudkowsky's post on metaethics, I'm sorry if I made the point of this post insufficiently clear, please see the... cousin... to this comment.

Comment by arandur on Are Deontological Moral Judgments Rationalizations? · 2011-08-17T15:08:24.582Z · score: 6 (8 votes) · LW · GW

Reckon it's atop some mystical unassailable mountain on a windswept planet. That, or it doesn't exist. :P I'm well aware of the arguments against stone tablet morality. I had thought I'd made it clear above that this was an epiphany about my flawed mind-state, not about Actual Morality. Judging by the downvotes, I did not make this sufficiently clear.

Comment by arandur on Are Deontological Moral Judgments Rationalizations? · 2011-08-17T12:20:35.880Z · score: 10 (24 votes) · LW · GW

Wow. I've been guilty of this for a while, and not realized it. That "is this action morally wrong" question really struck me.

Myself, I believe that there is an objective morality outside humanity, one that is, as Eliezer would deride the idea, "written on a stone tablet somewhere". This may be an unpopular hypothesis, but accepting it is not a prerequisite for my point. When asked about why certain actions were immoral, I, too, have reached for the "because it harms someone" explanation... an explanation which I just now see as the sin of Avoiding Your Belief's Real Weak Points.

What I really believe, upon much reflection, is that there are two overlapping, yet distinct, classes of "wrong" actions: one we might term "sins", and the other we might term "social transgressions". Social Transgressions is that class of acts which are punishable by society, usually those that are harmful. Sins is that class of acts which goes against this Immutable Moral Law. Examples are given below, being (in the spirit of full disclosure) the first examples I thought of, and neither the more pure examples, nor the most defensible, non-controversial examples.

  • Spitting on the floor of an office building is a social transgression, but not a sin.
  • Homosexuality is a sin, but not a social transgression (insofar as it is accepted by society, which is more and more very day).
  • Murder is both a sin and a social transgression.

I do not know if this is a defensible position, but I now recognize it as a clearer form of what I believe than what I had previously claimed to believe.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-17T11:57:18.906Z · score: 1 (3 votes) · LW · GW

Ha! Now I feel like a noob. How do I edit a top-level post? :3

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-17T11:51:08.526Z · score: 1 (3 votes) · LW · GW

Apparently it can't, which is a good thing, upon reflection.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-17T01:07:07.823Z · score: 2 (4 votes) · LW · GW

I can confirm that hypothesis; I'm still at zero, even though the grandfather to this post has received 4 points, given after I lost all my karma. Actually, this is a bit of an annoyance; I have no way to gauge how far I have to go to get into the positives...

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T22:26:17.646Z · score: 0 (2 votes) · LW · GW

Thank you.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T20:45:14.280Z · score: 0 (2 votes) · LW · GW

Oh, good. :3 I was worried that doing so would give that false implication.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T20:20:40.373Z · score: 1 (3 votes) · LW · GW

I am still relatively new to LW, though - or else I'm just not very good at picking up on social values - so I'll ask this question of you: What stigma would be attached to my decision to delete this post? I don't want to do it just to get my Karma back; I'm willing to accept the consequences of my mistake. On the pro side, this would no longer come up under my posts, and so people who have not already seen it would fail to judge me by it. This is only a positive because I have in fact learned much from the response, and plan to act upon those lessons. On the con side, it might be viewed as... I almost want to say cowardly? Failing to take responsibility for my actions? Running away?

I'm not sure, though, what the implications of that action would be to the rest of the community, so I need an outside opinion.

EDIT: I recognize that it is good to recognize that I have made stupid decisions for bad reasons. I do not know if it is a virtue to keep your mistakes around and visible.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T16:13:42.410Z · score: 0 (4 votes) · LW · GW

Functional communities would be nice. I'm not so sure that better PR is the way to go. Why not no PR? Why not subtle induction via existing infrastructure? Let the people who most deserve to be here be the ones who will find us. Let us not go out with blaring trumpet, but with fishing lure.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T16:10:15.467Z · score: 9 (11 votes) · LW · GW

That's quite all right; I'm sure the naivete blossoming forth from the OP makes that an easy mistake to make. :P

I'm well aware of the Discussion Section... which only compounds my error. Yes, this should have been posted there. Losing some eighty Karma (by the way, apparently negative Karma does not exist per se, but perhaps it does de facto... is as good a wakeup call as any for the sin of overconfidence.

I would have traded my karma simply for the advice you've given here. Thank you. And thank you for the compliment on my writing style; nice to see not everything about this experience was negative. I assure you that I will not be leaving any time soon. When I first saw that this post was getting a negative response, I made a split-second decision: should I flee, or should I learn? I choose to learn.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T16:04:51.646Z · score: 6 (8 votes) · LW · GW

Your chastisement is well taken. Thank you.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:35:34.848Z · score: 4 (6 votes) · LW · GW

I'm being pulled off to bed, but from my skimming this looks like a very, very helpful critique. Thank you for posting it; I'll peruse it as soon as I'm able. One note: I did note after posting this, but too late to make a meaningful change, that "we should support cryonics less" is rather a ridiculous notion, considering the people I'm talking to are probably not the same people who are working hardest on cryonics. So: oops.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:32:24.583Z · score: 2 (4 votes) · LW · GW

..... I will meditate on this constructive criticism. Thank you very much; I think this is the most useful response I've seen.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:31:32.547Z · score: 2 (4 votes) · LW · GW

I do apologize if I've given offense; not having had the opportunity yet to attend, I used the broadest term I could conjure while maintaining applicability.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:30:47.439Z · score: 3 (5 votes) · LW · GW

Seconded. I actually found this very relevant, and quite a good point.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:30:17.218Z · score: 1 (3 votes) · LW · GW

Heh, I appreciate the mitigation.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:29:09.732Z · score: 2 (4 votes) · LW · GW

It seems pretty obvious that Eliezer's view is that FAI is the quick ticket to world domination...

I hadn't considered that, but now I see it clearly. How interesting.

Really? Your plan is to get people interested in world domination by guilting them?

Ha! If that would work, maybe it'd be a good idea. But no, pointing out a moral obligation is not the same as guilting. Guilting would be me messaging you, saying "See that poor starving African woman? if you had listened to my plan, she'd be happier." But I won't be doing that.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:27:05.219Z · score: 1 (5 votes) · LW · GW

That's quite true. But I have a hunch (warning: bare assertion) that much governmental negligence is due to a) self-interest and b) corruption (see: corrupt African dictatorships).

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:25:42.213Z · score: 1 (5 votes) · LW · GW

And yet confidence seems a good one. The question is how much is too much, which can really only be verified after the fact.

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:24:40.345Z · score: 4 (6 votes) · LW · GW

Huh. An interesting point, and one that I should have considered. So what would you suggest as a safety hatch?

Comment by arandur on The Goal of the Bayesian Conspiracy · 2011-08-16T06:23:14.386Z · score: 0 (8 votes) · LW · GW

Ha! Yes, I had this thought as well. I actually messaged Yudkowsky, warning him that I was considering posting this, on the off chance that a) the Conspiracy existed, b) he was among their ranks, and c) he wanted me to not stir up this possibility. I waited for a response for a period of time consistent with my estimation of the probability of the Conspiracy existing in an incarnation that would meaningfully object.

Comment by arandur on Theory of Knowledge (rationality outreach) · 2011-08-14T19:58:26.632Z · score: 2 (4 votes) · LW · GW

A link to said guide would be helpful, if such is available online.

EDIT: Is this what you're talking about? If so, I'm actually very excited. I see a great deal of consonance between the objectives of ToK and the objectives of this site. Perhaps we should turn ToK into a feeding pool for the Bayesian Conspiracy... >:3

Comment by arandur on Theory of Knowledge (rationality outreach) · 2011-08-14T16:19:13.461Z · score: 3 (5 votes) · LW · GW

I can't speak for ToK students specifically, since I myself did not take the IB course in high school, but I'll note that the greatest skills I've learned from this community have been a) how to state arguments clearly and effectively, e.g. not getting confused by words, and b) understanding how politics mindkills. I would love to present the Blue/Green sky dilemma, see what came of it... but only after introducing the meaning of truth and perhaps even the Litanies.

Comment by arandur on The Proper Use of Humility · 2011-08-13T20:13:27.161Z · score: 2 (4 votes) · LW · GW

Then perhaps I was incorrect in my accusation. I apologize that I'm not able to present my side more clearly; this happened a while ago, and the data is muddled.

Comment by arandur on The Proper Use of Humility · 2011-08-13T19:27:34.606Z · score: 1 (3 votes) · LW · GW

That is a good point, but the error comes in my statement of he problem, not in the argument. Otherwise, why would we ever give to charity, unless explicitly asked to? What would constitute "asking", anyway? Could we pass by a homeless man on the street and, as long as he didn't actually say anything to us, safely ignore his sign?

Comment by arandur on Exercises to do while alone · 2011-08-13T09:51:27.982Z · score: 1 (3 votes) · LW · GW

I'm in a similar boat; I work overnight at a well-known US gym for 8 hours, and the shift is so slow I'm allowed to pull out my laptop, books, phone, whatever I want. (No internet, though, except on my phone.)

That said, I recommend worldbuilding. I do it for my tabletop games, but you could just as easily do it in the modality of a novel. Of particular use would be the creation of the histories of countries and political systems; in this way, you could experiment with social conventions... perhaps some outside your own culture. You are, of course limited by your imagination... but only your imagination.

Comment by arandur on The Proper Use of Humility · 2011-08-13T06:50:56.313Z · score: 2 (4 votes) · LW · GW

I thank you for your caution, but my argument was actually non-Biblical in nature, and it was a proof by contradiction. Ran something like this:

So, you think that I should give away everything to those who ask for it, without exception?
Every resource I consume is a resource that is then unavailable for others who ask for it.
Therefore, in order to give away every resource I might have otherwise consumed, I must not consume any resources, and therefore dies.
Your moral system prohibits suicide.
Therefore, your original proposition is inconsistent with your professed morality, QED.
Also therefore, get out of my house before I call the cops.

I apologize for the ambiguity; I did not mean to explicitly ascribe any moral valuation to committing suicide, though I should hope it could be inferred that I do not, in fact, advocate suicide. :P

As for "the homeless giving it back", why, to even ask would be selfish!

Comment by arandur on A Fable of Science and Politics · 2011-08-13T02:13:54.536Z · score: 27 (29 votes) · LW · GW

I once told a friend, "I think I'm a Daria, but I know the correct answer is Ferris". Then I realized the absurdity of that statement, and had much pondering to do.

Comment by arandur on The Modesty Argument · 2011-08-13T02:00:41.845Z · score: -3 (5 votes) · LW · GW

Sometimes arrogance is the mark of truth. History is filled with the blood of people who died for their beliefs holding them against all counterargument, who were later vindicated.

Of course, history is also filled will the blood of people who died for erroneous beliefs.

Obviously, you should utilize the Modesty Argument iff your viewpoint is incorrect.

Comment by arandur on The Proper Use of Humility · 2011-08-13T01:52:19.688Z · score: 1 (5 votes) · LW · GW

Matthew 6:16-18:

16 Moreover when ye fast, be not, as the hypocrites, of a sad countenance: for they disfigure their faces, that they may appear unto men to fast. Verily I say unto you, They have their reward.
17 But thou, when thou fastest, anoint thine head, and wash thy face;
18 That thou appear not unto men to fast, but unto thy Father which is in secret: and thy Father, which seeth in secret, shall reward thee openly.

Sorry to spread my Christian-flavored ideas around, but it reminded me. :3 The old joke among me and my siblings, when I was growing up, was that we would proclaim ourselves to be "the humblest one" of us all. I thought it was a joke, until I grew up and interacted with people who actually adhered to a similar philosophy...

Very well-written post, sir. I greatly appreciate the ones where you take a common word or phrase, and reduce it to its proper and true state.

Actually, what this really reminds me of is a recent altercation between me and a roommate. The word at the heart of this altercation was "selfishness"... my erstwhile roommate (subleaser, really) said that my and my wife's decision to fail to renew their lease was "selfish", because, apparently, in our religion we are supposed to give everything we have to anyone who asks of it. Logically, it can be well demonstrated that this does not follow; if we were to be "charitable" under this definition, we would give all our shelter and money to the starving and homeless, and die of starvation and exposure.

How strange the unflinching hypocrisy of mankind.

Comment by arandur on ...What's a bias, again? · 2011-08-13T01:36:44.768Z · score: 6 (8 votes) · LW · GW

Oh, how curious. I've been reading on here a while, and I think I had previously misunderstood the adopted meaning of the word "bias"... using the term as it's socially used, that is to say, a prior reason for holding a certain belief over another due to convenience. A judge might be biased because one side is paying him; a jury member might be biased because their sister is the one on trial. Are these "mistakes"? Or do they fall under a certain type of cognitive bias that is similar among all humans? *ponder*

Comment by arandur on Why truth? And... · 2011-08-13T01:20:41.417Z · score: 2 (6 votes) · LW · GW

Here's an interesting take on the "morality" side: It may be morally incumbent on some to look behind the curtain, and not for others. Since knowing about biases can hurt people, it may well be that those who are "fit" to look behind the curtain are in fact required to be the guardians of said curtain, forbidding anyone without the proper light and knowledge from looking behind it, but acting upon the knowledge gained for the benefit of society.

..... Hence, the Conspiracy.