Why Truth?

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2006-11-27T01:49:28.000Z · LW · GW · Legacy · 53 comments

The goal of instrumental rationality mostly speaks for itself. Some commenters have wondered, on the other hand, why rationalists care about truth. Which invites a few different answers, depending on who you ask; and these different answers have differing characters, which can shape the search for truth in different ways.

You might hold the view that pursuing truth is inherently noble, important, and worthwhile. In which case your priorities will be determined by your ideals about which truths are most important, or about when truthseeking is most virtuous.

This motivation tends to have a moral character to it. If you think it your duty to look behind the curtain, you are a lot more likely to believe that someone else should look behind the curtain too, or castigate them if they deliberately close their eyes.

I tend to be suspicious of morality as a motivation for rationality, not because I reject the moral ideal, but because it invites certain kinds of trouble. It is too easy to acquire, as learned moral duties, modes of thinking that are dreadful missteps in the dance.

Consider Spock, the naive archetype of rationality. Spock's affect is always set to “calm,” even when wildly inappropriate. He often gives many significant digits for probabilities that are grossly uncalibrated.1 Yet this popular image is how many people conceive of the duty to be “rational”—small wonder that they do not embrace it wholeheartedly.

To make rationality into a moral duty is to give it all the dreadful degrees of freedom of an arbitrary tribal custom. People arrive at the wrong answer, and then indignantly protest that they acted with propriety, rather than learning from their mistake.

What other motives are there?

Well, you might want to accomplish some specific real-world goal, like building an airplane, and therefore you need to know some specific truth about aerodynamics. Or more mundanely, you want chocolate milk, and therefore you want to know whether the local grocery has chocolate milk, so you can choose whether to walk there or somewhere else.

If this is the reason you want truth, then the priority you assign to your questions will reflect the expected utility of their information—how much the possible answers influence your choices, how much your choices matter, and how much you expect to find an answer that changes your choice from its default.

To seek truth merely for its instrumental value may seem impure—should we not desire the truth for its own sake?—but such investigations are extremely important because they create an outside criterion of verification: if your airplane drops out of the sky, or if you get to the store and find no chocolate milk, its a hint that you did something wrong. You get back feedback on which modes of thinking work, and which don't.

Another possibility: you might care about whats true because, damn it, you're curious.

As a reason to seek truth, curiosity has a special and admirable purity. If your motive is curiosity, you will assign priority to questions according to how the questions, themselves, tickle your aesthetic sense. A trickier challenge, with a greater probability of failure, may be worth more effort than a simpler one, just because it's more fun.

Curiosity and morality can both attach an intrinsic value to truth. Yet being curious about whats behind the curtain is a very different state of mind from believing that you have a moral duty to look there. If you're curious, your priorities will be determined by which truths you find most intriguing, not most important or most useful.

Although pure curiosity is a wonderful thing, it may not linger too long on verifying its answers, once the attractive mystery is gone. Curiosity, as a human emotion, has been around since long before the ancient Greeks. But what set humanity firmly on the path of Science was noticing that certain modes of thinking uncovered beliefs that let us manipulate the world—truth as an instrument. As far as sheer curiosity goes, spinning campfire tales of gods and heroes satisfied that desire just as well, and no one realized that anything was wrong with that.

At the same time, if we're going to improve our skills of rationality, go beyond the standards of performance set by hunter-gatherers, we'll need deliberate beliefs about how to think—things that look like norms of rationalist “propriety.” When we write new mental programs for ourselves, they start out as explicit injunctions, and are only slowly (if ever) trained into the neural circuitry that underlies our core motivations and habits.

Curiosity, pragmatism, and quasi-moral injunctions are all key to the rationalist project. Yet if you were to ask me which of these is most foundational, I would say: “curiosity.” I have my principles, and I have my plans, which may well tell me to look behind the curtain. But then, I also just really want to know. What will I see? The world has handed me a puzzle, and a solution feels tantalizingly close.

1 E.g., “Captain, if you steer the Enterprise directly into that black hole, our probability of surviving is only 2.234%.” Yet nine times out of ten the Enterprise is not destroyed. What kind of tragic fool gives four significant digits for a figure that is off by two orders of magnitude?


Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).

comment by Doug_S. · 2008-01-30T21:18:27.000Z · LW(p) · GW(p)

Yet nine times out of ten the Enterprise is not destroyed. What kind of tragic fool gives four significant digits for a figure that is off by two orders of magnitude?

One who doesn't understand the Million To One Chance principle that operates in fictional universes. If the Star Trek universe didn't follow the laws of fiction, the Enterprise would have been blown up long ago. ;)

See also: Straw Vulcan

Replies from: None, tlhonmey
comment by [deleted] · 2011-01-31T03:49:20.658Z · LW(p) · GW(p)

Maybe in ninety-eight universes out of 100 it does blow up and we just see the one that's left; and he's actually giving an accurate number. :P

Replies from: shokwave
comment by shokwave · 2011-01-31T13:00:02.625Z · LW(p) · GW(p)

The TV show version of the anthropic principle: all the episodes where the Enterprise does blow up aren't made.

Replies from: Raemon
comment by Raemon · 2011-11-23T06:52:14.118Z · LW(p) · GW(p)

Except one.

comment by tlhonmey · 2020-10-14T21:20:39.092Z · LW(p) · GW(p)

In the "Star Trek: Judgement Rites" game there's a spot where Spock gives ridiculously precise odds, and Kirk comments that they seem "better than usual."  Spock then clarifies that he has begun factoring Kirk's history of prevailing when the odds are against him into the calculations.

And do keep in mind that the audience doesn't necessarily see all the times that low-odds plans don't work out.

comment by Alex2 · 2008-09-12T17:59:03.000Z · LW(p) · GW(p)

Does this sentence contain a typo?

"If the iron approaches your face, and you believe it is hot, and it is cool, the Way opposes your fear. If the iron approaches your face, and you believe it is cool, the Way opposes your calm."

Replies from: happyseaurchin
comment by happyseaurchin · 2009-12-28T16:42:57.399Z · LW(p) · GW(p)

i like this but have no idea what it means since the determiner this has a split direction: either the sentence it is embedded or the following...

can't spot the mistake in either :s

comment by RickJS · 2010-05-24T01:24:32.374Z · LW(p) · GW(p)

Thanks, Eliezer!

“Are there motives for seeking truth besides curiosity and pragmatism?”

I can think of several that have showed up in my life. I’m offering these for consideration, but not claiming these are good or bad, pure or impure etc. Some will doubtless overlap somewhat with each other and the ones stated.

  1. As a weapon. Use it to win arguments (sometimes the point of an argument is to WIN, never mind learning the truth. I've got automatic competitiveness I need to keep on a short leash). Use it to win bar room bets. Acquire knowledge about the “buttons” people have, and use it to manipulate them. Use it to thwart opposition to my plans, however sleazy. (“What are we going to do tonight, Brain?” ... )
  2. As evidence that I deserve an A in school. Even if I never have a pragmatic use for the knowledge, there is (briefly) value in demonstrably having the knowledge.
  3. As culture. I don’t think I have ever found a practical use for the facts of history ( of science, of politics, or of art ), but they participated in shaping my whole world view. Out of that, I came out of retirement and dedicated myself to saving humanity. Go figure.
  4. As a contact, as in, “I know Nick Bostrom.” (OK, that’s a bit of a stretch, but it is partly informational.) 5, As pleasure & procreation, as in, “Cain knew his wife.” ;-)

“To make rationality into a moral duty is to give it all the dreadful degrees of freedom of an arbitrary tribal custom. People arrive at the wrong answer, and then indignantly protest that they acted with propriety, rather than learning from their mistake.” Yes. I say, “Morality is for agents that can’t figure out the probable consequences of their actions.” Which includes me, of course. However, whenever I can make a good estimate, I pretty much become a consequentialist.

Seeking knowledge has, for me, an indirect but huge value. I say: Humanity needs help to survive this century, needs a LOT of help. I think Friendly AI is our best shot at getting it. And we’re missing pieces of knowledge. There may be whole fields of knowledge that we’re missing and we don’t know what they are.

I would not recommend avoiding lines of research that might enable making terribly powerful weapons. We’ve already got that problem, there’s no avoiding it. But there’s no telling what investigations will produce bits of information that will trigger some human mind into a century-class breakthrough that we had no idea we needed.

comment by langej · 2010-07-14T11:56:21.625Z · LW(p) · GW(p)

The significant digit anecdote reminds me: why does the Dow Jones giver their average with 2 decimal points?

I do have a couple of problems, though

1) It is written: "The first virtue is curiosity." - Written by whom? 2) …curiosity is an emotion… - says who? 3) To seek truth merely for its instrumental value may seem impure… – Why? To whom? 4) If we want the truth, we can most effectively obtain it by thinking in certain ways – and if you think the way I tell you to think, you’ll wind up with my truth

Replies from: RichardKennaway
comment by RichardKennaway · 2010-07-14T12:39:09.736Z · LW(p) · GW(p)

It is written: "The first virtue is curiosity." - Written by whom?

By Eliezer.

comment by simplicio · 2010-08-16T00:30:12.136Z · LW(p) · GW(p)

From TVtropes:

Star Trek The Original Series episode "This Side of Paradise". Mr. Spock has been affected by spores that release his emotional side. He and his love interest Leila Kalomi are looking at clouds.

Spock: That one looks like a dragon. You see the tail and the dorsal spines?

Leila: I've never seen a dragon.

Spock: I have. On Berengaria 7. But I've never stopped to look at clouds before. Or rainbows. I can tell you exactly why one appears in the sky, but considering its beauty has always been out of the question.


Replies from: jwoodward48
comment by jwoodward48 · 2017-03-05T18:53:56.507Z · LW(p) · GW(p)

I know! Is the world not more beautiful when one can understand how it works?

comment by JohnDavidBustard · 2010-09-01T12:34:45.633Z · LW(p) · GW(p)

Lets not forget, arguably the most important reason.

Because it makes us feel good.

We can feel superior to others, because we can do something that few other people can. We can collect instances where our approach is beneficial and use that to validate our self worth. And we can form a community that validates our strengths and ignores our weaknesses. All perfectly reasonable motivations (provided our satisfaction is a reasonable goal).

In my own field (Computer Vision), there are those who pursue it rationally (with rigorous mathematical analysis) and those who pursue it heuristically (creating a variety of systems and testing them on small samples). These approaches seem to mirror the determined search for truth and the pragmatic "go with what feels like it works" approaches. Without rigorously analysing them (although this may be possible) both approaches seem to deliver benefit with no clear winner in terms of delivering approaches that are practically applied or used as the basis for further work. I think it is interesting to apply this meta analysis to reason, i.e. can we scientifically determine whether approaching problems reasonably conveys advantage? Is there an optimal balance?

Replies from: RobinZ, Nisan
comment by RobinZ · 2010-09-01T16:21:31.566Z · LW(p) · GW(p)

By "most important reason" do you mean "most compelling justification" or "predominant cause"?

Replies from: JohnDavidBustard
comment by JohnDavidBustard · 2010-09-01T16:53:13.112Z · LW(p) · GW(p)

I would suggest both, and I would add that I don't think this inherently diminishes the value of pursuing truth. I am increasingly of the belief that in order to be content it is necessary to pick ones community and embrace its values. What I love about this community is its willingness to question itself as much as the views of others. I think it's useful to acknowledge what we really enjoy and be hesitant of explanations that attribute objective value to enjoyable activities. Doing so risks erasing self doubt and can lead to the adoption of strong moral values that distort our lives to such an extent that they ultimately make us miserable.

comment by Nisan · 2010-09-01T17:13:15.129Z · LW(p) · GW(p)

there are those who pursue it rationally (with rigorous mathematical analysis) and those who pursue it heuristically (creating a variety of systems and testing them on small samples). [...] both approaches seem to deliver benefit with no clear winner

"Rationality" is what I would call the meta-analysis which concludes that both approaches are equally valid in this field.

comment by [deleted] · 2011-01-02T13:03:08.467Z · LW(p) · GW(p)

"For this reason, I would also label as "morality" the belief that truthseeking is pragmatically important to society, and therefore is incumbent as a duty upon all."

Morality doesn't need to have anything to do with society or duty. Consider the case of an rational ethical egoist, to whom acting in one's self-interest and for one's own values is virtuous.

Replies from: thomblake
comment by thomblake · 2011-05-12T22:26:42.481Z · LW(p) · GW(p)

Morality doesn't need to have anything to do with society or duty. Consider the case of an rational ethical egoist, to whom acting in one's self-interest and for one's own values is virtuous.

If that person is a human, and thinks that ethical egoism does not have anything to do with society or duty, then they are mistaken.

Replies from: None
comment by [deleted] · 2011-05-13T17:34:50.016Z · LW(p) · GW(p)


Replies from: CuSithBell, thomblake
comment by CuSithBell · 2011-05-13T17:37:21.317Z · LW(p) · GW(p)

I'd guess because humans often contain concepts of duty and the like, and have experiences vastly contingent on social / societal contexts.

comment by thomblake · 2011-05-13T21:06:34.779Z · LW(p) · GW(p)

Maintaining interpersonal relationships is vital to the human condition. As Aristotle put it, "The solitary life is perhaps suitable for a god or a beast, but not for a man". Friendships are a necessary part of flourishing for humans, and aside from that we are almost always in a context where our success depends upon our interactions with others.

comment by Alexandros · 2011-04-20T19:22:50.758Z · LW(p) · GW(p)

There is more discussion of this post here as part of the Rerunning the Sequences series.

comment by omeganaut · 2011-05-12T15:24:46.928Z · LW(p) · GW(p)

I'll be honest, I have a serious problem with hypocrites, and so I warn everyone I know if they start heading down that path. In your article, you say that morality is perhaps the most suspect method of rationality. Yet, you yourself, by putting up these articles and arguing that everyone should use rational thought, seem to have a moral motivation for rationality. I am not saying that this is your only motivation, but it seems to be the motivation behind these posts. However, I do appreciate that you respect morality by mentioning how important it is in pursuing paths that will not result in horrible consequences. I think that maybe you should allow yourself to admit that morality is a good motivator if used with other good types of motivations to seek truth.

comment by Arandur · 2011-08-13T01:20:41.417Z · LW(p) · GW(p)

Here's an interesting take on the "morality" side: It may be morally incumbent on some to look behind the curtain, and not for others. Since knowing about biases can hurt people, it may well be that those who are "fit" to look behind the curtain are in fact required to be the guardians of said curtain, forbidding anyone without the proper light and knowledge from looking behind it, but acting upon the knowledge gained for the benefit of society.

..... Hence, the Conspiracy.

comment by lowasser · 2012-05-05T21:54:53.294Z · LW(p) · GW(p)

I am trying to win an argument, and I am having trouble defeating the following claim:

It can, under certain scenarios, be instrumental (in the sense of achieving values) to believe in something which is false -- usually by virtue of a placebo effect. For example: believing you are more likely to get a job offer than you really are, so you are confident at the job interview.

The counterargument I want to make, in my head, is that if you have the ability to deceive yourself to that extent -- to make yourself believe something that is false -- then you have the ability to believe that you won't get the job interview, but pretend that you think you will. I don't feel like that's a very solid or reassuring argument, though.

Replies from: TimS
comment by TimS · 2012-05-05T23:38:56.169Z · LW(p) · GW(p)

I think the best response to the argument for instrumentally useful false beliefs is to think a little about the causal mechanism. Surely it is not the case that Omega reads your minds, sees your false confidence, and orders you hired.

As you noted, a more plausible mechanism is that the false confidence causes changes in affect (i.e. appearing confident) that are beneficial for the task. Or perhaps false over-confidence cancels false under-confidence that would have caused anxiety that would be detrimental for the task.

Once the causal chain is examined, the next thing to ask is whether the beneficial intermediate effects can be caused by something other than false belief. If so, you have answered the claim you are responding to. If not, examining why one doesn't believe it possible needs to be examined.

Replies from: AspiringRationalist
comment by AspiringRationalist · 2012-06-21T06:20:57.216Z · LW(p) · GW(p)

You should also examine the costs of each method of achieving the intermediate effects. Even if there are other ways available, maybe self-deception is the easiest, and the costs of that particular incorrect belief are small.

comment by royf · 2012-05-28T04:46:45.702Z · LW(p) · GW(p)

"If the iron approaches your face, and you believe it is hot, and it is cool, the Way opposes your fear. If the iron approaches your face, and you believe it is cool, and it is hot, the Way opposes your calm."

This quote conflates "true beliefs" and what we may call "correct beliefs". True beliefs are ones which assign high probability to the truth, i.e. the actual state of things. Correct beliefs are ones which follow from an agent's priors and observations. The former are objective, the latter subjective but not irrational. If the iron has been cool the last 107 times it has approached your face, but hot this 108th time, your belief that it is cool is correct but false (perhaps better terms are needed).

Also, a belief is not binary. You may be 99.8% sure that the iron is hot and still rationally fear it. A hot iron on your face is far more costly than a needless avoidance.

comment by royf · 2012-05-28T04:53:28.104Z · LW(p) · GW(p)

There's an interesting duality between morality as "the belief that truthseeking is pragmatically important to society" and morality as the result of social truthseeking, which is closer to the usual sense, or rather what the usual sense would ideally be. I'd like to see this explored further if anyone has a link in mind.

Replies from: MarsColony_in10years
comment by MarsColony_in10years · 2015-02-19T04:05:06.451Z · LW(p) · GW(p)

The LesssWrong FAQ indicated that there is value in replying to old content, so I'm posting anyway. Context might be in order, so here's what we are talking about:

I tend to be suspicious of morality as a motivation for rationality

You and I had a similar take on this bit of Yudkowsky's post. Maybe you would call my stance "truthseeking as the result of morality" instead of your "morality as the result of social truthseeking".

The problem Yudkowsky is describing sounds like it comes from entangling the "logical" archetype with "morality". This means any behavior which differs from this archetype becomes "immoral", regardless of whether it is actually Bayesian reasoning or not. Personally, I would phrase this as "declaring rationality to be (a) moral value". This specifically excludes cases where people place intrinsic value on some specific result, and then place instrumental moral value on rationality, as a tool to achieve the desired results. This is much what effective altruism is doing, after all.

Replies from: viktor-riabtsev-1
comment by Viktor Riabtsev (viktor-riabtsev-1) · 2018-10-10T22:23:21.161Z · LW(p) · GW(p)
LessWrong FAQ

Hmm, couldn't find a link directly on this site. Figured someone else might want it too (although a google search did kind of solve it instantly).

comment by aceofspades · 2012-06-27T19:05:09.754Z · LW(p) · GW(p)

I'm not convinced that this post actually says anything. If seeking the truth is useful for any specific reason, then people who see some benefit from it will do so and if it isn't useful then they won't. Actually writing this out has made me think both this post and my comment haven't really said much, but I think that's because this discussion is too abstract to have any real use/meaning. Ideas which are true/work will work, ideas that aren't won't, and that's all that needs to be said, never mind this business about rationality and truth and curiosity.

Replies from: thomblake, CephasAtheos
comment by thomblake · 2012-06-27T19:25:04.401Z · LW(p) · GW(p)

that's all that needs to be said

Would that this were true.

Indeed, if that were all there was to it, nothing would need to be said at all, as that's a tautology. But people manage to fail at noticing when things do / don't work anyway, and false ideas stick around a very long time.

Replies from: aceofspades
comment by aceofspades · 2012-07-05T18:31:46.753Z · LW(p) · GW(p)

I just find it very unlikely that the specifics of how this post is constructed have much of an effect on correcting this issue.

comment by CephasAtheos · 2012-12-22T23:09:35.932Z · LW(p) · GW(p)

Ah, but the seeker needs to find out if the answer - the truth - is beneficial. You can't not know the truth and make a decision without knowing the answer. That's just guessing.

My friend argues that believing in an afterlife (i.e. religion) is beneficial for some people because it gives them a (patently false!) sense of "security". So why tell them it's wrong to believe such a thing?

My answer is a) the fact that there's no afterlife is the truth, as far as humans know (i.e. as far as the evidence - or lack of evidence - shows); and b) it's wrong to believe in such a falsehood - in the sense that most people with such a belief tend to be either less ethical/moral (because they'll fix up the imbalance 'later'), or irrationally over-moral or hyper-ethical because they don't want to risk their slot in eternity's gravy train. Either way, they act irrationally and abnormally, and for the wrong reasons!

I can't think of much in life that could be worse than that. What a horrible life!

comment by [deleted] · 2014-06-22T23:32:11.783Z · LW(p) · GW(p)

It is instructive to review this essay after reading the sequence regarding metaethics, morlaity, and planning algorithms. It lets you receive a deeper insight into how "morality as a motivation" might have come about and what its flaws are.

comment by MarsColony_in10years · 2015-02-19T04:17:31.034Z · LW(p) · GW(p)

"our probability of surviving is only 2.234%" Yet nine times out of ten the Enterprise is not destroyed.

Perhaps this is a minor nitpick or technicality, but that's probably not the best example, because keeping the same probability estimate actually makes sense in this instance. To alter it would be a form of survivorship bias. This is because there is no way he could have have observed the opposite during the previous 10 attempts, since he would no longer be alive to have those memories if he had.

comment by Eddie_T · 2017-07-11T17:17:03.529Z · LW(p) · GW(p)

"For this reason, I would also label as "morality" the belief that truthseeking is pragmatically important to society..."

This seems like a naive understanding of what morality is. It seems like you are referring to a certain subset of ethics, in this case utilitarianism (do what promotes the greatest good among the greatest number). But this is just one part of a class of normative ethical theories. The class to which I'm referring to is consequentialism where essentially, the end justifies the means. I'd rather not get off topic here and simply state that a morality-driven pursuit of truth does not necessarily mean that the person is motivated by the "greater good".

Also, Spock's calculation is off by one order of magnitude, not two. He predicts, roughly, a 98% chance of destruction yet you say in practice, the Enterprise is destroyed 10% of the time. That's just about one order of magnitude off.

Replies from: Jiro, TheWakalix
comment by Jiro · 2017-07-11T20:15:18.496Z · LW(p) · GW(p)

Remember that that's a 11 year old post you're replying to.

Replies from: Duncan_Sabien
comment by Duncan_Sabien · 2017-07-15T20:12:04.337Z · LW(p) · GW(p)

Hey, eleven year old posts are just posts that lack life experience.

comment by TheWakalix · 2019-02-19T18:04:46.713Z · LW(p) · GW(p)

I think you're misinterpreting Yudkowsky. He's not saying that all ethics is pragmatic. He's saying that pragmatics is ethics. Previously in the paragraph, he listed other, non-pragmatic ethical reasons to seek truth.

As for the orders of magnitude, it's log(.9) - log(.02234) = 1.6 orders of magnitude. That's closer to 2 than to 1.

comment by Aerium · 2018-05-01T07:36:51.508Z · LW(p) · GW(p)

"Curiosity, as a human emotion, has been around since long before the ancient Greeks."

Is that a reference to Pandora's Box or am I off base?

Replies from: Elo
comment by Elo · 2018-05-01T18:27:03.572Z · LW(p) · GW(p)

Yes it is.

comment by Viktor Riabtsev (viktor-riabtsev-1) · 2018-10-03T20:28:17.734Z · LW(p) · GW(p)

I am guessing that the link what truth is. is meant to be http://yudkowsky.net/rational/the-simple-truth

Replies from: habryka4
comment by habryka (habryka4) · 2018-10-03T21:08:18.859Z · LW(p) · GW(p)

Thanks, fixed as well!

comment by Дмитрий Зеленский (dmitrii-zelenskii) · 2019-08-16T12:19:02.894Z · LW(p) · GW(p)

Please restore apostrophees...

"our probability of surviving" - probably extrapolated from other similar objects going through black holes. Enterprise, because fictional laws, eschews the odds, but it may only mean that some other ships get destroyed even somewhat more frequently and Enterprise has "five points... for sheer dumb luck!"

Replies from: masha
comment by masha · 2020-01-25T18:41:13.416Z · LW(p) · GW(p)

I think its great that the apostrophes were left out. Apart from possessive apostrophes, which I think should be used, apostrophes are an extra effort (especially when texting) that add no extra meaning or clarification.

Replies from: dmitrii-zelenskii
comment by Дмитрий Зеленский (dmitrii-zelenskii) · 2020-02-15T18:14:27.916Z · LW(p) · GW(p)

I mean, there are minimal pairs (mostly in cases where possessive apostrophees are for some reason not used, like its - it's, who's - whose). But overall it just helps readability (speaking as a non-native).

comment by Nikolaus Hansen (nikolaus-hansen) · 2019-11-24T14:25:19.056Z · LW(p) · GW(p)

I am not sure, but there seem to be a couple of apostrophes missing in the sentence

[...] if were going to improve our skills of rationality, go beyond the standards of performance set by hunter-gatherers, well need deliberate beliefs [...]
comment by masha · 2020-01-25T19:24:08.817Z · LW(p) · GW(p)

Truth is important because it is instrumental to all areas of life. By increasing our overall epistemic rationality, we will understand the world better, and so be able to act (or withold action) in ways that increase our quality of life. Without epistemic rationality, instrumental rationality may be incoherent and misdirected, seeking goals that are counterproductive to the agent's and/or common wellbeing. For example, a person might highly value outcome X, and practice instrumental rationality to achieve that outcome. However, if they had a better understanding of epistemic rationality, they might no longer value outcome X and instead more highly value different outcomes. Epistemic rationality allows us to "optimize" our values.

Optimizing our values and behaviour increases common wellbeing, therefore I think truth seeking and epistemic rationality is a moral imperative for everyone. I believe that the desire for increased wellbeing is actually the most important reason for truth seeking, and since it affects everyone, it is a moral/civil duty.

comment by michael_dello · 2020-05-06T22:22:14.452Z · LW(p) · GW(p)

Regarding the Spock probability reference, I've always imagined that TV shows and movies either take place in the parallel universe where very specific events happen to take place (e.g. the universe where the 'bad guys' miss the 'good guys' with all of their bullets despite being trained soldiers), or in the case of the Enterprise, the camera follows the adventures of the one ship that is super lucky. Perhaps the probability of survival really is 2.234 %, the Enterprise is just the 1 in 1,000 ship that keeps surviving (because who wants the camera to follow those other ships?).

comment by toothpaste · 2021-01-29T01:53:53.675Z · LW(p) · GW(p)

Most apostrophe removals didnt cause any problems, but the "were" in the paragraph before the last one had me confused for a split second.