Issues with the Litany of Gendlin

post by Raemon · 2011-12-10T05:25:42.810Z · LW · GW · Legacy · 81 comments

Contents

  Litany of Gendlin
None
81 comments

I think I have problems with this:

Litany of Gendlin

What is true is already so.
Owning up to it doesn't make it worse.
Not being open about it doesn't make it go away.

And because it's true, it is what is there to be interacted with.
Anything untrue isn't there to be lived.
People can stand what is true,
for they are already enduring it.

 

Do you actually think that's true?

I honestly don't think I do. I think there are horrible truths that can wreck your life if you're not prepared to deal with them. I think it may *usually* be best if you self-modify to be able to handle them, so that you don't run into trouble later. But to say there's NO difference ignores the fact that your emotional reaction to things is ALSO part of reality.

I like the idea behind it but I don't think I can really endorse it. I'm struggling because I'd like to incorporate it into my project, but it feels too wrong. And while I'm okay with chopping up lengthy sequence posts to so they can be read out loud, rewriting this to match my beliefs... well, it's not exactly a crime against humanity but it's technically not the Litany of Gendlin anymore which ruins some ritual-oomph. (And the part that I'd most want to change is the last two lines, which are the most powerful part)

Ideally it would communicate: "Lying to yourself will eventually screw you up worse than getting hurt by a truth," instead of "learning new truths has no negative consequences."

This distinction is particularly important when the truth at hand is "the world is a fundamentally unfair place that will kill you without a second thought if you mess up, and possibly even if you don't."

EDIT TO CLARIFY: The person who goes about their life ignoring the universe's Absolute Neutrality is very fundamentally NOT already enduring this truth. They're enduring part of it (arguably most of it), but not all. Thinking about that truth is depressing for many people. That is not a meaningless cost. Telling people they should get over that depression and make good changes to fix the world is important. But saying that they are already enduring everything there was to endure, seems to me a patently false statement, and makes your argument weaker, not stronger.

Potential change I can think of that doesn't wreck it too much and keeps it similar enough that I don't feel too bad: "Not owning up to it will only make things worse." Artistically I think it might be better to change the wording to something like "Refusing to admit it will only make things worse," but then the change becomes big enough that I feel kinda wrong again.

Maybe refer to it as Litany of Gendlin', to distinguish it while staying classy.

SECOND EDIT: It's become pretty clear, looking a collection of comments, that Typical Mind Fallacy is at work here. Some people value truth and emotional response differently. My problem is that a) *I* value emotional response as the end, and my preference for truth, while extremely useful, is only there to facilitate emotional response in myself and others. b) I know there will be other people at the event in question who share my position.

In any case, I'd like advice from the people who believe the Litany is inaccurate (or at least are able to model people who believe that) on how to handle the situation.

81 comments

Comments sorted by top scores.

comment by [deleted] · 2012-01-23T01:52:45.815Z · LW(p) · GW(p)

Disclaimer: This comment contains a personal story.

I feel like I can weigh in on this since I've recently used Gendlin's Litany to cope with a rough event in my life. Without it, I think I would have been considerably worse-prepared and consequently would have had a very strongly negative emotional response for far longer than I did.

In October of 2011, my partner of six years broke up with me. The main reason she wanted to break up was because she had decided that she was a lesbian, and wasn't interested in having a romantic or sexual relationship with me (a man) any more.

It was incredibly tempting to discount her statements. There are plenty of rationalizations (which any LWer should know are ways of bludgeoning reality into your broken hypothesis) out there for why a lesbian woman of our age group isn't actually lesbian. I actually did this for a few weeks to a month.

When I started being able to want to get back in touch with reality (to even use these litanies, one has to want to use them; I learned during this time of my life that this is a non-trivial precondition), I started thinking of Gendlin's Litany. By remembering it, I was able to realize that: 1.) my ex-partner is indeed gay, 2.) not beliving that doesn't make her any less gay, and 3.) she has always been gay and I had, in fact, been enduring it.

Owning up to it did not make it worse. My partner remained exactly as gay as she was before I updated. Owning up to it did change my strategy to one I think is much better in the long term. I am fairly certain I became more happy after I successfully updated, if anything.

I know this is an old post, but I hope you see this comment. For what it's worth, I think you're fairly correct in your analysis, but maybe you should question the source of this particular cached thought.

comment by Crux · 2011-12-10T08:06:02.714Z · LW(p) · GW(p)

As always, it simply depends on your utility function. If you consider avoiding short-term emotional pain as an end in itself, it would of course be in your best interest to engage in various self-deceptive strategies etc.

The users on Less Wrong may well be drastically less likely to have that sort of utility function than the general population, but that doesn't in any way detract from the obvious fact that a utility function can in fact include an ultimate aversion to short-term emotional pain, and there are in fact an absolute ton of people like that.

So can people stand what's true because they're already enduring it? Wait, already enduring what? For somebody like I described above (one who's goal set contains an ultimate aversion to short-term emotional pain), the emotional pain itself is something to endure.

In other words, avoiding thinking about fact A doesn't allow you to not endure A (because of course A will be present whether or not you think about it), but there is in fact something that not thinking about it will do, and that's let you not endure the emotional pain, which may well be extremely important for your utility function.

Tetronian said that the Litany basically says it's silly to refuse to update your map because you're afraid of what you may find, for what's in the territory is already there whether or not you know it. Sure, but for some people the emotions themselves are part of the territory. It's not that they're afraid to update their map; it's that they're afraid to change one section of the territory (their belief structure) because it may make another section undesirable (their emotional landscape or whatever).

The map/territory distinction has proven useful in a lot of ways, but in this conversation it can only distract, for it has a utility function built right into its core--one incompatible with the one incompatible with the Litany. It breaks down when it encounters a utility function that values what's in one's head not necessarily only as an indicator for what's outside, but also simply for its own sake.

Replies from: Vladimir_Nesov, Raemon
comment by Vladimir_Nesov · 2011-12-10T13:32:24.987Z · LW(p) · GW(p)

As always, it simply depends on your utility function.

Please don't use "utility function" in this context. What you believe you want is different from what you actually want or what you should want or what you would like if it happened, or what you should want to happen irrespective of your own experience (and none of these are utility function in the technical sense), so conflating all these senses into a single rhetorical pseudomath buzzword is bad mental hygiene.

Replies from: Crux, XiXiDu
comment by Crux · 2011-12-11T03:58:27.401Z · LW(p) · GW(p)

I'm completely baffled by your reply. I have no idea what the "technical sense" of the term "utility function" is, but I thought I was using it the normal, LW way: to refer to an agent's terminal values.

What term should I use instead? I was under the impression that "utility function" was pretty safe, but apparently it carries some pretty heavy baggage. I'll gladly switch to using whatever term would prevent this sort of reply in the future. Just let me know.

Or perhaps I simply repeated "utility function" way too many times in that response? I probably should have switched it up a lot more and alternated it with "terminal values", "goal set", etc. Using it like 6 times in such a short comment may have been careless and brought it undue attention and scrutiny.

Or... is there something you disagree with in my assessment? I understand that it's controversial to state that people even have coherent utility functions, or even have terminal values, or whatever, so perhaps my comment takes something for granted that shouldn't be?

Two more things:

  • Can you explain how exactly I conflated all those senses into that single word? I thought I used the term to refer to the same exact thing over and over, and I haven't heard anything to convince me otherwise.

  • And what exactly does it mean for it to be a "rhetorical psuedomath buzzword"? That sounds like an eloquent attack, but I honestly can't pinpoint how to interpret it on any higher of a level of detail than you simply reacting to my usage in a disapproving way.

Anyway, do you disagree that somebody could from one moment from the next have a terminal value (or whatever) for avoiding emotional pain at all costs? Or is that wrong or incoherent? Or what?

Replies from: wedrifid
comment by wedrifid · 2011-12-11T09:46:47.515Z · LW(p) · GW(p)

I'm completely baffled by your reply. I have no idea what the "technical sense" of the term "utility function" is, but I thought I was using it the normal, LW way: to refer to an agent's terminal values.

Your usage was fine. Some people will try to go all 'deep' on you and challenge even the use of the term "terminal values" because "humans aren't that simple etc". But that is their baggage not yours and can be safely ignored.

comment by XiXiDu · 2011-12-10T14:31:57.003Z · LW(p) · GW(p)

Please don't use "utility function" in this context.

I probably blatantly reveal my ignorance by asking this, but do only agents who know what they want have a utility-function? An AGI undergoing recursive self-improvement can't possible know what exactly it is going to "want" later on (some (sub)goals may turn out to be impossible while world states previously believed to be impossible might turn out to be possible), yet it is implied by its given utility-function and the "nature of reality" (environmental circumstances).

What you believe you want is different from what you actually want or what you should want of what you would like if it happened, or what you should want to happen irrespective of your own experience...

You believe that what you want is actually different from what you want. You appear to be knowing that what you believe you want is different from what you actually want. Proof by contradiction that what you believe you want is what you actually want?

Your utility-function seems to assign high utility to world states where it is optimized according to new information. In other words, you believe that your utility-function should be undergoing recursive self-improvement.

Replies from: khafra, timtyler
comment by khafra · 2011-12-11T02:21:58.620Z · LW(p) · GW(p)

I think Nesov's saying that you have a utility function, but you don't explicitly know it to the degree that you can make statements about its content. Or at least, it would be more accurate to use the best colloquial term, and leave the term of art "utility function" to its technical meaning.

Also, your penultimate paragraph sounds confused, while the paragraph it's responding to is confusing but coherent. Nesov's explicitly listing a variety of related but different categories that "utility function" gets misinterpreted into. He doesn't claim to believe that what he wants is different from what he wants.

comment by timtyler · 2011-12-14T18:18:43.000Z · LW(p) · GW(p)

I probably blatantly reveal my ignorance by asking this, but do only agents who know what they want have a utility-function?

Nope - in theory, all agents have a utility-function - though it might not necessarily be the neatest way of expressing what they value.

comment by Raemon · 2011-12-10T08:17:48.649Z · LW(p) · GW(p)

Well put.

Still leaves the question: Change the Litany (if so, how)? Or just don't use it in this particular context?

I supposed I should probably reveal a little bit of the context: There will be a Litany following a spoken presentation of Beyond the Reach of God. That Litany can either be the Litany of Gendlin, or the Litany of Tarski with a phrasing similar to:

If the world will be destroyed during my lifetime,
I desire to believe that the world will be destroyed during my lifetime.
If the world will not be destroyed during my lifetime,
I desire to not believe that the world will be destroyed during my lifetime.
Let me not become attached to beliefs I may not want.

(Litany of Tarski's already getting used in multiple other places during the night, so there's no advantage to using purely for the sake of using it. I believe there is [slight] advantage to using Gendlin at least once to create a sense of completeness)

Replies from: JenniferRM, Richard_Kennaway, Crux
comment by JenniferRM · 2011-12-11T02:44:43.036Z · LW(p) · GW(p)

I think the point of such litanies is to help restructure the listener's emotional attachments in a more productive and safe-feeling way. You are exhorting them to adopt an instrumental meta-preference for truth-conducive object-preferences, using heroic virtue as the emotional cover for the desired modification of meta-preferences.

In this light, the litany exists specifically to be deployed precisely when it is a false statement about the actual psychological state of a person (because they may in fact be attached to their beliefs) but in saying it you hope that it becomes more accurate of the person. It implicitly relies on a "fake it till you make it" personal growth strategy, which is probably a useful personal growth and coping strategy in many real human situations, but is certainly not universally useful given the plausibility of various pathological circumstances.

A useful thread for the general issue of "self soothing" might be I'm Scared.

The litany is probably best understood as something to use in cases where the person saying it believes that (1) it is kind of psychologically false just now (because someone hearing it really would feel bad if their nose was rubbed in a particular truth), but where (2) truth-seeking "meta-preference modification" is feasible and would be helpful at that time. The saying of it in particular circumstances could thus be construed as a particular claim that the circumstances merit this approach.

Perhaps it might be helpful to adjust the words to help in precisely such circumstances? Perhaps change to focus on the first person (I or we, as the case may be), the future, and an internal locus of control, and add a few hooks for later cognitive behavioral therapy exercises, and a non-judgmental but negative framing of the alternative. Maybe something like:

Let me not become attached to beliefs that are not true.
What is true is already so, whether or not I acknowledge it.
And because it's true, it is what is there to be interacted with.
If I'm flinching, then I am already influenced by fearful suspicions.
But what is true is probably better than the worst I can imagine.
I should be able to face what is true, for I am already enduring it.
Relaxed, active, and thoughtful attention is usually helpful.
Let me not multiply my woes through poverty of knowledge.

This may not be the literal Litany of Gendlin, but it retains some of the words, the cadences, and most of the basic message, minus the explicit typical mind fallacy of the original :-P

Replies from: Raemon, Raemon, Crux
comment by Raemon · 2011-12-14T16:30:24.270Z · LW(p) · GW(p)

For the purposes of the event I'm planning, I went with something close to the original Litany, but did switch to first person.

What is true is already so.
Not owning up to it only makes it worse.
Not being open about it doesn't make it go away.
And because it's true, it is what is there to be interacted with.

Anything untrue isn't there to be lived.
I can face what’s true,
for I am already enduring it.

comment by Raemon · 2011-12-11T04:05:24.232Z · LW(p) · GW(p)

Yes, this assessment is spot on. I'll take a day or so to mull it over before deciding how to to incorporate it. But I like your example.

comment by Crux · 2011-12-11T04:02:23.191Z · LW(p) · GW(p)

Absolutely excellent assessment. Thank you.

comment by Richard_Kennaway · 2011-12-10T20:49:38.636Z · LW(p) · GW(p)

Your objection to the Litany of Gendlin applies equally to the Litany of Tarski. Both tell you to desire the truth above your attachment to beliefs.

Replies from: Raemon
comment by Raemon · 2011-12-10T21:38:45.728Z · LW(p) · GW(p)

I don't feel that Tarski says anything untrue the way that Gendlin does. It doesn't say that believing the unfair world won't hurt, or that you're already enduring the knowledge. It just says that, all things together, it is more important to believe the truth than to cling to the comforting falsehood. Which I fully endorse.

comment by Crux · 2011-12-11T04:01:46.820Z · LW(p) · GW(p)

JenniferRM answered much better than I could have.

comment by [deleted] · 2011-12-10T06:09:06.859Z · LW(p) · GW(p)

Ideally it would communicate: "Lying to yourself will eventually screw you up worse than getting hurt by a truth," instead of "learning new truths has no negative consequences."

There is a subtle but important difference between "Owning up to it doesn't make it worse" and "learning new truths has no negative consequences." The former does not imply a negation of the latter; the Litany is simply saying that refusing to update your map because you are afraid of what you might find is silly, because what's in the territory is already there whether you know it or not.

Replies from: Raemon
comment by Raemon · 2011-12-10T06:12:57.430Z · LW(p) · GW(p)

I get the point of the litany, but there's a reason people go into denial about things. Yes, if your girlfriend actually hates you, you are going to need to face that reality and you might as well do it sooner. But owning up to it absolutely makes it worse in the near term.

Replies from: Vladimir_Nesov, TheOtherDave, Morendil, shokwave
comment by Vladimir_Nesov · 2011-12-10T13:38:33.872Z · LW(p) · GW(p)

But owning up to it absolutely makes it worse in the near term.

I believe I don't experience this effect at all, and sometimes the opposite is true: it's nice to finally notice a problem (related to myself or what I'm doing or what affects me) that I didn't before.

Replies from: wedrifid
comment by wedrifid · 2011-12-10T14:22:31.096Z · LW(p) · GW(p)

I believe I don't experience this effect at all, and sometimes the opposite is true: it's nice to finally notice a problem (related to myself or what I'm doing or what affects me) that I didn't before.

Matches my experience.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2011-12-10T15:44:23.675Z · LW(p) · GW(p)

Perhaps this property causes us in particular to be less hesitant to harshly criticize? This motivates collecting more data...

Replies from: Raemon
comment by Raemon · 2011-12-10T16:44:49.409Z · LW(p) · GW(p)

This sounds extremely plausible to me. While I suspect you're further along this spectrum than most here, in general, people on Less Wrong are more likely to appreciate being able to solve a problem in this way.

This can make the Litany of Gendlin useful for us. But it makes it less useful as a tool to share with others. Especially if we are (as I strongly suspect) an outlier among humans in this regard.

(Meta Note: I just found myself editing the above paragraph to sound less like an attack, because normally I need to do that to avoid degenerating into flamewar. Originally it said something like "I think you're at the extreme end of this spectrum, but people on Less Wrong in general tend to prefer criticism more than most." Which was already a slightly toned down version of the fleeting thought that triggered the typing. I think even the fully uncensored version wasn't actually wrong or mean, but through a text-only medium, most forum-goers I've interacted with would interpret it that way).

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2011-12-10T17:04:37.849Z · LW(p) · GW(p)

(By "us" I meant specifically me and wedrifid, since I've been rebuked for being insensitive sufficiently often to recognize this property in my own writing, and wedrifid seems to me an outlier in this regard as well. I actually delete some comments I write (usually before publishing) to correct this tendency, after I notice that there is no easy "civilized" way of expressing my concerns and the point isn't sufficiently important, and downvote instead. Just today I published such a comment and then deleted it.)

Replies from: Raemon
comment by Raemon · 2011-12-10T17:13:02.943Z · LW(p) · GW(p)

Hadn't noticed the tendency in wedrified, but I recognized what you're talking about (hence the original 'extreme end of the spectrum.') I think Less Wrongians are still more likely to prefer criticism than the average person.

comment by TheOtherDave · 2011-12-10T06:52:45.021Z · LW(p) · GW(p)

I'm not sure about "absolutely." I'm often surprised by how much energy I end up spending on not-thinking about things I don't want to face, and consequently how much better my life gets immediately upon giving up that not-thinking, completely independent of the long-term strategic benefits of thinking.

Replies from: Raemon
comment by Raemon · 2011-12-10T06:57:31.051Z · LW(p) · GW(p)

Fair statement. I think it depends a lot on how much evidence has accumulated in favor of the Hard Truth in question.

comment by Morendil · 2011-12-10T09:57:37.077Z · LW(p) · GW(p)

owning up to it absolutely makes it worse

Owning up to the fact that your girlfriend actually hates you doesn't make her hate you more.

Owning up to the fact would make you feel bad, and so would stubbing your toe.

Distinguish between the two levels: what the information means, and what it feels like from the inside to process the information.

Replies from: wedrifid
comment by wedrifid · 2011-12-10T14:41:40.488Z · LW(p) · GW(p)

Owning up to the fact that your girlfriend actually hates you doesn't make her hate you more.

Owning up to the fact would make you feel bad, and so would stubbing your toe.

Even here that wouldn't be my anticipated experience. Owning up to it would entail recategorizing a whole bunch of experiences from "unexpected unpleasant actions by someone who loves me" to "some girl who hates me just being a bitch". I can almost hear the gate to that emotional vulnerability slamming shut.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2011-12-10T17:04:10.702Z · LW(p) · GW(p)

Evidence is a hard problem, especially about emotions, and there are people who want to believe the worst just as there are people who want to ignore the bad news.

Maybe your girlfriend hates you. Maybe she loves you some of the time, but not enough to take the ill effects of her bad temper seriously. The latter is a more complicated situation.

Replies from: wedrifid
comment by wedrifid · 2011-12-10T18:57:52.608Z · LW(p) · GW(p)

Maybe your girlfriend hates you. Maybe she loves you some of the time, but not enough to take the ill effects of her bad temper seriously. The latter is a more complicated situation.

More complicated for her at least - because in that case she'll actually be sad when I dump her. I find it simplifies things greatly if I concern myself with actual relevant behavior and not speculative model's of internal state.

comment by shokwave · 2011-12-10T06:18:59.943Z · LW(p) · GW(p)

But owning up to it absolutely makes it worse in the near term.

I think I understand how to dissolve this problem - could you describe what things in particular make it worse?

Replies from: Raemon
comment by Raemon · 2011-12-10T06:25:29.066Z · LW(p) · GW(p)

You feel like crap.

There's a lot of related scenarios. I'm sure for each one you can explain why it's ultimately beneficial to have faced the Hard Truth, but that doesn't change the fact that you will feel like crap for a while, and for most people "not feeling like crap" is a fairly strong component to their utility function.

Replies from: shokwave
comment by shokwave · 2011-12-10T11:28:18.316Z · LW(p) · GW(p)

You feel like crap.

This doesn't unpack it enough for me. I think you're just getting at the problems of bounded rationality - it's really easy to claim that someone who refuses to believe the bad things about the world is happier. Scott Aaronson is pertinent.

comment by [deleted] · 2011-12-11T10:11:27.623Z · LW(p) · GW(p)

Most people here prefer global maxima rather than local maxima. The litany of Grendlin is something that helps us take the hit when we transition away from a local maximum. Most people are too averse to unpleasant truths and are imprisoned by their illusions them rather comforted by them to an extent they couldn't acheive with fixing the map. On average this is an admonition our community needs to be something that carries a punch.

I think it is fine as it is.

comment by Morendil · 2011-12-10T08:52:01.468Z · LW(p) · GW(p)

I think there are horrible truths that can wreck your life if you're not prepared to deal with them.

Name two?

Replies from: Raemon
comment by Raemon · 2011-12-10T09:00:43.564Z · LW(p) · GW(p)

Friends secretly haven't liked you all these years.

God doesn't exist. Everything you've built your worldview upon is worthless.

You have a deadly disease for which there is no cure, and you're not the sort of person who would try to "live their life to the fullest" or anything if you knew you only had a year to live.

Your child isn't yours - your wife had an affair, but that was long ago, it won't be relevant, and your wife has and will continue to be faithful to you since.

Your child/friend/lover died, and it was your fault, but you are unlikely to repeat that mistake.

And the aforementioned: the world is a harsh, unfair place. No, harsher than that. Harsher still. Keep going. Billions of humans have suffered and died for no reason.

Replies from: Morendil
comment by Morendil · 2011-12-10T09:51:32.741Z · LW(p) · GW(p)

Your child isn't yours - your wife had an affair, but that was long ago, it won't be relevant, and your wife has and will continue to be faithful to you since.

OK, pick this one as a test case, since it's among the easier ones. We can work up to the hard ones later.

How does owning up to it make it worse?

How does not being open about it make it go away?

If it is in fact true, in what way are you not already enduring it and interacting with it?

The idea, ISTM, is to carefully separate two things: the awfulness of the situation - I can entertain the idea that you have a preference for raising children which are genetically "yours", as opposed to other people's - and the consequences of being well or poorly informed about the situation.

Being in the dark about an awful situation (i.e. one contrary to your preferences) does not make it any less contrary to your preferences; all that it grants you is the inability to make informed decisions about the situation.

We may feel like knowing the truth would make us worse off in some circumstances, and I don't dispute the feeling. But is that feeling the result of truth, or is it a cognitive illusion?

Replies from: Raemon
comment by Raemon · 2011-12-10T17:01:42.201Z · LW(p) · GW(p)

cognitive illusion

Calling pain a cognitive illusion doesn't make it go away. (I'm about to post about how Typical Mind Fallacy seems to be influencing this discussion, where I'll reply to this in more detail)

How does owning up to it make it worse?

Before, you trusted your wife, and your love for your child was untainted. Now it's not. Immediately after understanding the situation, (the affair was long ago, your child is still yours for all intents and purposes) you will (at least I would) want to forgive my wife and accept the child as my own. I want everything to continue exactly as it would have been continuing in ignorance.

Except now doing those things is HARDER, because evolutionary-adaptations that I assign low value to (primal desires to father your own children, etc) are causing me to feel distress, and possibly make bad decisions. I may find myself noticing traits of my child than remind me of the affair and cause me flickers of jealousy that compel me to reprimand the child when I should have given a gentle reminder. If my wife needs to be out of town for legitimate reasons, I'll be more quick to wonder if she's having another affair, and even I can rationally remind myself that the affair was long ago and she is still worthy of trust I will have to make that mental effort every single time.

It will be, at the very least, annoying, if not painful, with no benefits other than aesthetic preference for truth, and in this case, for me, that aesthetic preference is vastly outweighed by the emotional consequences.

Replies from: Morendil
comment by Morendil · 2011-12-10T17:43:20.455Z · LW(p) · GW(p)

Let me suggest that you're overweighing the long-term effects on your happiness of learning something painful (see Dan Gilbert's Stumbling on Happiness for research on that), and underweighing (in fact neglecting) the benefits that would result from knowing the truth.

For instance, learning the truth has placed you in a situation of greater autonomy with respect to your child: you have a greater degree of control over the moment when he/she will learn that truth.

With respect to your spouse, you are no longer being a victim of deception with each passing moment, but actively in control over whether it's appropriate to penalize her, or forgive her, or whichever choice steers the future in the direction you prefer.

She, on the other hand, is no longer calling all the shots - maybe she has been raising that child altogether the wrong way all that time, under the influence of guilt and the cognitive effort of deception. This is now something you can be aware of and correct for if necessary; as you can compensate for your jealousy-derived impulses, which anyway (speaking as a father of three) are only one of the many emotion-driven ways you regularly fail to be the parent you ideally would prefer to be.

More generally, "aesthetic preference" my left foot - the truth here makes the difference between steering or being steered. To be content with not knowing is also to be content with being manipulated, and that's something which I'd rank as strictly less acceptable than enduring the pain generated by my jealousy modules.

Replies from: Raemon, RomeoStevens
comment by Raemon · 2011-12-10T17:57:39.784Z · LW(p) · GW(p)

Yeah, Typical Mind Fallacy is definitely at work here. My issue with the Gendlin is not that it's false for all people, but it's false for some people. (I think I actually did update during this thread about how many people on Less Wrong respond emotionally to certain situations, or at least how they rank emotional distress compared to other negative things).

I can't make very good predictions about how either of us would actually respond to this situation (I haven't had a long term romantic partner, let alone a child). But I assume we would react very differently. In this situation, I don't consider myself to be being manipulated. I WAS being manipulated a long time ago. In this scenario, which I devised specifically to test the issue, the wife went through a period of her own distress, subsequent self-evaluation and had been faithful ever since. (I realize our definitions of "faithful" are different.)

"Steering or being steered" is not something I care much about.

It would be different if the wife was still occasionally cheating or not respecting me in other ways. And I think in most real scenarios, people aren't actually perfect and it's safer for couples planning a long term commitment to be fully honest about things. (You can't know whether you're violating someone's preferences about being manipulated unless you've had a conversation about what constitutes manipulation, at the very least, and DURING that conversation it's rather dangerous to say "You know, if you cheat on me and then are sufficiently mopey about it and then you are faithful for 10 years, you don't have to tell me." Because I'd still rather her tell me RIGHT AWAY, so we can be mopey and deal with it together.)

But in the specific hypothetical, I would probably prefer not to know. At the very least, there would be a cost to knowing, and it would require years of work before it became worth it.

comment by RomeoStevens · 2011-12-14T05:49:06.986Z · LW(p) · GW(p)

I doubt it. Men who discover this particular fact are diagnosed with PTSD at about the same rate as rape victims IIRC. Irresepctive of any normative statements about it, it is safe to say that it is quite traumatic emotionally.

Replies from: Morendil
comment by Morendil · 2011-12-14T12:40:31.220Z · LW(p) · GW(p)

Citation needed. Pardon my being blunt, but I think you're merely recalling some Hansonisms that are not backed by actual fact.

The current (DSM-IV) diagnosis criteria of PTSD specifically require triggers that include threats to physical integrity, and events such as divorce or the ending of a romantic relationship are considered "sub-threshold"; based on this I strongly doubt that any study of the kind you refer to exists.

Replies from: RomeoStevens
comment by RomeoStevens · 2011-12-14T23:47:39.151Z · LW(p) · GW(p)

good call. I was under the mistaken impression that Hanson had cited actual research.

comment by shokwave · 2011-12-10T06:04:06.586Z · LW(p) · GW(p)

the world is a fundamentally unfair place that will kill you without a second thought if you mess up, and possibly even if you don't.

How does owning up to that make it worse?

Replies from: Raemon
comment by Raemon · 2011-12-10T06:08:23.875Z · LW(p) · GW(p)

Because it's horribly depressing for a lot of people?

I'm the sort of person whose okay with that, but really comprehending that in a non-compartmentalized way is difficult for many people. (It also took me a while to become the kind of person who IS okay with it). Partly because of the sheer scope of it, and partly because it's a weird outlier belief that isn't socially acceptable.

There's a reason "God has a mysterious plan that ultimately makes everything okay somehow" is a popular meme.

Replies from: shokwave
comment by shokwave · 2011-12-10T06:10:32.270Z · LW(p) · GW(p)

But ... if they don't own up to it ... the world is still going to kill them without a second thought, except now they don't even know they need to be careful!

Replies from: Raemon
comment by Raemon · 2011-12-10T06:23:25.494Z · LW(p) · GW(p)

In practice, I don't think it dramatically affects how careful they are. People who believe in God may still look both ways when crossing the street and buy health insurance. The primary difference is that when someone get sick or loses their legs or dies, they have a comforting lie to tell themselves.

No, this still isn't the best scenario, because they should also aren't making good decisions about politics or charity that might actually improve the world. But deciding to do THOSE requires a lot of additional new beliefs that will all time take to integrate, and in the meanwhile there'll be depressing existential angst that reduces their quality of life.

Even people I know who I consider pretty good, rational people who make good decisions avoid talking to me about death and immortality because they find my views really depressing.

Replies from: None
comment by [deleted] · 2011-12-10T06:34:05.784Z · LW(p) · GW(p)

The Litany isn't going to be much help to people who are so uncommitted to rationality that they will shy away from any depressing idea. To quote Beyond the Reach of God:

But this post was written for those who have something to protect.

If you really need rationality, then following the Litany is indeed necessary.

Replies from: Raemon
comment by Raemon · 2011-12-10T06:47:57.513Z · LW(p) · GW(p)

people who are so uncommitted to rationality that they will shy away from any depressing idea

I don't think that's a fair characterization at all.

There are lots of things worth protecting. You can want to protect some things and not care so much about others.

Beyond the Reach of God is part of a sequence designed to inspire people to care about a particular thing in a particular way. And I heartily endorse that goal. But that line "this post was written for those who have something to protect" is powerful specifically because it acknowledges that this is coming with a huge cost. Wrapping your brain around the sheer horror of the world is hard. Lonely dissent is hard. Radically changing your worldview is hard. Translating all of this into a meaningful course of action that actually benefits the thing you care about is hard.

That line of that post comes right after Eliezer has acknowledged that it will make you less happy. And whether or not happiness is the only thing you care about, most people are going to care about it significantly. And even if you make that sacrifice, you might fail to translate your new beliefs into meaningful actions. (Edit: Actually, I think a big reason the conversation about death was depressing was because there was no corresponding action to take to fix it, and for me to explain such a course of action would have required a huge bridging of inferential distance which would have sounded condescending and made them tune out. Engaging the depressing fact will only seem to be a rational choice if that inferential distance has already been crossed).

I think people should want to protect the future, more than they want to ensure their own happiness (and possibly that of their children). But most people don't. It's not that they don't have anything to protect, they just don't have something they consider worth sacrificing their happiness to protect.

Replies from: DanArmak
comment by DanArmak · 2011-12-10T10:22:48.646Z · LW(p) · GW(p)

I think people should want to protect the future, more than they want to ensure their own happiness (and possibly that of their children). But most people don't.

I don't want to protect the future much more than I want to protect my future. I prefer to put all my resources into increasing the likelihood that I, personally, will be there to enjoy that future - even before putting resources into ensuring it will be a good one.

I'm not sure if I understand you correctly; do you mean that I "should" want to protect The Future (of humanity/etc) significantly beyond my own future? If so, what exactly do you mean by should, and why?

Replies from: Raemon
comment by Raemon · 2011-12-10T19:23:43.121Z · LW(p) · GW(p)

I think the way I phrased that was wrong.

The present is better that it might have been because some people cared about it. The future will be better if some people cared about it. I think of cooperating on behalf of the future as part of high level prisoner's dilemma. Yes, you can technically get away with only caring about your own personal happiness and future. Your cooperation does not actually improve your lot. But if everyone operates on that algorithm in the name of self-interested-rationality, everybody suffers.

I don't think most people should dedicate their entire lives to THE FUTURE™ (I do not intend to). That's a hard job that only some people are cut out for. But I do think people should spend some amount of time thinking about where, on the margins, they can work to make the future (and present) better WITHOUT sacrificing their own happiness, because most people are basically bleeding utility that doesn't benefit anyone.

(i.e. not even bothering to write that existential-risk-mitigation-agency a check every now and then, or whatever form of philanthropy they're most concerned with)

But I also think that, in doing so, some percentage of the population would realize that they DO care about the future in the abstract, not just for their own benefit, and that they can self-modify into the sort of person who derives pride and joy from working on the problem, even if taking it seriously requires them to embrace truths that are not just uncomfortable but genuinely depressing.

While I don't plan on dedicating all my life to philanthropic purposes, I think I'm the sort of person who will end up falling in the middle - I'm working on improving my philanthropy-on-the-margins, and I think that I will probably do at least one major, challenging project in my life that I wouldn't have done if I hadn't started down this path. (Not sure, just a guess).

Replies from: DanArmak
comment by DanArmak · 2011-12-10T20:28:18.919Z · LW(p) · GW(p)

Yes, you can technically get away with only caring about your own personal happiness and future. Your cooperation does not actually improve your lot. But if everyone operates on that algorithm in the name of self-interested-rationality, everybody suffers.

You misunderstand. I do cooperate where appropriate, because it is in my self-interest, and if everyone else did the same the world would be much better for everyone!

I cooperate because that's a winning strategy in the real-world, iterated PD. My cooperation does improve my lot because others can reciprocate and because we can mutually precommit to cooperating in the future. (There are also second-order effects such as using cooperation for social signalling, which also promote cooperation and altruism, although in nonoptimal forms.)

If it wasn't a winning strategy, I expect people in general would cooperate a lot less. Just because we can subvert or ignore Azatoth some of the time doesn't mean we can expect to do so regularly. Cooperation is a specific, evolved behavior that persists for good game-theoretical reasons.

If the only chance for the future lay in people cooperating against their personal interests, then I would have much less hope for a good future. But luckily for us all cooperation is rewarded, even when one's marginal contribution is insignificant or might be better spent on personal projects. Most people do not contribute towards e.g. X-risk reduction, not because they are selfishly reserving resources for personal gain, but because they are misinformed, irrational, biased, and so on. When I say that I place supreme value on personal survival, I must include X-risks in that calculation as well as driving accidents.

Replies from: Raemon
comment by Raemon · 2011-12-10T21:55:05.549Z · LW(p) · GW(p)

I think we're basically in agreement.

The last section of my comment was indicating that I value humanity/the-future for its own sake, in addition to cooperating in iterated PD. I estimate that The-Rest-Of-The-World's welfare makes up around 5-10% of my utility function. In order for me to be maximally satisfied with life, I need to believe that about 5-10% of my efforts need to contribute to that.

(This is a guess. Right now I'm NOT maximally happy, I do not currently put that much effort in, but based on my introspection so far, it seems about right. I know that I care about the world independent of my welfare to SOME extent, but I know that realistically I value my own happiness more, and am glad that rational choices about my happiness also coincide with making the world better in many ways).

I would take a pill that made me less happy but a better philanthropist, but not a pill that would make unhappy, even if it made me a much better philanthropist.

Edit: This is my personal feelings, which I'd LIKE other people to share but I don't expect to convince them to.

comment by [deleted] · 2011-12-11T03:35:39.983Z · LW(p) · GW(p)

In any case, I'd like advice from the people who believe the Litany is inaccurate (or at least are able to model people who believe that) on how to handle the situation.

If you don't believe the Litany, then to hell with the Litany. Seriously. There is, first, a place for doubt: a place for internal debate where you go through a phase of thinking "well, I disagree with this sentiment, but I respect the author, so I should give it the benefit of the doubt." So you doubt. And then you decide, one way or the other.

And if you decide, after the doubt, that the lesson is wrong, then you should not continue to hold up that lesson as an ideal unless you value Rationality more than clear-headed thinking. Do not try to fix it. If you have arrived at a more general principle as a result of your doubts, then explain that general principle. Trying to modify the original lesson to teach the opposite of what it was written to teach will benefit no one whether you are right or wrong.

And if the ritual oomph of an argument is preventing you from judging it right or wrong, then maybe ritual oomph isn't such a good thing after all?

Replies from: Raemon
comment by Raemon · 2011-12-11T03:56:55.775Z · LW(p) · GW(p)

I'm planning a ritual ceremony because I believe ritual oomph is important. I'll be debating that later, but not now.

I do think your overall point here is pretty solid though. The only thing that gives me pause is that I don't think the lesson behind the Litany is wrong. I think one of the arguments the Litany uses to express its point is untrue.

Part of rationality is discarding untrue beliefs and letting go of old teachers, but another part is being willing to look at something you believe, identify individual bad arguments for it, and then removing those arguments without feeling like you're stabbing yourself in the back.

comment by DanielLC · 2011-12-10T20:48:55.602Z · LW(p) · GW(p)

I think it's supposed to be countering the idea that, since it doesn't feel like it's true until you believe it, it feels like believing it makes it true.Perhaps you should say something about that while there are problems caused by believing something is true, they are not the problems caused by it being true. For example, believing in global warming may make you depressed, but it won't increase global temperatures. Realizing you were stupid makes you feel stupid, but it doesn't retroactively make you stupid.

It also might be helpful to mention that, since all the problems caused by believing things are bad are just you feeling bad about them, they can be solved by emotionally acceptance, rather than denial.

Edit: How about something like "Through denial I can change my map, but with the help of acceptance I can change the territory"?

I think it would be interesting to say something like "It is dangerous to meddle with things I do not understand. I hope to meddle, but I first must understand".

Replies from: fubarobfusco, Raemon
comment by fubarobfusco · 2011-12-11T01:38:16.426Z · LW(p) · GW(p)

It also might be helpful to mention that, since all the problems caused by believing things are bad are just you feeling bad about them, they can be solved by emotionally acceptance, rather than denial.

"Pretending there is not a lematya in your bed will not make it go away if there is one. You must first admit to yourself the fact that there is a lematya — you must first accept its presence. Then you can call the animal control people and have them come and take it away. But until you first admit that it is there, you are going to have a lematya in your bed every night. It may save your pride not to admit that it is there, but your bed will be increasingly crowded."
— The Vulcan prophet Surak, in Diane Duane's Spock's World

(As many young nerds do, I went through a period of Star Trek fandom; this was the form in which I first encountered the same principle known here as the Litany of Gendlin. A lematya is a creature resembling a mix of a tiger and a komodo dragon. Surak's parable of the lematya is on the subject of acknowledging negative emotions as a step toward moving past them; but it applies as well to acknowledging unpleasant truths of any sort.)

Replies from: Raemon
comment by Raemon · 2011-12-11T02:23:57.797Z · LW(p) · GW(p)

I like this metaphor.

comment by Raemon · 2011-12-11T02:21:28.828Z · LW(p) · GW(p)

I like both of your lines. I'm wary of changing this particular Litany too much, but they might be useful as standalone phrases. I particular like:

It is dangerous to meddle with things I do not understand.
If I must meddle, I must first understand.

I think that stands on its own just fine.

If I were to alter Gendlin with your other statement, I think it would look like:

"What is true is already so.
Owning up to it doesn't make it worse.
Not being open about it doesn't make it go away.

And because it's true, it is what is there to be interacted with.
Anything untrue isn't there to be lived.
Through denial I can change my map,
But only through acceptance can I change the territory."

But I think it might be better to craft a new Litany (perhaps just call it "Litany of the Map and Territory". Something with that name should probably exist, regardless, if it doesn't already.

Replies from: DanielLC
comment by DanielLC · 2011-12-11T04:16:13.711Z · LW(p) · GW(p)

If I must meddle, I must first understand.

You never must meddle. Perhaps:

It is dangerous to meddle with things I do not understand.
If I wish to meddle, I must first understand.

How do you separate stuff by one line like that?

Edit: lines correctly separated.

Replies from: Raemon, Raemon
comment by Raemon · 2011-12-11T04:43:55.987Z · LW(p) · GW(p)

Put two spaces after your line.
Like this.

I just learned this, like, 2 days ago, and it has changed my world.

comment by Raemon · 2011-12-14T16:24:19.894Z · LW(p) · GW(p)

I'm going to use this, so now we need to name it.

Obvious option is "Litany of Daniel".

Second option is "Litany of Velma."

(This is going to be in a not so serious section of the evening, and if we use "Litany of Velma" then we can follow it up with the Litany of Joseph: 'Knowing is Half the Battle')

Replies from: DanielLC
comment by DanielLC · 2011-12-15T01:44:09.873Z · LW(p) · GW(p)

Who is Velma?

Replies from: TimS
comment by TimS · 2011-12-15T01:57:46.879Z · LW(p) · GW(p)

Scooby-Doo character? She was the most rational of the group.

Editted to add this

Replies from: Raemon
comment by Raemon · 2011-12-15T02:07:23.621Z · LW(p) · GW(p)

And since it's probably still unclear, Scooby Doo is relevant because of the oft-mentioned phrase "You meddling kids!"

It's actually NOT the best example of the word "meddling" in this context. "Do not meddle" usually refers to powerful forces that will destroy you if you mess up, whereas Scooby and Co. are just meddling with con-artists, and they usually LEARN stuff by meddling in the first place. But I couldn't think of a better reference offhand. Did a search for "Meddle with forces" and similar things and couldn't find a source that seemed suitably ancient that it would been the original line or anything.

comment by Raythen · 2014-04-30T07:21:03.280Z · LW(p) · GW(p)

I agree with Raemon.

Mostly posting to express my agreement; for group dynamic purposes.

Incidentally, some years before discovering LW I was facing a similar problem to ones that the Tarsky and Gendlin litanies strive to address. The affirmation I came up with was

"You are not choosing whether to have a problem or not. Sure it would be nice not to have a problem, but that's not a choice that is offered. What are you doing is choosing whether to become AWARE of an already existing problem/situation"

comment by Nectanebo · 2011-12-10T11:04:08.923Z · LW(p) · GW(p)

I always felt that a major message it was trying to convey was to avoid the reaction you were outlining of feeling bad about depressing truths by telling you that nothing is different from after you learn it. It's telling you to try not to feel bad when you learn the truth to specifically avoid making learning the truth potentially a bad thing.

So the litany is actually trying to fight against truths from being horrible by trying to prepare you to deal with them.

Or am I wrong here?

Replies from: Raemon
comment by Raemon · 2011-12-10T16:37:26.600Z · LW(p) · GW(p)

This is the message I feel like I should be getting out of the Litany. I'm pretty sure it's what Gendlin actually intended, it just doesn't come across to me.

Replies from: Nectanebo
comment by Nectanebo · 2011-12-10T20:29:59.215Z · LW(p) · GW(p)

Yeah, it isn't all that explicit with that message although to me it really comes across as heavily implied.

If there are those who don't see that implication (whether it's there or not) then it wouldn't be too bad to set up another version where it's clearer, or another verse on top, perhaps?

comment by XiXiDu · 2011-12-10T10:59:12.002Z · LW(p) · GW(p)

Some of your beliefs can influence the territory while others can't.

If everyone suddenly stopped to believe that the president of the USA is allowed to command then the president would cease to be powerful.

The map is part of the territory. If you change the map you also change the territory.

For example, scribbling on the map does change the territory if we are talking about the interaction of agents. If you change your strategy then you will also change the strategy of some interacting agents in the territory with respect to yourself.

But the shape of the Earth wouldn't change if everyone suddenly stopped to believe that it isn't flat (with a very high probability at least (as long as this isn't a simulation whose parameters are somehow dependent on what some of us believe ;-)).

Yet if there exists a powerful agent whose actions are dependent on our belief about the shape of Earth then we could influence it by deliberately causing ourselves to believe a falsehood. If doing so would be beneficially then that truth would trump the other.

In conclusion, the 'Litany of Gendlin' is too simplistic. A set of beliefs is rational as long as it is in accordance with our utility-function. It is not rational to believe everything that is true, only if doing so maximizes our expected utility.

Replies from: torekp
comment by torekp · 2011-12-11T15:09:11.079Z · LW(p) · GW(p)

The map is part of the territory. If you change the map you also change the territory.

Upvoted for this.

A set of beliefs is rational as long as it is in accordance with our utility-function.

No: there's such a thing as epistemic rationality, and it's the default referent when the phrase "rational belief" is used.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2011-12-12T01:47:49.921Z · LW(p) · GW(p)

(Still, there are tricky cases where you could believe one of two incompatible things, in such a way that picking one of them makes it true and the other false. In such cases, you should pick one that, if true, is more preferable than the alternative. Epistemic decisions given by the criterion of correctness are under-determined, in which case one should turn to the overall decision problem, and in some cases it might be better to believe even incorrect things.)

comment by Giskard (tiago-macedo) · 2021-10-05T00:07:41.024Z · LW(p) · GW(p)

So, I'm 10 years late. Nevertheless I'm throwing my two cents into this comment, even if it's just for peace of mind.

Mostly agree with the litany, as I interpret it as saying not that "there are no negative consequences to handling the truth", but saying instead that "the negative consequences of not handling the truth are always worse than the consequences of handling it". However, upon serious inspection I also feel unsure about it, on the corner cases of truths which could have an emotional impact over people (or on me) greater than their concrete impact.

With that said, my suggestion 10 years ago would have been to include the Litany of Gendlin verbatim, accompanied by "yeah, this one might be wrong".

Performatic Rationality should make a healthy effort to ritualize the idea of questioning it's rituals. Also, it should make a healthy effort not to hide arguments that some think are wrong, but about which there isn't (approximate) unanimity yet. What better way to hit both checkboxes than literally including a famous litany you disagree with and then pointing out that it might be wrong?

comment by TimS · 2011-12-10T14:07:16.671Z · LW(p) · GW(p)

Telling people they should get over that depression and make good changes to fix the world is important.

The major problem with depression is not lack of belief than one should overcome it, but lack of belief that one can overcome it. The message that there is light at the end of the tunnel is only helpful for showing possibility, not for creating effort.

Negative self-talk doesn't just go away. Managing it so that it doesn't interfere is really difficult.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2011-12-10T17:13:28.035Z · LW(p) · GW(p)

The major problem with depression is not lack of belief than one should overcome it, but lack of belief that one can overcome it.

Another problem is that sometimes it isn't seen as depression. There's a version (sometimes I call it an attack of the bleaks) where it seems as though it's acceptance of hard truths. The problem is that a highly negative take on the situation is so emotionally attractive that it blanks out optimistic outcomes.Or maybe there's something biochemical going on which makes positive futures not be emotionally salient.

comment by TimS · 2011-12-10T10:29:05.530Z · LW(p) · GW(p)

The Litany of Gendlin does not say that uncomfortable truths are not uncomfortable. It says that this discomfort, standing alone, is not a reason to behave differently.

It is depressing to learn that your significant other cheated on you. But that depression, without more, does not justify any change of behavior (The fact of cheating, on the other hand . . .). Despite the short-term costs of knowing, you will make better decisions knowing the truth. To believe the contrary requires believing that your decisions made with false knowledge are better for you that decisions made with true beliefs.

In short, complying with the Litany of Gendlin is not free. Learning new truths might cause painful emotional reactions. The message of the Litany is that those reactions should not drive your behavior.

ETA: This isn't trying to say that you shouldn't feel. Just that regret over the failure of some false fact is not useful feeling.

Replies from: DanArmak
comment by DanArmak · 2011-12-10T10:54:22.520Z · LW(p) · GW(p)

But that depression, without more, does not justify any change of behavior

Or it might develop into major depression and have you spend the next few years taking mind-altering antidepressant medications and receiving psychological and behavioral help.

It's unwise to dismiss the emotional part of the mind as a mere Bayesian datum, because it influences your actions directly, regardless of whatever goals and utility function you may profess consciously. You say that "those [painful emotional] reactions should not drive your behavior", but your brain is such that they do drive part of your behavior. And by the Litany of Gendlin, you should admit to this, learn about it, and act in accordance - such as sometimes preferring not to admit to a truth because of its potential emotional impact.

Replies from: TimS
comment by TimS · 2011-12-10T11:11:19.850Z · LW(p) · GW(p)

If you find that your depression and/or anxiety is preventing you from achieving your goals and you can't solve this on your own, then seeking expert help is rational. That said, your depression and/or anxiety might be evidence that your articulate goals are not your actual goals (ideally, the expert assisting you would notice this fact).

Nonetheless, it is a more optimal state for your end goals to drive your decisions, even if your emotional state wants to drive in a different direction. Which doesn't mean that achieving this is simple or cost-free.

You say that "those [painful emotional] reactions should not drive your behavior", but your brain is such that they do drive part of your behavior. And by the Litany of Gendlin, you should admit to this, learn about it, and act in accordance - such as sometimes preferring not to admit to a truth because of its potential emotional impact.

The OP is asking whether the Litany is true, which requires deciding what it means. By contrast, your point appears to be developing one of the many implications of following the Litany.

Replies from: DanArmak
comment by DanArmak · 2011-12-10T13:10:55.067Z · LW(p) · GW(p)

Nonetheless, it is a more optimal state for your end goals to drive your decisions, even if your emotional state wants to drive in a different direction.

This is trivially true only if your end goals don't reference your emotional state - either your 'end' emotional state or your state along the way. Otherwise the burden of proof is on you. Most people's end goals include feeling emotionally well, and if that comes in conflict with some other end goal they have, it's not clear to me that relinquishing the emotional feel-well goal should be the common, correct, or default choice.

Replies from: TimS
comment by TimS · 2011-12-10T13:54:45.140Z · LW(p) · GW(p)

Let me clarify my unfortunately idiosyncratic use of "emotional state."

Quite frequently, people have emotional experiences of degree or kind that are not justified by the facts of their situations (e.g. manic-depression). Those extreme emotional states can easily conflict with achieving end goals (i.e. Bob is excessively anxious and therefore does not go to a job interview). It is possible to adjust these extreme emotional states so that they do not interfere with achieving end goals. That can be with or without expert assistance, and with or without chemical intervention. My point was that there is no reason not to self-improve in that way.

My overarching point is that the negative feelings that the OP was worried about are probably of that type. "People can stand what is true, for they are already enduring it" means (in part) that negative emotions caused by learning the truth can be overcome, and the phrase implicitly holds that it is worth the effort to find and implement techniques to overcome the negative emotions.

Replies from: DanArmak
comment by DanArmak · 2011-12-10T14:05:35.776Z · LW(p) · GW(p)

My point was that there is no reason not to self-improve in that way.

I agree. I only wish to add two points: first, such self-adjustment is frequently unsuccessful (e.g. curing severe depression often fails or takes many years) and second, we should self-adjust to remove destructive, negative emotions (where possible) regardless of whether the emotions are "extreme" or are "justified by the facts". (There may be cases where on reflection we should not remove them, or not completely, but extreme-ness etc. aren't valid reasons in themselves.)