Values Assimilation Premortem

post by wolverdude · 2019-12-26T00:30:06.496Z · LW · GW · 10 comments

Contents

10 comments

In the past 3-4 years, I went through a prolonged and painful life crisis in which I systematically deconstructed my existing worldview and slowly moved away from Evangelical Christianity into something Rationalist or Rationalist-adjacent. In the past 4 months, I've started hanging around the Berkeley Rationality community and am now dating someone embedded therein. At this point my partner is still my main connection to the specific values and practices of the community, and given that my worldview is currently being fleshed-out, she has an outsized influence on what my future beliefs and values may look like.

She has become concerned that this dynamic may be unhealthy, and to be honest, I have too. I think there is a nontrivial chance that in 10 years (or some other future timeframe), I will again find myself in some sort of existential crisis, and realize looking back that I adopted inauthentic values in this present season. So let's reach into the future and read a reflection my future self wrote about this to see if it might reveal something of what went wrong...

I was floating in philosophical space, without a firm orientation when suddenly my partner appeared, offering me both a relationship and a philosophy to orient myself. As a result, I rather uncritically swallowed her values whole without giving myself time to crystalize my own values, values that reflect how I can sustainably live whole in the world.

At the time, I thought I was thinking through them, but in reality I did not have the tools or background to think through them; I had encountered no counter-thought that would equip me to vet them properly. They were well thought-out points to my eyes, and I wanted to believe them. I was tired of drifting. I really wanted to believe something, to have some basis on which to build my life. Even more my feelings for my partner and the prospect of the deep, long-term care and connection I had always dreamt of combined with my natural people-pleasing nature and made me want to be on the same page with her.

So I pretended I was in total agreement. I ignored that little voice in the back of my head saying, "I'm not sure about this," and "Something feels off here." The voice was very weak and I didn't want to hear it. It would only complicate things. And it wasn't well calibrated anyway. I still felt torn about my apostasy, especially when interacting with my friends and family who still held the faith. How was I supposed to sort out from all of that what was legitimate and what wasn't? How was I supposed to register the still, small voice amidst the torrent?

Excuses, every one. Excuses for not doing the hard, slow work of coming to a consensus with myself. And now I'm paying the price for that. Once again drifting, estranged from my community and those I'm closest to. Now I must do what I failed to do last time and take the time and care to coalesce around values I can sustainably live out. But with so many more people depending on me, it will be even harder than before, and more costly. My kids, my partners, my community... I'm so much more enmeshed than I was last time. What if I can't come to terms with them? What if I can't come to terms with myself?

All this time, and it still plagues me. I am indeed accursed. I partook of the tree of knowledge all those years ago, and I have lived by the sweat of my brow ever since, perspiring in angst over the meaning of it all and my place in the cosmos.

Okay, I got a little carried away there at the end, but you get the point. :wink: All this is playing Devil's advocate. I do not think this scenario is likely, but it is definitely possible and plausible, so it's worth having a premortem. The point of this post is to explore some potential failure modes and identify tendencies that I need to be more careful about if I want my relationship with my partner to be truly sustainable in the long run and want to minimize the chance of future existential calamities.

I think the main lesson from this exercise is that I need more arguments and perspectives on whatever I'm discussing with my partner. I know that I tend to be pretty terrible at critiquing ideas and arguments I'm encountering for the first time (as I presume most people, even most rationalists, are). I have trained myself to instinctively try to see things from other sides, but when I have no idea what another side might even look like, that doesn't work very well. Unfortunately, I also tend to be pretty terribly at seeking that information out. That might be the best point to try and intervene and change my habits, but I'm a little skeptical whether that will work based on past behavior. (I would tend to write down that I need to research X, and then never actually do it.) I do have one thing going for me here: I am open to updating, so when I do encounter another perspective I tend to execute the critique then, and it becomes part of my own perspective. As long as I'm generally consuming new outside information relevant to my discussions with my partner, I should be fine here.

The other point I'm getting from this is that I should be patient. I need to be prepared to wait for my neural network to settle on a relatively stable state, which may take years yet. That doesn't mean I can't move forward. It just means that I should pay attention to uneasiness I feel and not let myself be rushed. This is easier said than done though. NRE don't wanna waste no time, but more importantly, neither does my partner's biological clock, which is getting to the point where we need to jump on this thing relatively soon if we want to have kids. This is a balancing act I will just have to do, but I think that personally spending more time reading and thinking about the issues will speed this along, and I suspect that developing meditative practice will also help, though I haven't researched to know if that's true or not. (If anyone has knowledge here, please tell!)

There is also a legitimate argument for adopting my partner's position wholesale in some cases. If it's an issue that I don't have strong priors on and is not likely to significantly influence any major decisions I make with regard to her, I might as well just go with the flow and not complicate things unnecessarily.

There's also a larger meta-issue here. I have a lifelong wholeness project of fighting perfectionism. It's so ingrained in me that I'm pretty confident that fight will be lifelong for me. In that vein, this whole exercise could be seen as just another attempt to Do it Right The First Time™ and Never Make a Mistake®. So I do need to give myself a little freedom to screw this up, or I will really screw it up the way that I screwed up every relationship I never had before this. (Yes, I actually never dated anyone before this. I blame it on fear, shame & perfectionism + Evangelical sexual ethics taken a bit too far.)

Do you have thoughts on other failure modes I might be missing? Other strategies I might use to address this? Critiques, kudos, links, ruminations, etc? All non-trolling comments are welcome and appreciated!

(this was cross-posted from my blog)

10 comments

Comments sorted by top scores.

comment by Hazard · 2019-12-26T18:46:35.316Z · LW(p) · GW(p)

Hi, welcome to LW! Fellow deconverted christian here. I've both gone through some crisis mode deconverting from christianity, and some crisis mode when exploring and undoing some of the faux-rational patches I had made during the first crisis. Can't wait for round three :)

I'm happy to give some more thoughts, though it might be useful for you to enumerate a few example beliefs / behaviors that you are adopting and now rethinking. "rationalist" is a pretty big space and there's many different strokes for many different folks.

As a very general thought, I'm currently exploring the idea that most of my problems aren't related to big picture philosophy / world-view stuff, and more matters of increasing personal agency (i.e "Do I feel stressed from not enough money?" "Am I worried about the security of my job?" "Can I reliably have fun conversations?" "Can I spend time with people who love me?" "Does my body feel good?" etc). Though admittedly, I had to arrive at this stance via big picture world-view style thinking. Might be useful to dwell on.

Replies from: wolverdude
comment by wolverdude · 2019-12-27T04:56:00.631Z · LW(p) · GW(p)

Thanks for the welcome!

This is super helpful. It sounds like you've lived the thing that I'm only hypothesizing about here. Hopefully "Can't wait for round three" isn't sarcastic. This first round for me was extremely painful, but it sounds like round 2 was possibly more pleasant for you.

I like the framework you're using now, and I'm gonna try to condense it into my own words to make sure I understand what you mean. Basically, you're trying to optimize around keeping the various and conflicting hopes, needs, fears, etc. within you at least relatively cool with your choices. It also seems like there might be an emphasis on choosing to pursue the things that you find most meaningful. Is that correct? I would actually love to hear more on this. Are there good posts / sequences on it?

Regarding examples: I'll need to spend some time brainstorming and collating, but I'll post some here when I get to it. I tend to do the lazy thing of using examples to derive a general principle and then discarding the examples. This is probably not good practice wrt: Rationality.

Replies from: Hazard
comment by Hazard · 2019-12-28T18:21:43.669Z · LW(p) · GW(p)

More or less. Here are some related pieces of content:

There's a twitter thread by Qiaochu that ostensibly is about addiction, but has the idea "It's more useful to examine what you're running from, than what you're running to." In the context of our conversation, the Christianity and Rationalism would be "what you've been running to" and "what you're running from" (for me) has been social needs not being met, not having a lot of personal agency, etc.

Meaningness is an epic tome by David Chapman on different attitudes towards meaning that one can take and their repercussions.

Regarding regarding examples and generalizing, I've been finding it that it's really hard to feel like I've changed my mind in any substantive way, unless I can find the examples and memories of events that lead me to believe a general claim in the first place, and address those examples. Matt Goldenberg has a sequence [? · GW] on a specific version of this idea.


comment by Viliam · 2019-12-27T23:28:25.290Z · LW(p) · GW(p)

Welcome!

Not sure how relevant can be my advice, because I was never in your position. I was never religious. I grew up in a communist country, which is kinda similar to growing up in a cult, but I wasn't a true believer of that either.

My prediction is that in the process of your change, you will fail to update on some points, and overcompensate on other points. Which is okay, because growing up happens in multiple iterations. What you do wrong in the first step, you can fix in the second one. As long as you keep some basic humility and admit that you still may be wrong, even after you got rid of your previous wrong ideas. Your currect position is the next step in your personal evolution; it does not have to be the final step.

Here are some potential mistakes to avoid:

  • package fallacy: "either the Christianity I grew up in is 100% correct, or the rationalism as I understand it today is 100% correct", or "either everything I believed in the past was 100% correct, or everything I believed in the past was 100% wrong". Belief packages are collection of statements, some of them dependent on each other, but most of them are independent. There is nothing wrong with choosing A and rejecting B from package 1, and choosing X and rejecting Y from package 2. Each statement is true or false individually. You can apply this to religious beliefs, political beliefs, beliefs of rationalists, etc. (This does not imply the fallacy of grey; some packages contain more true statements than others. You can still sometimes find a gem in a 90% wrong package, though.)
  • losing your cool. What is true is already true; and it all adds up to normality. Don't kill yourself after reading about quantum immortality, don't freak out after reading about basilisk, don't do anything crazy just because the latest article on LW or someone identifying as rationalist told you so. Don't burn bridges. Do reductionism properly: after learning that the apple is actually composed of atoms, you can still eat it and enjoy its taste. Evolution is a fact, but the goals of evolution are not your goals (for example, evolution doesn't give a fuck about your suffering).
  • identification and tribalism. "Rationalists" are a tribe; rationality is not. Rationality does not depend on what rationalists believe; the entire point of rationality is doing things the other way round: changing your beliefs to fit facts, not ignoring facts to belong better. What is true is true regardless of what rationalists believe.
There's also a larger meta-issue here. I have a lifelong wholeness project of fighting perfectionism. It's so ingrained in me that I'm pretty confident that fight will be lifelong for me. In that vein, this whole exercise could be seen as just another attempt to Do it Right The First Time™ and Never Make a Mistake®. So I do need to give myself a little freedom to screw this up, or I will really screw it up the way that I screwed up every relationship I never had before this. (Yes, I actually never dated anyone before this. I blame it on fear, shame & perfectionism + Evangelical sexual ethics taken a bit too far.)

Go one step more meta, and realize that perfectionism itself is imperfect (i.e. does not lead to optimal outcomes in life). Making conclusions before gathering data is a mistake. It is okay to do the right thing, as long as it is actually the right thing instead of something that merely feels right (such as following the perfectionist rituals even when they lead to suboptimal outcomes). Relax (relaxation improves your life, how dare you ignore that).

Copying your partner's opinions feels wrong, but hey, what can I do here? Offer you my opinion to copy instead? Heh.

If it's an issue that I don't have strong priors on and is not likely to significantly influence any major decisions I make with regard to her, I might as well just go with the flow and not complicate things unnecessarily.

You might also adopt the position "I don't know". It is a valid position if you really don't know. Also, the point of having opinions is that they may influence your decitions. If something is too abstract to have an impact on anything, ignoring it may be the smart choice.

Replies from: wolverdude
comment by wolverdude · 2019-12-29T04:24:59.719Z · LW(p) · GW(p)

Thanks for your thoughts; they're all good ones! I've actually already engaged with the Rationality literature enough to have encountered most of them (I'm about 2/3 through The Sequences at the moment).

I think after reading people's responses to this post, I realize that the scenario I outline here is even less likely than I originally thought. There are wrong ways to apply rationality, it's true. But those are the failure modes @LeBleu alluded to. For everyone else, Rationality isn't a destination, it's a path. The updating is continuous. What happened for me is that I came from a different epistemological tradition and jumped ship to this one. Bushwhacking across terrain to get from one path to another is no fun. But now that I'm on a path, I'm not going to get into that kind of trouble again unless I leave the Rationality path entirely. So then the only question I need to be this worried about is whether the Rationality path is correct, and I'm pretty well convinced of that... but still willing to update, I suppose.

Go one step more meta, and realize that perfectionism itself is imperfect

The point about perfectionism is a good one. I've already recognized that perfectionism is not rational though, and it's more of a compulsive behavior / default mental state to inherently assume that information is free and be down on myself for not already knowing it and executing perfectly on it. Perhaps I actually can fully overcome that, but I'm not expecting it (which would be the perfectionist thing to do anyway ;)

comment by LeBleu · 2019-12-26T14:38:48.674Z · LW(p) · GW(p)

I think your supposition that most people have trouble critiquing arguments they're encountering for the first time is incorrect. I don't find this hard myself. Learning how to critique arguments is a skill you can study. Even just googling "how to critique an argument you've never seen before" gives some reasonable starting points. I'm not surprised a background in Evangelical Christianity has left you lacking this skill, as unquestioning belief is favored there.

Seeking out and listening to podcasts from several distinct but not obviously incorrect philosophies might give your a better perspective on alternative values you might apply to the Rationalist approach. (My favored alternative happens to be ecological approaches.)

Singularity.FM has some good interviews with people with contrary views to mainstream singularitian thought, and some of those have useful alternative lenses to view the world through, even when you don't agree with their conclusions.

Reading about those who have taken Rationalist-style approaches to get to obviously crazy conclusions is also useful, for seeing where people are prone to going off the rails, so you can avoid the same mistakes, or recognize the signs when others do.

As for perfectionism... As an interview with a specialist in innovation pointed out, if you aren't failing, you aren't taking big enough risks to find something new.

"When I was a kid, I thought mistakes were simply bad, and to be avoided. As an adult I realized many problems are best solved by working in two phases, one in which you let yourself make mistakes, followed by a second in which you aggressively fix them."--Paul Graham

I hope some of this helps, and good luck in your journey!

Replies from: wolverdude
comment by wolverdude · 2019-12-27T04:55:32.856Z · LW(p) · GW(p)

Thanks for the tips!

Learning how to critique arguments is a skill you can study.

I suppose that large portions of The Sequences are devoted to precisely the task of critiquing arguments without requiring a contrary position. It's kind of an extension of a logical syntax check, but the question isn't just whether it's complete and deductively sound, but also whether it's empirically sound and bayesianly sound.

It's gonna take me a while to master those techniques, but it's a worthy goal. Not 100% sure I can do it on the timeline I need, but I can at least practice and start developing the habits.

Reading about those who have taken Rationalist-style approaches to get to obviously crazy conclusions is also useful, for seeing where people are prone to going off the rails, so you can avoid the same mistakes, or recognize the signs when others do.

I love reading about failure modes! Not sure why I find it so fascinating. Maybe it's connected to the perfectionism? Speaking of...

if you aren't failing, you aren't taking big enough risks to find something new.

I consider my greatest failure in life to be that I haven't failed enough. I have too few experiences of what works and what doesn't, I failed to make critical course-corrections because they lay outside my info bubble, and I missed out on many positive life experiences along with the negative ones.

Replies from: Pattern
comment by Pattern · 2019-12-29T06:17:53.579Z · LW(p) · GW(p)
[1] I consider my greatest failure in life to be that I haven't failed enough.
[2] I failed to make critical course-corrections because they lay outside my info bubble,

Some day you may find you've taken on too many tasks, and that trying to succeed at all of them means you will fail all of them. At that point if "giving up"** (enough of them) to succeed at the rest is outside your info bubble***/the realm of acts you consider taking*, things might not go well - and perfectionism can mean taking failure really hard.

It may be tricky, balancing this against trying more things - to move away from not doing enough things at once for there to be any "failure"** (to learn from). Or it might be easy - if you're doing things that other people have done before you might be able to get an estimate of difficulty, and useful advice.

(If someone knows X, Y and Z are crazy hard, then if you ran your plan to do all of them on the same day by them, maybe they'll hear "I'm going to run a 100 miles and climb a mountain and fight a bear." and say "don't do that, that's crazy hard and too many things. If you want all 3, start by trying to run 1 mile, climbing a tree/small hill, and learning about the right form for punching, and get started on a punching bag.")


*This theory is oversimplified - failure stems from multiple causes, present and absent. Perhaps this means success requires a bunch of things to go right. Perhaps it means the opposite - failure requires a bunch of things to go wrong.

Perhaps there are many good changes, any one of which can improve things radically, or outright lead to success directly (or via spiraling - like if adding a good habit led to getting better at adding good habits, etc. ). And likewise, many bad changes which can make things a lot worse (not getting enough sleep -> sleep deprivation -> not doing things as well + not making as good of decisions (~per unit of time) -> things get worse, etc.)

**Framing/mindset may effect perception and action. (Recognizing this/changing mindset might help.) If you see something as "giving up" you may be unlikely to do it.

"(Classes with lots of) Tests are amazing - you get to fail so much!" - no one says this.

"I can see how well my program is working right now by having it tell me what it's thinking." Sounds a bit more positive.

***I'm not sure this is what you meant by info bubble, and I'm curious about what it actually means.

Replies from: wolverdude, wolverdude
comment by wolverdude · 2019-12-30T17:54:34.608Z · LW(p) · GW(p)

"(Classes with lots of) Tests are amazing - you get to fail so much!" - no one says this.

Classes with lots of tests are amazing, thanks to the testing effect.

comment by wolverdude · 2019-12-30T17:54:06.436Z · LW(p) · GW(p)

What I meant by "info bubble" is just all the things I'm aware of at this point in time. Presumably there are actions outside of my info bubble which are more beneficial (or more harmful) than any inside, simply because things I'm unaware of encompasses a much larger expanse of possibility space. This is more true the more insular my life has been up to the present moment. The fact that I didn't "sow my wild oats", as the expression goes, did spare me from some harm, but it also stopped me from discovering things that could have set my life on a different, more optimal path.