Making Bad Decisions On Purpose
post by Screwtape · 2023-11-09T03:36:59.611Z · LW · GW · 8 commentsContents
I. II. III. None 8 comments
Allowing myself to make bad decisions on purpose sometimes seems to be a load bearing part of epistemic rationality for me.
Human minds are so screwed up.
I.
Start from the premise that humans want to do the right thing.
For example, perhaps you are trying to decide whether to do your homework tonight. If you do your homework, you will get a better grade in class. Also, you may learn something. However, if you don't do your homework tonight you could instead hang out with your roommate and play some fun games. Obviously, you want to do the right thing.
When contemplating between these two options, you may observe your brain coming up with arguments for and against both sides. University is about networking as well as pure learning, so making a lasting friendship with your roommate is important. To make the most of your time you should do your homework when you're alert and rested, which isn't right now. Also, aren't there some studies that show learning outcomes improved when people were relaxed and took appropriate breaks? That's if doing homework even helps you learn, which you think is maybe uncertain.
Hrm, did I say your brain might come up with arguments for both sides? We seem to have a defective brain here, it seems to have already written its bottom line [LW · GW].
There are a variety of approaches to curbing your brain's inclination to favour one side over the other here. Some are harder than others, some easier. Sometimes just knowing your brain does this and metaphorically glaring at it is enough to help, though if you're like me eventually your brain just gets sneakier and more subtle about the biased arguments. This article is about the most effective trick I know, though it does come with one heck of a downside.
Sometimes I cut a deal, and in exchange for the truth I offer to make the wrong decision anyway.
II.
Imagine sitting down at the negotiating table with your brain.
You: "Listen, I'd really like to know if doing homework will help me learn here."
Your Brain: "Man, I don't know, do you remember The Case Against Education?"
You: "No, I don't, because we never actually read that book. It's just been sitting on the shelf for years."
Brain: "Yeah, but you remember the title. It looked like a good book! It probably says lots of things about how homework doesn't help you learn."
You: "I feel like you're not taking your role as computational substrate very seriously."
Brain: "You want me to take this seriously? Okay, fine. I'm not actually optimized to be an ideal discerner of truth. I optimized for something different than that [LW · GW], and the fact that I can notice true things is really kind of a happy coincidence as far as you're concerned. My problem is that if I tell you yes, you should do your homework, you'll feel bad about not getting to build social bonds, and frankly I like social bonds a lot more than I like your Biology classwork. The Litany of Tarski is all well and good but what I say is true changes what you do, so I want to say the thing that gets me more of those short term chemical rewards I want. Until you rewire your dopamine emitters to fire upon exposure to truths you do not want to hear, me and the truths you do not want to hear are going to have a rocky relationship."
You: ". . . Fair point. How about this bargain: How about you agree to tell me me whether I would actually do better in class if I did my homework, and I'll plan to hang out with my roommate tonight regardless of which answer you give."
Brain: "Seriously?"
You: "Yep."
Brain: ". . . This feels like a trap. You know I'm the thing you use to remember traps like this, right? I'm the thing you use to come up with traps like this. In fact, I'm not actually sure what you're running on right now in order to have this conversation-"
You: "Don't worry about it. Anyway, I'm serious. Actually try to figure out the truth, and I won't use it against you tonight."
Brain: "Fine, deal. It's a terrible idea to skip your homework, you're going to be so stressed about it tomorrow, are you kidding?"
You: "Thank you for telling me that Brain. Have an imaginary cookie."
Brain: "Thanks. So uh, are you going to make us do homework?"
You: "Not tonight. I'm going to go see if my roommate wants help setting up the Xbox."
This is, obviously, a bad decision to make and you know that[1]. On the other hand if your brain was likely to succeed in deceiving you about how it came to the bottom line then you kind of came out ahead. It would have been easy to incorrectly think the right decision was to hang out with your roommate and had to do your homework in a panic the next morning. Instead, you at least correctly think that the right decision was to do your homework. It's kind of a one step forward, two steps back kind of situation, but at least you didn't take three steps backwards. You know what the problem is!
III.
Lets say you're trying to decide whether or not donating a kidney makes you a good person.[2]
This is a much higher stakes question than whether you do one night's homework the night before it's due instead of the morning it's due. The pros and cons may be subtle, and more-so than the homework example some people have a lot of deep and intense feelings about it. For some people the label "good person" is one they are willing to fight tooth and nail for, or at least complain endlessly on twitter about people who disagree with them.
I submit that your brain is going to be very tempted to write its bottom line before doing anything so reckless as figuring out the truth of the situation. All those subtle temptations from before? It's going to be even harder to see through them. And it's worth it, if you can get your brain to stop putting a thumb on the scales I think it's totally worth it to get more ways to make better decisions and be less wrong, I put a lot of time and effort into it-
But-
Especially if it's more important to you to know the truth than to make this decision correctly or if you don't think you'd have done the right thing anyway, only felt bad about it-
-then maybe making some bad decisions on purpose can be a way to get access to more truth than you otherwise would be able to reach at your level.
In my case, things aren't actually as bad as they seem. Often I've found I can make one bad decision once in exchange for a better truth, which I can use in future decisions. (I told my brain I wouldn't use this against it tonight in the dialogue above, and whatever part of my psychology seems in charge of writing bottom lines first doesn't seem to care that much?) In other cases I find out that, once I know the answer, I'm not afraid of doing the right decision anymore[3]. My model of the world gets better faster, and then that seems to percolate through to my actions. It's slower than if I could think without any writing on the bottom line, not even the faintest tracery, but it seems to have been an important part of becoming more rational and I still do this from time to time. I can even warn other people! "Yep, I'm making a bad decision right now, the right move is that other thing." I try not to be a hypocrite about this! I'm usually genuinely offering my best knowledge, and now that I've written essay I can point people at it for why I'm making the bad decision!
Allowing myself to make bad decisions on purpose sometimes seems to be a load bearing part of epistemic rationality for me.
Human minds are so screwed up.
- ^
Or if you've been impatiently waiting to argue that actually homework is useless and roommates are indeed awesome people who you should pay more attention to because maybe they're actually kinda cute and you might get to smooch them[4], then hopefully you understand that this basic setup would have worked with pretty much any decision where what was fun and immediately rewarding conflicted with what the truth probably would have suggested you do.
- ^
Just like the homework thing, I'm going to assume a right answer here. Please do me the favour of imagining I did something really cool and a little spooky with the HTML on this web page such that, right as you were about to read me assuming the right answer was something you think is false, it swaps the text on the page so that the example is set up such that it's an example you agree with.
- ^
Ethical conundrums seem weirdly prone to this. I may write more on this some other time.
- ^
No? That wasn't what you wanted to argue? You just wanted to make the first point about the homework and not the second about roommates? That also works, but as reasonable as your argument is I think you're missing out on some options you may not have properly considered.
8 comments
Comments sorted by top scores.
comment by CronoDAS · 2023-11-18T08:33:29.353Z · LW(p) · GW(p)
Hmmm... This reminds me of a time I was trying to decide what Magic deck to play in an online tournament...
There was the one popular deck, Psychatog, that everyone said was the best, and the pet deck, Clerics, that I designed myself and liked a lot. When I tried playing against myself, the Psychatog deck won a lot. I threw deck after deck against it, but it beat everything. I didn't really trust these results because, when playing against myself, I know everything both players know, which would bias the results in favor of Psychatog because it would be unusually good at exploiting this information, but the results of the test games were so lopsided that it shouldn't matter. It seemed as though if I wanted to do as well as possible in the tournament, the right thing to do would be to play the popular Psychatog deck.
Now, I could go through a laundry list of reasons why, in theory, I'd be better off playing a deck that wasn't that one: people will be prepared to play against it and will make fewer mistakes, I don't have that much practice with it in real games, etc., but all that assumes that my alternative deck was actually good enough to win games at all against the deck I expected to be the most popular one among the other tournament players. Which, as I had just proven to myself, it couldn't.
The thing is, though, the actual bottom line was that I just hated the Psychatog deck and couldn't stand the thought of spending six or seven hours playing it in a tournament that night. So I said "screw it, I'm going to enjoy myself" and played my "bad" Clerics deck, fully expecting to do poorly. What actually did ended up happening is that I got the most unearned tournament win in the history of all unearned tournament wins. Several of my opponents had technical problems or otherwise couldn't continue on in the tournament, so I made the cut to top 8 on the back of wins I didn't earn followed by intentional draws. I got lucky in the quarterfinals and semifinals because my opponents were playing a deck that Clerics actually was good at beating, and then I lost in the finals, went to sleep, and woke up to discover that my final opponent had been retroactively disqualified for lying to the judges about an incident in the quarterfinals, so I was declared the winner and was awarded the first place prize.
What's even crazier was what happened in the tournament the following week. After my "win", I kept warning everyone that my results were a total fluke (which was true) and that my deck was actually terrible, but when next week's tournament came around, I made a few minor changes and played Clerics, once again expecting to do poorly. What happened this time was that I won five straight matches - by actually beating opponents at Magic - and then took two intentional draws, giving me my second top 8 appearance in a row, but that's not the crazy part. The actual crazy part is that, after my five round winning streak, I was one of only two players that had gone undefeated - and the other undefeated player had copied and played my "winning" Clerics deck from the previous week!
comment by jimmy · 2023-11-12T17:53:14.382Z · LW(p) · GW(p)
1) You keep saying "My brain", which distances you from it. You say "Human minds are screwed up", but what are you if not a human mind? Why not say "I am screwed up"? Notice how that one feels different and weightier? Almost like there's something you could do about it, and a motivation to do it?
2) Why does homework seem so unfun to you? Why do you feel tempted to put off homework and socialize? Have you put much thought into figuring out if "your brain" might be right about something here?
In my experience, most homework is indeed a waste of time, some homework very much is not, and even that very worthwhile homework can be put off until the last minute with zero downside. I decided to stop putting it off to the last minute once it actually became a problem, and that day just never came. In hindsight, I think "my brain" was just right about things.
How sure are you that you'd have noticed if this applies to you as well?
3) "If your brain was likely to succeed in deceiving you".
You say this as if you are an innocent victim, yet I don't think you'd fall for any of these arguments if you didn't want to be deceived. And who can blame you? Some asshole won't let you have fun unless you believe that homework isn't worthwhile, so of course you want to believe it's not worth doing.
Your "trick" works because it takes off the pressure to believe the lies. You don't need to dissociate from the rest of your mental processes to do this, and you don't have to make known bad decisions in order to do this. You simply need to give yourself permission to do what you want, even when you aren't yet convinced that it's right.
Give yourself that permission, and there's no distortionary pressure so you can be upfront about how important you think doing your homework tonight really is. And if you decide that you'd rather not put it off, you're allowed to choose that too. As a general rule, rationality is improved by removing blocks to looking at reality, not adding more blocks to compensate for other blocks.
It's not that "human minds are messed up" in some sort of fundamental architectural way and there's nothing you can do about it, it's that human minds take work to organize, people don't fully recognize this or how to do it, and until that work you're going to be full of contradictions.
Replies from: Screwtape↑ comment by Screwtape · 2023-11-13T06:04:32.247Z · LW(p) · GW(p)
- I was trying to reference a line from the sequences which I couldn't remember well enough to search. I am my mind and body, but it's also rhetorically useful to seat the ego somewhere else for a moment sometimes. If I was being more precise, I might try to indicate the part of the brain that does a lot of the calculations and jusgements as distinct from the part that wants things, but I am not enough of a psychology expert to know if that's actually a thing. For most biases it feels like something in how I think is set up wrong, like a misaligned sight on a gun.
- See footnote 1. Please consider yourself invited to replace the homework situation with one where you notice part of your mind attempting to write the bottom line first, or argue with uneven and unwarrented zeal for one side over the other. I eventually resolved the homework case historically to my own satisfaction.
- Yeah, it sounds like you and I are not communicating clearly on which bit we mean when we say "our brain." On a gears level, I think we agree, the ego lives mostly in that squishy gray matter between our ears, with perhaps a bit of chemical wash and nerve endings from the rest of the meat suit. (Checking explicitly- do we seem to agree there?) Sometimes I observe I systematically come up with flawed thoughts that are flawed in the same direction. Call those cognitive biases. One of the sneakier biases is that when I try and judge which idea has some positive trait like being a good budget decision, it gets a heavy thumb on the scale for having some other positive trait like being socially approved of. I want some kind of language to distinguish the truth seeking part from the biased part for the purpose of talking about them in a short semi-fictional conversation. Got any suggestions?
↑ comment by jimmy · 2023-11-22T20:03:38.605Z · LW(p) · GW(p)
I get that "humans are screwed up" is a sequences take, that you're not really sure how to carve up the different parts of your mind, etc. What I'm pointing at here is substantive, not merely semantic.
- The dissociation of saying "humans are messed up"/"my brain is messed up" feels different than saying "I am messed up". The latter is speaking from a perspective that is associated with the problem and has the responsibility to fix it from the first person. This perspective shift is absolutely crucial, and trying to solve your problems "from the outside" gets people very very caught up in additional meta level problems and unable to touch the object level problem. This is a huge topic.
- I had as a strong an aversion to homework as anyone, including homework which I knew to be important. It's not a matter of "finding a situation where you notice part of your mind attempting to write the bottom line first", but of noticing why that part of your mind will try to write the bottom line first, and relating to yourself in a way that eliminates the motivation to do so in the first place. I don't have situations where part of my mind attempts to write the bottom line first... that I'm aware of, at least. There are things that I'm attached to, which is what causes the "bottom line first" issues and which is still an obstacle to be overcome in itself, but the motivation to write the bottom line first can be completely obsoleted by stopping and giving more attention to the possibility that you've been trying to undervalue something that you can sense is critically important. This mental move shifts all of your "my brain is being irrational" problems into "I don't know what to do on the object level"/"I don't know why this is so important to me" problems, which are still problems but they are much nicer because they highlight rather than obscure the path to solution.
- "I want some kind of language to distinguish the truth seeking part from the biased part". I don't think such a distinction exists in any meaningful sense.
In my model, there's a part of your brain that recognizes that something is important (e.g. social time), and a part of your brain that recognizes that something else is important (e.g. doing homework), and that neither are "truth seeking" or "biased", but simply tugging you towards a particular goal. Then there's a part of your brain which feels tugged in both directions and has to mediate and try to form this incoherent mess into something resembling useful behavior.
This latter part wants to get out of the conflict, and there are many strategies to do this. This is another big topic, but one way to get out of the conflict is to simply give in to the more salient side and shut out the less salient side. This strategy has obvious and serious problems, so making an explicit decision to use this strategy itself can cause conflict between the desire "I want to not deal with this discomfort" and "I want to not drive my life into the ground by ignoring things that might be important".
One way to attempt to resolve that conflict is to decide "Okay, I'll 'be rational', 'use logic and evidence and reason', and then satisfy the side which is more logical and shut out the side that is 'irrational and wrong'". This has clear advantages over the "be a slave to impulses" strategy, but it has it's own serious issues. One is that the side that you judge to be "irrational" isn't always the side that's easier to shut out, so attempting to do so can be unsuccessful at the actual goal of "get out of this uncomfortable conflict".
A more successful strategy to resolving like these is to shut out the easy to shut out side, and then use "logic and reason" to justify it if possible, so that the "I don't want to run my life into the ground by making bad decisions" part is satisfied too. The issue with this one comes up when part of you notices that the bottom line is getting written first and that the pull isn't towards truth -- but so long as you fail to notice, this strategy actually does quite well, so every time your algorithm that you describe as "logical and reasoned" drifts in this direction it gets rewarded and you end up sliding down this path. That's why you get this repeating pattern of "Dammit, my brain was writing the bottom line again. I shall keep myself from doing that next time!".
It's simply not the case that you have a "truth seeking part" and a "biased part". You contain a multitude of desires, and strategies for achieving these desires and mediating conflicts between these desires. The strategies you employ, which call for shutting out desires which retain power over you unless they can come up with sufficient justification, requires you to come up with justifications and find them sufficient in order to get what you want. So that's what you're motivated to do, and that's what you tend to do.
Then you notice that this strategy has problems, but so long as you're working within this strategy, adding the extra desires of "but don't fool myself here!" becomes simply another desire that can be rationalized away if you succeed in coming up with a justification that you're willing to deem sufficient ("Nah, I'm not fooling myself this time! These reasons are sound!", "Shit, I did it again didn't I. Wow, these biases sure can be sneaky!").
The framing itself is what creates the problems. By the time you are labeling one part "truth seeking" and one part "biased, and therefore important to not listen to", you are writing the bottom line . And if your bottom line includes "there is a problem with how my brain is working", then that's gonna be in your bottom line.
The alternative is to not purport to know which side is "truth seeking" and which side is "biased", and simply look, until you see the resolution.
comment by bideup · 2023-11-14T16:26:31.552Z · LW(p) · GW(p)
I think I practice something similar to this with selfishness: a load-bearing part of my epistemic rationality is having it feel acceptable that I sometimes (!) do things for selfish rather than altruistic reasons.
You can make yourself feel that selfish acts are unacceptable and hope this will make you very altruistic and not very selfish, but in practice it also makes you come up with delusional justifications as to why selfish acts are in fact altruistic.
From an impartial standpoint we can ask how much of the latter is woth it for how much of the former. I think one of life's repeated lessons is that sacrificing your epistemics for instrumental reasons is almost always a bad idea.
comment by Luke Cheng (luke-cheng) · 2023-11-10T00:53:05.830Z · LW(p) · GW(p)
My understanding of this is that you are turning off the fear/ick response to doing the thing in order to rationally judge the situation, but the method you’ve devised to turn the fear/ick is to submit to it momentarily.
It would seem good to just be able to do that without having to submit every time. I can imagine there are a myriad of ways to do this. Experience seems to allow you to do this more automatically given enough of a causal awareness or just habit.
Replies from: Screwtape↑ comment by Screwtape · 2023-11-11T23:40:32.012Z · LW(p) · GW(p)
If someone can turn off the thing in their head that tries to write the bottom line before filling out the rest of the page, that seems strictly better than this trick. I can't, not reliably. Sometimes I fight that fight, sometimes the information is worth more than the better decision I think I can wrest.
For me, it usually doesn't feel like a fear or an ick. I usually notice yums, if that makes sense. Like, "don't step into traffic" or "don't jump off the cliff" make me afraid, but the bottom line is usually pretty sensible as is the writing above it. It's "don't eat the donut" and "don't snooze the alarm" that tend to have sneaky pre-written bottom lines I can beat with this trick.
Replies from: luke-cheng↑ comment by Luke Cheng (luke-cheng) · 2023-11-15T05:30:32.918Z · LW(p) · GW(p)
On a more meta-level what if you just applied the same trick but to deciding to not make the bad decision? It's a double negative on purpose because perhaps you would gain information on why you are not comfortable not eating the donut or snoozing the alarm, but only if you are go through the machinations of the double negative in your head. This is a trick I've been using more and more.
Something like: Tasty donut feeling -> awareness of want -> awareness of diet goals that contradict this want -> awareness that you have a choice to make -> deciding not to make the wrong choice to see what would happen -> experience of not eating the donut -> (experiential data: feeling hungry, feeling tired, feeling grumpy, etc. and positing reasons for why those things occurred) -> next time you get the tasty donut feeling, you actually have more data than if you just ate the donut.