Experience diets

post by KatjaGrace · 2016-12-04T23:03:06.000Z · LW · GW · 0 comments

Imagine you have a part of your mind that just keeps track of which visceral experiences you have how often, and then expects more experiences in that ratio. So if you look at pictures of crocodiles, it feels like crocodiles are a bigger part of what is going on in the world. And then if you watch ten youtube videos of people slapping each other in the face, it feels like it is more normal for people to slap each other in the face.  If you get up late in the day for a while, it tells you that the world is mostly dark. If you see starving people, it populates its simulated world with starving people (rather than just those magazine pictures of starving people it previously knew about).

‘Visceral’ is vague, but let’s say there are some kinds of experience it can understand, and some it can’t. Anecdotes and pictures and direct experience are intelligible, but it interprets more abstract datasets as ‘sometimes there are abstract datasets’. Like a reinforcement learner which can perceive a large subset of the stimuli that other parts of our minds can respond to, though not all of them.

And suppose that you can even intellectually notice that you are responding badly to seeing a few crocodile pictures, but the kinds of mental parts that can ‘intellectually notice’ things don’t speak any languages that the other part knows, so they can’t just directly fix the problem with explicit efforts. The best they can do is choose to look at a bunch of the most compelling non-crocodile stuff they can find until the other mental part gets the picture. And the whole time you would feel like you have an accurate account of the world.

My impression is that this is what humans are like to some extent, but I don’t know the extent or exact nature of the interaction between this and other ways that humans are. I also don’t know whether this is all a thing that experts have an excellent understanding of, because this is not currently the kind of blog where the blogger does a bunch of research before they write things.

Anyway, if this picture captured an important part of what was going on in the human mind, I might expect a key issue for humans would be strategizing around what kinds of experiences to consume for worldview warping purposes. For instance, this might come up when you are deciding whether to watch ten videos of people slapping each other on YouTube.

People do strategize about this kind of thing a bit. Though I think mostly about people’s behavior, in really extreme cases, or seeking happiness rather than truth. Here are examples I can think of:

These cases are all either involve very extreme and immediate corrections, desire to meddle with someone else’s behavior,  or efforts to feel better about the world rather than to view it more accurately. The kinds of things I have in mind would be more like:

I think I occasionally hear these kinds of considerations raised, and maybe acted on, though it is hard to think of examples, other than people sometimes intentionally spending more time with people who nebulously seem like good influences, which might embody some such things.

In the wake of the recent US election, I have heard people talking about mingling more with people from different bubbles. Which also sounds maybe close, but I think they are mostly suggesting talking to political rivals about their explicit views and trying to understand where they are coming from and to empathize with them. I’m not talking about anything so intellectual or socially virtuous—I’m just talking about bumping into people who vote differently often enough that your intuitions register their existence. Which is arguably less of a big deal for characteristics that define political divides, because you are probably aware of your political rivals’ existence by the time you are rivaling them. And if not, the media will probably tell you about them. Whereas if you never see any truck drivers, you could easily forget to consistently imagine that there are three million of them around here somewhere. Even if you read a statistic about it once, and could maybe figure out a decent guess if someone directly asked you ‘how many truck drivers are there in America?’ And the existence of three million truck drivers probably comes up sometimes, like when speculating about the implications of self driving cars.

So anyway, I claim that this kind of strategizing about experience consumption mostly comes up in fairly extreme or immediate cases, or cases where the costs are to someone else, or in order to improve the enjoyability of one’s worldview rather than its accuracy. I’m not very confident about this. But supposing this is true, it might be because the effects are too small to make it worth thinking about (and other people know that, while I don’t). It could also be that other people are thinking about these things a lot more than I think they are, and they don’t discuss them much, or I forget the good examples.

An interesting explanation is that such strategizing would indeed be very useful, but we mostly don’t do it because it is only strategic from the perspective of the more intellectual parts of our minds. Those parts would like to correct our crazy instinctive picture of the world in pursuit of their own abstract goals. However the part we have been talking about—the ‘visceral picture of the world based on direct observations and stories’  part— doesn’t have any picture of a gap between an accurate abstract world model and its own model. That’s too abstract for a start, and representing “model X is badly inaccurate” inside model X is at least a bit complicated. And our more intellectual mental parts have trouble finding any experience that would really hammer home the fact of this gap. So we don’t really feel like it is a big deal, though it seems like it might be intellectually. This also matches the way people discuss this kind of thing intellectually, but don’t seem to do much about it.


0 comments

Comments sorted by top scores.