Reification bias
post by adamShimi, Gabriel Alfour (gabriel-alfour-1) · 2023-01-09T12:22:15.460Z · LW · GW · 6 commentsContents
6 comments
“All right," said Susan. "I'm not stupid. You're saying humans need... fantasies to make life bearable."
REALLY? AS IF IT WAS SOME KIND OF PINK PILL? NO. HUMANS NEED FANTASY TO BE HUMAN. TO BE THE PLACE WHERE THE FALLING ANGEL MEETS THE RISING APE.
"Tooth fairies? Hogfathers? Little—"
YES. AS PRACTICE. YOU HAVE TO START OUT LEARNING TO BELIEVE THE LITTLE LIES.
"So we can believe the big ones?"
YES. JUSTICE. MERCY. DUTY. THAT SORT OF THING.
"They're not the same at all!"
YOU THINK SO? THEN TAKE THE UNIVERSE AND GRIND IT DOWN TO THE FINEST POWDER AND SIEVE IT THROUGH THE FINEST SIEVE AND THEN SHOW ME ONE ATOM OF JUSTICE, ONE MOLECULE OF MERCY. AND YET—Death waved a hand. AND YET YOU ACT AS IF THERE IS SOME IDEAL ORDER IN THE WORLD, AS IF THERE IS SOME...SOME RIGHTNESS IN THE UNIVERSE BY WHICH IT MAY BE JUDGED.
"Yes, but people have got to believe that, or what's the point—"
MY POINT EXACTLY.”
Terry Pratchett, Hogfather
Humans make up people, objects, things, all the time. We observe a system that we don’t fully control, and then jump to assuming some underlying real thing making it work and function.
Two obvious examples come to mind, one mostly a relic of the past, another very much a part of the present:
- Polytheistic religions, where every phenomenon is literally explained by a hallucinated person: god, nymph, djinn…
- Hidden entities theories in science, where the phenomena are often clarified and explained through the assumption of unobservable entities, like the fields of field theories, the aether in pre-relativity physics, and the strings and branes in string theory.
Yet our predictive hallucinations are far more prevalent than that!
- When reading a novel or playing a game of D&D, we could directly infer the next event to come from the tropes and the psychology of the author, but we tend to naturally hallucinate a whole coherent world with imagined people as if it existed really, and predict the future of the story based on that.
- In matters of politics, we could directly model the status games that are being play, like immoral mazes [? · GW] and simulacra levels [? · GW] attempt to, but we tend to naturally hallucinate issues for which we believe we are fighting, like immigration and economy, even though we mostly are playing the status game and often have different opinion about the real world complex systems these words are supposed to point to.
- In matters of morals, we could directly model the reasons for coordination and the consequences of breaking trust, but we tend to naturally hallucinate moral concepts like Justice, Mercy, Sin, and to act as moral realists even when we have a more sophisticated perspective on morality.
The point is not that we always resort to predictive hallucinations— only that we have a reification bias: predictive hallucinations are our first instinct and our first intuition.
The point is not that predictive hallucinations are bad either. They don’t have a moral value, they’re just tools. They equally give rise to the mess of politics and the progress of science.
Reification bias, like all biases, is an engine of human cognition [LW · GW]. For better or worse. It’s both a constraint and a capability, like any affordance. It is a shortcut that lets us find compressed explanations of almost any phenomenon, and communicate them straightforwardly to each other. When it works, it lets us beat the odds and move faster than we should reasonably expect — most of science's biggest jumps and successes come from here. But like every shortcut, it can lead us astray. Even in science, there have been predictive hallucinations which mostly proved a hindrance, like ether. And it’s so natural for us to believe in them, to ascribe reality to them in the most tangible way, that we often cannot let go.
Like all biases, we can get further not by denying it or fearing it, but by leveraging it, under a watchful eye, to reap its benefits without incurring its costs.
6 comments
Comments sorted by top scores.
comment by the gears to ascension (lahwran) · 2023-01-11T19:21:15.982Z · LW(p) · GW(p)
this feels like it proves too much; I'm always skeptical of status, in particular, as a first explanation. but I'm not sure what I'd suggest as an improvement at time of comment.
comment by Slider · 2023-01-09T17:16:01.975Z · LW(p) · GW(p)
Could you give an example of science where we do not lean on predictive hallucinations?
Funny thing, I was just researching about amplituhedrons and ended up reading about Poincaré disk model. I bet that learning about Hogfathers discworld would be way more engaging.
Replies from: adamShimi↑ comment by adamShimi · 2023-01-09T17:48:50.755Z · LW(p) · GW(p)
I agree that a lot of science relies on predictive hallucinations. But there are examples that come to mind, notably the sort of phenomenological compression pushed by Faraday and (early) Ampère in their initial exploration of electromagnetism. What they did amounted to vary a lot of the experimental condition and relate outcomes and phenomena to each other, without directly assuming any hidden entity. (see this book for more details)
More generally, I expect most phenomenological laws to not rely heavily on predictive hallucinations, even when they integrate theoretical terms in their formulation. That's because they are mostly strong experimental regularities (like the ideal gas law or the phenomenological laws of thermodynamics) which tend to be carried to the next paradigm [LW · GW] with radically different hidden entities.
comment by Bezzi · 2023-01-10T08:50:53.217Z · LW(p) · GW(p)
When reading a novel or playing a game of D&D, we could directly infer the next event to come from the tropes and the psychology of the author, but we tend to naturally hallucinate a whole coherent world with imagined people as if it existed really, and predict the future of the story based on that.
It's not just that we tend to hallucinate. The D&D rules themselves actively instruct players to do that. The Dungeon Master's Guide has this entire paragraph intended to explicitly discourage players from metagaming:
“I figure there’ll be a lever on the other side of the pit that deactivates the trap,” a player says to the others, “because the DM would never create a trap that we couldn’t deactivate somehow.” That’s an
example of metagame thinking. Any time the players base their characters’ actions on logic that depends on the fact that they’re playing a game; they’re using metagame thinking. This behavior
should always be discouraged, because it detracts from real role-playing and spoils the suspension of disbelief. Surprise your players by foiling metagame thinking. Suppose the other side of the pit has a lever, for example, but it’s rusted and useless. Keep your players on their toes, and don’t let them second-guess you. Tell them to think in terms of the game world, not in terms of you as the DM. In the game world, someone made the trap in the dungeon for a purpose. You have figured out the reason
why the trap exists, and the PCs will need to do the same. In short, when possible you should encourage the players to employ in-game logic. Confronted with the situation given above, an appropriate response from a clever character is “I figure there’ll be a lever on the other side of the pit that deactivates the trap, because the gnomes who constructed the trap must have a means to deactivate it.” In fact, this is wonderful—it shows smart thinking as well as respect for the verisimilitude of the game world.
comment by Mateusz Bagiński (mateusz-baginski) · 2023-01-09T14:30:24.272Z · LW(p) · GW(p)
So to clarify, reification, in the sense you use it here, refers to explaining some thing A by a reference to an thing X, whose existence we need to postulate to explain A (or some other thing B, for which we have already postulated that X exists)?
Replies from: adamShimi↑ comment by adamShimi · 2023-01-09T16:43:07.264Z · LW(p) · GW(p)
So reification means "the act of making real" in most english dictionaries (see here for example). That's the meaning we're trying to evoke here, where the reification bias amounts to first postulate some underlying entity that explain the phenomena (that's merely a modelling technique), and second to ascribe reality to this entity and manipulate it as if it was real.