Beliefs: A Structural Change

post by Michaël Trazzi (mtrazzi) · 2018-05-06T13:40:30.262Z · score: 9 (5 votes) · LW · GW · 7 comments

Contents

  Changing Beliefs
  A Structural Shift
  Floating Beliefs
  Practical Philosophy
None
7 comments

(Note: I am overwhelmingly grateful to have received so many constructive feedback on my last post [LW · GW]. I read every single word, and I think the best way to thank you is to continue writing with all of your advises in mind. This is a personal blog post about what I experienced in the last 24 hours)

Changing Beliefs

Splash. In front of me, in the swimming pool? The girl I like, with a smile on her face. If this is a one-player simulation [LW · GW], I'm enjoying the plot.

"I don't know why you would "want" to be in a simulation at all. Most people would find the idea disturbing" (comment [LW · GW] from TAG [LW · GW])

Do I truly believe I live in a simulation? What should I expect if I was, indeed, in a simulation?

Plop. The key to my locker seem to have sunk. Like a mermaid, she dived at the bottom of the pool, searching for this tiny piece of metal.

I believe that my key is at the bottom of the pool. Hence, I expect to spot a glint of metal if I keep looking.

What does this belief not allow to happen?

Well, if I go out of the pool, walk toward the lifeguard, and ask him if someone gave him some ke... Yes, he has it! Instrumental rationality [LW · GW] helped me find my key.

A Structural Shift

Obviously, these weren't exactly the words which came to my mind when swimming on a sunny day.

However, I had those thoughts the same day, later, in a crowded Parisian metro, while starting "_Map And Territory, Part B, Fake Beliefs_".

Aaah! This is why my friend Tiago (who introduced me to LW) used to have this app where he predicted a bunch of stuff (he went as far as to bet on the weight of Whales' testes (525 kg, in case you're wondering)).

Huum, so that was the reason why this guy (LW reader) at the AI Safety Meetup quoted "All models are wrong but some are useful".

"If you have a view on something before reading Eliezer's thoughts on it, this can help you integrate Eliezer's views into your own, without doing so blindly. It's easier to learn something if you already have some related beliefs for it to latch onto" (comment [LW · GW] from Ikaxas [LW · GW])

Looking at the sky, everything came together. Empiricism, predictions, Bayesian inference. Multiple labels, one goal: practical philosophy.

Floating Beliefs

"When you argue a seemingly factual question, always keep in mind which difference in anticipation you are arguing about. If you can't find the difference of anticipation, you're probably arguing about labels in your belief network-or even worse, floating beliefs, barnacles of your network" (EY, Making Beliefs Pay Rent (in Anticipated Experiences) [LW · GW])

For the last few days, I argued a lot over some theoretical questions where no difference of anticipation emerged.

If indeed I was living in a first-person simulation, how would it affect me?

I discussed the ethical implications of assuming to be the only conscious being on Earth. Was this practical?

Practical Philosophy

At the end of the metro, I arrived in a quiet suburb outside of Paris.

No internet. I was faced with a concrete problem: finding the way to my friend's house with only screenshots from Google Maps.

There I was, with a concrete Map vs Territory challenge.

Obviously, I got lost, but in a very funny setting: I was in the very street where my friend lived.

The problem? There weren't any street sign with the name of the street, so I couldn't be sure that the number 11 was indeed her house.

This was not a theoretical question, yet an important one:

"If I ring at the door, whom should I expect to see?"

Hopefully, she opened the door, and we went on chatting about what the heck I had been doing since last week:

"You know, the usual stuff. Writing one article a day on a rationalist website with smart people debunking all my arguments", I answered.

What surprised me the most was that after I said this, she asked me:

"Can you please give me the name of this site?"

7 comments

Comments sorted by top scores.

comment by TAG · 2018-05-08T14:06:45.600Z · score: 6 (2 votes) · LW · GW

For the last few days, I argued a lot over some theoretical questions where no difference of anticipation emerged.

Your belief in solipsism had a consequence: refusing to attend EA. That illustrates a problem with EY's idea. A belief can still have an impact, through shaping behaviour, even if it does not constrain experience.

comment by Michaël Trazzi (mtrazzi) · 2018-05-08T14:29:18.307Z · score: 3 (1 votes) · LW · GW

Interesting comment.

I feel that what shapes the behaviour is not the belief in itself, but what this belief implies.

It's more an empiric law like "This guy believes in A, so it's improbable that he also believes in B, given that he is smart enough to not have contradicting views" (e.g. Solipsism and Utilitarianism).

comment by TAG · 2018-05-09T08:57:57.996Z · score: 3 (1 votes) · LW · GW

I feel that what shapes the behaviour is not the belief in itself, but what this belief implies.

I am not seeing much of a distinction ... you would have to draw implications to anticipate experience, in most cases.

comment by Ikaxas · 2018-05-08T02:19:56.973Z · score: 4 (1 votes) · LW · GW

Data point: I found this post fascinating, but confusing. Not sure how to articulate my thoughts on it in more detail at the moment.

comment by Gyrodiot · 2018-05-07T09:07:38.659Z · score: 4 (2 votes) · LW · GW

I sometimes do a brief presentation of rationality to acquaintances, and I often stress the importance of being able to change your mind. Often, in the Sequences, this is illustrated by thought experiments, which sound a bit contrived when taken out of context, or by wide-ranging choices, which sound too remote and dramatic for explanation purposes.

I don't encounter enough examples of day-to-day application of instrumental rationality, the experience of changing your mind, rather than the knowledge of how to do it. Your post has short glimpses of it, and I would very much enjoy reading a more in-depth description of these experiences. You seem to notice them, which is a skill I find very valuable.

On a more personal note, your post nudges me towards "write more things down", as I should track when I do change my mind. In other words, follow more of the rationality checklist advice. I'm too often frustrated by my lack of noticing stuff. So, thanks for this nudge!

comment by Michaël Trazzi (mtrazzi) · 2018-05-08T14:31:36.775Z · score: 4 (2 votes) · LW · GW

Thank you. Will try to give more day-to-day rationality applications if I can.

EDIT: about the "write more things down", I think writing specific to LessWrong stuff (like when your beliefs change) might prove useful. However, just a lot of personal journaling or thought-crystallization-on-the-internet is enough.

comment by TAG · 2018-05-08T13:41:34.973Z · score: 3 (1 votes) · LW · GW

--deleted---