post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by TAG · 2018-05-08T14:06:45.600Z · LW(p) · GW(p)

For the last few days, I argued a lot over some theoretical questions where no difference of anticipation emerged.

Your belief in solipsism had a consequence: refusing to attend EA. That illustrates a problem with EY's idea. A belief can still have an impact, through shaping behaviour, even if it does not constrain experience.

Replies from: mtrazzi
comment by Michaël Trazzi (mtrazzi) · 2018-05-08T14:29:18.307Z · LW(p) · GW(p)

Interesting comment.

I feel that what shapes the behaviour is not the belief in itself, but what this belief implies.

It's more an empiric law like "This guy believes in A, so it's improbable that he also believes in B, given that he is smart enough to not have contradicting views" (e.g. Solipsism and Utilitarianism).

Replies from: TAG
comment by TAG · 2018-05-09T08:57:57.996Z · LW(p) · GW(p)

I feel that what shapes the behaviour is not the belief in itself, but what this belief implies.

I am not seeing much of a distinction ... you would have to draw implications to anticipate experience, in most cases.

comment by Gyrodiot · 2018-05-07T09:07:38.659Z · LW(p) · GW(p)

I sometimes do a brief presentation of rationality to acquaintances, and I often stress the importance of being able to change your mind. Often, in the Sequences, this is illustrated by thought experiments, which sound a bit contrived when taken out of context, or by wide-ranging choices, which sound too remote and dramatic for explanation purposes.

I don't encounter enough examples of day-to-day application of instrumental rationality, the experience of changing your mind, rather than the knowledge of how to do it. Your post has short glimpses of it, and I would very much enjoy reading a more in-depth description of these experiences. You seem to notice them, which is a skill I find very valuable.

On a more personal note, your post nudges me towards "write more things down", as I should track when I do change my mind. In other words, follow more of the rationality checklist advice. I'm too often frustrated by my lack of noticing stuff. So, thanks for this nudge!

Replies from: mtrazzi
comment by Michaël Trazzi (mtrazzi) · 2018-05-08T14:31:36.775Z · LW(p) · GW(p)

Thank you. Will try to give more day-to-day rationality applications if I can.

EDIT: about the "write more things down", I think writing specific to LessWrong stuff (like when your beliefs change) might prove useful. However, just a lot of personal journaling or thought-crystallization-on-the-internet is enough.

comment by TAG · 2018-05-08T13:41:34.973Z · LW(p) · GW(p)

--deleted---

comment by Vaughn Papenhausen (Ikaxas) · 2018-05-08T02:19:56.973Z · LW(p) · GW(p)

Data point: I found this post fascinating, but confusing. Not sure how to articulate my thoughts on it in more detail at the moment.