post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by Said Achmiz (SaidAchmiz) · 2019-07-28T23:35:48.065Z · LW(p) · GW(p)

What evidence can I show to a non-Rationalist that our particular movement (i.e. our particular techniques for overcoming biases, studying decision theory, applying Bayesianism, learning CBT techniques, etc.) is valuable for making their lives significantly better?

The question you need to answer first is, rather:

Why do you believe that “our particular movement (i.e. our particular techniques for overcoming biases, studying decision theory, applying Bayesianism, learning CBT techniques, etc.)” is valuable for making your life (or our lives) “significantly better”?

Before asking how to convince someone else, first ask why you are convinced. If you can answer that to your own satisfaction, that is a good first step; if you can answer that to the satisfaction of a third party, that is progress; and then the question of “how to convince others” should be easy.

Replies from: Senarin
comment by Bae's Theorem (Senarin) · 2019-07-29T00:14:25.555Z · LW(p) · GW(p)

I'm convinced mostly due to its effects on my own life, as stated in the opening paragraph. But I'm unsure of how to test and demonstrate that claim. My question is for my benefit as well as others.

Replies from: Alexei
comment by Alexei · 2019-07-29T05:28:44.279Z · LW(p) · GW(p)

Right, but how do you know? Are there specific stories of how you were going to make a decision X but then you used a rationality tool Y and it saved the day?

Replies from: Senarin
comment by Bae's Theorem (Senarin) · 2019-07-29T19:21:38.521Z · LW(p) · GW(p)

Yes, but they could all be explained by the fact I just sat down and bothered to think about the problem, which wouldn't exactly be an amazing endorsement of rationality as a whole.

I also don't look at rationality as merely a set of tools; it's an entire worldview that emphasizes curiosity and a desire to know the truth. If it does improve lives, it might very well simply be making our thinking more robust and streamlined. If so, I wouldn't know how to falsify or quantify that.

Replies from: eigen
comment by eigen · 2019-07-29T19:56:20.208Z · LW(p) · GW(p)

I don't understand how are you getting so many questions about your post instead of sensible replies to it. Did someone really say to you to change the question? Why would you ever do that if what you really want to know is how people are benefited by this way of thinking?

What if say to that guy: "no,no..." how about you tell me how you have benefited about Bayesian thinking since that's what I'm interested in knowing?

Replies from: Alexei
comment by Alexei · 2019-07-30T01:52:01.900Z · LW(p) · GW(p)

The questions are being asked (at least on my part) because I believe the best way to “convince” someone is to show them with the example of your own life.

Replies from: Senarin
comment by Bae's Theorem (Senarin) · 2019-07-30T03:16:00.377Z · LW(p) · GW(p)

No, the best way to convince me is to show me data. Evidence I can actually update on, instead of self-reporting on results that may be poisoned by motivated reasoning, or any number of other biases. Data I can show to people who know what they are talking about, that they will take seriously.

Replies from: Alexei
comment by Alexei · 2019-07-30T03:56:00.654Z · LW(p) · GW(p)

Your question was: “What evidence can I show to a non-Rationalist that our particular movement...”

I’m saying for non rationalists that’s one of the better ways to do it. They don’t need the kind of data you seem to require. But if you talk about your life in a friendly, open way, that will get you far.

Additionally, “example of your own life” is data. And some people know how to process that pretty remarkably.

comment by John_Maxwell (John_Maxwell_IV) · 2019-07-29T05:28:13.719Z · LW(p) · GW(p)

Here's some evidence that "Bayesian Rationality" doesn't work: The fact that you have written the bottom line first with this question, instead of asking a question like "What evidence do we have about impacts of being part of the rationality community?" or some similar question that doesn't get you filtered info :)

Replies from: Senarin
comment by Bae's Theorem (Senarin) · 2019-07-29T18:48:58.816Z · LW(p) · GW(p)

I ask the question this way to hopefully avoid stepping on toes. I'm fully open to the idea that the answer is "we have none". Also, I am primarily addressing the people who are making a claim. I am not necessarily making a claim myself.

Replies from: John_Maxwell_IV
comment by John_Maxwell (John_Maxwell_IV) · 2019-07-30T08:19:22.873Z · LW(p) · GW(p)

Fair enough.

CFAR has some data about participants in their workshops: https://rationality.org/studies/2015-longitudinal-study BTW, I think the inventor of Cohen's d said 0.2 is a "small" effect size.

I think some LW surveys have collected data on the amount people have read LW and checked to see if that was predictive of e.g. being well-calibrated on things (IIRC it wasn't.) You could search for "survey [year]" on LW to find that data, and you could analyze it yourself if you want. Of course, it's hard to infer causality.

I think LW is one of the best online communities. But if reading a great online community is like reading a great book, even the best books are unlikely to produce consistent measurable changes in the life outcomes of most readers, I would guess.

Supposedly education research has shown that transfer learning isn't really a thing, which could imply, for example, that reading about Bayesianism won't make you better calibrated. Specifically practicing the skill of calibration could make you better calibrated, but we don't spend a lot of time doing that.

I think Bryan Caplan discusses transfer learning in his book The Case Against Education, which also talks about the uselessness of education in general. LW could be better for your human capital than a university degree and still be pretty useless.

The usefulness of reading LW has long been a debate topic on LW. Here are some related posts:

https://www.lesswrong.com/posts/LgavAYtzFQZKg95WC/extreme-rationality-it-s-not-that-great [LW · GW]

https://www.lesswrong.com/posts/7dRGYDqA2z6Zt7Q4h/goals-for-which-less-wrong-does-and-doesn-t-help [LW · GW]

https://www.lesswrong.com/posts/qGEqpy7J78bZh3awf/what-i-ve-learned-from-less-wrong [LW · GW]

https://www.lesswrong.com/posts/PBRWb2Em5SNeWYwwB/humans-are-not-automatically-strategic [LW · GW]

http://www.overcomingbias.com/2012/06/the-smart-are-more-biased-to-think-they-are-less-biased.html

https://www.lesswrong.com/posts/AdYdLP2sRqPMoe8fb/knowing-about-biases-can-hurt-people [LW · GW]

You can also do keyword searches for replies people have made, e.g.

https://www.lesswrong.com/posts/B3b29FJboqnANJRDz/extreme-rationality-it-could-be-great [LW · GW]

comment by Shmi (shminux) · 2019-07-29T01:52:19.496Z · LW(p) · GW(p)
What evidence can I show to a non-Rationalist that our particular movement (i.e. our particular techniques for overcoming biases, studying decision theory, applying Bayesianism, learning CBT techniques, etc.) is valuable for making their lives significantly better?

Notice the typical mind fallacy/generalizing from one example: you assume that if your life got significantly better from learning the Gospel of Bayes, then so would everyone's. That is emphatically not so: there are many many happy religious people who feel happiness from living their life according to their understanding of their God's laws.

Maybe consider starting by identifying a subset of currently not very happy people who might benefit from learning about LW/CFAR-style rationality and focus on those.

Personally, I have enjoyed reading and learning what Eliezer wrote between 2009 and 2015 or so (fiction and non-fiction), and what Scott A has been writing, and an occasional other post, but I would be hard pressed to say that any of that made my life significantly better. If anything, learning to understand people's feelings, including my own, has had a far larger impact on my life.

comment by Pattern · 2019-07-29T02:14:57.634Z · LW(p) · GW(p)

I think it's easier to test in advance, as an experiment. (The trick might be getting a control group.)


comment by TAG · 2019-07-29T07:58:03.063Z · LW(p) · GW(p)

Are you using calculations, or something more hand wavey?

Replies from: Senarin
comment by Bae's Theorem (Senarin) · 2019-07-29T19:56:44.808Z · LW(p) · GW(p)

A strong correlation between adopting the virtues and established methods of rationality, and an increased quality of life, but yeah; more handwavey. I don't even know what calculations could be made. That's sorta why I'm here.

Replies from: eigen
comment by eigen · 2019-07-29T19:59:10.132Z · LW(p) · GW(p)

If you're not doing calculations then you are not doing "Bayesian Rationality". Therefore, you very likely cannot explain to someone how "Bayesian Rationality" has worked out for you.

Replies from: Senarin
comment by Bae's Theorem (Senarin) · 2019-07-29T20:09:41.314Z · LW(p) · GW(p)

I see Bayesian Rationality as a methodology as much as it is a calculation. It's being aware of our own prior beliefs, the confidence intervals of those beliefs, keeping those priors as close to the base rates as possible, being cognizant of how our biases can influence our perception of all this, trying to mitigate the effects of those biases, and updating based on the strength of evidence.

I'm trying to get better at math so I can do better calculations. It's a major flaw in my technique I acknowledge and am trying to change.

But as you noted earlier, none of this answers my question. If I am not currently practicing your art, and you believe your art is good, what evidence do you have to support that claim?