How is rationalism different from utilitarianism?
post by Wang Street Journal · 2021-02-15T02:04:25.102Z · LW · GW · No commentsThis is a question post.
Contents
Answers 13 dynomight 7 ChristianKl 2 Vladimir_Nesov 1 TAG 1 Rad Hardman None No comments
Would appreciate either theoretical explanations or illustration using examples. I am new to the forum, and it appears to me that both rationalists and utilitarianists would use "utility maximization" as the key decision criterion. Maybe for rationalists, "utility" is defined narrower because certain values are upheld or preferred over others (e.g. uphold Truth and the scientific process).
Take an example from daily life, say your friend bought you a jacket as a gift. You find the jacket hideous. Your friend asks you how you like the jacket. Should you tell the truth (e.g., doesn't think it suits you) or should you tell a white lie (e.g., you love it)? Would the decision-making calculus of a rationalist differ from that of a utilitarianist?
Answers
You could model the two as being totally orthogonal:
- Rationality is the art of figuring out how to get what you want.
- Utilitarianism is a calculus for figuring out what you should want.
In practice, I think the dividing lines are more blurry. Also, the two tend to come up together because people who are attracted to the thinking in one of these tend to be attracted to the other as well.
Should you tell the truth (e.g., doesn't think it suits you) or should you tell a white lie (e.g., you love it)?
Neither. Rationalism isn't about thinking you should do certain things because you identify a certain way.
Rationality is perhaps about thinking carefully about careful thinking: what it is, what it's for, what is its value, what use is it, how to channel it more clearly. Utilitarianism is about very different things.
It's not clear that rationalism boils down to the maximisation of personal utility but even if it does, there is still a major difference between personal utility maximisation and collective utility maximisation.
I would say that comparing rationalism and utilitarianism is comparing apples to oranges. Rationalism is concerned with forming accurate models about the world. Essentially, it's a set of tools used to find "truth". It only deals with positive aspects. Meanwhile, utilitarianism is an ethical system. It only deals with normative aspects. It just happens that many rationalists here are also utilitarians, thus making lots of writing concerning rationalism be couched in utilitarianism.
The two are related in the sense that, since utilitarianism is a form of consequentialism, you need to have accurate models of the world to make sure what you do will lead to the most utility. But you could be a rationalist and have a value ethics system.
With regard to your example, the scope of rationalism would be assessing the effect on your friend of you saying those things, and that is where it ends. What you choose to do will depend on your value system. If you're a utilitarian, maybe you'd tell the truth so they can pick out better gifts for you, or because you know they value honesty. Maybe you won't if you believe the negative feelings they would feel outweigh the benefit of them knowing. If you're a Kantian, you would definitely say it was ugly.
↑ comment by ChristianKl · 2021-02-15T12:11:00.508Z · LW(p) · GW(p)
Rationalism is concerned with forming accurate models about the world.
That's not the way the term is primarily used in this community. We generally orient us more towards decision science. From Jonathan Baron's textbook Thinking and deciding:
Replies from: Vladimir_NesovThe best kind of thinking, which we shall call rational thinking, is whatever kind of thinking best helps people achieve their goals. If it should turn out that following the rules of formal logic leads to eternal happiness, then it is “rational thinking” to follow the laws of logic (assuming that we all want eternal happiness). If it should turn out, on the other hand, that carefully violating the laws of logic at every turn leads to eternal happiness, then it is these violations that we shall call “rational.”
When I argue that certain kinds of thinking are “most rational,” I mean that these help people achieve their goals. Such arguments could be wrong. If so, some other sort of thinking is most rational.
↑ comment by Vladimir_Nesov · 2021-02-15T14:47:32.477Z · LW(p) · GW(p)
It's instrumentally useful for the world to be affected according to a decision theory, but it's not obviously a terminal value for people to act this way, especially in detail. Instrumentally useful things that people shouldn't be doing can instead be done by tools we build.
No comments
Comments sorted by top scores.