What does your philosophy maximize?
post by Antb (darustc4) · 2024-03-01T16:10:37.773Z · LW · GW · 1 commentsContents
1 comment
The universe is vast and complex, and we like to take mental refuge from this vastness by following some philosophical school. These philosophical schools come in many different shapes and sizes, and while listing them all is quite impractical, there are three that I often hear in discussions nowadays: theism, atheism, and empiricism (or rationalism).
These discussion are more often heated than not. The reason being that when you subscribe to any of these philosophies, it becomes all too easy to come up with many reasons to justify why someone taking refuge under any other school of thought is fundamentally misguided:
- Theists see non-believers as fundamentally lacking the love of (the) God(s), or as unfortunate souls doomed to destruction.
- Atheists see believers as sheep following old scriptures written by some megalomaniacs seeking control over the masses.
- Empiricists see others as idiots too blinded by their ideals to actually see reality as it is.
It is quite likely that, if you are reading this on LessWrong, you are an empiricist, an atheist, or both. Have you ever wondered why you chose these schools of thought over the others? Or why it is it that you get such mixed feelings when thinking about people following ideas you don't believe in?
I see philosophical schools as maximization algorithms. You subscribe to a philosophy when its algorithm maximizes what you desire most. Some choose empiricism because they want to maximize predictive power. Some choose theism because they want to maximize mental integrity. Some choose atheism because they want to maximize freedom of thought.
The reason you see others outside your school of thought as misguided fools is because they are maximizing an entirely different metric to the one you are, perhaps unwittingly, maximizing yourself. When the metric they are maximizing at least correlates positively to yours, you can tolerate them. When the metric they are maximizing is negatively correlated to yours, you feel disdain, and perhaps even a desire to correct their lack of 'common sense'.
I've held these feeling all too many times. I've spent way too much energy foolishly trying to 'right' the world. But now I've chosen a different approach: before trying to convert someone to my school of thought, first I ask myself: does this person necessarily want to maximize what I am maximizing? Or would they be better off maximizing whatever they are doing right now?
There is no universal rule saying that predictive power is any more useful than internal peace, or vice-versa. There is no real reason to convince your friend to be a rationalist if they are already living a happy life.
1 comments
Comments sorted by top scores.
comment by Ustice · 2024-03-02T13:45:38.861Z · LW(p) · GW(p)
My personal philosophy is a blended approach. In general, I’m a deontologist and Stoic, so not really used to thinking in maximizing much more than kindness. I like the heuristic of “what would Mr. Rogers do?”
The only thing that I have a hope of changing in this world is myself. For all the rest, I can only give my perspective. I’m much more interested in working with people in their current worldview than getting them to change it. I’m sure that whatever arguments I could come up with wouldn’t really be novel nor particularly persuasive.
Life is more peaceful this way.