Discussion: Was SBF a naive utilitarian, or a sociopath?
post by Nicholas / Heather Kross (NicholasKross) · 2022-11-17T02:52:09.756Z · LW · GW · 4 commentsContents
4 comments
4 comments
Comments sorted by top scores.
comment by Dagon · 2022-11-17T18:57:33.951Z · LW(p) · GW(p)
Why not both?
Replies from: NicholasKross↑ comment by Nicholas / Heather Kross (NicholasKross) · 2022-11-17T19:40:33.809Z · LW(p) · GW(p)
I have wondered if that's true, and if so, is it more like a merge, or more like a Jekyll/Hyde thing.
Replies from: Dagon↑ comment by Dagon · 2022-11-17T21:02:45.745Z · LW(p) · GW(p)
[epistemic status: amusing (to me) exploration. not perfectly serious, but I don't completely disavow it either. ]
The naivety in simplistic Utilitarian assumptions is quite compatible with sociopathy. The whole point is that the future is so much larger than the present, that small changes in future probabilities are worth large sacrifices in current experience. This makes it VERY easy to convince yourself that your instrumental power increase benefits trillions of future humans, justifying 'most any behavior you want.
comment by ChristianKl · 2022-11-17T14:36:37.793Z · LW(p) · GW(p)
Part of what utilitarianism is about is that you are not supposed to make decisions because you feel sad when others feel sad but instead of following emotions make rational calculations.
If your point is "most humans who are utilitarianism don't really follow through on rational calculation but let emotions drive them, so there's no problem", it would make sense to make it more explicit.
There's no difference between applying the principles of utilitarianism to the book and being a sociopath in your definition.
Let's say you have the scenario: 'There's on person who's alive and suffering, you can either help them or help 1 trillion future people and you don't feel any emotional empathy toward those 1 trillion future people'. Who should you help? Normal people decide to help the one person who's alive and suffering because they have empathy for them. The longtermist pitch, on the other hand, suggests that this is wrong and future people should be valued just as much as present people and people should not lead themselves to get blinded by that empathy.