Posts

Comments

Comment by Jagan on Circular Altruism · 2013-01-23T08:22:36.695Z · LW · GW

Well, I could qualify my example, saying surveillance ensures only people who provide zero utility are allowed to be murdered, but as I said, the article makes my point much better, even if it doesn't mean to. A single speck of dust, even an annoying and slightly painful one, in the eyes of X people NEVER adds up to 50 years of torture for an individual. It doesn't matter how large you make X, 7 billion, a googolplex, or 13^^^^^^^^41. It's irrelevant.

Comment by Jagan on Circular Altruism · 2013-01-23T06:50:29.911Z · LW · GW

You've officially given me the best example of the inherent flaw in the utilitarian model of morality. Normally, I use the example of a man who is the sole provider of an arbitrarily large family murdering an old homeless man. Utilitarianism says he should go free. The murder's family, of size X, will all experience disutility from his imprisonment. Call that Y. The homeless man, literally no one will miss. No family members to gain utility from exacting justice. Therefore, since X*Y > 0, the murderer should go back to providing for his family. I do not believe any rational person would consider that just, moral, or even reasonable.

I'm all for rational evaluations of problems, but rationality does not apply to moral arguments. Morality is an emotional response by its very nature. Rational arguments are fine when we're comparing large numbers of people. A plan that will save 400 lives vs. a plan that has a 90% chance to save 500 lives. That's not morality, that's rationality. It doesn't truly become about morality until it's personal. If you could save the lives of 3 people you've never met, would you let yourself be tortured? Would you torture someone? Regardless of your answer, it is easier said than done...

P.S. I'm not a psychologist, but I imagine if you had different answers to torturing vs. being tortured, that says something about you. Not sure what...