Posts
Comments
Utilitarianism is not supposed to be applied like this. It is only a perspective. If you apply it everywhere, then there's a much quicker shortcut: we should kill a healthy person and use this person's organs to save several other people who would otherwise be healthy if not for some organ disfunction.
Lives are in general not comparable by amount, especially human lives, for a society to function. Which is why the person who pulls the handle in the trolly problem commits a crime.
This is where intuition can go wrong. If intuitions are not necessarily consistent, since most people avoid the trolley problem at all cost, then no wonder ethics built to be based on intuition is futile.
There's no theoretical nor empirical reason to believe we know how to build intelligent machines that are more powerful than human in terms of planning, reasoning, and general executive abilities.
I agree. My main point is not that we're rational yet we disagree. But even as we strive to be rational in the future, we can still disagree due to imperfections in language. Perfect communication doesn't entail complete revelation of brain states, as with perfect communication humans can still be selective as to what to communicate, so self interest wouldn't be a major problem.
Hello there. This seems to be a quirky corner of the internet that I should've discovered and started using years ago. Looking forward to reading these productive conversations! I am particularly interested in information, computation, complex system and intelligence.