post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by FeepingCreature · 2019-11-03T18:46:33.929Z · LW(p) · GW(p)

Most morality is cashed out game theory. In humans this usually takes the form of "rule utilitarianism", because first of all humans don't have the cognitive capability to evaluate a utilitarian criterion over their entire worldstate, and second of all some if not all humans would defect if given the chance, so everyone agrees it's best we pick a system where they're not given the chance, so we can all benefit from the cooperate/cooperate outcomes. Now, when evaluating the quality of a rule, we use something called the "Rawlsian veil", which avoids temporary imbalances in power from skewing the outcomes - for instance, if in any exchange, only one player gets to decide the rules, but which player it is varies randomly, then both players will want the exchange to be judged as if both players have equal worth because otherwise you can't get the cooperate/cooperate outcomes yada yada. (Power imbalances are common in human society.) So the way our morality works, or at least this is the way we'll claim it works when asked¹, is whenever we face a moral challenge we first set the moral value of all participants to equal, and then compute the rule that maximizes outcomes across the board. Understand that this is not arbitrary - so far everything we're saying is for the purpose of maximizing payoffs and somewhat insensitive to the actual distribution of rewards, and we should expect this system - call it "rule of law" - to be somewhat convergent in alien species.

Transitioning society to this shape was not an instant process. As such, we also have a bunch of metamorality bound up; we see people who defect against weaker partners as exploiters, and people who cooperate even when stronger as heroic and good. However, because we needed to transition to this morality (it wasn't inherently inborn in our species) we also have standards for when defecting is acceptable, and it's that if defecting is ever acceptable it's against people who themselves defected, denying them the benefits of the social compact of rule of law. Again, we should expect this to be convergent.

Now, it's true that if you're really on top of things you can defect to your heart's content. However, you just met aliens. So what you're actually espousing, behaviorally, is that more powerful aliens can eat you if it makes them really really happy - you're saying that you're happy being in the defector category. If you ever meet more powerful aliens, that may be risky - in fact, you should preemptively try to guess what kinds of rules these aliens would expect arbitrary aliens to apply and live by them.

Probably "don't eat weaker participants" is not a rule you want to be exempt from...

¹ In our current society, we don't always live by these principles, but also we're in the tail end of a transition away from belief in a higher power, so we're in an unusually defecty stage. It's unclear how representative that is.

Replies from: Bunthut, None
comment by Bunthut · 2019-11-04T11:31:18.998Z · LW(p) · GW(p)
we use something called the "Rawlsian veil", which avoids temporary imbalances in power from skewing the outcomes
Power imbalances are common in human society.

But many of the power imbalances which are found in human society are not at all temporary. For instance, if the player deciding didn't vary randomly, but instead triangles always got to decide over squares, then while there might still develop a rule of law internally, its not clear what interest the triangles have in rectifying the inter-gonal situation. But we still (claim to) regard it as moral for that to happen. It seems the babyeaters are indeed in such a situation: Any adult eating the babies will never be a baby again. Further, they are almost certain to succeed in eating them, after which they will not grow big and maybe become a threat some day.

comment by [deleted] · 2019-11-03T19:06:17.172Z · LW(p) · GW(p)

.

Replies from: gilch
comment by gilch · 2019-11-03T20:52:07.331Z · LW(p) · GW(p)

You can't persuade a rock that cheesecake tastes better than dirt, no matter how clever your arguments. Not every possible mind might agree with us, even in principle; there are No Universally Compelling Arguments [LW · GW].

We are born already in motion [LW · GW] with godshatter [LW · GW] instincts suited to our stone-age ancestral environment of evolutionary adaptation. This part is not simply arbitrary: it had the requisite survival value, or we would not be here talking about it. We are then acculturated to our surrounding society. This part is learned, but not entirely arbitrary either, because culture itself is evolving and subject to selection pressures.

To whatever extent we judge our society to be suboptimal, we must use our evolved/acculturated minds to do it. What else could we use? But there are reasons we are the way we are, principles that are not simply random.

comment by Pattern · 2021-06-27T19:52:20.665Z · LW(p) · GW(p)

They should probably start with "how did you evolve to not care about anyone at all? Your kids? Like how did that happen?"