The Involuntary Pacifists

post by Capybasilisk · 2023-01-06T00:28:49.109Z · LW · GW · 3 comments

Contents

3 comments

This might be an idea better put to r/WritingPrompts (or ChatGPT), but a scenario has occurred to me which might count either as a near-utopia, or extreme dystopia, depending on your point of view.

Imagine a future Earth where a great power (this can be a superintelligent AI, benevolent alien race, or just straight-up God, depending on your preferences) has imposed a version of the libertarian harm principle on humanity. Most actions a human can engage in today would be available to you, you just wouldn't be able to commit an act of violence against another human. Sure, you might have an extremely strong desire to harm a human, but try as you might, your brain just can't send the signals to the rest of your body in order to pull that trigger/prepare that device/program that killbot.

We should also define "harm" very narrowly here. A person scamming an elderly lady out of her life savings is definitely harm, so is hurling racial epithets at someone, but I think we should here reserve the word for the most serious of crimes. These involve a violation of bodily integrity. Murder, assault, rape, and forcible confinement. What would this world where you can't do violence to a fellow human be like? Getting your head bashed in with a tire iron during a road rage incident would be something you read about in historical fiction. Same with war. You can gather all the men and material you possible can, but you simply wouldn't be able to use them. Seems pretty utopian to me, though I'm aware sometimes important details are hidden from cursory glances at a possible world [LW · GW]. A possible problem is how do you punish that guy who scammed the grandma? Taking him into custody would necessarily involve acts of violence no longer available to humans.

We could even limit this to non-consensual violence. All you MMA types, having formally agreed to accept violence from an opponent, would be free to punch each other's faces in. You could even have voluntary wars, where all the people who really miss the sound and fury of battle can meet up in central Australia or somewhere and have at it, leaving the rest of humanity in peace.

It would be interesting to see how conflict among humans evolves under these conditions. Two primitive tribes are competing for a scarce water source. So one tribe comes to take it from the other by force, and then I guess they stand around with constipated expressions before giving up and going home. But that means the tribe already there gets the precious water resource just by having got there first. Is that fair?

I also don't see this as a violation of free will, as you would still have the desire to harm, just not the freedom to act on it. Same way a prisoner, though confined, has the overarching desire to escape at the first opportunity.

So what detail would reveal this prospective utopia to be, in fact, a hellish dystopia?

3 comments

Comments sorted by top scores.

comment by nim · 2023-01-06T06:04:55.370Z · LW(p) · GW(p)

Have you ever needed to take a test with such strict anti-cheating measures that the anti-cheating measures themselves inspired you to want to get away with cheating, even though you have no need nor desire to cheat on the actual test? That's an emotional reaction that some, but perhaps not all, people feel toward such constraints. Since the reaction doesn't directly do harm, it's still allowed.

First off, surgery is gonna be a problem. My mother and I are both only alive today because a surgeon violated her bodily integrity to get me out safely. If the limit is to non-consensual harm, we would have been okay as long as my mother had consented to surgery and/or been conscious at the time that it was realized the surgery was needed.

During some training recently, I got to work a few rotations in an emergency room, and one memorable moment was when some nurses let me tag along to watch an endoscopic surgery to physically remove a clot from a stroke patient's brain. The cath lab team was waiting anxiously at the emergency room doors for the ambulance containing the unconscious patient to arrive. One of the paramedics on the ambulance crew happened to know the patient, so he handed me the ventilator he was carrying and got a family member of the patient on the phone so that an administrator from the cath lab could get the appropriate consent for treatment from the patient's family. Patient, gurney, ventilator, IVs, medics, nurses, student, and a great noodliness of tubes rolled at almost a sprint down the hallway, into the elevator, along the next hall in a flurry of getting hairnets out of nurses' pockets and onto everyone's heads, and into the operating theater. The vent tech swapped the patient off the device I was holding onto a bigger and fancier machine to do a similar job, and all us field folks helped to transfer the electronically-animated body from the gurney to a slender padded table in the midst of the tangle of alien-looking machines. The ambulance crew got back to work, but those of us with time to stay and watch were shepherded into a radiation-safe viewing area as the system spun to life. I stayed out from underfoot and watched as they made an incision near the patient's groin and, guided by imaging every second or two, gently threaded what's essentially a miniature drain snake through the circulatory system into the brain. They found the clot and painstakingly removed it, piece by tiny little piece, to restore circulation. I don't know what became of that patient afterwards, but the procedure I witnessed gave her a chance at recovery that someone with the same ailment even a few decades ago could never have enjoyed.

Could your pacifists have performed that surgery?

The patient was unconscious, and thus couldn't consent having her circulatory system violated by a meter of bendy metal rod. Would that have stayed the surgeon's hand?

Perhaps the surgeon's intent, desire to save life and certainty that the procedure is in the patient's best interests, grants consent within the system you're proposing. If that's the case, a villain need only run their own medical "school": Train surgeons who deeply and sincerely believe that certain procedures are the best thing for patients in certain circumstances. What if, on the very day I saw that surgery, there was some clinical trial going on somewhere that had completed research proving that the intervention I saw was ultimately worse than another for patients, and so some people in the world knew with certainty that it was the wrong thing to do, while other people knew with certainty that it was right? How would your system handle such disparities? You specified that lying is not forbidden, so our villain's school can lie to students easily. Hone your understanding of exactly where the enforcement system draws the line, and soon you're churning out soldiers who are religiously certain that their bullets are tools of peace and kindness, rescuing recipients from the horrifying senescence years of cancer or dementia that they'd eventually have to suffer if someone was so cruel as to continue leaving them alive.

In the modern legal system, the patient's immediate family was able to consent to the procedure on her behalf. Is that how yours would work, too? If so, a mentally ill parent who hates their child yet can consent on that child's behalf can "formally accept violence from an opponent" on that child's behalf, allowing anyone to do that violence unto the child.

OK, let's patch that loophole by saying that all children count as individuals, and themselves can do no harm but must consent for themselves. A conscious child with a broken limb won't want to permit any adult to set that break. And at what age does the prohibition upon harm start? At the very genesis of reproduction, the placenta parasitizes the womb, violating the bodily integrity of the human who's carrying it. Could a willing mother consent to a pregnancy at all, if the prospective future child lacks an individual identity and the capacity to understand that consent-to-implant at the time it's needed? Would there be a particular moment in pregnancy, perhaps where the foetal brain is sufficiently formed for the system to begin acting upon it, where the prospective future child suddenly starts counting as a human? If not -- if your system acts on the very atoms which will eventually become an adult -- we get into some profoundly quirky edge cases about how, say, a kernel of corn gets eaten by a chicken, and its molecules become an egg, and the egg gets eaten by a prospective future parent, and those molecules are used by the human body to make reproductive cells that eventually become a baby, but only molecules from certain parts of the egg, and so forth. Wild.

Back to the unconscious individual. Can some proxy consent to violation of bodily integrity on her behalf, or nah? If the proxy can consent to violation, the proxy can behave maliciously. Consent is just speech, and you've explicitly stated that speech which leads to harm, like scamming Grandma out of house and home so that she freezes to death in the winter streets, is still permitted. This raises the question: Is it forbidden to painlessly place someone into a state of unconsciousness from which everyone fully expects them to arise unharmed at the time it happens? Actually, we don't even need that, because it isn't forbidden to trick people. If you poison the victim's favorite food and leave that food where you know they'll see it and expect they'll consume it, you're not really actually harming them, are you? They had the choice not to partake of it, they consented by eating it, after all... If poisoning is forbidden, where's the line at what constitutes poisoning? Does sufficiently bad cooking, such as using unsafe food handling practices, count? Does it count only if safer food handling practices are known and available? (if knowledge is relevant to the workings of the system, does suppression of relevant knowledge count as doing harm?) Introducing e coli or salmonella to an individual's system via their food causes the recipient to at least feel harmed by that action, as anyone who's simulated an all-too-creative multi-part fountain from a bout of food poisoning can attest. If someone contaminates a piece of food without being certain it'll be eaten, and another person serves that food without being certain it was contaminated, is the prohibition on harm breached by one, both, or neither?

And if no proxy can consent for a necessary violation of integrity, that stroke patient's treatment would be halted. Congratulations, no human could have done it, but the pacifism enforcer's mandatory inaction has ended a life that might, without it, have been saved.

editing to add -- P. S., if your system tries to work around this issue by prohibiting inaction in certain circumstances, that's basically slavery with extra steps.

comment by Viliam · 2023-01-06T21:34:14.276Z · LW(p) · GW(p)

Depending on exact rules, people would find a way to hurt others that the rules do not see.

For example, a group of people can surround a victim and prevent him/her from leaving. The victim cannot push them away (that would be initiating violence, technically). If you have enough bullies, they can switch places regularly, so each bully gets a break. The victim starves to death, "peacefully".

It might be interesting to watch the wars, where the defending soldiers hold their hands making a chain around the entire country, and the attacking soldiers try to climb over their heads or dig under the ground.

comment by Shmi (shminux) · 2023-01-06T04:39:07.351Z · LW(p) · GW(p)

Have you heard of emotional abuse?