Valuing Sentience: Can They Suffer?

post by jefftk (jkaufman) · 2013-07-29T12:39:04.481Z · LW · GW · Legacy · 29 comments

Contents

29 comments

In the recent discussions here about the value of animals several people have argued that what matters is "sentience", or the ability to feel. This goes back to at least Bentham with "The question is not, Can they reason? nor, Can they talk? but, Can they suffer?"

Is "can they feel pain" or "can they feel pleasure" really the right question, though? Let's say we research the biological correlates of pleasure until we understand how to make a compact and efficient network of neurons that constantly experiences maximum pleasure. Because we've thrown out nearly everything else a brain does, this has the potential for orders of magnitude more sentience per gram of neurons than anything currently existing. A group of altruists intend to create a "happy neuron farm" of these: is this valuable?  How valuable?

(Or say a supervillian is creating a "sad neuron farm". How important is it that we stop them?  Does it matter at all?)

29 comments

Comments sorted by top scores.

comment by Eneasz · 2013-07-29T16:24:23.442Z · LW(p) · GW(p)

Can we taboo "Suffer"? Because at this point I'm not even sure what that means. Is it "a biological signal that identifies damage"? That seems too simple, because most sophisticated machines can detect damage and signal it, and we don't particularly worry ourselves about that.

Catch-22 re God & pain:

Oh, He was really being charitable to us when He gave us pain! Why couldn't He have used a doorbell instead to notify us, or one of his celestial choirs? Or a system of blue-and-red neon tubes right in the middle of each person's forehead. Any jukebox manufacturer worth his salt could have done that.

Listening to RadioLab they described a wasp who's midsection had been accidentally crushed. As it was dying it began to eat it's own viscera. Likely because it detected a rich food source and began executing the standard action when in the presence of a rich food source. It was at this point that I finally intuitively understood that insects are simply biological replicating machines. I cannot think of them as feeling anything akin to suffering any more, merely damage-avoidance subroutines.

It seems we're concerned about the capacity of a mind to experience something it wants to avoid. Doesn't that imply that the complexity of the mind is a factor?

Replies from: DanielLC
comment by DanielLC · 2013-07-29T22:51:26.050Z · LW(p) · GW(p)

Can we taboo "Suffer"? Because at this point I'm not even sure what that means.

We cannot, for the same reason we can't taboo consciousness. None of us are sure what it means.

All I can say is that it's the sucky part of consciousness.

Replies from: aelephant
comment by aelephant · 2013-07-29T23:55:57.296Z · LW(p) · GW(p)

It sucks to experience it personally, but maybe it serves an evolutionary purpose that we don't yet fully understand & eliminating it completely would be a mistake?

Replies from: ThisSpaceAvailable, DanielLC, Desrtopa
comment by ThisSpaceAvailable · 2013-08-01T02:51:21.482Z · LW(p) · GW(p)

"It serves an evolutionary purpose" and "eliminating it completely would be a mistake" are two completely different claims. While there is correlation between evolutionary purposes and human purposes, the former has no value in and of itself.

comment by DanielLC · 2013-07-30T02:25:00.903Z · LW(p) · GW(p)

It serves an evolutionary purpose that's pretty obvious and eliminating it entirely would cause a lot of problems. We can still find a way to improve the status quo though. We didn't evolve to maximize net happiness, and we're going to have to do things we didn't evolve to do if we want to maximize it.

comment by Desrtopa · 2013-08-01T03:28:56.569Z · LW(p) · GW(p)

I think we already have more than an inkling of the usefulness of suffering over warning signs which are less burdensome to experience. It can be awfully tempting to override such warning signs when we can.

Imagine a group of hunters who're chasing down a valuable game animal. All the hunters know that the first one to spear it will get a lot of extra respect in the group. One hunter who's an exceptional runner pulls to the head of the group, bearing down on the animal... and breaks a bone in his leg.

In a world where he gets a signal of his body's state, but it's not as distressing as pain is, he's likely to try to power on and bring down the game animal. He might still be the first to get a spear in it, at the cost of serious long term disability, more costly to him than the status is valuable.

The hunter's evolutionary prospects are better in a world where the difficulty in overriding the signal is commensurate with the potential costs of doing so. If attempting to override such signals were not so viscerally unpleasant, we'd probably only be able to make remotely effective tradeoffs on them using System 2 reasoning, and we're very often not in a state to do that when making decisions regarding damage to our own bodies.

comment by Lumifer · 2013-07-29T19:20:49.270Z · LW(p) · GW(p)

Let me offer a similar scenario that has the advantage of reality: we can implement it right now without waiting for future research.

We know where the pleasure centers of rats are. We can implant electrodes into these centers and stimulate them leading to rats being in, more or less, perpetual state of ecstasy.

We can right now create Happy Rat Farms where rats' brains are electrically stimulated to experience lots and lots of great pleasure.

Is it valuable to create Happy Rat Farms?

Replies from: ChristianKl, army1987
comment by ChristianKl · 2013-07-31T13:15:36.903Z · LW(p) · GW(p)

Or alternatively:

Should we wirehead those rats we use for toxicity testing of new medicaments?

comment by A1987dM (army1987) · 2013-07-30T12:58:40.562Z · LW(p) · GW(p)

We can right now create Happy Rat Farms where rats' brains are electrically stimulated to experience lots and lots of great pleasure. [emphasis in the original]

Sure we could, but how much would it cost us? Isn't there anything better we could do with the same amount of resources?

comment by Mestroyer · 2013-07-30T02:01:34.376Z · LW(p) · GW(p)

If Omega explained it was about to take out its super-scalpel and give me an incredibly precise lobotomy, which would take away some abilities of my mind, but not all, and there was nothing I could do to escape it, and afterwards Omega would poke the remnant of me with hot irons for a few days before killing me, but I could pay in advance to escape the hot irons, and the same offer was given to everyone, regardless of what Omega had predicted that they would choose...

If the lobotomy would take away my ability to appreciate complex forms of beauty, humor, and camaraderie, or my ability to form or comprehend English sentences, my ability to contribute to society, my ability to organize my experiences into narratives, my ability write and be persuaded by arguments like this one, my sense of a morality and inclination to act upon it, or my ability to design tools, I would still pay not to be poked with hot irons.

But if I was told that the lobotomy would take away my ability to suffer (And Omega said that by "suffer," it meant whatever familiar yet unidentified processes in my brain I previously attached that word to), I wouldn't care about the hot irons.

comment by Pablo (Pablo_Stafforini) · 2013-07-29T13:19:55.059Z · LW(p) · GW(p)

Is "can they feel pain" or "can they feel pleasure" really the right question, though? Let's say we research the biological correlates of pleasure until we understand how to make a compact and efficient network of neurons that constantly experiences maximum pleasure. Because we've thrown out nearly everything else a brain does, this has the potential for orders of magnitude more sentience per gram of neurons than anything currently existing. A group of altruists intend to create a "happy neuron farm" of these: are they awesome and inspiring or misguided and creating nothing of value?

I think this is a false dilemma. I don't find that scenario "awesome", but I do believe it would be creating something of value. The reason I believe this is that, when I experience intense pleasure, I can apprehend that these experiences ought to exist in the universe, by virtue of how they feel like. Filling the universe (or a portion of it) with these experiences is therefore a great thing, regardless of how "awesome" or "inspiring" I happen to find it.

Replies from: jkaufman
comment by jefftk (jkaufman) · 2013-07-29T14:02:21.695Z · LW(p) · GW(p)

I've edited the post to just ask the simpler question of whether this is valuable.

comment by Eneasz · 2013-07-29T16:28:57.176Z · LW(p) · GW(p)

Only tangentially related, this reminds me of a flash-fiction about vast farms of neuron-less meat, and the conundrum a humane society faces when some of those slabs of meat develop neurons. Great story, only 1000 words. Neither Face Nor Feeling.

comment by beth · 2013-08-02T05:24:55.469Z · LW(p) · GW(p)

Suffering is an emotional state triggered by desire. Desire is the attachment of value to imagined experiences.

So there's a minimal level of consciousness required to experience suffering, and a neuron farm probably doesn't meet it, that's why it's not morally significant. What sorts of organisms do meet it is another matter.

comment by Locaha · 2013-07-29T18:21:18.844Z · LW(p) · GW(p)

Can you suffer? Can you prove it?

Replies from: Rukifellth, mwengler
comment by Rukifellth · 2013-07-30T21:43:30.373Z · LW(p) · GW(p)

I'd like to take a break from LW's tradition of simply down voting and ignoring questions.

Asking to prove if we suffer is like asking us to prove that we ask questions. We point at the experience and say "this is suffering" or "this is the act of asking the question".

Replies from: asr, ThisSpaceAvailable, Locaha
comment by asr · 2013-07-30T23:06:51.900Z · LW(p) · GW(p)

Yes and that works reasonably well for most humans, since we grow up assuming that other humans work similarly to us. But as soon as you ask whether an animal or a computer program or a fetus suffers, intuition stops being very reliable.

Consider:

A) A life-like humanoid robot with a relatively simple control program that makes pained faces and says "ouch" in response to certain stimuli.

B) An uploaded human who is experiencing simulated torture.

C) An actual biological human whose motor nerves have been paralyzed, being tortured. The subject's face and breathing are placid.

Which of these would a casual outside observer believe were experiencing suffering? How would an expert convince the casual observer otherwise?

Replies from: Rukifellth
comment by Rukifellth · 2013-07-30T23:47:11.908Z · LW(p) · GW(p)

But this wasn't the question he asked. He asked if

A) I could suffer

B) I could prove it

To answer your question, I'll just lazily say that it requires The Hard Problem to be solved first.

comment by ThisSpaceAvailable · 2013-08-01T02:57:23.575Z · LW(p) · GW(p)

Asking a question is an exterior act. It can be objectively verified. I don't see how that is analogous to suffering. Asking a question is not a qualia.

comment by Locaha · 2013-07-31T08:27:36.410Z · LW(p) · GW(p)

You can program a simple robot to point at something and say "this is suffering". Or teach a parrot to say it.

Replies from: Rukifellth
comment by Rukifellth · 2013-07-31T11:22:17.199Z · LW(p) · GW(p)

This is not the question you asked.

Replies from: Locaha
comment by Locaha · 2013-07-31T11:56:25.226Z · LW(p) · GW(p)

It was a rhetorical question. You can't prove you can suffer, because suffering is a qualia.

What you can prove is you have the means to retaliate against people who make you suffer. While a chicken can't.

And thus a social contract of was eventually created between people, but not between people and chickens.

Replies from: ThisSpaceAvailable, Rukifellth
comment by ThisSpaceAvailable · 2013-08-01T03:27:55.040Z · LW(p) · GW(p)

Actually, chickens can retaliate against those that make them suffer. If they don't like how they're being treated, they can run away. So farmers have to either make sure the chickens like how they're being treated, or make sure to have good fencing. The reason we don't have contracts with chickens is because chickens don't have the intelligence to form contracts.

Replies from: Desrtopa
comment by Desrtopa · 2013-08-01T03:53:25.798Z · LW(p) · GW(p)

So farmers have to either make sure the chickens like how they're being treated, or make sure to have good fencing.

It has most likely been easier to accomplish some form of the latter than the former since we first domesticated them. Of course, being kept in fences would be a huge step up for modern farmed chickens.

comment by Rukifellth · 2013-07-31T12:16:57.410Z · LW(p) · GW(p)

You suggest that since its possible to mimic pain, it's impossible to tell if pain is genuine by the signs it leaves, yes?

But that's giving up rather early. Mimicry is imperfect, in that there is no such thing as an entity which is the same in every way as every other entity, except for having to mimic the feeling of suffering. We can look at people with anti social personality disorder, and notice that, while they do feel pain, lack any feeling of grief, shame or regret. And yet they do mimic the feelings well, other traits betray them, such as impulsiveness, frustration and boredom.

You say a parrot can be trained to say "That makes me sad", but parrots will also have a physiological signs of suffering.

Replies from: Locaha
comment by Locaha · 2013-07-31T12:59:29.038Z · LW(p) · GW(p)

Suffering, not pain.

Replies from: Rukifellth
comment by Rukifellth · 2013-07-31T16:47:39.092Z · LW(p) · GW(p)

And?

comment by mwengler · 2013-07-31T18:21:45.792Z · LW(p) · GW(p)

Can you suffer? Can you prove it?

I can prove to myself that I suffer. I can prove to myself that other people suffer by a preponderance of the evidence, but not in a deductive fashion.

comment by [deleted] · 2013-07-29T14:15:05.602Z · LW(p) · GW(p)

Do I enjoy its taste and texture? Pig, yes; octopus, no.