An argument that animals don't really suffer
post by Solvent · 2012-01-07T09:07:53.775Z · LW · GW · Legacy · 86 commentsContents
86 comments
I ended up reading this article about animal suffering by this Christian apologist called William Craig. Forgive the source, please.
In his book Nature Red in Tooth and Claw, Michael Murray explains on the basis of neurological studies that there is an ascending three-fold hierarchy of pain awareness in nature:i
Level 3: Awareness that one is oneself in pain
Level 2: Mental states of pain
Level 1: Aversive reaction to noxious stimuliOrganisms which are not sentient, that is, have no mental life, display at most Level 1 reactions. Insects, worms, and other invertebrates react to noxious stimuli but lack the neurological capacity to feel pain. Their avoidance behavior obviously has a selective advantage in the struggle for survival and so is built into them by natural selection. The experience of pain is thus not necessary for an organism to exhibit aversive behavior to contact that may be injurious. Thus when your friend asks, “If you beat an animal, wouldn't it try to avoid the source of pain so that way 'it' wouldn't suffer? Isn't that a form of 'self-awareness?'," you can see that such aversive behavior doesn’t even imply second order pain awareness, much less third order awareness. Avoidance behavior doesn’t require pain awareness, and the neurological capacities of primitive organisms aren’t sufficient to support Level 2 mental states.
Level 2 awareness arrives on the scene with the vertebrates. Their nervous systems are sufficiently developed to have associated with certain brain states mental states of pain. So when we see an animal like a dog, cat, or horse thrashing about or screaming when injured, it is irresistible to ascribe to them second order mental states of pain. It is this experience of animal pain that forms the basis of the objection to God’s goodness from animal suffering. But notice that an experience of Level 2 pain awareness does not imply a Level 3 awareness. Indeed, the biological evidence indicates that very few animals have an awareness that they are themselves in pain.
Level 3 is a higher-order awareness that one is oneself experiencing a Level 2 state. Your friend asks, “How could an animal not be aware of their suffering if they're yelping/screaming out of pain?" Brain studies supply the remarkable answer. Neurological research indicates that there are two independent neural pathways associated with the experience of pain. The one pathway is involved in producing Level 2 mental states of being in pain. But there is an independent neural pathway that is associated with being aware that one is oneself in a Level 2 state. And this second neural pathway is apparently a very late evolutionary development which only emerges in the higher primates, including man. Other animals lack the neural pathways for having the experience of Level 3 pain awareness. So even though animals like zebras and giraffes, for example, experience pain when attacked by a lion, they really aren’t aware of it.
To help understand this, consider an astonishing analogous phenomenon in human experience known as blind sight. The experience of sight is also associated biologically with two independent neural pathways in the brain. The one pathway conveys visual stimuli about what external objects are presented to the viewer. The other pathway is associated with an awareness of the visual states. Incredibly, certain persons, who have experienced impairment to the second neural pathway but whose first neural pathway is functioning normally, exhibit what is called blind sight. That is to say, these people are effectively blind because they are not aware that they can see anything. But in fact, they do “see” in the sense that they correctly register visual stimuli conveyed by the first neural pathway. If you toss a ball to such a person he will catch it because he does see it. But he isn’t aware that he sees it! Phenomenologically, he is like a person who is utterly blind, who doesn’t receive any visual stimuli. Obviously, as Michael Murray says, it would be a pointless undertaking to invite a blind sighted person to spend an afternoon at the art gallery. For even though he, in a sense, sees the paintings on the walls, he isn’t aware that he sees them and so has no experience of the paintings.
Now neurobiology indicates a similar situation with respect to animal pain awareness. All animals but the great apes and man lack the neural pathways associated with Level 3 pain awareness. Being a very late evolutionary development, this pathway is not present throughout the animal world. What that implies is that throughout almost the entirety of the long history of evolutionary development, no creature was ever aware of being in pain.
He continues the argument here.
How decent do you think this argument is? I don't know where to look to evaluate the core claim, as I know very little neuroscience myself. I'm quite concerned about animal suffering, and choose to be vegetarian largely on the basis of that concern. How much should my decision on that be affected by this argument?
EDIT: David_Gerard wins by doing the basic Google search that I neglected. It seems that the argument is flawed. Particularly, animals apart from primates have pre-frontal cortexes.
86 comments
Comments sorted by top scores.
comment by David_Gerard · 2012-01-07T09:28:02.954Z · LW(p) · GW(p)
Voted up for giving us an argument to chew on that's both important and terrible :-)
The punchline is, of course, "and therefore God exists." Craig is trying to solve theodicy here - he's trying to show that animal suffering doesn't exist, therefore doesn't count as God allowing evil.
The obvious Google search turns up a string of refutations of Craig's argument and, indeed, his bogus neuroscience. This one and this one go over a pile of the obvious errors. PZ Myers, who, as well as being an obnoxious atheist sceptic, just happens to be a professor of developmental biology, gets stuck into both Craig's bad science and his odious ethics. (For the philosophy, Myers also points out that Craig has just made an argument in favour of freedom of abortion. Philosotroll notes that Craig's argument rejects dualism: "Does God then have a prefrontal cortex?")
Also, the mirror test is interesting.
In general, if William Lane Craig publicly says the sky is blue, he's going to follow it with "and therefore God exists."
Related: the Discovery Institute (the organisation formed to push Intelligent Design; Craig is a Fellow of the DI) has started a newsletter called The Human Exceptionalist. DI and Craig both have a religious requirement of humans being a different kind to any other animal, despite the ever-increasing mountains of data on ways in which this just isn't the case. "Human exceptionalism" is apparently the new marketing slogan. Like "theistic evolution", it's creationism with a funny hat on.
Replies from: DanielLC, David_Gerard, Solvent↑ comment by DanielLC · 2012-01-07T17:37:13.206Z · LW(p) · GW(p)
I notice both of the objections to this mention that they don't like the implications (animal "cruelty" is okay) as if it's part of their counter-argument. That's hardly relevant. You might as well argue that animals don't feel pain because that would imply there's no omnipotent, omnibenevolent, god.
Also, they talk about other animals having pre-frontal cortex. This would mean that the argument is more specific than it states, but would still imply that many animals do not feel pain.
Replies from: bogus, David_Gerard↑ comment by bogus · 2012-01-07T17:50:47.419Z · LW(p) · GW(p)
I notice both of the objections to this mention that they don't like the implications (animal "cruelty" is okay) as if it's part of their counter-argument. That's hardly relevant.
Animal cruelty could be a case where our evolved intuitions mislead us: we feel as if animals are suffering and empathize with them (even though they aren't) because they are visibly in pain. In fact, I assume that most people would feel some mild aversion to animal "cruelty" even if they knew with certainty that the animals in question lack sensory awareness, because our evolved intuitions cannot be overridden without some effort.
Replies from: David_Gerard, NancyLebovitz↑ comment by David_Gerard · 2012-01-07T18:09:17.490Z · LW(p) · GW(p)
In fact, I assume that most people would feel some mild aversion to animal "cruelty" even if they knew with certainty that the animals in question lack sensory awareness, because our evolved intuitions cannot be overridden without some effort.
This is why cruelty to animals is useful as an indicator of sociopathy in humans.
Replies from: CronoDAS↑ comment by CronoDAS · 2012-01-08T05:30:39.300Z · LW(p) · GW(p)
I wonder if torturing Sims is also correlated with sociopathy?
Replies from: MixedNuts, wedrifid↑ comment by NancyLebovitz · 2012-01-07T22:09:31.159Z · LW(p) · GW(p)
In fact, I assume that most people would feel some mild aversion to animal "cruelty" even if they knew with certainty that the animals in question lack sensory awareness, because our evolved intuitions cannot be overridden without some effort.
There's a lot of cultural variation there-- the animal fights in the Roman coliseum, bull-fighting,, and bear-baiting are all examples of culturally supported use of animal suffering as part of entertainment.
Replies from: Solvent↑ comment by David_Gerard · 2012-01-07T18:11:19.355Z · LW(p) · GW(p)
Yes, they're more "this is bogus science and I am disgusted by his conclusions from the bad science" than they are robust philosophical argumentation. As I note above, this is philosophically tainted and not the strongest refutation to give to those who might be convinced by Craig's argumentation; I assume they're assuming their readers are familar with Craig and his habit of starting with the bottom line.
The trouble with the question "does a given animal feel pain?" is the particular usage being applied of the words "feel" and "pain".
↑ comment by David_Gerard · 2012-01-07T09:59:08.475Z · LW(p) · GW(p)
One commenter on PZ Myers' post notes that the argument that animals don't feel pain as humans know it is not at all original with Craig:
In my theology classes in high school they tried REALLY HARD to drill it into us that animals do NOT feel real pain, do NOT feel real fear, and do NOT think in any way whatsoever. They were "one step above robots."
As someone who had grown up with animals, this upset and confused me, and I didn’t understand why people stared blankly at me like I was some kind of moron when I said of COURSE dogs can reason and learn, haven’t you ever seen a dog?!
I'm not aware of the history of the argument - anyone else familiar with it? Another commenter notes the similarity to the claim that humans, even severely brain-damaged ones, have souls, and smart animals just don't.
Replies from: JonathanLivengood↑ comment by JonathanLivengood · 2012-01-07T12:00:52.891Z · LW(p) · GW(p)
I don't know all of the ins and outs of the literature, but the basic problems here go back at least to Bentham and Mill, who had a dispute about kinds of pleasure and pain. Bentham took the view that all pains and pleasures were on the same footing. A human appreciating a work of art is no different from a pig appreciating a good roll in the mud. Mill took the view that pains and pleasures had more internal structure. Of course, for both Bentham and Mill, pain played a big part in the moral calculus. General concern about the moral standing of animals goes back a lot further: Descartes, for example, claimed that we have a moral certainty that animals have no souls -- otherwise, we couldn't eat them -- but it's not clear to me whether he connected this to pain.
More recently, the debate seems to be about the degree to which an analogical argument works that takes us from human pain to animal pain. See, for examples, an older article by Singer (excerpts only) and a newer article by Allen et al (pdf). But for most of these people, the issues are not theological.
Replies from: David_Gerard↑ comment by David_Gerard · 2012-01-07T15:39:36.098Z · LW(p) · GW(p)
Thank you :-) I meant in particular religious pedagogy really pushing the point, but I suppose that follows from the sort of backlash against the Enlightenment that inspired Fundamentalism.
↑ comment by Solvent · 2012-01-07T10:05:40.304Z · LW(p) · GW(p)
I've just glanced at these, I'll read them properly in a second.
My preliminary concern about those two rebuttals is that they seem to be arguing based on the punchline. I think it's obvious that whether or not animals feel pain is pretty irrelevant to arguments about God's existence, personally. So it icks me that both the posts mention this before mentioning actual factual arguments.
Replies from: David_Gerard↑ comment by David_Gerard · 2012-01-07T10:10:15.546Z · LW(p) · GW(p)
Fair enough. I will note that, although tainted in terms of pure philosophy, refutations of Craig that start at his punchline may well be quite reasonable given Craig's long history of always starting with the bottom line.
You stated a concern with the neuroscience, which Myers addresses pretty well.
Replies from: Solvent↑ comment by Solvent · 2012-01-07T10:14:02.711Z · LW(p) · GW(p)
I don't think that I agree. Jumping to the bottom line is always a problem, especially cases like this where the debate doesn't even really affect any God-existing debates.
Theism and atheism can both easily explain animals suffering and not suffering. I don't think that Craig even considers this to be a particularly strong argument in favor of Christianity. Both of those posts, particularly the second, used their (correct) disputation of the neuroscience as an argument against God. That's a sign of bad reasoning.
Like, for instance, the atheism.about.com page says Craig is "lying" about the prefrontal cortex thing, when it's far more likely he's mistaken.
I don't like either of those blog posts, even though they both raise a correct point.
Replies from: David_Gerard↑ comment by David_Gerard · 2012-01-07T10:24:41.154Z · LW(p) · GW(p)
Not "mistaken", but "doesn't care". Craig is starting with the bottom line; the presumption that he is not is useful philosophical hygiene when attempting a refutation, but is factually incorrect.
Replies from: JonathanLivengood↑ comment by JonathanLivengood · 2012-01-07T12:04:38.997Z · LW(p) · GW(p)
If you can get to the conclusion that God exists regardless of the facts, then of course, you will be indifferent to the facts. That is, I think, the big danger in reasoning to a foregone conclusion.
comment by XiXiDu · 2012-01-07T11:15:54.456Z · LW(p) · GW(p)
I don't feel like getting into a debate about preferences. But this comes up so often that I want to state my preferences for what its worth.
I would deem it extremely tasteless if someone was going to eat a raven, parrot, orca or octopus if they could as well survive by eating other lower animals. There are other examples of animals that show a lot of signs of characteristics that we normally only associated with being human. I would further deem it extremely tasteless if someone was going to torture animals just for fun, animals that can feel pain but might not be aware of it in the same sense that humans are. And if people argue in favor of those acts by claiming that I am biased and that my preferences depend on anthropomorphizing those animals, then I can only say that I believe that they are overcompensating and that I won't revisit my preferences until they can show me that a raven or parrot is no more affected by torture than Microsoft Word.
This is about preferences, about what we want. That's why I signal mine. And if you share those preferences but fool yourself into believing that they don't apply to animals because that's "biased", then you might be confused about what it means to be rational. Maybe some animals feel pain but are not aware of it and others don't even feel pain but just act like they do. So what? What if your preferences are about the signaling of pain rather than some mysterious ethical property called "pain"? You still won't enjoy watching animated cartoon animals being tortured even though they obviously don't feel pain.
The question should be, how much fun does that person have torturing that animal, how much do I care about how much fun that person has, how much do I care about the possible disutility of the animal and how much does it bother me, how much disutility do I gain because of it happening at all? If the answer has negative utility then the person should (should according to you) fucking stop torturing that animal or fucking die.
This has nothing to do with some mysterious concept called ethics, its just about preferences and expected utility calculations.
Replies from: TheOtherDave↑ comment by TheOtherDave · 2012-01-07T15:49:48.301Z · LW(p) · GW(p)
Well, it has to do with ethics insofar as ethics is about preferences and expected utility calculations.
comment by Peter Wildeford (peter_hurford) · 2012-01-07T19:38:35.476Z · LW(p) · GW(p)
Craig responds to some criticism on his argument here. Craig agrees the question is theologically neutral, and then defends the existence of God through the standard Moral Argument, by appealing to the fact that atheists do not have a basis for moral obligations to animals (including humans), which is, of course, false.
I think that should clear up some of the misconceptions about what Craig thinks his argument is really doing -- it's not proving God directly, but rather answering an objection to Christianity based in the Problem of Evil, and responding with the Moral Argument.
...
That being said, I'd like to mention something else: there was something in his second response that really interested me. Craig says:
I’ve been surprised by the emotional reactions I’ve received to last week’s Question! It almost seems as if some atheists would actually prefer that animals experience terrible suffering than have to give up the objection to theism based on the problem of animal pain!
To me, this seems to suggest that if whether or not animals actually suffer is dependent upon our personal beliefs about the issue -- if we give up the objection to theism, then animals won't have to suffer! I was wondering if anyone else saw this same belief creates reality approach.
Which is oddly contrasted by this other quote:
Replies from: David_Gerardbut what we find comfortable can’t be allowed to be a ring in the nose of science pulling it in the direction we prefer.
↑ comment by David_Gerard · 2012-01-07T20:49:38.124Z · LW(p) · GW(p)
Yeah, he's trying to solve theodicy rather than go straight to "and therefore, God"- removing an objection. That his followup is nonsensical doesn't help, though. He appears to be rather better at real-time debate rhetoric than argument on paper where his opponent can consider and respond.
comment by [deleted] · 2012-01-07T20:24:16.317Z · LW(p) · GW(p)
When a bee is stuck flying against the window desperately trying to get free, I help it.
When a spider is in some place where I know it will starve to death or get crushed, I put it outside.
When an injured bird needs some time to pull itself together and avoid being eaten by the cat, I'll spend hours babying it.
As a human, I feel empathy for other beings, and I project a conscious sentient being on them. Even though I know that there is no such conscious being, it still gets constructed and empathized with, whether I like it or not. Faced with this, I have a choice:
Act on my feelings of empathy, thereby practicing the habit of doing the right thing, and using a bit of time.
Put on my murder face and ignore the imaginary suffering, thereby practicing moral indifference, to save a bit of time.
From a purely instrumental perspective, I think choosing #1 is a good idea. Practicing morality seems much better than practicing indifference, even if the practice situation is imaginary.
That's how I like to think about animal suffering.
Replies from: latanius, tgb, jhuffman↑ comment by latanius · 2012-01-07T22:29:49.433Z · LW(p) · GW(p)
As a human, I feel empathy for other beings, and I project a conscious sentient being on them
Couldn't agree more. It's one of those "subjective objectivity" questions: there probably isn't any objective definition of animal pain that feels right except for "they are in pain if they look like they are", using humans::looks_like, of course, but the fact that their pain is not an objective bad thing about the world doesn't make its existence better or less of a thing to avoid.
By the way, is there anyone else here inclined to also help stuck electric motors and fans? OK, that's really not pain (you can't really empathize with them), but still... somehow feels bad to just leave them there. Otherwise they would be... just... sad.
Replies from: None↑ comment by [deleted] · 2012-01-18T19:32:42.268Z · LW(p) · GW(p)
I doubt there's any objective definition of pain if you simply assume the subject in question isn't a reliable narrator (they could be a p-zombie, or faking it, or it could be entirely programmed behavior...), so yeah, at some point you have to go with the affect -- they look like they're in pain, they act like they're in pain, and sure, my judgement of that is biased by my own perspective as a human with certain brainbits that make that call, but they're making that call and that's got a direct impact on my own perceptions of the situation.
I often sympathize with machines and objects as well, BTW. >> But I'm like that.
↑ comment by tgb · 2012-01-08T02:48:19.561Z · LW(p) · GW(p)
I feel obligated to point out that a lot of help that people give to animals isn't helpful. It's even illegal in many places to take in wild animals without a specific license, despite your best intentions. If you're really worried about them, call the local authorities on the subject. Or donate to a wildlife charity. (are there any GiveWell-esque meta-giving sites for wildlife funds?)
Replies from: None↑ comment by [deleted] · 2012-01-08T18:58:29.260Z · LW(p) · GW(p)
I'll keep an injured bird warm and away from the cat (it was the cat that injured it), but calling animal control for a terrorized sparrow seems like a waste of everyone's resources.
Everything I do is illegal in some way or another so I've stopped taking that into account.
Donating to a wildlife charity misses the point. It's not about producing utilons, it's about maintaining empathy. Putting on my murder face to watch a bird get torn apart by the cat while planning to donate to a charity falls in the second category (from parent) of things I could do, except that it doesn't even save me time.
If it was a larger animal like a coyote or raccoon, then calling the animal people would make sense, but not for little birds.
Replies from: wedrifid↑ comment by wedrifid · 2012-01-09T03:48:47.410Z · LW(p) · GW(p)
I'll keep an injured bird warm and away from the cat (it was the cat that injured it), but calling animal control for a terrorized sparrow seems like a waste of everyone's resources.
The bird is probably gong to die anyway. It's probably better just to kill it.
If it was a larger animal like a coyote or raccoon, then calling the animal people would make sense, but not for little birds.
They'll probably just euthanize it anyway. But yes, it can make you feel good if you don't know or think about what'll end up happening.
Replies from: None, None↑ comment by [deleted] · 2012-01-09T04:04:51.491Z · LW(p) · GW(p)
The bird seemed happy enough to be safe and recovering, and happy enough to be out again once it was released, but I think it did end up being eaten by the cat once it was back in the real world.
It's probably better just to kill it.
No! That meme feels terribly wrong to me, tho I have not worked out entirely why. It's probably a combination of the implication that you should kill someone who is being tortured, even if you had a chance of rescuing them, and the effects on your personality of killing something you have empathy for.
I've heard that murder only gets easier. I don't think I want that.
EDIT:
They'll probably just euthanize it anyway. But yes, it can make you feel good if you don't know or think about what'll end up happening.
Good point. I actually don't know what I would do for a larger injured animal. Helping it may be better than calling the death-squad. If there were a chance that it could live.
Replies from: wedrifid↑ comment by wedrifid · 2012-01-09T04:07:03.854Z · LW(p) · GW(p)
I've heard that murder only gets easier.
"Euthanize" sounds slightly better.
Replies from: jhuffman, None↑ comment by [deleted] · 2012-01-09T04:18:03.082Z · LW(p) · GW(p)
It's worth putting an appropriately strong word on death.
Replies from: Rain, wedrifid↑ comment by Rain · 2012-01-10T02:49:23.288Z · LW(p) · GW(p)
When I received a briefing from an Air Force pilot, he talked about how he "applied kinetic force" to "prosecute the target" rather than "shot missiles" to "kill people". I immediately noticed how useful that sort of language would be for psychological health when performing such actions.
This was long before the use of "kinetic military action" to describe our little war in Libya.
↑ comment by [deleted] · 2012-01-18T19:36:18.084Z · LW(p) · GW(p)
The bird is probably gong to die anyway. It's probably better just to kill it.
Depends on the injury, actually. A broken wing -- yeah, that bird's not gonna last in the wild. A bite from a cat that misses vital organs and doesn't bleed out, or some scratching? It may well survive that if it can recuperate, clot up, and retain the ability to fly after. Living systems have this weird ability to, y'know, heal from damage inflicted as long as it's not too severe.
Everything's going to die eventually. So are all the people you might ever help. Should you just not help because in the long run everything's doomed to be recycled?
They'll probably just euthanize it anyway. But yes, it can make you feel good if you don't know or think about what'll end up happening.
I take it you're not a wildlife rehabilitator and don't know anyone who is? Because that's not the standard response to injured animals...
Replies from: wedrifid↑ comment by wedrifid · 2012-01-19T01:21:21.041Z · LW(p) · GW(p)
I take it you're not a wildlife rehabilitator and don't know anyone who is?
If you must know my cynicism was given to me from my veterinarian sister who spends a surprising amount of her time killing wildlife that well intentioned but naive individuals have brought in to her or to wildlife nuts. At times she even has to bite her tongue and not tell them that if they had left the poor creature alone it probably would have lived but now that they caught it it is going to die!
Because that's not the standard response to injured animals...
Injured animals like sparrows? I beg to differ. (I'm sorry, they don't get sent 'to the farm' or 'go to sparrow heaven' either!)
Replies from: None↑ comment by [deleted] · 2012-01-19T06:51:54.107Z · LW(p) · GW(p)
At times she even has to bite her tongue and not tell them that if they had left the poor creature alone it probably would have lived but now that they caught it it is going to die!
Sure. And she's a veterinarian, not a wildlife rehabilitator (person whose job it is to, oddly enough, rehabilitate injured wildlife for re-release).
Injured animals like sparrows?
In the bit that's a response to, you were talking about coyotes and raccoons, not sparrows.
Replies from: wedrifid↑ comment by wedrifid · 2012-01-19T12:54:18.910Z · LW(p) · GW(p)
In the bit that's a response to, you were talking about coyotes and raccoons, not sparrows.
Not actually true.
Replies from: None↑ comment by [deleted] · 2012-01-20T04:09:02.532Z · LW(p) · GW(p)
Someone said:
If it was a larger animal like a coyote or raccoon, then calling the animal people would make sense, but not for little birds.
You said:
They'll probably just euthanize it anyway. But yes, it can make you feel good if you don't know or think about what'll end up happening.
I said:
I take it you're not a wildlife rehabilitator and don't know anyone who is? Because that's not the standard response to injured animals...
So yes, actually true.
Replies from: wedrifid↑ comment by wedrifid · 2012-01-20T05:23:56.994Z · LW(p) · GW(p)
So yes, actually true.
No it isn't. The context is ambiguous. Not that it matters either way since I do maintain a substantial disagreement regarding the most common outcome for larger-than-sparrow-but-still-not-important creatures that token do-gooders try to intervene to rescue.
It would not seem controversial to suggest that neither of us are likely to learn anything from this conversation so I'm going to leave it at that.
↑ comment by jhuffman · 2012-01-09T21:57:27.698Z · LW(p) · GW(p)
I've heard it said that animal cruelty should be avoided for what it does to us as the perpetrator more than for what it is actually doing to the animal.
Replies from: wedrifid↑ comment by wedrifid · 2012-01-19T01:25:36.015Z · LW(p) · GW(p)
I've heard it said that animal cruelty should be avoided for what it does to us as the perpetrator more than for what it is actually doing to the animal.
For instance it gives people around us strong evidence that we may be sociopaths!
comment by MixedNuts · 2012-01-07T11:52:10.621Z · LW(p) · GW(p)
Citation needed for most claims, but the core distinction between avoidance, pain, and awareness of pain works. Systems that have negative reinforcement pathways exist even if not all invertebrates are examples. David Gerard points out that neurological similarities make animals almost certainly aware of their pain, but there can be (we could create) exceptions.
But why on Earth would we care about awareness of pain rather than just plain pain?
Years after that, I was in a similar situation and eventually asked a friend why my body kept acting like it was in pain. She responded that my body was in fact in pain, and that the reason I didn’t understand my own reactions was the dissociation that goes with severe chronic pain. And that nobody who wasn’t in pain would ask that question.
Amanda Baggs, The Summer Thing
Craig's argument implies that if we partially relieved Amanda's pain, if would be bad, because she'd be aware of her pain. That doesn't sound right.
Replies from: bogus↑ comment by bogus · 2012-01-07T15:50:40.020Z · LW(p) · GW(p)
From reading Baggs' post, it seems that she is only complaining about the side-effects of her pain-like reflexes, not about the reflexes or the "pain" itself. To the extent that animals have no self-awareness of pain, this would in fact support Craig's argument. And yes, if we could only partially relieve these reactions by increasing her awareness of them, that would be bad. Of course, there are in fact ways to mitigate one's self awareness of pain, such as meditation.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2012-01-07T22:34:52.706Z · LW(p) · GW(p)
Would this imply that if meditation can lower enough level 3 pain to level 2, then meditators should count as less human? That people should have less concern about causing pain to skilled meditators?
Replies from: bogus↑ comment by bogus · 2012-01-08T00:50:02.430Z · LW(p) · GW(p)
Not sure where the "less human" part is coming from here. Should folks who can buy Aspirin and Tylenol (or even strong narcotic painkillers) at the local drugstore count as "less human" than folks who can't? Perhaps we should care less about them feeling pain, but the effect seems quite negligible. Also, my guess is that some animals at least do have awareness of pain, contra Craig; so using "human" here is very misleading.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2012-01-08T09:27:45.904Z · LW(p) · GW(p)
If an important distinction between people and animals is that animals only take pain up to 2, then a human who perceives pain only (mostly?) at the 2 level might be more like an animal.
Analgesics aren't relevant to this argument because they eliminate or blunt pain rather than changing the experience of it.
Replies from: Ghatanathoah↑ comment by Ghatanathoah · 2013-12-06T01:03:19.224Z · LW(p) · GW(p)
If an important distinction between people and animals is that animals only take pain up to 2, then a human who perceives pain only (mostly?) at the 2 level might be more like an animal.
In terms of experiencing pain, yes (although I do think there are more level 3 animals than Craig does). If I had to choose between torturing a level 2 human or a level 3 human I'd pick a level 2, providing the torture did no lasting damage to the body.
However, a far, far more morally significant distinction between a human and an animal is that humans can foresee the future and have preferences about how it turns out. I think an important part of morality is respecting these preferences, regardless of whether they involve pleasure or pain. So it is still wrong to kill or otherwise inconvenience a level 2 human, because they have many preferences about what they want to accomplish in life, and thwarting such preferences is just as bad, if not worse, then inflicting pain upon them.
It would even be fine to inflict small amounts of pain on a level 3 human if doing so will prevent a major life-goal of a level 2 human from being thwarted.
EDIT: Of course, I'm not saying that no other animals possess the ability to have preferences about the future. I'm sure a few do. But there are a great many that don't and I think that is an important distinction.
comment by NancyLebovitz · 2012-01-07T15:18:09.269Z · LW(p) · GW(p)
Is that argument related to saying that animals are p-zombies?
It isn't saying that animals have no qualia, but I think it's saying that some types of qualia matter more than others, and that animals can behave like people in terms of reacting to and avoiding pain, but the behavior means something very different from what it would if a person were doing it.
Replies from: David_Gerard, bogus↑ comment by David_Gerard · 2012-01-07T18:18:28.758Z · LW(p) · GW(p)
Is that argument related to saying that animals are p-zombies?
It's related in that it is intended to set the stage to show that dualism is true, if only for humans - and specifically that it is true only for humans. (In the case of p-zombies, the assertion that p-zombies are conceivable as an actually possible thing is an assertion that dualism is conceivable as an actually possible thing, so it's no surprise the argument concludes by proving dualism, i.e. something it assumed.)
↑ comment by bogus · 2012-01-07T16:20:08.003Z · LW(p) · GW(p)
I'd say that if qualia actually exist in some sense, and animals (at least those that are somewhat related to humans, e.g. mice and other mammals) have qualia, then awareness of pain is most likely among them. I'd also assume that all animals with qualia have some assessment of "pleasing" and "suffering", because conscious awareness would not be evolutionarily useful otherwise. Craig is probably assuming that animals do not have qualia, although he does not state this assumption clearly.
comment by Adriano_Mannino · 2012-08-18T16:45:45.674Z · LW(p) · GW(p)
It's been asserted here that "the core distinction between avoidance, pain, and awareness of pain works" or that "there is such a thing as bodily pain we're not conciously aware of". This, I think, blurs and confuses the most important distinction there is in the world - namely the one between what is a conscious/mental state and what is not. Talk of "sub-conscious/non-conscious mental states" confuses things too: If it's not conscious, then it's not a mental state. It might cause one or be caused by one, but it isn't a mental state.
Regarding the concept of "being aware of being in pain": I can understand it as referring to a second-order mental state, a thought with the content that there is an unpleasant mental state going on (pain). But in that sense, it often happens that I am not (second-order) aware of my stream of consciousness because "I" am totally immersed in it, so to speak. But the absence of second-order mental states does not change the fact that first-order mental states exist and that it feels like something (and feels good or bad) to be in them (or rather: to be them). The claim that "no creature was ever aware of being in pain" suggests that for most non-human animals, it doesn't feel like anything to be in pain and that, therefore, such pain-states are ethically insignificant. As I said, I reject the notion of "pain that doesn't consciously feel like anything" as confused: If it doesn't feel like anything, it's not a mental state and it can't be pain. And there is no reason for believing that first-order (possibly painful and thus ethically significant) mental states require second-order awareness. At the very least, we should give non-human animals the benefit of the doubt and assign a significant probability to their brain states being mental and possibly painful and thus ethically significant.
Last but not least, there is also an argument (advanced e.g. by Dawkins) to the effect that pain intensity and frequency might even be greater in less intelligent creatures: "Isn't it plausible that a clever species such as our own might need less pain, precisely because we are capable of intelligently working out what is good for us, and what damaging events we should avoid? Isn't it plausible that an unintelligent species might need a massive wallop of pain, to drive home a lesson that we can learn with less powerful inducement? At very least, I conclude that we have no general reason to think that non-human animals feel pain less acutely than we do, and we should in any case give them the benefit of the doubt."
comment by fortyeridania · 2012-01-08T04:59:30.052Z · LW(p) · GW(p)
Stating right at the beginning that the argument comes from Craig was probably a bad idea. Maybe others didn't have this problem, but I immediately disagreed with the argument's conclusion and rigor after reading who its author was.
It's hard both to document your sources and avoid framing effects, but maybe you could have put the author's name at the end?
comment by DuncanS · 2012-01-08T01:15:09.437Z · LW(p) · GW(p)
I think the argument is both true in some ways, and flawed. I agree that it takes a man (or perhaps the higher apes) to form the thought "I am in pain", and that most mammals don't bother with this type of reflexive thinking.
The flaw in the argument is that the "I am in pain" thought isn't the painful bit.
Replies from: fortyeridania, MixedNuts↑ comment by fortyeridania · 2012-01-08T04:50:17.716Z · LW(p) · GW(p)
According to the theory behind cognitive behavioral therapy, believing that you're suffering exacerbates the suffering (and is often the major component). They apply this to physical suffering, too.
comment by scientism · 2012-01-07T14:52:18.336Z · LW(p) · GW(p)
I'd probably identify three levels (or, at least, mark three areas on a continuum) but for different reasons. There's the class of organisms so far removed from us that analogies are difficult to make even if they do exhibit reactions to noxious stimuli (single cell organisms, worms, insects, etc). Then there's the continuum of animals from, say, simple vertebrates to mammals to primates, where their form of life is increasingly similar to our own, and it becomes much easier to identify when they're in pain. However, all such pain-identifications are attenuated and the final class is language-using beings such as ourselves, who can exhibit a far more nuanced range of suffering because of our developed linguistic skill. I think for most people the cut off point will be somewhere along the continuum of the second class and is somewhat arbitrary.
I do think when trying to identify the range of application of a psychological concept like pain it has to be relative to the concept originator (human beings). Pain is not simply a mental state or a behaviour, it's part of a complex conceptual network, partly constituted by behaviour, sensations, etc, but dependent on the antecedent applicability of a range of other concepts. Pain applied to animals is always in an attenuated sense. The more distant these animals in their lifestyles compared to our own, the less applicable the concept, since we have to find home for an entire nexus of concepts relating to pain in order to apply the concept of pain. It's an isomorphism. You can fill in gaps, ignore dissimilarities, etc, but eventually there's enough distance that you just can't fit one to the other at all.
comment by Matt_Simpson · 2012-01-07T09:40:55.001Z · LW(p) · GW(p)
Let's take his argument in the quote true as given (I don't know the relevant neuroscience here either). So we'll assume that all non-human animals only have level 1 or 2 awareness of pain. Now you need to figure out which sort of pain it is that you value preventing - level 1, 2, or 3 (presumably if you value preventing 1, you value preventing 2 and 3 as well). If you only value preventing level 3 pain, then eat away. If level 2, then don't eat vertebrates. If level 1, don't eat any organism that reacts to negative stimuli (all organisms?). This is ultimately a values question.
Note that if preventing animals from feeling some sort of pain is the only reason you're a vegetarian, then consider whether you would eat animals who were killed in a non-pain inducing way (in any/all of the three senses). If you don't think eating the animals is ok in that situation, then think about what your true rejection of eating animals is.
Replies from: peter_hurford, Solvent↑ comment by Peter Wildeford (peter_hurford) · 2012-01-07T19:43:40.261Z · LW(p) · GW(p)
I would eat animals if they were killed in a non-pain inducing way, in so far as they don't have the same kind of interests in "continuing to live" as (nearly all) humans do. Unfortunately, animals are most definitely not kept or killed in non-pain inducing ways.
Replies from: MixedNuts↑ comment by MixedNuts · 2012-01-07T19:47:04.647Z · LW(p) · GW(p)
Optimal response to that (barring behavioral consistency) is to eat the best-treated meat, not no meat.
Replies from: peter_hurford↑ comment by Peter Wildeford (peter_hurford) · 2012-01-07T20:06:37.090Z · LW(p) · GW(p)
Why is that the optimal response?
Replies from: MixedNuts↑ comment by MixedNuts · 2012-01-07T20:26:16.005Z · LW(p) · GW(p)
Stronger incentive to treat animals better, by moving money along that gradient instead of just removing it from all companies.
Replies from: Matt_Simpson↑ comment by Matt_Simpson · 2012-01-07T21:57:47.577Z · LW(p) · GW(p)
Unless even the best treated meat is treated too badly according to peter_hurford's values.
Replies from: MixedNuts↑ comment by MixedNuts · 2012-01-07T22:06:20.497Z · LW(p) · GW(p)
No! (And given we all live on Earth in 2012 it's pretty obvious that it is and I knew that.) It gives companies an incentive to treat meat marginally better, instead of creating an impossible step between current and acceptable policies. It also gives the company you like most additional money to crush its competitors with, rather than giving a smaller relative contribution to all soy farmers against the meat industry.
Replies from: Matt_Simpson↑ comment by Matt_Simpson · 2012-01-07T22:53:34.815Z · LW(p) · GW(p)
Yes, it changes incentives in the meat industry, but giving money to soy farmers shrinks the meat industry. But which of these is better depends on a) your preferences and b) the economics of these industries. The short term cost of contributing to the suffering of animals my not be worth the tiny marginal improvement in the amount of suffering animals.
Replies from: peter_hurford↑ comment by Peter Wildeford (peter_hurford) · 2012-01-08T05:14:11.152Z · LW(p) · GW(p)
I don't know the economics of it, but based on my current knowledge, I think the safest bet is refraining from consuming animal products as much as possible, and then buying the most humane products when necessary.
Replies from: Matt_Simpson↑ comment by Matt_Simpson · 2012-01-08T10:14:32.110Z · LW(p) · GW(p)
On the other hand, some people are making arguments that the production of plant based foods actually harms more animals than the production of, say, beef. I just stumbled across this
Replies from: peter_hurford↑ comment by Peter Wildeford (peter_hurford) · 2012-01-15T07:19:07.048Z · LW(p) · GW(p)
It's an interesting take on it, but I think that it ignores quite a lot of things:
This article is terribly sourced. I'm willing to trust it a bit because the author seems well-qualified, but I can't be confident that the article is accurate.
This article portrays cows grazing in really open fields, which is an inaccurate picture of how much beef is produced. The significant issue is factory farming, and this article does not address that. I have significantly less problem with feeding on animals that avoid feedlots. I wouldn't be too surprised if the ideal world included some degree of raising animals for food, for a variety of compelling reasons.
I'm not worried at all about the death of nonhuman animals, but rather the suffering of nonhuman animals. This article seems to conflate the two.
The normal grain argument is that even if the production of grain does damage, the grain still needs to be fed to the animals, so it's still a net benefit to consume as few animal products as possible. This article counters it by suggesting that cows eat special grains that humans cannot. Assuming that's accurate, if cows were not produced for consumption, all that inedible grain could be replaced with edible grain/vegetables.
At best, this merits a cow-based diet -- we still should not consume chickens, fish, eggs, pigs, etc.
Overall, I still think anyone interested in preventing as much suffering as possible should reduce his or her animal product consumption as much as possible.
↑ comment by Solvent · 2012-01-07T10:18:16.185Z · LW(p) · GW(p)
Hmmm. I'm vegetarian for other reasons than just the animals feeling pain: I've also got environmental concerns, and after being vegetarian for so long, the thought of eating meat squicks me. So it's not my true rejection, but it is a component of it.
comment by Desrtopa · 2012-01-11T18:00:18.529Z · LW(p) · GW(p)
And this second neural pathway is apparently a very late evolutionary development which only emerges in the higher primates, including man.
Only in primates? Not in any other animals, including ones capable of passing the mirror test? Given the amount of convergence that's apparent in the intelligence of humans and, say, elephants, I'm pretty skeptical of this.
comment by AspiringKnitter · 2012-01-08T02:56:54.207Z · LW(p) · GW(p)
Why doesn't he consider what he calls Level 2 pain and suffering? It seems to me that it's the very definition of pain.
Replies from: khafracomment by DanielLC · 2012-01-07T17:42:03.629Z · LW(p) · GW(p)
Animal behavior (including that of insects) changes in reaction to pain, and not just while they're feeling it. They can be trained. They can remember pain, and act to avoid it in the future. In whatever sense they have knowledge, they know to avoid that. Pain is an awareness that you don't want to do something anymore, not an awareness of that awareness.
Replies from: MixedNuts, bogus↑ comment by MixedNuts · 2012-01-07T18:19:13.136Z · LW(p) · GW(p)
Buh? We can write systems with negative reinforcement. Say, a robot that performs various movements then releases a ball, detects how far from a target the ball landed, and executes this movement sequence less often if it was too far. I think "the robot avoids missing the target" is a fair description, and "the robot feels pain when it misses the target" is a completely bogus one. Do you disagree?
Replies from: DanielLC↑ comment by DanielLC · 2012-01-07T21:32:31.519Z · LW(p) · GW(p)
First off, I believe that consciousness is not discrete. That is, I can be more conscious than a dog. Given that consciousness isn't necessarily zero or one, it seems unlikely to ever be exactly zero. As such, all systems have consciousness. A simple robot has a simple mind, with little consciousness. You can cause it pain, but it won't be much pain.
Perhaps the robot simply isn't truly aware of anything. In that case, it's not aware that it should avoid missing the target, and it feels no pain. I just don't see why adding an extra level would make it aware. If I gave it two sets of RAM, and the second repeated whatever the first said, it would be aware of everything in it's memory, in that it would put it in another piece of memory, but it's not going to make the robot become aware.
Replies from: MixedNuts↑ comment by MixedNuts · 2012-01-07T21:54:01.517Z · LW(p) · GW(p)
A simple robot has a simple mind, with little consciousness. You can cause it pain, but it won't be much pain.
The designer wrote the code for the robot and assembled the parts, proving she understands how the robot works. You and I can both inspect the robot. Can you explain the process that creates consciousness (and in particular pain), whatever those two things are, from the robot's source code, sensors, and actuators?
Perhaps the robot simply isn't truly aware of anything. In that case, it's not aware that it should avoid missing the target, and it feels no pain.
It has sensors that can measure the distance between ball and target. If I wish to, I can make it display a "I missed :(" message when the distance is too great. It then changes its actions in such a way that, in a stable enough environment, this distance is on average reduced. What extra work is "aware it should avoid missing the target" doing?
↑ comment by bogus · 2012-01-07T18:11:28.618Z · LW(p) · GW(p)
Unfortunately, this is non-responsive. There is such a thing as bodily pain we're not conciously aware of. If perceptual control theory is true, there is even such a thing as unconscious memory/aversive reaction/"won't do this anymore", as a matter of basic neurology. So insects could show all of these reactions while lacking anything like concious awareness.
comment by geebee2 · 2012-01-07T15:20:57.568Z · LW(p) · GW(p)
I would agree with the basic idea that there are three levels of pain, and also that only great apes are aware that they are in pain.
In fact humans may be in pain, but not be aware of it. I recently had a moderately serious accident, and cut my thumb deeply ( the tip of the bone was sliced off, to give you an idea ). I then probably cycled home ( I don't remember that well due to concussion, of which I was completely unaware ), and was quite unaware that I was in pain. I did know that I had cut my thumb. You might argue that I wasn't even in pain, that's debatable.
I would also cite the example of young babies - they have very little self-awareness (I don't recall the age at which it develops, but it is I think after birth), but can you assert they do not suffer when experiencing pain?
Regardless, the big jump here is going from "animals (other than great apes) not being aware that they are in pain" to the title of your post which is "An argument that animals don't really suffer". Why is suffering related to awareness of being in pain? Isn't it enough just to be in pain?
Replies from: bogus↑ comment by bogus · 2012-01-07T16:08:26.403Z · LW(p) · GW(p)
It's quite likely that you weren't in level 2 pain at all - endorphins will do that to you. It would be very different if you had started feeling pain and then decided to use concious meditation in order to mitigate it. Roughly stated, you would feel your assessment of the sensation change from "pain/suffering" to "meh... some weird stimulation I don't really care about".
Replies from: geebee2↑ comment by geebee2 · 2012-01-07T21:03:57.144Z · LW(p) · GW(p)
I'm aware of the theory of endorphins, but I'm a little doubtful if that is the correct explanation. I would instead attribute non-perception of pain mainly to the mind being able to shut out signals that are not the most important in a given situation. In fact while I am out cycling, I an easily able to instantly switch between perceiving pain (what is hurting at the moment) to concentrating on something else (going faster, or navigating a difficult corner say). So pain is often what we choose to perceive at a moment in time. In the case of my accident, if I had stopped and thought "what's hurting", I'm fairly sure I would have felt pain, and been aware of it. But until I had cycled home, put my bike in the garage, and called the emergency services, I was concentrating on other things. Having done that, I certainly did immediately feel pain! I doubt that endorphins could explain such rapid switch. Can endorphin production be consciously controlled? I doubt it.
I guess we agree here, except that I am attributing the lack of perception not to a concious decision to meditate, but an automatic stress response to concentrate fully on what what needs to be done. That would suggest I was in level 2 pain (albeit it may have been reduced due to endorphins), but I was nevertheless not aware of pain.
comment by [deleted] · 2014-06-13T10:53:33.374Z · LW(p) · GW(p)
'"Empirical studies conducted by social psychologist Daniel Batson have demonstrated that empathic concern is felt when one adopts the perspective of another person in need. His work emphasizes the different emotions evoked when imagining another situation from a self-perspective or imagining from another perspective.[17] The former is often associated with personal distress (i.e., feelings of discomfort and anxiety), whereas the latter leads to empathic concern."
Perhaps people just rationalise their feelings till it's conceptualised and construed into a socially acceptable ethical positions - even if that means a 'rationally'' defendable one.