philosophy
, morality
, evolution
, indoctrination
This questions sounds easy, so bear with me here. As a hard core materialist (and as a programmer who has written “machine learning” algorithms that are logical equivalents to pleasure and pain), this is the concept that I have the most trouble with.
From an atheist point of view, the main reason that religious people “know” that there is a God, is that they were indoctrinated to believe in him when they were too young to subject that belief to rational scrutiny. Rationally those people might say “there is no evidence.” They might even understand and accept that the capacity for belief in God may have had survival benefits in their ancestors, so, true or not, human brains evolved that capacity. Nevertheless, they can’t let go of something that became deeply embedded in their view of the world starting when they were toddlers. No matter how they try, they can’t rationalize it away, and they generally wouldn’t want to.
As atheists, we tend to see that as a weakness…
However, most of us – atheists included – “know” that it is wrong to torture a kitten, but that you can’t be similarly cruel to a man-made machine. And the reasons we know this seem (to me) almost identical to the above: i.e. right and wrong were taught to us when we were too young to question it, a capacity for empathy (specifically toward humans and animals) evolved into humans because it had survival benefits to our ancestors, etc. Rationally, we can acknowledge that the kitten’s brain is essentially a computer, that pain is not supernatural, and that there is no technical reason the kitten’s sensation of pain is fundamentally different than whatever happens in a robot or other man-made machine when it is forced into a situation that it is programmed to attempt to avoid.
But still….we just know that there is something different about the kitten. Even though there isn’t a shred of scientific evidence to support the idea that there is more to pain than a certain kind of computational process, we can’t, and don’t want to, rationalize away our aversion to being cruel to kittens, nor can we rationalize into existence a similar feeling toward certain machines.
What is the difference? Is empathy an irrational weakness, and if not, why not?
Edit: I was asked to edit this with some things that I clarified in the comments below. I won’t try to get everything, but for now, I’ll add my definition of “pain,” because I think it is important to this discussion.
Note that my definition is not circular (it doesn’t use words like “dislike”, “unpleasant”, “hurt”, “unhappy” or “bad”, nor their opposites) nor subjective (doesn’t use words like “experience”, “feeling”, “sensation”, “conscious” or “qualia”) nor specific to biology (nothing about neurotransmitters!). Due to those requirements, though, it tends to contrast with our intuitive notion of pain, in that it can apply as easily to software programs as it can to animals.
Pain: in a goal-oriented entity that can learn based upon the results of past behavior, the process of suppressing previously followed decision paths so that they are less likely to be followed in the future.
You might similarly ask if any emotional experience is irrational. It’s certainly possible to develop maladaptive emotional responses, such as exploding in a fit of jealous rage when your girlfriend smiles at another guy. That’s irrational because smiling at someone doesn’t mean much. It’s a weakness because it’s a reaction that would only make a bad situation worse, or a neutral or good situation bad.
Thinking about empathy in the same way shows that it’s neither a weakness nor irrational. If a kitten displays behaviours which clearly demonstrate that it’s experiencing pain, it’s rational to respond as if it is in fact in pain. For example, if you hurt the kitten it will attack you or run away. If you persist it will grow more aggressive or fearful, or both. It’s not a weakness to respond empathetically because that will end the pain, improving the situation for you and the kitten (i.e., it’s an adaptive response). It’s not irrational since the point of empathy is, by sharing the kitten’s experience, to get you to do what’s best for the kitten. The purpose of feeling empathy would have to be something different for such a response to be considered irrational. It’s how kittens themselves learn not to hurt each other (whether or not you’d call that empathy is another debate).
The difference, aside from the lack a nervous system, is that computers are not programmed to learn that a particular signal (i.e., the experience of pain) indicates that whatever action they performed should be avoided. Of course, as you know, they can be programmed that way. But in general they’re not. They are simply programmed to avoid performing the action in the first place (or rather, not programmed to perform the action, thus obviating a need for avoidance). Alternatively, they are programmed to alert someone to their need of attention/repair, and to shut down or otherwise continue performing whatever functions are still available. Such programming is direct, specific, and limited; very different from the processes involved in experiencing and learning from pain.
However, suppose a robot were programmed to learn from its interactions with the environment. Suppose further that it was most effective to program it to learn to generate a strong signal (i.e., pain) in response to damage (whether or not it caused the damage itself), and a similar signal would also be generated to override any action which previously caused it damage. If it is damaged, it cringes, retreats, and lets out a piercing sound, all to let anything nearby know that it has been damaged, and that whatever caused it shouldn’t be repeated (even if whatever caused the damage was non-sentient). Generating the “pain” signal floods the robot with energy, preparing it for rapid action. However that energy can’t be reclaimed, so if it’s not used, it’s lost. The robot releases particular fluids which also help it respond quickly and vigourously if necessary, however those fluids are damaging if they’re allowed to build up. In such an unlikely scenario it would be rational to feel empathy for a robot. And in fact we probably would. There is plenty of evidence that people respond to robots as they do to humans, if the robots behave sufficiently like humans. But as far as I’m aware, no computers or robots or machines are programmed that way (and there is ample reason to think that would be far from the most effective implementation of damage avoidance), so it’s not irrational to lack empathy for computers, but it is irrational to approximate computer programming or machine functions to the experience of animals, at least until their implementation goes beyond logical equivalence and attains fully-functional equivalence.
Try not to be distracted by scientific evidence, per se, on differentiation within a given person's mind; consider instead the function and the how of empathy.
In terms of comparative irrationality, believing in [god] and experiencing or being motivated by empathy are both reliant on the defensive and developmental mechanism of (Projective) Identification. In terms of the second concept you are addressing, psychopathy (lack of empathy) toward computers, you might refer to the "discussions have distinguished between empathy (as the more intuitive emotional aspect) and perspective-taking (as the more cognitive aspect)." Empathy is an important mechanism in that it promotes/provides for community and the long term success of the species (as it is currently manifest); perspective-taking is important in that allows us to view the world and the objects in it objectively and as means to a specific end.
Irrationality is an odd metric to introduce in comparing the three of these types of entities. The degree to which a given person identifies with [god] (supernatural, generally anthropomorphic), kittens (natural, anthropomorphic), and computer hardware (natural/synthetic, not anthropomorphic) might give you a better construct for evaluating how someone comes to empathize with one, as opposed to another, of those entities.
When you are considering why a given person might detest pain toward a kitten more than a computer it is likely because the person can project their understanding of pain (by experiencing it) on to the kitten, whereas they may not have enough of an identification with a laptop to do the same. In this way, although empathy might not be a cognitive (rational) motivator, it can be rationalized functionally.
To add a moral question on top, just extend the degree of identification with a dose of value-judgment. The degree of value-judgment required (by the act in consideration; i.e. torture by morality has a higher degree of judgment than gossiping about) will have a multiplicative relation to the degree of identification with the subject (i.e. a person might be 10/10, a dog a 9/10, an ant a 4/10, a computer a 2/10). In this way, you can create a metric for how a given person reacts and understand what motivates their rational layer of thought. In a sense, value-judgment is the synergy between empathy and value-judgment.
Rationally, we can acknowledge that the kitten’s brain is essentially a computer, that pain is not supernatural, and that there is no technical reason the kitten’s sensation of pain is fundamentally different than whatever happens in a robot or other man-made machine when it is forced into a situation that it is programmed to attempt to avoid.
Ergo it should be rationally fine to torture animals, that seems to be your conclusion. Of course, if it’s OK to torture kittens then by only one small step of induction it should be fine to torture humans as well, since they’re ultimately nothing but biological machines as well.
Did you ever consider that perhaps the more reasonable conclusion would be that it’s just as wrong to torture robots?
Of course, we both know that in the current age the comparison is absolutely ridiculous. The amount of work that has to be done in order to translate a simple avoidance algorithm into the kind of complex integration with other components that can comprise a form of consciousness is quite large. Teasing an avoidance algorithm is more like teasing an amoeba, not a kitten. Thus the conclusion one naturally comes to when making this categorical error is to err on the side of, “everything’s just a basic machine,” while if we really had machines that where anything like kittens in the world the picture we would have in our minds would be quite, quite different.
Might I suggest you go get yourself a copy of The Mind's I and have a read of the chapter titled “Soul of the Mark III Beast”? Or you can just read it online: http://junkerhq.net/MGS2/MarkIII.html
Taking a cold and clinical viewpoint of pain and empathy, I think it's fair to claim that empathy is a weakness, but I don't think you can defend the claim that it is irrational. Empathy plays a significant part in a species' ability to live long enough to procreate and produce subsequent generations of itself. For instance, a lion feels no empathy towards a gazelle when the lion needs to eat, but the lion makes sure that its cubs and other members of its pride have a chance at the dead gazelle to fulfill their need for sustenance. For humans, it could be foolish to show empathy to an enemy during war time because that enemy could kill you when you're not looking. However, it could be beneficial to show empathy towards an enemy if there is some chance that the empathic behavior would be returned--consider the story of Androcles and the Lion.
We are not born with empathy, it is something we learn through interaction with other things, humans, other animals, vegetation, etc. I encountered a theory some time ago, but I don't currently remember exactly where, that during development children need to be exposed to the nurturing, caring behavior that is stereotypic to mothers as well as the more combatant play that is stereotypic of fathers. For proper development, children have to have someone who plays with limited roughness so that they learn about limits--children learn that being pinned by an older sibling makes them feel bad, so they learn not to do the same thing to others. This</strong> is the meaning of empathy.</p>
Empathy may be considered a weakness, but in light of what it means to the propagation of a species, it is definitely not irrational.