morality
Sam Harris, for example, argues in The Moral Landscape, and I generally agree, that the measure of moral culpability for a death is tied to the complexity and capacity for suffering of the thing dying. This is why it’s ok for complex humans to eat less complex fish. Or why killing a baby is much more egregious than terminating a pregnancy.
The sticking point for me with this definition is this, what happens if we encounter aliens who are more complex than us? From another angle, if we create an intelligence that is smarter than we are? Wouldn’t that mean that we would have to put our own life’s worth below that of these more complex entities? Especially if their capacity for suffering can be measured as greater than our own. Wouldn’t then we be compelled to put their lives ahead of ours?
I think that our “complexity” is mostly a convenient excuse that we use to justify our innate moral sense that we are more important than other animals. For example, we tend to be much more horrified when someone kills a baby than when someone kills an (adult) baboon, even though the baboon is arguably more cognitively complex and probably possesses a keener sense of suffering (simply because it is more aware). We also tend to be outraged even when the baby is killed in an entirely painless fashion; most fish almost certainly suffer more.
It makes much more sense to me to assume that our moral sense is honed to be useful on evolutionary scales to intelligent social creatures. Of course we are horrified when harm comes to our young, since without young we’ll go extinct! And, naturally, when dealing with other smart and motivated creatures with whom you can communicate, it’s almost always better to communicate and reach some agreement than act unilaterally and provoke a violent response. So feeling empathy etc. is a useful first step: it makes you want to understand, help, engage in reciprocally beneficial actions, and so on, instead of using violence to mediate between different goals. (Many animals do the same when trying to attract a mate: males have a variety of ways to judge standing and strength beyond actually testing it in a fight. We just have a lot more options because we’re much better at communicating and understanding complex concepts.)
But if our moral sense arises from very practical issues regarding our survival as a species, then it doesn’t matter when some entirely unrelated entity appears. If we’re sympathetic to, say, a cute little baby monkey, that can make some sense; we share a lot of genes with monkeys, and we’re probably better off with more diversity of life (more robust) and it’s generally better to engage our sympathetic feelings widely rather than err in the other direction and fail to extend them even where they’re critically important (“the enemy is pure evil!”).
But why be sympathetic in the same way to an organoelectronic thing that has arrived from some distant star? It still makes a lot of sense to communicate and cooperate, since said thing might be dangerous to us, and us to it; and it may know things that can help us and we it. (Though one could argue that instant annihilation is another relatively safe strategy if you can pull it off–as long as you’re clearly the more powerful one, and the other side doesn’t have powerful friends of which you’re unaware.) But I don’t think we’re in any way morally obligated to put its interests ahead of ours because that’s not what morality is for. If we could be convinced, somehow, that said thing would guard us and assure us of our continued survival better than we could ourselves, then maybe there’s an argument to make that we should place less value on our individual lives than its. (In fact, you could argue that if this really were true, to place less value on its life than ours would be an evolutionarily suicidal tendency–i.e. pretty stupid for a living thing to do, especially one who can reason about what it’s doing.)
Complexity is another way of describing mind functions.
If we spend time with anything with a lower mind we are less likely to kill and consume it.
We identify with the mind functions of animals etc.
Your question relates to the constant debate about whether we are superior other animals, the answer to that question is also the answer to your higher order beings question.
If we propose that there is a maximum capacity for suffering that is contingent upon self-awareness then we would find an equality in the judgement as to the value of life for sentient beings.
This question is based on a slight misunderstanding Harris’ moral landscape and utilitarianism. The moral hierarchy acts as a guide for us towards others. Nobody would expect the lambs to bring themselves to the slaughter, even if they could understand such a request.
We can rationalise our killing of animals for food by noting that their capacity to suffer by being slaughtered is (probably) much less than ours, if we didn’t get enough food.
So utilitarism in general and Harris in particular wouldn’t request that we sacrifice ourselves for the well-being of some alien overlords. But these aliens could legitimately use this argument to subjugate us.
That almost sounds like we would have to worship these theoretical entities. Placing their wellbeing and care above our own would elevate them to the status of gods. As an atheist, I'm not going to put some "god's" well being above my own, so I don't think the argument works in reverse. In the same way that I don't think dogs or fish are obligated to put our wellbeing above their own. Reminds me of this story where a fox "shot" his hunter, and I thought to myself "good for you, fox".
I do suppose, however, that the entities might feel that harming a human is less morally wrong than hurting one of their own. I guess that's the basis of all those alien invasion flicks, like Battle: Los Angeles where the aliens are here for our resources. Presumably the aliens feel they have a higher level of advancement and moral awareness, and are therefore justified in taking our resources to avoid their own suffering.
I would hope that the ability to communicate suffering to the oppressing entity would be the dividing line. Not just react to it, but preemptively communicate. If a fish could somehow send me a communiqué saying "stop, you're killing my family, have mercy," I would probably never eat fish again. Hopefully the supercomputers and aliens make the same decision. Let's just hope we can communicate our suffering before they wipe us out!
You might be able to make a case for it depending on context.
If, for example, these complex beings instantly suffer their equivalent of third-degree burns when exposed to ultraviolet light, and one of them needed something from the car on a sunny day, I would be willing to cross the parking lot to fetch it, because my “suffering” (being exposed to a bit of UVA and UVB) would be much less than the alien’s (third-degree burns). I would be inconsiderate and lacking in compassion if I insisted that my porphyric companion got his own damn iPod out of the car where he forgot it just because I didn’t want to get off my arse.
However, if they feed their young on flambéed cat, I’m still not handing mine over for Gort Jr’s breakfast.
You’ve taken Peter Seeger’s argument in Animal Liberation for not being cruel to animals and turned it on it’s head. He argues that because animals show distress from pain, care for their young and even mourn for the loss of their children or mates, we should not kill and eat them. This is not some scale of complexity, that if you are lower on it you can be treated as “less than”, this is a line, above which certain actions are off limits.
Plants can be cut and plucked and they don’t whince, cry out or attempt to avoid the knife, so until it can be shown otherwise, we can be safe in treating them as food. A human fetus does react to the late term abortion procedures, so there is some argument there for not doing that. I’m not interested in debating these points, I’m just painting the question in a different light. I’m not sure how Sam Harris would apply his method to these situations. His book is about morality in general, not specific political issues.
My suspicion is that Harris sees this as something other than a continuum as you describe it. That there is a place along the scale where killing is unacceptable and it doesn’t matter how much more complex you are than the thing you are thinking of killing. It is not a matter of degree. Thinking about aliens who believe they are above considering us worthy of life should lead to considering buying free-range chickens and locally grown vegetables, not giving up yourself to their superior intellect.
All content is licensed under CC BY-SA 3.0.