Sadly we live in a world where armed conflict affects millions of people. And our ever expanding technoscience advances mean that we are constantly faced with ethical dilemmas no humans before us have ever faced. Questions on which I have been consulted recently include what are the ethics of waging drone warfare? How should we deal with the “dual use” research that is intended to help us to develop biodefence strategies, but which could also be used to create bioweapons for biowarfare? Are certain kinds of research inherently wrong and must never be undertaken, for instance, the genetic manipulation of human embryos to make them fearless or hyper-aggressive soldiers, when they become young adults?

It can take great courage to say “no” to something we believe to be inherently wrong, especially when those with power over us want us to say “yes.” Society has always looked to the military to display great physical courage and has been inspired by their doing so, even to the extent of sacrificing their lives to protect the rest of us. But today, we must also call on your moral courage, the courage to say “no” to what you believe to be ethically and morally wrong. Doing so can involve fear, just as the exercise of physical courage does…

Sometimes, for a variety of reasons, it’s very difficult to “do ethics.” Ethical conflict occurs when we disagree about what ethics requires; ethical distress is when we know that unethical conduct is taking place, but it seems an overwhelming task to prevent it, including because doing so faces us with harms or risks to ourselves. But we shouldn’t give up in despair — good guys do win out, even if only in the long run. Let me tell you about a remarkable paper I heard at a conference on the history and philosophy of science.

We used to refer to philosophers as spending their days counting how many angels could sit on the head of a pin. Today, they are using computers to create sequential, computer-generated, decision-making sets. They generate, for instance, five thousand consecutive decisions or ten thousand consecutive decisions.

In one of these experiments, the philosophers started with two equal-sized groups of decision makers: one they called rats, the other lemmings. The rats (the bad guys) were represented by tiny red squares. They always decided just in their own self-interest and without regard to the welfare of others. The lemmings (the good guys) were yellow squares. They did the opposite; they tried to protect others, their relationships and the community, as well as themselves. At first, the rats won hands down. Initially, the yellow squares disappeared very quickly; the lemmings were losing badly. But eventually, the lemmings started to come back; yellow squares began to appear among the red ones.

What was extremely interesting and the most important message from this study was that as long as a small cohesive cluster of lemmings remained, they were not lost forever; they came back — eventually ethics was spreading again throughout the society. But if that small group was lost, if their number fell below a small critical mass, the whole graph turned red and could not be reversed. So, one ethical person plus a few ethical friends who all support each other really matters ethically.

It’s a message that’s both hopeful and fearful. A few ethical voices crying in the wilderness do matter and can make a major difference. But loss of those voices causes a complete loss of ethics. You must make sure that doesn’t happen, because military victories without the accompanying embedded ethics would indeed be hollow victories.

Margaret Somerville
Margaret Somerville is Professor of Bioethics at University of Notre Dame Australia. She was previously Samuel Gale Professor of Law at McGill University.

Vous pouvez reproduire cet article d’Options politiques en ligne ou dans un pĂ©riodique imprimĂ©, sous licence Creative Commons Attribution.

Creative Commons License