Monday, February 05, 2007

A Code of Ethics for Scientists

There's an article on today's ScienceDaily website about a code of ethics for scientists [Scientists Should Adopt Codes Of Ethics, Scientist-bioethicist Says]. The ScienceDaily article is based on a press release from Wake Forest University Baptist Medical Center. The press release highlights a paper by Nancy L. Jones. Jones has some experience in "ethics" according to the press release.
Jones, an adjunct associate professor at Wake Forest University School of Medicine, is an American Association for the Advancement of Science (AAAS) science and technology policy fellow at the National Institutes of Health. She is a fellow at the Center for Bioethics and Human Dignity and is a recent member of the Secretary’s Advisory Committee on Human Research Protection of the U.S. Department of Health and Human Services.
With credentials like that, you'd think she would know something about science and ethics.

Jones appears to be concerned about issues such as cloning, stem cells, and gene transfer. It's not clear to me that there are real ethical issues associated with those topics but one thing is very clear—she's focusing on the uses of science (technology) and not on pure science.

Jones wants all scientists to sign a code of ethics to regulate and control their behavior. What kind of a code is she talking about? The only example in the press release is,
“A code of ethics should provide guidance for which knowledge should be sought, define the ethical means of acquiring knowledge, emphasize thoughtful examination of potential consequences, both good and bad, and help society prescribe responsible use of the knowledge,” writes Jones.

Her prototype code compares the norms of life sciences to the Hippocratic tradition. In part, it reads, “In granting the privilege of freedom of inquiry, society implicitly assumes that scientists act with integrity on behalf of the interests of all people. Scientists and the scientific community should accept the responsibility for the consequences of their work by guiding society in the developing of safeguards necessary to judiciously anticipate and minimize harm.”
I have a problem with this. Let's unpack the mix and address each of the four parts separately.
1. Provide guidance for which knowledge should be sought.

What does this mean? What kind of "guidance" would be part of a universal code of scientific ethics? Would I have to limit my search for knowledge to that which is acceptable to a researcher at a Baptist Medical School? I'm never going to sign a "code of ethics" that restricts my ability to pursue knowledge.
2. Define the ethical means of acquiring knowledge.

This sounds okay, although I wonder how it's going to work in practice. I doubt that anyone has a scientific ethical problem with most of the work done by astronomers, physicists, geologists, chemists, and botanists. Am I correct in assuming that Jones is worried about medical researchers and is transferring her specific concerns to all scientists? Is she talking about animal research or clinical trials? Would those be the only things that require defining or is there an ethical way of using a telescope?
3. Emphasize thoughtful examination of potential consequences, both good and bad.

This is the tough one. I know it seems reasonable for scientists to consider the consequences of their quest for knowledge but, in practice, it's not that easy. In my most pessimistic moods I can imagine all kinds of evil things that might be done with the knowledge that biochemists have gained over the past few decades. What should I do about that? Should we force our colleagues to stop doing research whenever we can imagine a dire consequence? Of course not.

Does that mean we should never consider the consequences; no, it doesn't. But keep in mind that scientists have been badly burned whenever they have publicly stepped into this morass. It was scientists who raised the issue of possible consequences of genetic engineering. Even though the scientists decided that the possible risks were minimal, the lawyers soon took over and we were stuck with silly laws that impeded research for a decade. Many of us remember that fiasco.

The responsibility for the misuse of scientific knowledge lies with those who misuse it and not with those who discovered the knowledge in the first place. You can't inhibit the search for knowledge on the grounds that it might be abused by someone in the future. That's why this part of the code of ethics is naive, irresponsible, and ultimately counter-productive. It attempts to put the blame on science when it's technology that's at fault.
4. Help society prescribe responsible use of the knowledge.

This is a legitimate role for scientists as long as they are explaining science. I don't have a problem with scientists describing stem cell research, for example. They can explain how it's done and explain the probabilities of success and the consequences of failure. They can describe how the new-found knowledge might help patients with various diseases and injuries. In other words, scientists can be a valuable source of knowledge.

But are scientists any better than the average citizen at "prescribing responsible use of knowledge" in the sense that Jones implies? I don't think so. Almost all American scientists would advocate funding stem cell research. Are they being ethical? What about those religious scientists who say that stem cell research is unethical? If both types of scientist signed the same code of ethics then what does it mean to say that scientists should "help society prescribe responsible use of knowledge"? What about those stem cell researchers who choose to stay out of the public limelight and get on with curing Alzheimer's? Are they unethical because they remain silent?
As you can see, science ethics is a complicated problem. Any attempt to regulate scientists based on some individual's definition of ethics is doomed to failure. I can't wait to see what Janet Stemwedel has to say about this.

4 comments:

  1. The referenced Journal article itself does seem to be focused on the "life sciences" (as indicated by its actual title, which says this). The same issue of this Journal also has an article about "ethics" for physics.

    ReplyDelete
  2. I don't have a lot of trouble finding ethical issues associated with research outside the life sciences. For example, Bikini Atoll is still radioactive half a century after the rather dramatic physics research done there, probably people wouldn't support that kind of testing today. The mud volcano in Java is a current events example of geology gone wrong, although that was a commercial drilling operation rather than research. I suspect most sciences hold the potential for a horrible outcome to an experiment.

    ReplyDelete
  3. Micael says,

    I don't have a lot of trouble finding ethical issues associated with research outside the life sciences. For example, Bikini Atoll is still radioactive half a century after the rather dramatic physics research done there, probably people wouldn't support that kind of testing today.

    Bikini atoll was the site of many nuclear weapons tests in the 40's and 50's. What does this have to do with science?

    ReplyDelete
  4. I think this post has nailed most of the problems here, certainly more than I have thought of. As I said on the earlier post on ethics for science, I'm used to see ethical committees for framing work and ethical codes for guiding it. Scientists should be on the committees (and I see that the code supports this).

    To compare "to the Hippocratic tradition" means that the code confuses development (scientists) with application (users) as noted. The code should note personal ethical responsibility for applications (animal research or clinical trials) and suggest minimizing damage or pain, but the university ethical committees should in practice survey and control that the work is reasonable.

    I noted earlier that science is seldom large scale industry, so responsibility for such things as environment or neighbors aren't a concern for a code. I hope they don't go into that morass.

    I agree that the code should mention responsibility to promote science in the society, such as education, media presence, and membership on science ethical and budget committees. The example of consequences of genetic engineering is excellent.

    It also shows that this process needs improvement beyond the current reach of scientists - but in risky cases it is better to err on the safe side. A small measure of comfort is that the public seems to recognize that the researchers were acting responsibly.

    "Define the ethical means of acquiring knowledge."

    It could also mean to specify that faking research, stealing research, acting against other researchers, suppressing knowledge, twisting research results for personal or political gains, or squander research money, and a lot of other activities involving nepotism, sexual harassment of dependents, et cetera, aren't ethical.

    Most of that is regulated elsewhere and some aren't specific for science but the institutions, but it could be mentioned here anyway.

    ReplyDelete