On the Human Condition
Volume XXVIII Number 2
Photo © Tyagan Miller
The Science of Doing Right
Human experimentation. The term may call to mind Frankenstein or, far worse, the real-world horrors of concentration camp tests conducted by German doctors, the 40-year Tuskegee study of syphilis left untreated in African American men, or Cold War-era human radiation experiments sponsored by the U.S. government.
In 2006, the world of research using human subjects is arguably a very different place. Codes of ethical conduct and regulations regarding human subjects have proliferated--chief among them the Nuremberg Code of 1947, the Helsinki Declaration of 1964, the Belmont Report issued by the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research in 1979, and the federal government's so-called "Common Rule" established in 1991.
Since the mid-1970s, several national commissions have been formed to advise U.S. presidents on bioethical issues. A federal Office for Human Research Protections, operating within the Department of Health and Human Services, now exists to help researchers follow ethical principles and regulatory requirements involved in human subjects research. International declarations and resolutions regarding human rights and scientific research abound. At the local level, university policies and procedures detail the legal and ethical obligations of researchers using human subjects, while Institutional Review Boards (IRBs) scrutinize faculty research projects as they are proposed and carried out.
So why would scientists need an intensive two-month seminar on the ethics of research with human subjects? Because despite all the codes, commissions, laws, and resolutions, ethical education for today's scientists is lacking, says Kenneth Pimple, the director of teaching research ethics programs at Indiana University's Poynter Center for the Study of Ethics and American Institutions.
Pimple leads Scientists and Subjects (S & S), an ethics education course he started with National Institutes of Health funding seven years ago. The course and others like it around the country are crucial, says Pimple, because inside the lab, intuition is not enough.
Fast forwarded by the sequencing of the human genome, scientific and biotechnology discoveries are raising all sorts of new questions about the effects and implications of life sciences research, especially when human lives are involved. But even as the study of moral issues in biological and medical research has mushroomed and ethical guidelines have accumulated, systematic training for scientists tackling such issues has not kept pace.
"It's not always obvious when something is wrong," he says. "You can be a good, honest, and honorable person and still not be able to figure out some of today's questions (about the responsible conduct of research). Scientists need technical expertise, taught explicitly."
Why is it hard for today's working scientists to know how to do the right thing? Pimple explains one reason. Decades ago, during the Cold War era, research teams were small, and laboratories were personal. "Everyone knew everyone else, so if someone did bad science, everyone in the field knew," he says. Now, there are many more scientists, much fiercer competition for scarce resources and funds, and an exponentially more complicated regulatory landscape. As the expense of research, the speed of new technologies, and the pressure for faster "technology transfer" have intensified, so have temptations to cut corners. As Pimple puts it, scientists now face far more, and more consequential, "occasions for sin." The international scandal involving research on stem cells and human cloning falsified by South Korean scientist Hwang Woo-suk is a recent, and egregious, case in point.
Pimple adds that moral conundrums can be especially confounding for scientists due to the nature of scientific training itself. Immersed in the empirical and the objective, scientists are schooled to pinpoint "the" right answer to a question. During his annual seminar, Pimple says, he often points scientists to an obvious parallel.
"Actually, there are no right answers in science, only approximations to right answers, the best answers to be found under current constraints," he says. "I tell scientists it's the same thing with moral action--no single right answer, but many possible answers. All of the possibilities may be uncomfortable and imperfect, so you ask, which one is the best I can do right now?"
Throughout the S & S seminars, which are conducted online, Pimple works with scientists (and research administrators such as those who serve on IRBs) to develop what he calls "robust moral imaginations"--an expanded ability to parse the moral questions inherent in research using human subjects. A key part of that ability is learning to recognize a moral problem in the first place, says Pimple.
"If you go through a course of study to become a scientist and no one ever talks about the types of problems that crop up, you may be taken by surprise when something does happen and be far more likely to do something that isn't right. If I can get people to listen to what makes a strong moral argument, then they'll know such arguments exist and may be able to use them later."
"Active moral deliberation" is the goal, he says. "I want participants to be able to recognize what they need to know, what they don't know yet, and when they have to make a decision, regardless of what they still don't know."
Participants in Pimple's seminar, who join from various academic levels and countries, spend much of their time spinning out "what if?" scenarios. "We talk about what parameters of a case would need to change in order to change the morality of it," he says, offering an example: Say you're a scientist with a risky but potentially groundbreaking experiment at hand. You need 100 patients to get valid data. That puts too many people at risk. But you could still learn something by conducting the experiment with just 20 people. How do you weigh the value of limited findings with 20 subjects against the value of definitive findings, but endangering 80 more people?
Asked another way, the question might be, could it ever be right to do the wrong thing--for instance, to risk one person for the good of many?
"The good of science is entirely contingent on its relationship to human society," Pimple answers. "If a single bad action has tremendously good repercussions--if we have to sacrifice one life to cure cancer, for instance--we might all think that's a bargain we ought to take. But if we sacrifice one person to cure cancer, then who do we sacrifice to cure AIDS? And on and on. A bad action is suspended in a web of social relationships and human history, and it has the potential to form a habit for researchers and for society."
Although Pimple is firm when he cites respect for a person's rights as an absolute in bioethical decision making, he's the first to admit that risk-benefit analyses in human subjects research can be difficult and, in his word, "maddening." Partly this is because, unlike many human subjects studies conducted 30 or more years ago, today's human experimentation can in fact lead to good things. Pimple points to clinical trials for cancer and AIDS, for example, in which experimental treatments have extended, or saved, lives. Even clandestine radiation experiments of the Cold War era yielded improvements in nuclear medicine and radiation therapies.
"How do you balance the claims of the risk and benefit to the subject versus the risk and benefit to society, the risk of one person's life versus the potential for tremendous benefit to society?" Pimple asks.
One answer is, through talk and more talk, such as the ongoing critical discussions among scientists, IRB members, patients, and ethicists that occur in Pimple's seminars. Various ethical regulations and declarations help some too. The 1979 Belmont Report's statement of three foundational principles for human subjects research--respect for persons, the obligation of beneficence, and justice--is "lucid, persuasive, and absolutely still applicable," Pimple says. In fact, he's come up with something of a code himself, a Top 10 list of the most important things to know about research ethics (see http://mypage.iu.edu/~pimple). No. 1 is "Be honest."
Close behind honesty come candor and courage, he says. In today's scientific world, researchers need "the courage of their convictions, the moral courage to stand up and say no."
It's not that Pimple sees the world of human subjects research as a swamp of moral problems--"it's more like a golf course," he jokes, with rough patches here and there. Nor does he take a wholly dim view of human nature. On the contrary, he says, "my general view of humanity is that most of us want to do the right thing most of the time. Most of us have an honest desire to do the best we can. The trouble is, we're really awful at fulfilling that desire.
"We're really skilled at convincing ourselves that ‘the right thing' and the expedient thing are identical. Our ability to rationalize our own behavior is incredible."
Which makes the broad participation in the Poynter Center's Scientists and Subjects seminars especially heartening, he says. "We live in a ‘fallen world.' We're never going to eliminate sin or error or bad behavior. But a lot of scientists and others are willing to put extra effort into better understanding the ethical underpinnings of what they do.
"Acting ethically makes life--and research--more complicated and more difficult," Pimple says, "but I'm convinced it makes both better."
Lauren J. Bryant is editor of Research & Creative Activity magazine.