Robert J. Levine, M.D.
Professor of Medicine and
Co-chair, Yale Interdisciplinary Program in Bioethics
New Haven, CT
May 14, 2004
Belmont Oral History Project
Interviewer: Dr. Bernard A. Schwetz, D.V.M., Ph.D., Director, Office for Human Research Protections.
DR. LEVINE: I'm Robert Levine. I have an M.D. degree and an honorary master's degree from Yale. And that's my degrees. I am now a professor of medicine and lecturer in pharmacology at Yale University, and also o-chair of the Yale University Interdisciplinary Program on Bioethics. And that's about it.
INTERVIEWER: Thank you very much. The focus of our discussion is going to be primarily on the Belmont Report, and the things that surrounded making the Belmont Report at that time; and kind of how it has survived so well in these 25 years. So, to start out, maybe a question of what it was about your background, at that time--more than 25 years ago--that brought you into being part of writing the Belmont Report?
DR. LEVINE: In 1974--which is the year I was asked to join the staff of the Commission--I was chief of clinical pharmacology at Yale University School of Medicine. I had begun to write about the regulations that the federal government--in those days, the Department of Health, Education and Welfare--were proposing as regulations for the protection of human research subjects. I thought they were, in general, very bad proposals, and started writing editorials, polemics, position papers, and so on. And then when they finally decided to have a commission, I was asked to join the staff, I refused, because in those days--and perhaps to this day--if you agreed to accept your salary from the federal government you could not criticize the government. And so they made an arrangement with me through which they called me a "special consultant," and gave me 90--I think, 90 or 95 percent of my salary, which meant that I was still free to criticize things if I saw fit. Fortunately, I didn't have to exercise that very much.
I had no education, or took no courses, in ethics except some related courses as an undergraduate in college. But during my time with the Commission I had a marvelous experience. It was kind of like being a post-doctoral student in bioethics, with the best and the brightest of the country serving as your dissertation committee, because all the papers I wrote--what the Commission called their "background theoretical essays"--were sent out to philosophers, lawyers, surgeons, nurses--people all over the country who then criticized my work. And then I got to work again with it, and these all–most of the very large appendix to the Belmont Report– is the papers I wrote in those circumstances.
INTERVIEWER: The National Commission wrote many reports, but the one that probably has more prominence today than some of the others is the Belmont Report. Did you anticipate at that time that the Belmont Report, decades later, people would be saying that this is probably one of the best reports that's ever been written by a committee or a commission for the government?
DR. LEVINE: I did not anticipate that. The Belmont Report, after all, is just a very slim volume. Thanks, in good part, to me, its appendices are longer than those of any other report. [Laughs.] That's because I tended to write very long papers in those days.
But all it does is simply name and flesh out a little bit the meaning of the basic three ethical principles, and then provide some advice on distinguishing research from practice.
INTERVIEWER: Did you anticipate that the Belmont Report would, in effect, become the basis for the regulations that we now have from the FDA and HHS?
DR. LEVINE: Well, yes and no. I mean, the Belmont Report consists of formal principles. And as is typical of ethical principles--and, particularly, the way ethical principles were considered in the mid-1970s--they're written at a level of formality and abstraction so that they don't direct any particular action. And then part of the bioethics project of the '70s was that you would write ethical norms or rules--what the Congress called "recommend guidelines." And the purpose of these guidelines was to show how these abstract principles could be made applicable to the actual activity that they were being interpreted for; in this case, research involving human subjects.
So the Belmont Report, as a report, does not--none of it is in the regulations. What's in the regulations is exactly what Congress requested or required, and that is guidelines that the Commission recommended, which were converted by the Secretary of HEW--Health, Education and Welfare--into regulations.
In almost every case, the regulations came out almost exactly the same as the Commission's recommendation, the exceptions being primarily in the field of what the National Commission called "those institutionalized as mentally infirm," which, unfortunately, ended up with no regulations, I think because the Commission disbanded before the time came to promulgate those final regulations.
The Department Health, Education and Welfare also tried to introduce drastic changes into the children's special regulations for research involving children, but the Commission was still in office then, and they said, "No, that's not what we recommended," and they straightened them out and then did go ahead and promulgate regulations that were based on the Commission's recommendations.
If I may say so, I think the full expression of the Commission's contribution to the field of research ethics is best realized in that report "Research Involving Children," because there they had fully come to terms with the meaning of their new definitions of research and practice; where, in the Report on Research Involving the Fetus, they still were using that grotesque dichotomy between therapeutic and non-therapeutic research. I say "grotesque" because it inevitably generates regulations that make no sense.
INTERVIEWER: But there are people who feel that some parts of the regulation need to be rewritten today; the one in particular about prisoners. How would you think it would be best if someone chose to rewrite a subpart of the Common Rule? How would we involve the people who had the background in ethics and philosophy as we tried to rewrite regulations today?
DR. LEVINE: I think a model for this was set when the regulations--the so-called Subpart D regulations for the fetus, the in vitro fertilization, pregnant women--they were rewritten, and they needed rewriting because, as I said just a few minutes ago, the Commission's recommendations were based upon the old dichotomy between therapeutic and non-therapeutic research. Everybody knew the regulations made no sense. But it took years for them to get around to rewriting them.
The people who rewrote the Subpart D were based primarily in what was then called the Office for Protection from Research Risks, with consultations with various offices in Health and Human Services. And they--as I understand it, they were not permitted to formally engage outside consultants in this rewrite process. But I can tell you that they informally engaged them, and some of the outside consultants that they informally engaged were very much involved in the development of the Commission's recommendations on children, and the revision of the definitions of research and practice. I can tell you that because I was one such informal consultant. And it came out quite satisfactorily.
I think that the federal government has got to find a way to include outside experts in these rewrite processes. Whatever the rules might be for--I mean, they can have some sort of role in this, whatever role would be permitted by the way the regulations are now written. I honestly don't know what the regulations are for where you can get consultants to do what sort of thing.
But I think if they go through a process that was more open, and therefore less tediously cumbersome than going through this end-run of informal consultations that they should be able to come up with a good job.
Now, the prisoner recommendations are–I think they--I said a little while ago that the fetus--I mean the children's regulations, or the children's report was the Commission's finest hour. The prisoner report was perhaps its worst hour. And that's because--and I did use a bit of my 5 percent free time to criticize things at the time.
The big mistake the National Commission made is that it confused the agendas of prison reform and protection of human subjects. Their report was contrary to the evidence it received by having hearings at places like Jackson State Prison, and site visits to Vacaville and Walla Walla, Washington. And what they did is they set a bunch of criteria up for justification of doing certain types of research in prison, and they thought that the drug companies--the pharmaceutical firms--had such a powerful vested interest in keeping their prisoner research going that they would accomplish the agenda of prison reform in order to do so.
We had expert consultants who were themselves sociologists who had been prisoners, who said, "No. The only thing you will accomplish by this is to get the only people out of the--the only people who ever showed any concern for the well-being of the prisoner are the drug company researchers. You're going to force them out." And it came to pass.
In fact, when the FDA was on the eve of finalizing, or promulgating its final regulations, the prisoners at the Jackson State Prison filed a class action suit saying that the proposed FDA regulations were an unconstitutional deprivation of their liberty to serve as research subjects. And it was on the eve of putting out those regulations in final form that FDA backed off and withdrew the proposal.
But that expressed exactly the sentiments of the prisoners as we heard them during our visits to the prisons. I will never forget, when we met in Jackson State Prison, with the group called the trustees, and the chair of the trustees committee--these were all mandatory lifers. He welcomed us with these words: "Ladies and gentlemen, you are in a place where death at random is a way of life. We have noticed that the only place around here that people don't die is in the research unit. Now, just what is it you're here to protect us from?"
Wow, what an opener! That's the lead-in of the prisoner chapter in my book on ethics and regulation of clinical research. That's why I remember it so well. I wrote it down at the time. [Laughs.] It's very interesting.
INTERVIEWER: In general, how was the Belmont Report received at that time?
DR. LEVINE: In 1978? I think it was widely heralded--widely celebrated. It didn't say anything that anybody could experience as a threat. It said, "Respect persons." It said "Do good, don't do harm." It said, "Be fair." That's wonderful.
You don't begin to see where your problems might be--or what some people might call the "obstacles," until you begin to see the regulations that interpret what does it mean"to respect persons?" What does it mean, "to be fair?" Then, after they see that, they say, "Uh-oh, that's a little different than we would have hoped for."
INTERVIEWER: How much of the thinking of the team of you who wrote the Belmont Report was dominated, or impacted at least, by the Tuskegee experiments?
DR. LEVINE: I would say, in those days--well, first: there can be no doubt that in the past decade--give or take five years--that the Tuskegee experience has become the number one metaphor for evil in the name of research. When somebody wants to point at a project and say, "That is the worst thing I can imagine," they say, "That reminds me of Tuskegee."
That position was held in the 1970s by the Nazi experiments. And so almost everything else paled in comparison with the Nazi experience. I mean, all you have to do is spend a half-hour at the Holocaust Museum in Washington--I mean, it's one thing to deprive people of treatment, to tell them lies instead of informed consent, and to do that sort of thing. It's quite another thing to cut them up into four quarters and drop them in vats of the sort that's on display on the top floor--you know, to throw people in freezing water. So, the freshness of the Nazi experience meant that it was going to dominate any thoughts of what's evil in research.
I think now it has yielded to Tuskegee, because, one, Tuskegee is a very American experience. I mean, it happened right here. And, two, I think the people who lived with and through the time of the Nazi experiments are moving on; they're retired, they've passed away. But there can be no doubt that Tuskegee now holds the place that the Nazis once did.
INTERVIEWER: Do you think we could see another Tuskegee type of situation emerge today?
DR. LEVINE: Ahh--there are respectable authors who claim that they have seen it. Isn't that what Marcia Angell said when she reviewed the research, trying to validate the use of the short-duration AZT to prevent perinatal transmission of AIDS?
I disagree with her--but, yes, there are going to be people--that was, what, 1997 or 1998 when she wrote that?--it's going to happen again. Next year somebody will see something that looks as bad as Tuskegee. I think the odds that they actually will see such a thing are small.
Today we're not talking about no informed consent. We're not talking about endangering people on a grand scale. We're talking about--let's say--an individual like Jesse Gelsinger, a tragic situation, but mismanagement of a single case, rather than recruitment of 600 illiterate men in conditions of extreme prejudice. I mean, the Black man in Alabama in 1932 didn't have much by way of rights or concern for his welfare.
INTERVIEWER: Are there issues of human subject protection today that you wish the Commission had been able to anticipate?
DR. LEVINE: I wish they had anticipated all of the concern there is today about just what is the distinction between research and practice.
What the Commission was asked to do by the Congress was to consider the boundaries between research--biomedical and behavioral research--on the one hand, and on the other hand, the routine and accepted practice of medicine or behavioral therapy. I think the Commission responded to that charge very well. It said, "There is no boundary." You know? It's like being asked to consider the boundary between North Dakota and Nebraska--you see.
The boundary is that when you do research on practices, you start confusing what is the overall goal of this project? And their definition said, "Now you've got research here, you've got practices here. And with some new practices, which we're going to call 'non-validated practices,' or 'innovative practices,' you've got research on the practices." That's where the confusion is. But it gave us a way to sort it out.
By doing this, it sort of passively defined research as it applied to social and behavioral science as it was understood in the mid-1970s. But now the question is being asked, "What about in the field of public health? Outbreak investigations? Surveillance? These use the same methodologies as research, and yet they're are routine and accepted practices for a place like CDC. Should they be submitting to IRB review?"
And so--you've got in the field of quality improvement--you see? We are manipulating everything every day, hoping that each day we're going to do a little better. And we want to throw in some evaluation so that we can make a statement about whether we're doing better or worse. Can we even--if you review that, can we even put together a protocol that tells you what we're doing, because we're responsive, day by day, to what did we see yesterday. It tells us what we're going to do tomorrow. It's almost like anthropology on a grand scale, which is another place we have a problem. Anthropologists are saying they're not being evaluated properly because, you know, very often they walk into a situation--they have no hypothesis. They evaluate a good project as one that generates a good hypothesis.
Now, what I have seen--and let me issue a little disclaimer here--it was my job to write the definitions of research and practice. So when I say I think we did a good job in '78, that can be construed as self-congratulation--a little. But I had the help--a lot of what I wrote was formed in debate with the members of the Commission, so I can't claim that I did it all alone. In fact, if Joe Brady hadn't been on the Commission it would have based the whole definition on intent. But radical behaviorists don't concede that there is such a thing as intent. So there we were. I said, "It depends upon what you're intending to do."
Now, I think what we have to do is to concede that the distinction between research and practice--and therefore the definition of research--was not intended to cover some of the activities we're discussing today. And so what I've been advising people who are relentlessly trying to redefine "research," so that what will be reviewed is only what they want to have reviewed--I've been telling them, one, you're not the only people who want to redefine "research." All the other people want to redefine it in a different way that would be bad for your cause. So you're all fighting with each other, even though you don't even--you're not even aware of each others' existence.
My recommendation is to go the exemption route; the same way we have for all six of the exemptions, but particularly the public benefits exemption, to say that there's something so important about what we want to do that we should not have to go through the full panoply of regulation by the Common Rule and all of its implications. And I think that would be a much better approach to doing things. Of course, that will depend upon Health and Human Services and, to no small extent, your own office in agreeing that that's a good way to go with this problem.
But that may not be the best way to go with the problem. But the way the process of redefining research will certainly cause more confusion than anything else.
That's why we continue to have so many discussions about reaching conclusions of what to do with it.
INTERVIEWER: During the discussions that brought about the principles, how much of that was based on social and behavioral research, as opposed to biomedical?
DR. LEVINE: All of my writings, right from day one, included social and behavioral. But the mainstream of the Commission's deliberations were biomedical. The main stories that created the need for the Commission were biomedical stories. They were impressed with the ethical problems of the randomized clinical trial. They were impressed with Phase I drug studies. They did not spend a lot of time looking at the problems peculiar to social and behavioral science.
There's one piece in the regulations that's directly responsive to concerns raised by social scientists. And when I present it that way to many people--even experienced people--they can't guess what it is. But it's the one that restricts the IRB's deliberations so that they are not to take into account the long-range implications of doing this research. That was particularly in response to social scientists saying, "That can be a very political problem. IRB members can say, 'We don't like the long-range policy implications of this research, and therefore, cut it out.'" But, as a matter of fact, almost everything that the IRB does in reviewing, let's say, research in the field of therapeutic development, has definitely in mind the long-range implications, and no one is troubled by that--I mean, the long-range implication is that it's going to make it easier to treat– tuberculosis or something.
INTERVIEWER: One of the issues that we're focusing on a lot these days is at least financial conflict of interest, if not institutional and professional conflicts of interest. To what extent was that discussed in the deliberations--because the Belmont Report is relatively silent in mentioning them by name.
DR. LEVINE: I think it's absolutely silent. It's generous of you to try to say we thought of it.
Conflict of interest was something we were all aware of, and yet we considered that outside of our scope. That was--not that we ever stood up one day and said, "Should we include conflict of interest?" "No, we consider that outside our scope." We passively left it out. And yet everyone was aware of conflict of interest and in the hearings that led to the writing and passage of the Harris-Kefauver amendments to the Food, Drug and Cosmetic Act--the hearings that were held in the aftermath of thalidomide--conflict of interest was a big story, even Kefauver himself, after he died, they showed the conflicts he had with owing stock in several corporations who were being subjected to the hearings, and that was--oh, people wrung their hands, "Isn't that terrible!" But they said, "That's another thing. That's not the same. We're concerned with protecting human subjects at the point of what might you do to an individual to injure that individual's welfare or to disregard his or her rights."
And we also left out plagiarism. We left out everything that's in the Office of Research Integrity.
But, you know, the year I went to NIH as a clinical associate, I got a big lesson about how do we deal with data figures. I went to hear a paper that was to be presented by a young man from Yale. And he didn't show up, his professor showed up and said, "We found that he faked his data." And we never heard that name again. Gone.
That's what the scene looked like in those days; grievance committees.
INTERVIEWER: Research involving humans today is much more international than it was in the '70s–
DR. LEVINE: Yes and no. We're getting more and more aware of it, primarily through HIV-AIDS. But, you know, 1975 is when a group called the CIOMS--the Council for International Organizations of Medical Sciences--first convened its task force to develop guidelines for international research. And it was primarily, in those days--it started as a project in the field of what they called "biologicals," which in Geneva meant vaccines. And they actually put out their first batch of guidelines in 1982. So, there was an awareness that there were special problems crossing national boundaries.
The National Commission did not pay attention to that aspect of its work. I don't think the National Commission at any part of any report makes reference to what are the special problems when an American goes to Malawi or Cote d'Ivoire or somewhere. So that's definitely one thing it left out.
Much is made of the fact that the international documents have all copied the Belmont principles, and many people have said, "You see, that shows how universal they are."
That's wrong. I can tell you how the Belmont principles first got into international documents. It was in 1986 or ‘87, I was invited to be part of a group for world health, for U.N. AIDS and world health, to develop criteria for guidelines for HIV prevention vaccine research. And I was put in charge of the legal and ethical writing group. And at the end of the first day, the proposal was made that I should include the Belmont proposals in the report of the legal and ethical group. And I said, no, these are really American principles.
And so that night the leader of our group--the leader of the overall project--wrote the Belmont principles and presented them to us as a draft. And I said, "They really shouldn't be there." And the group said, "Oh, yes, we want them there." I said, "Well, in that case, if you want them there, let's get them right. Because what you've written is not what they are." So I corrected them, and ever since that time I've been given the credit--or the blame--whatever you will--for introducing the Belmont principles into the international documents.
One think you notice, though, as you look at all these documents--U.N. AIDS, WHO, CIOMS--they all mention respect for persons, beneficence and justice in the preamble, and they're never heard from again in the entire rest of the document. There's no explicit appeal to any of these.
Why do I say these are peculiarly American? I'm not going to bore you with the whole--I can give an hour lecture on that. But respect for persons, let's say--as the principle was articulated by Emanuel Kant–could be considered universal, particularly at that level of formality and abstraction. But once you begin to say that the primary attribute of the principle of respect for persons is autonomy, then you've become American. Autonomy is very much less a dominant value even in the rest of Western civilization. And in much of the rest of the world, communitarian concerns will compete on an even keel with individual self determination.
So this is why I say there is not the universality that there appears to be there. Our very, very different way of seeing self-determination from, let's say, a citizen of--well, most citizens in Asia--accounts for why we have such profound misunderstandings over whether or not we're going to get informed consent, and what form informed consent should take when we're trying to do research, let's say, in China or in Malawi, or somewhere.
INTERVIEWER: This is even more important today than it has been in the past, at a time when some 30 percent of data to support new drugs--for the FDA--are from studies done outside the U.S.A. Coming back to the three principles, there's been a lot of discussion about whether or not they are independent, as opposed to sequential, and the question of whether or not respect for persons trumps the other two. How much agreement was there within the Commissioners--between the Commissioners--about whether or not they should be independent?
DR. LEVINE: Well, first, the issue was never joined during a meeting of the Commission. I have done my best to put together everything I can find, and everything I can recall about the Commission's deliberations, and I come out with the conclusion--which I published in my Ethics and Regulation of Clinical Research book--that the Commission intended that, at least in the abstract, each of these principles had equal moral force; no thought of "trumping." I don't even know if the Commission had any bridge-players. But I think "trumping" was introduced by my friend Robert Veatch, who did not meet with the Commission, and who introduced the notion of trumping only after 1978. So it was not contemporaneous.
So, it's important to recognize this, because the requirements that derive from the three principles are often inconsistent with each other. Almost invariably there is going to be tension between respect for persons and distributive justice. And the way I like to put it in my teaching is that this virtually guarantees that we will have productive discussions for as long as anyone remembers the Belmont Report.
It's not as if you can say, "Well, first let's take care of autonomy, and then we'll worry about 'doing good.'" I don't think that's right.
Now, I don't know if you're interviewing Al Johnson--you're interviewing Steve Toulmin, I see. Johnson and Toulmin, I think, well, we wrote principles, but we're not guided by that at all. It was all an exercise in casuistry. I think casuistry--I mean, it's natural for human beings to do a little bit of amateurish casuistry--"Well, how did we handle this one last time out"--you know? But I don't think we ever appealed to casuistic principles--you know, identify paradigm cases and so on and so forth.
I think you can look through the--you don't have to look at the appendices, you look at the parts of each report called "Deliberations and Conclusions," and you'll see statements that make it very clear that this is principle-based. One of the statements on justice has to do with equitable distribution of burdens, however minor those burdens might be. And I think that's verbatim. That's a very principled stand. That has nothing to whatever to do with any other theoretical model of ethics.
INTERVIEWER: Do you think there's need to rewrite the Belmont Report, in view of the situation today?
DR. LEVINE: I think there's always the need to update our understandings of ethics from time to time. I would not say it's time to rewrite the Belmont Report. But I think it's time to write things that are fresh and responsive to the needs of the year 2004, and perhaps for the next five or 10 years.
And we always have a need to do that. And this comes out in many, many forms. For example, the American Society for Bone and Mineral Research recently put out its document on placebo-controlled trials in evaluation of new drugs for the treatment of osteoporosis. Without reinterpreting--without redoing--the Belmont Report, they give us a fresh vision that responds to how do things look today?
Now, you've got a lot of very thoughtful, respectable, ethics people in on the writing. You know, they weren't all a bunch of osteoporosis doctors.
INTERVIEWER: There's been a shift away from protecting human subjects from risk, toward permitting access to potentially useful, helpful innovations, whether they're drugs or devices or whatever else. Is that shift something that is consistent with the principles?
DR. LEVINE: Sure. I mean, what you're really talking about is interpretations of the principle of justice--what the Commission called "justice"--which most philosophers would call "distributive justice." It's all about distributing the burdens and benefits of an enterprise fairly or equitably. And you use the same principle to distribute burdens as you do to distribute benefits.
In 1970, as the Commission was writing, the prevailing view was that participation in research was a great burden. The prevailing view was that research participation was always potentially harmful, always potentially exploitative. One of the major papers--consult papers--that the Commission had written was by the Boston philosopher-ethicist Mark Swartofsky. His paper was on "doing it for money." This is not a paper you write about a field where you're focused on distribution of the benefits. This is not the title you would choose for such a paper.
I think even in the '70s there were a small number a people who called attention to the benefits of participation. A paper I co-authored with the Commissioner Karen Lebacqz, referred to the benefits. We were examining the oft-repeated accusation of the Veterans Administration, that they were exploiting VA patients as research subjects. And our paper said, well, you know, for all we know these veterans might think it's a benefit to get all this attention, and new anti-hypertensive therapies. Why don't you get them together and ask them? That was '77, '78.
I don't think, though, we really turned it around. We didn't really take seriously the fact that there was a responsible voice calling attention to the benefits of research participation until '86. And that's when we first heard from a group that came to be known as the AIDS activists. And they said, "We keep reading in the paper that participation in this clinical trial--this placebo-controlled trial of AZT versus placebo for the treatment of AIDS--is a burden. Well, from our perspective, it's a benefit. It's the only way these people can get even a 50 percent change of getting the only drug that anyone things is addressed to the cause of this disease. Moreover," they said, "Participation in any randomized clinical trial is the only way many of these people can even get a decent physical examination." You may recall, in the mid-'80s health insurance companies were desperately trying to find ways to exclude people in the at-risk groups for AIDS. They couldn't even get to see a doctor.
That's when it began to turn around. After that, then we said, "Well, now, last year we told you you had to keep women who could get pregnant out. Now, if you don't recruit them, we're going to penalize you in your scientific priority score at NIH, and we might reject your new drug application." Very different way of looking.
INTERVIEWER: There is that--the prevailing thought at that time was to protect the vulnerable groups. As you say, now there are more women in trials than there are men. We have pushed for research in children, and there's a fair amount of research going on–primarily of the social and behavior nature–on prisoners. Does that make you nervous?
DR. LEVINE: Nope. If you believe Kurt Minert's studies, there were more women being studied than men even before the new policies were promulgated. Kurt was part of the Institute of Medicine group.
And he came out with that astonishing conclusion, which I had written years earlier based on no data whatever, but I said you look around my metabolic research unit, and mostly what you see are middle-aged women, because--and it's got nothing to do with ethics, it's because--I might get struck down for saying this, years later, but men had to go to work. That's before women had to go to work. Times do change.
DR. LEVINE: If you edit that piece, do it very carefully.
–END OF INTERVIEW–
Content last reviewed on March 18, 2016