Andrews, Lori B. "Money Is Putting People at Risk in Biomedical Research," The Chronicle of

 

Andrews, Lori B. "Money Is Putting People at Risk in Biomedical Research," The Chronicle of
Higher Education, 10 March 2000, pp. B4, B5.

When the Food and Drug Administration halted research into human-gene therapy at the
University of Pennsylvania in January, after the death of a research subject, Jesse Gelsinger,
administrators at universities across the United States took notice. If a research program at an Ivy
League institution, run by a scientist as prominent as James M. Wilson,, was being accused of
serious breaches of law and ethical standards, what problems lurked in the growing number os
studies at their own institutions? The Universities of Alabama, Colorado and Illinois, as well as
Duke and Virginia Commonwealth Universities, also have recently faced embarrassing public
disclosures about their research programs. For example, federal regulators charged scientists at
the University of Illinois at Chicago with failing to obtain subjects' informed consent in a study
of an antipsychotic drug, and charged Duke's institutional review board with not monitoring
continuing research.

Can other institutions stay out of trouble by giving researchers more education about their ethical
and legal obligations? Or do we need radical new laws to remedy the situation?

The number of research studies on campuses is soaring, and the potential risks for subjects are
increasing. In the 1970s, as an undergraduate majoring in psychology, I earned either course
credit or a few dollars by participating in studies in which the major risk was inconvenience or
boredom. Now, health college students can earn upwards of $2,000 fir taking part in medical
studies but may face permanent harm.

The financial incentives for researchers are escalating, too. Last year, The New York Times
reported that pharmaceutical companies often pay doctors handsomely in some case, $1- million
per year for enrolling patients in studies. The results: Doctors from one field enroll their patients
in drug trials in another field. For example, asthma specialists run studies on psychiatric
medications. Patients who are not appropriate candidates for a study have received drugs for
conditions they did not have, sometimes without even being told that the drugs were
experimental. That not only subjected them to unnecessary risks, which is malpractice, but also
compromised the study results.

Even $1-million a year is small potatoes, though, in light of the incentives created for university
researchers in the 1980s by the federal laws governing technology transfer. Before that time,
research conducted at universities and supported by public funds belong to the public. But the
new laws give academic researchers intellectual-property rights; now they can, for example,
patent a gene they discover or an invention they make, even if the entire enterprise has been
financed by taxpayers through the National Institutes of Health or another federal agency.

Academic researchers can form biotech companies or enter into joint ventures with them. Penn's
James Wilson, for example, founded a gene-therapy company. Many university biologists have
become millionaires, with stock options and fees as directors of corporations far exceeding their
university salaries.

Is it any wonder, then, that federal investigators from the Food and Drug Administration and the
N.I.H.'s Office for Protection from Research Risks find that, even through federal regulations
require scientists to disclose the risks of participating in research to potential subjects, the
researchers often underplay the risks and enroll inappropriate candidates? At Penn, the
informed-consent form that Jesse Gelsinger signed did not disclose the fact that two monkeys
had died after receiving the gene-therapy vector that he, too, was given. And researchers had
accepted Gelsinger as a subject despite the fact that, according to the government, his liver
function was not good enough to meet the study's criteria.

Other universities have covered up the risks of gene therapy in studies conducted on their
campuses. Disregarding federal rules, researchers reported promptly to the N.I.H. only 39 of the
691 deaths and illnesses suffered by participants in gene-therapy research who had received the
same vector as Gelsinger.

AT CONGRESSIONAL HEARINGS in February, sponsored by Sen Bill Frist, a Republican
from Tennessee, the testimony be federal regulators was disappointing. It sounded almost exactly
like that of the biotech companies. Jay Siegel, of the F.D.A., pointed to the "potential for
tremendous patient benefit" of gene therapy. He reported that "since the first human gene transfer
in the late 1980s, human gene therapy products have become one of the fastest growing areas of
product development."

Referring to "products" makes it seem as if valid gene therapies are geing marketed already. In
reality, gene therapy has been a dud not one successful experiment has been reported in a
peer-reviewed journal. Yet, thanks for the hype by companies and regulators alike, 36 percent of
the public, according to a survey conducted by the National Center for Genomic Resources,
thinks that gene-therapy treatments have already succeeded in curing human diseases. Research
volunteers thus may not realize the preliminary nature of the experiments they have agreed to
participate in.

The business perspective among researchers is even more obvious. As far back as September
1995, at a meeting at Stanford University on genetics research, faculty members and graduate
students filled the auditorium when George Poste, of SmithKline Beecham, was speaking. But
when Nancy Wexler, then chair of the N.I.H.-D.O.E. Working Group on the Ethical, Legal, and
Social Implications of the Human Genome Project, took to the stage to discuss the risks to
participants in genetic research, many members of the audience left. I overhead researchers in the
hall discussing how to handle psychological risks not to their subjects, but to the researchers
themselves when there was a dip in the value of their biotech stock.

Whether the corporate mentality of many researchers is part of the reson for the abuses recently
uncovered in research with human subjects, or whether investigators have simply become more
aggressive, is an interesting question. It is clear, however, that the violations of the federal
regulations protecting human subjects are serious.

According to the American Bar Association's Coordinating Group on Bioethics and the Law,
between January 1997 and June 1999 the F.D.A. issued 36 warning letters to investigators
undertaking research with drugs, medical devices, and biological products. The reported
violations are similar to those found by the Office for Protection from Research Risks for human
research in general. They include failures to submit studies to an institutional review board,
obtain informed consent, report adverse effects of the research, and determine the suitability of
research subjects' participation, and inadequate record-keeping.

The F.D.A. and the risk-protection office also found that some universities did not seem to be
taking seriously the review of human-subjects research. Their I.R.B.'s had too few members for
the volume of studies they were expected to review, and their researchers had inappropriately
claimed that studies fell within regulatory exceptions and so did not need to be approved by the
institution's I.R.B.

Administrators and researchers at many colleges and universities seem to think that they should
be exempt from regulation because they work at nonprofit institutions. Yet human research
subjects deserve protection no matter who is conducting the study. That is particularly true is an
era when universities are acting more and more like businesses, even to the point of forming
for-profit companies to commercialize their professors' research.

The aUniversity of Minnesota developed, manufactured, and sold and antirejection drug for use
in transplants, making more than $85-million. It was forced to stop marketing the drug when
federal investigators found that the university had failed to get F.D.A. approval for the drug, and
not not properly reported serious adverse reactions, including deaths. The government sued the
university for allegedly selling an unlicensed drug, fraudulently billing Medicare for the drug,
and submitting false grant applications claiming that Minnesota was distributing the drug at cost
rather than making a profit to the N.I.H. The university settled the suit in 1998 for $32- million.

The academics involved in the affair seemed to feel that they were above the law. An internal
university report suggested that "cost recovery in any form is strictly illegal, but ... when ti is
done on behalf of the University, the FDA will not take action."

WHAT CAN BE DONE to better protect the subjects of academic research? Individual colleges
and universities need to put more resources both people and money into their institutional review
boards. They need to do a better job of educating their employees about the federal regulations
and state laws that govern human research.

Academe as a whole should also devote more effort to the question of I.R.B.'s. We need research
on how well the system is working, mechanisms to evaluate the performance of individual
boards, and new ways to reward faculty members who serve on I.R.B.'s such as offering them
release time from other duties, and counting I.R.B. service in promotion and tenure decisions.

Universities must also guard against conflicts of interest among members of I.R.B.'s. The
risk-protection office found that some review-board members do not recuse themselves from
oting when they have such conflicts. At Duke, both the director and the assistant director of the
medical center's Office of Grants and Contracts, which is responsible for bringing in grants, had
improperly served as voting members of the medical center's I.R.B.

The government needs to do a better job of monitoring research including periodic reviews of
I.R.B. files and making available information about the risks involved. Serious side effects
uncovered in studies paid for by industry are sometimes labeled proprietary information and
through they are reported to the F.D.A. are not disclosed to the public. Such censorship should
not be allowed.

The Department of Health and Human Services has not provided sufficient personnel or
resources in their Office of Biotechnology Activities, for example to complete important public
databases about research risks. Such data are crucial both to people's decisions about whether to
participate in research, and to investigators' determinations about when research is too dangerous
to continue.

More generally, Congress needs to reconsider the laws governing technology transfer. By trying
to ensure that products are brought to market quickly, lawmakers have so commercialized
academe that there are now few neutral scientists who can provide credible assessments of the
risks and benefits of new medical developments. We should revise that laws so that academics
and universities cannot profit from the sale of treatments created through research that the
taxpayers have paid for.

The federal regulations developed in the mid-1970s for use by a kinder, gentler generation of
researchers may be outdated now. Last August, the bar association's Coordinating Group of
Bioethics and the Law initiated a major project to analyze the research abuses of the past three
years. The group is assessing what proportion of abuses violate existing regulations, and what
proportion could have been prevented only be new regulations. Many things have changed since
the current federal regulations were adopted, including greater financial incentives for
researchers, more and larger conflicts of interest, more research on tissue samples collected for
other purposes, and greater risks to people's privacy when genetic information about them is
obtained.

"When lives are at stake, and my son's life was at stake," Paul Gelsinger told Senator Frist and
his colleagues in February, "money and fame should take a back seat. The concern should not be
on getting to the finish line first, but on making sure no unnecessary risks are taken, no lives
filled with potential and promise are lost forever, not more fathers lose their sons."

Academic researchers seem to be focusing on their new role as business people. Universities and
regulators need to remind them of their obligations to the subjects and society.

Lois B. Andrews is a professor of law at the Chicago-Kent College of Law at the Illinois Institute
of Technology, and director of the Institute for Science, Law, and Technology there. This spring
she is a visiting professor at Case Western Reserve University's School of Law. A revised edition
of her most recent nook, The Clone Age: Adventures in the New World of Reproductive
Technologies, will be available from Henry Holt this spring.