Ethics in Science

Henry H. Bauer


Consequences of Misconduct in Science

The knowledge filter works properly if and only if peer review, mutual criticism, is objective, impartial. You need to judge work by how good the work is, not by who did it or where they're from. The jigsaw puzzle gets put together efficiently and properly only if the players are open with one another and behave as honestly as they can.

Nowadays, many people don't seem to understand that honesty is the best policy in science; for instance the AIDS activist who said, "Show me someone with a perfectly clean record who's able to get anything done" (11).

There have been literally dozens of books published in the last ten years or so telling the supposedly true inside stories of successful scientific discovery as episodes of cut-throat competition and cutting corners by scientists anxious to get there first and win the biggest prizes and grants, see Table 2.

I said just now, puzzle players need to "behave as honestly as they can"; because complete honesty and objectivity aren't achievable by the typical human being. It's difficult to award grades without being influenced by how you like or dislike a student's manner. It gets especially difficult if the student is also a friend, child, spouse, or lover. That's why having such people in your class should be avoided: it protects you from an intolerable conflict of interest, between wanting to be fair and wanting to please someone you care about. It also protects the rest of the class from being suspicious of the teacher and jealous of his protégé. Conflicts of interest erode the fabric of trust on which worthwhile social interactions rely.

People don't seem to understand that any more. Influential voices are suggesting that we ignore conflicts of interest in order to get important things done: new drugs developed, better facilities for universities established, the balance of trade improved. I think this rush to get things done even if it means doing risky things, cutting corners, is a real threat to science. I think the effect can already be seen in the "hottest" research areas of molecular biology and genetic engineering.

In September 1989, the National Institutes of Health (NIH) proposed that people applying for grants should disclose all sources of support, including honoraria and consulting fees; and that people funded by NIH (or their assistants, consultants, spouses, or children) shouldn't own stock in companies that would be affected by the outcome of the research; and that results could not be shared with private firms before they had been made public.

Doesn't all that sound reasonable? Yet NIH was flooded by protests (12):

The Association of Biotechnology Companies said the guidelines were a "draconian remedy which . . . will create real problems for the biotechnology industry". What problems?

The guidelines would "effectively eliminate contact between academia and industry for many small biotechnology companies" that don't have the financial resources to pay researchers in cash for their work. Well, so what? Why should companies, large or small, be able to get something without paying for it? How many companies in other fields than biotechnology are able to get going without the necessary financial resources?

One private Cancer Research Center complained (13) that keeping all those financial records would take 4.3 feet of file space per year. Does that sound like a reason or like an excuse?

Some people asked, how could investigators know beforehand which companies might benefit from the fundamental research the investigators did? I suppose the best answer is, because they're not stupid. But in any case, if unexpected applications of some work open up, you can explain what happened and find some legitimate way of getting rid of the stock. Again, this is an excuse rather than a reason.

The NIH guidelines would have prohibited investigators from taking money from companies whose products they were evaluating in a government-funded project. Seems a sensible enough safeguard, doesn't it? When the Air Force tests a new airplane, we do hope the testers aren't getting something from the aircraft company.

Or do we? "Clinical researchers . . . pointed out that ties with industry were crucial for rapid progress". But who was arguing against ties with industry? Only against those ties that produce conflicts of interest. And what is this preoccupation with rapidity? Is it better to be fast than to be sound? Is it better to get false results quickly than valid results slowly? Doesn't the knowledge filter demand time as well as disinterested participation?

Those anonymous clinical researchers also argued that "an individual's bias . . . could hardly be a major factor in influencing the outcome of rigorously controlled multi-center trials". But where's the rigorous control if conflict of interest is permitted? And if it's permitted, why assume that only one person would suffer from it rather than everyone, in all those multiple centers?

George Levy directs a lab at Syracuse University; he also owns a company that exploits commercially what he does in his university lab, and the company pays royalties that support his lab. "Undoubtedly, I have split loyalties. That really is a problem", he said. "But the alternative is to let the Japanese buy the United States" (14). Really? Don't they have rules against conflicts of interest in Japan? Is it really for that reason that the United States has lost out in trade to Japan, or is it for such reasons as inferior quality control and inept management?

Levy also said (15), "If you take away anybody with a conflict of interest, you take away all the experts". Mark Wrighton, chairman of the Chemistry Department at MIT, was "very concerned about the development of guidelines that will have the effect of precluding the opportunity to pursue the very research that is most interesting". But how do those guidelines preclude any work from getting done? And interesting for what reasons and to whom? Interesting for scientific reasons or because there's money to be made out of it?

"Blanket prohibitions don't work", said the Vice-President for Research at a major university. I suppose he believes that all existing laws should be wiped from the books, like against drunken driving.

These protests against the NIH guidelines showed that influential people don't understand the dangers posed by conflicts of interest; or, that they believe that ends justify means; or both. A spokesman for the National Association of State Universities and Land-Grant Colleges said, "Many people have apparent conflicts of interest". But what's an apparent conflict of interest? How does it differ from a real conflict of interest?

Perhaps that spokesman was thinking - or rather not thinking - along the same lines as Nobel-Prize-winner David Baltimore when he was a professor at MIT and about to become in addition, at the same time, Director of the Whitehead Institute for Biomedical Research. How, he was asked, would he handle that conflict of interest? Well, Baltimore replied (16), "I think people are entitled to ask that of me. But I do think the statements and decisions I make come from the highest sense of integrity". The conflict is only apparent, in other words, no harm will come of it.

Baltimore's blind spot owes something, I think, to belief in the mythical scientific method. Baltimore is the most successful kind of scientist, having won the Nobel Prize; scientists, so goes the old syllogism, use the scientific method, which guarantees objectivity; therefore, Baltimore, having shown that he's a master of that method, must be particularly able to be objective in all things at all times.

But if we understand that science is a human, social activity, we may be less likely to swallow that fallacy. Among the things we know about human beings is that they - meaning we, all of us - don't see the world directly, we "see" an interpretation of what our eyes and their nerves signal to the brain. It's the same with intellectual signals or information: we "see" the data through the distortions of our beliefs and preconceptions. We tend to see what we expect to see and want to see and we're very apt to overlook things that we don't expect or want to see. That's all largely automatic, we don't have much control over it. Even when we're aware of the general problem, it's very difficult to counteract and it's impossible to counteract perfectly and completely in every actual instance.

Financial advisers and salesmen, not to mention outright confidence tricksters, make good livings because they understand how little encouragement most of us need to fool ourselves into imagining something to be true just because we want it to be true. There's a book about this that I highly recommend, by Thomas Gilovich, called How We Know What Isn't So (17).

The plain fact of the matter is that no human being can be trusted as much under a conflict of interest as when there's no such conflict. It isn't that people will always do the wrong thing; it isn't that we always put our private interests ahead of the public interest; it's just that if you have several, mutually conflicting wishes or interests, then you can't satisfy all of them. So it's to the public's interest - which means to everyone's interest - that those who serve us shouldn't have reasons for wanting to short-change us.

And still, influential people don't understand that. A television program (NOVA) a few years ago showed many doctors, clinicians, researchers who thought it quite all right to give speeches praising the value of a given drug even when the manufacturer of the drug was paying them to give the speech. The chairwoman of a clinical department said that her opinion could be entirely objective even though her department was getting a lot of money from drug manufacturers.

A spokesman for the American Association of Universities - the most prestigious organization of research universities in the United States - said that since everyone has conflicts of interest we shouldn't try to avoid that in government-funded research, we should just decide how much of it to allow. But you can't quantify ethics or conflicts of interest like that. Quite the contrary, our experience of life and of history is that small breaches of honesty tend to lead to bigger ones.

Another misconception about conflict of interest is that it's a personal matter, a problem for individuals. But institutions too can and do suffer from conflicts of interest. Universities, for example, need money to do the desirable and valuable things they do: but is it permissible for a university to do everything and anything to get money for those worthwhile goals?

Accepting gifts from parents, graduates, and other benefactors has long been standard practice, and it doesn't usually cause trouble - unless, of course, a wealthy donor has a stupid nephew whom he wants enrolled and given a degree.

Is it quite all right for universities to operate book-stores, cafeterias, stores that sell mementos?

What about cashing in through patents on discoveries made by the faculty?

What about setting up a for-profit company to exploit such discoveries, instead of just getting royalties from private companies that buy patent rights? That's what the University of California planned to do last year, before protests called a halt to it.

What about hiring lobbyists to persuade the government to designate some funds for a new building or program? That's become a growth industry. Universities hire - for 6-figure fees - people who try to persuade members of Congress to put into some bill, say $60 million dollars for a supercomputing center at Cornell University. That's an actual example from about ten years ago. On that occasion, Cornell's President said that his university wouldn't accept the money: they were perfectly able to compete through the traditional, peer-reviewed channels that decide which university can make the best use of funds for any given project. But a vice-president here at Tech sneered at the scruple of Cornell's President, saying it was a fine example of allowing principles to get in the way of getting things done.

This new method of using political clout rather than intellectual merit to make decisions, pork-barreling in other words, is an almost hallowed American tradition but it has only recently been taken up by universities. The American Association of Universities admitted that this pork-barreling is a bad thing, and wished that it wouldn't happen; but it refused to criticize those of its members who were doing it, on the grounds that the need for resources is so great.

In other words the Association was saying that the ends justify the means. But ends can never justify means, because it's the means you use that determine the ends that you'll get. Use force instead of persuasion, and you'll have a society that's controlled by force: use pork-barreling to get what you want, and you'll have a society that works through bribery and who you know and not on the basis of merit.


Return to the Table of Contents.   |||   Go to the next section: Summary.
Version: 1.0, text updated: 5/12/1995
Send questions or comments about this essay to:
Henry H. Bauer
Professor of Chemistry & Science Studies
Virginia Polytechnic Institute and State University, Blacksburg, VA 24061-0227
hhbauer@vt.edu, (540) 951-2107, http://www.cis.vt.edu/stshome/faculty/bauer.htm