Fraudulent scientists seem to be everywhere
these days. In recent years we have been regaled with cheats trying to foist
upon us that rabbit blood can be turned into an AIDS vaccine,
eating meat makes people more selfish and that transistors can be made out of
virtually anything. Normal folk don’t know where to turn, so they doubt
everything: evolution, climate change, vaccination. What is happening? Has
science become propaganda from PhDs perverted by the search for research money and
prizes?
The popular ideal image of us scientists is as disciples
of the pursuit of knowledge. With scant
regard for the trivialities of life, we refuse bodily pleasure, food and sleep
in our endless search for the ultimate truths of life, the universe and
everything, scrupulously obeying the doctrine of the scientific method as we solemnly
unroll the red carpet of transformational discovery. Little wonder then, that,
whereas the uncovering of lawyerly or political fraud is met by knowing snickering, each new revelation of
science misconduct is considered tantamount to apostasy.
Serious scientific fraud - the fabrication
of some high-impact but plausible new phenomenon - propels the perpetrator ephemerally
into the academic stratosphere, while misleading large numbers of fellow
researchers and misdirecting precious resources. But serious fraud in science is
relatively rare, if only because the perpetrators, if not delusional, know they
are likely to eventually be exposed by curious colleagues.
Serious fraud must, of course, be
unearthed and punished, but my contention here is that is the least of
science’s problems. You see, we scientists commit many sins, all of which lead
to some sizeable proportion of our published work being at least partially misleading
or wrong. These sins, which do not involve fraudulent deception, are
far more widespread and more damaging to scientific progress.
Let’s delight with a troll through seven
deadly sins of scientists. We start with incompetence
and ignorance. Our hypotheses may be balderdash,
logically inconsistent. Our work may be ‘shoddy’ or ‘sloppy’; we may not
perform experiments that actually test
our hypotheses, failing to test alternative explanations, and not knowing to
perform elementary “control“ experiments. Our computer programs may contain
critical errors. We may not estimate the statistical errors in our
data. We
may look at data and draw completely the wrong conclusions because we don’t
know the underlying principles that govern the phenomenon under scrutiny. We may
write our papers as if a logical sequence of experiments had been done when in
fact we randomly tried things then assembled them into something that makes a
pretty story.
We can also be lazy. We may only do one or
two quickie experiments, nowhere near enough to justify the grandiose
conclusions we then draw, and hope the reviewers and editor of our papers are
themselves too lazy or busy to read our manuscript properly. We may not even
bother to properly search the literature to find out who has done anything
related to our study. We may take the path of least resistance, that of
expediency, to spin a story aligned with our vision.
Then there is illiteracy. We may be unable to describe our findings in a way that
anybody else can possibly follow; we may omit steps in our argument, and our
writing may be grammatically awful, leaving even qualified readers flailing.
True to our nerdy stereotype, we are
often myopic. Our publications may deliberately
ignore closely related but highly pertinent findings of others, concentrating
only on our own past achievements, such that we do not put our work into
context. Citations made
to others that we do include are to papers we have not even read.
We are also self-aggrandizing. In print and in
person, and especially in grant proposals, we puff up the importance of our
work and castigate other, legitimate studies. I may have been cited 20,000 times, but
of course it should have been 200,000!
Great scientists can be highly intuitive,
but this intuition also blinds us all. Many Nobel laureates have suffered from
this. Take, for example, my ‘academic grandfather’*, Linus Pauling, arguably the
greatest chemist of the last century. He spent his last decades misleading
humanity by trumpeting unsubstantiated ideas about the health benefits of mega-doses
of Vitamin C.
You see, Pauling, in his later years,
fell victim to that ubiquitous scientist’s plague: that of wishful
thinking. This arises naturally from
the initiating, creative act in science, in which various half-formed ideas shape into a concept to which we cling and may
base our careers, fomenting long-held desires, and prejudices. We believe in something, a beautiful process or an imagined principle.
So, blinded by our belief, we may see a trend in our data that is not really there,
or a small peak in a spectrum that is really just noise. We may remove that
lone, recalcitrant data point that doesn't fit our model – that’s not fraud
because we really believe the data point can’t be right. The temptation to airbrush data is irresistible. Lets add a calibration factor, fudge factor,
cosmological constant. We smooth, filter and transform data onto scales that
make them look more accurate. Some run an experiment ten times until they get
the result they want then publish only that one. We may refuse to give access
to our raw data to others – after all, we haven’t finished analyzing them ourselves
and, anyway, others would misuse them.
So, we scientists are ignorant, incompetent,
illiterate, lazy, myopic, self-aggrandizing wishful thinkers. Each of these seven
sins has the same effect as outright fraud – wrong results, erroneous interpretations,
false conclusions. So shoddy, dubious science is everywhere, leading
to a large
proportion of submitted papers being rejected after anonymous peer review, and
a fair proportion of manuscripts that do manage to sidle past peer review being
still wrong. In my, of course unbiased, opinion, about half the interesting papers
in my field published in the top journals such as Nature and Science, are basically
wrong – they may be brilliant,
thought-provoking, beautiful and even inspirational, but they are still
wrong.
Every
Wednesday my lab holds a Journal Club, in which we select one or two papers to
read, and we try to understand what was done, its significance, and its
validity. Sometimes we leave the room exalted by a timely and impactful piece
of research. But often, when we try to ascertain whether the main conclusions
of the authors are justified by the data presented, we regretfully must conclude
that the answer is “no”, and occasionally we go ballistic, especially me. A while ago I had one of those ballistic days.
We read a published paper on the computational design of drugs to overcome
antibacterial resistance, and concluded that every one of the main conclusions
was wrong. The paper was total pigswill. If anyone reads this paper and starts
a program of drug design based on it they will have been sadly misled. [Naturally,
though, that it is inconceivable that anyone would hold such a subversive
meeting criticizing our own work.
Inconceivable (ahem).]
Why then, is there so much fluff and junk
out there? Well, unlike other professional pursuits, scientific research
tackles the unknown. This makes it inherently
very difficult to know which questions to ask and how to go about things. Also,
we scientists are condemned to membership of a certain species of animal
endowed with primitive, instinctual, jealous and lustful traits. So each new
problem will have each of us looking at it with our own biases, framed by our
own imperfect training and experience. So it’s hardly surprising that there can
be a lot of trash to wade through before an advance can be solidified.
So,
what to do about it all? Well, the world could try a science detox, doing
without science completely, but then there will be no cures for cancer, no
saving the environment, no endless supply of energy, no technological terrorist
foiling. Another option is to keep doing what the authorities are concentrating
on now; fraud detection, witch hunting, setting up Offices of Scientific
Integrity and Research Integrity and the like. But that is no panacea. You see, the seven
scientific sins are juxtaposed by seven virtues: curiosity, intelligence, vision,
drive, rigour, integrity and insight, virtues propelled by appreciation of the
beauty of truth. The virtues win out in the end.
*academic grandfather: the adviser of my
adviser, Martin Karplus. Pauling has hundreds of such grandkids..