Bad Science
Free-market academic research policies have unleashed medical quackery and scientific fraud, forcing consumers to pay premiums for discoveries we’ve already funded as taxpayers.
At the heart of the US healthcare system’s profit-based approach to medical science is the harsh truth that money alone can prolong life. Take, for example, the class of genes dubbed “tumor suppressors.” Because of their ability to regulate cell growth, tumor suppressors are at the forefront of cancer-prevention research. A positive test for mutations in a tumor suppressor gene like BRCA1 or BRCA2 is a leading indication of high risk for breast or ovarian cancer.
But despite the potential life-saving importance of the discovery, the cost of the BRCA1 and BRCA2 test is prohibitively expensive. At $4,000 a test, it is four times that of a full genetic sequencing. The only reason the price for a potentially life-saving evaluation could be this outrageous is due to the actions of one company, Myriad Genetics. While the Supreme Court recently struck down Myriad’s claim to the BRCA1 and BRCA2 genes, declaring that human genes can’t be patented, Myriad continues to assert its monopoly on the test for susceptibility to breast cancer.
What’s even more egregious about Myriad’s price-gouging is that many of the costs of developing the BRCA1 and BRCA2 test have already been paid for by the public. The research to identify those genes as cancer triggers was publicly funded through the University of Utah School of Medicine. Myriad Genetics was simply a startup founded by researchers at the university to take possession of the patent after the test’s discovery. And it was only because of the Bayh-Dole Act that this could take place.
At the time of its passage, the 1980 Bayh-Dole Act was intended to drive innovation in academic research. By removing restrictions on what universities could do with their scientific discoveries, it would ostensibly bring more money to the university system. To pay for their work, academic research facilities could now sell off their patents,or hand out exclusive licenses to private industry. With a monopoly on intellectual property provided by the patent, the private sector would be incentivized to quickly develop those patents into advanced consumer products and services.
The supporters of Bayh-Dole claimed that the opportunity to make more money would push academic science to make more discoveries and encourage private industry to bring more of those discoveries to market. Not long after its passage, the financial repercussions were already being realized. Researchers at Columbia University applied for patents on the process of DNA cotransformation, known as the Axel patents, that would eventually earn the university hundreds of millions in licensing fees. The Cohen-Boyer patent on recombinant DNA would earn Stanford over two hundred million. Along with the 1980 Diamond v. Chakrabarty Supreme Court decision that allowed biomedical material to be patented, it was the beginning of the biotech boom. Universities scrambled to build advanced research labs to make new claims on intellectual property from software to DNA sequencing that could be patented and sold to the public.
Previously, discoveries made by public universities could only be given out to private industry through non-exclusive licenses. Private entities could develop new drugs and new inventions based on groundbreaking research, but so could any other company. The supporters of Bayh-Dole argued that this grace period was essentially a disincentive to innovate. If one company didn’t have exclusive rights to an invention, then there was little money to be made in its development. Why bother innovating if the competition could do the same and eat away at the potential profit margin? Inventions would be left to “rot on the shelf.”
Yet what might seem like an arcane bit of legal minutia related to intellectual property is at the forefront of the university research system’s decline. The public-license restriction protected academic research from descending into an intellectual-property gold rush. Removing it has unleashed a flood of capital from private industry eager to possess a monopoly on cutting-edge scientific advancements. Private bodies now help fund academic institutions in return for priority in the process of “tech transfer” — the exclusive licensing of publicly-funded research to private industry. Giant pharmaceutical conglomerates like Merck and GlaxoSmithKline fund partnerships with private and state universities on projects to research currently incurable diseases, with the explicit stipulation that those companies will reap the benefits by obtaining exclusive licenses on any forthcoming discoveries. Those discoveries, whether they are related to the original aim of the project or not, are then turned into overpriced, brand-name pharmaceutical drugs.
Not only do patents push higher prices onto consumers, they burden the research world with the increased costs of paying for the intellectual property needed to do further research. Research labs have to pay thousands of dollars for the strains and processes needed to build upon current developments, adding more costs to cutting-edge research. The profit-driven atmosphere of the current research system is a far cry from the one Jonas Salk worked in when he discovered the cure for polio. His discovery, which affected millions around the world suffering from a debilitating disease, was effectively given away for free. While Salk rhetorically wondered whether it was acceptable to “patent the sun” to make a profit, today’s race for intellectual property claims is quickly approaching that absurd proposition.
Though having more money invested in public education and hastening the development of new technology is ostensibly a public good, the influence of capital from private industry is largely corruptive. Combined with the sharp decline in state funding for education, Bayh-Dole has helped privatize the public university system. Without those public funds, universities have become ever more dependent on private investment through grants and donations. And with that money comes corrosive influences on academia.
Nowhere is this conflict of interest as prevalent as in pharmacology and biotechnology. Academics in those fields are commonly paid to sign their names to ghostwritten journal articles, promote drugs, and discover drugs based on market potential rather than the public good. They earn outsized consulting fees and lucrative speaking deals at industry-funded conferences in exchange for their compliance. In the case of Pfizer and their anticonvulsant drug Neurontin, academics were paid $1,000 a paper to sign their names to journal articles written by unknown medical ghostwriters and to speak at conferences extolling the virtues of a drug, initially intended for epilepsy sufferers, to treat anything from bipolar disorder, post-traumatic stress disorder, and insomnia to restless leg syndrome, hot flashes, migraines, and tension headaches. Not only are consumers misinformed about the safety and efficacy of the prescription drugs they take, but they pay the costs three times over: by funding public university research to discover these drugs, by paying the higher costs on patented drugs, and by accepting the pharmaceutical industry’s tax write-offs for their university sponsorships.
Even with limited public funding and an increased dependence on private financing, universities haven’t stopped spending, particularly on new facilities. A McGraw-Hill Construction survey estimated that over $11 billion had been spent on construction by higher education institutions between 2010 and 2012. By floating massive bonds to pay for new biomedical research facilities and state-of-the-art gymnasiums, schools hope to attract the students, star researchers, and funding that will help pay for it all. But these schools have wildly overcommitted themselves, and by doing so they’ve entered into the vicious cycle of a debtor’s beauty contest. They are spending massively to do research that can attract the grants and land the intellectual property jackpot to pay for the bloated administrative costs and massive debt they’ve incurred.
The burden of this scramble for money and fame is left on the students. Over the last thirty years, tuition costs have increased sixfold. There are fewer and fewer post-graduate opportunities, even in the world of academic research where so much is being spent. The flood of private money coming to the research system hasn’t made its way to expanding academic careers. Instead of employing more staff scientists, underpaid post-doctoral students are hired for half the cost to produce the eye-catching research that attracts grant money. Those students then go on to graduate into a science field flooded with other post-docs who are in direct competition for the dwindling number of established research positions available. The result is a highly competitive job market where too many are left fighting for fewer positions.
Across the whole university system, the pressure to cut costs means that tenure-track positions are being replaced by adjuncts with low pay and no job security as the salaries of administrators and college presidents continually rise.
In what Georgia State University economics professor Paula Stephan has referred to as an academic pyramid scheme, the resulting discrepancy between underpaid post-docs and adjuncts with minimal career prospects and the diminishing number of tenured, well-paid, and established star scientist positions mimics a tournament structure for scientific inquiry. It is a cutthroat beauty-contest atmosphere that takes its toll on the science being done. More and more earth-shattering studies by star scientists need to be published in prominent journals to garner the attention and the grants needed to keep up appearances and keep the lights on in the lab. In Stephan’s words, “Bigger is seen as better: more funding, more papers, more citations, and more trainees — regardless of whether the market can sustain their employment.”
The end result is a greater imperative not just to publish or perish, but to publish groundbreaking, provocative insights into our understanding of the world around us that require further investigation in highly respected journals — or perish. In the words of Stephen Quake, professor of bioengineering at Stanford, it is “funding or famine.” Within that decision matrix, the incentive to falsify findings, cut corners, and cherry-pick data becomes more advantageous. Whatever it takes to get more papers out the door and more grants coming in. It has come to a point that academics are insisting “there is no cost to getting things wrong. The cost is not getting them published.” In a meta-analysis of published research for the Public Library of Science (PLOS), John P. A. Ioannidis placed the blame specifically on the financial underpinnings of research, noting that “the greater the financial and other interests and prejudices in a scientific field, the less likely the research findings are to be true.”
The results are readily apparent. The overwhelming number of retractions due to flawed methodology, flawed approach, and general misconduct over the last decade is staggering. Stories in almost every field have seen a rash of inaccuracies. The percentage of scientific articles retracted because of fraud has increased tenfold since 1975. Only a fraction of heart disease and cancer studies have held up to scrutiny as their results were not reproducible. The free-radical theory of aging, once a well-regarded theory of how antioxidant enzymes affect cell life, has been thrown out, along with the USDA’s guidelines for measuring antioxidants in food. This, in turn, has called into question the whole supplemental vitamin industry,which is based in large part on the need for more antioxidants. The positive effects of omega-3 fatty acids on everything from cancer prevention to brain development have been challenged after follow-up stories showed no significant effect. The benefits of regular mammograms have been called into question as the results of the Canadian National Breast Screening Study showed no decline in the rate of mortality from breast cancer owing to their use, and regular testing sometimes led to overdiagnosis.
While there is certainly still a center of reputable, respectable, and reproducible science, it is surrounded by a cloud of inaccuracy and chicanery. Enthusiastic discoveries about possible cancer cures are swallowed whole and regurgitated by a media desperate for content that is unwilling or unable to decipher the false leads, flawed methodology, and erroneous statistics used to get those results. The public’s understanding of controversial topics like genetically modified organisms and endocrine disruptors is muddled further by the release of inaccurate studies supporting each disputed side. Those stories are then turned into short-lived diet fads and health scares, like those linking autism to vaccinations at birth.
Results that are quick to produce and quick to publish are more likely to be inaccurate. Proper science takes time, and refuting flawed science can take even longer. While it took over nine months to disprove a recent genetic test for autism, it took only three days for the original study to go from submission to print. In that time, few of those who heard the exciting news of the initial discovery will likely hear of the disappointment surrounding its correction. When a paper is published trumpeting the discovery of a genetic test for longevity, it immediately inspires cottage industries dedicated to providing longevity exams. When that paper is retracted — not because of fraud or misconduct but because of a flawed approach — those genomic testing operations don’t necessarily disappear overnight. They survive in a gray-market economy that profits off the public’s lack of knowledge of current scientific research.
The privatization of academic research not only hinders the scientific process, it also means that direct corruption — where scientists are paid off by private industry to deceive the public about toxins in their food or pollution in their air — has more opportunity to continue unabated. Researchers desperate for funding to maintain their positions and sustain their work are more susceptible to financing from industries eager to distort science to their own whims. It only encourages the perverse incentives of the free market to take advantage of what were once public institutions. When the health risks of cancerous flame-retardant chemicals can be distorted by an industry eager to make money off of their proliferation, then science ceases to work for the public interest. Eventually, the market-based approach to academic research ceases to be about science but about attracting attention and money under the gloss of scientific research.
If anything, the neoliberal approach to academic research is a return to the privately funded, pre-tenure origins of the university system when numerous schools were simply research labs and promotional arms for private industry rather than institutions of knowledge advancing science in the public interest. Back then, professors worked at the behest of the school’s donors and board of trustees. They could be easily fired for outspoken criticism or for publicizing research that affected the school’s or their donors’ bottom line. Supporting labor rights, advocating for socialist policies, believing in evolution, advocating against slavery, or informing the public about the toxic consequences of copper smelting fumes could lead to instant dismissal. Thorstein Veblen went so far as to acknowledge an unspoken blacklist amongst academics:
So well is the academic blacklist understood, indeed, and so sensitive and trustworthy is the fearsome loyalty of the common run among academic men, that very few among them will venture openly to say a good word for any one of their colleagues who may have fallen under the displeasure of some incumbent of executive office.
With tenure and public funding, researchers could speak freely and focus on topics that avoided short-term, consumer-based, money-making propositions. Advancements that might not have an immediate profit potential could be developed without a constant need to publish or perish. In the postwar era, government investment in academia and research led to many of the innovative breakthroughs we take for granted today. What so many have ascribed to the advancements of the digital revolution, from the internet and GPS to the DNA sequencing found in the Axel patents, were once large-scale, government-funded projects developed on university campuses begun decades before the Bayh-Dole agreement was even conceived of.
Despite the claims of Bayh-Dole proponents, those inventions have not been left on the shelf to rot.