Scientific retractions have increased approximately 10-fold over the past two decades, rising from one in 5,000 papers in the early 2000s to roughly one in 500 papers today, according to data presented by Ivan Oransky, MD, Director of the Center for Scientific Integrity and co-founder of Retraction Watch, at a Toronto presentation to physicians on September 29.
Despite this dramatic increase, Dr. Oransky argued that retraction rates remain inadequate. "[R]etraction should represent about 2% of papers. So about [one] in 50," he stated, citing multiple lines of evidence supporting this figure. The claim represents a 20-fold increase from current retraction rates.
The evidence base for the 2% threshold included a 2009 meta-analysis published by Fanelli et al in PLOS One showing that when researchers were surveyed anonymously, 2% admitted to committing misconduct. Additional support came from a 2016 investigation published by Bik et al in mBio, which examined 20,000 papers and "found that half of the 4%, so 2% of them [...] exhibited features suggestive of deliberate manipulation," Dr. Oranksy stated.
Misconduct now accounts for two-thirds of all retractions, a shift driven largely by increased detection rather than necessarily higher rates of fraud. Dr. Oransky explained that independent researchers specializing in identifying problems such as image manipulation, statistical irregularities, plagiarism, and ethical violations are doing a lot of the work to catch fraudulent papers.
Systemic Delays Undermine Scientific Record
Journal response times to misconduct allegations remain problematic. Dr. Oransky presented cases in which universities requested retractions "after investigations, after official findings. Lawyers have signed off on it, and yet journals do nothing." One analysis of orthopedic research found that "fewer than half the journals had even said what they were going to do after [1] year."
The consequences may extend to clinical practice. Research by Alison Abritis, Research Director at Retraction Watch, found that "more than 90% of the time, retracted papers are cited as if they hadn't been retracted." In mental health literature specifically, 40% of retracted papers couldn't be identified as retracted even when researchers checked publisher websites.
Paper Mills and Gaming Systems
The proliferation of paper mills—organizations that fabricate research papers or sell authorship positions—represents a growing threat. Dr. Oransky described "an entire industry that [...] is polluting the scientific literature." Some services explicitly target international medical graduates seeking US residencies, guaranteeing systematic review publications to strengthen applications.
The scope of the paper mill problem became apparent when publishers Hindawi and Wiley retracted 13,000 papers, representing more than 20% of all retractions recorded in the Retraction Watch database. "[T]he markets actually started taking an interest in this because these are publicly traded companies," Dr. Oransky noted.
Academic ranking systems based on citations could also create perverse incentives. Dr. Oransky highlighted that in systems such as Times Higher Education rankings, "citations are also the easiest thing to game." This has spawned "citation cartels" and "citation rings," where researchers coordinate to artificially inflate citation counts.
Geographic and Demographic Patterns
Analysis of the Retraction Watch leaderboard—tracking researchers with the most retractions—revealed striking demographic patterns. "There are no women on this list," Dr. Oransky observed. "Men are nine times more likely to retract papers for misconduct than women are," even after controlling for publication rates.
Canadian institutions haven't been immune to misconduct cases. Dr. Oransky cited the Motherisk case at the University of Toronto and work by Jonathan Pruitt at McMaster University, though he emphasized: "These are just some cases. That doesn't mean there are more than you would expect or less than you would expect."
Detection Infrastructure Evolves
Several developments suggested improved detection capabilities. Some journals have hired research integrity specialists to screen submissions prior to publication. Reference management software, including Zotero, Mendeley, EndNote, and Papers now integrate the Retraction Watch database, providing automated alerts when papers in personal libraries are retracted.
The Cochrane Collaboration has similarly incorporated retraction checking into its systematic review processes. Dr. Oransky emphasized the practical importance: "If you are publishing, or just doing research, or just keeping a library, a personal electronic library of papers [...] all of this software actually uses the Retraction Watch database."
The Center for Scientific Integrity has expanded beyond Retraction Watch to include a Sleuths in Residence Program, the Elisabeth Bik Science Integrity Fund supporting independent investigators, a Hijacked Journal Checker identifying compromised publications, and the recently launched Medical Evidence Project examining papers that inform clinical guidelines.
Institutional Response Required
Both Canadian and UK parliamentary committees have examined scientific integrity issues. The UK House of Commons report concluded: "[T]his should happen much faster. We should stop having months or even years of not correcting the record."
Dr. Oransky warned of consequences for self-governance failure: "If you don't self-police, if you don't correct the record, if you don't sanction people who commit fraud, [...] someone else will come in and do that. And it very well could be a government that you don't like."
India's ranking system now penalizes universities with above-average retraction rates, though Dr. Oransky acknowledged implementation challenges. The approach represents one attempt to create institutional accountability for research integrity.
While industry-sponsored research showed comparable or lower rates of outright fraud compared with other funding sources, Dr. Oransky attributed this to heightened regulatory scrutiny rather than inherently better practices. He advocated for data transparency initiatives such as the Yale Open Data Access project, which provides public access to pharmaceutical company trial data under specified conditions.