John Ioannidis is perhaps best known for a 2005 paper “Why Most Published Research Findings Are False.” One of the most highly cited researchers in the world, Ioannidis, a professor at Stanford, has built a career in the field of meta-research. Earlier this month, he published a heartfelt and provocative essay in the the Journal of Clinical Epidemiology titled “Evidence-Based Medicine Has Been Hijacked: A Report to David Sackett.” In it, he carries on a conversation begun in 2004 with Sackett, who died last May and was widely considered the father of evidence-based medicine. We asked Ioannidis to expand on his comments in the essay, including why he believes he is a “failure.”
Retraction Watch: You write that as evidence-based medicine “became more influential, it was also hijacked to serve agendas different from what it originally aimed for.” Can you elaborate?
John Ioannidis: As I describe in the paper, “evidence-based medicine” has become a very common term that is misused and abused by eminence-based experts and conflicted stakeholders who want to support their views and their products, without caring much about the integrity, transparency, and unbiasedness of science.
RW: You also write that evidence-based medicine “still remains an unmet goal, worthy to be attained.” Can you explain further?
JI: The commentary that I wrote gives a personal confession perspective on whether evidence-based medicine currently fulfills the wonderful definition that David Sackett came up with: “integrating individual clinical expertise with the best external evidence”. This is a goal that is clearly worthy to be attained, but, in my view, I don’t see that this has happened yet. Each of us may ponder whether the goal has been attained. I suspect that many/most will agree that we still have a lot of work to do.
RW: You write that clinical evidence is “becoming an industry advertisement tool” and that “much ‘basic’ science [is] becoming an annex to Las Vegas casinos.” Provocative — what do you mean?
JI: Since clinical research that can generate useful clinical evidence has fallen off the radar screen of many/most public funders, it is largely left up to the industry to support it. The sales and marketing departments in most companies are more powerful than their R&D departments. Hence, the design, conduct, reporting, and dissemination of this clinical evidence becomes an advertisement tool. As for “basic” research, as I explain in the paper, the current system favors PIs who make a primary focus of their career how to absorb more money. Success in obtaining (more) funding in a fiercely competitive world is what counts the most. Given that much “basic” research is justifiably unpredictable in terms of its yield, we are encouraging aggressive gamblers. Unfortunately, it is not gambling for getting major, high-risk discoveries (which would have been nice), it is gambling for simply getting more money.
RW: Studying what ails science doesn’t make you popular with other researchers — until they want to publish with you, of course, as you point out in your piece. But those criticisms can also lump you in with those that you describe as “pseudo-scientists and dogmatists…trying to exploit individuals and populations and attack science.” How do you differentiate your own work?
JI: I definitely can’t complain for lack of popularity. I feel privileged to have worked with thousands of other scientists over the years and to have learnt from them. It is not possible to make everybody happy all the time, but the work of my team is aiming to protect science, defend the scientific method, question dogma, and enhance the capability and efficiency of research methodology and research practices. In this regard, it is at the very opposite pole than those who want to attack science, question the scientific method and promote dogmas.
RW: You’re worried that Cochrane Collaboration reviews — the apex of evidence-based medicine — “may cause harm by giving credibility to biased studies of vested interests through otherwise respected systematic reviews.” Why, and what’s the alternative?
JI: A systematic review that combines biased pieces of evidence may unfortunately give another seal of authority to that biased evidence. Systematic reviews may sometimes be most helpful if, instead of focusing on the summary of the evidence, highlight the biases that are involved and what needs to be done to remedy the state-of-the-evidence in the given field. This often requires a bird’s eye view where hundreds and thousands of systematic reviews and meta-analyses are examined, because then the patterns of bias are much easier to discern as they apply across diverse topics in the same or multiple disciplines. Much of the time, the solution is that, instead of waiting to piece together fragments of biased evidence retrospectively after the fact, one needs to act pre-emptively and make sure that the evidence to be produced will be clinically meaningful and unbiased, to the extent possible. Meta-analyses should become primary research, where studies are designed with the explicit anticipation that they are part of an overarching planned cumulative meta-analysis.
RW: What are your hopes for evidence-based medicine moving forward?
JI: The right ideas are there, and there are many superb scientists and clinicians who want to do the right thing, so I am always cautiously hopeful. We should keep trying.