“In my view, hydroxychloroquine should not be used in the hospital setting,” says Martin Landray, a physician and researcher in the University of Oxford’s Nuffield Department of Population Health and one of the heads of Recovery. “Outside the hospital setting it would be reasonable to use it in the context of a randomized controlled trial, but not otherwise.”
Speaking of which: The Minnesota study looked at people not in the hospital, so by definition not as sick. And that caused some methodological problems. The lack of easy and fast Covid-19 testing in the US meant that not everyone in the study population had a diagnosis made via PCR testing, or taking a sample via a nasal swab and analyzing it for the virus’ genetic material. For these participants, the team of researchers confirmed that they had Covid-like symptoms, and that they had contact with someone whose infection was confirmed with a test. It’s a slightly dicier set-up, but still valid.
The Minnesota team had originally intended to use death or hospitalization numbers as a marker of whether the drug helped people in the study. But even though numbers of both are sky-high in the US, the actual mortality and hospitalization rates overall are low—or too low to show up significantly in just under 500 people, the size of the group in the study. So without looking at the data, the team switched to another metric: symptom reduction. (Participants reported their own symptoms on a 10-point visual scale day by day; the most common ones were cough, fatigue, and headache.) Here, too, hydroxychloroquine made no difference. Two weeks after starting, 24 percent of the 201 people taking the drug still had symptoms versus 30 percent of 194 people taking a placebo. Again: no significant difference.
Those results were actually going to be part of an earlier paper from the team, showing that hydroxychloroquine likewise didn’t work as a preventative, keeping people from getting sick after they’d been exposed to the disease. That “post-exposure prophylaxis” paper got accepted to the New England Journal of Medicine quickly and came out in early June. But as time went on and the drug faded a bit from the news and presidential briefings, it was tougher to find a home for the paper about how the drug fared as a treatment. “The negative fact that hydroxychloroquine didn’t work was not as newsworthy, I guess. They weren’t as interested in a null study,” says David Boulware, the infectious disease physician running the team. “To design the study was eight or nine days. To do the study was seven weeks. To actually get it published was two and a half months … In a normal timeframe that’s fast. In a Covid timeframe, that’s glacially slow.”
The lack of confirmed, PCR-based testing also makes the study slightly less bombproof. “The true believers are going to criticize it. Not everyone had PCR testing, because it’s the United States and people didn’t have access to PCR testing,” Boulware says. “It’s not a perfect study, but I think it’s correct.”
By “true believers,” Boulware means people who remain unshakably convinced of the drug’s value. For months, they’ve parsed every hydroxychloroquine study for factors that they think might influence its effectiveness that the researchers did wrong—too high a dose, too low a dose, given too soon, given too late, given without supposedly important adjuncts like zinc. Proponents of the drug’s use have proposed all of those as critical to its success. In some respects, they’re right—dosage does matter. One major study of the drug in Brazil stopped early because of serious heart problems in people taking it, a known side effect. But that study was also using extraordinarily high doses, well beyond levels used preventatively or even as a treatment. The Recovery and Minnesota teams used a more typical protocol.