As the coronavirus pandemic sweeps the world, shutting down entire countries and threatening the worst economic crisis since 2008, developed nations are asking their citizens for the kind of sacrifices that normally are restricted to wartime. A shelter-in-place order now covers millions residents in California and Illinois. Colleges and universities around the country are shuttered. Are these unprecedented efforts to “flatten the curve” really worth it? One renowned researcher – John Ioannidis, a specialist in scientific methodology at Stanford University – is skeptical, arguing that there’s not enough data to justify wrecking the economy by putting the country on lockdown. His analysis is being taken seriously by scientific luminaries such as Steven Pinker. But it has some big shortfalls.
Ioannidis is famous for his 2005 paper, “Why Most Published Research Findings Are False.” This article was a stinging rebuke to the cultural underbelly of science, where researchers often prioritize flashy findings over accuracy, bend results to fit funders’ agendas (a particular problem in pharmaceutical research), and rely on sloppy statistical analyses. The problem was that this underbelly wasn’t just a small, seedy portion of the scientific world. In fact, a lot of scientific research – including that which drives public policy and healthcare decisions – is probably deeply flawed: hard to replicate, biased, and spurious. Since Ioannidis released this bombshell article, it’s been cited more than 8,000 times and has helped spark massive revolutions in how sciences such as psychology carry out their research.*
So Ioannidis has his bona fides. People – important people, influential people – trust him. He’s also an inveterate skeptic and data nerd, the world’s go-to expert for sniffing out the holes in fallacious scientific arguments. One of his biggest pet peeves is when arguments aren’t built on a foundation of rock-solid data. In the midst of the public-policy debates surrounding the coronavirus pandemic, he thinks that the case for preemptively closing down cities and regions to contain the spread of the illness is just such an argument. Writing in STAT, he argued on Wednesday we simply lack the data to determine whether such extreme measures are a good idea. But his hard-hitting analysis illustrates a unique danger of data wonkishness: putting so much stock in scientific abstractions that reality itself becomes invisible.
* Not that psychology has exactly conquered the problem yet.