Should Scientists Be Held Legally Responsible for Their Results?

On March 31, 2009, a panel of scientists and civil servants met to assess the risk presented by a recent series of tremors in the Abruzzo region of Italy. They concluded that a major seismic event was unlikely. Soon thereafter, Bernardo De Bernardinis, the vice-director of Italy’s Department of Civil Protection, the organization that put together the panel, told reporters that citizens should not worry, and even agreed with a journalist who suggested that people should relax with a glass of wine.

Six days later, a major earthquake struck L’Aquila, a city in Abruzzo, killing more than 300 people. Soon after, citizens requested an investigation into the panelists’ findings, and the public prosecutor obliged. De Bernardinis and the panelists were charged with manslaughter and now face up to 15 years in prison. The L’Aquila judge who determined that the case could go to court said the defendants provided “imprecise, incomplete and contradictory information” and effectively “thwarted the activities designed to protect the public.”

Many seismologists around the world say that criminalizing the Italian panel’s assessments will have a chilling effect on science. Sheila Jasanoff, a professor at Harvard University’s John F. Kennedy School of Government who studies the role of science and expertise in politics and the law, told me that, though the Italian trial is an extreme example, public scrutiny of how scientists convey low-probability, high-danger situations is not in itself unreasonable. Nor is it unprecedented. Jasanoff said the trial in L’Aquila called to mind the fallout caused by research about bovine spongiform encephalopathy, or mad cow disease, in the U.K. In 1989 a scientific advisory group reported that it was unlikely that BSE could be transmitted to humans. Through the early 1990s, government ministries reassured the public that it was safe to eat homegrown beef. As it happened, BSE was transmissible to humans. After dozens died from BSE, the British government launched an inquiry, rather than prosecute, Italian-style. The inquiry’s report, released in 2000, criticized scientists and civil servants alike for not adequately communicating that what’s unlikely is not impossible—for failing to admit openly that they could not rule out the risk of transmission.

Just as scientists in Britain did not yet know whether BSE would jump to humans, seismologists in Italy could not predict whether a major earthquake would occur. The through-line from mad cow to the Italian earthquake is not science—good, bad, or otherwise; it is the decision to placate the public by tempering scientific findings with (in hindsight, misplaced) reassurances. If months from now, a court finds the scientists guilty, that would be unfair for them and set a dangerous precedent for panelists on future advisory committees, who might feel reluctant to offer any opinion at all. The ongoing trial should draw researchers’ attention to the benefits of declaiming their own uncertainty, and it should remind the rest of us of the chasm between factual evidence and practical advice.