The AstraZeneca vaccine crisis in Europe was not at all a question of science

AstraZeneca vaccine vials pictured in Huelva, Spain on March 24, 2021. Photo: Reuters / Marcelo del Pozo.

A crisis was brewing in Europe, over the safety of the Oxford-AstraZeneca vaccine against COVID-19. Europeans in several countries had reported abnormal blood clots and bleeding after receiving it, which sometimes required hospitalization. In response, regulators across Europe, where the vaccine is most widely distributed, had suspended its use and were reviewing its safety. But scientific and medical experts were frustrated. They pointed out that the incidence of blood clots is actually much lower than in the general population and that the vaccine is safe. They fear that the regulatory response will amplify vaccine reluctance and increase life-threatening COVID-19 infections. The science, they insist, is clear and must be trusted.

But this crisis was not at all about science. It was about the public trust, and frightened citizens cannot be easily convinced by an expertise that seems far away. Our solutions must reflect this.

There is a long-standing perception that vaccine reluctance is the result of public ignorance or rejection of science. But studies show that vaccine reluctance is the result of mistrust of our governing institutions, including those dedicated to science and technology. Citizens are alarmed by the often gentle relationships between regulators and the industries they oversee, and frustrated by the role of private interests in the research enterprise.

My own research, conducted as part of the Technology Assessment Project at the University of Michigan, adds two other types of institutional failure that lead to citizen mistrust. We have seen serious problems due to what sociologist Diane Vaughan calls the “normalization of deviance,” where dangerous bureaucratic practices become normal if they do not cause immediate disaster. Examining the Space Shuttle Challenger disaster, Vaughan discovered that NASA’s organizational culture essentially prevented managers from hearing engineers’ concerns about the O-ring’s weaknesses, the technical component ultimately blamed for the explosion as it fell apart. cracked in extremely cold temperatures. Likewise, organizational issues at the U.S. Centers for Disease Control and Prevention led to its flawed COVID-19 test at the start of the pandemic.

In addition, marginalized communities often feel that decisions made by governments and technical experts simply do not represent them. Take the recent water crisis in Flint, Michigan. Shortly after rulers changed the city’s water source to save money, residents (54% of whom are black) noticed their water smelled foul and was discolored. They started to lose their hair and their skin was breaking out in rashes. Some people have died from Legionnaires’ disease. They shouted at local experts and government officials, including environmental and health regulators, but their concerns were summarily dismissed for months. This episode ignited existing concerns that government officials were not respecting the knowledge or needs of the community, and to date there is great skepticism about the safety of their water.

Seen in this light, we should applaud the decisions of European regulators to take a precautionary approach to the Oxford-AstraZeneca vaccine. To ensure public confidence in vaccines – and technologies more generally – governments must take adverse events and emerging community concerns seriously. By suspending the distribution of vaccines and reviewing the data, European governments allow citizens to feel their concerns are heard and to believe that their governments are truly representing their interests. In the short term, these governments can continue to build community confidence by being transparent about their vaccine findings, including the uncertainties and the risks and benefits they seek when deciding how to proceed. . This is especially crucial in Europe, where vaccine hesitancy rates were high even before blood clots were discovered.

But this episode also provides lessons for the long haul. Responding quickly and transparently during a crisis is not enough, especially because citizens often do not share the same priorities, for example, obtaining collective immunity. Instead, building and maintaining public trust requires systemic solutions.

First, drug regulators need to create systems that require physicians to report all adverse drug reactions, as well as clear rules on what types of data may trigger further scrutiny. The data and rules that guide government decision-making should be accessible to citizens. While most countries have notification systems, few are mandatory. A mandatory system makes important data available and, if communicated effectively, can reassure citizens that their leaders are watching pharmaceuticals and are ready to act when necessary. It can also help curb the spread of false information, as citizens can view the data on their own.

Second, regulators should include the views of citizens – especially communities historically marginalized or likely to be skeptical – when reviewing vaccines. Community members can advise bureaucrats during initial review processes and sit on advisory committees alongside technical experts. This would provide crucial expertise to decision makers, helping them understand which risks citizens are concerned about and how they balance risks and benefits. This would simultaneously give these communities a sense of inclusion and empowerment while giving them better insight into the vaccine approval process that they can take back to their friends and family who might be hesitant.

Finally, it is essential that drug regulators, research funding bodies, and other science and technology policy institutions make community concerns central to their day-to-day work. In other words, we cannot only care about the concerns of alienated communities when they affect us directly. To build long-term trust, leaders must learn more about the concerns and priorities of historically marginalized populations and shift research and regulatory priorities to address them.

Some might argue that such steps take too long and will distract from the goal of developing science and technology in the public interest. But if the past year has taught us anything, it’s that broad social trust is essential to the success of public health initiatives and, ultimately, to our survival.

‘Future Tense’ is a partnership of Slate, New America, and Arizona State University which examines emerging technologies, public policy, and society.

Source