An Oxford study has found that general vaccine mistrust, conspiracy beliefs, low government trust and watching YouTube are key predictors of an individual’s COVID-19 vaccine hesitancy.
Conducted with 1,476 UK adults between 12-16 December 2020, the study found that “trust is a core predictor, with distrust in vaccines in general and mistrust in government raising vaccine hesitancy.” Researchers found common misunderstandings among people who are covid-vaccine-hesitant fell under believing that herd immunity provides virus protection, fearing rapid vaccine development and unknown side effects, and believing that the virus is man-made and used for population control.
After government mistrust, the social media channels Snapchat, TikTok and YouTube were the next indicators for vaccine hesitancy. Facebook and Twitter had little correlated impact on vaccine hesitancy or eagerness.
Researchers pointed to in-built components of the social media platforms as directly responsible for hesitancy. The study pointed to “relatively unregulated social media sources—such as YouTube—that have recommendations tailored by watch history.” Top 5, Up Next and other algorithm tools were also blamed for increasing the likelihood of users getting stuck in echo chambers. Futhermore, it seems that fake news spreads more effectively than real news; an analysis of 1300 Facebook pages during the 2019 measles outbreak found that anti-vax pages grew by 500%, compared to 50% growth of pro-vaccine pages.
Increasing numbers of people are turning to social media for health advice. Even in 2013, 72% of Americans and 83% of Europeans used the internet as a source of health information, and the covid-19 pandemic has only exacerbated the trend. The Oxford vaccine hesitancy study pressed for intervention from governments, health officials and social media companies to combat health misinformation in the future.
When asked for reasons why they did not want to take a covid vaccine, “they offered either some adapted understanding of herd immunity, or arguments that the virus was not as deadly as described (linked to scepticism of registered deaths), concluding that most people do not need a vaccine. Similarly, in justifying their decision not to get a vaccine, they highlighted their belief that the vaccine process had been rushed, that not enough testing had been undertaken, and the potential of unknown side effects.”
Some believed that those who were must vulnerable to covid should receive a vaccine, but as they did not personally see themselves to be at-risk, believed that they did not need a vaccine. The WHO listed vaccine hesitancy as one of the top ten threats to global health in 2019.
Covid vaccine hesitancy draws on long-standing issues regarding general vaccine trust. A study of vaccine related YouTube posts found that 65.5% discouraged vaccine use, focusing on autism, undisclosed risks, adverse reactions, and alleged mercury content.
The study suggested that “emotional thinking during conditions of uncertainty” can make individuals unable to assess risk, citing the example of the 0.0004% instance of rare blood clot disorders in the Oxford/AstraZeneca and Johnson & Johnson vaccines.
Potential interventions could include advertisers boycotting their advertisements appearing alongside harmful content, altering keyword searches and redirecting individuals to correct sources, banning overt conspiracy groups such as QAnon, flagging misinformation, or rapidly removing content. A large difficulty with regulating social media sites is the rate of transmission; YouTube and Facebook removed a video titled “Plandemic” but only after it had been viewed 1.8 million times.
As for direct actions of the government, the study warned that “reversals in advice” can generate a great deal of mistrust, adding “social trust enables the collective action needed to achieve sufficient population vaccination level.”