Business

Op-Ed: What You Hear About Covid in the Metaverse Should Scare You

Andrew Caballero-Reynolds | AFP | Getty Images

Brian Castrucci is an epidemiologist, public health practitioner and president and CEO of the de Beaumont Foundation. Frank Luntz is a Republican pollster and communication advisor.

It's not what you say that matters. It's what people hear. And what people are hearing on social media regarding Covid-19 in general, and the vaccines in particular, should scare you.

On the day that Mark Zuckerberg announced Meta, John Carmack, the consulting chief technology officer for Oculus (Facebook's virtual reality unit), acknowledged the potential harm in the digital world, saying, "If there is a demonstrated harm, then yes, we should try to mitigate the harm โ€ฆ I think generally the right thing to do is to wait until harm actually manifests."

That's akin to the fire department arriving to a house only after it's been burned the ground. As has been made clear during the Covid pandemic, we're already too late.ย 

The impact of social media on health goes beyond Covid. Long before the pandemic, social media had been linked to worsening mental health, increased risk for eating disorders, and misinformation about cures for diseases like cancer and diabetes. The question isn't whether we should act. It is why we haven't acted yet.

In the real world, there are rules and regulations in place to protect the public. From food and product safety to air and water quality measures, Americans expect some level of common-sense protection from known harm for the things we use and consume daily. Why should a virtual world be any different?

Social media has been weaponized to spread misinformation about Covid and the vaccines, which has contributed to lower vaccination rates and, ultimately, cost American lives. We don't yet know what the "metaverse" will look like, but it's not hyperbolic to assume misinformation will fester and spread just the same, if not worse.

Morning Consult conducted a recent poll for us that provides additional proof of the negative impact of social media use on our ability to save lives during the pandemic. Those who said they share information daily on social media are the most likely to believe unfounded and inaccurate statements about Covid including incorrect information about infertility, the impact of the mRNA vaccine on DNA and the severity and prevalence of the virus.

Social media is also influencing people's willingness to get vaccinated โ€” or not. Just over half of the unvaccinated respondents said social media was pushing them to wait or forgo the vaccine, and the vaccination rate among people who said social media was one of their primary sources of information was 16% lower than the rate among the general public.

Even in this era of hyper-sensitivity to free speech and expression, much of it deserved, 53% of Americans agree that social media companies should restrict or remove what they determine to be misinformation or disinformation about Covid and vaccines. The battle against misinformation goes beyond this pandemic as false and misleading information poses a real and measurable threat to our collective and individual health.

We have the opportunity to address the vulnerabilities that the pandemic has exposed. Regulators must hold social media companies and others accountable and responsible by engaging the public health community and ensuring that Internet regulations include common sense public health protections.

Our nation, and especially the public health community, can't be caught flat-footed again when the next crisis or pandemic hits. As the digital world evolves, so must public health. In a new era of "techno public health," collaboration between public health practitioners and social media could include:

  1. Partnerships between social media companies and public health practitioners to create, adopt, and implement accepted public health principles and protocols for the digital world.
  2. Congress creating a "digital world" safety office at the Centers for Disease Control and Prevention to monitor incidence of misinformation and deliberate disinformation on social media platforms and support ongoing research about the impacts of social media, misinformation, and public health knowledge and outcomes.
  3. State and local governmental public health agencies creating roles for digital community health workers. Community health workers are trusted public health educators and health care navigators in the real world and could be trusted influencers in the digital space. Agencies can start to develop these roles now, leveraging American Rescue Plan Act resources.

When it comes to Covid and the vaccines, there's room for debate about policies like vaccine and mask mandates. And social media channels provide a place where ordinary people can have robust discussions. However, facts are not debatable.

The harmful impact of misinformation is not hypothetical โ€“ it is real, and it is personal. Alaska's chief medical officer, Dr. Anne Zink, recently wrote about her observations as an emergency physician.

"My patient (who remains hospitalized) was suffering not just because of the virus, but also because of the deadly combination of misinformation and disinformation in a broken health-care system, in a country of broken trust," she said.

When do we decide as a society that enough is enough?

ย 

ย 

Copyright CNBC
Contact Us