Skip to content
Penn Engineering Blog

Posts from the School of Engineering and Applied Science

  • Blog
  • Newsroom
  • Magazine
  • Penn Engineering Home

Reexamining Misinformation: How Unflagged, Factual Content Drives Vaccine Hesitancy

Posted on May 30, 2024July 17, 2024Author Ian Scheffler
A cartoon of a Facebook newsfeed with an image of a vaccine syringe, a COVID-19 viral particle, and the phrase "FAKE NEWS" stamped across the image.
Factual, vaccine-skeptical content on Facebook has a greater overall effect than “fake news,” discouraging millions from the COVID-19 shot. (Marcela Vieira via Getty Images)

What threatens public health more, a deliberately false Facebook post about tracking microchips in the COVID-19 vaccine that is flagged as misinformation, or an unflagged, factual article about the rare case of a young, healthy person who died after receiving the vaccine?

According to Duncan J. Watts, Stevens University Professor in Computer and Information Science at Penn Engineering and Director of the Computational Social Science (CSS) Lab, along with David G. Rand, Erwin H. Schell Professor at MIT Sloan School of Management, and Jennifer Allen, 2024 MIT Sloan School of Management Ph.D. graduate and incoming CSS postdoctoral fellow, the latter is much more damaging. “The misinformation flagged by fact-checkers was 46 times less impactful than the unflagged content that nonetheless encouraged vaccine skepticism,” they conclude in a new paper in Science. 

Historically, research on “fake news” has focused almost exclusively on deliberately false or misleading content, on the theory that such content is much more likely to shape human behavior. But, as Allen points out, “When you actually look at the stories people encounter in their day-to-day information diets, fake news is a minuscule percentage. What people are seeing is either no news at all or mainstream media.” 

A portrait of Duncan Watts, wearing glasses and a blue collared shirt
Duncan Watts, Stevens University Professor in Computer and Information Science

“Since the 2016 U.S. presidential election, many thousands of papers have been published about the dangers of false information propagating on social media,” says Watts. “But what this literature has almost universally overlooked is the related danger of information that is merely biased. That’s what we look at here in the context of COVID vaccines.” 

In the study, Watts, one of the paper’s senior authors, and Allen, the paper’s first author, used thousands of survey results and AI to estimate the impact of more than 13,000 individual Facebook posts. “Our methodology allows us to estimate the effect of each piece of content on Facebook,” says Allen. “What makes our paper really unique is that it allows us to break open Facebook and actually understand what types of content are driving misinformed-ness.” 

One of the paper’s key findings is that “fake news,” or articles flagged as misinformation by professional fact-checkers, has a much smaller overall effect on vaccine hesitancy than unflagged stories that the researchers describe as “vaccine-skeptical,” many of which focus on statistical anomalies that suggest that COVID-19 vaccines are dangerous. 

“Obviously, people are misinformed,” says Allen, pointing to the low vaccination rates among U.S. adults, in particular for the COVID-19 booster vaccine, “but it doesn’t seem like fake news is doing it.” One of the most viewed URLs on Facebook during the time period covered by the study, at the height of the pandemic, for instance, was a true story in a reputable newspaper about a doctor who happened to die shortly after receiving the COVID-19 vaccine. 

That story racked up tens of millions of views on the platform, multiples of the combined number of views of all COVID-19-related URLs that Facebook flagged as misinformation during the time period covered by the study. “Vaccine-skeptical content that’s not being flagged by Facebook is potentially lowering users’ intentions to get vaccinated by 2.3 percentage points,” Allen says. “A back-of-the-envelope estimate suggests that translates to approximately 3 million people who might have gotten vaccinated had they not seen this content.” 

A portrait of Jennifer Allen, wearing a black blazer and white shirt
Jennifer Allen, incoming CSS postdoctoral fellow and lead author of the new paper

Despite the fact that, in the survey results, fake news identified by fact-checkers proved more persuasive on an individual basis, so many more users were exposed to the factual, vaccine-skeptical articles with clickbait-style headlines that the overall impact of the latter outstripped that of the former. 

“Even though misinformation, when people see it, can be more persuasive than factual content in the context of vaccine hesitancy,” says Allen, “it is seen so little that these accurate, ‘vaccine-skeptical’ stories dwarf the impact of outright false claims.” 

As the researchers point out, being able to quantify the impact of misleading but factual stories points to a fundamental tension between free expression and combating misinformation, as Facebook would be unlikely to shut down mainstream publications. “Deciding how to weigh these competing values is an extremely challenging normative question with no straightforward solution,” the authors write in the paper. 

Allen points to content moderation that involves the user community as one possible means to address this challenge. “Crowdsourcing fact-checking and moderation works surprisingly well,” she says. “That’s a potential, more democratic solution.” 

With the 2024 U.S. Presidential election on the horizon, Allen emphasizes the need for Americans to seriously consider these tradeoffs. “The most popular story on Facebook in the lead-up to the 2020 election was about military ballots found in the trash that were mostly votes for Donald Trump,” she notes. “That was a real story, but the headline did not mention that there were nine votes total, seven of them for Trump.” 

This study was conducted at the University of Pennsylvania’s School of Engineering and Applied Science, the Annenberg School for Communication and the Wharton School, along with the Massachusetts Institute of Technology Sloan School of Management, and was supported by funding from Alain Rossmann.

Share:
Tagged 2024 election, ai, Computational Social Science Lab, CSS, Duncan Watts, fake news, Jennifer Allen, misinformation, presidential election, Responsible Innovation

Post navigation

Brewing Brilliance: Nader Engheta and Firooz Aflatouni of Penn’s School of Engineering and Applied Science turn tea time into new ideas.
NSF-Funded Expedition Project Uses AI to Rethink Computer Operating Systems

Categories

  • Academics
  • Alumni
  • Events
  • Grants
  • Guest Posts
  • Honors + Awards
  • In the News
  • Medium.com Archive
  • Research + Innovation
  • Students

Tags

Abhishek Dhand ai antibiotics artificial intelligence Bach BE CBE Cesar de la Fuente chatgpt Chinedum Osuji Chris Callison-Burch CIS Clark Scholars CSSLab Dan Huh Dani Bassett Duncan Watts EENT Engineering Alumni Society ESE Jason Burdick Kevin Turner Liang Feng lipid nanoparticles llms LNPs Lulu Xue Magazine News Matt Fallon MEAM microbiome Mike Mitchell misinformation MIT MSE network theory Olympics Paco Barreras precise center Responsible Innovation Senior Design Sherry Gao Sophia David Suman Kulkarni Vijay Kumar

Archives

Follow Penn Engineering

Receive Our Newsletter

Authors

avatar for Bella CiervoBella Ciervo
avatar for Olivia J. McMahonOlivia J. McMahon
avatar for Melissa PappasMelissa Pappas
avatar for Ian SchefflerIan Scheffler
avatar for Holly WojcikHolly Wojcik
Penn Engineering Logo

School of Engineering and Applied Science
University of Pennsylvania
220 South 33rd Street
107 Towne Building
Philadelphia, PA 19104-6391

Contact Penn Engineering Webmaster
Report accessibility issues and get help

© 2025 Penn Engineering Blog
Powered by WordPress / Theme by Design Lab