With a growing body of research showing the role of social media in the promotion of anti-science sentiment and misinformation, pressure has been building on the tech companies themselves to clamp down on the spread of ideas like anti-vaccination conspiracy theories. The companies, especially social media like Facebook, have already faced mounting scrutiny over the last year focused on data privacy and their involvement in the spread of other fake news.
Now, lawmakers like California Representative Adam Schiff are focusing their attention on how social media can lend a veneer of legitimacy to ideas that have no foundation in science – some of which can cause widespread harm when put into practice.
As Washington State faces the latest in a series of measles outbreaks in the US, Schiff sent letters last week to Facebook CEO Mark Zuckerberg voicing concerns over the spread of anti-vaccination propaganda on Facebook and Instagram, and to Google CEO Sundar Pichai discussing the same issues on YouTube.
“Repetition of information, even if false, can often be mistaken for accuracy,” he said.
He also cited research earlier this month from The Guardian that showed the prevalence of inaccurate anti-vaccine information on Facebook and YouTube. In experiments that a number of journalists have been conducting and publishing on Twitter, and that Schiff tried himself, Facebook search results on vaccines turn up largely anti-vaccine content.
In the research, The Guardian’s Julia Carrie Wong also found that even more alarmingly, YouTube’s video recommendation algorithm fuels the spread of the propaganda.
Another recent study has implicated YouTube in the spread of another conspiracy theory that roundly dismisses the relevant science – a movement promoting the idea that Earth is flat rather than round, in spite of thousands of years of scientific evidence.
Those researchers said that while YouTube hadn’t done anything explicitly wrong to promote the spread of Flat Earth misinformation, they certainly could choose to modify their algorithm to make it not quite as easy for users stumble on.
They found that after watching conspiracy theory videos on other topics, such as 9/11, the Moon landing, and the Sandy Hook school shooting, YouTube’s algorithm began suggesting videos promoting Flat Earth theories. In this way, the site exposes the ideas to those most likely to accept them.
And while a belief that the Earth is flat may pose less immediate risk than anti-vaccination views, the researchers highlighted how it plays into a larger problem:
“Believing the Earth is flat in of itself is not necessarily harmful, but it comes packaged with a distrust in institutions and authority more generally,” according to Asheley Landrum, who led the Flat Earth research at Texas Tech University. “We want people to be critical consumers of the information they are given, but there is a balance to be had.”
Researchers point to what they call a “data void” as the source of the problem. For certain search terms, there is a minimal amount of new content discussing the settled science, but a plethora of propaganda that dismisses all of the evidence in favor of fearmongering.
Sometimes, the consequences have been even more drastic, with lynch mobs in Sri Lanka and India responding to misinformation about child abductions on WhatsApp, a messaging service owned by Facebook.
The companies are starting to respond to criticism. Both Facebook and YouTube have put policies in place to tamp down on misleading information, and last month, YouTube said it would curtail recommendations of videos that “could misinform users in harmful ways.”
Last summer, Facebook started deleting misinformation that aimed to promote “violence or physical harm.”
Pinterest, another social network focused on images, has instituted a broad policy against misinformation, with rules specifically targeting “promotion of false cures for terminal or chronic illnesses and anti-vaccination advice.”
Banning content from prominent anti-vaccination activists was just the beginning – Pinterest has now “blacklisted” certain problematic search terms to show no results at all, as an alternative to misinformation. Users can no longer find results of any kind searching “vaccination” on Pinterest.
“We’re hoping that we can move from breaking the site to surfacing only good content. Until then, this is preferable,” said Ifeoma Ozoma, a Pinterest public policy and social impact manager.
“This has not been a focus for other platforms,” according to Ozoma. “We’ve been working on this since 2017, but we’ve been kind of alone on it. Hopefully there’s more discussion going forward.”
There are other long-term solutions. The internet needs more content promoting settled science to drown out unscientific voices. Unlike traditional media, social media gives more weight to what’s new and popular than what’s proven by experts. Platforms like Facebook, Pinterest, and YouTube need to work towards giving preference to that content and not actively promoting misinformation. But until then, with issues so important to the welfare of society as a whole, perhaps other companies should take a cue from Pinterest, if that’s what it takes to stop the spread of harmful misinformation.