Why is Facebook so bad at countering vaccine misinformation?


It’s been six months since Facebook announced a major reversal to its policies on vaccine misinformation. Faced with a rising tide of viral rumors and conspiracy theories, the company said it would vaccine mistruths from its platform. Notably, the effort not only encompassed content about COVID-19 vaccines, but all vaccines. That includes many of the kinds of claims it had long allowed, like those linking vaccines and autism, statements that vaccines are “toxic” or otherwise dangerous.

The move was widely praised, as disinformation researchers and public health officials have long urged and other platforms to treat vaccine misinformation more aggressively. Since then, the company has banned some prominent anti-vaxxers, stopped recommending health-related groups and shown vaccine-related PSAs across Facebook and Instagram. It now labels that mentions COVID-19 vaccines, whether factual or not.

Yet, despite these efforts, vaccine misinformation is still an urgent problem, and public health officials say Facebook and other social media platforms aren’t doing enough to address it. Last month, the Surgeon General issued warning of the dangers of health misinformation online. The accompanying 22-page report didn’t call out any platforms by name, but it highlighted algorithmic amplification and other issues commonly associated with Facebook. The following day, President Joe Biden made headlines when he said that misinformation on Facebook was “.”

While Facebook has , citing its numerous efforts to quash health misinformation during the pandemic, the company’s past lax approach to vaccine misinformation has likely made that job much more difficult. In a statement, a Facebook spokesperson said vaccine hesitancy has decreased among its users in the US, but the company has also repeatedly rebuffed requests for more data that could shed light on just how big the problem really is.

“Since the beginning of the pandemic, we have removed 18 million pieces of COVID misinformation, labeled hundreds of millions of pieces of COVID content rated by our fact-checking partners, and connected over 2 billion people with authoritative information through tools like our COVID information center,” a Facebook spokesperson told Engadget. “The data shows that for people in the US on Facebook, vaccine hesitancy has declined by 50% since January, and acceptance is high. We will continue to enforce against any account or group that violates our COVID-19 and vaccine policies and offer tools and reminders for people who use our platform to get vaccinated.”

Facebook’s pandemic decision

Throughout the pandemic, Facebook has moderated health misinformation much than it has in the past. Yet for the first year of the pandemic, the company made between coronavirus misinformation — e.g., statements about fake cures or disputing the effectiveness of masks, which it removed — and vaccine conspiracy theories, which did not break the company’s rules. Mark Zuckerberg that he would be reluctant to moderate vaccine misinformation the same way the company has with COVID misinformation.

That changed this year, with the advent of COVID-19 vaccines and the rising tide of misinformation and vaccine hesitancy that accompanied them, but the damage may have already been done. A peer-reviewed study published in February found that exposure to misinformation about the COVID-19 vaccines “lowers intent to accept a COVID-19 vaccine” by about 6 percent.

People are also more likely to be unvaccinated if they primarily get their news from Facebook, according to from the COVID States Project. The researchers sampled more than 20,000 adults in all 50 states and found that those who cited Facebook as a primary news source were less likely to be vaccinated. While the authors note that it doesn’t prove that using Facebook affects someone’s choice to get vaccinated, they found a “surprisingly strong relationship” between the two.

“If you rely on Facebook to get news and information about the coronavirus, you are substantially less likely than the average American to say you have been vaccinated,”. “In fact, Facebook news consumers are less likely to be vaccinated than people who get their coronavirus information from Fox News. According to our data, Facebook users were also among the most likely to believe false claims about coronavirus vaccines.”

The researchers speculate that this could be because people who spend a lot of time on Facebook are less likely to trust the government, the media or other institutions. Or, it could be that spending time on the platform contributed to that distrust. While there’s no way to know for sure, we do know that Facebook has for years been an effective platform for spreading disinformation about vaccines.

A spotty record

Doctors and researchers have warned for years that Facebook wasn’t doing enough to prevent lies about vaccines from spreading. Because of this, prominent anti-vaxxers have used Facebook and Instagram to spread their message and build their followings.

A report published earlier this year from the CCDH found that more than half of all vaccine misinformation online could be linked to who are part of a long-running, and , effort to undermine vaccines. But while the company has banned some accounts, some of those individuals still have a presence on a Facebook-owned platform, the CCDH. Facebook has disputed the findings of that report, which relied on analytics from the company’s CrowdTangle tool. But the social network’s own research into vaccine hesitancy indicated “a small group appears to play a big role” in undermining vaccines, The Washington Post in March.

There are other issues, too. For years, Facebook’s search and recommendation algorithm have made it extraordinarily easy for users to fall into rabbit holes of misinformation. Simply searching the word “vaccine” would be enough to surface recommendations for accounts spreading conspiracy theories and other vaccine disinformation.

Engadget reported on Instagram’s algorithmic search results associated anti-vaccine accounts with COVID-19 conspiracies and QAnon content. More than a year later, a recent study found that although this type of content no longer appears at the top of search results, Facebook’s recommendation algorithms continue to recommend pages and groups that promote misinformation about vaccines. In their report, researchers document how users can fall into misinformation “rabbit holes” by liking seemingly innocuous pages or searching for “vaccines.” They also found that Facebook’s page recommendation algorithm appeared to associate vaccines and autism.

“Over the course of two days, we used two new Facebook accounts to follow vaccine-related pages that Facebook suggested for us. Facebook’s algorithm directed us to 109 pages, with 1.4M followers, containing anti-vaccine content — including pages from well-known anti-vaccine advocates and organizations such as Del Bigtree, Dr. Ben Tapper, Dr. Toni Bark, Andrew Wakefield, Children’s Health Defense, Learn the Risk, and Dr. Suzanne Humphries. Many of the pages the algorithm recommended to us carried a label, warning that the page posts about COVID-19 or vaccines, giving us the option to go directly to the CDC website. The algorithm also recommended 10 pages related to autism — some containing anti-vaccine content, some not — suggesting that Facebook’s algorithm associates vaccines with autism, a thoroughly debunked link that anti-vaccine advocates continue to push.”

Facebook has removed some of these pages from its recommendations, though it’s not clear which. Avaaz points out that there’s no way to know why Facebook’s recommendation algorithm surfaces the pages it does as the company doesn’t disclose how these systems work. Yet it’s notable because content associating vaccines with autism is one of the claims that Facebook said it would ban under its stricter misinformation rules during the pandemic. That Facebook’s suggestions are intermingling the topics is, at the very least, undermining those efforts.

Claims and counterclaims

Facebook has strongly opposed these claims. The company repeatedly points to its messaging campaign around covid-19 vaccines, noting that more than 2 billion people have viewed the company’s COVID-19 and vaccine PSAs.

In a responding to President Biden’s comments last month, Facebook’s VP of Integrity Guy Rosen argued that “vaccine acceptance among Facebook users in the US has increased.” He noted that the company has “reduced the visibility of more than 167 million pieces of COVID-19 content debunked by our network of fact-checking partners so fewer people see it.”

He didn’t share, however, how much of that misinformation was about vaccines, or details on the company’s enforcement of its more general vaccine misinformation rules. That’s likely not an accident. The company has repeatedly resisted efforts that could shed light on how misinformation spreads on its platform.

Facebook executives declined a request from their data scientists who asked for additional resources to study COVID-19 misinformation at the start of the pandemic, The New York Times. It’s not clear why the request was turned down, but the company has also pushed back on outsiders’ efforts to gain insight into health misinformation.

Facebook has declined to share the results of an internal study on vaccine hesitancy on its platform, according to Washington DC Attorney General Karl Racine’s office, which has launched a consumer protection investigation into the company’s handling of vaccine misinformation.

“Facebook has said it’s taking action to address the proliferation of COVID-19 vaccine misinformation on its site,” a spokesperson said. “But then when pressed to show its work, Facebook refused.”

The Biden Administration has also — unsuccessfully — pushed Facebook to be more forthcoming about vaccine misinformation. According to , administration officials have met repeatedly with Facebook and other platforms as part of its effort to curb misinformations about coronavirus vaccines. Yet when a White House official asked Facebook to share “how often misinformation was viewed and spread,” the company refused. According to The Times, “Facebook responded to some requests for information by talking about vaccine promotion strategies,’ such as its PSAs or its tool to help users vaccine appointments.

One issue is that it’s not always easy to define what is, and isn’t, misinformation. Factual information, like news stories or personal anecdotes about vaccine side effects, can be shared with misleading commentary. This, Facebook has suggested, makes it difficult to study the issue in the way that many have asked. At the same time, Facebook is a notoriously data-driven company. It’s constantly testing even the smallest features, and it employs scores of researchers and data scientists. It’s difficult to believe that learning more about vaccine hesitancy and how misinformation spreads is entirely out of reach.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.



Source link

We will be happy to hear your thoughts

Leave a reply

Household Attire
Logo