NEW YORK – Ten weeks after Facebook Inc. pledged to fight vaccine misinformation, such content remains widely available across its platforms as the social-media giant grapples with how aggressively to limit the spread of hoaxes and deceptions.
Facebook as of this week is still running paid ads for a prominent antivaccination group that suggests unethical doctors have conspired to hide evidence of harm vaccines do to children. Both the company’s main site and its Instagram app recommend additional antivaccine content to users who view similar material. And the top three vaccine-related accounts recommended by Instagram are “vaccinetruth” “vaccinesuncovered” and “vaccines – revealed” – all advocates for the discredited claim that vaccines are toxic.
“We’re not where we want to be,” Monika Bickert, Facebook’s head of global policy management, said in an interview. “And we know that.”
Public health officials in the US, which is in the midst of its biggest measles outbreak in more than a quarter-century, have long promoted the efficacy and safety of vaccines.
Vaccine misinformation is only one of many fronts on which Facebook continues to catch flak related to how it polices content, even as it invests billions of dollars and dedicates thousands of employees to the task. Last week, Facebook took fire for failing to remove a doctored video of House Speaker Nancy Pelosi, with Bickert saying at the time it wasn’t the company’s place to determine the truth of the altered footage.
Facebook also has been criticized for removing content. After it temporarily shut out a group dedicated to adherents of a diet plan affiliated with CrossFit, the popular fitness company accused Facebook of shilling for an “unholy alliance of academia, government, and multinational food, beverage, and pharmaceutical companies.” CrossFit said last week that it would stop using Facebook services until further notice.
Facebook has declined to comment on the reason for the suspension.
Bickert said the company is mindful that it could be accused of overreach when it comes to pulling content from the platform. With that awareness, for vaccines the company aims only to prevent the spread of specific types of false information, not silence antivaccine activists, she said.
Amid calls from some lawmakers to break up the company, Facebook Chief Executive Mark Zuckerberg has contended that its size and resources make it uniquely capable of tackling the problem. “Facebook spends more on safety than Twitter’s whole revenue for the year,” Zuckerberg said on a conference call earlier this month. “We’re able to do things that I think are just not possible for other folks to do.”
The incremental pace of Facebook’s vaccine initiative stands out in part because, by the company’s admission, the issue is more straightforward and easier to address than fast-moving disinformation campaigns or other instances where the truth isn’t clear.
“Here there are persistent hoaxes that are widely circulating on and offline, and there’s a consensus among the leading experts that these are false,” Bickert said.
Yet she cautioned against expecting immediate results in what users see on the platform. Facebook is still working with public health groups on informational material that will be provided to users atop vaccine-related search results, and automated tools for detecting banned antivaccine content aren’t ready for deployment.
Once Facebook has come up with solutions to these problems, Bickert said, it will have to test those prototypes before implementing them. “If you don’t get it right, you could be pushing people in the wrong direction or toward shirking away from engagement at all,” she said.
Some solutions implemented by the company can be less than meets the eye. Facebook’s commitment to cull advertising from antivaccine groups, for example, is being applied only to ads that include falsehoods in their actual text.
A prominent antivaccination organization, the World Mercury Project, is pitching a free e-book alleging that vaccines can cause autism, sudden infant death syndrome and sterility – all claims that Facebook would ban under its stated policy. But the company deems the ad acceptable because the ad text itself makes only vague claims about “conflicts of interest” and “tainted science,” before directing users to material containing claims explicitly banned by Facebook.
In other instances, Facebook hasn’t yet implemented solutions it said it had put in place.
As part of Facebook’s recent efforts to provide users with more private forums for communication, Zuckerberg said the company had made it harder to find groups whose administrators promote misinformation. And Facebook’s initiative against antivaccine misinformation included a pledge to remove groups and pages featuring such material from Facebook’s search results. But administrators of several of the top-recommended Facebook groups explicitly endorse the claim that vaccines cause autism, and the most popular Facebook page related to vaccines – with 106,000 likes – is for an antivaccine documentary making similar claims.
Likewise, Instagram held an event for reporters earlier this month in which the photo-sharing app announced that it was actively deleting hashtags that had become overrun by false antivaccination content, making it harder for such material to spread. Several weeks later, hashtags including #vaccineskill and #Vaccinesharm are no longer active. However, #vaccineinjury, #vaccinetruth and #antivaccine show up among the most popular vaccine-related hashtags, filled with thousands of posts containing the sort of material that Instagram said would trigger crackdowns.
“But I wouldn’t say that’s an indication that work isn’t taking place,” said Karina Newton, Instagram’s global head of public policy, adding that the company has approximately 200 people across the company working on this issue.
Anecdotally, both vaccine opponents and advocates say Facebook’s efforts to date haven’t produced a noticeable impact on what is displayed. Peter Hotez, dean of Baylor University’s school of tropical medicine, said he has observed no changes since Facebook said it would intervene. Hotez, who develops vaccines for neglected diseases and also has an autistic daughter, has become a lightning rod for antivaccine activists on Facebook, appearing in derisive memes and being accused of profiting from harming children.
Facebook’s activities to date on antivaccine content are “the minimum possible in order to give the illusion of corporate responsibility,” he said.
Facebook’s announced crackdown hasn’t yet affected the platform’s utility to the antivaccine movement, said Elaine Shtein, a California natural-living advocate and antivaccine activist. She said she grew concerned about vaccine safety in 2011 when her 18-month old son was diagnosed with autism that she believes was the result of “toxic overload for his body.”
More than any other forum, she said, Facebook has fostered online communities of parents who believe their children were harmed by vaccines and are trying to warn others. “We’ve educated a lot of people, “ she said, crediting Facebook with enabling antivaccine activists to make their case in the face of skepticism from mainstream media.
Shtein said she is aware of Facebook’s plans to restrict the spread of antivaccine content and doesn’t believe the company should have the right to determine what is bogus or not. For now her posts appear near the top of numerous Facebook and Instagram hashtags related to vaccines.
“I don’t feel like my content is being censored,” she said. “I still have things being shared out, go viral.”