Troubled by the number of unvaccinated COVID-19 patients at his hospital, the French doctor logged into Facebook and uploaded a video urging people to get vaccinated. He was soon surrounded by dozens, then hundreds, then more than 1,000 hate messages from an anti-vaccination extremist group known as V_V. The group, active in France and Italy, harassed doctors and public health officials, vandalized government offices and attempted to disrupt vaccine clinics.
Be warned about the abuse of its platform, Facebook launched several accounts affiliated with the group last December. But it didn’t stop V_V, which continues to use Facebook and other platforms and, like many anti-vaccination groups around the world, has expanded its portfolio to include climate change denialism and anti-democratic messages.
“Let’s go get them at home, they don’t have to sleep anymore,” one post from the group. “Fight us!” read other.
The largely unchecked nature of attacks on the indisputable health benefits of vaccines has highlighted the clear limitation of a social media company in blocking even the Misinformation is most destructive, especially without sustained active effort.
Researchers at Reset, a US-based nonprofit, have identified more than 15,000 abusive or misleading Facebook posts from V_V – activity that peaked in the spring 2022, months after the platform announced actions against the organization. In a report on V_V’s activities, Reset researchers concluded that its continued presence on Facebook raises “questions about the effectiveness and consistency of self-reported interventions.” Meta’s report”.
Meta, Facebook’s parent company, noted in its response that its 2021 actions were never intended to remove all V_V content but rather to take down accounts found to be engaging in harassment coordination disorder. After the Associated Press notified Facebook of the group’s continued activities on its platform, it said it removed 100 more accounts this week.
Meta says it’s trying to strike a balance between removing content from groups like V_V that clearly violate rules against harassment or dangerous misinformation, and not silencing innocent users. That can be especially difficult when it comes to the controversial issue of vaccines.
A spokesperson for Meta told the AP: “This is a highly competitive space, and our efforts are ongoing: since the initial takedown, we’ve taken numerous actions against the attempt. the returning force of this network”.
V_V is also working on Twitter, where Reset researchers found hundreds of accounts and thousands of posts from the group. Many of the accounts were created shortly after Facebook implemented the program last winter, Reset showed.
In response to Reset’s report, Twitter said it had taken enforcement actions against several accounts linked to V_V but did not detail those actions.
V_V has proved particularly resilient to efforts to stop it. Named for the movie V for Vendetta, in which a lone masked man seeks revenge against an authoritarian government, the group uses fake accounts to avoid detection and often coordinates activities. Its messaging and activity on platforms like Telegram lack Facebook’s more aggressive moderation policies.
According to Jack Stubbs, a researcher at Graphika, a data analytics company that has tracked V_V’s activities, that adaptability is one reason why it’s so hard to stop the group.
“They understand how the Internet works,” says Stubbs.
Graphika estimates the group’s membership to be 20,000 by the end of 2021, with a small number of members participating in the group’s online harassment efforts. In addition to Italy and France, Graphika’s team has found evidence that V_V is trying to create divisions in Spain, the United Kingdom, Ireland, Brazil and Germany, where a similar anti-government movement is called is Querdenken in action.
Groups and movements like V_V and Querdenken have increasingly alarmed extremism and law enforcement researchers, who say there is evidence that far-right groups are using COVID skepticism- 19 and vaccines to expand their reach.
More and more such groups are moving from online harassment to real-world action.
For example, in April, V_V used Telegram announced plans to pay a bounty of 10,000 euros to vandals who spray-painted the group’s logo (two red Vs in a circle) on public buildings or vaccine clinics. The group then used Telegram to disseminate photos of the vandalism.
A month before Facebook took action on V_V, Italian police raided the homes of 17 anti-vaccination activists who used Telegram to threaten government, healthcare and media figures. because of their support for COVID-19 limitations.
Social media companies have struggled to deal with a wave of vaccine misinformation since the start of the COVID-19 pandemic. Earlier this week, Facebook and Instagram suspended Children’s Health Advocacy, an influential anti-vaccination organization led by Robert Kennedy Jr. leader.
One reason is the difficult balancing act between censoring harmful content and protecting free speech, according to Joshua Tucker of New York University, who co-directs the Center for Politics and Social Media of New York University. NYU and is a senior advisor at Kroll, a technology, government and economic consulting firm.
Striking the right balance is especially important as social media has emerged as an important source of news and information around the world. Leave too much bad content and users can be misinformed. Remove too much and users will start to distrust the platform.
“It’s dangerous for us as a society when we’re going in a direction where nobody feels that they can trust the information,” Tucker said.