Meta disclosed Thursday that it recently removed a network of activity affiliated with the violent extremist group after it detected members making inroads back onto Facebook and Instagram. The company says it removed around 480 Proud Boys accounts, pages, groups and events through a strategy it calls "strategic network disruption" — basically neutralizing a network of activity linked to a banned group in a targeted, simultaneous sweep.
Using this tactic, Meta says it's able to act effectively against dangerous organizations like hate and terror groups seeking to maintain a foothold on the platform, lessening the chances that those accounts will coordinate and pop back up.
"While there’s no silver bullet here, our approach is impacting these dangerous organisations, and we can see adversaries trying harder to hide their affiliation & change tactics," Meta Counterterrorism Policy Lead Dina Hussein wrote on Twitter. "We'll continue to stay vigilant and share our findings."
Beyond that specific targeted enforcement, Meta says that it also removed 750 other accounts, groups, pages and events linked to the Proud Boys during the course of its normal moderation efforts in 2022 so far. Across some of that activity, Proud Boys members were directing Facebook users to other platforms where the organization is not banned, though Meta declined to name those services.
Facebook banned the Proud Boys in October 2018 following Twitter's decision to do the same that August, designating the group as a dangerous hate organization under its platform rules. Prior to the ban, TechCrunch investigated how the Proud Boys leveraged Facebook as a key recruitment hub, operating a national network of well-organized chapters to grow its ranks through the social network's groups and algorithmic recommendations.
While the Proud Boys were once out and proud on Facebook, their efforts to reestablish a presence there are much subtler now. That includes members concealing their affiliation, promoting front groups and pushing more benign content that doesn't contain overt extremist messaging.
Meta doesn't always share the moves it makes against extremist and hate groups, particularly when those actions are part of an ongoing effort. On Twitter, Hussein contextualized the company's decision to share its recent actions against the Proud Boys in order to "highlight the adversarial mutations we are noticing" among banned groups that make persistent efforts to claw their way back onto the platform.
This is a fair point, there are times when, as we face an especially determined or adversarial org when we opt to make our actions public to highlight the adversarial mutations we are noticing. https://t.co/wQ1sUIudpx
— Dina Hussein دينا (@DinaHussein) August 25, 2022
Meta's approach to extremism has evolved considerably since the online heyday of the Proud Boys, QAnon conspiracists and myriad violent anti-government militias, which once organized in the open on Facebook and Instagram. Now, Meta implements lessons learned through its more traditional, longstanding counterterrorism efforts as well as its more recently developed strategies for dealing with what it calls "coordinated inauthentic behavior" — influence campaigns spreading disinformation or other propaganda that are often tied to authoritarian governments.
The violent far-right organization infamous for stoking street fights in left-leaning U.S. cities during the Trump era is now a centerpiece of the investigation into the January 6 Capitol attack. This June, the Justice Department indicted five members, including former Proud Boys leader Henry "Enrique" Tarrio with seditious conspiracy for their alleged role in planning and participating in the attack.