Meta’s bombshell announcement is set to spur heightened activity from fake accounts and troll farms, which thrive on spreading deliberate falsehoods. The organisation’s decision to discontinue its fact-checking programs raises significant concerns about the potential rise of disinformation online and its impact on society.
Fact-checkers have played a crucial role in enabling users to identify false information by marking content as potentially misleading and providing links to reliable research. The system did not remove content but instead gave users the context needed to make informed decisions, with the decision to remove content always coming from Meta and not fact-checkers.
Vaidotas Šedys, Chief Risk Officer at Oxylabs, said: “The discourse surrounding Meta’s decision may inadvertently become a self-fulfilling prophecy, and public concerns of wild spreads of disinformation bombing the users of Meta’s social media platforms might act as an invitation to troll farms to test the new boundaries. Henceforth, the transition period will be precarious, requiring a heightened level of caution from users, policymakers and Meta itself.”
“Meta’s shareholders stand to gain the most from this shift, added Šedys. Removing fact-checking is likely to increase user activity and engagement—key metrics for monetisation. However, troll farms, which often exploit controversial and divisive topics, are poised to capitalise on the platform’s vulnerabilities, and by fueling public dissatisfaction, especially on politically charged issues, they will find ample opportunities to amplify their agendas. This dynamic underscores the risk that Meta’s platforms may become battlegrounds for disinformation campaigns, with potentially significant societal repercussions.”
Viktoras Daukšas, Head of Debunk.org, a Lithuania-based disinformation analysis centre and independent think tank, partner of Oxylabs Project 4beta, added: “Meta is following in the footsteps of X and poses the risk of creating an informational vacuum that could become a tool for manipulation, especially as state-sponsored trolls from countries like Russia or China gain unrestricted access to platform content. This chaos could severely hinder efforts to combat disinformation in the U.S. and undermine the public’s ability to distinguish trustworthy information from false narratives.”
“By limiting access for researchers and organisations working to analyse disinformation, platforms may be attempting to avoid communication and legal crises”, added Daukšas. “However, this approach significantly jeopardises public safety and the quality of information available. In Europe, such actions could violate the Digital Services Act, which holds platforms accountable for ensuring safety within the EU. If Meta’s decision to remove fact-checkers applies to Europe, The European Commission will likely respond decisively to protect information integrity and prevent such changes from becoming widespread.”
“This situation reflects broader debates about platform responsibility and the influence of political shifts on corporate decisions. The European Union will defend users’ rights with strict regulatory measures, ensuring that similar decisions do not set a dangerous precedent.”