NEWS

Facebook Must Stop Dodging Responsibility for Its Role in Destabilizing Democracy

Hamsini Sridharan | June 29, 2020

On Friday, Facebook CEO Mark Zuckerberg announced several updates to the company’s content moderation policies, including changes meant to promote authoritative voting information and defuse concerns about how it exempts “newsworthy” politicians from moderation. The announcements come in response to public outcry over Facebook’s refusal to moderate recent posts by President Trump, including posts that threatened violence against Black Lives Matter protesters and spread unsubstantiated claims undermining mail-in voting. From civil rights leaders to the company’s own employees, Facebook has faced significant backlash to this policy of inaction. A growing advertiser boycott, led by the #StopHateForProfit campaign, is putting financial pressure on the company.

Facebook is clearly feeling the mounting pressure and felt the need to respond publicly—but this response continues to dodge responsibility for the company’s part in destabilizing democracy. Incremental policy changes and voter information campaigns are not enough to address problems rooted in Facebook’s platform design and business strategy. Facebook needs to do more—and lawmakers must be ready to hold it to account. 

The recent announcements fail to address several major concerns:

1. Inconsistent moderation of politicians: Zuckerberg claims that content that incites violence or spreads false information about voting processes is against Facebook’s policies and will be removed regardless of who it comes from—yet when President Trump called for violence against protestors and spread false information about voting by mail, the company chose to do nothing, arguing that its policies did not apply. In other cases, Facebook has removed advertising by the president’s re-election campaign, such as when it spread misleading information about the Census, or, more recently, used Nazi iconography—but only after pressure from reporters and civil rights groups; the company approved the ads to run in the first place. The gaps and glaring inconsistencies in the design and enforcement of its moderation policies demonstrate that Facebook is more concerned with avoiding political reprisal than with protecting democratic expression. 

Saying that it will now label problematic content from “newsworthy” political figures (instead of leaving it up unchecked) is not enough. Facebook must consistently enforce its moderation policies, including against politicians. And it must update its policies to reflect that spreading false information about voting by mail and threatening violence against protestors are part of a long tradition of anti-democratic political suppression that should not be given a platform.

2. Invasive microtargeting of political ads: Facebook recently announced that it would allow users to opt out of seeing political ads, blatantly dodging systemic problems with how it allows advertisers to target political communications. Microtargeting enables advertisers to target divisive and harmful messaging to narrow groups of users, or even individuals, including those from marginalized communities. Facebook’s opaque ad delivery system compounds this by serving ads in potentially discriminatory ways. The company’s supposed transparency tools for political advertising offer little insight into who is being targeted or how.

Instead of shifting the burden onto users, Facebook should limit the ways it allows political messaging to be targeted. In addition, it should provide transparency for how political ads are targeted and who they reach, both on the ads themselves and in its Ad Archive.

3. Disinformation spread by Facebook Groups: Last year, Mark Zuckerberg released a new privacy-focused vision for the platform, leading it to increasingly promote Facebook Groups. However, researchers have found that Facebook Groups are hotbeds for disinformation and offer little information about who is behind them. While the company occasionally bans Groups advocating violence, it has not done nearly enough to address the ways its own design promotes such activity. 

Facebook needs to increase transparency and moderation of Groups. Users need to know who is behind each Group. Facebook must more proactively deplatform Groups that are large-scale spreaders of dangerous disinformation and users who are coordinating to spread disinformation across Groups. Beyond that, the platform must consider how its algorithm promotes Groups and Pages to users, which runs the risk of funneling them into increasingly extreme communities.

Facebook is able to affect political participation at a massive scale—and right now, it does so without meaningful public accountability. Instead of falling for political theater, we must call on Facebook to address the systemic problems it has given a platform to. Beyond that, we must demand that lawmakers enact rules and standards to hold it, and other major social media platforms, to account.