Decoder Newsletter: Facebook Introduces More Bans

Margaret Sessa-Hawkins and Viviana Padelli | October 12, 2020

Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • Q-ban: Facebook expanded its restrictions on accounts espousing the QAnon conspiracy theory last week. Ann Ravel, MapLight’s Digital Deception Project Director, called the move “a blaring reminder of how the integrity of our democracy currently depends far too heavily on the goodwill of a handful of technology and social media companies and their CEOs,” and another indication that it is Congress, not tech companies, that must try to stem the spread of disinformation. 

  • But will it work?: The ban might also be a case of too little, too late. The Telegraph found that Facebook has already allowed ads promoting the conspiracy theory to be shown 2.4 million times, The Guardian pointed out that the move will omit several high-profile Australian accounts and INews reported that now adherents to the conspiracy theory are jumping to other platforms.  The founder of 8Chan, Frederick Brennan, told CNN’s Donie O’Sullivan that he is worried about QAnon supporters turning violent in the wake of the election, and that Facebook should have taken action long ago. In The New York Times, Kevin Roose examined why conspiracy theories are particularly addictive right now, and for Insider, sex-trafficking survivors explain why the conspiracy is particularly infuriating for them. 

  • Post-election ad ban: Facebook also announced that it would be indefinitely banning political ads following the presidential election. (Google had already stated that it would stop political ads after the election, while Twitter has introduced its own suite of temporary changes -- including giving users a ‘timeout’ before spreading information, warning users when they share inaccurate content, and appending labels to any posts claiming election victory before results are announced -- meant to slow the spread of electoral disinformation). 

  • Reactions: Over at Wired, Evan Greer reminds everyone that political ads account for a very small percentage of disinformation, and the New Yorker looks at why Facebook doesn’t actually want to stop the spread of disinformation from powerful figures in particular.  Meanwhile, David Gilbert of Vice reports that Facebook's fiercest critics, the so-called  ‘Real Facebook Oversight Board’, were forced offline after the platform appeared to complain their site was involved in phishing.
  • Holocuast denialism banned: Facebook has also (finally) decided to ban any posts that deny the Holocaust. The social media network had long resisted calls to ban Holocaust denialism, but Mark Zuckerberg explained his about-face on the issue by saying he was motivated on seeing informaiton about the rise in anti-Semitic behavior. 

  • Charity?: Meanwhile in Recode, Theodore Schleifer looks at how the Chan/Zuckerberg $250M donation to election offices has raised some thorny issues. The article explores what it means when philanthropy begins compensating for state funding, as well as the fact that false rumors about the donations being partisan are (somewhat ironically) spreading on social media. The odds are that the Chan/Zuckerbergs will continue to have enough money for big philanthropy projects like this for a while though, CNBC recently reported that with Twitter and Google limiting political ads to stop the spread of disinformation, Facebook is reaping the benefits of a fairly unregulated marketplace.

  • Facebook and kidnapping plots: A plan to kidnap Michigan Governor Gretchen Witmer was discussed on Facebook, according to an FBI affidavit. The group involved recruited new members by getting in touch with a militia known as the Wolverine Watchmen, writes Taylor Hatmaker in TechCrunch. The Wolverine Watchmen were later removed in Facebook’s purge of the Boogaloo movement. In The New York Times, Charlie Warzel suggests that Facebook needs to ‘break itself’ in order to stop conspiracy groups from coalescing on the platform.

  • Discrimination & social media: Lest we forget that social media can be a cesspool, there were a few stories this past week examining the role it plays in discrimination. The Washington Post took a look at the mechanics of how racist and sexist attacks against Democratic Vice Presidential candidate Kamala Harris spread online. A new Swedish report found that nearly a third of all social media posts about Jews are hostile, while the Anti-Defamation League examined some of the vitriol Jewish lawmakers in particular receive. 

  • Content moderation suppressing minority voices: The problem of racism and social media is not only one of vitriolic attacks, however. In The New York Times, Ashanti M. Martin looked at how Black voices have recently become more prominent on LinkedIn. This newfound involvement, however, has also led to many posts being taken down or accounts frozen for violating LinkedIn’s vague ‘decorum’ rules. Black business owners are seeing a similar problem on Instagram, where the ad-approval algorithm often flags product advertisements as relating to ‘social issues’. (A new study in the journal Philosophy and Technology points out many of the endemic problems with algorithm-driven content moderation).

  • Misinformation Hearing: On Tuesday the House Administration subcommittee held a hearing on combating misinformation and voting. While Witnesses including U.S. Election Assistance Commission Chair Benjamin Hovland urged Congress to provide more resources for state and local election officials to educate voters, Spencer Overton of the Joint Center for Political and Economic Studies spoke about how foreign and domestic actors are using online disinformation to specifically target and suppress Black votes. Also addressing this fact, Rep. Lauren Underwood  (D-Ill.) sent letters to the CEOs of Facebook, Twitter and YouTube on Tuesday expressing concerns that social media would be used as a tool to suppress the vote of Black individuals. 

  • Anti-trust report release: Congress released its long-awaited antitrust report Tuesday. Meant to be bipartisan, disagreements over how to handle monopolistic behavior rose to the point that Republicans released their own, alternative report. One of the more interesting aspects of the majority's report, as outlined in CNN, was that big tech companies are using “the vast amounts of data they've gathered on consumers and other businesses in order to muscle out rivals.” Politico has a nice roundup of political opinions on the report, while Alec Stapp writes in MIT’s Technology Review that there isn’t really a case for breaking up the companies. Instead, he argues that ‘sunshine laws’ like the Honest Ads act and direct subsidies to provide for local news are needed to increase transparency, and halt the spread of digital disinformation. The European Commission, meanwhile, is finalizing a far-reaching legislative package aimed at regulating large tech firms.

  • Climate disinformation: Climate disinformation ads had 8 million impressions, according to a new analysis by InfluenceMap. The analysis looked at 51 ads that ran over a six-month period. The ads were run by well-known climate disinformation groups including PragerU, the Mackinack Center for Public Policy, and the Competitive Enterprise Institute. Of the 51 ads, only one had been taken down by Facebook, and some were still running as of the first of October.

  • Who regulates political speech?: Should big tech companies be regulating political speech? The Centre for International Governance Innovation took on that question by asking five experts for their opinions. While their answers varied, most agreed that there needs to be more transparency around the moderation process, and that the current patchwork of regulations across platforms makes reigning in disinformation extremely difficult.