NEWS

Decoder Newsletter: Facebook Cracks Down on QAnon

Margaret Sessa-Hawkins | August 24, 2020

Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • In a large crackdown on QAnon, Facebook has restricted more than 10,000 groups, pages, or accounts associated with the conspiracy theory. The company has not banned content relating to QAnon, but instead is restricting it by removing pages and groups from recommended algorithms and reducing their presence on newsfeeds and search results. The restrictions are a result of Facebook’s policy towards movements that have “demonstrated significant risks to public safety.” The theory has been flagged as a potential domestic terrorism threat by the FBI. 

  • Facebook’s move to restrict the conspiracy on social media comes after a period of record growth -- with activity around some of the largest QAnon groups rising by 200 to 300 percent in the last six months, according to The New York Times, spreading around the globe. In Russia, government-supported organizations are even helping to amplify the theory. For The Verge, Casey Newton reflects that the crackdown may have come too late, and examines the optimal time to reign in conspiracy theories. Newton also took a look at some of the new ideas being floated to combat other forms of misinformation on the platform (such as those related to coronavirus), including a “circuit breaker” method that the company says it will pilot.

  • Facebook’s campaign contributions to House Judiciary Committee members poured in despite that committee’s antitrust hearings, a new MapLight analysis finds. So far, $1 of every $12 spent by Facebook's political action committee, top executives, and employees during the 2020 election cycle has gone to House Judiciary Committee members. At the end of July, Facebook CEO Mark Zuckerberg testified at an antitrust hearing in front of a Judiciary subcommittee. While Congress cannot break up tech companies, it can pass legislation making it easier for federal agencies to press antitrust complaints against tech monopolies.

  • Coronavirus misinformation has been viewed four times more than authoritative content on Facebook, according to a new study by the left-leaning human rights group Avaaz. The group estimated that disinformation about vaccines or other health topics has been viewed roughly 3.8 billion times, despite the fact that Facebook has long touted its robust efforts to combat coronavirus misinformation. 

  • There are multiple angles to explore with these numbers though. For Vice news, David Gilbert compared the data in the report to the platform’s recent success in blocking the spread of the viral ‘Plandemic’ video’s sequel. The New York Times interviewed doctors to explore how misinformation is impacting their jobs -- from treating patients who have tried fake cures to being the subject of online rumors. While in New Zealand, Duncan Grieve of The Spinoff looked at social media companies’ role in enabling a malicious coronavirus rumor to go viral -- and the lack of consequences they have faced for their part in facilitating misinformation during a pandemic.

  • However, some countries are using the pandemic to crack down on journalism, writes Jenna Hand for Nieman Lab. Nations including Hungary, Romania, Algeria, Thailand, and the Philippines have instituted emergency decrees allowing authorities to imprison people for either creating or spreading false information during the pandemic. There are fears that this legislation is being used to limit speech in many countries, including justifying the detaining or fining of journalists critical of the government.

  • On Sunday, Twitter placed a disclaimer on another of President Trump’s tweets about mail-in voting, citing the tweet as a violation of its “civic and election integrity” rules. Since late May, the company has placed multiple disclaimers on tweets where the President attacked mail-in voting, which violate the social media platform’s policies against voter suppression. Facebook, meanwhile, is considering whether it will need to take action on the president’s posts after the election, if he decides to use the platform to try to delegitimize the results. The talks have even included the possibility of having a “kill switch” to shut off political advertising.

  • Twitter also released its biannual transparency report last week. The report, housed in the new Transparency Center, touts the company’s increased enforcement of policies. According to the company this period saw the largest increase in actions taken against those violating Twitter’s abuse policies. The new transparency center includes data on information requests, child exploitation and extremism, and accounts that were taken down. 

  • A new fact-checking podcast on the African continent is gaining traction. “What’s Crap on WhatsApp,” produced by the fact-checking organization Africa Check in partnership with podcast company Volume, is a five to seven minute podcast that analyzes viral rumors. So far, the show has nearly 6,000 subscribers, up from roughly 1,700 in January. Volume CEO Paul McNally said the show’s goal is to be a tool listeners can use to have conversations with family and friends about misinformation.