Decoder Newsletter: Kamala Harris and Digital Disinformation

Margaret Sessa-Hawkins | August 17, 2020

Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • Kamala Harris is already facing digital disinformation. Numerous false stories have been surfacing and resurfacing on social media since Joe Biden named the California senator as his running mate last Tuesday, including a brief vandalization of her Wikipedia page. The stories were examples of the extra harassment women -- and especially women of color -- face online. In response, a coalition of women’s groups has declared that they will refute sexist and false information as it surfaces, while another group of female lawmakers called on Facebook to do more to combat disinformation targeting women leaders online.

  • Twitter plans to expand its misinformation policies on mail-in and early voting, in a move that could spark further clashes with President Trump. The company told Politico it will try to implement regulations that cover all mischaracterizations of the processes. The move comes amid increasing scrutiny over the President’s continuing efforts to sabotage mail-in voting, including denying USPS aid money, having the USPS deactivate mail sorting machines, and appointing a prolific Republican fundraiser as postmaster general. In concert with these efforts, the President has repeatedly falsely linked mail-in voting to fraud in social media posts.  

  • Facebook announced a number of new measures meant to combat racism. In an update to its hate speech policies, blackface and anti-semitic stereotypes are now banned on the platform. Facebook still, however, allows Holocaust denials, and in fact a recent investigation by a UK-based anti-extremism group found that the company’s algorithm actually promotes Holocaust denial. The #StopHateForProfit campaign also issued a fairly damning indictment of the company recently, releasing a progress report showing that a month into the campaign, Facebook still has a long way to go in addressing civil rights issues on the platform. Reddit, meanwhile, also banned a racist subreddit, but only after co-founder Alexis Ohanian called it out on Twitter.

  • Marjorie Taylor Greene, a supporter of the QAnon conspiracy theory, is most likely headed to Congress after she won the primary for Georgia’s heavily Republican 14th Congressional District. Meanwhile, CNN has examined which other congressional candidates potentially support the conspiracy theory and an internal investigation by Facebook has found thousands of QAnon groups and pages that have a combined following of more than three million. In The New York Times, Kevin Roose looks at the effect of QAnon on groups that are actually fighting sex trafficking.

  • Major tech companies have expanded a coalition to safeguard the election from foreign interference and digital disinformation. The group, which has been meeting with representatives from agencies such as the FBI and Department of Homeland Security, already included tech giants such as Google, Microsoft, Twitter and Facebook. Now it also includes the Wikimedia Foundation, LinkedIn, Reddit and Verizon Media. Many big tech companies -- including Google, Twitter, Snapchat and Facebook -- have also announced new voter initiatives meant to counter digital disinformation around elections, however, these initiatives still fall far short of what is needed to fully address the growing problem of disinformation in elections. 

  • Around the world, Facebook has been facing criticism for its lax response to harmful content. In Papua New Guinea, users were concerned after the company failed to swiftly remove an image that was seen as potentially encouraging child abuse. In Egypt, a gay man was beaten up after a TikTok video of him in front of a rainbow flag was posted on Facebook without his knowledge. The man sent multiple requests to the company to remove the post which were ignored. One of the issues, according to Jillian York of the Electronic Frontier Foundation, is that the platform has “a dearth of content moderators in a number of languages where they should have a lot more.”

  • Trump has extended the time ByteDance, the Chinese parent company of TikTok, has to divest its U.S. operations. In a new executive order, issued Friday, the President gave the company 90 days, instead of the initial 45, to find a U.S. buyer. One day after issuing the new executive order, the President joined Triller, a TikTok rival. Another TikTok rival, Bigo, has stated that it will switch its servers from Hong Kong to Singapore to avoid a potential crackdown. Trump has declared that he is looking at banning other Chinese-owned companies, including, potentially, Alibaba. 

  • Trying to ensure state-run media outlets don’t contribute to digital disinformation can be a very thorny issue. Courtney Radsch of The Committee to Protect Journalists takes an in-depth look at why in a recent analysis exploring how the different platforms decide what is and isn’t ‘state-run’, what regulations each platform applies to outlets, and what questions and controversies arise throughout the process.