Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!
- Disinformation spreads: Although Joe Biden was finally projected as the winner of the presidential election Saturday, delayed election results created an atmosphere ripe for disinformation. Trump (and his family and surrogates) retweeted local narratives alleging electoral fraud, advanced conspiracy theories, and helped spearhead a ‘dramatic’ increase in electoral disinformation on Twitter. In one case, a Georgia poll worker was forced to go into hiding after false information spread about him on Twitter. Facebook also saw an increase in disinformation and ‘worrying activity’ around the election.
- Platforms put to the test: In many cases, the platforms took a more proactive approach to this disinformation than they have in the past. The New York Times tracked Twitter’s moderation of Trump’s tweets. Facebook blocked select hashtags, tightened policies on election disinformation, and has started putting some groups on probation. Youtube, however, took a more lax approach and saw a greater spread of information, and TikTok has also become an unexpected source of disinformation. In Protocol, Issie Lapowsky examines whether the measures made a substantial impact. In the New York Times, Kevin Roose reminds everyone that Facebook and Twitter only reduced disinformation by making their platforms perform worse.
- Platform jumping: Following action from Facebook and Twitter, some conspiracy theories began to jump platforms as conservatives organized groups in anticipation of a ban. In the wake of the election, conservative pundits are urging their followers to use these new platforms, leading to a rapid increase in downloads.
- Trump Twitter: In the wake of the election, several Democratic lawmakers called for Trump’s Twitter account to be suspended until the outcome was decided, to curtail the spread of disinformation. Later, nonpartisan groups including Common Cause and the Lawyers’ Committee for Civil Rights Under Law joined the call. Twitter did not suspend the account. However, Kurt Wagner writes in Bloomberg that the company has confirmed Trump will lose his ‘public interest’ protections once he leaves office. One-time surrogate Steve Bannon has now been suspended permanently from Twitter after suggesting that some government officials should be beheaded.
- Speaking of inciting violence: The metric Facebook uses for measuring ‘violence and incitement trends’ is rising, Buzzfeed reports. There was a 45% increase in the metric, which is internal, and has not previously been reported. A new survey from Reuters and digital intelligence firm CounterAction also found that violent rhetoric was rife in thousands of political groups on the platform.
- Privacy please: As Wired reports, one of the subplots of this election has been voters’ clear preference for more privacy. In California, voters passed Proposition 24, which regulates data tracking and gathering. In Michigan, there was overwhelming support for Proposition 2, which requires warrants to access a person’s electronic data. As the Wired article details, the propositions have somewhat thrown into disarray traditional partisan divides, and suggest that strengthening privacy laws post-2020 could be a complex endeavor.
- QAnon & polling errors?: In the wake of the election there are many questions about why the polls (at least by current tallies) seem to have been so wrong in some states. One study, by researchers at the University of Southern California, suggests part of the answer may lie with the QAnon conspiracy theory. The researchers, “identified a strong statistical correlation between state polls that underestimated Mr. Trump’s chances and a higher-than-average volume of QAnon activity in those states.”
- What next?: In the Columbia Journalism Review, Joel Simon writes that many major tech platforms seem to have finally dropped their guiding principle of avoiding content moderation. He reviews how we got to this point, as well as what the next steps for big tech should be, and examines some interesting proposals for reducing online disinformation.
- Addressing online harassment: New President-elect Joe Biden’s plan for ending violence against women includes the intention to convene a national task force focusing on the connection between online harassment, mass shootings, extremism, and violence against women. The task force will also consider the thorny issue of platform accountability, as well as reporting requirements, and best practices.