Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!
- Election concerns: The election is tomorrow, and President Trump has already told confidants he’ll try to declare a premature victory. With polls finding that a majority of respondents in swing states have seen ads questioning the validity of the election, reports that local election officials are experiencing ‘tsunamis of misinformation’ and examinations of voting misinformation ‘super-spreader’ pages on Facebook, there is valid concern that social media platforms will amplify his claims. Americans have expressed bipartisan support for accuracy over speed in election night results, but are also afraid of potential violence.
- What’s being done: Most social media platforms have released plans for how they plan to handle election day. Youtube has said it will remove content that violates its policies, limit the spread of ‘borderline’ content, and promote information from authoritative sources. Twitter announced that it will be working to ‘prebunk’ disinformation, and TikTok will be limiting distribution of content that can’t be verified by fact-checkers. In addition to previously announced policies, Facebook has also reportedly prepared measures in case of unrest around the election. The Guardian has a roundup of the steps various social media platforms have implemented to combat electoral misinformation.
- What’s gone wrong: Of course, these election policies are far from perfect. Facebook’s political ad block was riddled with issues. Politicians have also found a way around the ad ban, by paying influencers. Misinformation has also been spreading across other forums, with Reuters reporting on disinformation spreading to Indian-Americans through WhatsApp, while The Washington Post found disinformation spreading through text messages.
- What happens next: What will the days and weeks after the election look like though? The good news is the Senate seems likely to confirm three new members of the Federal Election Commission giving the group the quorum necessary to consider complaints (there are currently more than 400 on its docket). In the bad news column though, Facebook’s lack of an end-date on its ads blackout has raised concerns on both sides of the aisle about fundraising and campaigning in races -- including runoff races -- after the election. The Atlantic looks at how the Trump presidency (whether it continues or not) has permanently changed the internet.
- NYU Ad Observatory: MapLight, along with dozens of other organizations, has signed onto an open letter calling for Facebook to withdraw its cease and desist letter against the NYU Ad Observatory. As mentioned in the last Decoder, the Observatory uses a browser plug-in to collect information about ad targeting. However, Facebook has said that because the plug-in ‘scrapes’ data, it violates terms of service. In Nieman Lab, Andy Sellars refutes this claim.
- Section 230 Hearing: On Wednesday the CEOs of Google, Twitter and Facebook were grilled in a Senate Commerce Committee hearing on Section 230. Twitter CEO Jack Dorsey proposed a ‘three-pronged solution to increase transparency and consumer choice’, while The Information broke down each platform’s view of the liability shield.
- The Takes: Politico had a good sum-up of some of the top takeaways from the hearing. Senate Commerce Committee minority leader Maria Cantwell released a report in which she detailed the harm big tech’s ‘unfair practices’ are doing to local news. In Slate, Danielle Keats Citron and Spencer Overton argue that the hearing is actually about Republicans reiterating and amplifying claims of bias, which will cow social media platforms into performing less content moderation.
- About that...: Speaking of conservative claims of censorship, a new study from Media Matters for America undermines that claim (again). The Markup also looked at advertising and found that Facebook charged Joe Biden a higher price for campaign ads (MapLight’s Ann Ravel is quoted in the analysis), while The Washington Post reports that Trump allies are frequently permitted to spread misinformation unchecked. Also in The Washington Post, Steven Johnson, Brent Kitchen and Peter Gray point out that if there is any bias on Facebook, it’s towards an echo chamber. As they found in their research, the platform’s algorithm is to blame.
- Worth a look: New EU proposals could potentially open platforms’ algorithms up to regulation, while a new campaign by Avaaz looks at the personal harm unregulated algorithms can cause (Facebook has currently suspended algorithm recommendations for political and social issue groups). Slate has an article examining how Facebook’s content moderation errors interfered with the #EndSARS campaign, while a new study by think tank ISD provides an overview of “coordinated inauthentic behavior” on Facebook.