Decoder Newsletter: Platform Governance

Hamsini Sridharan | July 16, 2020

Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. If you're interested in receiving the Decoder in your inbox, sign up here.

  • What comes next after Facebook’s civil rights audit? In Recode, Rebecca Heilweil and Shirin Ghaffary enumerate major takeaways from the report released last week (full report here). As they point out, Facebook is not bound to follow the auditors' recommendations; it will be interesting to see what, if anything, sticks. For a refresher on the fast and furious Facebook activity of the past few weeks, check out this timeline from Clare Duffy in CNN. Meanwhile, critics remain concerned that the sheer scale of Facebook’s business means that mounting public pressure on the platform will not be enough to drive meaningful change, as Julia Carrie Wong writes in The Guardian.

  • Facebook fracas continued: For the Washington Post, Elizabeth Dwoskin and Craig Timberg report that the company is now considering banning all political ads in the days before the November election. Meanwhile, in Bloomberg, Bill Ellison and Misyrlena Egkolfopoulou observe that the Trump campaign seems to be outpacing the Biden campaign on microtargeted Facebook advertising, even as the president levies new charges of political bias against the platform — this time directed at its voter registration drive, per Eric Newcomer.

  • For CNBC, Salvador Rodriguez reports that Facebook has removed a network of more than 100 accounts and pages tied to Roger Stone that have engaged in “coordinated inauthentic behavior.” This network spent over $308,000 on Facebook ads and had more than 320,000 followers across Facebook and Instagram. (Facebook also removed Stone’s personal social media accounts, according to Davey Alba at The New York Times.) This report from Graphika offers a deeper analysis.

  • What does “coordinated inauthentic behavior” even mean? For Slate, legal scholar Evelyn Douek usefully points out that we’ve come to take the term for granted, but it has no clear definition beyond what platforms say it is in practice. And speaking of platform governancethis piece from John Herrman in The New York Times tackles what (if anything) the recent spate of bans of white supremacists says about whether change is coming for social media. Herrman writes, “Governance-wise, social platforms are authoritarian spaces dressed up in borrowed democratic language. Their policies and rules are best understood as attempts at automation. Stylistically, they’re laws. Practically, and legally, they’re closer to software updates.”

  • CDA 230: For The New Yorker, Anna Wiener explores how Section 230 of the Communications Decency Act connects to the vexing content moderation questions of the last several weeks and discusses some of the alternatives that have been proposed. One recent option, the PACT Act, is considered more moderate than many other proposals, writes Cat Zakrzewski in the Washington Post. The bill, which has bipartisan sponsorship in the Senate, would require tech companies to provide greater transparency for content moderation.

  • In this fantastic webinar, Mutale Nkonde (Stanford Digital Civil Society Lab), LaTosha Brown (Black Voters Matter Fund), Leonard Cortana (NYU), Charlton McIlwain (NYU), and Maria Rodriguez (Hunter College) discuss disinformation that disenfranchises voters of color, including Nkonde’s research on “disinformation creep.” And the Lawfare Arbiters of Truth podcast talks to Brandi Collins-Dexter (Harvard/Color of Change) about her research on Covid-19 misinformation in Black communities, as well as the #StopHateforProfit campaign.

  • Interregnum: For Roll Call, Gopal Ratnam examines how social media platforms are preparing for potential disinformation following the 2020 election. It is possible that increased mail-in voting due to the pandemic will result in a delayed election outcome — creating an information vacuum that is ripe for misinformation and manipulation.

  • The party may stop for TikTok if the Trump administration bans the app — an option it is considering, according to comments by Secretary of State Mike Pompeo reported by Arjun Kharpal for CNBC. In The Verge, Russell Brandom contextualizes concerns about TikTok in light of anxiety about its ties to China. For a deeper dive on those concerns, check out Ben Thompson’s analysis in Stratechery, which looks at TikTok’s algorithm and the risks of the Chinese government accessing user data, censoring content, and pushing propaganda. Meanwhile, for TechCrunch, Connie Loizos notes that Facebook may be the real winner if TikTok is banned, and Bloomberg’s Egkolfopoulou and Shelly Banjo report that the teens who used the platform to organize against Trump’s Tulsa rally are now review bombing his official campaign app.

  • News or political campaign? For Nieman Lab, Duke University researchers Jessica Mahone and Philip Napoli discuss their research showing that hundreds of hyperpartisan sites — funded by candidates, PACS, and party operatives — are posing as local news sites. And in Politico, Alex Thompson examines Courier, a publication launched by Democratic digital organization Acronym that is blurring the lines between news and dark money political campaigning. 

  • Heads up: The CEOs of Apple, Facebook, Amazon, and Google are set to testify before the House Judiciary antitrust subcommittee on July 27 — the first time all four will appear together before Congress, according to Lauren Feiner at CNBC.