WASHINGTON – Top Democrats in Congress who will soon oversee the tech industry’s practices on allowing disinformation and lies to spread on their platforms are questioning whether the moves by Facebook, Twitter and Google to ban President Donald Trump and remove thousands of fake accounts are sufficient, or if they were too late.
In the days after the Jan. 6 attack by Trump supporters on the U.S. Capitol, Facebook and Twitter initially suspended Trump’s accounts. They later moved to permanently ban him from their platforms. YouTube said it was cracking down on election disinformation by Trump and his supporters on its channels.
Reddit said it had banned a group devoted to Trump, and Twitch, a video platform used mostly by gamers, said it had banned a Trump channel. As thousands of Trump supporters moved to Parler — seen as an alternative to Twitter — Apple and Google first warned the service to step up moderation, then removed it from their app stores. Amazon, whose cloud services hosted Parler, ejected the company from its servers, effectively turning off Parler’s lights.
But several Democrats in Congress said the flurry of action by social media companies came too late.
Sen. Mark Warner, D-Va., who stands to become the chairman of the Intelligence Committee, called the actions to ban Trump from social media too late and not nearly enough. He highlighted additional issues that Democrats say Facebook, Twitter and YouTube, which is owned by Google, have failed to address.
“Disinformation and extremism researchers have for years pointed to broad network-based exploitation of these platforms,” Warner said, adding that they “have served as core organizing infrastructure for violent, far-right groups and militia movements. The platforms, researchers say, have helped groups to recruit, organize, coordinate and, in many cases, generate profits.”
Warner’s committee produced five volumes of reports on Russian interference in the 2016 election and included extensive documentation of how the Kremlin had used social media companies to target Americans to help Trump win that race.
Social media companies “have done enduring damage to their own credibility,” Sen. Richard Blumenthal, D-Conn., told The Washington Post last week. And the attack on the Capitol “will renew and refocus the need for Congress to reform Big Tech,” he said.
Blumenthal, a member of the Judiciary Committee and a longtime critic of Big Tech, told the Post that he blames the companies “for ignoring repeated red flags and demands for fixes.”
Rep. Bennie Thompson, D-Miss., who chairs the House Homeland Security Committee, said he couldn’t help but wonder if the decision by social media companies to crack down on Trump “was an opportunistic one, motivated by the news of a Democratically controlled Congress.”
Tech industry representatives defended their members’ actions. “This was not a fourth-dimensional chess move with regard to politics, this was the platforms deciding to do what was best for their users,” said Carl Szabo, general counsel of NetChoice, which lobbies on behalf of technology companies including Facebook, Twitter and Google.
Nevertheless, the flurry of action by tech companies since Friday is unlikely to buy them goodwill with the party on the cusp of controlling the House, Senate and White House when President-elect Joe Biden takes office Jan. 20.
The challenge for Congress is to figure out how to address a series of interrelated issues that affect how the tech industry and social media companies function in the future, said Emma Llanso, director of the Free Expression Project at the Center for Democracy and Technology.
One item sure to be on the agenda is a 1996 law, known as Section 230, that allows online companies to moderate their own websites by protecting them from lawsuits relating to content produced by individual users. Although Democrats say social media companies have used the protection to escape responsibility for hosting violent content, Republicans have said the tech companies have used the liability protection to stifle right-wing voices.
“I don’t really see the kind of gap between the very partisan divide on Sec. 230 closing in response to any of this,” Llanso said, referring to the Jan. 6 attack and its aftermath. “There are no easy legislative fixes, and there’s not one kind of idea for members of Congress to coalesce around,” she said.
Early comments from lawmakers show that both sides are deeply entrenched in their respective positions, Llanso said. Democrats see social media companies acting too late, and Republicans see the ban on Trump and the shutting down of Parler as an anti-conservative bias in the tech industry, she said.
In the 116th Congress, Blumenthal and five other Senate Democrats backed an effort by Judiciary Chairman Lindsey Graham, R-S.C., and 10 other Republicans that would have stripped social media companies of their Section 230 immunity if they did not comply with content moderation best practices set by a group of government officials.
Another bipartisan bill in the 116th, by Senate Majority Whip John Thune, R-S.D., and Sen. Brian Schatz, D-Hawaii, would have allowed companies to retain immunity as long as they post their content moderation practices online and explain the decision-making process behind removed content.
Both bills could be reintroduced in the new session and more could be on the way. Rep. Jan Schakowsky, D-Ill., who chairs the House Energy and Commerce subcommittee on consumer protection, is planning swift action on legislation that would limit Section 230 privileges for companies that fail to enforce their own terms of service, according to an Axios report.
Some companies, including Facebook and Twitter, have indicated that they would support minor changes to Section 230, but the industry remains largely opposed. Szabo said last week’s events demonstrated the importance of Section 230 because it empowers platforms to move quickly and decisively to remove content that was objectionable.
Democrats also could face opposition from civil liberties activists who argue that paring Section 230 would erode free speech and expression online. Evan Greer, deputy director of the digital rights group Fight for the Future, said gutting Section 230 to punish social media companies would be utterly counterproductive.
The use of social media platforms by right-wing extremists to organize the attack on the Capitol also may lead to Congress revisiting the debate over online surveillance, Llanso said.
After the rise of the Islamic State in the Middle East in 2012 and 2013, Congress held hearings on what social media companies ought to do about stopping the terrorist group from spreading its propaganda and recruiting members, Llanso said.
But Congress didn’t come to any clear conclusion about what online platforms should do in confronting al-Qaeda and the Islamic State, she said, because it is “extremely messy” to draw a line on what is and is not acceptable speech without unfairly targeting people, she said.
The coming debate also may encompass discussions on the role of law enforcement agencies in conducting online surveillance of domestic terrorists and extremists, similar to the one Congress had after the 9/11 attacks, Llanso said.
Congress passed the anti-terror law known as the Patriot Act, which gave sweeping surveillance powers to intelligence agencies and the FBI that were later revealed to have been abused. After former National Security Agency contractor Edward Snowden leaked secret surveillance programs, Congress was forced to amend the Patriot Act and pass restrictions.
On Monday, the FBI warned that armed protests were being planned in all 50 state capitals and at the U.S. Capitol from Jan. 16 through Jan. 20.
Lawmakers’ positions and the debate about the tech industry could shift further depending on what comes between now and Inauguration Day, Llanso said, and that could “fuel the legislative activity that we’ll see in Congress in coming months.”