WASHINGTON — Ex-Facebook employee and whistleblower Frances Haugen implored lawmakers Wednesday to avert the usual congressional stalemates as they weigh proposals to curb abuses on social media platforms by limiting the companies’ free-speech protections against legal liability.
“Facebook wants you to get caught up in a long, drawn out debate over the minutiae of different legislative approaches. Please don’t fall into that trap,” Haugen testified at a hearing by a House Energy and Commerce subcommittee. “Time is of the essence. There is a lot at stake here. You have a once-in-a-generation opportunity to create new rules for our online world. I came forward, at great personal risk, because I believe we still have time to act. But we must act now.”
Lawmakers brought forward proposals after Haugen presented a case in October that Facebook’s systems amplify online hate and extremism and fail to protect young users from harmful content.
Her previous disclosures have energized legislative and regulatory efforts around the world aimed at cracking down on Big Tech, and she made a series of appearances recently before European lawmakers and officials who are drawing up rules for social media companies.
Haugen, a data scientist who worked as a product manager in Facebook’s civic integrity unit, buttressed her assertions with a massive trove of internal company documents she secretly copied and provided to federal securities regulators and Congress.
When she made her first public appearance this fall, laying out a far-reaching condemnation of the social network giant before a Senate Commerce subcommittee, she shared how she believes Facebook’s platforms could be made safer and offered prescriptions for action by Congress. She rejected the idea of breaking up the tech giant as many lawmakers are calling for, favoring instead targeted legislative remedies.
Most notably, they include new curbs on the long-standing legal protections for speech posted on social media platforms. Both Republican and Democratic lawmakers have called for stripping away some of the protections granted by a provision in a 25-year-old law — generally known as Section 230 — that shields internet companies from liability for what users post.
“Let’s work together on bipartisan legislation because we can’t continue to wait,” said Rep. Mike Doyle, D-Pa., the chairman of the communications and technology subcommittee. The tech giants want nothing more than partisan division and dithering over the legislation, he said.
Facebook and other social media companies use computer algorithms to rank and recommend content. They govern what shows up on users’ news feeds. Haugen’s idea is to remove the protections in cases where dominant content driven by algorithms favors massive engagement by users over public safety.
“Facebook will not change until the incentives change,” Haugen told the House panel. “I hope that you guys act because our children deserve much better.”
That’s the thought behind the Justice Against Malicious Algorithms Act, which was introduced by senior House Democrats about a week after Haugen testified to the Senate panel in October. The bill would hold social media companies responsible by removing their protection under Section 230 for tailored recommendations to users that are deemed to cause harm. A platform would lose the immunity in cases where it “knowingly or recklessly” promoted harmful content.
Rep. Frank Pallone, D-N.J., who heads the full Energy and Commerce committee, said a proposal from its senior Republican, Rep. Cathy McMorris Rodgers of Washington, isn’t identical to the Democrats’ bill but represents a good start for potential compromise.
“Big Tech should not be the arbiter of truth,” Rodgers said, renewing conservatives’ assertions that social media platforms censor those viewpoints. Rodgers’ proposal would allow conservatives to challenge the platforms’ content decisions.
All of the legislative proposals face a heavy lift toward final enactment by Congress.
Some experts who support stricter regulation of social media say the Democrats’ legislation as written could have unintended consequences. It doesn’t make clear enough which specific algorithmic behaviors would lead to loss of the liability protection, they suggest, making it hard to see how it would work in practice and leading to wide disagreement over what it might actually do.
Meta Platforms, the new name of Facebook’s parent company, has declined to comment on specific legislative proposals. The company says it has long advocated for updated regulations.
Meta CEO Mark Zuckerberg has suggested changes that would only give internet platforms legal protection if they can prove that their systems for identifying illegal content are up to snuff. That requirement, however, might be more difficult for smaller tech companies and startups to meet, leading critics to charge that it would ultimately favor Facebook.
Other social media companies have urged caution in any legislative changes to Section 230.