WASHINGTON — Last week was a decidedly bad one for Facebook, as the social media giant faced the dual crises of a server outage on Monday and a whistleblower detailing to senators a day later how the company has knowingly chosen profits over protecting children and curbing misinformation and hate, a charge she also made Sunday during a “60 Minutes” interview on CBS.
The testimony of Frances Haugen, a former Facebook data scientist who revealed her identity after leaking internal documents to the Wall Street Journal, helped to light a path for Congress to crack down on the trillion-dollar company that operates Instagram and the messaging service WhatsApp in addition to its namesake social media platform.
“The company’s leadership knows ways to make Facebook and Instagram safer and won’t make the necessary changes because they have put their immense profits before people,” Haugen told a Senate subcommittee. “Congressional action is needed. They cannot solve this crisis without your help.”
If Congress is going to take action and regulate Facebook, two Washington lawmakers will play leading roles.
Democratic Sen. Maria Cantwell and Rep. Cathy McMorris Rodgers, a Spokane Republican, lead their respective parties on panels charged with regulating tech companies like Facebook — Cantwell as chair of the Senate Committee on Commerce, Science and Transportation and McMorris Rodgers as the top-ranking GOP member on the House Energy and Commerce Committee.
While both Democrats and Republicans have aimed criticism at Facebook, so far the parties have largely disagreed on what to do . Now, Haugen and the internal reports she released have given lawmakers a clearer target: the algorithms that decide what users see on Facebook and Instagram, which amplify misinformation and target teens with ads and posts that harm their mental health, the company’s own internal reports show.
The company’s impact on kids has been a particular focus of McMorris Rodgers, who asked Facebook CEO Mark Zuckerberg directly in a March hearing, “Has Facebook conducted any internal research as to the effect your products are having on the mental health of our children?”
Zuckerberg replied he believed it had. Haugen told senators the internal research she leaked is likely just the tip of the iceberg.
In July, McMorris Rodgers introduced draft legislation that would strip Facebook and other social media companies of the legal protection that shields them from liability for content posted on their platforms. The proposal also aims to protect children from online harm by mandating a straightforward process for parents to report cyberbullying, creating an annual education campaign about the mental health risks of social media and require companies to disclose the mental health impact their products have.
Cantwell, who has called a series of hearings on digital privacy in recent weeks, said in a Sept. 30 hearing with Facebook’s global head of safety that collecting data on children “should have more aggressive attention,” calling for an update to the 1998 law designed to protect kids online.
The effectiveness of that law — the Children’s Online Privacy Protection Act, or COPPA — is limited by a requirement that a company like Facebook have “actual knowledge” that it’s collecting personal information from kids. Its strongest protections also apply only to children under 13.
A proposal from COPPA’s original authors — Sens. Ed Markey, D-Mass., and Richard Blumenthal, D-Conn. — would ban several features for users under 16, including auto-playing videos, push alerts that encourage kids to open apps, badges that reward users for increasing their time on apps and features that count “likes” and followers.
Facebook and Instagram require users to be at least 13 to create an account, but the company has been developing a separate “Instagram Kids” app for those 10 to 12. Facing criticism, the company paused that project on Sept. 27.
In an interview with Cheddar News on Sept. 30, McMorris Rodgers said she was happy that Facebook paused Instagram Kids but asked the company “just to stop.”
“The evidence is clear that these companies are not being transparent,” she said. “They’re not being transparent about the impact of their products on our kids and the risk to our children’s well-being.”
Unlike some of Facebook’s critics, Haugen told senators she didn’t support breaking up the company. Instead, she urged Congress to focus its efforts on the algorithms that control what users see on Facebook and Instagram, something the federal government has visibility into.
“I came forward because I recognized a frightening truth: Almost no one outside of Facebook knows what happens inside Facebook,” Haugen said, likening the situation to the government regulating cars without being allowed to sit in one “or even know that seat belts could exist.”
If Congress has a chance to rein in Facebook, it will take cooperation between both parties. During Haugen’s testimony, that seemed more possible than ever, as Blumenthal noted during the hearing.
“If you closed your eyes, you wouldn’t know if it was a Republican or a Democrat,” he said. “Every part of the country has the harms that are inflicted by Facebook and Instagram.”