“Facebook wants you to get caught up in a long, drawn out debate over the minutiae of different legislative approaches. Please don’t fall into that trap,” Haugen testified at a hearing by a House Energy and Commerce subcommittee. “Time is of the essence. There is a lot at stake here. You have a once-in-a-generation opportunity to create new rules for our online world. I came forward, at great personal risk, because I believe we still have time to act. But we must act now.”
After Haugen presented evidence in October that Facebook's systems amplify online hate and extremism and fail to protect young users from harmful content, lawmakers proposed legislation.
Her previous disclosures energized legislative and regulatory efforts around the world aimed at cracking down on Big Tech, and she recently appeared before European lawmakers and officials who are developing rules for social media companies.
Haugen, a data scientist who worked as a product manager in Facebook's civic integrity unit, backed up her claims with a massive trove of internal company documents that she secretly copied and gave to federal securities regulators and Congress.
She made her first public appearance this fall, laying out a broad condemnation of the social network giant before a Senate Commerce subcommittee, and she shared her thoughts on how Facebook's platforms could be made safer, as well as prescriptions for congressional action. She rejected the idea of dismantling the tech behemoth, as many lawmakers have done, instead opting for targeted legislative solutions.
They include, most notably, new restrictions on long-standing legal protections for speech posted on social media platforms. Both Republican and Democratic lawmakers have called for the repeal of some of the safeguards provided by Section 230 of a 25-year-old law that shields internet companies from liability for what users post.
"Let's work together on bipartisan legislation because we can't wait any longer," said Rep. Mike Doyle, D-Pa., chairman of the subcommittee on communications and technology. He claims that the tech titans want nothing more than partisan bickering and dithering over legislation.
Computer algorithms are used by Facebook and other social media companies to rank and recommend content. They control what appears in users' news feeds. The idea behind Haugen's proposal is to remove the safeguards in cases where dominant content driven by algorithms favors massive user engagement over public safety.
"Facebook will not change until the incentives change," Haugen told the House Energy and Commerce Committee. "I hope you guys take action because our children deserve so much better."
That's the idea behind the Justice Against Malicious Algorithms Act, which was introduced by senior House Democrats about a week after Haugen testified before a Senate committee in October. The bill would hold social media companies accountable by removing Section 230 protection for tailored recommendations to users that are deemed to be harmful. If a platform "knowingly or recklessly" promoted harmful content, it would lose its immunity.
Rep. Frank Pallone, D-New Jersey, who chairs the full Energy and Commerce Committee, said a proposal from the committee's senior Republican, Rep. Cathy McMorris Rodgers of Washington, isn't identical to the Democrats' bill but is a good starting point for potential compromise.
"Big Tech should not be the arbiter of truth," Rodgers said, reiterating conservatives' claims that social media platforms censor conservative viewpoints. Conservatives would be able to challenge the platforms' content decisions under Rodgers' proposal.
All of the legislative proposals have a long road ahead of them in terms of final enactment by Congress.
Some experts who support stricter social media regulation believe the Democrats' legislation, as written, may have unintended consequences. It doesn't make it clear enough which specific algorithmic behaviors would result in the loss of liability protection, they argue, making it difficult to see how it would work in practice and leading to widespread disagreement about what it might actually do.
The new name of Facebook's parent company, Meta Platforms, has declined to comment on specific legislative proposals. According to the company, it has long advocated for updated regulations.
Meta CEO Mark Zuckerberg has proposed changes that would only provide legal protection to internet platforms if they can demonstrate that their systems for detecting illegal content are up to date. This requirement, however, may be more difficult for smaller tech companies and startups to meet, prompting critics to claim that it will ultimately benefit Facebook.
Other social media companies have urged caution in any changes to Section 230 legislation.
