Legislators are attempting to weaken the legal safeguards that allow social-media platforms like Twitter Inc. and Meta Platforms Inc.'s Facebook to post user content without being held liable for it.
That protection was granted in Section 230 of the Communications Decency Act of 1996, a provision that major technology companies regard as a cornerstone of the modern internet.
During a hearing on Section 230 bills on Wednesday, House Energy and Commerce Committee Chairman Frank Pallone (D., N.J.) said the legal protections have played an important role in the growth of the internet, but they have also encouraged companies to promote divisive and harmful content in order to attract more users.
Some of the proposed changes would allow people who have been seriously harmed by social-media algorithms, including some teenage girls who have body image issues, to seek restitution in court, according to Mr. Pallone.
"For years, these platforms have operated above the law, beyond the reach of regulators and the general public, and it is time for that to change," he said. "The time has come to act."
Other proposed changes would subject tech platforms to liability for content that violates users' civil rights.
"Every day, big tech corporations increase the number of people in our country who believe racial and ethnic violence is a solution, not a problem, and they increase the number of businesses in our country that practice discrimination and systematically exploit consumers," Rashad Robinson, president of Color of Change, a nonprofit civil rights advocacy group, said.
The current immunity could also be lifted in cases of harassment and stalking, wrongful death, and violations of international human rights laws, among other things.
Ahead of Wednesday's hearing, tech groups expressed their displeasure.
"Changing Section 230 is a poor solution to a problem that social media platforms are already working hard to address," said Chris Marchese, counsel at NetChoice, a technology trade association.
According to a report released on Wednesday by NetChoice, social media companies will remove billions of posts in the second half of 2020 to keep their platforms safe.
The report highlighted a potential schism between Democrats and Republicans over Section 230, which could be difficult to bridge.
In general, Democrats favor tightening the restrictions on tech platforms, including those on hate speech and misinformation. Republicans, on the other hand, believe that tech companies frequently use these restrictions to censor conservatives. This could cause friction between the two parties when deciding which online harms to target.
"I'm deeply troubled by the path before us," said Rep. Cathy McMorris Rodgers (R., Wash.), the committee's top Republican, referring to some of the bills under consideration sponsored by Democrats. "It begs for more censorship."
Sen. Marsha Blackburn (R-Tenn.) added in a statement that "any legislative action involving Section 230 must make it easier for Americans to freely express themselves."
The final passage of all legislation is still a long way off. In June, a House committee passed legislation to tighten antitrust rules, only to have the bills blocked by leadership before they could be debated on the floor, owing to opposition from tech firms.
A major question looming over all legislative efforts is what, if anything, the Senate's closely divided majority can pass.
Senator Richard Blumenthal (D-Conn.) stated that senators are making progress on legislation, including Section 230 bills. He described Wednesday's House hearing as "further evidence of Congress's powerful momentum for addressing big tech concerns."
Frances Haugen, a former Facebook employee who released internal documents showing harms from the company's products ranging from teenagers' mental-health problems to poisoned political debate, was among those testifying at Wednesday's hearing. Ms. Haugen's documents, which served as the foundation for The Wall Street Journal's Facebook Files series.
In her testimony, she cited examples of Facebook harm, such as teenagers whose mental health is jeopardized by Instagram, healthcare professionals who must deal with the fallout from Covid-19 vaccine conspiracies, and people all over the world who are affected by online radicalization.
"Facebook may not be the source of all of these issues," she said. "However, the company has unquestionably contributed to their deterioration." Facebook is aware of what is going on on the platform, but they are doing far too little about it—in fact, they have incentives for it to stay that way. That is what must change."
According to a Meta spokesman, the Facebook platform aims to provide users with a positive experience and to bring people closer together.
"That's why, even if it hurts our bottom line, we take steps to keep people safe," the spokesman said. "What we need is a set of updated internet rules established by Congress that companies must follow, which is why we've been requesting this for nearly three years."
