A pair of legislators have a plan to save Section 230: kill it so that Congress is forced to come up with a better version.

That was the topic of discussion at a hearing on Wednesday in the House Energy and Commerce subcommittee on communications and technology. It came on the heels of the committee leaders’ proposal for the sunset, which they announced in a Wall Street Journal op-ed last week. E&C Chair Cathy McMorris Rodgers (R-WA) and Ranking Member Frank Pallone (D-NJ) want to give Congress 18 months to come up with a new framework to replace Section 230 or risk losing it entirely. The idea is to force their colleagues to do something to change the law that’s been the subject of bipartisan ire for years.

“Big Tech lobbied to kill them every time. These companies left us with no other option.”

“Our goal is not for Section 230 to disappear,” McMorris Rodgers said at the hearing. “But the reality is that nearly 25 bills to amend Section 230 have been introduced over the last two Congresses. Many of these were good-faith attempts to reform the law, and Big Tech lobbied to kill them every time. These companies left us with no other option.”

Section 230 of the Communications Decency Act is the law that protects social media platforms from being held responsible for what their users post. It’s also what enables the platforms to moderate content on their services how they see fit, without fearing that doing so will land them in a lengthy legal dispute.

While industry players say this is essential for how the internet operates — keeping the most abhorrent content off of mainstream services while allowing for mostly open conversations and giving smaller platforms a shot at existence without being drowned in legal fees that larger platforms are able to shoulder — many policymakers have soured on the law as tech companies have grown in power.

Republicans and Democrats often have very different ideas of how exactly the law should change. Republicans who support Section 230 reform often want platforms to have fewer protections for their content moderation decisions to combat what they see as censorship of conservative views, while Democrats who support reform tend to want platforms to moderate or remove more content, such as disinformation. These days, however, both sides seem open to changes that could further protect children on the internet, as proposals like the Kids Online Safety Act have gained steam.

“I do really worry that’s the unintended consequence here”

Wednesday’s hearing showcased both sides of the 230 discussion, inviting experts engaged in advocacy both for and against reform to field questions about how far Congress should go in making changes. Kate Tummarello, executive director of startup advocacy group Engine (which has received funding from Google), shared her experience seeking out reproductive care information in online communities while experiencing a pregnancy loss two years ago. Following the Dobbs v. Jackson Women’s Health Organization decision that overturned Roe v. Wade, Tummarello said the same online communities she turned to for emotional and practical support shrank, and she saw women express fear of posting about seeking reproductive care online.

“I don’t think anyone’s intending to repeal 230 to get at women like me,” Tummarello said. “But I do really worry that’s the unintended consequence here. And I don’t want the women who are dealing with this today and in the future to not have the resources I had, to have the community that I was able to lean on. Because, again, to me, it was life-saving.”

But victims rights attorney Carrie Goldberg and Organization for Social Media Safety CEO Marc Berkman said throughout the hearing that the fear that Section 230 repeal would lead to a tsunami of lawsuits was overblown. “When we’re talking about content removal … there still has to be a cause of action,” Goldberg said. She added that it can still take years to go through the courts. “There’s not going to be some sort of mythical rush to the courthouse by millions of people because there has to be an injury,” she said.

But Tummarello said that even without successful legal cases, stripping 230 protection could make it easier to pressure tech companies into removing lawful speech. “Absent 230, it’s not that the platforms would be held liable for the speech, it’s that the platforms could very easily be pressured into removing speech people don’t like,” she said.

Lawmakers want to hear how Section 230 will collide with generative AI

Lawmakers were also curious how Section 230 would apply to content created by generative AI and whether that should update their understanding of it. Goldberg said, “Section 230 right now is going to be used by all these companies as a reason to not be held liable” in the early days of the technology. “We’re really early in the roll-out here and we’ve already seen extremely concerning examples,” Berkman said, pointing to Snapchat’s AI bot, which came under fire for engaging in mature conversations when the platform hosts many minors. Tummarello said startups are also using AI to find and moderate harmful content.

Even though many members seemed open to some level of 230 reform, not everyone appeared on board with the proposed sunset of the protections — and it’s not yet clear if the proposal to sunset 230 can gain wide support. McMorris Rodgers would need to bring the proposal to a vote before the full committee for it to have a shot of reaching the floor under normal operating procedure.

“This is where I start to have a problem,” said Rep. Jay Obernolte (R-CA). “It seems like the premise of repealing 230 is that the world would be a better place if we just all sued each other more often.” Obernolte said that rather than “rely on the indirect route of the threat of being sued,” Congress should outlaw the specific acts they don’t want to occur. “This is something that we can solve a different way than just expanding liability.”

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums

プロテイン サプリ 効果.