Pallone Opening Remarks at Hearing on Section 230 Legislation to Hold Big Tech Accountable

Webp 7edited

Pallone Opening Remarks at Hearing on Section 230 Legislation to Hold Big Tech Accountable

The following press release was published by the House Committee on Energy and Commerce on Dec. 1, 2021. It is reproduced in full below.

Energy and Commerce Chairman Frank Pallone, Jr. (D-NJ) delivered the following opening remarks at today’s Communications and Technology Subcommittee legislative hearing titled, “Holding Big Tech Accountable: Targeted Reforms for Tech’s Legal Immunity:"

Today’s hearing is the first of two in which this Committee will discuss legislative reforms to hold social media companies accountable. We have two panels today. The first will focus on the insidious problems from which some social media platforms online are profiting. And the second will consider how reforms to Section 230 of the Communications Decency Act can play a part in addressing those problems. Then, next week, in a Consumer Protection and Commerce Subcommittee hearing, we will discuss how consumer protection-focused proposals can increase these companies’ accountability to the public.

These two legislative hearings come after years of repeated, bipartisan calls for online platforms to change their ways. Since 2018, we’ve held six hearings examining tech platform accountability, and our members have sent countless letters.

The most prominent online platforms have repeatedly feigned ignorance before this Committee, but our suspicions unfortunately have been repeatedly confirmed - the latest coming from former Facebook employee Frances Haugen. We’ve learned how the platforms downplayed research that teen girls were especially vulnerable and suffering online. We’ve learned how executives knew their algorithms amplify harmful and divisive content and rejected proposals to fix the issue. We’ve seen a pattern of platforms highlighting COVID-19 misinformation, conspiracy theories, and divisiveness. We’ve learned that during a civil rights audit, one platform failed to disclose that its algorithms disproportionately harmed minority groups. For years now, these platforms have acted above the law and outside the reach of regulators and the public, and it is time for that to change.

The legal protections provided by Section 230 of the Communications Decency Act have played a role in that lack of accountability by stopping victims from having their cases heard. In one recently filed suit, a video chatting platform that is commonly used to engage in online sex between users paired a young girl with a middle-aged man. He convinced her to send nude photos and videos of herself, including by blackmailing her. This man forced her to engage in sexual performances for himself and his friends and even to recruit others. Based on court precedent, Section 230 may very well threaten justice for this young girl. I hope it does not because the platform was responsible for pairing the young girl with the middle-aged man.

Judges and a whole host of diverse interests, including many of our witnesses, have suggested that Courts may have interpreted Section 230 more broadly than Congress intended and have urged reform.

To be clear, Section 230 is critically important to promoting a vibrant and free internet, but I agree with those who suggest the courts have allowed it to stray too far. Judge Katzman, the late Chief Judge of the Second Circuit, brought some clarity to this issue in his dissent in Force v. Facebook. He stated that Section 230 does not and should not bar relief when a plaintiff brings a claim that is based not on the content of the information shown but rather on the connections a platform’s algorithms make between individuals.

Of course, that was not the court’s ruling in that case, and the challenge for us is to clarify the statute, if the courts do not, while ensuring that we balance the statute’s good against the pain it inflicts.

Today, we will consider four proposals that would amend or clarify Section 230 to protect users while promoting open and free online dialogue. These bills do not impose liability on the platforms and do not directly restrict the content that platforms make available. They simply limit the Section 230 protections in certain circumstances, including when platforms use algorithms to amplify certain content. These targeted proposals for reform are intended to balance the benefits of vibrant, free expression online while ensuring that platforms cannot hide behind Section 230 when their business practices meaningfully contribute to real harm.

I am disappointed that my Republican colleagues chose not to introduce the discussion drafts they released in July so they could be included in today’s hearing. In order to actually pass legislation that will begin to hold these platforms accountable, we must work together, and I urge my colleagues not to close the door on bipartisanship for an issue that is so critical.

After all, I believe there is more that unites us than divides us on clarifying Section 230. For example, Ranking Member Rogers’ discussion draft includes a provision similar to my Justice Against Malicious Algorithms Act, in that her proposal would clarify that Section 230 immunity does not apply to algorithmic recommendations. While the proposals aren’t identical, this is a place for us to start what I hope could be bipartisan work.

The time to act is now. Congress must come together to hold these companies accountable for making the internet a safer place.

Source: House Committee on Energy and Commerce

More News