Section 230, which turned 25 years old this month, has played a central role in shaping the internet. Over the past year, Congress has introduced several proposals to change the law — some of them drastic — but the bills have often focused on a handful of very large tech companies like Facebook and Google. In reality, Section 230 has created a lot of the web as we know it.

On Monday, March 1st, we’re holding an event on Section 230 and the future of tech regulation. After a keynote from Sen. Amy Klobuchar (D-MN), I’ll be sitting down with Wikimedia Foundation general counsel Amanda Keton, Vimeo general counsel Michael Cheah, and writer and strategist Sydette Harry to discuss how changing Section 230 could change the web. For a broader sense of its impact, however, I also spoke to a range of companies, nonprofits, legal experts, and others with a stake in preserving or reforming the law.


Laurent Crenshaw, head of policy at Patreon

Patreon is an online funding platform for artists, musicians, streamers, and other creatives. Founded in 2013, it supports over 200,000 creators with more than 6 million patrons. Its users include podcaster Joe Budden, musician Amanda Palmer, and news host Philip DeFranco.

How does Section 230 impact Patreon?

Section 230 matters to Patreon every single day because we are a platform that has hundreds of thousands of creators, that has millions of patrons that are communicating with each other and the creators, and creators that are creating new works on a daily basis. If we had to proactively take steps to identify and monitor, validate, and verify that content prior to posting, it would be incredibly difficult for that platform to exist.

Even little tweaks, like when they talk about removing language around “otherwise objectionable” content, which really is what gives platforms the ability to have their own moderation guidelines — it’s just incredibly concerning.

Because otherwise, we’re talking about a scenario where platforms are encouraged to not have any type of content moderation, while at the same time still potentially facing liability, or to have much more litigation directed at them without the protection that Section 230 provides.

Let’s say Section 230 gets repealed completely. What would the next year look like for Patreon?

We would do everything within our power to continue to be a platform that funds creators. But at the same time, I think we would have to reassess how we go about deciding who we can fund through the platform, if suddenly we have to consider whether or not we’ll be liable for any content we’re directly funding.

What about more moderate options in bills like the PACT Act, which would do things like require publishing transparent moderation guidelines?

Transparency reports are not the most controversial thing in the world. It’s something that we’re looking at doing in the future as well. Some of the items that are being proposed, like call centers, I think sound better on paper than necessarily in practice.

But at the end of the day, I think that honestly, any lawmaker proposals that are being considered need to make sure that they aren’t basically solving problems that are targeted towards larger companies, and at the same time, that they don’t throw the baby out bathwater in regards to creating a whole new sort of regulatory model — an approach that is overly burdensome.


Carrie Goldberg, author and victims’ rights attorney

Carrie Goldberg represents victims of online harassment, stalking, blackmail, and other crimes. She is the author of Nobody’s Victim: Fighting Psychos, Stalkers, Pervs, and Trolls, and she represented Grindr user Matthew Herrick in a high-profile lawsuit that was ultimately dismissed on Section 230 grounds.

What is the core effect that you see Section 230 having online?

It emboldens platforms to give zero fucks about users. Since users can’t hold platforms liable for harms caused, platforms have no incentive to invest in design and operations to reduce or mitigate injuries. And I’m not talking about harms like being called a bitch on Twitter. I’m talking about child abuse victims’ rapes being on porn sites, 1,200 men arriving at your home for a gangbang because somebody is impersonating you on Grindr.

What would be your proposed changes to the law?

First choice is to completely abolish Section 230. Our current civil legal system already has barriers to entry to prevent frivolous suits.

Second choice is eliminate the immunity for platforms that are solely in the business of being malicious, instances where a platform is put on notice of a harming user, and all causes of action alleging unlawful activity on platforms — especially dissemination of child sexual abuse material, child sexual exploitation, revenge porn, deepfakes, stalking, known rapists and abusers using dating apps.

What do you think of some of the more common proposed changes to Section 230, like Facebook’s suggestion that sites should be required to release transparency reports?

Facebook’s suggestion that it be required to release transparency reports is not a change to Section 230. Obviously, any proposal set forth by a platform should be regarded with side-eye. I want to throw up any time somebody suggests transparency as a fix. Transparency means nothing if nobody has the right to do anything about all the atrocities these companies are being transparent about.

One of the criticisms of Section 230 reforms is that sites like Facebook can afford to effectively litigate around them, while smaller forums without the same resources will be disproportionately affected. Is that a reasonable fear? If it is, do you see a way to mitigate it?

If you harm your users, you should be sued, no matter how big or small. The solution for small platforms is to honor the people using your product and make it safe. In no other industry is liability for harms based on the size of the business. My law firm of 13 employees can be sued just as easily for harms we cause as a law firm with 2000 employees. My solution is to honor and cherish my clients and not harm them.

If lawmakers did something like entirely repeal Section 230, how do you practically think the internet would change?

It would improve it.


Katie Jordan, director of public policy and technology for the Internet Society

The Internet Society is a nonprofit founded in 1992 by internet pioneers Vint Cerf and Bob Kahn. It supports efforts to increase internet access worldwide, construct community networks in unconnected areas, and maintain openness and security online.

What’s the Internet Society’s position on changing Section 230?

I think it’s a great goal to prevent misinformation and hate speech online, but I’m really not convinced that Section 230 is the right way to do that. The risks of change, or of unintended consequences, are significant. And I don’t know if we’re having a large enough conversation about what those risks are.

What sort of risks are you worried about?

When the Internet Society approaches intermediary liability protection, we look at the whole stack. Section 230 is not just about platforms like Facebook, Google, and Amazon. It’s also about the underlying infrastructure and the intermediaries that move content and data from one place to another, whether it’s cloud providers, domain name registries, email servers, things like that.

If cloud providers get wrapped up in this conversation about pulling back intermediary liability protection, then by default, they’re going to have to reduce privacy and security practices because they’ll have to look at the content they’re storing for you, to know if they’re breaking the law.

How does that translate to actual pieces of legislation?

The more specific we can be about the goal, the better the legislation will be. So if we’re very clear that there is a problem on social media platforms with misinformation, disinformation, and hate speech, then our goal can be to fix those things. But a lot of the legislation we’ve seen is so broad that it would be very difficult to limit it to just those things.

I think it would be fairly easy to say, there is a public concern with a handful of platforms dealing with speech in a way that maybe is not in society’s best interest. Let’s start there and see what we can fix. Because my fear is that if we don’t look very closely at the target, the unintended consequences of this will be so far-reaching that it will do more harm than good.


Yiota Souras, senior vice president of general counsel for the National Center for Missing and Exploited Children

NCMEC is a government-funded nonprofit organization fighting the abduction, abuse, and exploitation of children. Among other work with victims and law enforcement, it operates the CyberTipline, a system for reporting child exploitation on the internet.

How does Section 230 affect your work at NCMEC?

Section 230 denies victims the ability to bring a lawsuit if they have been exploited in certain ways through or on a particular platform. And because social media companies have that immunity, it reduces the potential that companies will do the most they can to make sure child exploitation content is not posted, not distributed, and is taken down right away, because there is really no legal incentive or for them to take those extra steps.

Companies do have a lot of non-legal incentives, though, and many of them already put a lot of resources toward this problem. What kind of steps do you think adding legal liability would make them take?

There are a lot of companies, especially some of the very large companies, that engage in really tremendous voluntary measures. But there are a lot of companies that don’t, and there is no legal requirement for them to use any kind of detection or screening. I think the fact that there are some companies that are trying as hard as they can and devoting immense resources and are prioritizing this is great. But they’re just a handful of the companies out there.

There’s been one carveout already, FOSTA-SESTA, which removed liability protections for trafficking and sex work-related material. What has been the impact of that for you?

There have been a lot of ramifications. Backpage.com was taken down before FOSTA-SESTA, and what we’ve seen is that there has been no other central marketplace that’s come up and said “we’re Backpage 2.0.” It’s not normalized in the same way anymore.

So you think the value has been as a deterrent?

Right. And there have been some criminal and some civil cases that have moved forward representing victims against websites — some that involve a lot of claims, including FOSTA-SESTA. And they’re still working their way through court.


Maxine Doogan, president of the Erotic Service Providers Legal Education and Research Project

ESPLERP is a legal advocacy group supporting decriminalization and legal protection for sex workers. It was one of several organizations to oppose FOSTA-SESTA on the grounds that it penalized consensual sex work.

What’s been the impact of FOSTA-SESTA on sex workers?

It’s been a disaster. We lost a whole bunch of our sites, a whole bunch of our advertising. Now we have sites that are charging us huge amounts of money in bitcoin. I think Congress is very conscious of the position that they’re forcing us into, which is that, you know, a bunch of people have lost their jobs. A bunch of people have been forced back onto the streets. A bunch of people have been forced to work together in order to maintain their indoor location because the rents are so high — you could afford it before FOSTA-SESTA, now you can’t because you have less access to advertising. So now you have to share space.

What do you think of newer efforts to change Section 230?

They’re missing the point, and they’re missing the bigger issues that society has refused to address.

What kinds of issues should lawmakers be focusing on?

They need an anti-discrimination law for all erotic service providers — we are discriminated against regardless of our legal or illegal status in housing, child custody, and access to education. Financial institutions are being allowed to systematically discriminate against us in access to them. They need to be providing us with some actual rights with their legislation, instead of continuing to strip our rights away from us.


Al Smith, communications strategist for the Tor Project

The Tor Project is the nonprofit behind Tor, an open-source encryption tool for remaining anonymous online. Tor is used by journalists communicating with whistleblowers, human rights activists living under repressive governments, and millions of others worldwide.

What does Section 230 mean for the Tor Project?

Tor relies on Section 230. Without it, the organization would be vulnerable to lawsuits brought by anyone who believes that a Tor user harmed them. For instance, because of Section 230, if someone uses Tor to protect their identity in order to criticize or report about a governmental official (something journalists and activists often do), Tor cannot be sued for facilitating or aiding and abetting or otherwise. Tor is a small nonprofit, and even a suit that was not meritorious could easily sink it.

What do you think of some of the most common proposals for changing Section 230?

Overall, these proposals seem to be based on anger at Facebook and Google and a few other tech giants. A huge variety of tools are protected by Section 230, and most of the proposals for changing it will cause tremendous collateral damage to important parts of the internet like Wikimedia, the Internet Archive, and Tor.

There are a few specific proposals on the table, including rules that require services to remove illegal content. How would those affect you?

Because Tor does not host content, and because of the nature of the tool, it is impossible to know every single person and organization that uses its free, open-source technology. Tor could not comply with any “removal” requirements put forth by some proposals that change Section 230.

Any such law must differentiate between tools that store and host content and those that do not — Tor is the latter. Often these proposals are aimed at forcing tools like Tor to weaken or eliminate their use of encryption (e.g., EARN IT Act), and that would be extremely dangerous for all Tor (and all internet) users who need strong encryption to stay safe.

Since Tor is a global service, how much does Section 230 matter for people using it outside the US?

Section 230 is hugely important for people using Tor outside the United States. Many of our users outside of the US rely on Tor because, in order to stay safe and fight for a better world, they require encryption and security — human rights defenders, political dissidents, journalists, whistleblowers, domestic violence survivors looking to escape, and more. Any effort to “reform” Section 230 that makes Tor less secure or provides an opportunity for litigation against it for what its users do could be catastrophic for users around the world.


Mary Anne Franks, president of the Cyber Civil Rights Initiative

Mary Anne Franks is a professor at the University of Miami School of Law as well as the president of the Cyber Civil Rights Initiative, an organization that combats civil rights abuses online. She also drafted the first model criminal statute banning nonconsensual pornography or “revenge porn” online.

What has Section 230’s impact been on the internet? What would the internet look like if it had never been passed?

The internet with Section 230 is like an auto industry with no safety regulations or a restaurant industry with no hygiene standards. Imagining an internet without Section 230 requires imagining what kinds of industry standards would have developed over the last 20 years if tech companies had not been assured that they could act as recklessly as they wanted without fear of liability.

How might companies have designed their platforms and services differently if they knew they could be held responsible for failing to take reasonable steps to avoid foreseeable harms? How much less likely would it be that the internet would be controlled by a tiny number of multibillion-dollar corporations wielding power without responsibility?

How deep should this responsibility go? Should infrastructure providers like Cloudflare, web hosts, and domain registrars be responsible for their customers’ content, or for the content their customers allow?

These kinds of questions are impossible to answer in advance because they are so contingent on the facts of particular situations, and Section 230 keeps them from getting to courts where they can be worked out.

In addition to very (some, like me, would say, overly) robust First Amendment protections, there are lots of challenges to holding a third party responsible for the conduct of someone else. If Section 230 were repealed tomorrow, it wouldn’t mean that Cloudflare or Facebook or any online intermediary would suddenly become liable for user behavior. It would simply mean that at least some of the time, online intermediaries would have to show up to court and explain why they aren’t liable, and sometimes they would lose.

A lot of content that people complain about, like misinformation, seems like a First Amendment question rather than a Section 230 question.

The First Amendment is not a fixed rule, and what it protects in 2021 is very different from what it protected in 1996 or 1896 or 1789. It is not, despite many Americans’ passionate belief to the contrary, a clear, consistent, or principled doctrine. It has chiefly served to protect powerful groups’ right to speak freely, even and especially when that speech chills the speech of less powerful groups.

The Section 230 debate is so fraught in part because the public has been trained to believe that everything they do online is speech, and that none of this speech can be regulated. But people use the internet for a vast array of activities, from buying toilet paper to making hotel reservations to watching movies, and it has been used by intermediaries to avoid liability for many things that would not qualify as protected speech under the First Amendment.

That problem could be addressed by replacing the word “information” in Section c(1) with “speech,” making it clear that the law’s protections cannot extend beyond protected speech. That by itself won’t solve all the problems with Section 230, but it would be a way to rein in some of its worst excesses.


Rebecca Tushnet, co-founder of the Organization for Transformative Works

Rebecca Tushnet is a professor at Harvard Law School. She is also a co-founder of the Organization for Transformative Works, a nonprofit that supports fanworks through projects like the Archive Of Our Own (AO3), a Hugo Award-winning archive hosting more than 7 million fanworks.

How is Section 230 important for the Archive Of Our Own?

Section 230 shapes what happens every day — it allows us to moderate with confidence that, if we make one mistake, the hundreds of times we remove content that violates our policies won’t be held against us. 230 is vital in allowing our volunteers who handle abuse complaints to make reasonable decisions without having a lawyer review every ticket they see.

Some lawmakers want to require sites to remove illegal content within 24 hours. How would that affect AO3?

That seems both not very troubling and completely unresponsive to the things that everyone is complaining about. There is only one site of any note of which I am aware — Ripoff Report — that refuses on 230 grounds to remove content that a court has found illegal (and even it wiggles on that, I believe). Most sites respond to court orders voluntarily, enough so that there’s a real problem of forged court orders used to remove content.

I think this is a big gap in understanding: even most awful, abusive messages on their own are not illegal. If one person sends a stream of abusive messages, maybe that’s harassment, but if 100 different people send one message, none of them are actually liable for criminal harassment unless we change the criminal law. And there are reasons we might do that, but we definitely haven’t yet.

Are there legal changes that could push Facebook or Google to moderate better, but wouldn’t catch smaller operations like AO3 in the crossfire?

I think we’re increasingly seeing really interesting discussions around using antitrust law. So antitrust remedies don’t necessarily have to include breaking somebody up. They can include saying, “Okay, this is how you have to behave.” I have to say, though, I have some — not a great deal, but some — sympathy for Facebook. Because remember the scale they’re dealing with. So even if they’re right 99.9999 percent of the time, the one time that they screw up is going to be bad.

There’s a counterargument that if a site is designed to scale in a way that makes tragedies inevitable, you should just say it’s sort of “unsafe at any speed.”

The question that I have is, what do you want the world to look like? Because let’s be realistic about this: a site that does not make billions of dollars a year in advertising will not have 100,000 people doing content moderation. If you want a full-time staff moderating all content, then you need to figure out how that will be paid for.

If what you’re saying is, we don’t want this volume of speech online — that is a thing to say, but I think it has real consequences for lots of good stuff, too.

Airpods pro بعدی اپل می تواند دارای سنسور دمای داخلی باشد مجله کارت ویدئو.