Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Internet Government

Should Some Sites Be Liable For The Content They Host? (nytimes.com) 265

America's lawmakers are scrutinizing the blanket protections in Section 230 of the Communications Decency Act, which lets online companies moderate their own sites without incurring legal liability for everything they host.

schwit1 shared this article from the New York Times: Last month, Senator Ted Cruz, Republican of Texas, said in a hearing about Google and censorship that the law was "a subsidy, a perk" for big tech that may need to be reconsidered. In an April interview, Speaker Nancy Pelosi of California called Section 230 a "gift" to tech companies "that could be removed."

"There is definitely more attention being paid to Section 230 than at any time in its history," said Jeff Kosseff, a cybersecurity law professor at the United States Naval Academy and the author of a book about the law, The Twenty-Six Words That Created the Internet .... Mr. Wyden, now a senator [and a co-author of the original bill], said the law had been written to provide "a sword and a shield" for internet companies. The shield is the liability protection for user content, but the sword was meant to allow companies to keep out "offensive materials." However, he said firms had not done enough to keep "slime" off their sites. In an interview with The New York Times, Mr. Wyden said he had recently told tech workers at a conference on content moderation that if "you don't use the sword, there are going to be people coming for your shield."

There is also a concern that the law's immunity is too sweeping. Websites trading in revenge pornography, hate speech or personal information to harass people online receive the same immunity as sites like Wikipedia. "It gives immunity to people who do not earn it and are not worthy of it," said Danielle Keats Citron, a law professor at Boston University who has written extensively about the statute. The first blow came last year with the signing of a law that creates an exception in Section 230 for websites that knowingly assist, facilitate or support sex trafficking. Critics of the new law said it opened the door to create other exceptions and would ultimately render Section 230 meaningless.

The article notes that while lawmakers from both parties are challenging the protections, "they disagree on why," with Republicans complaining that the law has only protected some free speech while still leaving conservative voices open to censorship on major platforms.

The Times also notes that when Wyden co-authored the original bill in 1996, Google didn't exist yet, and Mark Zuckerberg was 11 years old.
This discussion has been archived. No new comments can be posted.

Should Some Sites Be Liable For The Content They Host?

Comments Filter:
  • No. (Score:5, Insightful)

    by rsilvergun ( 571051 ) on Saturday August 10, 2019 @09:48PM (#59075096)
    Section 230 of the CDA [eff.org] exists for a reason. The internet can't exist without it. I like the fact that I can get information and commentary from folks like TYT, Secular Talk, Beau of the Fifth Column and even Dave Packman.

    Establishment media will have no trouble getting around this, the entire point of this is to shut down the anti-establishment media outlets.
    • by vix86 ( 592763 )

      Section 230 has to exist because of the speed of creation on the internet, not necessarily because of what big corp can or can not do.

      Nearly 6,000 tweets are posted every second on Twitter. That's 500 Million a day.

      Nearly 500 hours of video are uploaded to YouTube every minute. That's ~82 years worth of video every day.

      It's physically impossible to manually review all of this content so algorithms have to do it. Bad shit slips through all the time. Without Section 230 providing the immunity coverage, there

    • Re: (Score:2, Insightful)

      You could get information just fine before the internet, both from conventional sources and bbs and other dial-up services (remember CompuServe? MiniTel?) and *gasp* schools and libraries and specialized periodicals. The internet doesn't deserve special treatment. If a site can't exist without exemptions for hosting revenge porn or hate speech, then they need to come up with a different business model, or a different business. Same as, when Uber runs out of OPM, they should not get a government bailout.

      W

      • Me too.

        ASL?

        You have mail.

        I remember the flood of AOL users. Life was miserable. The Internet was geekland back then, and then here came the kiddies.

        I forget what process I used to splice together binary photo files from Usenet. It was time-consuming.

        And to address your "What would you do ...," whatever I did would be a heck of a lot faster now.

      • Re:No. (Score:5, Insightful)

        by swillden ( 191260 ) <shawn-ds@willden.org> on Saturday August 10, 2019 @11:31PM (#59075340) Journal

        What would you do if the Web were to disappear tomorrow, leaving the other internet services intact? I'd have no problem going back to Usenet Newsgroups, email would still be around, as would FTP.

        Nonsense.

        The only reason those services were able to function as well as they did a few decades ago is because the Internet was restricted to a tiny minority of the population, mostly academics.

        As soon as it became a tool of the whole population, that changed, and it has changed forever. Eternal September, and much, much more. There is no going back to the pre-Web Internet. Just as the Web would sink into a morass without some filtering, Usenet and FTP would if they became widely used today. They don't need to be filtered now because no one uses them, but kill the web and everyone will move and wherever everyone moves the crap will go, too. Unfiltered email is already unusable. Thankfully, email filtering has gotten so good that we don't much notice how awful email is.

        (As an aside: Even aside from filtering, the replicate-the-world architecture of Usenet simply could not scale to modern needs. It was getting really creaky even before Usenet began to decline. And, actually, many news servers engaged in filtering because it was necessary even then.)

        There really are only two choices: Either allow service providers to continue waging their endless battle against the flood of crap, recognizing that they'll never succeed perfectly, or else enable some other organization -- probably governmental -- to do the job. Rolling the clock back 25 years is not going to happen.

        Personally, I don't think the current situation is great, but I really, really don't want to see government get involved. As long as the filtering is all voluntary, services will have to try to find a balance, and that balance will have to roughly match the sensibilities of the community they serve. Filter too much and the door is open to competitors who allow more freedom. Filter too little, and the door is open to competitors who create a less crap-filled environment. And since "community sensibilities" are only a rough mean with wide variance, there will always be fringe sites that cater to non-mainstream views. Oh, and absolutely anyone is free to run their own servers if they want -- and there's always Tor.

        The status quo is messy, complicated and makes no one really happy, but I think it's the best we can do.

        • CGNAT, AUP, and RBLs (Score:5, Informative)

          by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Sunday August 11, 2019 @12:02AM (#59075412) Homepage Journal

          Oh, and absolutely anyone is free to run their own servers if they want

          This is technically correct, as anyone can run a server on his or her local area network. However, one's own server may not be able to do anything useful on the Internet. This has several causes:

          - ISPs with insufficient IP address allocation use carrier-grade network address translation (CGNAT). This has the side effect of blocking incoming connections.
          - ISPs threaten to disconnect home users who run a server reachable from the Internet and then fulfill those threats.
          - Established servers use blackhole lists to determine with which other servers not to communicate. These lists cover home and home business IP address blocks.

          • by AmiMoJo ( 196126 )

            So what though? You can technically make a public speech from your front lawn, but the audience may be pretty limited and if you try to do it at 2AM you might find the cops take an interest.

            There has never been a more powerful tool for reaching large numbers of people as the internet. We are talking about a tiny number of people who are involved in domestic terrorism having their web sites booted off. Far right politics are mainstream. There is zero evidence that it's affecting political discord, in and in

      • before the Internet I thought Joe Biden was a friend of the working class. Thanks to the Internet I have this [youtube.com] and this [youtube.com] and this [medium.com].

        If I just went by the pre-Internet information I'd cheerfully vote for that sellout. Yeah, everything above is public news, but good luck finding it without the Internet.
        • Damn, I didn't think it was possible but after reading that third link I almost LIKE biden.

          Too bad the handful of claims I checked turned out to be either blatant bullshit or exaggeration.

    • Re: No. (Score:2, Interesting)

      Which of those rely on hosting third party content at a scale they can't practically police well?

      We have AM talk radio stations all over the US (and a very healthy number are hard right conservative political talk shows). They are not immune to lawsuits related to their content AFAIK. They screen their callers heavily and have dump buttons at the ready. Somehow they apparently even make money.

      This doesn't prevent all kinds of hateful garbage from going on air from hosts and random callers alike, but the a

  • Only after YouTube has turned into an unsueable behemoth of digital publishing. After Twitter and Facebook (and all it's derivatives) have laid their financial claim in the U.S. lobby landscape. NOW, they are contemplating the right to freely communicate without compensating them. Conventient. VERY convenient. Will this reduce any of the real world violence? No. Will this foster any progress and innovation with industries? No. Will this restrict and entangle foreign companies and opposing views? ... You th
    • Why don't you get serious like Walmart did, you insensitive clod?

      In an internal memo, the retailer told employees to remove any violent marketing material, unplug Xbox and PlayStation consoles that show violent video games and turn off any violence depicted on screens in its electronics departments.

      Employees also were asked to shut off hunting season videos in the sporting goods department where guns are sold. "Remove from the salesfloor or turn off these items immediately," the memo said.

      Walmart will still sell the violent video games and hasn't made any changes to its gun sales policy, despite pressure from workers, politicians and activists to do so.

  • by presidenteloco ( 659168 ) on Saturday August 10, 2019 @09:51PM (#59075104)
    Instead, content aggregator/distributor sites should be required to have an open API allowing development of 3rd part content filters that users can install if the user chooses to protect themselves from this or that kind of content.

    Allow those who want a censored experience to obtain one. Don't force some single opinion of what should be censored on everyone.
  • by Eravnrekaree ( 467752 ) on Saturday August 10, 2019 @09:54PM (#59075112)

    This would be an enormous benefit to facebook and google. It would be the end of independent sites. Google and facebook want this. Don't fall for it. I can gaurantee facebook and google has lobbyists pushing for these. When they say its to punish facebook, it means facebook wants it and its a gift to facebook. its called regulatory lock in. Facebook wants to be heavily regulated to get rid of all of its weaker competitors.

    Exposing a site to liability for things a user posts would make it impossible to run any kind of message board and would kill free speech! How? You want to bring down a site? Just post something libelous to it, it really doesnt have to be much, it could be a slanderous and libelous comment. The fact is, only large sites like google can cope with the expensive of having to monitor massive websites. This effectively locks out smaller sites and would create a large monopoly of facebook, google etc.

    Call your reps and tell them to oppose this .

  • As this [xkcd.com] relevant XKCD so eloquently puts it, these companies have no obligation to host your content on their dime.

    But they're the "public square"; it's the only way you can easily reach a bunch of people, right? Yeah, why don't you try setting up your own business in the parking lot of Walmart without their permission and see how long that lasts. You're not entitled to someone else's audience or customer base, sorry. Same reason I can't come along and start sticking my preferred political party's campai

    • Your right, instead of complaining about private sites doing this, they need to instead set up their own websites to host their stuff. Just don't use facebook or youtube if you dont like their policies. Instead, what they want to do is actually make youtube and facebook the only sites available which is what repealing the protections would do, because only youtube and facebook could cope with the regulatory burden of massive policing of content that it would require and as well the billions of dollars of le

    • A lot of cultures had common areas - the forum in ancient Greece and Rome for example where the public could meet and discuss. Of course that might have been the male, no-slave, land-owning, public from the right families - but the concept is still valid.

      There is no equivalent place on the internet. Facebook and the like are like private residences - where people can talk, but only with the consent of the land owners.

      I real public forum would serve this purpose, but unlike ancient governments that were w

      • You know what the real goddam problem is?

        The real goddam problem is that the fucking goddam unwashed masses take Facebook and that shit seriously.

        It's a cat video venue -- a game platform. Anyone getting their news from Facebook is seriously impaired. Who would do that?

        Legitimate news does exist outside the bubble.

        I say we all have a happening and dump Facebook.

        That's what I did. I'm a little uninformed about my nephew's drug abuse, but all in all, I'm happier.

        #DeleteFacebook

    • If they were smarter they'd just ask for a government-run "geocities" type hosting provider, and then they'd have all their frozen peaches intact. Of course, that would be Socialism, so nevermind. Thank goodness they can't stomach government services, or they'd be able to ask for something that would "solve" their "problem!"

      • by epyT-R ( 613989 )

        Except that it wouldn't. Politicians would not be able to keep from imposing their own narratives on the system.

      • Bad idea. Once it was subject to government administration and funding, it would very quickly become subject to a lot more government control of content - the AUP would become even more restrictive than most private-sector hosts. The very first thing to be banned would be obscenity, followed by a broadening of the definition of obscenity to catch all the weird stuff. It'd be downhill from there.

    • Completely wrong. (Score:2, Insightful)

      by thesupraman ( 179040 )

      Do you even know what section 230 is about?

      Without that, a company has two choices, either:
      a) you are a common carrier, you do NOT control content, just carry it, and you have no responsibility.
      b) you are a publisher (like a news paper) and you DO control content, and you have a responsibility for that content.

      However, section 230 is a nice little gift to the companies who want the best of BOTH. the control without the responsibility.

      This is NOT a free speech issue, it is an issue with a special allowance f

      • it's anyone who runs a computer based service. Google started out fairly small and had some good tech and a fast loading web page in an era of dog slow portals. People forget how they got their start, which is odd since /. is full of old codgers who where there for it. You set your homepage to Google because it loaded quickly while Yahoo took 2 minutes+ on dial up.

        Point is, 230 made innovation possible and it continues to.
      • by 0xdeadbeef ( 28836 ) on Sunday August 11, 2019 @07:28AM (#59075922) Homepage Journal

        ISPs are not common carriers, according the same people promoting this false understanding of Section 230, so how do they rationalize social networks as common carriers? That is not, and never has been a choice available to them. Social networks are not utilities.

        And Section 230 explicitly says they are not the publishers of the content contributed by others. Here's what it does say, it encourages them to moderate objectionable content by explicitly shielding them from liability for doing so:

        No provider or user of an interactive computer service shall be held liable on account of -

        (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

        (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1)

        What you're all really so butt-hurt about is that your political opinions are considered objectionable content by a majority of people. You spout racist gibberish, you harass women and gays, you spread falsehoods and conspiracy theories, and despite having the absolute freedom to form your own social networks where you're perfectly free to moderate comments according to your own twisted community values, you want the government to inflict a new fairness doctrine on the social networks that the cool kids already dominate. You're basically admitting that you're losers on the Internet and you want the government to help you be winners, despite being the ones who more expertly spread "memes" and propaganda.

      • However, section 230 is a nice little gift to the companies who want the best of BOTH. the control without the responsibility.

        Not quite. It was meant to encourage hosts to moderate, since they would no longer be in danger for failing to moderate perfectly in all possible respects. (That's the risk of the publisher approach -- one tiny slip and you're totally fucked)

        It has nothing to do with large companies or "normal" people. Anyone who creates content remains liable for it, but no one else is. Even if

  • for "Where's my campaign contributions?"
    • Only a complete idiot, or somebody who only recently returned to the surface after decades underground, would think that Senator Wyden gives a rats ass about "campaign contributions."

  • by zugmeister ( 1050414 ) on Saturday August 10, 2019 @10:09PM (#59075150)
    Do they moderate?
    If yes, then they should be responsible for the site they moderate.
    If no they should not be responsible for the site they are not moderating, they're just hosting someone else's content.
    • 8chan, anyone?

    • You are not contributing to the discussion; you just restated Section 230 of the Communications Decency Act (CDA).

      The discussion is about whether Section 230 needs to be reformed or repealed, because it does not have an enforcement mechanism and it did not foresee the rise of social media.

      • by OYAHHH ( 322809 )

        Social media has existed since the advent of USENET. The formatting of the discussion or the delivery vehicle is immaterial.

  • by brainchill ( 611679 ) on Saturday August 10, 2019 @10:35PM (#59075214)
    This is disturbing to even talk about. We have free speech for a reason ... as long as you are not telling people to take rights away from or physically hurt others with your speech there should be zero regulation of it .... period. I do not think that the opinions of legitimate nazis or democratic socialists are worth listening to, as an example, but I would defend their right to share their stupid opinions to my last breath.....
    • Private companies have no obligation to some country's idea of free speech. THIS site has stopped AC posting as an example, where's your complaint? Yeah. Its laudable that you'd defend a right to say something, but there's no right to broadcast it, be forced to listen to it, or not suffer consequences from whats said.
      • We are not discussing private companies obligation to protect free speech in this post. What we are discussing is the government reversing it's position and holding private companies accountable for the things that their users say ....... my position stands the same .... there is nothing to hold companies accountable for unless the users are telling people to physically harm or strip the rights of others.
        • Re: (Score:2, Informative)

          by Silverhammer ( 13644 )

          Each service provider (hosting, website, phone app, whatever) must decide whether it is a publisher or a platform, under the terms of Section 230 of the Communications Decency Act (CDA). If it is a publisher, it is legally liable for its content and therefore may censor the speech of its users. Conversely, if it is a platform, the federal government grants it protection from legal liability as long as it does not censor the speech of its users. Social media companies like Facebook, Twitter, and YouTube are

          • Conversely, if it is a platform, the federal government grants it protection from legal liability as long as it does not censor the speech of its users.

            No, it doesn't, you lying moron. A website can moderate anything it damn well pleases.

            Learn something: https://www.techdirt.com/artic... [techdirt.com]

    • Re: (Score:3, Insightful)

      Free speech is a prohibition against the US government. The places you are talking about are private property.

      Free speech does not apply.

      The ToS is a binding contract that YOU agree to before using the site.

      Don't like stuff? Exercise the only real right you do have: Leave.

    • This is requiring private companies to be responsible for the content that they host. In effect, this is more about enforcing libel and hate crimes on the internet.

      Say a website is allowed to publish a message that says, "Brainchill is a faggoty pedophile spic who crossed the Rio to rape and murder our children in our sleep. He lives at 123 4th St, El Paso, TX. Kill him, and you'll be a hero." Except, it's your real name, not your alias. Say a lawyer reaches out to the sysop for the site and says, "This

  • There's a difference between moderators being asleep/inconsistent/slow to remove stuff that violates their TOS; and a policy/tendency of moderators to welcome certain content, or to look the other way when they find it.
    Google etc. not hiring enough 3rd-world mods to stare at horrible stuff all day is different from, say, 8-chan mods allowing everything and anything.

    It might be most useful for the govt. to fund research into machine learning software to detect stuff that violates a given site's TOS (whatever

  • Absolutely not, unless you hate free speech. Everyone needs to recognize that even the speech they disagree with is protected, and that such protections are vital to ensuring what they do agree with is allowed to continue.

    • An approach that lasts until the first photograph of child sex gets posted. There are a few true free-speech-absolutists around who will defend the right to publish and view absolutely anything, but it's really a tiny niche. The vast majority of people are happy to accept some form of ban on certain content, they just differ regarding what needs to be banned.

  • If you censor your users, you should be liable for the content you are curating. If you're a common carrier of anybody's content, you should be not be. The current system works pretty well.

    • Yes, that's what Section 230 says.

      But you still think it's working, after all the news about Facebook, Twitter, and YouTube over the last few years?

      It's not working, because there is no enforcement mechanism. That's why people are talking about reforming or repealing it.

    • f you censor your users, you should be liable for the content you are curating. If you're a common carrier of anybody's content, you should be not be. The current system works pretty well

      What you describe is not the current system. The current system is you are not liable even if you curate the content.

      There is a widespread myth that curating means liability. It isn't true.

  • n/t

  • Is this why AC comments went away?

  • This is exactly the thinking of a person more interested in social (mob) justice than law. If only SOME sites should be liable, which sites would be liable? What test would you apply?

    If a site wants to editorialize its content (eg. Facebook and Twitter blocking voices that don't follow their corporate ideology) then it can do so and bear the consequences of ALL the content posted. If it does want to be a platform, then it only has the obligation to remove things that are a clear violation of law after a law

  • Sites should only be liable for content posted by their users if they are notified of content that is illegal (sex trafficking, criminal activity etc) and fail to act to remove that content.

    But if someone is knowingly hosting something that is outright illegal and fail to act, they loose their safe harbor for that content.

  • Short answer: A resounding NO. To rule, in law, otherwise, would invite utter chaos and have an extremely chilling effect (at least here in the United States; YMMV) on Freedom of Speech rights.

    Now, that having been said: Privately owned websites are not required under law to allow any and all content to be posted on their pages, they can exercise any level of moderation of user-generated content they choose, so long as it is consistent with their own stated rules. To provide more specific examples of what I mean, if I'm not clear enough, if the owner of a website that allows user-generated content does not wish to have racist, bigoted, sexist, discussion of illegal activities, discussion of drug use, or My Little Pony discussions for that matter, then it is entirely within their rights to remove content as they see fit and even revoke the access rights of any users as they see fit, too. Affected persons and their content are of course free to pursue legal action against that website and it's owners in civil court, but (using the U.S. as an example) there are no constitutional protetions against censorship guaranteed with regards to privately-held companies so far as I can see.
  • So...Republicans and Democrats are teaming up to hurt companies because they are not censoring the way the government wants.

    This is not a good sign.

E = MC ** 2 +- 3db

Working...