Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Social Networks Facebook Google Government The Courts The Internet United States

Should Facebook, Google Be Liable For User Posts? (reuters.com) 137

An anonymous reader quotes a report from Reuters: U.S. Attorney General William Barr on Wednesday questioned whether Facebook, Google and other major online platforms still need the immunity from legal liability that has prevented them from being sued over material their users post. "No longer are tech companies the underdog upstarts. They have become titans," Barr said at a public meeting held by the Justice Department to examine the future of Section 230 of the Communications Decency Act. "Given this changing technological landscape, valid questions have been raised about whether Section 230's broad immunity is necessary at least in its current form," he said.

Section 230 says online companies such as Facebook, Alphabet's Google and Twitter cannot be treated as the publisher or speaker of information they provide. This largely exempts them from liability involving content posted by users, although they can be held liable for content that violates criminal or intellectual property law. The increased size and power of online platforms has also left consumers with fewer options, and the lack of feasible alternatives is a relevant discussion, Barr said, adding that the Section 230 review came out of the Justice Department's broader look at potential anticompetitive practices at tech companies. Lawmakers from both major political parties have called for Congress to change Section 230 in ways that could expose tech companies to more lawsuits or significantly increase their costs. Barr said the department would not advocate a position at the meeting. But he hinted at the idea of allowing the U.S. government to take action against recalcitrant platforms, saying it was "questionable" whether Section 230 should prevent the American government from suing platforms when it is "acting to protect American citizens."
The attorney general of Nebraska, Doug Peterson, noted that the law does not shield platforms from federal criminal prosecution; the immunity helps protect against civil claims or a state-level prosecution. Peterson said the exception should be widened to allow state-level action as well. Addressing the tech industry, he called it a "pretty simple solution" that would allow local officials "to clean up your industry instead of waiting for your industry to clean up itself."

Matt Schruers, president of the Computer and Communications Industry Association, which counts Google and Facebook among its members, said such a solution would result in tech giants having to obey 50 separate sets of laws governing user content. He suggested law enforcement's energies might be better spent pursuing the millions of tips that the tech industry sent over every year, only a small fraction of which, he noted, resulted in investigations.
This discussion has been archived. No new comments can be posted.

Should Facebook, Google Be Liable For User Posts?

Comments Filter:
  • by AHuxley ( 892839 ) on Thursday February 20, 2020 @08:57PM (#59748884) Journal
    Passing information on? Thats a telco, network, provider. All risk stays with the user. Full gov protection.

    Have staff sort, approve, censor users art, work, comments, links? Have a brand state it has its own special political test for users art, jokes, comments, words, reviews?
    Thats a publisher, the role of an editor. Welcome to the world of been a publisher. No more free gov protection.
    Welcome to a new tax rate, no gov support and been responsible for every nations mil/faith/gov rule, regulation and law.

    Become a charity? Non profit? NGO? Think tank? That might work :)
    • by RazorSharp ( 1418697 ) on Thursday February 20, 2020 @10:00PM (#59749080)

      The problem is that for most websites, especially social network sites, it doesn't make sense to classify them as either a telco or a publisher. Insisting that those are the only possible classifications is a false dichotomy and fails to appreciate the clear differences between an internet forum, a book, and telephone network.

      All internet forums cease to be useful once moderation is banned (even /.'s user-moderation is moderation, when the trolls come people just move that slider to hide -1 comments). Likewise, making websites liable for content posted on their forums because they don't want to have giant swastikas trolling every other post will create an absurd amount of lawsuits that will require websites and ISPs to turn over tons of user-information over to the government when people get subpoenaed. Insisting on the telco/publisher either/or is completely impractical and undesirable.

    • by cpt kangarooski ( 3773 ) on Thursday February 20, 2020 @10:00PM (#59749082) Homepage

      You've failed to think it through. Without the safe harbor, it means there can be no moderation of any sort, not even the smallest, most innocuous amount, because assuming even the tiniest iota of responsibility for doing so exposes a site to total liability for everything; not only for moderation that they engage in, but also a failure to moderate where they had the ability of, but chose not to or just missed it. There is simply way too much material being posted to vet it all, and yet the law without the safe harbor requires perfect vetting of everything, with no second chances, if there is any moderation.

      So a site that allowed users to post but didn't moderate could not moderate for language, nor for indecent or obscene matter, nor hateful speech. So family-friendly sites will be gone. It could not moderate to remove spam. It could not moderate to remove malware. It could not moderate to remove defamatory statements. Basically, it would be crap. I remember what happened to USENET -- the Internet's original discussion boards, totally destroyed by the late 90s because, among other things, no one could stop all the spam.

      The only alternative would be to not allow users to post anything at all.

      Eliminating the safe harbor turns the Internet into tv and magazines -- we become passive readers and watchers rather than being able to actively participate and converse.

      This is why it was recognized that the old laws -- which applied to things like newspapers, which had comparatively little content, all of which went through editors for approval, and which did not allow individuals to contribute freely -- just didn't work. We wanted sites online to be able to do some moderation, and we needed to give them the safe harbor because otherwise they would never do it.

      • by guruevi ( 827432 ) on Thursday February 20, 2020 @11:45PM (#59749352)

        Yet, back in the day we found solutions to those things. Right here on Slashdot there is a user-driven moderation system as does Reddit, the *chans and back in the day, we had IRC with community-driven ops, halfops and for Usenet we simply had the filter systems on our clients AND IT WORKED! Even if you have a soapbox, the community will shout you down if you're an asshat.

        The problem is that modern corporations and the media don't want the community to drive the narrative, after the Clinton impeachment and people across nations voicing together about various issues throughout the Bush and Obama administrations, Internet self-moderation was incrementally killed off.

      • by AmiMoJo ( 196126 )

        The current common carrier rules say that they can moderate their platform on pretty much any basis they like.

        https://www.law.cornell.edu/us... [cornell.edu]

        (2)Civil liability
        No provider or user of an interactive computer service shall be held liable on account ofâ"

        (A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

      • Re: (Score:2, Insightful)

        Comment removed based on user account deletion
        • by cpt kangarooski ( 3773 ) on Friday February 21, 2020 @09:40AM (#59750286) Homepage

          Your ignorance about section 230 and your blatantly wrong statements about it are appalling.

          The safe harbor was intended to allow sites to remove legal content posted by users, according to the arbitrary (but earnest) desires of the sites where the content was being posted. The law was a specific reaction to the Prodigy case, where that service was successfully sued for libel (ironically the claimed libel was later revealed to be true, but too late to help anyone) because Prodigy had been moderating to remove things like curse words and off-topic comments and had failed to delete this one libelous comment -- not that there was ever any reason that it would know it was libelous -- so was therefore responsible for it.

          This led to everyone not wanting to moderate anything. Congress wanted moderation. The CDA -- the Communications Decency Act -- was meant to make the entire Internet child-friendly. No cursing, no legal porn, no discussions of adult subjects -- none of that where kids might run into it. Congress was exasperated by sites' refusal to moderate because of the Prodigy case holding that any moderation means total liability for any mistake -- including failures to moderate -- so they provided a broad safe harbor that allows any degree of moderation, for basically any reason. They were encouraging sites to use it. If anything, what we've seen is that sites still don't want to moderate too much because it's a pain in the ass, and doesn't produce revenues.

          Your claims about their intent and about what the law permits sites to do are 100% false. The mid-90s statements of its sponsors, like prominent shitbag Senator Exon, are easy to find, and perhaps you'd enjoy that old Time Magazine where the cover story was that the brand new "Information Superhighway" had porn on it that children might see!

          It's never had to do with 'respect for cultural and legal norms' or shit like that. The Congress that passed it hated free speech and tried to crush it. Luckily the rest of the CDA got thrown out by the Supreme Court for exactly that reason. But the safe harbor survived.

          Do us all a favor and go learn something about this and shut the fuck up until you have. Start with what people were saying at the time in '95 and '96, read the entire CDA, read the ACLU v Reno case that struck most of it down, and read some of the early cases about section 230 in particular.

        • by Holi ( 250190 )
          Can you have a discussion without your political ideology clouding your ability to think? Really, can you try and move beyond left bad right good mentality and understand the world doesn't actually work like that at all, in any way, and that things are far more complicated and deserve real thought, not just blind ideological obedience.
      • You've failed to think it through. Without the safe harbor, it means there can be no moderation of any sort, not even the smallest, most innocuous amount, because assuming even the tiniest iota of responsibility for doing so exposes a site to total liability for everything; not only for moderation that they engage in, but also a failure to moderate where they had the ability of, but chose not to or just missed it. There is simply way too much material being posted to vet it all, and yet the law without the

        • Congratulations on your novel.

          One thing: what we're mostly discussing here is the safe harbor of 47 USC 230 -- the Communications Decency Act (CDA) safe harbor. This is the one that allows sites to moderate user posts or not, as the sites see fit and as they are able or unable to actually manage, and to not be held responsible for them. Everyone including sites remains responsible for what they themselves post online. This safe harbor doesn't provide protection for intellectual property issues, however.

          You'

        • there's too much out there for anyone to really know who owns what.

          Did you mean 'there's too much out there for us to know who knows all the works'?

          should I expect Slashdot to know that this was copyright infringement?

          No, because that's not how copyright enforcement works. It's a response mechanism.

          • should I expect Slashdot to know that this was copyright infringement?

            No, because that's not how copyright enforcement works. It's a response mechanism.

            For now. They're actively trying to change that.

      • So, do you promote that "You support the wrong sex's rights." or "You support the wrong race's rights." or "You support the wrong political view." valid reasons to remove content from your hosted public forum?
        • The government provides public fora, though probably not so many online. I can't think of any offhand.

          Internet fora belong to the people running them: Facebook is owned by Facebook, Inc., not by the public. Twitter is owned by Twitter, Inc., not by the public. Slashdot is owned by SlashdotMedia.

          Other than subject to a few comparatively minor exceptions, they can control who they want to allow to use the sites, and what's posted there. This is because just as the users have rights of free speech and free

          • >so too does the site have rights of free speech and free association that prevent the government from controlling who what speech they allow and disallow

            Why does this logic not apply to telcos like Verizon? Why does that logic for free association not applicable to protected classes like race and other public services? There are state and federal laws that explicitly limit the rights of companies to free association. Maybe you need to apply the same standard to those laws as well and file lawsuits in th

            • Verizon is a utility. In exchange for some perks (like getting to use the state's right to take property from people to run lines in), they accept substantial regulations (like having to offer service to anyone who will pay). This is also related to their status as a natural monopoly (ie the capital investment needed to run enough lines to compete with them, and the utter futility of everyone running redundant lines to everywhere, when only a fraction would be used, means there's really not going to be comp

              • Aside from the physical lines, sounds applicable to current tech companies (capital investment and market share). The government passed a law saying X. That is acceptable to you because you agree with X. That's fine. But that doesn't lend it self to say why limiting those companies is any different than what is proposed in addressing the stated problem here with social media.

                In limited aspects social media has been defined as public squares. That lends credence to the idea that there are times and places on

                • Aside from the physical lines

                  That's the entire basis of them being a monopoly.

                  In limited aspects social media has been defined as public squares. That lends credence to the idea that there are times and places online that should not be censored by private interests to protect the rights of individuals.

                  Not in any legal context. Private property isn't a public forum unless the private owner is functionally the local government or a close partner with it. The cases are Marsh v. Alabama, 326 US 501 (a town owned by a company couldn't prohibit people from distributing religious literature on the sidewalk), Lloyd v. Tanner, 407 US 551 (a shopping mall could prohibit people from passing out anti war literature inside the mall), and these led to the really key cas

                  • > them being a monopoly.

                    A natural monopoly. Sure. But monopolies are also defined by market share. Is physicality your only argument as to why these companies should be exempt from any scrutiny? Sounds like a path to never be able to address evolving technologies and the virtual landscape of the internet.

                    > Private property isn't a public forum unless the private owner is functionally the local government or a close partner with it

                    These are evolving issues on new technologies and changing social issue

                    • But monopolies are also defined by market share.

                      The safe harbor protects big and small alike. And a cap on the safe harbor based on size will only discourage growth, which is generally felt to be a bad idea. It's certainly hard to see how it would work logistically; it would be as if your phone could only call people on your network, but not arbitrary phone numbers.

                      Zuccotti park in NYC during Occupy Wallstreet comes to mind as it was a privately owned public space.

                      And the protesters were evicted and the courts ruled against them. Not the best example you could cite.

                      As does Trump's twitter feed.

                      That's not because of Twitter, that's because of Trump. It's not the account that's p

                    • > The safe harbor protects big and small alike.

                      Irrelevant when there is acknowledgement that they may qualify for breakup. There is at least some circumstantial examples of anti-capitalist behavior by these companies (coordinated efforts of de-platforming and de-monetizing). So at the very least any rule change can recognize their status and coordinated actions.

                      >No one is being censored,

                      Repeating this does not make it true.

                      > Illuminate us as to your amazing plan.

                      I think Barr is moving in the right

                    • This is off the top of my head

                      Holy cats! I never pictured you as having a head. My mental image of you was always that you just had a second ass atop your neck, constantly spewing shit from a nasty-looking diseased asshole, and the shit dribbled on a keyboard, which is how you typed all the crap you post. There were a lot of undigested corn kernels in it for some reason.

                      examples of anti-capitalist behavior

                      Well there's nothing wrong with a company being anti-capitalistic.

                      I think Barr is moving in the right direction.

                      Well, Barr is a totally corrupt authoritarian unamerican shitbag so I can see how you'd like him. Also

    • by Kjella ( 173770 ) on Thursday February 20, 2020 @10:30PM (#59749174) Homepage

      Have you actually seen what happens if you don't filter at all? Where's that dude posting endless pages of Nazi crosses when you need him. People who argue that any human touching the content should mean they lose all legal protection are either blissfully naive or trying to kill free speech. Because what you get then is a trash heap spammed to hell by bots that's not usable by anyone. If you want to see what the Internet would really look like, get a job moderating Facebook or something like that. It's so ugly they get PTSD but you know what? The rest of the world doesn't have to see that shit.

      If you take away section 230 it's not like Facebook and YouTube will go away, but what will happen is that the law is broken constantly and enforcement will be almost none. But at any time the government can yank any website's chain and say unless you start playing by our fiddle we'll prosecute every little violation. You got a president in office that's screaming about fake news, well get on his bad side and your website is suddenly on the shit list for prosecution. I could just start copy-pasting in books here and make slashdot liable for $750-150000 per infraction. How long would they last?

    • by Sloppy ( 14984 )

      Thats a publisher, the role of an editor. Welcome to the world of been a publisher. No more free gov protection.

      About that last sentence: Why?

      It's a matter of routine for small websites' robots to delete obvious-junk things, as well as for operators to occasionally manually intervene or tune the robots. Are you saying that doing this means the website suddenly "says" all the remaining posts that it didn't delete?

      And is this an analysis by you, or a proposed policy? Are you saying this is how it is, or is it

  • by fuzzyfuzzyfungus ( 1223518 ) on Thursday February 20, 2020 @08:58PM (#59748888) Journal
    I'm sure that we all would like the internet better if it were subject to the whims of what some hickistani sheriff thinks 'decency' looks like.
    • by penandpaper ( 2463226 ) on Thursday February 20, 2020 @09:05PM (#59748906) Journal

      Totally. Tumblerinas in Wokeland are going to have better decency laws. We can count on them to end the rampant sexualization of women that reinforces heteronormativity and cis gendered racist oppression on LGBTQ+ people of color in video games, moves, and media. In the name of Progressive Intersectional Feminism I banneth thee!

      • The really unpleasant bit is that, if you let local authorities claim jurisdiction, you'll get the worst of both: If everything on the internet has to survive potential attack by any of 50 state Attorney Generals the remnants will be thin and banal indeed. It will make the textbooks that get homogenized until they will sell in both Texas and California look like great works of literature.
    • Or some California Circuit Court judge for that matter.
  • Good. (Score:2, Insightful)

    by Anonymous Coward

    Play political games. Win political prizes. Tech companies want to be gods to the new landscape of information and discussion. No company should have that much power.

    What is funny about this discussion, the left that normally advocates for government are suddenly libertarian over companies ability to censor their political opponents. Can't win the argument? Ban 'em all. Ban anything not progressive.

    • 20 years ago:

      The left was pro big-government and anti big-corporation.
      The right was anti big-government and pro big-corporation.

      Now the left is pro big-government and pro big-corporation.
      The right is anti big-government and anti big-corporation.

      This is how the left became fascist last time, and its how they became fascist this time.
  • No (Score:4, Insightful)

    by Snotnose ( 212196 ) on Thursday February 20, 2020 @09:04PM (#59748904)
    Not sure what else to add, but I need to fill out the comment.
  • >"Should Facebook, Google [and similar] Be Liable For User Posts?" [...]"questionable" whether Section 230 should prevent the American government from suing platforms when it is "acting to protect American citizens."

    I believe they are both engaged in manipulating content, manipulating ratings, using broad "acceptability" rules to squash things they don't like, promoting content they like, shadow banning, and worse, and perhaps they should both be labeled as publishers and lose their liability protection

    • Translation: the government wants censorship of anything they don't like, AKA the great firewall of China, and wants companies like Google to do it for them (because corporations are exempt from the Constitution and many laws?)
  • Yes (Score:1, Troll)

    by sexconker ( 1179573 )

    They censor, suppress, promote, shadow ban, etc.
    They are a publisher, not a platform.

    Full liability.

    • by Skapare ( 16644 )
      so we make them liable for the parts they publish. i like that idea.
    • Moron. They'll just shut down user posting instead. If you really want that so much, go fuck off and delete your /. account -- this site will have to shut down too.

      • by guruevi ( 827432 )

        Nope, they don't Slashdot didn't censor (at least not for the longest times), you get downvoted, nobody sees your content, that's been the way of dealing with asshattery in the public sphere since the Greeks.

        • You think that will save them from lawsuits where people make the argument? Even if Slashdot is safe from liability, the costs of proving that could be crippling on their own. Worse if there's even the slightest middle ground to argue about. Half of the value of the safe harbor is its clarity.

      • If facebook shuts down user posting, facebook shuts down facebook, it ain't happening.
        • If they lose the safe harbor they will either shut down or be sued into oblivion as being responsible for every last word, picture, and video Facebook users have posted. There's enough users posting enough bad stuff that they wouldn't last long. Even Facebook doesn't have that much money.

          To paint a nice mental image, imagine the safe harbor as the metal door keeping a million ravenous piranhas from swimming over to a nice plump cow that's cooling off in a pond. The piranhas -- lawyers like me -- will eat ol

  • i build a wooden panel on a set of poles next to a building i own or lease near the center of town with a sign that says "what are your thoughts?". there are rows of nails people can push their papers on. i or my friends come by daily and tear down anything offensive or commercial advertizing. i put up my own ads to cover the cost of maintenance. should i be sued in civil court for something someone puts up there? consider cases where the person who posted it is known and unknown.
    • by nagora ( 177841 )

      Good analogy. I'd say: yes. Once you start advertising then you are running a business based on the facility, not simply providing a public asset.

      Known and unknown make no difference to whether you are responsible for what your business displays. It obviously makes a difference to whether the poster is *also* liable or not.

      Most courts would make an allowance based on your personal effort to police it but ultimately it was your choice to vet material retroactively instead of editing ahead of time. No one for

    • Perhaps an improved analogy:

      You put up the panel with the purpose to make money (or secretly collecting photos of people walking up to the board), not just cover the cost. People use it to post drug ads and complain about the local invasion of foreigners stealing their jobs and eating their dogs. If they didn't post such, your board would be empty, and nobody would pay you for maintaining the board.

  • by DogDude ( 805747 )
    Absolutely, yes. It's their servers, it's their information. I'm liable for what's on my server. There's no reason they shouldn't be liable for what's on theirs.
    • But they're liable in the same ways that you are liable. If you use your server to host a forum, you're still under the same rules they are.

      You frame it as if you're liable for your server but they're not liable for theirs. This isn't true. You're both afforded the same liabilities and protections.

    • I'm liable for what's on my server.

      Not if someone else uploaded it. (Aside from special categories like pirated materials, child porn, etc.)

      And they're still liable for things they post; just not for what their users post.

      Only users are liable for their own posts, and that's entirely appropriate.

      • by DogDude ( 805747 )
        You installed a program on your server that allows people to upload random shit to your server. That's 100% your fault. Servers were designed to serve data, hence the name. If you want to serve some horrible shit that you allowed somebody to upload to your server, then that's your problem..
    • I run a forum. It's a small one and we have moderators to kick out spammers, but we don't have resources anywhere close to Google/Facebook. Should I be liable if someone posts copyrighted content on my forum and I and my moderators don't catch it? What if I post copyrighted material here? I published a novel that few people have read. If I entered a chapter of it here as a comment, then it wouldn't be copyright infringement because I own the copyright. However, if you bought a copy, copied a chapter, and po

  • by aicrules ( 819392 ) on Thursday February 20, 2020 @09:11PM (#59748930)
    No.
  • The reason those companies are afforded that protection has nothing to do with being "underdog upstarts", but rather for two reasons:

    • You can't have a different set of rules that applies to companies simply because they are big, and some other companies in that space *are* underdog upstarts.
    • Exercising prior restraint on a large scale is infeasible, effectively making potential liability infinite.

    Thus, any argument against such protection, then, is tantamount to saying that the Internet and free speech we

    • by RazorSharp ( 1418697 ) on Thursday February 20, 2020 @10:23PM (#59749156)

      I like this analogy.

      I also think people should think of it this way: Suppose you have a small blog that supports comments and has a forum. You have a fairly loyal base of readers and because they discuss quite a bit on your blog, you are able to generate revenue through ads. Not a lot, but you bring in about $1,000 a month of profit. Not enough to quit your day job, but a nice little benefit for running a passion project.

      Suddenly, the law changes and YOU become liable for what these people post. Let's say your blog is about open source software and your users really like to trash certain tech billionaires. If one of those tech billionaires were to sue you--even over something frivolous that might get thrown out--it would both cost you financially and greatly inconvenience you. So you decide it's no longer worth it to allow comments and host a forum. But this was the glue that held your community of readers together. They still come to your site to read your posts, but not enough to generate any money. Soon the hosting costs exceed the income.

      There's an economic loss, free speech is stifled, and the internet becomes less diverse.

      • by dgatwood ( 11270 )

        There's an economic loss, free speech is stifled, and the internet becomes less diverse.

        Worse than that. It pretty much guarantees that only the largest companies will feel safe hosting user-generated content, which cements the oligopoly of the major players at that particular moment in time (e.g. if the law changed today, Facebook, Google, Twitter, Snap, and that's about it) and pretty much shuts everybody else out. Permanently. No more opportunities for innovative companies to shake up the industry, no

  • by WaffleMonster ( 969671 ) on Thursday February 20, 2020 @09:18PM (#59748956)

    Is quite a lame, hackable and inherently illogical concept. Lawyers are destroying this country with increasingly ridiculous theories of liability.

    • by cpt kangarooski ( 3773 ) on Thursday February 20, 2020 @10:05PM (#59749096) Homepage

      First, not all lawyers. I'm a lawyer, and I support the safe harbor completely. (I wish more sites would take advantage of it, but it can't be compulsory)

      Second, there's no increase here. These are old-time concepts. In olden times, media were responsible for everything they printed or aired, so they carefully limited what they ran and vetted it to make sure it was all good first. Distributors, such as a bookstore, would not be liable (they can't read every book) unless they knew or had reason to know of the problem with what they were distributing. Adapting this to the net, the rule was that if you moderated anything, no matter how trivial, you were responsible for everything, even things you didn't know about that went across your system; if you moderated nothing, you escaped liability, but also were inundated with crap you could 't keep out. The safe harbor was meant to fix that -- to encourage sites to moderate by giving them freedom to do so without risking total liability for things they might have moderated incorrectly or that they should have moderated but failed to.

      • Under this system, what restrictions might apply to a 'public' forum hosting site refusing to host a women's issues forum?
        • None, probably.

          They can't discriminate against users based on gender, for example, but the site can control the topics of discussion. So I would expect that if there were say, a discussion board about glass eyes, it couldn't be women-only. But that doesn't mean the site operator has to has discussions about non-glass-eye subjects.

  • To sue Twitter for all the harmful and offensive Tweets POTUS makes.

  • NOTE: I (of course) hate Facebook and pretty much all so-called 'social media' because I think it's toxic and a trap and exists to steal from you.
    HOWEVER: If you make them responsible for ALL user-generated content, then ALL user-generated content has to be approved by a human being before being posted on the public-facing site. Otherwise they open themselves up to a seething hell of legal liability. Some company like Facebook or Twitter would have to employ millions of people as moderators, or have a backlog of user posts numbering in the billions or trillions every year. In other words they'd be out of business in less than a week, likely out of business as soon as the law took effect.
    Of course on the other hand such a law would spell the end to 'social media' across the board, so I wouldn't exactly be unhappy about that.
    But it would also spell the end to the Internet being Read/Write; it would become Read Only, with all content being generated exclusively by companies and other 'responsible parties'. You wouldn't even be able to have customer reviews posted on your website. Hell, you might not even be able to have email anymore, depending on how such a law was written; imagine having to have some adminstrator working for your ISP read all your private email after you send it, before it's allowed on to it's destination, then having some administrator at the other end having to review it before putting it the destinations' inbox!

    One must wonder at the actual intent of such a suggestion, and here's one theory: Maybe they want to make the Internet 'Read Only'. Then they'd have total control over all of it, and we'd have none.
    Of course someone like me would just cancel it immediately, and I suspect most of you would do the same.
    • But it would also spell the end to the Internet being Read/Write; it would become Read Only, with all content being generated exclusively by companies and other 'responsible parties'

      You're completely wrong. Anybody can post anything to the Internet that they want on their own server. Anybody. You can set up a web site today for pennies a month, and you'll always be able to do it. The idea that content can only be posted through Giant Super Ultra Mega Omni Corp is nonsense.
      • Okay.. but we're talking theoretical here, not real world right now.
        If they made all Internet sites responsible for their content, then depending on how such a (fucked-up) law was written, the hosting company might have to 'approve' all your content before they'd allow it to 'go live'.
        Doesn't matter, really, this is all just theoretical conversation, the thing that puts the kibosh on any nonsense like this is the First Amendment, really.
      • Anybody can post anything to the Internet that they want on their own server. Anybody. You can set up a web site today for pennies a month, and you'll always be able to do it.

        And that's great, but most people can't build or buy a server, house it somewhere, get good connectivity, and pay to run and maintain it -- especially not for the sole purpose of sharing a video of what their cat did. It certainly costs more than pennies a month, and would also fragment the net terribly making communication harder. Say what you will about the major social media sites -- I sure don't care for them -- but they do actually enable people to communicate with one another better than we managed

        • by DogDude ( 805747 )
          A. Yes they can. I have one running in my house. It costs pennies per month to run.

          B. If you want to pay for server space, you can do that, too. https://www.nearlyfreespeech.n... [nearlyfreespeech.net]

          So yeah, maybe Youtube wouldn't exist any more. I don't think that's a bad thing. The Internet has turned into a cesspool precisely because nobody is liable for anything they do. It's pretty insane.
          • Ooh, a house -- There certainly aren't any people who don't have those. And also everyone in the world gets to enjoy the same standard of living as you, and doesn't have to prioritize what they spend money on.

            Re: server space, nope -- the host would risk liability.

            nobody is liable for anything they do

            Nope! Everyone is liable for what they do. For example your post above, you're liable for it. It's Slashdot that isn't liable. Which is sensible since they didn't write it, they don't know what's in it, and they have no particular reason to th

            • You can run a server off a raspberry pi with PV cells and a battery in your backpack. You can carry your forum with you anywhere.
            • by DogDude ( 805747 )
              That's the nice thing about the safe harbor. It protects sites from liability caused by third parties. but doesn't protect anyone from liability they cause themselves.

              That's NOT nice. That's exactly why the Internet has turned into a toxic sewer filled with Russian misinformation, and regular people being absolutely awful to other people: There's no responsibility. This model of the Internet clearly does not work.
              • Well, option one is no moderation and you get more of that, option two is no user posting at all -- which either means people can't talk to each other or they go to other fora in other jurisdictions, which may be worse. (Like if all the right wingers move from Facebook to a site hosted by VKontakte). Option three is the middle road, which you seem to dislike, where sites may moderate but are not obligated to. That's the one we're on now.

                What would you suggest? Please let us know how much speech you think th

                • by DogDude ( 805747 )
                  People can say whatever they want (within certain guidelines), but they have to be responsible for what they say. The Internet has turned into garbage precisely because people can and do say whatever they please, with zero repercussions. The Internet is not a place where productive speech happens, in my opinion. I would suggest that the Internet has done much more to harm the public good than it has done to help it.

                  I think that every device connected to the Internet is the responsibility of whoever o
                  • People can say whatever they want (within certain guidelines), but they have to be responsible for what they say. The Internet has turned into garbage precisely because people can and do say whatever they please, with zero repercussions.

                    This is true now. To borrow my example from elsewhere, if I were to defame you by saying you fuck goats, you could sue me for libel. You couldn't sue Slashdot for 'publishing' my libelous statement though. This means that there is not much point to tracking me down only to find that I haven't got enough assets to pay a judgment for the reputational harm I would have done to you, less the cost of suing. Slashdot presumably has more money, but you can't get at it. And maybe it really didn't harm you -- aft

                    • by DogDude ( 805747 )
                      hat's what usually happens. Most harms are minor and not worth the trouble, so no one responsible for them is actually held to account.

                      I disagree. We have a very large problem with a very large part of the population who are dangerously min-informed. And right now, there is no way to hold anybody accountable.

                      . So he can sell anything without facing liability unless he has or reasonably should have actual knowledge of something in a book giving rise to liability, whereupon he needs to stop selling tha
          • by radl33t ( 900691 )
            pennies per month? unlikely. Even a mobile phone serving pages is going to cost dozens of pennies to run per month.
    • by Falos ( 2905315 )

      They should trial it. Clog the facetweets for a week.

      "Your post is 34768295 in the queue of 817451345 awaiting approval in compliance with portions of the Patriotic Internet Freedom Protection Safety Rights Act, sec.2(a), thank you for your patience."

      Either the political theater (hey everyone I'm fixing the internet) stops or the socnets do. Win win.

      Okay, one particular subset of theater, but still.

  • by AK Marc ( 707885 ) on Thursday February 20, 2020 @09:23PM (#59748968)
    Facebook has an explicit policy of censorship, as such, anything that gets by should be 100% their responsibility. I don't know what service they are talking about for Alphabet, as Google does so many things. Facebook does a good job of banning anyone who calls a Nazi a Nazi. Their filters look to be keyword based, not content based. Saying "you shouldn't call someone a *****" will get you banned if you spell out *****. Even if you are speaking against it, the filters simply don't work.

    Sarcasm and satire are banned, as they could sound like the real thing.

    Facebook bans 20k users per day. Some are temporary bans, others permanent. all are proof that Facebook doesn't practice common carrier standards of neutrality, but they take an active interest in it and are responsible for the contents they let through.
  • They WERE a platform, they ARE a publisher. Section 230 is for platforms.

  • by Strill ( 6019874 ) on Thursday February 20, 2020 @10:02PM (#59749092)

    Facebook and Google should not be liable for user posts, as long as they don't curate and editorialize those posts. If they choose to censor users, or to selectively promote some posts over others, they should lose those protections.

    • So companies should be punished if they don't censor the way politicians want?

      Doesn't sound so great, or constitutional, put that way.

      • by Strill ( 6019874 )

        I'm saying that social media companies should be held responsible for their own speech. If they leave user posts alone, then the user posts are the user's speech. If they editorialize the user posts, then the user posts become the company's speech.

    • And, of course, promote some posts over others is exactly what they do, and they should not enjoy those protections. I don't regard them as a simple carrier and there is an urgent need to regulate, especially Facebook which is a threat to society in its current form.

      Should Facebook be liable when millions of people take dangerous medical advice from some quack's Facebook page that Facebook has been repeatedly notified of but refused to take down --- and that they show prominently because it is so popular?

      I

  • by peterofoz ( 1038508 ) on Thursday February 20, 2020 @10:45PM (#59749210) Homepage Journal
    Generally, the social media platforms are carriers and should be immune from prosecution for what their users post. Having said that, to earn that they would also need to provide real verified identities much as speakers in a town hall do for public comments. The forums have it in their best interest to maintain a somewhat civil environment for discourse. Bad actors espousing violence, extremist views and such would shy away from the real id aspect of the platform in favor of anonymous platforms. Further, a self regulating platform much like Slashdot has with crowd sourced +/- post ratings should normalize the conversation over time to to acceptable social norms within post publishing visibility domains. What one group finds acceptable, another group far away may find offensive - one size does not fit all.
  • I like how the retards who are pissy over Section 230 think that if websites can be sued over user content, the moderation of their racist trolling will somehow end.

    No, the moderation policies will become ten times more aggressive, and the social justice ideology that drives you bonkers will be the only controversial opinions allowed. No one will risk being sued because a right-wing terrorist was allowed to speak his mind and no one acted on the warning signs. You are only fucking yourselves and increasing

  • by dmt0 ( 1295725 ) on Thursday February 20, 2020 @11:11PM (#59749290)

    Should HTTP be responsible for censoring websites?
    That's how it should be with social nets. Facebook should be replaced with a protocol and become completely distributed. Then the issue of censorship would not arise in the first place.

  • Google indexes content, Facebook hosts it.

    Big fucking difference.

  • by TVmisGuided ( 151197 ) <alan...jump@@@gmail...com> on Friday February 21, 2020 @10:22AM (#59750388) Homepage

    Is a railroad responsible for what the graffiti on the side of a rail car says?

  • ... welcome state gun regulations regarding license to carry, making purchases of weapons and ammunition.

  • by Chas ( 5144 ) on Friday February 21, 2020 @01:09PM (#59751068) Homepage Journal

    Are they going to be a neutral carrier with no editorial control?

    If "no", then YES they SHOULD be liable.

    If "yes", then NO, they should NOT be liable.

    They cannot continue to have their cake AND eat it.

  • There's always an issue of one Party feels such-and-such has gone too far but the other Party feels fine about it. If there is a transfer of power then it becomes the other way around. Now we online fictional articles posing as fact (i.e. fictional characters more interesting than real people which is why fiction is popular) gets more attention and manipulates real world thinking.

    Then there's issue where Google and Facebook have tens (hundreds?) of billions in extra cash, tax free, in their basements to d

  • Google Facebook should not be liable for user posts.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...