Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Social Networks Government

Stricter Rules for Internet Platforms? What are the Alternatives... (acm.org) 83

A law professor serving on the EFF's board of directors (and advisory boards for the Electronic Privacy Information Center and the Center for Democracy and Technology) offers this analysis of "the push for stricter rules for internet platforms," reviewing proposed changes to the liability-limiting Section 230 of the Communications Decency Act — and speculating about what the changes would accomplish: Short of repeal, several initiatives aim to change section 230. Eleven bills have been introduced in the Senate and nine in the House of Representatives to amend section 230 in various ways.... Some would widen the categories of harmful conduct for which section 230 immunity is unavailable. At present, section 230 does not apply to user-posted content that violates federal criminal law, infringes intellectual property rights, or facilitates sex trafficking. One proposal would add to this list violations of federal civil laws.

Some bills would condition section 230 immunity on compliance with certain conditions or make it unavailable if the platforms engage in behavioral advertising. Others would require platforms to spell out their content moderation policies with particularity in their terms of service (TOS) and would limit section 230 immunity to TOS violations. Still others would allow users whose content was taken down in "bad faith" to bring a lawsuit to challenge this and be awarded $5,000 if the challenge was successful. Some bills would impose due process requirements on platforms concerning removal of user-posted content. Other bills seek to regulate platform algorithms in the hope of stopping the spread of extremist content or in the hope of eliminating biases...

Neither legislation nor an FCC rule-making may be necessary to significantly curtail section 230 as a shield from liability. Conservative Justice Thomas has recently suggested a reinterpretation of section 230 that would support imposing liability on Internet platforms as "distributors" of harmful content... Section 230, after all, shields these services from liability as "speakers" and "publishers," but is silent about possible "distributor" liability. Endorsing this interpretation would be akin to adopting the notice-and-takedown rules that apply when platforms host user-uploaded files that infringe copyrights.

Thanks to Slashdot reader Beeftopia for sharing the article, which ultimately concludes: - Notice-and-takedown regimes have long been problematic because false or mistaken notices are common and platforms often quickly take-down challenged content, even if it is lawful, to avoid liability...

- For the most part, these platforms promote free speech interests of their users in a responsible way. Startup and small nonprofit platforms would be adversely affected by some of the proposed changes insofar as the changes would enable more lawsuits against platforms for third-party content. Fighting lawsuits is costly, even if one wins on the merits.

- Much of the fuel for the proposed changes to section 230 has come from conservative politicians who are no longer in control of the Senate.

- The next Congress will have a lot of work to do. Section 230 reform is unlikely to be a high priority in the near term. Yet, some adjustments to that law seem quite likely over time because platforms are widely viewed as having too much power over users' speech and are not transparent or consistent about their policies and practices.

This discussion has been archived. No new comments can be posted.

Stricter Rules for Internet Platforms? What are the Alternatives...

Comments Filter:
  • Still others would allow users whose content was taken down in "bad faith" to bring a lawsuit to challenge this and be awarded $5,000 if the challenge was successful.

    What about users who post content in "bad faith", will someone get to sue them too?

    • It's pretty bad when you have lawmakers writing this stuff with such a poor understanding of how the internet works. These platforms are privately owned and as a user, you're a guest, just the same as if you've set foot inside your local mall. The owner is perfectly within their rights to toss you out if you can't follow the rules, or to tell you to turn off your boombox because you're annoying other patrons.

      I'm not really sure where folks started getting this idea that they have a God-given right to post

      • Re:And ... (Score:5, Insightful)

        by Fringe ( 6096 ) on Sunday March 21, 2021 @06:13PM (#61183524)

        It really isn't like that at all. That mall you set foot in, it cannot even allow racist behavior. If you trip and fall there, you can sue it for liability. But Facebook, while private (as you say), has S.230 protections, closer to being a public beach. Except it's a public beach that can be racist.

        Right now, they have the best of both worlds; they can arbitrarily censor, and simultaneously claim immunity for posts that would otherwise violate the law. The trouble is that they are moderating out what they disagree with, under S.230, while not moderating out violations. Definitely not a shopping mall.

        • That mall you set foot in, ... If you trip and fall there, you can sue it for liability.

          Well... perhaps for some things, but not so much if you trip on your own untied shoelace, etc ...

        • Re: (Score:2, Interesting)

          by Powercntrl ( 458442 )

          A slip and fall injury lawsuit has nothing to do with free speech.

          The fact is, in meatspace, you have very limited free speech within private establishments. It's not generally a problem because most people who aren't familiar with each other tend to ignore each other while out in public, and disruptive behavior is swiftly dealt with. Most people seem to understand that you need to be on public property if your intentions are to rabble-rouse.

          That's the issue: If you think the internet needs something ana

          • However, the government does have it within their power to shut down social media companies in an instant - just repeal S230. Then *nobody* could afford the self-policing necessary to avoid drowning under a tide of lawsuits. (even perfect policing probably wouldn't protect them)

            Since social media has proven themselves unable (or unwilling) to limit censorship, and quite happy to promote the spread of severely destructive misinformation on their platforms so long as it improves engagement, it would be compl

            • Facebook is exercising their own discretion and voice to amplify the impact of the posts they feed you first, at the expense of the ones that fall off the bottom of the list.

              You've also described how Google's page rank system works, and pretty much how the entire entertainment industry has operated since its inception. It's only a fairly recent thing that people have gotten it into their minds that they have some sort of God-given right to access someone else's platform. If you don't like the way Facebook is run, don't use it. If you have something to say that they don't want on their platform, host it yourself. Facebook doesn't owe you access to their audience any more tha

              • Re:And ... (Score:4, Insightful)

                by Immerman ( 2627577 ) on Sunday March 21, 2021 @10:35PM (#61184086)

                >You've also described how Google's page rank system works
                Yes, absolutely - and they shoulod be held accountable for how they wield their power.

                >, and pretty much how the entire entertainment industry has operated since its inception
                A very different situation, since the entertainment industry *is* held legally accountable for what they publish, and thus have incentive to police the content they publish (at least where libel and slander are concerned - some accountability for misinformation that harms people other than those they're talking about might be in order)

                >If you have something to say that they don't want on their platform, host it yourself.
                Honestly, I have no problem with that, I only concede it's an issue worth considering as we rewrite the rules. I do however have a BIG problem with Facebook being able to publish whatever misinformation they want completely free of any repercussions, hiding behind section 230 because someone else wrote the words they choose to shout from the rooftops. They're wielding the power of a publisher, they should face the legal responsibility as well. If a newspaper publishes a slanderous letter to the editor, they can be (and often have been) dragged to court over it.

            • Given that environment, it seems a perfectly reasonable compromise to say "if you want to be in this business, you must address these issues".

              Nope! With only the narrowest of exceptions, the government has no business whatsoever in deciding what sort of speech is okay, and what isn't, particularly where favoring or disfavoring particular opinions is involved. Their best course is to butt out.

              Seems to me there's a very strong argument to make that the latter is a form of publication, and thus deserving of fewer protections.

              Seems to me that you're ignorant of the law. Making even the teeniest, tiniest editorial decision makes it publication. Prodigy just wanted to be family friendly, so they tried to filter out curse words and such, and wound up having 100% liability for every

              • Re: (Score:3, Insightful)

                by vyvepe ( 809573 )

                Happily we have 47 USC 230 which states very clearly that no site or user is to be treated as a publisher, no matter what. This gives them the freedom to remove things -- like spam, hate speech, porn, malware, etc -- without being liable for whatever they fail to remove.

                In other words, you give companies right to censor without them being responsible for the content. No surprise publishers do not like it. It really is unfair.

                • In other words, you give companies right to censor without them being responsible for the content. No surprise publishers do not like it. It really is unfair.

                  No! Everyone gets to moderate online content (to the extent they want to and have authority to) without being held liable for the content that stays up that they didn't create themselves.

                  So a traditional newspaper publisher like the New York Times gets to have an online comments section if they wish (and in fact they do) and can also run any other sort of user generated content online.

                  There's no unfairness at all. Nothing is stopping older offline media from running social media sites online or ad networks

            • However, the government does have it within their power to shut down social media companies in an instant - just repeal S230

              This is true, but I think the collateral damage would be enormous (and hence unlikely to actually become law). Slashdot is just one of the sites that would be affected. Nobody could afford the potential liability of any user-posted comments if they want to limit any posts (i.e. the moment they moderate or edit, they become publishers along with the liability of being a publisher). This includes sites like GitHub (it's defamation to post that bug report!), YouTube, and newspapers with user-comments, WordPres

              • I agree - my preferred solution, assuming we can't find some method of curtailing the rapid spread of misinformation on social media, would be to just shut down sites that actually act more like publishers than moderated forums. E.g. revise section 230 to apply to online forums but NOT to social media.

                I think the dividing line between the two is not difficult to draw:
                - Social Media determines what content you see and in what order, using their own voice to promote specific content by putting it near the

      • It's pretty bad when you have lawmakers writing this stuff with such a poor understanding of how the internet works."

        ... It's a series of tubes, right?

        • by rtb61 ( 674572 )

          It is real name social media versus pseudonym social media. Pseudonuym media used to be troublesome but it did not really cause that much of a problem, if illegal content went up, the law could uncover the real name and act but mostly people ignored it and most everyone had fun and messed around.

          Real name social media twisted that all up, made it extremely destructive and people became addicted because they had to protect their identity now exposed to the entire connected general public. It them become cur

      • The purpose of 230 was to create the legal fiction that your post under your name was your property even if it is served up by my hardware. The reasoning being that we don't really want to restrict freedom of the press to only those who can purchase a printing press, and paper, and ink, and everything else to run it.

        That's an idealistic interpretation of course. A much bigger reason was that big tech (it existed back then too) didn't want to get sued out of existence while monetizing the internets. But the

      • It has reached a point where corporations are so integrated with society and so essential for certain people that the concept of a "private company" needs to be given a long hard look. A website, or any place that has public access, should not be able to pick and choose when to enforce its rules. I mean, nothing stops me enforcing a "No sneakers" rule but only applying it to Asian people, or applying a "No black clothing" rule to orthodox Jews. Sure, you can claim that I only enforced it against you because

    • What about internet services that fully operate outside of US jurisdiction? Section 230 isn't international law...

      Best the US can do in those circumstances is block them - or at least attempt to. Which opens an interesting kettle of fish for sure.

      • I think the opposite problem is a bigger problem. What if another region wants to enforce different rules than the US?

        The US has enormous levers that enforce their laws when they want to. Look at what happened with Europe and the Iranian nuclear deal - Europe couldn't effectively get around US sanctions because too much of the banking system requires using US dollars and anybody who uses that system can be sanctioned by the US (meaning any company big enough to really matter). This is despite the European e

        • Which is why Instex was created - to remove US dominance in this area.

          And its also why companies like Airbus are starting to find alternatives to US suppliers, so they cant be forced to not sell to countries or entities that the US dislikes.

    • You would have to prove who did actually posted the content and that it was indeed in bad faith. I imagine bouncing through the tor network before posting with my fake account while using a public wifi would be sufficient to cover my tracks.

      The only purposed bill up there that I think would be helpful is the one that expands liability protection against civil lawsuits. I interpret that as a 3rd party is not allowed to sue the platform because billy said something bad, terrible, horrible whatever. The most I

    • Of course what is "bad faith" Was "A Modest Proposal" written in "bad faith"?
    • We already have defamation laws to cover that sort of thing.
      • We already have defamation laws to cover that sort of thing.

        I was thinking of things clearly untrue and/or misleading, but not libelous. For example, almost everything the 45th President posted ... :-)

  • by Fringe ( 6096 ) on Sunday March 21, 2021 @06:07PM (#61183502)

    The initial challenge with S.230 was that it provides "common carrier" status to providers - e.g. discussion boards, Facebook, Twitter - that then act in a biased way, refuting the "common" part of "common carrier.". They often do this in the name of "decency" or "false/fake news", but these are not generally legal distinctions. (Meaning they don't have a legal definition, not that claiming them is unlawful.)

    But, upon realizing that censorship can work, the other side launched a series of crusades to render ever more words and ideas forbidden. And to rail against any "fake news" they don't like, regardless of the science. (Science is suspect too, dontchaknow.) A year after certain ideas were branded fake about, e.g., a major virus, we see that those "fake" ideas were in fact the real ones... but nobody is going after the initial censors on sites such as Facebook and Nextdoor.

    So, roughly from the right, if the carrier refuses to be common, why should they have common carrier protections?

    And, roughly from the left, if they're allowed and even expected to moderate out offensive content, and yet refuse to, why shouldn't we hold them accountable? After all, they are offending us!

    I don't see a modification or fine-tuning that will work. S.230 has to go.

    • Re: (Score:3, Insightful)

      by Powercntrl ( 458442 )

      Some people keep making this argument because they want to see Facebook and Twitter punished, but they don't realize what would actually happen if section 230 went *poof*.

      Let's say you decided to roll your own right-wing forum as a weekend warrior project. Turns out, it doesn't take long before you start getting tons of posts being made that don't really jive with the audience you're trying to attract. Gee, wouldn't it have been nice to have the ability to remove posts you didn't want on your forum, and s

      • The gay cake case keeps coming up in discussions of 230, despite it having at best a superficial resemblance. It make sense as a comparison if, to post something on social media, you would communicate your sentiment to an employee who would then compose the actual post on your behalf.

        I agree that people are too quickly calling for 230 to be culled. Removing 230, and not putting something comparable in its place, would make user generated content incredibly risky.

        • The gay cake case keeps coming up in discussions of 230

          I could've used the Democrats considering to abolish the filibuster as another example. Take your pick. It's just any change you assume will help your side, but lack the foresight to realize that the other side could easily weaponize it against you.

          230 in its present form benefits anyone who has something to say on the internet, and understands that Facebook/Twitter/AWS are under no obligation to plant your political signs in their front yards. Free speech is the right to speak, it's not the right to fre

          • What use is a right of speech without the ability to distribute it?

            Especially AWS is relevant in that respect. If you can't get hosting and DoS protection you can't speak on the internet.

            • What use is a right of speech without the ability to distribute it?

              It works the same way as the 2nd amendment. You have right to own a gun, but paying for it is still your responsibility. Expecting a platform to be forced to carry your speech is the same as expecting Smith and Wesson to provide you with a free gun.

              "So, if you don't have money, you don't have free speech?" Yeah, and if you don't have money, get your gun, either. It's free as in freedom, not free as in gratis.

              • So the 1st Amendment doesn't apply to poor people?
                And for that matter, anti-fascism, i.e. democracy, doesn't apply to poor people as well?

                • Of course it does. It's just not handed to you. If you have an unpopular idea, then you need to pay to get it heard, or you need to find a platform willing to post your unpopular idea for free. Just don't expect that the unpopular idea will be applauded. Go back 50 years and the conspiracy theories were dealt with using badly mimeographed newsletters, and often requiring a fee to cover the cost of distribution (even if it's "send a self addressed stamped envelope"). If you're too far out of the mainstr

            • Not really. Consider just how successful The Pirate Bay has been at staying online all these years of being under continual attack by the copyright cartels. AWS is mearly one cloud provider of three major ones and may minors. You'll be able to find one somewhere that agrees with you, and they do connect to common carriers. In addition to TPB; Infowars, Breitbart, Stormfront, the Blaze, Fox News, and plenty others remain online. It's just Parler that went beyond-the-pale on 1/6 and got booted.

              And AWS's

            • Especially AWS is relevant in that respect

              No it isn't Parler were incompetent, that's all. Plenty of other people have somehow managed with a massive target on their back. Or do you think 4chan and the pirate bay are on Amazon?

            • by AmiMoJo ( 196126 )

              You can speak, just not as loudly as you might like.

              That's how it works in the real world too. If all you have is a soapbox and a megaphone then you can't reach as many people as someone who has a prime time show on national TV. Doesn't mean someone has to put you on TV, or that you have been censored and your free speech violated.

            • by Bert64 ( 520050 )

              You can host from home on a small scale, especially if your content is mostly text based.

            • The abiitiy to distribute free speech is always there. But you're not necessarily able to use the exact method that you want in the way that you want. Do you want a full front page advertisement in the New York Times? Then be prepared to pay a shitload of money for that privilege, otherwise use the standard rates for a tiny spot in page 30 of the advertiser supplement. Freedom of speech does not mean that all speech will become mainstream speech.

              Right now, *everyone* can speak on the internet. Their au

          • Why not just stick with the matter under discussion? The problem with the private company argument is that 230 explicitly relieves these companies of the obligations ordinarily applying to publishers. There is a need to clarify what kind of filtering or editing sites can perform without crossing that line. There is a point reached where a moderator becomes an editor. Despite this I have repeatedly cautioned against fiddling with 230. I do not trust politicians to not ruin it, due to incompetence or corrupti

            • There is a need to clarify what kind of filtering or editing sites can perform without crossing that line.

              No, there isn't. It's already been clarified. S.230 of the CDA clarified it.

      • Let's say you decided to roll your own right-wing forum as a weekend warrior project.

        Or as a former US President: Trump To Return To Social Media On His Own Platform Within Three Months, Senior Advisor Says [forbes.com]

        "This is something that I think will be the hottest ticket in social media," Miller said Sunday. "It's going to completely redefine the game, and everybody's going to be waiting and watching to see what exactly President Trump does, but it will be his own platform."

        Can't wait to see what he and the politicians think about Section 230 then ...

        • by sinij ( 911942 )

          Just like with Presidency, Trump will quickly discover this is much harder than it appears. Maybe he has enough resources to build hist own Internet, but the very likely outcome that inventing new ways to shut it down will become a competitive sport in SV, and collectively they do own the Internet.

        • You know Trump’s social media site is going to be a real shit show. Funded by user fees and hosted as cheaply as possible. Expect database dumps within a week.

          • And you just know he's not doing it for the good of the people or for some higher political ideals. He just misses his adoring fans, so it's for his ego. With his looming financial problems expect lots of solicitations for donations.

        • by AmiMoJo ( 196126 )

          We don't really have to wait, we can just look at what Trump has already said on the subject.

          Trump is all about "fairness". He thinks tech companies are very unfair because they don't like him or conservatives in general. Unfortunately his idea of being fair means censoring all the lefties and allowing very fine people like Trump to say what they like... At least until it starts an insurrection, then he will disavow.

      • And Parler did exactly this. Trolls showed up and they got banned. Not much complaining from those customers who wanted a free speech forum, because *their* ideas were being allowed so they didn't care if some opposing view was banned despite the public facing claim of "anything goes".

        Rules are fine, they just need to be well defined and applied fairly by the platform. But the platforms have struggled with this from the start. They don't want to be seen as a family unsafe product because it's bad for th

    • by Areyoukiddingme ( 1289470 ) on Sunday March 21, 2021 @06:38PM (#61183594)

      The initial challenge with S.230 was that it provides "common carrier" status to providers...

      No it doesn't. It never did. It explicitly isn't common carrier, on purpose. Common carrier status already existed. It's Title II of the Communications Act of 1934 and all of the businesses involved in the Internet took great pains (and spent millions in lobbying dollars) to make sure they were not subject to it. Common carrier does not mean what you think it means. It definitely does not mean Section 230 of the Communications Decency Act of 1996. They are very different, intentionally so. There was no reason to pass a new law if they weren't different.

      If section 230 is repealed and not replaced with something substantially similar, all user-generated content vanishes off of the Internet, instantly. Not just no new content, but all old content as well. Not even Google can afford the lawyers required to litigate the legality of it all, and they definitely wouldn't try. The Internet would become cable television, with a small, heavily restricted flea market on the side. Nothing else would survive. The threat of fantastically expensive litigation that never ends would see to that.

      Eliminating section 230 is untenable if the Internet is to survive in any recognizable form. And even the Powers That Be, much as they hate not utterly controlling the narrative, are reluctant to destroy the economic engine that the Internet has become. But sure, if what you want is cable TV, a flea market, and some private business to business communications that you're not allowed to see, keep advocating for the repeal of section 230.

      • by Whibla ( 210729 )

        If section 230 is repealed and not replaced with something substantially similar, all user-generated content vanishes off of the Internet, instantly. Not just no new content, but all old content as well. Not even Google can afford the lawyers required to litigate the legality of it all, and they definitely wouldn't try. The Internet would become cable television, with a small, heavily restricted flea market on the side. Nothing else would survive. The threat of fantastically expensive litigation that never ends would see to that.

        Eliminating section 230 is untenable if the Internet is to survive in any recognizable form.

        What about the internet that doesn't exist within the borders of the USA? How exactly does that survive without Section 230?

        Is it possible that there are other 'assumptions', legal considerations, or maybe even cultural 'norms' specific to the USA that are equally relevant to the question? If so, why is all the focus on Section 230?

      • by vyvepe ( 809573 )

        If section 230 is repealed and not replaced with something substantially similar, all user-generated content vanishes off of the Internet, instantly.

        It will now vanish. There are two easy options for user content to continue. One is that hosting companies will not censor users at all. That includes stupid ranking based on unknown criteria which are not stable by users. Other option is that users will host their content themselves. There is no need to worry about user content disappearing. It may be less of it until better platforms appear. But that is a temporary setback.

        • There are two easy options for user content to continue. One is that hosting companies will not censor users at all.

          And all meaningful content vanishes under an avalanche of spam. That's not even remotely a solution.

          Other option is that users will host their content themselves.

          And get sued into oblivion by a cottage industry of lawyers like the sort who decided to get rich extorting people for porn piracy, only without the fraud. Or by cleverly camouflaged minions of Big Content who no longer have to tolerate any competition at all for their paid material.

          The most innocuous of that material can stay up for a time, but no individual user can afford the cost of hosting their conten

          • by vyvepe ( 809573 )

            User defined filters can get rid of spam.

            Don't post illegal stuff and you will not get sued. Or if you will get sued then you can counter-sue. It is not an issue. There are much more posters than the ones starting law suits. Only few of the posters can be hit. There can be class lawsuits against companies starting mass law suits against individuals. If you are really scared about being sued then do not post as an individual; post as a tiny limited liability company.

            You do not need a private computer to host

      • by AmiMoJo ( 196126 )

        One of the reasons why there are so many successful internet companies in the US is S230 protections. Other countries have weaker rules and companies trying to create user-content services have suffered for it.

        For example European tech companies are more focused on infrastructure and backend, because the dirty business of handling user content is not as well protected and thus a lot more expensive to do.

    • What part of Section 230 mentions common carrier? Hint: none. They idea behind the law was that a site owner should have protections from user-postings and not held liable. But it does allow sites to moderate according to whatever criteria they like, allowing a whole category of sites that were on very thin ice before the law.

      I wouldn't want to have, say, a site dedicated to gardening where my only choice is

      1. Allow anything, meaning non-gardening or posts for gardening of illegal activities must be allowed

    • by stikves ( 127823 )

      The discussion is like "You can have 4chan, or News York Times, nothing in between"

      Section 230 allows reasonable moderation (automated or otherwise) to enable neutral platforms on the Internet. However whatever is reasonable for most people will always be unreasonable for some. You don't like Fox News, and someone shares on Facebook? Take Facebook down. You don't like PBS, and someone shares it on Twitter? Again, take Twitter down.

      What could be fixed is people's attitudes towards the ideas they don't like.

  • EFF and EPIC wade too much into partisanship. They could outline what they think social media should look like and if somebody says, "so basically Gab or the one Trump is launching?" they're all like, "no, of course not."

    Regardless of the issue, the government can't (political philosophy normative) grant immunity to an entity to do things that it is prohibited from doing. Same for mercenary war as for censorship. S.230 must be interpreted in that light for the First Amendment to be the Supreme governing

    • Yes, the first amendment needs to apply to platforms, especially those in dominant positions, similar to the European model. At the same time, I’m pretty happy the goatse stuff is off /. There are limits on freedom of speach, and efforts to encourage social discourse should be allowed.

      • Yes, the first amendment needs to apply to platforms

        It does. They are fully protected by the First Amendment just like anyone else, and the government cannot ignore the First Amendment and try to control them.

        Much better than the old days when, for example, the Supreme Court held that movies were not protected by the First Amendment.

        Or did you mean something much much stupider?

  • Let's make them repeal the CDA and demand a neutral open internet. That is the only acceptable "alternative"

  • It is ok to have individual companies set their own policies, but the problem is that the platforms have become essentially monopolies and they need to be broken up to manage their spheres of influence. The rules end up either being gamed or simply ineffective at influencing the desired outcome.

    The same issue holds true for Visa/Mastercard, SWIFT, and a few other organizations outside of technology.

  • If authoritarian governments want to survive, they need to limit the spread of ideas that challenge authoritarianism. I've begun to wonder if democracies which want to survive need to - and always have, one way or another - limit the spread of ideas that challenge democracy. There's a curious coda to the Wikipedia article on Nazi book burnings, for example, in which millions of Nazi and Nazi-adjacent books were destroyed after the war by the Allies and they admitted that the order in principle was no diff [wikipedia.org]
  • by rsilvergun ( 571051 ) on Sunday March 21, 2021 @06:57PM (#61183650)
    the entire framing of this is that we must do *something*. It's wrong. Section 230 is as good as it gets.

    Right now some very, very wealthy and powerful people are annoyed that they don't have total control over a major communications platform. In the past they could just buy up any newspaper, radio or TV station and then only their message would get out. The Internet changed that, and Section 230, along with Net Neutrality, is what made that possible.

    Attacks on S230 are them reasserting control. There is no way in hell in this age of dark money that we're going to get anything better. Instead we're going to get the DMCA 2.0. Where a single takedown notice gets you silenced and 3 of them get you silenced permanently. People with money will be able to get around that and the rest of us will be told to sit down and shut up.

    Make no mistake: Attacks on S230 exist only to censor you. Anyone who tells you differently is either lying or being lied to.
    • by sinij ( 911942 )

      Make no mistake: Attacks on S230 exist only to censor you.

      What is your message to conservatives that are already being censored by the existing system?

      • by Anonymous Coward

        What is your message to conservatives that are already being censored by the existing system?

        Well, to start with, it's going to get much worse for said conservatives if Section 230 is repealed.

      • Generally they're not being kicked off of the major platforms. I'd be happier if they were, but generally they're not. But if they're really upset, go to a different platform. No one's stopping them.

      • by Anonymous Coward

        My message to them is that demonstrably are not being censored. They are, in fact, more prevalent on facebook than any other political faction. [usatoday.com] How could they possibly be censored the most when there's more of them than anyone else?

      • You have 3 24/7 news networks, all of talk radio, 60-70% of local TV stations dozens of professionally made YouTube channels, 80% of Facebook and most local newspapers. Your message is the default in political discourse as it benefits the 1% so they heavily promoted it on your behalf.

        My message is that at no point in time is any conservative in America censored. Ever. You own the media in America. And now you're trying to repeal Section 230 so you can use DMCA style laws to silence your opposition.
        • by sinij ( 911942 )

          My message is that at no point in time is any conservative in America censored. Ever.

          This is counter-factual view of reality. You have to redefine "conservative" and/or "censored" for this to be a true statement. So what do you exactly mean by these words?

      • the message is that you're not being censored. You have very valid views most of the time, but there are extremists that are damaging you from within. And those extremists are scaring the mainstream, and media has *always* been about the mainstream. Ideas that are on the fringe will have to resort to alternative means of getting the message out; there is no requirement or responsibility for the government or major companies to offer you a free ride. Even some of the big examples aren't even posts or mess

    • Comment removed based on user account deletion
  • by Beeftopia ( 1846720 ) on Sunday March 21, 2021 @08:35PM (#61183848)

    From the CACM article: [acm.org]

    Section 230 provides Internet platforms with a shield from liability rising from content posted by others.

    It started in 1995, when a Prodigy user accused the company Stratton-Oakmont of securities fraud. Stratton-Oakmont sued Prodigy and won. In 1996, Section 230 was enacted as part of an overhaul of U.S. telecommunications law.

    The first court decision on interpreting section 230 occurred in 1997, with Zeran v. America Online. Someone posted Ken Zeran's telephone number on t-shirts glorifying the 1995 Oklahoma City Bombing, and posted them to ads on AOL. Zeran received hundreds of phone calls, including death threats. AOL would not take down the ads despite Zeran's requests. Zeran sued AOL. AOL asked the court to dismiss the suit based on the new section 230. AOL won and the case was dismissed.

    Relying on Zeran, online platforms have routinely avoided legal liability through 230 defenses. Numerous cases have featured very sympathetic plaintiffs, such as victims of revenge porn, fraudulent ads, and professional defamation, and some unsympathetic defendants who seem to have encouraged or tolerated harmful postings.

    In late 2020, the Senate introduced a bill that would repeal 230 outright. Civil liberties groups, Internet platforms, and industry associations still support 230, as do Senator Wyden and former Congressman Chris Cox, who co-sponsored the bill that became 230. Wyden and Cox have pointed out that an overwhelming majority of the 200 million U.S.-based Internet platforms depend on 230 to protect them against unwarranted lawsuits by disgruntled users and those who may have been harmed by user-posted content of which the platforms were unaware and over which they had no control.

  • All we need to do is make is so that "recommendations" are not shielded from liability.

    Then people can post freely, comment freely, share all the crazy they want with their friends, but the tech companies won't then "recommend" it to other people unless it has been vetted by somebody they trust.

    • "I recommend you kill yourself and everyone else kills you." -- Some Internet person who was triggerend and could not feel empathy towards you because you don't even feel like a real person to him, nor does he to you.

      Yes, I agree, but also, no, it's not quite that easy. ;)

  • How about you're not our nanny, and have the balls let us decide.
    And then stop raising us in a way that makes us forever-immature passive-thinking knee-jerking offendable trigger fests with no scientific understanding to speak of.

    How about you give us ALL the tools to make a good decision. And you don't get to decide what is "good". Because you aren't any more grown-up either.

    One of Murphy's laws is:
    "If you make something simpler, nature just invents a better idiot."
    That is because we are is an environment

  • NO! The internet is supposed to be an open platform for the free exchange of thought and idea. Keep all of your little grubby authoritarian paws off it. We will not cede control to you.

  • Comment removed based on user account deletion

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...