Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Social Networks Democrats United States

Democratic Bill Would Suspend Section 230 Protections When Social Networks Boost Anti-vax Conspiracies (techcrunch.com) 282

Two Democratic senators introduced a bill Thursday that would strip away the liability shield that social media platforms hold dear when those companies are found to have boosted anti-vaccine conspiracies and other kinds of health misinformation. From a report: The Health Misinformation Act, introduced by Senators Amy Klobuchar (D-MN) and Ben Ray Lujan (D-NM), would create a new carve-out in Section 230 of the Communications Decency Act to hold platforms liable for algorithmically-promoted health misinformation and conspiracies. Platforms rely on Section 230 to protect them from legal liability for the vast amount of user-created content they host.

"For far too long, online platforms have not done enough to protect the health of Americans," Klobuchar said. "These are some of the biggest, richest companies in the world and they must do more to prevent the spread of deadly vaccine misinformation." The bill would specifically alter Section 230's language to revoke liability protections in the case of "health misinformation that is created or developed through the interactive computer service" if that misinformation is amplified through an algorithm. The proposed exception would only kick in during a declared national public health crisis, like the advent of Covid-19, and wouldn't apply in normal times.

This discussion has been archived. No new comments can be posted.

Democratic Bill Would Suspend Section 230 Protections When Social Networks Boost Anti-vax Conspiracies

Comments Filter:
  • Freedom of Speech (Score:5, Insightful)

    by phantomfive ( 622387 ) on Thursday July 22, 2021 @05:33PM (#61609407) Journal

    If there is a problem with one of the vaccines, you need to be able to tell people about it.

    Freedom of speech means that you can stop the form of speech (like no yelling out in the middle of the night, people want to sleep), but you can't stop a particular idea from being shared.

    These people want to stop an idea from being shared. If an idea is false, then the solution is to answer it with more speech, not with censorship. In this case the authorities are right, but if you give them the power to censor, they will abuse that power.

    • Not only that "health misinformation" is pretty damn broad. This could include just about anything, such as a low carb diet or not wearing sunscreen.
      • Or homeopathy.

        • ...acupuncture, chiropractic, colloidal silver therapy, hair analysis (yes, that's a thing), iridology, neuro-linguistic programming, urine therapy, ... You know what? The list is waaay too long. It just wears me out to think of how fast people can generate so much bullshit. Can we just call it a day & burn Facebook & Twitter to the ground? That'd just leave the conspiracy theorists here on /.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Exactly. What happens when another Snowden-type reveal shows that a "conspiracy" was actually right all along!

    • Re: (Score:3, Informative)

      You are right, of course. But they're not censoring.
      I will concede that they are in fact chilling free speech.
      The law removes an immunity granted to online platforms for online platforms that spread scientifically false medically important data, specifically during a national medical emergency.

      There have been many laws like this over time (specifically wartime laws)

      Whether right or wrong, I don't think it's quite as dire as you're afraid of.
      • It's just another step in killing "the people's" ability to communicate in public. Facebook and twitter are essentially the public square at this point. If you let these platforms get sued into oblivion, they will quickly ban posting before they die.

        Worse though, facebook and twitter have lots of money and can hang on for a lot longer. All the little site's will be dead after the first lawsuit. That or those sites will just stop allowing user-generated content. Say good bye to public forums of any kind.

        Exac

        • I understand where you're coming from, and I agree with you.
          I've always been for S.230. The case that preceded, and instigated its creation was a perfect example of why it needed to exist.

          I'm merely saying, there is precedent for this kind of thing happening, and the rules in place, established by the Supreme Court are very tailored against allowing things like censorship.

          In general, the government is allowed to do all kinds of things in a crisis, particularly if what they're doing is reasonably requir
    • Re: (Score:3, Insightful)

      by Anubis IV ( 1279820 )

      These people want to stop an idea from being shared. If an idea is false, then the solution is to answer it with more speech, not with censorship. In this case the authorities are right, but if you give them the power to censor, they will abuse that power.

      So what's the happy balance? On the one hand, we want to preserve the freedom of speech: a person who wants to talk about the earth being flat should be able to do so, regardless of how wrong they are. On the other hand, we want to preserve the freedom of association: a wholesome and encouraging online community for kids who love Legos should be able to block indecent or inappropriate content that they don't want. Both rights are guaranteed by the First Amendment and both must be considered when you write l

      • No one should be able to sue Facebook because I made an account and started spouting bullshit. At most, the offended party should probably file a civil suit against me, the person that made the account to spout bullshit. Facebook should be compelled to give up whatever information it has on me if ordered by a judge.

        That's about the end of it right there. Facebook owns the platform and if they want to ban stuff, that's up to them. They pay the bills. If they don't want to ban anything, that's fine as well. F

    • If there is a problem with one of the vaccines, you need to be able to tell people about it.

      Does that include, "Aunt Martha said it made her right tit hurt, her friend had the same problem, skip the vax!"?

      If you quote doctor-unverified anecdotal evidence or cherry-pick medical opinions, you should be required to state so. Fox & Infowars finds nutty outlier doctors and doesn't mention that they are outliers. That shouldn't be allowed. A "news" organization, or what acting like news, should at least take

      • Does that include, "Aunt Martha said it made her right tit hurt, her friend had the same problem, skip the vax!"?

        Yes. There are alternatives to censorship. People who knee-jerk immediately to the solution of censorship are the problem.

    • all it says is you can be sued and are no longer protected if you allow health misinformation on your platform.
      If the "misinformation" is in fact actual information not proven to be false then they still can't be sued.

  • "The proposed exception would only kick in during a declared national public health crisis, like the advent of Covid-19, and wouldn't apply in normal times.

    Well, since the elitists are getting very rich off maintaining this particular version of "normal" in the world, you might as well throw this new legislation in the garbage where it belongs, since the first thing professional amplifiers are going to do is lobby to define a new state of "normal" to ensure this never kicks in.

  • The proposed exception would only kick in during a declared national public health crisis, like the advent of Covid-19, and wouldn't apply in normal times.

    From the White House: [whitehouse.gov]

    But I also — today, we’re taking steps to confront not just the gun crisis, but what is actually a public health crisis.

  • Not a good plan (Score:5, Informative)

    by cpt kangarooski ( 3773 ) on Thursday July 22, 2021 @06:12PM (#61609571) Homepage

    First, it actually swings too far the other way. 47 USC 230 was meant to avoid the situation where an Internet service was treated as the publisher of information a user posted, rendering it liable for defamation and other offenses. The impetus comes from a pair of cases in the early 90s: one held that where a provider did not review or moderate users' posts, the provider was not a publisher, but merely a distributor, and the other held that if it reviewed or moderated user posts in the least bit, the site was a publisher and liable for anything it should have moderated but didn't. For example, if you moderate to remove porn, offensive language, malware, spam, etc., you're also liable if a user defames someone and you fail to do anything about it despite that you have no way of knowing whether it is defamatory unless you review and fact check literally every post for every possible thing, before it can go live, and never make the error of a false negative.

    This was clearly a bad system so we got a statutory policy that allows and encourages moderation on whatever grounds a service prefers (you could remove fucking spam but not offensive fucking language for example) without adding liability for taking things down or leaving them up. Sites haven't done a great job of policing themselves, but at least it's better than either of the three extremes you get otherwise -- wall to wall crap like what happened to Usenet, turning the Internet into cable tv in which users cannot post, or just not offering any services because it's a pain in the ass.

    This law doesn't simply go back to the default condition (which has the potential, as it is common law, to evolve) but instead affirmatively imposes liability on sites with regard to "health misinformation." So even failing to moderate doesn't save a site; moderation is absolutely mandatory under this law, even if only for this one thing.

    Next, there is no definition of what "health misinformation" is. The government is directed to provide 'guidance' but it's easy for someone to claim that some post was misinformative despite compliance with the guidance.

    Further, this is impossible to accomplish. It means every post must be checked, manually, for any possible "health misinformation" before it goes live, and that any post inadvertently made live exposes a site to liability. Even if the moderator made the right call, the potential litigation for frivolous suits is enough to destroy virtually any site around.

    This is not a sensible bill. This is someone trying to destroy the ability of people to post things online.

  • For such legislation to be anything other than a partisan swipe, it would have to be written to cover all lies about generally accepted scientific facts. How big a Bureau Of Bullshit would it take to monitor all of social media for violations? Get ready for endless tribal bickering over the precise wording of everyone's tweets and Facebook ruminations. It would turn into a jobs program for people who owe huge student loans for critical theory degrees.

  • by rsilvergun ( 571051 ) on Thursday July 22, 2021 @06:59PM (#61609777)
    Basically Republicans in everything but name. This is a really really bad idea. Very quickly all the protections on the internet for free speech will be removed and replaced with dmca style takedown rules. So if you say anything that anyone with enough money to hire a lawyer disagrees with you'll be shut off the internet.

    The internet lives and dies by two things. Net neutrality and section 230. Anyone who tells you different is trying to destroy the internet or has been fooled by someone who is trying to destroy the internet. The C-levels don't like that we can just talk amongst ourselves and they're moving to take over. If they had understood what the internet was back in the 90s they never would have let us have it
  • Woke Dictionary (Score:2, Insightful)

    "Misinformation", n. Information I disagree with.

    "Disinformation", n. Information I disagree with that also contradicts what I'm trying to say.

  • Sounds great! Crack down on those social media companies for the misinformation and anti-vax in general. Next, since the government cares so much about my health and how companies in general are some how responsible for my health, I purpose we ban all advertisements of alcohol, fast food, soda pop and really any food that is processed.

    Surely that would be the socially responsible thing to do, since we care so much about society's health and all. Also, why are cigarettes banned yet? They literally cause canc

  • When the news suggests vaccines are unsafe based on no evidence and your family member listens to that advice and dies, that should be an open and shut wrongful death suit as surely as that girl who convinced her boyfriend to commit suicide for laughs. This is no different, its a bad actor convincing someone else to hurt themselves for their benefit. That girl was found guilty, so should everyone else peddling garbage as truth.

  • I don't understand.
    Being an idiot anti vaxer isn't against the law and shouldn't ever be
  • by backslashdot ( 95548 ) on Thursday July 22, 2021 @08:33PM (#61610001)

    I am no anti-vaxxer but you never want the government deciding what is misinformation and what isn't. You may be cool with it now, but if (actually "when," not if) the government shifts they may not like what you say.

I have hardly ever known a mathematician who was capable of reasoning. -- Plato

Working...