Ron Wyden, a senior U.S. Senator from Oregon, argues there
should be consequences for internet companies that refuse to remove hate speech from their platforms. An anonymous reader shares an excerpt from a report Wyden wrote via TechCrunch:
I wrote the law that allows sites to be unfettered free speech marketplaces. I wrote that same law, Section 230 of the Communications Decency Act, to provide vital protections to sites that didn't want to host the most unsavory forms of expression. The goal was to protect the unique ability of the internet to be the proverbial marketplace of ideas while ensuring that mainstream sites could reflect the ethics of society as a whole. In general, this has been a success -- with one glaring exception. I never expected that internet CEOs would fail to understand one simple principle: that an individual endorsing (or denying) the extermination of millions of people, or attacking the victims of horrific crimes or the parents of murdered children, is far more indecent than an individual posting pornography.
Social media cannot exist without the legal protections of Section 230. That protection is not constitutional, it's statutory. Failure by the companies to properly understand the premise of the law is the beginning of the end of the protections it provides. I say this because their failures are making it increasingly difficult for me to protect Section 230 in Congress. Members across the spectrum, including far-right House and Senate leaders, are agitating for government regulation of internet platforms. Even if government doesn't take the dangerous step of regulating speech, just eliminating the 230 protections is enough to have a dramatic, chilling effect on expression across the internet. Were Twitter to lose the protections I wrote into law, within 24 hours its potential liabilities would be many multiples of its assets and its stock would be worthless. The same for Facebook and any other social media site. Boards of directors should have taken action long before now against CEOs who refuse to recognize this threat to their business. In
an interview with Recode, Wyden said that platforms should be punished for hosting content that goes against "common decency." "I think what the Alex Jones case shows, we're gonna really be looking at what the consequences are for just leaving common decency in the dust," Wyden told Recode's Kara Swisher. "...What I'm gonna be trying to do in my legislation is to really lay out what the consequences are when somebody who is a bad actor, somebody who really doesn't meet the decency principles that reflect our values, if that bad actor blows by the bounds of common decency, I think you gotta have a way to make sure that stuff is taken down."