Should Some Sites Be Liable For The Content They Host? (nytimes.com) 265
America's lawmakers are scrutinizing the blanket protections in Section 230 of the Communications Decency Act, which lets online companies moderate their own sites without incurring legal liability for everything they host.
schwit1 shared this article from the New York Times: Last month, Senator Ted Cruz, Republican of Texas, said in a hearing about Google and censorship that the law was "a subsidy, a perk" for big tech that may need to be reconsidered. In an April interview, Speaker Nancy Pelosi of California called Section 230 a "gift" to tech companies "that could be removed."
"There is definitely more attention being paid to Section 230 than at any time in its history," said Jeff Kosseff, a cybersecurity law professor at the United States Naval Academy and the author of a book about the law, The Twenty-Six Words That Created the Internet .... Mr. Wyden, now a senator [and a co-author of the original bill], said the law had been written to provide "a sword and a shield" for internet companies. The shield is the liability protection for user content, but the sword was meant to allow companies to keep out "offensive materials." However, he said firms had not done enough to keep "slime" off their sites. In an interview with The New York Times, Mr. Wyden said he had recently told tech workers at a conference on content moderation that if "you don't use the sword, there are going to be people coming for your shield."
There is also a concern that the law's immunity is too sweeping. Websites trading in revenge pornography, hate speech or personal information to harass people online receive the same immunity as sites like Wikipedia. "It gives immunity to people who do not earn it and are not worthy of it," said Danielle Keats Citron, a law professor at Boston University who has written extensively about the statute. The first blow came last year with the signing of a law that creates an exception in Section 230 for websites that knowingly assist, facilitate or support sex trafficking. Critics of the new law said it opened the door to create other exceptions and would ultimately render Section 230 meaningless.
The article notes that while lawmakers from both parties are challenging the protections, "they disagree on why," with Republicans complaining that the law has only protected some free speech while still leaving conservative voices open to censorship on major platforms.
The Times also notes that when Wyden co-authored the original bill in 1996, Google didn't exist yet, and Mark Zuckerberg was 11 years old.
schwit1 shared this article from the New York Times: Last month, Senator Ted Cruz, Republican of Texas, said in a hearing about Google and censorship that the law was "a subsidy, a perk" for big tech that may need to be reconsidered. In an April interview, Speaker Nancy Pelosi of California called Section 230 a "gift" to tech companies "that could be removed."
"There is definitely more attention being paid to Section 230 than at any time in its history," said Jeff Kosseff, a cybersecurity law professor at the United States Naval Academy and the author of a book about the law, The Twenty-Six Words That Created the Internet .... Mr. Wyden, now a senator [and a co-author of the original bill], said the law had been written to provide "a sword and a shield" for internet companies. The shield is the liability protection for user content, but the sword was meant to allow companies to keep out "offensive materials." However, he said firms had not done enough to keep "slime" off their sites. In an interview with The New York Times, Mr. Wyden said he had recently told tech workers at a conference on content moderation that if "you don't use the sword, there are going to be people coming for your shield."
There is also a concern that the law's immunity is too sweeping. Websites trading in revenge pornography, hate speech or personal information to harass people online receive the same immunity as sites like Wikipedia. "It gives immunity to people who do not earn it and are not worthy of it," said Danielle Keats Citron, a law professor at Boston University who has written extensively about the statute. The first blow came last year with the signing of a law that creates an exception in Section 230 for websites that knowingly assist, facilitate or support sex trafficking. Critics of the new law said it opened the door to create other exceptions and would ultimately render Section 230 meaningless.
The article notes that while lawmakers from both parties are challenging the protections, "they disagree on why," with Republicans complaining that the law has only protected some free speech while still leaving conservative voices open to censorship on major platforms.
The Times also notes that when Wyden co-authored the original bill in 1996, Google didn't exist yet, and Mark Zuckerberg was 11 years old.
No. (Score:5, Insightful)
Establishment media will have no trouble getting around this, the entire point of this is to shut down the anti-establishment media outlets.
Re: (Score:3)
Section 230 has to exist because of the speed of creation on the internet, not necessarily because of what big corp can or can not do.
Nearly 6,000 tweets are posted every second on Twitter. That's 500 Million a day.
Nearly 500 hours of video are uploaded to YouTube every minute. That's ~82 years worth of video every day.
It's physically impossible to manually review all of this content so algorithms have to do it. Bad shit slips through all the time. Without Section 230 providing the immunity coverage, there
Re: (Score:2, Insightful)
You could get information just fine before the internet, both from conventional sources and bbs and other dial-up services (remember CompuServe? MiniTel?) and *gasp* schools and libraries and specialized periodicals. The internet doesn't deserve special treatment. If a site can't exist without exemptions for hosting revenge porn or hate speech, then they need to come up with a different business model, or a different business. Same as, when Uber runs out of OPM, they should not get a government bailout.
W
Re: (Score:2)
Me too.
ASL?
You have mail.
I remember the flood of AOL users. Life was miserable. The Internet was geekland back then, and then here came the kiddies.
I forget what process I used to splice together binary photo files from Usenet. It was time-consuming.
And to address your "What would you do ...," whatever I did would be a heck of a lot faster now.
Re:No. (Score:5, Insightful)
What would you do if the Web were to disappear tomorrow, leaving the other internet services intact? I'd have no problem going back to Usenet Newsgroups, email would still be around, as would FTP.
Nonsense.
The only reason those services were able to function as well as they did a few decades ago is because the Internet was restricted to a tiny minority of the population, mostly academics.
As soon as it became a tool of the whole population, that changed, and it has changed forever. Eternal September, and much, much more. There is no going back to the pre-Web Internet. Just as the Web would sink into a morass without some filtering, Usenet and FTP would if they became widely used today. They don't need to be filtered now because no one uses them, but kill the web and everyone will move and wherever everyone moves the crap will go, too. Unfiltered email is already unusable. Thankfully, email filtering has gotten so good that we don't much notice how awful email is.
(As an aside: Even aside from filtering, the replicate-the-world architecture of Usenet simply could not scale to modern needs. It was getting really creaky even before Usenet began to decline. And, actually, many news servers engaged in filtering because it was necessary even then.)
There really are only two choices: Either allow service providers to continue waging their endless battle against the flood of crap, recognizing that they'll never succeed perfectly, or else enable some other organization -- probably governmental -- to do the job. Rolling the clock back 25 years is not going to happen.
Personally, I don't think the current situation is great, but I really, really don't want to see government get involved. As long as the filtering is all voluntary, services will have to try to find a balance, and that balance will have to roughly match the sensibilities of the community they serve. Filter too much and the door is open to competitors who allow more freedom. Filter too little, and the door is open to competitors who create a less crap-filled environment. And since "community sensibilities" are only a rough mean with wide variance, there will always be fringe sites that cater to non-mainstream views. Oh, and absolutely anyone is free to run their own servers if they want -- and there's always Tor.
The status quo is messy, complicated and makes no one really happy, but I think it's the best we can do.
CGNAT, AUP, and RBLs (Score:5, Informative)
Oh, and absolutely anyone is free to run their own servers if they want
This is technically correct, as anyone can run a server on his or her local area network. However, one's own server may not be able to do anything useful on the Internet. This has several causes:
- ISPs with insufficient IP address allocation use carrier-grade network address translation (CGNAT). This has the side effect of blocking incoming connections.
- ISPs threaten to disconnect home users who run a server reachable from the Internet and then fulfill those threats.
- Established servers use blackhole lists to determine with which other servers not to communicate. These lists cover home and home business IP address blocks.
Re: (Score:2)
So what though? You can technically make a public speech from your front lawn, but the audience may be pretty limited and if you try to do it at 2AM you might find the cops take an interest.
There has never been a more powerful tool for reaching large numbers of people as the internet. We are talking about a tiny number of people who are involved in domestic terrorism having their web sites booted off. Far right politics are mainstream. There is zero evidence that it's affecting political discord, in and in
Re:CGNAT, AUP, and RBLs (Score:4, Informative)
There is no evidence of that being the case. So far it's only been sites involved in actual terrorism that have had services withdrawn. Doesn't apply to far left websites because all extremist murders last year (and I'll bet this year too) were by the far right.
https://www.businessinsider.nl... [businessinsider.nl]
I didn't see people complaining about this when it was ISIS sites getting taken down. None of the freeze peach warriors were quitting YouTube and Patreon over Islamic terrorism videos getting removed. Strange that the government didn't want that report to be released either, almost as if far right extremism looks bad for them.
No I couldn't (Score:2)
If I just went by the pre-Internet information I'd cheerfully vote for that sellout. Yeah, everything above is public news, but good luck finding it without the Internet.
Re: No I couldn't (Score:3)
Damn, I didn't think it was possible but after reading that third link I almost LIKE biden.
Too bad the handful of claims I checked turned out to be either blatant bullshit or exaggeration.
Re: No. (Score:2, Interesting)
Which of those rely on hosting third party content at a scale they can't practically police well?
We have AM talk radio stations all over the US (and a very healthy number are hard right conservative political talk shows). They are not immune to lawsuits related to their content AFAIK. They screen their callers heavily and have dump buttons at the ready. Somehow they apparently even make money.
This doesn't prevent all kinds of hateful garbage from going on air from hosts and random callers alike, but the a
Re: No. (Score:2)
Rush never pretends to be neutral. That's a big part of it.
AM stations are not liable, though (Score:4, Insightful)
Basically, the radio stations are only on hook if the caller says "fuck" - that's all FCC cares about.
Re: AM stations are not liable, though (Score:2)
Then Internet services should be in the same position, regarding liability for defamation.
I'm not seeing anything wrong with this.
And at the right time (Score:2)
Re: (Score:2)
Why don't you get serious like Walmart did, you insensitive clod?
In an internal memo, the retailer told employees to remove any violent marketing material, unplug Xbox and PlayStation consoles that show violent video games and turn off any violence depicted on screens in its electronics departments.
Employees also were asked to shut off hunting season videos in the sporting goods department where guns are sold. "Remove from the salesfloor or turn off these items immediately," the memo said.
Walmart will still sell the violent video games and hasn't made any changes to its gun sales policy, despite pressure from workers, politicians and activists to do so.
No. Unless they made the content. (Score:5, Insightful)
Allow those who want a censored experience to obtain one. Don't force some single opinion of what should be censored on everyone.
will destroy independants, benefit google (Score:5, Interesting)
This would be an enormous benefit to facebook and google. It would be the end of independent sites. Google and facebook want this. Don't fall for it. I can gaurantee facebook and google has lobbyists pushing for these. When they say its to punish facebook, it means facebook wants it and its a gift to facebook. its called regulatory lock in. Facebook wants to be heavily regulated to get rid of all of its weaker competitors.
Exposing a site to liability for things a user posts would make it impossible to run any kind of message board and would kill free speech! How? You want to bring down a site? Just post something libelous to it, it really doesnt have to be much, it could be a slanderous and libelous comment. The fact is, only large sites like google can cope with the expensive of having to monitor massive websites. This effectively locks out smaller sites and would create a large monopoly of facebook, google etc.
Call your reps and tell them to oppose this .
The Right doesn't have a censorship problem (Score:2, Insightful)
As this [xkcd.com] relevant XKCD so eloquently puts it, these companies have no obligation to host your content on their dime.
But they're the "public square"; it's the only way you can easily reach a bunch of people, right? Yeah, why don't you try setting up your own business in the parking lot of Walmart without their permission and see how long that lasts. You're not entitled to someone else's audience or customer base, sorry. Same reason I can't come along and start sticking my preferred political party's campai
Re: (Score:2)
Your right, instead of complaining about private sites doing this, they need to instead set up their own websites to host their stuff. Just don't use facebook or youtube if you dont like their policies. Instead, what they want to do is actually make youtube and facebook the only sites available which is what repealing the protections would do, because only youtube and facebook could cope with the regulatory burden of massive policing of content that it would require and as well the billions of dollars of le
Re: (Score:2)
A lot of cultures had common areas - the forum in ancient Greece and Rome for example where the public could meet and discuss. Of course that might have been the male, no-slave, land-owning, public from the right families - but the concept is still valid.
There is no equivalent place on the internet. Facebook and the like are like private residences - where people can talk, but only with the consent of the land owners.
I real public forum would serve this purpose, but unlike ancient governments that were w
Re: (Score:2)
You know what the real goddam problem is?
The real goddam problem is that the fucking goddam unwashed masses take Facebook and that shit seriously.
It's a cat video venue -- a game platform. Anyone getting their news from Facebook is seriously impaired. Who would do that?
Legitimate news does exist outside the bubble.
I say we all have a happening and dump Facebook.
That's what I did. I'm a little uninformed about my nephew's drug abuse, but all in all, I'm happier.
#DeleteFacebook
Re: (Score:3)
If they were smarter they'd just ask for a government-run "geocities" type hosting provider, and then they'd have all their frozen peaches intact. Of course, that would be Socialism, so nevermind. Thank goodness they can't stomach government services, or they'd be able to ask for something that would "solve" their "problem!"
Re: (Score:2)
Except that it wouldn't. Politicians would not be able to keep from imposing their own narratives on the system.
Re: (Score:2)
Bad idea. Once it was subject to government administration and funding, it would very quickly become subject to a lot more government control of content - the AUP would become even more restrictive than most private-sector hosts. The very first thing to be banned would be obscenity, followed by a broadening of the definition of obscenity to catch all the weird stuff. It'd be downhill from there.
Completely wrong. (Score:2, Insightful)
Do you even know what section 230 is about?
Without that, a company has two choices, either:
a) you are a common carrier, you do NOT control content, just carry it, and you have no responsibility.
b) you are a publisher (like a news paper) and you DO control content, and you have a responsibility for that content.
However, section 230 is a nice little gift to the companies who want the best of BOTH. the control without the responsibility.
This is NOT a free speech issue, it is an issue with a special allowance f
It's not just large companies (Score:3)
Point is, 230 made innovation possible and it continues to.
Re:Completely wrong. (Score:4, Insightful)
ISPs are not common carriers, according the same people promoting this false understanding of Section 230, so how do they rationalize social networks as common carriers? That is not, and never has been a choice available to them. Social networks are not utilities.
And Section 230 explicitly says they are not the publishers of the content contributed by others. Here's what it does say, it encourages them to moderate objectionable content by explicitly shielding them from liability for doing so:
What you're all really so butt-hurt about is that your political opinions are considered objectionable content by a majority of people. You spout racist gibberish, you harass women and gays, you spread falsehoods and conspiracy theories, and despite having the absolute freedom to form your own social networks where you're perfectly free to moderate comments according to your own twisted community values, you want the government to inflict a new fairness doctrine on the social networks that the cool kids already dominate. You're basically admitting that you're losers on the Internet and you want the government to help you be winners, despite being the ones who more expertly spread "memes" and propaganda.
Re: (Score:3)
However, section 230 is a nice little gift to the companies who want the best of BOTH. the control without the responsibility.
Not quite. It was meant to encourage hosts to moderate, since they would no longer be in danger for failing to moderate perfectly in all possible respects. (That's the risk of the publisher approach -- one tiny slip and you're totally fucked)
It has nothing to do with large companies or "normal" people. Anyone who creates content remains liable for it, but no one else is. Even if
Re: The Right doesn't have a censorship problem (Score:3)
CDA 230 is being misused as cover for unamerican censorship of overtly political speech.
Thatâ(TM)s a total lie. The rights of free speech and free associations are held by the sites too, and allow sites to pick a side and to deny certain people from using it. What could be more American than telling people you donâ(TM)t like to get off your lawn?
The people seeking to clarify CDA 230 want public conversation platforms ... to choose between being common carriers or publishers.
Another lie. Their purp
Sounds like legislator speak (Score:2)
Re: (Score:2)
Only a complete idiot, or somebody who only recently returned to the surface after decades underground, would think that Senator Wyden gives a rats ass about "campaign contributions."
Simple... (Score:3)
If yes, then they should be responsible for the site they moderate.
If no they should not be responsible for the site they are not moderating, they're just hosting someone else's content.
Re: (Score:2)
8chan, anyone?
Re: (Score:2)
You are not contributing to the discussion; you just restated Section 230 of the Communications Decency Act (CDA).
The discussion is about whether Section 230 needs to be reformed or repealed, because it does not have an enforcement mechanism and it did not foresee the rise of social media.
Re: (Score:3)
Social media has existed since the advent of USENET. The formatting of the discussion or the delivery vehicle is immaterial.
Re: (Score:2)
In the physical world you can't just post a notice somewhere that there is a party happening at some location you own or rent and let anything happen that might happen.
Actually, you can.
If criminal activity happens at that party, or it becomes a nuisance, the police could break up the party....which will then start again the next night (presuming the owner of the property wants it to continue)
Free speech is free speech, period (Score:3)
"Free speech is free speech, period." No. Its not. (Score:3)
Re: (Score:2)
Re: (Score:2, Informative)
Each service provider (hosting, website, phone app, whatever) must decide whether it is a publisher or a platform, under the terms of Section 230 of the Communications Decency Act (CDA). If it is a publisher, it is legally liable for its content and therefore may censor the speech of its users. Conversely, if it is a platform, the federal government grants it protection from legal liability as long as it does not censor the speech of its users. Social media companies like Facebook, Twitter, and YouTube are
Re: (Score:3)
No, it doesn't, you lying moron. A website can moderate anything it damn well pleases.
Learn something: https://www.techdirt.com/artic... [techdirt.com]
Re: (Score:3)
No, it doesn't, you lying moron. A website can moderate anything it damn well pleases.
Learn something: https://www.techdirt.com/artic... [techdirt.com]
Re: (Score:3, Insightful)
Free speech is a prohibition against the US government. The places you are talking about are private property.
Free speech does not apply.
The ToS is a binding contract that YOU agree to before using the site.
Don't like stuff? Exercise the only real right you do have: Leave.
This is not revoking "free speech" (Score:2)
This is requiring private companies to be responsible for the content that they host. In effect, this is more about enforcing libel and hate crimes on the internet.
Say a website is allowed to publish a message that says, "Brainchill is a faggoty pedophile spic who crossed the Rio to rape and murder our children in our sleep. He lives at 123 4th St, El Paso, TX. Kill him, and you'll be a hero." Except, it's your real name, not your alias. Say a lawyer reaches out to the sysop for the site and says, "This
If They Court Toxicity (Score:2)
There's a difference between moderators being asleep/inconsistent/slow to remove stuff that violates their TOS; and a policy/tendency of moderators to welcome certain content, or to look the other way when they find it.
Google etc. not hiring enough 3rd-world mods to stare at horrible stuff all day is different from, say, 8-chan mods allowing everything and anything.
It might be most useful for the govt. to fund research into machine learning software to detect stuff that violates a given site's TOS (whatever
Should sites be liable for content they host? (Score:2)
Absolutely not, unless you hate free speech. Everyone needs to recognize that even the speech they disagree with is protected, and that such protections are vital to ensuring what they do agree with is allowed to continue.
Re: (Score:2)
An approach that lasts until the first photograph of child sex gets posted. There are a few true free-speech-absolutists around who will defend the right to publish and view absolutely anything, but it's really a tiny niche. The vast majority of people are happy to accept some form of ban on certain content, they just differ regarding what needs to be banned.
We have a system for this (Score:2)
If you censor your users, you should be liable for the content you are curating. If you're a common carrier of anybody's content, you should be not be. The current system works pretty well.
Re: (Score:2)
Yes, that's what Section 230 says.
But you still think it's working, after all the news about Facebook, Twitter, and YouTube over the last few years?
It's not working, because there is no enforcement mechanism. That's why people are talking about reforming or repealing it.
Re: (Score:3)
f you censor your users, you should be liable for the content you are curating. If you're a common carrier of anybody's content, you should be not be. The current system works pretty well
What you describe is not the current system. The current system is you are not liable even if you curate the content.
There is a widespread myth that curating means liability. It isn't true.
no (Score:2)
n/t
So... (Score:2)
Is this why AC comments went away?
Why some sites? (Score:2)
This is exactly the thinking of a person more interested in social (mob) justice than law. If only SOME sites should be liable, which sites would be liable? What test would you apply?
If a site wants to editorialize its content (eg. Facebook and Twitter blocking voices that don't follow their corporate ideology) then it can do so and bear the consequences of ALL the content posted. If it does want to be a platform, then it only has the obligation to remove things that are a clear violation of law after a law
Only if they dont act. (Score:2)
Sites should only be liable for content posted by their users if they are notified of content that is illegal (sex trafficking, criminal activity etc) and fail to act to remove that content.
But if someone is knowingly hosting something that is outright illegal and fail to act, they loose their safe harbor for that content.
Should some (web)sites be held liable for content? (Score:3)
Now, that having been said: Privately owned websites are not required under law to allow any and all content to be posted on their pages, they can exercise any level of moderation of user-generated content they choose, so long as it is consistent with their own stated rules. To provide more specific examples of what I mean, if I'm not clear enough, if the owner of a website that allows user-generated content does not wish to have racist, bigoted, sexist, discussion of illegal activities, discussion of drug use, or My Little Pony discussions for that matter, then it is entirely within their rights to remove content as they see fit and even revoke the access rights of any users as they see fit, too. Affected persons and their content are of course free to pursue legal action against that website and it's owners in civil court, but (using the U.S. as an example) there are no constitutional protetions against censorship guaranteed with regards to privately-held companies so far as I can see.
And so it ends (Score:2)
So...Republicans and Democrats are teaming up to hurt companies because they are not censoring the way the government wants.
This is not a good sign.
Re:Boring (Score:5, Insightful)
Re: (Score:2)
Re: (Score:3)
Opposite day for me: CD.
Re:Boring (Score:5, Insightful)
Re: (Score:3)
Right...
But the AC you and I are remembering is a different breed of AC.
Re: (Score:2)
Thats not really true at all. As we can all tell by comment count on articles, and the recent flood of 6.1million accts registered lately, most of the insightful(not meaning modded with feelings) comments come from AC's. Go look at any article older than 48 hours... I also read at -1 so you have to endure a strong scrolling finger.
Re: (Score:3)
Re: (Score:2)
While most trolling may come from ACs this makes the troll post highly visible as well. And since we notice things that stand out, our perception is skewed easily. All while sensible AC posts go unnoticed.
Having to register on the other hand and using a comment rating system by the community creates some kind of hive mind for a good portion of the participants. It acts as social engineering since most people would prefer to
Re: Boring (Score:4)
I really do like the idea of having an anonymous option, but there's no denying that a handful of trolls abused the shit out of it. They didn't just "stick out" because they were anonymous, but rather because they were far far worse than any of the comments from registered accounts.
I don't miss the AC postings but I have to admit they where a part of what made /. /.. But sometimes there is a need to us to post or reply to something anonymously. One compromise that could be made would be to restore the anonymous posting ability with registered accounts. We would still have some sort of anonymous ability but the abuse could still be linked to an account.
Or we could just get over and it and move on. I'm really good with ether.
Re: (Score:3)
4chan is at least entertaining and creative at some points. This shit was nether.
Re: (Score:3)
The key word is, compromise. Registered users would retain the ability to post anonymously, by clicking the check mark box, like the could before the ban on AC. Just like those posts are linked to the account that checked the box, future anonymous postings would be linked to the account.
You didn't assume just because you checked that box to post anonymously that they can't track your posts did you?
Re: (Score:3)
hope not. removing anonymity (even if it's only for a pseudonym and/or a throw-away account) has a chilling effect on free speech.
Oh no, someone said something online i don't agree with, whatever will I do?
Re: (Score:3)
" Oh no, someone said something online i don't agree with, whatever will I do? "
If you're on Reddit, you get down voted into oblivion. Once your karma drops below a certain threshold, you're pretty much screwed with no means to recover because most subreddits won't allow you to post anything at all with a low karma score. ( Even if your post is in full agreement with the subject at hand. It's auto-removed. )
Hence, the dangers of moderated systems.
They eventually turn into nothing but an echo chamber beca
Re: (Score:3)
There is one right that /. provides and that is the right that your comments are your opinions and not statements of fact and can not be interpreted as such.
Now that is the real difference between being held accountable and not being held accountable. Does the site present itself as the truth, does the site reinforce this by using real names, does the site claim to present the real factual news and even curate the news to reinforce this, does the site generate profit based upon the content itself acting as
Re: (Score:2)
If you can not realise that 4chan and 8chan and trolls trolling trolls, you will twist your head around in circles
I'll confess, that sentence left my head spinning.
Re: (Score:3)
I never said anything about 'rights'.
What a simplistic point of view though: Freedom of speech only exists as seen through the VERY narrow definition found in the constitution -- Rather than the cornerstone of a free society.
Re: (Score:2)
I do like the ability to post AC though. Won't be the same without it. Makes me reflect on the AC trolls I've seen over the years, from the humble shiteater
Re: (Score:2)
Re: (Score:2)
I get an email every time someone (not anonymous, although that doesn't matter for the moment) replies to my posts.
Maybe you should enable that.
They need to update the FAQ (Score:2)
Most of the trolls and useless stuff comes from "Anonymous Coward" posters. Why not eliminate anonymous posting?
We've thought about it. We think the ability to post anonymously is important, though. Sometimes people have important information they want to post, but wouldn't do so if they could be linked to it. Anonymous posting will continue for the foreseeable future.
Re: (Score:2)
Does this explain why Slashdot disabled AC posting? Are they concerned they could be liable
TFSummary explains they are not liable.
I think there should be liability if the site hosting the content should have reasonably known about the content and that it was illegal
Do you have any idea of the number of jurisdictions on the planet?
If those things actually happened, since Slashdot should have known (due to the content being flagged), then yes, there should be some liability
And what magic is it you expect Slashdot to do in this situation? It's not like police currently investigate things like SWATting due to the numerous jurisdiction issues.
Also, "you should kill yourself" is not legally actionable.
Also, intentionally not having a way to report abuse shouldn't be an excuse for not knowing. On the other hand, a site that gets thousands of comments per day can't be expected to review everything, especially if it's never reported.
......haven't thought this through, have you?
Re: (Score:2)
Have you seen the trolls here? Having an inappropriate flag would be meaningless starting on day one. You can be sure that any perfectly on-topic, accurate, and completely inoffensive comment would be flagged within seconds of posting.
When every comment is marked inappropriate, the flag becomes meaningless.
Beyond that, if by 'flagged', you mean modded down, that's hardly the same thing.
Re: (Score:2)
Have you seen the trolls here? Having an inappropriate flag would be meaningless starting on day one. You can be sure that any perfectly on-topic, accurate, and completely inoffensive comment would be flagged within seconds of posting.
When every comment is marked inappropriate, the flag becomes meaningless.
Beyond that, if by 'flagged', you mean modded down, that's hardly the same thing.
Are you still set to force v2 of the site from back during the beta push? Or using noscript or similar?
If so you probably don't see the flag option.
But day one of flagging was almost a decade ago, and all human reviewed. Timothy said they only acted on reports for spam full of links and posts that were random words to fill the length limit.
(Which to be fair is an accurate description of APK's later posts)
Can't say what the new managements policy is, but it doesn't look all that different from before.
Re: (Score:2)
I see the flag, but since it doesn't make the posts automagically disappear and rarely causes anything to happen, it is an inappropriate post flag in name only.
If it did anything more than practically nothing, it would surely be abused far more often than used appropriately.
Re:HARMFUL CONTENT PROBLEM OF INTERNET (Score:4, Interesting)
Complete rubbish. The Internet can be the ultimate honeypot for finding stupid people who want to do bad things. We should be encouraging them to post.
Re:HARMFUL CONTENT PROBLEM OF INTERNET (Score:5, Insightful)
Explain what is harmful. Lots of people throw that word around.. Really, all it seems to mean is "stuff I don't like."
Re: (Score:2, Insightful)
Indeed. This is just backhanded censorship. If both Ted Cruz and Nancy Pelosi are for something, you know it is a bad idea.
Re: HARMFUL CONTENT PROBLEM OF INTERNET (Score:2)
+1 One Party, Two Faces
Re: (Score:2)
Too many "IMHO"s, exclamation points and @ symbols. Not nearly enough white space. 0/10 would not read again.
Re: (Score:2)
" People who say they're being humble usually aren't. If they were, "
Exactly! OTOH (I'm one of those with many hands) it would be dangerous to eliminate all opinion pieces in the press, even if they are stupid, too left or too right.
Re: (Score:3)
Is that how far you got in reading his post before you commented? Or did you deliberatly only quote that little snippit because it makes it easier to respond if you ignore all the other stuff he said?
U get booboo feelings too, bro?
My point, my friend, is that virtually no one gets censored here. Although some folks see -1 as censorship. It ain't.
And despite some people hating it, it actually works. If you wanted to see Nazi ascii art, you can see it here. If you want to see severe psycho-homosexual projection here, or bullshit about certain middle eastern origin people or hate directed to folks of African descent that have dark pigmentation, just turn that little bar down to -1, and it's all there
Re: (Score:3, Insightful)
Being forced to create an account is the worst. This is absolutely terrible.
Wow. Talk about snowflakes and first world problems. Kind of ironic that all the anonymous trolls calling everyone else snowflakes were so delicate. I hope you're not triggered by this post.
Re: (Score:2)
Yer killin' me.
Re: (Score:2)
Hi Bill. Didn't I see you at Bass Pro yesterday? What's up with the name thingy?
Re: (Score:2)
Re: Good vs evil (Score:3)
Old school liberals have nothing whatsoever in common with the Corporate Social Just-Us Nazis who control Big Tech.
The old left/right dichotomy is obsolete and stultifying. Let's give it a rest.
Re: (Score:3)
Re: The left does not want to end 230 protections (Score:3, Informative)
Take a look at who's been banned and it's uniformly the same thing: White Supremacists and Wink and Nod calls to violence.
Heh. Twitter permanently banned Milo Yiannopoulos; a Jewish homosexual married to a black man who has never once advocated violence against anyone.
This is the problem with the left; you've decided that you can label absolutely anyone who disagrees with you a "violent white supremacist", and then argue that all of those people should be banned/beaten/killed depending on how extreme you happen to be. It used to only be a problem with the far left, but it's become mainstream over the last decade. And it's
Re: (Score:2)
If we include the title of your comment, you are saying:
"I am good, and everything I do is good. You are evil, and everything you do is evil."
Wow, you can't get much more self-righteous than that.
Re: Couple of reasons why today's Internet is usel (Score:2)
I dunno. I can still download tarballs and build packages from source in my NetBSD system. I can still download PDF files of datasheets from chip manufacturers. At work I can still chase down the particulars of weird decade old equipment that customers call threatens to send in for me to repair. It doesn't seem like anything I care about is ending.
Re: (Score:2)
but there is no effective or substantive difference between someone in a public square urging others toward violent actions (illegal!) and someone doing so in an online forum over via social media
The owner of that square is not liable for the inflammatory speech.
Re:The Conservictims ... (Score:5, Insightful)
he says, then proceeds to explain how he ran a denial-of-service attack against
But yeah, nothing but tolerance of people with opposing points of view here...
Re: Why is this even a discussion? (Score:3)
"you are either a publisher or a platform"
That's what it _should_ be. However that is not the actual law. Thus the push for reform.
https://www.law.cornell.edu/us... [cornell.edu]