Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Internet Censorship Electronic Frontier Foundation Facebook Government Social Networks Twitter

What Would The Internet Look Like If America Repeals Section 230? (wbur.org) 519

"REVOKE 230!" President Trump tweeted Friday, and NPR reports that the movement to revoke its safeguards "is increasingly becoming a bipartisan consensus... But experts caution that eliminating the legal protections may have unintended consequences for Internet users that extend far beyond Facebook and Twitter." "We don't think about things like Wikipedia, the Internet Archive and all these other public goods that exist and have a public-interest component that would not exist in a world without 230," said Aaron Mackey, staff attorney at the Electronic Frontier Foundation, a digital civil liberties nonprofit.

Without Section 230, experts argue, sites would have less tolerance for people posting their opinions on YouTube, Reddit, Yelp, Amazon and many other corners of the Internet...

The tech industry, unsurprisingly, is fighting hard to preserve Section 230, said Jeff Kosseff, the author of a book about Section 230, The Twenty-Six Words That Created the Internet. "The major platforms came into existence because of 230," Kosseff said. "Without 230, their operations would have to be substantially changed." In particular, Facebook, Twitter and Google would likely become aggressive about removing content and may side more often with complaining users, Kosseff said. Mackey with the Electronic Frontier Foundation agrees."It could create a prescreening of every piece of material every person posts and lead to an exceptional amount of moderation and prevention," Mackey said. "What every platform would be concerned about is: 'Do I risk anything to have this content posted to my site?'"

Another possible ripple effect of repealing, Kosseff said, is making it more difficult for whatever company is hoping to emerge as the next big social media company. "It will be harder for them because they will face more liability at the outset," Kosseff said. Eric Goldman, a professor at Santa Clara University Law School and co-director of the High Tech Law Institute, said rescinding Section 230 could reduce the number of online platforms that welcome open dialogue.

This discussion has been archived. No new comments can be posted.

What Would The Internet Look Like If America Repeals Section 230?

Comments Filter:
  • No Slashdot... (Score:5, Insightful)

    by Baby Yoda's Daddy ( 6413160 ) on Saturday May 30, 2020 @10:36PM (#60127044)
    RIP 2020
    • by drnb ( 2434720 ) on Saturday May 30, 2020 @11:51PM (#60127228)

      No Slashdot... RIP 2020

      Nope. Slashdot remains a platform because they are not editing or censoring content, nor are they even affecting its visibility. 3rd party's are rating the content and elevating or diminishing it visibility. Slashdot is not.

      • by raymorris ( 2726007 ) on Saturday May 30, 2020 @11:57PM (#60127244) Journal

        Nice try, but that doesn't fly. See Stratton Oakmont.
        Users did the moderation, the owners of the site are still liable.

        • Re: (Score:2, Insightful)

          by drnb ( 2434720 )

          Nice try, but that doesn't fly. See Stratton Oakmont. Users did the moderation, the owners of the site are still liable.

          Nope. Federal legislation, the CDA, overrides and supersedes that. 230 is going to be rewritten not completely discarded. If you act like a publisher you will no longer be able to claim to be a platform. You will have to act like a platform. Slashdot is in really good shape in that scenario.

          • by raymorris ( 2726007 ) on Sunday May 31, 2020 @12:59AM (#60127360) Journal

            > Nope. Federal legislation, the CDA, overrides and supersedes that.

            Yeah, which *section* of the CDA gets rid of Oakmont v Prodigy? Section 230! If section 230 is repealed, we're back at Oakmont.

            • Re: (Score:3, Informative)

              by raymorris ( 2726007 )

              To summarize:

              Me: Without section 230, we'd have Stratton Oakmont v Prodigy - Slashdot is responsible for anything they allow to be posted

              You: No we wouldn't, because of the CDA

              Me: Section 230 is the part of the CDA you're talking about. If section 230 of CDA were repealed, it would no longer provide safe harbor. We'd be back to Stratton Oakmont v Prodigy

              To be clear, when a law is repealed it goes away. Section 230 of the CDA won't protect Slashdot if Section 230 of the CDA os repealed.

              • by drnb ( 2434720 ) on Sunday May 31, 2020 @01:29AM (#60127424)

                To summarize:
                Me: Without section 230, we'd have Stratton Oakmont v Prodigy - Slashdot is responsible for anything they allow to be posted
                You: No we wouldn't, because of the CDA
                Me: Section 230 is the part of the CDA you're talking about. If section 230 of CDA were repealed, it would no longer provide safe harbor. We'd be back to Stratton Oakmont v Prodigy

                That is a very poor summary. Here is a more accurate one:
                raymorris: Straw man that won't happen, ie. repeal.
                drnb: actual scenario repubs and dems are discussing, ie. modification.

                To be clear, when a law is repealed it goes away. Section 230 of the CDA won't protect Slashdot if Section 230 of the CDA os repealed.

                Again, a straw man. Its not going to be repealed. Its going to be modified. Modification is one of those incredibly rare things in the universe that both republicans and democrats are agreeing upon. Platforms will be shielded IFF they act like platforms, act like a publisher no shield. That is the likely result.

              • Re: (Score:3, Insightful)

                Comment removed based on user account deletion
          • by AmiMoJo ( 196126 ) on Sunday May 31, 2020 @03:17AM (#60127638) Homepage Journal

            What does "act like a platform" actually mean in your scenario?

            Can you block/delete spam?
            Can you post warnings next to phishing attacks?
            Can you block links to certain websites like goatse?
            Can you hide certain content by default?
            Can you block certain content entirely, e.g. YouTube's no porn rule?
            Can staff have accounts they use to comment on things, as Slashdot editors sometimes do?
            Can the ToS have terms like "no harassment"?

            I can't really see how the current version of 230 can be improved on but I'm happy to listen to your specific suggestions.

        • by rtb61 ( 674572 )

          You are only moderating visibility based upon the choice of the readers. The reader chooses at what level to view and read comments, from high to low, the readers choice, the comments are always as visible as the reader chooses.

          They are actively creating a new work, based upon the deletion of legal content, the hiding of legal content and the active promotion of content presented falsely in a fraudulent and hence illegal fashion, as being the most popular, the one the majority of people approve, to convinc

      • Slashdot remains a platform because they are not editing or censoring content, nor are they even affecting its visibility.

        I hate to break it to you, but every site deletes content. Including Slashdot. If they didn't they would be overrun by SPAM.

        But it's worse than that. Slashdot deletes stuff that isn't SPAM too. Hint: where has all "Remier" drivel gone? Wanna take a guess what happened to it?

        What happened is /. previous owners did nothing about it, went broke, and sold out. They allowed grossly off topic posts, so the site was overrun by by personal feuds, NAZI posts, racial slurs - anything but commercial SPAM was allowed from what I can tell. But Apparently people don't want to have to sift through that shit to have a discussion about the topic they clicked on, and so they lost users. Who would have thunk it?

        If you every wanted an example of how a deleting posts is necessary ingredient for creating a place people engage in interesting discussions, todays /. is it. I don't know what the previous owners thought they where doing. There are lots of discussion sites out there. They carter for a particular demographic. They all, including 8-chan, curate content, get rid of crap they know won't appeal to that demographic. It starts with SPAM, but goes well beyond it. The fact this happens all the time under your nose, right here, and you apparently have never noticed is mind boggling.

        But the way, 230 does not give site owners a lot of "power". They are in life and death fight for readers and posters. It's the readers and posters who have that power now - not individually but in aggregate. Take away 230 other people will have a say via the courts. It will by American style justice - he with the biggest bank has the most say. That's the style of America Trump wants of course - one where he gets a say in what people living in the land of the free are free to say.

  • by Anonymous Coward on Saturday May 30, 2020 @10:40PM (#60127050)

    ...means that most of Trumps tweets would probably be deleted because the sites would now have to? Trump is the very definition of the posts that would have to be deleted.

    • Won't rescinding 230 means that most of Trumps tweets would probably be deleted because the sites would now have to?

      No. Keep in mind that both parties are complaining about social media companies acting as publishers but claiming to be a platform. What the legislators are going to do is not eliminate protections for platforms, what they are going to do is only let you claim platform protection IFF you act like a platform. Act like a publisher, ie delete stuff, and you will be a publisher in the eyes of the law.

      • by Joce640k ( 829181 ) on Sunday May 31, 2020 @12:36AM (#60127326) Homepage

        Act like a publisher, ie delete stuff, and you will be a publisher in the eyes of the law.

        Um, Twitter hasn't deleted anything, they just attached a warning label.

        • by ShanghaiBill ( 739463 ) on Sunday May 31, 2020 @01:06AM (#60127376)

          Um, Twitter hasn't deleted anything, they just attached a warning label.

          Platforms are supposed to be neutral conduits.

          Warning labels are not neutral.

          They are making an editorial decision to warn about some things but not other things.

          That editorial control makes them a publisher, not a platform.

          • by narcc ( 412956 ) on Sunday May 31, 2020 @02:01AM (#60127492) Journal

            Platforms are supposed to be neutral conduits.

            Ummm... No. You're thinking of carriers.

            The internet is mostly made up of things which are in no way carriers and obviously aren't publishers. It's a new thing that needed different kinds of rules.

            Warning labels are not neutral.

            What would a neutral warning label look like?

            That editorial control makes them a publisher, not a platform.

            When the beekeeping forum deletes all your posts about skydiving, are they suddenly now a publisher or are they just deleting spam? Don't be ridiculous. That absurdity would kill every forum and comments section on the internet.

      • No. The right wants everything to be like 8chan, without any restrictions on posting. The people criticizing it from the left (who are just nuts, I don't understand their position at all) ostensibly want more moderation.

        (To the extent they want more vigorous antitrust enforcement, that's really a totally separate issue, yet they're not seeing that)

        The problem is you can't force moderation, and exposing sites to liability for not moderating enough will absolutely backfire.

  • That's the whole intent anyway. User input is dangerous. The three networks must regain control

    • Comment removed based on user account deletion
      • Good idea... until the service becomes nothing but spam, porn, memes, racism... basically /pol/.

      • Don't curate or moderate user content. You can let the users do that themselves (a-la Slashdot). If the hosting service doesn't get involved it does NOT lose 230 protection.

        Yes it does. The Stratton Oakmont case that created the need to have section 230 involved users moderating a BBS chat room.

  • Whoop de doo. (Score:5, Insightful)

    by LordWabbit2 ( 2440804 ) on Saturday May 30, 2020 @10:43PM (#60127058)
    Host it outside the US.
    Problem solved.
    America shooting itself in the foot again.
    Offshoring factories turned out to be a bad idea.
    Offshoring IT turned out to be a bad idea.
    Offshoring the internet has to work right?
    • by HiThere ( 15173 )

      Pretty much, with the caveat that it would need to be hosted by partied that never intended to visit the US even to change planes, and it would have to be hosted in a site that was willing to stand up to the US.

      Note that foreign sites tend to have their own rules about what content it is legal to host, however. And Iceland, e.g., doesn't have a lot of fat pipes. (Many countries are a lot more hospitable to coverage in a language that is not native to that country.)

      • Pretty much, with the caveat that it would need to be hosted by partied that never intended to visit the US even to change planes, and it would have to be hosted in a site that was willing to stand up to the US.

        Note that foreign sites tend to have their own rules about what content it is legal to host, however. And Iceland, e.g., doesn't have a lot of fat pipes. (Many countries are a lot more hospitable to coverage in a language that is not native to that country.)

        It would certainly cause a global re-arrangement of CDNs, but that is what they are for. Would probably still be worse for the US than any other country though. Now your content has to be served from somewhere else. Sucks it is so slow.

      • Nope. let's look for example at YT.

        Assume that they do move overseas. This is not a criminal action, like DeBeers vs the DoJ. This is civil. So someone sues and wins. That means that they can attach any assets in the US. Including payment from US companies. Also probably assets in Europe.

        That means no US advertiser would do business with them. Which means that they would wind up running ads like those on the more illicit sites eg the one with the model the woman who got Proffesor Paul Frampton in trouble w

  • by backslashdot ( 95548 ) on Saturday May 30, 2020 @10:46PM (#60127064)

    Websites will be unable to control the types of conversations on their forums. Stackoverflow would become reddit. Basically it will be impossible for public discourse to occur and the right wing fascists of government will control the narrative, until the ultra-leftists grab it .. so it will stay fucked forever if Section 230 is repealed. Political websites especially will be screwed .. for example a Christian forum can be taken over by atheists. An anti-abortion site can become dominated by pro-abortion.. so the result will be discussion forums won't exist at all. Nobody will be willing to risk being sued or spend money on moderating them .. it won't even be remotely financially possible.

    • Websites will be unable to control the types of conversations on their forums.

      This is quite a common complaint anyways. I don't see why this would make it any different.

      • Right now they can and to limited extents, they actually do. Change the law and they can't in any way whatsoever, unless they do a perfect job of it every time, which is impossible. Or they ban user posting altogether.

  • by raymorris ( 2726007 ) on Saturday May 30, 2020 @10:50PM (#60127078) Journal

    A quick primer of what Section 230 is:

    It makes sense that a newspaper is responsible for the stories they publish. They are responsible for what is in their paper. Prior to section 230, if I attempted to place a classified ad in the newspaper, the newspaper could be held responsible for the content of that ad, because they published it.

    Prior to section 230, a content-neutral carrier such as the phone company who does NO filtering was not responsible for the content. That left platforms, online and offline, with a choice between two options:

    A. Screen posts and be responsible for the content.
    B. Allow all spam, obscenity, calls to violence, etc (like the phone company allows you to say anything you want in a phone call)

    Having to choose between those two options sucks.
    So section 230 added a third option. Under section 230, message boards like Slashdot, social media platforms like Facebook, etc are allowed to undertake good faith efforts to screen posts containing certain things: obscenity, excessive violence, harassment, and a couple of other categories. Under section 230, good faith efforts to reduce these things do NOT make the platform responsible for the content of all posts.

    Repealing 230 would mean Slashdot and other platforms are responsible for whatever we post here, if they undertake any efforts to fight obscenity, etc. For example without 230 they couldn't filter specific words and phrases which indicate the post is an ad for a very weird porn site. Gotta either allow everything, or be responsible for every post that is allowed.

    Here is the actual text of the law:
    https://www.law.cornell.edu/us... [cornell.edu]

    • Comment removed based on user account deletion
      • Really? What if the users are the ones doing the moderation instead of the owners of the platform.....oh wait!!

        If self-moderation is the work-around, I'd think the Internet would be much better if it uses a slashdot type moderation system.

        Sorry, there's no workaround. In the Stratton Oakmont case that gave rise to the need for the safe harbor, users were volunteering as moderators and the site was held liable for them failing to perfectly moderate things they were not even looking for and couldn't have realistically checked.

      • Repealing 230 would mean Slashdot and other platforms are responsible for whatever we post here, if they undertake any efforts to fight obscenity, etc.

        Really? What if the users are the ones doing the moderation instead of the owners of the platform.....oh wait!!

        If self-moderation is the work-around, I'd think the Internet would be much better if it uses a slashdot type moderation system.

        Unfortunately if the mods are incompetent the platform owner is still responsible.

    • by raymorris ( 2726007 ) on Saturday May 30, 2020 @11:37PM (#60127184) Journal

      I forgot to include my usual "disclaimer". Every time I explain what the law is on a particular subject, or what a particular statute or case says, somebody gets really upset with me, saying "so you think ... should ...". I'm not saying what the law SHOULD be. I didn't write section 230. I read section 230 and I stated what the law *is*.

      I was around and cognizant of the debate when 230 was being passed and implemented through more detailed regulations, but to the best of my memory I didn't participate in the comments on that one. So I get no credit or blame for it. I just know what it says, that's all.

      Another related item -

      Trump recently issued an executive order directing a federal department to do a report on whether major platforms such as Facebook and Twitter are in fact doing "good faith" screening related to the categories they are allowed to screen for under section 230. For example, if there were emails from Zuckerberg directing Facebook staffers to censor only conservative-leaning posts and NOT mark any Antifa posts as "excessive violence", that would not be good faith screening for excessive violence and would not be within the protections of section 230.

      Some headlines and stories have suggested that Trump's order "repeals section 230" or something. That's false. A president does not have the power to repeal a law, no matter how much our last two presidents have wished for that power. The president's order tells his administration to see whether or not the socal media giants are operating within the law.

      Lastly -
      Having said that my previous post simply tells what the law is, not what I think it should be, I guess now I'll say what I think it should be. I think there are probably some tweaks to 230 that could improve it. I'd love to hear any specific suggestions. Repealing it would be very problematic, in my experience working with people prior to 230, and before it was widely understood by relevant citizens.

      • by cpt kangarooski ( 3773 ) on Sunday May 31, 2020 @12:19AM (#60127290) Homepage

        For example, if there were emails from Zuckerberg directing Facebook staffers to censor only conservative-leaning posts and NOT mark any Antifa posts as "excessive violence", that would not be good faith screening for excessive violence and would not be within the protections of section 230.

        Well, you're making a number of assumptions, but let's say you were right. Zuckerberg could just say that the official policy of Facebook is that conservative posts are offensive and anti-conservative posts are not. And if they moderated posts accordingly and didn't misfilter posts for some other reason (like letting a particular post slide because the user paid a fee to avoid being filtered), then that would be protected under the law.

        Having political viewpoints and filtering based on them is fine under 230(c)(2), as long as you really believe the things you don't want on your site are offensive to you.

        On the other hand, you could not remove a post as offensive when really you want the user to pay to keep it up and it didn't really offend you; that would be an example of bad faith moderation.

        The president's order tells his administration to see whether or not the socal media giants are operating within the law.

        No. Trump is too stupid to conceal his motives. He's mainly a) upset that Twitter would post fact checks on his posts, and b) he's being directed by his handlers (he is also too stupid to have an actual agenda aside from corrupt personal enrichment and self-aggrandizement) to attack moderation online because conservatives in the English-speaking world are basically synonymous with wacko hate groups now, and fairly mundane policies about moderating out hate speech and other such shit are affecting them.

        I think there are probably some tweaks to 230 that could improve it. I'd love to hear any specific suggestions.

        Oh for sure. Repeal subsections (d) (which is pointless) and (e)(5) (which was predicted to be, and then turned out actually was, entirely counter-productive). Otherwise the law is good. The problem is really that lily-livered sites don't use it. Twitter, Facebook, etc., should be moderating a lot more heavily, and the law enables them to, but they don't due to laziness and greed. There's really not a way for the law to force them to do it though; it would violate the First Amendment.

        • I agree with you on subsection d. It probably does very little good. It also probably does little harm, but I'd prefer to not have unnecessary laws.

          > > The executive order directs

          > No. Trump is too stupid to conceal his motives. He's mainly a) upset

          Well yes that's WHY. You seem to be conflating his motive for issuing the order with what the order actually says, though. The order does not say "I am mad because....", it says the commerce department, and specifically the FCC, shall check into whet

    • Re: (Score:4, Informative)

      by srichard25 ( 221590 ) on Saturday May 30, 2020 @11:42PM (#60127198)

      Under section 230, message boards like Slashdot, social media platforms like Facebook, etc are allowed to undertake good faith efforts to screen posts containing certain things: obscenity, excessive violence, harassment, and a couple of other categories.

      Which allowed category do these examples fall in to:
      "Posting hoax political meme on July 20, 2018, specifically a fake message, supposedly from Democrats, that urged men not to vote in the midterm elections). "
      "suspended for a tweet attacking Ilhan Omar by accusing her of being "pro-Sharia" and "pro-FGM""
      "Suspended for tweets about the fictional character Baby Yoda"
      "Banned after tweeting criticism of Michigan Governor Gretchen Whitmer's stay-at-home order and encouraging the state's citizens to violate the directive"

      If social media platforms stuck to blocking posts/people within those major categories, I don't think most people would have a problem with it. The issue is that they often have gone far beyond the obviously objectionable and started banning stuff that they just don't like.

      • > If social media platforms stuck to blocking posts/people within those major categories, I don't think most people would have a problem with it. The issue is that they often have gone far beyond the obviously objectionable and started banning stuff that they just don't like.

        The president has ordered the commerce department to look into whether screening my Facebook and Twitter is in fact limited to the good-faith efforts to address the types of content they are allowed to filter under section 230.

        If you

        • by Qzukk ( 229616 )

          whether screening my Facebook and Twitter is in fact limited to the good-faith efforts to address the types of content they are allowed to filter

          Thing is, S.230 allows that effectively without limitation:

          any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.

          So, say a site run

      • The issue is that they often have gone far beyond the obviously objectionable and started banning stuff that they just don't like.

        Duh, that's what 'objectionable' means, idiot. It's entirely subjective, a tremendous catch-all, and it's simply what is objectionable to the owners and operators of the site, whatever that should happen to be.

        No one cares what you find objectionable, but if you ran your own site, you could moderate it based on your own personal beliefs, and it would be fine.

    • Everyone know Trump's intention (or at least what he claimed to want to) is to stop political discrimination / censorship. If those legal experts think repealing 230 will do the opposite of what he publicly call for, then provide a draft for 230 replacement!
      • You're not wrong.

        However, what Trump wants, his motives, is immaterial to the discussion of repealing section 230.

  • by Sebby ( 238625 ) on Saturday May 30, 2020 @10:55PM (#60127090)

    If Twitter didn't have the protections from section 230, they'd likely feel compelled to take down most of his tweets.

    • All affected websites would need to do is move out of the US. This 'section 230' is US law and its revocation will not affect other countries.
  • Full Diaper (Score:2, Insightful)

    by PopeRatzo ( 965947 )

    It's worth remembering that this entire "Repeal 230" business was started as an impulsive tantrum by a petulant man-baby. He didn't discuss it with legal experts or the FCC or anyone else. He saw something that displeased him during his daily "executive time" and went storming through the halls like some syphilitic Mad King George zonked on Adderall and everybody around him had to try to appease by doing something that they knew to be foolish (and ultimately ineffective) policy.

    And now those that support

    • It's worth remembering that this entire "Repeal 230" business was started as an impulsive tantrum by a petulant man-baby.

      Politicians have been attacking it in a bi-partisan way for a while [senate.gov], in the case I linked to, they are trying to get allow law enforcement to get through encryption (despite their pleas to "protecting the children").

  • Comment removed based on user account deletion
  • There is a very easy fix for this mess... go outside EUA!
    Between the risk of losing many of the world market AND having to screen ALL posts and the risk of losing some (or all if they ban the site) EUA market and still doing the same... probably for many companies, leaving EUA is a better option... for others, EUA is too big to lose, but they can try the move and see what happens... maybe in a few years in the future there is a different idiot in the whitehouse and everything returns to normal

    The fun fact i

  • It is the moderation equivalent of saying 'hey, you have cops, okay you can do what you please, we won't hold you to any standards at all.' You edit posts to promote/discourage a view... that's fine, You promote certain views and suppress others... that's fine too. You completely ignore your own rules to let certain views go by, all fine and dandy. You completely make up rules on the spot to suppress others views too, don't worry, you're totally in the clear. Hell, rules? Who needs those, just arbitrarily decide who posts or doesn't with no justification what so ever, you're okay there too. Doesn't matter what those views are, so long as you moderate you can do as you please. That's not what 230 was intended for, but that is what it has become. It has become the get out of jail free card for any moderation policy no matter how ridiculous, unbalanced, or dishonest. Honestly, it is time to see that it is on its death bed. Democrats want it gone. Republicans want it modified at the very least, but also there is strong rumbles to want it gone too mostly due to the strong monopolies a few companies have social media. EU wants it gone and are already starting to ignore the principle of it with various laws. Pretty much every other country has already acted as if it doesn't exists (for good reason, for them it doesn't and they is usually no equivalent). It was a nice first attempt, but everyone is finding all the exploits in it, and it is time to trash it and figure out something to replace it because the ship has already started sinking and its going down fast.
  • by BrendaEM ( 871664 ) on Sunday May 31, 2020 @02:06AM (#60127504) Homepage
    No. It's your internet. Tell him that, now, and in November.
  • 230 is NOT a bad thing. It is actually needed. However, the problem is that it removed ALL RESPONSIBILITY FOR ALL SIDES: the ISP, socialmedia, any company providing services, AND THE END USER. End users have little to no responsibility. As such, we get huge liars like Trump, and on here, we have plenty (crimson tsunami, caffeinated bacon, etc).

    It makes no sense to hold social media responsible for the content, when content is coming from end users. BUT, social media NEEDS to hold the END USERS responsible. This can ONLY be done IFF the end users are vetted. It is far too expensive for social media to vet end users. As such, we need a vetted digital certificates for end users so that end users are held responsible for what they post, or harass or whatever.
    And if we do not hold end users responsible, then we need to start holding social media responsible for controlling their postings.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...