Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Facebook Businesses Media Social Networks The Internet Your Rights Online

Facebook VP Says Company Won't Use Experts To Fix Fake News Because It is Worried About Criticism (theoutline.com) 155

Joshua Topolsky, writing for The Outline: According to Axios reporter Ina Fried, the vice president of global communications, marketing, and public policy (phew!) at Facebook shook off suggestions that the network should use outside media literacy watch dogs as opposed to outsourcing its "fake news" problem to a "statistically representative" group of its own users. While speaking at the tech conference DLD (Digital Life Design) in Munich, he revealed that the real motivation behind the company's decision was one based almost entirely on optics. This shouldn't come as much of a surprise, as the company has been totally ignorant and outrageously slow in accepting responsibility for what has been a disaster for its users. While Twitter is turning to media literacy groups such as Common Sense Media and the National Association for Media Literacy for solutions to its own troll and fake news epidemic, Facebook continues to cower behind a broken concept that the company is a neutral platform where all of its participants are equally weighted.
This discussion has been archived. No new comments can be posted.

Facebook VP Says Company Won't Use Experts To Fix Fake News Because It is Worried About Criticism

Comments Filter:
  • by gatkinso ( 15975 ) on Monday January 22, 2018 @09:03AM (#55977349)

    I just want to look at my friends kids and pet pics. Is that so much to ask?

    • by Anonymous Coward

      You like looking at kids? SICK!!!

    • by OtisSnerd ( 600854 ) on Monday January 22, 2018 @09:16AM (#55977427)
      Take a look at the F. B. (Fluff Busting) Purity add-on, it hides a lot of that unwanted junk on Facebook. It works with Firefox, Chrome, Edge, Safari, Opera, and Maxthon browsers, on Windows, Mac, and Linux. http://www.fbpurity.com/ [fbpurity.com] I've been using it for over a year, and it's updated regularly.
    • by tsqr ( 808554 )

      I just want to look at my friends kids and pet pics. Is that so much to ask?

      Same here, though the "news" doesn't bother me as much as the weird quizzes that appear to be designed to make you feel smart but in reality should be easy for anyone with an IQ over 80. Just keep scrolling.

    • by Anonymous Coward

      Reporting you to the FBI for being a fucking pedophile.

      • by Anonymous Coward
        This is trolling on so many levels. Simple verbal abuse on the face of it. Look deeper and you see it is actually smart attempt to silence somebody completely by branding them in a socially repulsive way - that is a civil death for you and done for personal entertainment not for any real deed. On the other level it is just a show of how deep we as a society fell - the accusation of this type similarly to meetoo nonsense are once thrown stick forever idiocy used as weapons ending any discussion.
    • Except a lot of the fake news is coming from your friends political links.

      I wish they could find a way to ban all of it. Just go back to pictures of food and kids and those desperate "You don't care enough to hit reply" posts.

      I can feel some of their pain, any attempt to deal with "fake news" is going to scream censorship and bias to someone. The term has no actual meaning anymore, when actual news is being labelled fake, and fake news is being bought into by masses of tin-foil hat conspiracists.

      • by swb ( 14022 ) on Monday January 22, 2018 @11:01AM (#55978103)

        Except a lot of the fake news is coming from your friends political links.

        This is the real crux of the problem. People on Facebook have become militantly political, down all the usual dividing lines. People I've known for 10-20 years who never appeared to have any political opinions are now rabidly political on Facebook.

        My own sense is that this grew out of Facebook enabling the easy re-sharing of links and pictures. At first it was just mostly dumb memes, but as election season hit it quickly became a way of sharing and ultimately manufacturing and reinforcing partisan outrage.

        I don't think Facebook can fix this without some heavy AI analysis of submitted links and images that eliminates the ability to re-share political ones without hindering the ability to re-share non-political content. News links might be more amenable to machine analysis, but images would be touch. The only other option is just disabling re-sharing of any of them, which I think users would find an unacceptable functionality limitation.

        An individual can "solve" this themselves by unfriending people who post too much political stuff, but my own sense is that means gutting even moderate-sized friend lists, rendering the entire thing kind of pointless.

        I've kind of abandoned it completely myself. I'm missing out on some socialization, but mostly I think it's false socialization with people in ordinary circumstances I wouldn't keep in touch with. A loss perhaps for some old friends and family, but not enough to make it worth putting up with.

        • Facebook *could* solve this issue just by giving people the tools to easily reduce how much of the share spam shows up on their wall. But they won't do that because their bean counters tell them that more shares = more data = more money.

          • by swb ( 14022 )

            I think it's a chicken and egg problem. You could allow people to reduce the amount of shared links and images people see in their newsfeeds. However, the easy sharing of links and especially images has caused people to become accustomed to low-effort posting and sharing, especially of images *and* it's impossible to really know whether you only got 2 likes because nobody thought your image was worth much or because it got no visibility.

            The users have basically been trained to pump out low-effort content

    • I just want to look at my friends kids and pet pics. Is that so much to ask?

      Mod the parent up! I originally joined facebook to keep in touch with friends. After the last election and all of the negativity, I deleted my account. I am now 85 days free.

      • Two and a half years for me, though I must go back in and harvest any email addresses I missed sometime. I'll use whatsapp and email to talk to people, Facebook can go fcuk themselves.

    • by Anonymous Coward

      The fundamental building blocks of Facebook is populism. A news delivery platform driven by populism can never be fair, accurate, complete, objective, or any other measure that one might want to apply. Conversely, it is the ideal group think platform. Yes, alternative ideas can exist and be put forth, but they'll only thrive based on their own popularity.

      FB would serve us all better by eliminating attempts to control news and just educate users on just how poor a news platform it is.

  • by Anonymous Coward on Monday January 22, 2018 @09:08AM (#55977371)

    Why should Facebook take any responsibility for content that is posted by their users? Why should they suddenly become a media curator, instead of a social network? Why is TFA written as though it was completely obvious and indisputable truth?

    Don't get me wrong, I hate Facebook as much as the next guy, but I hate the tone of this article even more and the righteous anger radiating from it is just revolting. It's like the author is begging for even more corporate censorship and is extremely pissed off that it's not being dished out in a timely manner.

    • Fake news, Russian interference.

      • by InPursuitOfTruth ( 2676955 ) on Monday January 22, 2018 @09:28AM (#55977477)
        Prove it. I saw an alleged post by such yesterday. It was just a guy wearing a t-shirt making a joke. This isn't "fake news" cuz it is a joke, not news. When did making jokes become "election interference". When did FREE SPEECH become a threat?
        • by bondsbw ( 888959 )

          It's not so much that free speech is a threat. It's that the truth tends to be more boring than sensationalism.

          • by lgw ( 121541 )

            And this has been the case for longer than America has existed, yet democracy seems to do OK. You think the anonymous handbills the Founding Fathers were printing to foment revolution were fair and balanced? Man, they'd make 4chan blush.

            Biased propaganda to influence elections is precisely what the First Amendment is there to protect - good thing too, as some "news" networks are 100% anti-Trump propaganda 24/7.

            I know it's fashionable with pseudo-intellectuals (e.g., TFA) to believe that the average person

            • by Anonymous Coward

              I know it's fashionable with pseudo-intellectuals (e.g., TFA) to believe that the average person is incapable of making any important decision, and thus media should only exist to guide them to the decision make by their betters, but that's aristocracy, not democracy.

              Having long been a Jeffersonian instead of a Hamiltonian, I am nonetheless bothered by the observation that far too many people buy religion from a scifi writer who wrote "the way to get rich is start a religion" and politics from a promoter who wrote "the art of the deal is to "tell people what thy want to hear." The Dunning-Kruger effect is far too widespread for comfort.

    • by Anonymous Coward

      > Why should they suddenly become a media curator, instead of a social network?

      They've always been, ever since the newsfeed left the simple "newest first" algorithm. What gets buried, and why? That's an editorial decision, regardless if it's made by a person in real time or a programmer through implementing some algorithm or neural network or whatever.

  • by Kierthos ( 225954 ) on Monday January 22, 2018 @09:12AM (#55977403) Homepage

    ... they're going to be criticized no matter what they do about this.

    I mean, if they hire an outside group to handle this, the user-base will complain that either the wrong group was picked, or that group was not conservative enough, not liberal enough, not whatever enough... you pick. Hell, they'd probably get accused of everything under the sun.

    However, the same thing will happen if they pick internal users to be their test-bed for this. "Oh, you picked the wrong users. They're too conservative, too liberal, whatever." It doesn't matter, there's bad optics no matter what.

    Doesn't mean that they shouldn't try. Just be prepared for the butt-hurt no matter what you do.

    • Not to worry, a least Russian FB users won't say anything about it at all; for some reason they're too busy looking at what their friends are up to now (Who's dating Ivan now -- Ivana know, tovarisch!)

    • by mwvdlee ( 775178 )

      Option A: Spend millions of dollars on curators and get criticized for it.
      Option B: Do nothing and get criticized for it.

      Oh, and option A has the added risk of accidentally blocking some not-entirely-fake news and getting sued because of it.

      • I wonder about option "C": Allow the user to pick their own review board. For example, have people from the Daily Kos, Breitbart, der Spiegel, CNN, MSNBC, Comedy Central, and other news organizations offer a review/weighting service for articles, with the ability for a user to pick and choose among them. This way, they are not stuck with what one groups deems as valid.

        This way, FB can't be accused of being partisan, since people can choose who (if any) reviews news articles and sets validity scores for t

        • by mwvdlee ( 775178 )

          Having news company X review messages posted by news company Y. What could possibly go wrong?

          • by Cederic ( 9623 )

            Well, Steam implemented their curator system and it's led to a wide and varied choice of game recommendations.

            Why not offer something similar for news curation? Yes, there'll be a large number of highly partisan curators. But the market will choose whether to stay with those, or to use the people with a more balanced perspective.

            If people only want news that matches their views then that's going to happen anyway. What Twitter have done is mandate that, and restricted it to a specific viewpoint.

        • I wonder about option "C": Allow the user to pick their own review board. For example, have people from the Daily Kos, Breitbart, der Spiegel, CNN, MSNBC, Comedy Central, and other news organizations offer a review/weighting service for articles, with the ability for a user to pick and choose among them. This way, they are not stuck with what one groups deems as valid.
          This way, FB can't be accused of being partisan, since people can choose who (if any) reviews news articles and sets validity scores for them.

          This is also how accidentally FB works now already.

          The only difference is instead of paying professional for their view, it pools the data that it has for free : the opinion and behaviour of your friends.

          It also has the same big draw-back : it creates a biased bubble. Except that currently the bubble isn't based on partisan media, but on what you talk with your friends and what popular in your circle. And currently the bubble costs nothing.

          What you're proposing will just cost FB more money (gotta pay the re

  • Fix what? (Score:5, Insightful)

    by InPursuitOfTruth ( 2676955 ) on Monday January 22, 2018 @09:22AM (#55977457)
    The concept that there is a "problem" is premised on several notions: - Only those in the US should have free speech on the Internet. - US readers are unable to think for themselves and scrutinize. They need a protective overlord. What would be more acceptable is tagging content that has certain attributes, then letting readers do what they wish with such tagged content. In fact, instead of debating what algorithms or filters FB, Twitter or any other potential big brother should have, how about letting readers customize their own algorithms and be empowered to control what they see in their feeds? Why isn't this concept being proposed?
    • The reason that this is a problem is that it's easier to spread appealing lies than boring truth. As Facebook sells ads, they have a financial incentive to keep people engaged. But they recognize that if the platform becomes over-whelmed by falsehoods (aka "alternative facts") that people will stop engaging at all. Facebook works due to network effect so if an exodus were to start it may happen quickly. They want to clean up their platform and it has little to do with free speech.
    • The concept that there is a "problem" is premised on several notions: - Only those in the US should have free speech on the Internet. - US readers are unable to think for themselves and scrutinize. They need a protective overlord. What would be more acceptable is tagging content that has certain attributes, then letting readers do what they wish with such tagged content. In fact, instead of debating what algorithms or filters FB, Twitter or any other potential big brother should have, how about letting readers customize their own algorithms and be empowered to control what they see in their feeds? Why isn't this concept being proposed?

      One can already filter and control what is seen in their own feeds. And when the signal-to-noise ratio gets too high, often the best way to control the feed is to fucking unplug from it. Many people have left Farcebook for this very reason, and have been better off without it.

      As far as tagging content goes, that's a dead idea from the start. Clicks and likes are all that matter these days, which is exactly why we continue to have a bullshit peddling problem. And as long as clicks and likes generate mass

  • Facebook can't win (Score:5, Insightful)

    by MobyDisk ( 75490 ) on Monday January 22, 2018 @09:25AM (#55977465) Homepage

    Facebook can't win here. If they spread fake news people blame them directly. And if they use a panel of experts than the experts are controlling the news people see. That's not good either. The article attacks them for the decision, saying it is marketing, but I think Facebook is right here. It isn't their job to be the news police.

    This simply isn't Facebook's problem. The users are to blame, and this isn't a new problem. Just like Slashdot or Reddit or any other internet forum, the content is provided by the users and it is not the responsibility of Facebook to tell people that they are idiots. Studies show that people click like on things, and then repost them, without even reading the articles. And most people don't seem to be able to distinguish political fact from reality even if they do read the articles.

    This problem happened before the internet. In the US, go to a grocery store and look at what news is available for purchase. It is 40% tabloids (AKA "fake news"), 40% celebrity gossup, 40% real news. Facebook is no different.

    This is why come to Slashdot: there are educated people here, and debunkers here. I go straight to the discussion first because half the stories are garbage.

    • by tsqr ( 808554 )

      the content is provided by the users and it is not the responsibility of Facebook to tell people that they are idiots.

      I agree with respect to user-submitted content. But, when Facebook accepts payment for advertisements, Facebook is responsible for that content.

      • by MobyDisk ( 75490 )

        . But, when Facebook accepts payment for advertisements, Facebook is responsible for that content.

        Was the discussion about advertising content? I thought it was about fake news articles submitted by users. I suppose both are a problem. So then Facebook has to filter out news articles posted by ads. That still sounds like a slippery slope, although it does limit the scope. Maybe they just shouldn't allow political ads at all.

        • by tsqr ( 808554 )

          . But, when Facebook accepts payment for advertisements, Facebook is responsible for that content.

          Was the discussion about advertising content? I thought it was about fake news articles submitted by users. I suppose both are a problem. So then Facebook has to filter out news articles posted by ads. That still sounds like a slippery slope, although it does limit the scope. Maybe they just shouldn't allow political ads at all.

          Didn't this whole controversy start with ads purportedly paid for by Russian actors?

          • by MobyDisk ( 75490 )

            I think the Russian Facebook ads are what got the public aware. But the problem runs older and deeper than that. "Fake news" isn't just about Russian propaganda. It includes American propaganda. I've had acquaintances and family members passing around articles about Bill Gates' secret eugenics mission, various causes of autism, cures for cancer, proof Obama was born in Kenya, etc.

            Oooh! I just found this article on the history of fake news. [bbc.com]

            Some time ago Slashdot linked to a study showing that users "like

    • by pots ( 5047349 )

      And if they use a panel of experts than the experts are controlling the news people see. That's not good either.

      ... Why? What if, instead of calling them "experts," we called them "journalists"? Is that still bad? Even though that's how it is anyway, and how it has always been?

      You could make the standard argument about bias, but that's why we get our news from multiple sources (multiple journalists) instead of just one. Let's try a medical analogy: your rely on your doctor for medical advice, because you doctor spends all of his time on that crap and knows a lot more about it than you do. If you disagree with your

      • And if they use a panel of experts than the experts are controlling the news people see. That's not good either.

        ... Why? What if, instead of calling them "experts," we called them "journalists"? Is that still bad? Even though that's how it is anyway, and how it has always been?

        You could make the standard argument about bias, but that's why we get our news from multiple sources (multiple journalists) instead of just one. Let's try a medical analogy: your rely on your doctor for medical advice, because you doctor spends all of his time on that crap and knows a lot more about it than you do. If you disagree with your doctor or don't like what he says, then you get a second opinion from another knowledgeable doctor who, let me repeat, follows developments in his field closely and knows more about it than you do.

        To complete the analogy: Where do you go to get your second opinion when only doctors who have the same opinion are allowed to talk about it on the forum you use? Perhaps you can go to another forum where only doctors with a different opinion are allowed?

        The only end I see with this approach is building up the walls to your personal echo chamber.

        • by pots ( 5047349 )
          The thing about a second opinion is, it's not necessarily different from the first. The point of seeking a second opinion is not to keep talking to doctor after doctor until one says something you like. If every doctor has the same opinion, it's likely because that opinion is what is true.

          And I did provide an example of someone you can go to in order to get a differing opinion if that's really what you're after: the kook down the street.
          • by MobyDisk ( 75490 )

            We aren't talking about people going to a news site. We are talking about people going to a news aggregator site. Nothing stops us from going to a different site for our news. That's an important distinction. The site gets its news from the users. So it naturally reflects what the users are sharing. That's all it should do.

            We need to stop relying on social media companies and algorithms to apply human moral judgement to data. If my Mom shares a fake news article about how the president is secretly a

          • The thing about a second opinion is, it's not necessarily different from the first. The point of seeking a second opinion is not to keep talking to doctor after doctor until one says something you like. If every doctor has the same opinion, it's likely because that opinion is what is true.

            And I did provide an example of someone you can go to in order to get a differing opinion if that's really what you're after: the kook down the street.

            What I think you are proposing prevents someone from finding the "kook down the street" because the forum is controlled by a governing body - in this case some consensus of "expert" doctors. What will happen is new ideas and medical procedures will be suppressed alongside the "kooks". Who controls the governing body? European doctors perform different techniques and drugs than US doctors which is different than Chinese doctors. What about procedures or drugs that are illegal in some places but widely us

            • by pots ( 5047349 )

              If you use set theory to only allow what is universally "correct" much information is excluded.

              This is certainly true, and is what I meant when I said, "we could expect a real caccophony of medical advice and products, all being pushed by interested parties. The volume of information would certainly be larger, but would we be better informed for that?"

              All of your questions are valid, but you seem to be phrasing them as though they were hypothetical. All of the things you say are true right now: we do indeed suppress new medical ideas and procedures, with the reasoning that the rejected procedures

    • Re: (Score:1, Insightful)

      "Educated people"? - Have you seen the Trumptards running amok in Slashdot lately?
      • by MobyDisk ( 75490 )

        No. Because they get modded down. And they get modded down by the users of the site, not by the admins. That's the key.

    • I'm glad that we have 120% news here. Good to know we are leading in that statistic. But this isn't a case of news for sale. With the Facebook news feed it's more like somebody shouting in your face. If the local grocery store let lunatics stand outside and harass customers with conspiracy theories, the business would lose customers. When people see fake news, they don't feel like engaging with Facebook. I don't know how big of a problem this is. I don't see much fake news but then I follow some ver
      • by MobyDisk ( 75490 )

        When people see fake news, they don't feel like engaging with Facebook.

        The problem here is that fake news makes people engage with Facebook *more*! They don't seem to know the difference! And now that someone told them they have been fooled, they want to blame the messenger instead of their own dang selves. I really hope this is a wake-up call to educators.

        • Short-term this is true. And for some demographics it will be true forever. But it will also scare others away. FB recognizes that the additional engagement they are getting from some users is not worth the cost of having others leave the platform. Hence they want to resolve this. I don't think anybody realizes that they've been fooled. The mainstream media has a (mostly) liberal bias. The fake news media has a (strongly) grey-haired white republican bias. FB needs to appeal across demographics so t
  • by Pyramid ( 57001 ) on Monday January 22, 2018 @09:32AM (#55977495)

    No service should be filtering what you see based on unseen algorithms or so called, "experts". Give the individual a robust tool-set to make their own decisions.

    To those of you saying a system should be implemented, who exactly would you cede what you should see to? Who knows better than you?

    • All that needs to happen is users' and advertisements' countries of origin need to be properly unmasked, and all sock-puppet accounts connecting through VPNs or proxies need to be banned. Basically just a cursory effort towards enforcement of their existing TOS, actually.

  • Great (Score:1, Insightful)

    by Anonymous Coward

    The 'fact checking' is entirely political.

    The worst fake news are establishment media companies.

  • by Charcharodon ( 611187 ) on Monday January 22, 2018 @09:47AM (#55977571)
    They are smart not to outsource their issue. Those so called "experts" out there all end up just peddling a different form of fake news. Facebook is already in enough hot water, bringing those dumbasses in would have just made things worse. In the end they are still destined for Myspace glory because they cant leave well enough alone.

    Just give me a way to block annoying sources of content, whether they be a person (the crazy Aunt) or a bunch of asshats trolling clickbait. I don't need you telling me what to think,
    • I don't need you telling me what to think,

      Yes, you do. Clearly you also need me to remind you to finish your sentences.

      • Ms Delarco (my 8th grade English Teacher) stop stalking me. I passed your class you can't tell me what to do anymore.
    • "There's only one thing to fix about "fake news""

      The readers.

      Well, actually educating the populace, especially in 'critical thinking' skills so they may do an effective job of parsing "news" for themselves, is right out, as neither side wants that!

      That kind of reckless action could bring about the end of FB and Twitter, et al, as well as put the vast majority of elected officials out of office and destroy both major US political parties. Oh, the humanity!

      Strat

      • Yes, the fundamentals are well apparent, but great effort is spent to evade them because then it becomes personal.

  • "Our engineers are aware of the problem and we have our top men on it." Maybe they should just turn it of and on again....

  • by Uberbah ( 647458 ) on Monday January 22, 2018 @10:24AM (#55977811)

    The worst psyop since Saddam planning 911 and having WMD's is the propaganda about Russian propaganda. [newyorker.com]

    • by Anonymous Coward

      Except Russia's continual war on free countries everywhere is a fact you can't argue with -- and they've never stopped.

      Russians hate democracy and the rule of law like the Devil hates holy water

      • Except Russia's continual war on free countries everywhere is a fact you can't argue with -- and they've never stopped.

        The CIA has overthrown dozens of governments since WWII. Can you name the last one overthrown by Russia without going back to the USSR? You can skip Crimea unless you want me to explain how much of an idiot you are, as the population wouldn't have voted to join Russia if your literal neo-Nazi pals hadn't taken over Ukraine in a US-backed coup. Furthermore....

        Which country has illegally a

  • by DeplorableCodeMonkey ( 4828467 ) on Monday January 22, 2018 @10:27AM (#55977845)

    This doesn't violate Facebook standards [pjmedia.com]. Or for those who won't read TFL:

    Group Threatening to Burn 'Activist Mommy' Alive Doesn't Violate Standards, Facebook Says

    Amazing how the examples in the linked article don't constitute a violation of standards on harassment and threatening violence. Must be like the Sarah Palin principle. You can say "someone should shit down Sarah Palin's throat" on national television and not be roundly condemned, but say anything about women on the other side, no matter how tepid the statement, and it's going to be a 2 minute hate session.

  • There is no reason people can't read all sorts of things and make a determination on what to believe. They do it all the time. UFOs, Aliens, bigfoot,and yes even politics. Facebook stay out of it. Be a platform, period.

  • what is this talking about? but I agree with others, no news on facebook would probably be better.

  • The bucket of reasons for bailing out of Facebook continues to fill. At some point, people are going to decide that they can get their cat video fix somewhere else; somewhere that doesn't unload propaganda and "fake news" on them on every visit (the FB Purity plug-in can't eliminate it all, after all). This goes for other sites that chose to follow Facebook's model, too. (Talkin' about you LinkedIn).

  • by fish_in_the_c ( 577259 ) on Monday January 22, 2018 @12:16PM (#55978797)

    Is they will always favor themselves and censor those things that criticize them.
    The problem with having no censor is people will have to decide for themselves what is true or false and will sometimes get it wrong.

    The argument that 'The normal individual can't be trusted, or expected, to know what I true and false' Is the greatest argument against any kind of democratic rule that can be made. It is actually however the reason why our constitution created a republic with strong states rights. In as much as we have moved away from that model we have made it much easier for small groups of people to control the population.

  • OK, because Twitter is turning to blatantly partisan groups. The founder of Common Sense Media is Jim Steyer, the brother of Tom Steyer...Tom Steyer is to the Democratic Party what people think the Koch brothers are to the Republican Party. The National Association of Media Literacy is composed of administrators from various universities and environmentalist groups (which leads me to believe that they are people who believe that Republican=Nazi).
  • Fact-checking experts will appear partisan in today's political climate where one party lies much more than the other party does. Unfortunately, a popularity contest simply won't be able to distinguish what is factual and what is a false propaganda. Partisans (on both sides) will consider truthful anything that reinforces their distorted world-views or tribalism. Human brains are not good at distinguishing truth from falsehood without extensive training and practice; just think about why social engineering,

  • Either eliminate news from FB altogether, or hire people from both sides of the aisle and set guidelines for what actually is fake news (i.e. demonstrably false in a plain language reading based on facts already proven).

    The problem that Facebook, Twitter and Google have is they all exist in an alt left echo chamber and can't figure out why 60% of the country gets pissed at them when they use "fake news" as an excuse to censor legitimate conservative news sources and stories. Not agreeing with a story does

  • Facebook continues to cower behind a broken concept that the company is a neutral platform where all of its participants are equally weighted.

    Are the phone companies accountable for what conversations you hold over their networks? Is the US Postal Service responsible for the content of letters? Come off it guys. You don't get to play thought police in politics and you don't get to play thought police in business. In fact, you don't get to play thought police at all. If you don't like the way people are voting, convince them otherwise or suck it up and deal with it. Don't wrap yourself in flag and scripture and purple robes to tell us what we shou

  • Used to be we'd say that any fool with a printing press can print stuff. That was because a lot of people if they saw it in print they thought it was true. Some news people published crap. They came up with a word for that Yellow Journalism - https://en.wikipedia.org/wiki/... [wikipedia.org] . Today CNN fits that bill to a T with their supposed un-named sources when in fact there was no source. It was just speculative bullshit. A good example right now is the collusion delusion. There is no source, there is no proof, yet h

Chemist who falls in acid will be tripping for weeks.

Working...