Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Social Networks Facebook Privacy The Courts

Facebook Admits Flaw in Image Moderation After BBC Report (bbc.com) 57

From a report on BBC: A Facebook executive has admitted to MPs its moderating process "was not working" following a BBC investigation. BBC News reported 100 posts featuring sexualised images and comments about children, but 82 were deemed not to "breach community standards." Facebook UK director Simon Milner told MPs the problem was now fixed. He was speaking to the Commons Home Affairs committee alongside bosses from Twitter and Google as part of an investigation into online hate crime. The BBC investigation reported dozens of posts through the website tool, including images from groups where users were discussing swapping what appeared to be child abuse material. When journalists went back to Facebook with the images that had not been taken down, the company reported them to the police and cancelled an interview, saying in a statement: "It is against the law for anyone to distribute images of child exploitation."
This discussion has been archived. No new comments can be posted.

Facebook Admits Flaw in Image Moderation After BBC Report

Comments Filter:
  • "Appeared" to be (Score:4, Insightful)

    by Anonymous Coward on Tuesday March 14, 2017 @01:51PM (#54038525)
    "Appeared" to be child abuse material, but under no reasonable law would they be classified as such.
    • by arth1 ( 260657 ) on Tuesday March 14, 2017 @02:15PM (#54038659) Homepage Journal

      "Zero tolerance" laws are always, no exception, a bad idea.
      It's sad when an e-mail service flags and blocks pictures I send to my brother of himself as a child, in a bath tub. Who knows what lists you can get added to because the laws are just plain wrong.

      • ""Zero tolerance" laws are always, no exception, a bad idea."

        Er, is that a "law"?

        Sorry, I haven't had my coffee yet and am still feeling pedanticly.

      • "Zero tolerance" laws are always, no exception, a bad idea.

        I like to call them Zero Intelligence laws.

      • So, would you say you have zero tolerance for zero tolerance laws?

        • by arth1 ( 260657 )

          Laugh! No, I tolerate them, but they are still a bad idea. I support getting rid of them, no matter what the zero tolerance policy is; whether it's banning Erlenmeyer flasks, banning knifes at school (prompting a crafts teacher to hand children chisels), or anything else.

          (Which is why I'm also against common law and prefer civil law - in common law, stupid precedents become law, and courts can't use common sense to interpret the meaning of the law, and get stuck on the letter of the law. Zero tolerance p

          • I agree. I would throw "three strikes" laws in with this as a subcategory of zero tolerance-type laws.

            • by arth1 ( 260657 )

              I agree. I would throw "three strikes" laws in with this as a subcategory of zero tolerance-type laws.

              Yes, someone stealing *food* a total of three times going to jail for life is not too helpful. And far more expensive than e.g. creating work programs.

              In my opinion, most knee-jerk good intention laws are flawed.
              That also includes idiotic laws that make rape carry far stronger sentences than other acts of violence, which have unintended consequences like women not reporting rape because they don't want their husband or family member to go to jail for life, or where the perpetrators may choose to kill their

      • by cusco ( 717999 )

        I'm reminded of the couple in the '90s who were charged with Child Pornography for putting a photo of their 2 year-old playing naked in the lawn sprinkler on their web page.

        • by rtb61 ( 674572 )

          This stop and makes you think about all of the fuss over nakedness, is it somehow unnatural?

      • Zero tolerance laws seem to exist mostly to protect officials from accountability, so they can overreact in safety.

        "Yes, I know it seems unreasonable to expel your child for pretending to shoot at a friend with a drinking straw, but my hands are tied. I'm sorry, but I have no choice in the matter. It's the law."

  • for future finances
  • by Anonymous Coward

    ...Is the capture and punishment of those creating these materials. The fact that these materials (photos, videos, etc.) exist is secondary to the more serious crime of child exploitation itself. Humanity often seems to be more concerned with treating the symptom instead of the root cause of the problem, so to speak.

    • by Teun ( 17872 )
      No thieves without fences.

      The interesting thing is Facebook reporting the reporters to the police.

  • by Anonymous Coward

    It's like asking AI to define pornography. You'll 64% know it when you see it, maybe, if the image fits a certain general profile. That's not going to work. Hire eyeballs with your BILLIONS OF DOLLARS.

  • by Baby Duck ( 176251 ) on Tuesday March 14, 2017 @02:11PM (#54038643) Homepage
    Maybe this is why whenever I flag a video for showing actual homicide, it never gets taken down.
    • by Anonymous Coward

      Why should they be taken down? Videos showing actual homicide aren't illegal. From the tone of the article, neither were the images BBC was reporting. Sounds more like it was teenager bikini selfies and stuff out of underwear catalogs. Just because someone might be offended isn't a good reason to go around censoring everything.

      • by Anonymous Coward

        Why does it matter if someone masturbates to a swimsuit catalogue? Even if that catalogue was for younger girls? The point is the parents of those children signed releases for their images to be used in advertising material and distributed widely. In some cases the producer and artistic director of the shoot pushed the boundaries of what society considers decent, or maybe they did not: though at all times the figure behind the [entirely sanctioned and legal] camera was trying to make the shot "special" or t

  • Without knowing exactly what was in the offending posts, how can we possibly know what to think? Somehow I doubt it's that easy to find hardcore child abuse images on facebook, so these might be almost innocent - it wouldn't surprise me if most of them are just swimsuit images from someone's family holiday, or screencaps from Toddlers and Tiaras or some other TV program.

    It reminds me of certain very socially-conservative news sites I've seen decrying something or other as an affront to all that is good in t

  • When is facebook going to admit that it is physically impossible to filter out all the kiddie porn? According to this website https://zephoria.com/top-15-va... [zephoria.com] there are 300 million pictures posted EVERY DAY. Even if someone can review 1,000 pictures a day facebook would have to hire 300,000 people to ensure none of the pictures posted are 'kiddie porn'

    And computer scientists know there is no automated way to screen these photos without generating false positives. Even an algorithm that was 99% accurate,
    • Even if someone can review 1,000 pictures a day facebook would have to hire 300,000 people to ensure none of the pictures posted are 'kiddie porn'

      Good. Facebook's stupid system should be unbearably expensive, even with content moderator farms in Morocco where your image doesn't get more than a second's consideration.

      It's currently a kiddie-pool design, where people can get accounts semi-anonymously, people can report things that make them feel uncomfortable anonymously, and the images are taken down anony

      • How 'bout this:? You show some ID to get an account.

        How 'bout I don't show some ID to get an account at a place that doesn't require all that shit?

        Facebook is not a goddam governmental agency where membership is a requirement.

        Facebook is a business and will spend money to fend off external efforts to get it to spend money.

    • by dgatwood ( 11270 )

      ... Even if someone can review 1,000 pictures a day facebook would have to hire 300,000 people to ensure none of the pictures posted are 'kiddie porn'

      And computer scientists know there is no automated way to screen these photos without generating false positives. Even an algorithm that was 99% accurate, would mean 1% of 300 million pictures, or 3 MILLION pictures would get falsely reported as child pornography and taken down every single day. And let me tell you, our image recognition algorithms are nowher

    • Even if someone can review 1,000 pictures a day facebook would have to hire 300,000 people to ensure none of the pictures posted are 'kiddie porn'

      Why did you pull such a bad number out of your ass?

      1000 is basically only 2 images per minute for an 8 hour work shift. If you worked for me I'd be looking into your productivity right now.

      • I will admit, i pulled '1,000' pictures a day out of my ass. But it still seems pretty reasonably to me, its 30 seconds an image. If its a picture of a dog then thats only 1s to analyze and move on, but then you come to someone who posted a picture of their 8 year old in a swimsuit. Is that okay? Better ask the manager. Thats more than 30 seconds. Manager says its okay. 15 pictures later and its a pic of an 8 year old in a swimsuit lying on his back smiling at the camera. That was probably innocently taken
  • We need to level fines at such a large level that they act as a deterrent. FB would quickly find a solution to this problem.

  • by PPH ( 736903 ) on Tuesday March 14, 2017 @03:43PM (#54039277)

    ... let BBC stand for the British Broadcasting Corporation. Or I'm not clicking on that link.

  • Disclaimer: I don't want to see child porn or ISIS propaganda of people getting their heads cut off. EVER.
    However: So-called 'content moderation' should be called what it is: censorship. Doesn't matter if it's a non-governmental, privately-owned company, enforcing their own rules on their own website, it's still censorship. Stop with the 'newspeak' already.
    • by Maritz ( 1829006 )
      If it isn't state censorship, nobody cares. And that's precisely how it should be. If you don't like a platform's policies, off you fuck.

Any sufficiently advanced technology is indistinguishable from magic. -- Arthur C. Clarke

Working...