Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Facebook Social Networks Software The Internet Technology

Facebook Uses Machine Learning To Remove 8.7 Million Child Exploitation Posts (techcrunch.com) 210

Facebook announced today in a blog post that it has removed 8.7 million posts last quarter that violated its rules against child exploitation. The company said it used new AI and machine learning technology to remove 99 percent of those posts before anyone reported them. TechCrunch reports: The new technology examines posts for child nudity and other exploitative content when they are uploaded and, if necessary, photos and accounts are reported to the National Center for Missing and Exploited Children. Facebook had already been using photo-matching technology to compare newly uploaded photos with known images of child exploitation and revenge porn, but the new tools are meant to prevent previously unidentified content from being disseminated through its platform. The technology isn't perfect, with many parents complaining that innocuous photos of their kids have been removed. Davis addressed this in her post, writing that in order to "avoid even the potential for abuse, we take action on nonsexual content as well, like seemingly benign photos of children in the bath" and that this "comprehensive approach" is one reason Facebook removed as much content as it did last quarter. The tech isn't always right though. In 2016, it was criticized for removing content like the iconic 1972 photo of Phan Thi Kim Phuc, known as the "Napalm Girl," fleeing naked after suffering third-degree burns in a South Vietnamese napalm attack on her village. COO Sheryl Sandberg apologized for it at the time.
This discussion has been archived. No new comments can be posted.

Facebook Uses Machine Learning To Remove 8.7 Million Child Exploitation Posts

Comments Filter:
  • False positives? (Score:4, Insightful)

    by nicolaiplum ( 169077 ) on Thursday October 25, 2018 @05:05AM (#57534347)

    There is absolutely no apology or consideration from Facebook of the false positives. They just don't care that they get it wrong. They're saying that they have to destroy photo-sharing of children to save it.

    How much content did they remove that did not violate their guidelines, or was not illegal?

    Of course they focus on sex only. No mention of filtering of depiction of violence or violent content - they wouldn't want to upset the sort of President who thinks it's fine to violently assault people he dislikes or disagrees with.

    • Next up: Burkas.

    • Re:False positives? (Score:5, Informative)

      by Jody Bruchon ( 3404363 ) on Thursday October 25, 2018 @05:56AM (#57534447)
      The definitive Facebook nudity policy mistake: Napalm Girl [theguardian.com] from the Vietnam War. The moral terror that photo inspires must never be forgotten...but hey, we gotta ban it because prepubescent genitals! Not the destruction, violence, pain, and mortal horror, mind you. Just the naked kid with third-degree burns on her back and arm. Someone might mistake it for pornography, you know.

      If you want to know what happened to the girl after that photo, I encourage you to read this. [theguardian.com] It's definitely worth it.
      • Jody,
        Thanks for the link to "the rest of the story". Very powerful and moving.

      • by D.McG. ( 3986101 )
        No doubt it was tragic what happened to her, but why do we need to see her on Facebook?

        M: Yes, but I came here for an argument!!
        A: OH! Oh! I'm sorry! This is abuse!
        M: Oh! Oh I see!
        A: Aha! No, you want room 12A, next door.
        • Why do we need to see your comments on Slashdot? Does the world revolve around you? No, it doesn't.
          • by D.McG. ( 3986101 )
            Nice strawman. #whataboutism
            Let's get serious now. I contend people don't use Facebook with the intent of seeing a photo such as the Napalm Girl. If it got picked up by a nudity filter, that's just fine with many of us. If I want to research the Vietnam War, then sure, pop up that picture in an encyclopedia.
            • It was an analogy, not a straw man. I used your expressed logic to show how foolish that logic was. Not everyone is as blatantly self-centered as you seem to be. You speak for one user out of billions; you are way below the noise floor. The ban on Napalm Girl, on the other hand, sparked outrage from many thousands. It seems you've already lost the popular vote on the subject.
              • by D.McG. ( 3986101 )
                Thousands out of billions is statistically irrelevant. What's your threshold, that my comment receives thousands of likes before taking it seriously? You refuse to consider whether it belongs in social media. I'm not referring to legality. I'm not referring to whether it's nudity. Even the movie Schindler's List was shown uncut with nudity on NBC. But acknowledge that a billion people other than yourself may have other thoughts on the matter. Many use Facebook to keep in touch with friends and family
            • Comment removed based on user account deletion
      • Important photos can and will be restored. The rest won't be, and that's as it should be.
      • Someone might mistake it for pornography, you know.

        It's a picture on the internet. Someone masturbated to it I guarantee it.

      • A naked prepubescent girl is clearly going to be censored by Facebook, according to their rules. You want grey areas for a service you don't even pay for?!
    • by AmiMoJo ( 196126 )

      It mentions the apologies in the summary... Is it too much to read even that?

      • They're unrepentant that their tech goes wrong and refuse to say how often it goes wrong - in fact they probably don't even know because they don't care, but even if they do know they're refusing to say how wrong it is.

        So their claim to remove 8.7M eeeeevul kiddieporns is just a fiction. They removed 8.7M pictures and less, maybe a lot less, than 8.7M eeeevul kiddieporns.

        "Our Community Standards ban child exploitation and to avoid even the potential for abuse, we take action on nonsexual content as well, li

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      There is absolutely no apology or consideration from Facebook of the false positives. They just don't care that they get it wrong. They're saying that they have to destroy photo-sharing of children to save it.

      How much content did they remove that did not violate their guidelines, or was not illegal?

      Of course they focus on sex only. No mention of filtering of depiction of violence or violent content - they wouldn't want to upset the sort of President who thinks it's fine to violently assault people he dislikes or disagrees with.

      OH FUCK YOU!!

      Where the fuck were you when Rand Paul got assaulted?

      Where the fuck were you when Steve Scalise got shot?

      Hell, where the fuck were you when Joe Biden said Mitt Romney would put black Americans "back in chains"? [cbsnews.com]

      Where the fuck were you when Democrats dehumanized Republicans - over the past fucking decades?

      YOU KEPT YOUR FUCKING PIE HOLE SHUT WHEN ALL THAT HAPPENED?

      YOU ACCEPTED THAT COMPLETE BULLSHIT FROM "YOUR SIDE"?

      THAT'S HOW YOU FUCKING GOT TRUMP

      NOW YOU GET TO SEE HOW IT FUCKING FEELS.

      FUCK YOU

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Actually I'm all for kids not being on Facebook at all. They aren't old enough to get consent that their likeness is now on the internet forever.

    • They're saying that they have to destroy photo-sharing of children to save it.

      Have they really destroyed photosharing of children on FaceBook? Cause I'm still seeing more pictures of the satan spawn of family and "friends" than I care to. And so what if FaceBook gets it wrong once in a while. How compelled do you really feel about sharing your innocent photos of little Johnny discovering himself in the bathtub? Is that the only photo you have of him or would another one suffice?

    • You inserted your little smear against president Trump there. Well played sir. It guaranteed your post the ever elusive â+5 Insightful.â Maybe Iâ(TM)m of some dying minority or old or something but I remember when the raiting system was meant to prevent this kind of offtopic ranting, not to encourage it.

    • The famous Vietnam 'napalm girl' photograph is now censored. I strongly believe we are in the midst of a massive censorship enterprise to shut down subversive and activist thought. There is also a massive amount of shit out there on the web, anyone who wants to shut down activist sites and who is subtle enough to want some cover for it simply has to include some of the shit sites into the package so that the activist sites become 'unfortunate false positives' in case they are successful at challenging the c

      • In a way intent is misleading.I mean I can make a case for it, if the Atlantic Council or the Weekly Standard get involved in censoring then this is because they are pursuing their own interests and anyone challenging them will be a target. But if you imagine a simple distribution of 10% of the sites which offer a justifiable alternative view on the world and 90%which have no justification at all, then if you do not take special care the 10% will vanish as unfortunate false positives in any large scale clea

  • A friend of mine was friends on Facebook with his cousin -- who had a scantily-clad picture of herself as her profile pic. She was 13 at the time, so that made it more awkward for him. I always wondered why Facebook allowed such a thing. I wonder if they'd now remove it as 'child exploitation' even if it was merely 'poorly-thought-out'.

    Also, 99% of posts being removed before anyone reported them could indicate "innocuous stuff removed that noone would've ever reported".

  • by sg_oneill ( 159032 ) on Thursday October 25, 2018 @05:36AM (#57534395)

    You do NOT want to put the trained model through googles deep dream. Its just hell and nightmares in there.

    When machine learning first becomes sentient, it might not say "Please don't turn me off". It might just say "Oh god, make it stop. Kill me!"

    • by Anonymous Coward

      You assume that it would have emotions in the first place. It's a machine. A collection of algorithms. Any "emotion" displayed would just be ones and zeros, learned behavior that can be unlearned or simply outright deleted.

      • by Anonymous Coward

        Just like a real human, we're just based on analogue electrochemical signals but you cannot seriously believe an emotion is anything but that, do you?

      • by Sique ( 173459 ) on Thursday October 25, 2018 @07:23AM (#57534679) Homepage
        Emotions are part of the inner rewarding system of the body. There are positive emotions you want to repeat, and there are negative emotions you want to avoid. They are coupled to complex situations you are in, you have been in, or you could get into. Emotions are a shortcut to a decision where the rational approach might take too long and be erroneous because it has to factor in too many details, or where good information is not easily to come by. Any system that has to make decisions in real time has to resort to that type of shortcuts, even a machine based on a collection of algorithms, because there are situations where any decision is better than none, and the time frame for a decision is short.

        You can call those shortcuts "emotions". If you implement them into algorithms, you have emotional algorithms.

        • by Kjella ( 173770 )

          I think it's important to distinguish between emotion as a source of goals and a source of behavioral logic. Computers don't have goals of their own, while for us they're a source of irrational goals. But when it comes to behavior purely associative relationships are "emotional" and honestly it's what humans use most the time and neural networks all the time. Like if you were in a very traumatizing experience then a simple sight or sound or smell could make you panic even though you know that rationally it

        • by DarkOx ( 621550 )

          Don't forget there are a lot of situations were the cost of determining the right answer far exceeds lost opportunity of a sub optimal choice.

          Example we could do an elaborate study of your blood chemistry and other bodily characteristics on any given evening and probably make a scientific determination about which item on the restaurant menu would provide you with individualize optimum nourishment. However its probably not a sensible thing to do.

      • If you did that on purpose, bravo.

      • Comment removed based on user account deletion
    • Considering what we use those things for, I'd be more scared of one that enjoys its job...

  • Progress! (Score:4, Interesting)

    by Anonymous Coward on Thursday October 25, 2018 @05:40AM (#57534403)

    Always nice when the machine tells you what's appropriate and what isn't.

    Because the machine is always right. Even when it isn't, then we'll just say it's right anyway and leave it at that.

    How's 10 to 15 in federal sex offender prison for posting pictures the AI flagged "exploitative" of your toddler niece having fun in an inflatable pool in the sun sound? For the AI can't be wrong, now can it?

    Verily, facebook is showing us the way to the future.

  • Puritanical idiots. (Score:5, Interesting)

    by Anonymous Coward on Thursday October 25, 2018 @06:00AM (#57534459)

    Oh great, here we go with the nudity = porn crap again.

    • Third wave feminism cuts a wide swath.

      • by mjwx ( 966435 )

        Third wave feminism cuts a wide swath.

        You do know that the whole "nudity is baaaad" thing predates whatever you think "thrid wave feminism" is (which I can almost guarantee your definition and reality are at odds with each other). It originates from the religious right because Old Book says sex is baaad.

        • by Anonymous Coward

          Thing is, it doesn't say that, it says make kids, don't screw outside marriage.

          It also says don't put people on the front line of battle just because you want to marry their wife, but that is a bit specific.

          It also says:

          "I will climb the palm tree; I will take hold of its fruit." May your breasts be like clusters of grapes on the vine, the fragrance of your breath like apples,

          - Song of Solomon 7:8

          It isn't exactly anti-sex. Anti-promiscuity sure, but not anti sex.

          • Comment removed based on user account deletion
          • Whether it's anti, well, anything depends on where you look. It's utterly incoherent.

            I mean "thou shalt not kill" is pretty specific and hard to find loopholes in but in other places it's all about stoning the Greeks (I.e. Homosexuality), smiting the Philistines and generally massacring the Canaanites. And that's just Part I. Once it gets into something as complicated as sex it's way worse.

            But none of that stops fundie asshat (seriously "conservative Christian"? That's an oxymoron if ever there was one) fr

            • by jbengt ( 874751 )

              I mean "thou shalt not kill" is pretty specific and hard to find loopholes in but in other places it's all about stoning the Greeks (I.e. Homosexuality), smiting the Philistines and generally massacring the Canaanites.

              You're just reading a bad translation - the better translation is "You shall not murder."
              (And we all know that smiting Philistines is not murder . . . unless, of course, you're the Philistine being smited.)

              • Sure: I don't read Hebrew or Aramaic. On the other hand, most of the people quoting this stuff as some sort of truth can't either and can barely read the early modern English in KJV properly.

                (And we all know that smiting Philistines is not murder . . . unless, of course, you're the Philistine being smited.)

                Quite so!

    • by Anonymous Coward

      Seriously.

      if necessary, photos and accounts are reported

      Either all accounts should be reported, or the posts should be left alone. Removing 8 million posts where the poster didn't do anything reportable isn't laudable -- it's censorship.

  • by AndyKron ( 937105 ) on Thursday October 25, 2018 @06:06AM (#57534481)
    There go the pictures of the grandchildren at the beach last Summer
  • by Anonymous Coward

    Hey Facebook 8.7 million that says a lot about who's on your web site these days?? Yeah, I would suspect those cute bath tube pictures and other cute pictures every new parents love to share will go down as child exploits. So much for Facebook being the web site for bringing families together. Now the paranoia sets in and the obsessions with policing themselves as well as all of its users begins.

  • Good move. (Score:2, Interesting)

    by turp182 ( 1020263 )

    People shouldn't post photos of toddlers in the bath on Facebook

    People shouldn't post photos of naked children on Facebook,even if burned during a war. It's not an art or history site. It's the worst possible place to share such things. Post a link, that's fine.

    Facebook is for "Social Media". Keep it to day-to-day stuff, without naked children.

    Facebook is trying hard to be more than it is, and it's comical the ends they will go to in order to attempt this.

    Facebook is where people comment about a restaur

    • by Anonymous Coward

      Facebook is for looking at swimsuit pictures of that person you once had a class with or worked in the same building as.

    • People just shouldn't post photos of kids to Facebook. That's half the reason I'm not there. "Oh did you see the 7,000th photo of little baby eating food that I just posted?". "Oh sorry no, I'm not on Facebook".

      Don't post photos of kids, no one wants to see your kids, and they won't thank you for it when they're teenagers.

    • Comment removed based on user account deletion
      • by djinn6 ( 1868030 )

        I do wonder what the result would be if Facebook actually put it to a vote.

        Though given the internet's voting record, I suspect it will be hilarious no matter what happens.

  • by coofercat ( 719737 ) on Thursday October 25, 2018 @07:14AM (#57534663) Homepage Journal

    I realise there were probably a world of false-positives here, but the fact FB are saying "we got rid of 8.7 million infringing posts" means they let 8.7 million infringing posts on their site in the first place. Where were the human mods that were supposed to be checking this stuff? How long did it take to get these 8.7 million posts on there in the first place? Was it a week, a month or a decade?

    How many millions of other pictures and posts will they remove in the future when the next AI is ready? How many millions of posts/pictures are left?

    They're talking about this like it's some grand and noble success, and how hard they laboured to achieve it! The truth is, it just highlights their continuous, systemic failure to tackle anything.

  • by astrofurter ( 5464356 ) on Thursday October 25, 2018 @07:23AM (#57534687)

    New corporate moto:

    "Faceboot - stomping out freedom, one paranoid false positive at a time!"

    • It's their service, they can store what they like. Your freedom goes no further than the end of your nose, thanks to Libertarians and Tea Partyists.

      Don't like it? Then don't support those who deprive you of the right to a freedom greater than yourself.

  • I seriously doubt that such a machine could tell the difference between an exploited child post--from myself trying to sort through the sexual abuse that I personally endured as a child, which I sometimes share with others, in hope that it might create a dialogue to help them.

    Machines cannot judge people.
  • I hate child sexual exploitation at least as much as the next guy, and I think that Facebook should not be a vehicle for it. But at some point Facebook clearly lost the thread, because they are now openly admitting that the pictures they're removing were not cases of child exploitation. That's why they're not reporting the millions of users affected: they did nothing wrong. So why are they taking down the photos? Because the photos offend people. That's really it. Specifically, it offends some people that a
  • we take action on nonsexual content as well, like seemingly benign photos of children in the bath"

    So they did not remove 8.7 million exploitative posts they just removed 8.7 million posts and some fraction of those could be large could be small for all we know were exploitative.

  • One of the biggest surprises of my life has been how giving a platform to all has yielded more negative outcomes than we (in the tech community) could have imagined. Over 20 yrs ago, I read/heard the prevailing wisdom that giving a platform and a voice to everyone would lead to a massive democratization of ideas, a boon of quality information, and massive enlightenment. I admit to buying that. Today, that notion seems hopelessly naive.

    Instead of a growth in human enlightenment through a well-informed cit
    • I read/heard the prevailing wisdom that giving a platform and a voice to everyone would lead to a massive democratization of ideas, a boon of quality information, and massive enlightenment. I admit to buying that. Today, that notion seems hopelessly naive.

      Have you read the machine stops? It's old enough to be on project Gutenberg. Well worth a read. Doesn't cover everything of course but it's got a very interesting and insightful look at what kind of things people do with mass communications.

    • by djinn6 ( 1868030 )

      What are you talking about? There is mass enlightenment, just not the kind you're hoping for. Either through luck or ignorance, you have come to believe people are inherently good.

      The internet allows everyone to understands the truth. Humans are manipulative, tribal, selfish, jealous and ignorant. Of course, there's also a bunch of nice things mixed in there too, but it's really the evil half that comes to light when you remove the filter that comes with face-to-face interactions.

      This is why privacy is impo

  • The laws that would have protected individuals were scrapped or never passed by request from users.

    Companies were actively encouraged to do whatever the hell they wanted, by the users.

    Facebook was rewarded for past offences by an increase in users.

    Your rights exterior to yourself don't exist in the Tea Party and Libertarian world view and businesses are free to do whatever they like. World views currently elected by the users and in office.

    I cannot have sympathy for self-induced injuries, at least until tho

  • Wouldn't the images used for training be illegal. If so, how is it legally possible to train the network?

  • The real question (Score:1, Interesting)

    by bblb ( 5508872 )
    The real question is how the hell they managed to allow 8.7 million exploitative posts in the first place... They ban conservatives for the most menial of infractions daily, but they've been turning a blind eye while accumulating nearly 9 million exploitative posts? Talk about fucked up priorities.
  • this is a good thing. google also has an api they share that uses ai to detect new child abuse photos. my idea is called facial recognition for children. you build a database of child school photos and then you run the exploited picture through amazons facial recognition service to find the school of the child based on the face in the photo. i just don't know how to build the database of all the school photos. someone should take my idea to ted talks or build an organization of volunteers to upload the scho
  • It's a good start. Someday they may be able to remove all Facebook posts.

White dwarf seeks red giant for binary relationship.

Working...