Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Yahoo! AI Open Source

Yahoo Open Sources a Deep Learning Model For Classifying Pornographic Images (venturebeat.com) 119

New submitter OWCareers writes: Yahoo today announced its latest open-source release: a model that can figure out if images are specifically pornographic in nature. The system uses a type of artificial intelligence called deep learning, which involves training artificial neural networks on lots of data (like dirty images) and getting them to make inferences about new data. The model that's now available on GitHub under a BSD 2-Clause license comes pre-trained, so users only have to fine-tune it if they so choose. The model works with the widely used Caffe open source deep learning framework. The team trained the model using its now open source CaffeOnSpark system.
The new model could be interesting to look at for developers maintaining applications like Instagram and Pinterest that are keen to minimize smut. Search engine operators like Google and Microsoft might also want to check out what's under the hood here.
The tool gives images a score between 0 to 1 on how NSFW the pictures look. The official blog post from Yahoo outlines several examples.
This discussion has been archived. No new comments can be posted.

Yahoo Open Sources a Deep Learning Model For Classifying Pornographic Images

Comments Filter:
  • Cool (Score:4, Insightful)

    by plopez ( 54068 ) on Friday September 30, 2016 @03:41PM (#52991265) Journal

    Now all you have to have is a good definition of what is pornographic. Personally I find gratuitous violence to be pornographic.

    • Re:Cool (Score:5, Insightful)

      by cdrudge ( 68377 ) on Friday September 30, 2016 @03:45PM (#52991281) Homepage

      That's easy. You'll just know it when you see it [wikipedia.org]

    • by mentil ( 1748130 )

      It seems to be trying to classify images based on how NSFW they are, which is different from how pornographic they are. For example, there are some artistic nudes which may not be commonly considered pornographic, but are still NSFW.

      • Re:Cool (Score:5, Insightful)

        by ShanghaiBill ( 739463 ) on Friday September 30, 2016 @04:43PM (#52991635)

        As usual, Yahoo is missing the market. Rather than a binary porn/not-porn, there would be a MUCH bigger market for a porn classifier that could help people find what they like. If their DL-NN is based on RBMs [wikipedia.org] they could even use it in generative mode to create porn to individual tastes.

      • by allo ( 1728082 )

        Slashdot is NSFW, as everything not work related at work.

        If you're allowed to browse on your workstation, artistic nudity should not be a problem.

    • by gnick ( 1211984 )

      My guess is that "pornographic" will be roughly equivalent to "containing nudity." Nudity, of course, being the most dangerous thing a person can be exposed to. Gods forbid that a child sees a nipple.

      • I am afraid that the origin of the world [musee-orsay.fr] will be considered pornographic by any algorithm capable of identifying it. Actually, I wouldn't be surprised if many people actually think it is pornographic.
        • Maybe - but from a NSFW standpoint, that link is definitely one, DavisMZ.

        • Pornography is in the eye of the beholder

          Just go with the "what would your employer think" standard and it almost universally becomes images with nudity.

          I am afraid that the origin of the world [musee-orsay.fr] will be considered pornographic by any algorithm capable of identifying it

          Is that really something you fear? Are you really concerned that too much nude artistry is going to filtered out from your search results?

      • Gods forbid that a child sees a nipple.

        Well, it is a slippery slope, where does it end? The next thing you know we'll be exposing innocent new born infants to nipples. Degenerates!!!
  • Is that what they're calling it nowadays?

    • by Tablizer ( 95088 )

      Is that what they're calling it nowadays?

      Deep learning can now penetrate your pathways in expanding ways and inject fulfilling content that triggers a euphoria of discovery and edification.

    • How deep, baby? Tell me how deep your learning goes!

      (did yahoo! classify this? will they tell us in two years?)

  • Great it's about time ADs got this kind of treatment.

  • now with AI
  • About time! (Score:3, Funny)

    by Anonymous Coward on Friday September 30, 2016 @03:52PM (#52991341)

    Glad to hear AI is finally being used for a benevolent purpose: To more easily locate pornographic images! So now we can just bypass the Google images search with safe search off when we're looking for stuff for the spankbank, right? And, practically overnight, Yahoo! becomes relevant yet again.

  • by clonehappy ( 655530 ) on Friday September 30, 2016 @03:53PM (#52991347)

    What possible uses does this have other than censorship?

    • Re:Why? (Score:5, Interesting)

      by ArtemaOne ( 1300025 ) on Friday September 30, 2016 @03:56PM (#52991365)
      My guess is that since they purchased Tumblr, they're facing the fact that it is one of the biggest nude image collections ever. For years I've joked with my wife about scrolling: funny image, social justice post, kitten, woman receiving anal, puppy dog, web comic, nude woman, kitten
      • Well, you can laugh all you want, but I object to the de facto censorship imposed on us by these de facto monopolies like Facebook, Google, and now Yahoo(?)

        As if simply seeing something is the worst affront that one can suffer, so we need this AI nanny.

        I agree with others on many points:

        - I would rate violent images worse, automatically

        - what about artistic nudes? Is this thing smart enough to discriminate between guys with cameras and the good stuff?

        - what about legitimate naturist and nude beach mementos?

        • I get part of what you're saying, but what is with the attitude as if I'm pro censorship? When did I laugh? Try to turn down your douche-nozzle. The nudity is one of the best things about Tumblr, and I'm afraid of Yahoo ruining that to try to make Tumblr more family friendly, which it is fundamentally not.
        • > - what about artistic nudes? Is this thing smart enough to discriminate between guys with cameras and the good stuff?

          Doubt it.

          I guess Leonard Nimoy's books (yes, "Spock") photography books will classified as pornographic:

          * Shekhina [amazon.com]
          * The Full Body Project: Photographs by Leonard Nimoy [amazon.com]

        • by dbIII ( 701233 )

          - what about artistic nudes? Is this thing smart enough to discriminate between guys with cameras and the good stuff?

          Of course not, you'd need an automatic poet for that.
          To do it just follow Stanislaw Lem's instructions.
          First simulate a universe ...

        • by bmo ( 77928 )

          Are we to raise an entire generation to think that shooting (imaginary) people until blood splatters the virtual screen is just peachy keen, but those photos of our trip to the nude beach are just oh so terrible?

          I suggest you study your own question a bit more and I'm sure you'll come up with your own answer about the utility and value of violence (for the people who are part of the club ("it's a big club and you ain't in it" -- Carlin)) as opposed to the value of art, mementos, and porn (I suggest that por

    • So I can sort my porn collection into softcore and harder. Duh.

    • No mainstream ad networks (such as google Adsense) are allowed on sites with adult NSFW content. So if you have an ad-supported site with user-generated content, you have to screen the images somehow.

      Call it censorship if you want to, but these algorithmic methods are getting better, and it's pretty useful as there will always be nitwits uploading pics of their manhood.

    • by ceoyoyo ( 59147 )

      Well, some engineer at Yahoo convinced his boss that he should spend his work time surfing porn... for training the model, yeah, that's it.

      Also, if Yahoo wanted to be profitable again they could have the best porn search engine by tomorrow.

    • To power the "safe search" option on a search engine. Self-censorship does not really count as censorship. And if I need to search for questionable words at work, I'd prefer not to have NSFW results.

    • Curation.

  • an overweight guy in sweat pants in his mother's basement?

  • welcome to nazi germany papers please

    • by ffkom ( 3519199 )
      Actually, "Nazi Germany" wasn't quite as uptight with regards to nudity as today's USA is - on 10th July 1942, for example, the Nazi regime generally allowed naked bathing in open waters in a new "Reichsverordnung".
  • Who cares about filtering these images? I want to hook this up to an internet spider and have it go out and fetch me a vast collection of glorious pornographic images.

  • by Snotnose ( 212196 ) on Friday September 30, 2016 @04:03PM (#52991407)
    If it can tell various types of porn apart then it can categorize, um, my friend's collection he keeps meaning to organize.
  • Classify THIS [youtu.be] Mariss.
    • by PPH ( 736903 )

      Schwing!

    • Wow, that thing's erection is...amazing. So long, and versatile, and hard. I bet women look at it and go, "Oh my god!"

      I wish I could say I was the proud owner of a Putzmeister.

    • WARNING: The video above depicts multiple men yanking an extremely long tool all over the place. Most of the video focuses on the men slowly getting the tool erect.

  • "open source"? "pornography classification"?

    I see no possible way this could go awry.
    Do carry on.
  • So deep learning way about learning how to deep throat without choking the whole time?

  • Great,

    But can it detect duplicates and organize it by category for me?

  • by watermark ( 913726 ) on Friday September 30, 2016 @05:44PM (#52991927)

    New challenge, find an image that gives a perfect 1 score

  • by PacoSuarez ( 530275 ) on Friday September 30, 2016 @05:45PM (#52991933)
    You can feed an image to this network and use backpropagation to compute the gradient of the NSFW score with respect to the pixel values in the input. A few steps of gradient ascent/descent and you'll get a spiced up/down version of the original image. I believe this is roughly what DeepDream does. The results could be hilarious. It is very possible that Yahoo has inadvertently created an open-source porn generator.

    Any takers?
  • *knock knock* "What are you doing in there in the bathroom, son?"

    *furtive rustling noises* "I'm, uh, doing Deep Learning, mom!"

    "But Jimmy, you've been in there for hours!"

    "Uh, yeah, mom, but there's a lot of sites- I mean, ummm material to look at."

  • Comment removed based on user account deletion
  • Do you think the software that analyzes the porn will do it until it goes blind or until it is wearing thick glasses?
  • The tool gives images a score between 0 to 1 on how NSFW the pictures look.

    Wake me when the things turns it up to 11.

  • So it doesn't classify porn images, just works out if they are.
    Phrasing msmash!

Keep up the good work! But please don't ask me to help.

Working...