Yahoo Open Sources a Deep Learning Model For Classifying Pornographic Images (venturebeat.com) 119
New submitter OWCareers writes: Yahoo today announced its latest open-source release: a model that can figure out if images are specifically pornographic in nature. The system uses a type of artificial intelligence called deep learning, which involves training artificial neural networks on lots of data (like dirty images) and getting them to make inferences about new data. The model that's now available on GitHub under a BSD 2-Clause license comes pre-trained, so users only have to fine-tune it if they so choose. The model works with the widely used Caffe open source deep learning framework. The team trained the model using its now open source CaffeOnSpark system.
The new model could be interesting to look at for developers maintaining applications like Instagram and Pinterest that are keen to minimize smut. Search engine operators like Google and Microsoft might also want to check out what's under the hood here.The tool gives images a score between 0 to 1 on how NSFW the pictures look. The official blog post from Yahoo outlines several examples.
The new model could be interesting to look at for developers maintaining applications like Instagram and Pinterest that are keen to minimize smut. Search engine operators like Google and Microsoft might also want to check out what's under the hood here.The tool gives images a score between 0 to 1 on how NSFW the pictures look. The official blog post from Yahoo outlines several examples.
Cool (Score:4, Insightful)
Now all you have to have is a good definition of what is pornographic. Personally I find gratuitous violence to be pornographic.
Re:Cool (Score:5, Insightful)
That's easy. You'll just know it when you see it [wikipedia.org]
REALLY cool! (Score:3)
Yes, this is excellent! With this available, now I won't have to suffer with non-pornographic image retrievals and suggestions any more from these stupid image search engines. Finally!
Wait, it does what?
Re: (Score:1)
I wonder what it thinks about this Rosetta image [twimg.com]:
Re: (Score:2)
Re: (Score:3)
It seems to be trying to classify images based on how NSFW they are, which is different from how pornographic they are. For example, there are some artistic nudes which may not be commonly considered pornographic, but are still NSFW.
Re:Cool (Score:5, Insightful)
As usual, Yahoo is missing the market. Rather than a binary porn/not-porn, there would be a MUCH bigger market for a porn classifier that could help people find what they like. If their DL-NN is based on RBMs [wikipedia.org] they could even use it in generative mode to create porn to individual tastes.
Re: (Score:2)
Well, it's open source, so feel free to improve and share !
Re: (Score:2)
Slashdot is NSFW, as everything not work related at work.
If you're allowed to browse on your workstation, artistic nudity should not be a problem.
Re: (Score:1)
My guess is that "pornographic" will be roughly equivalent to "containing nudity." Nudity, of course, being the most dangerous thing a person can be exposed to. Gods forbid that a child sees a nipple.
Pornography is in the eye of the beholder (Score:2)
Re: (Score:2)
Maybe - but from a NSFW standpoint, that link is definitely one, DavisMZ.
Re: (Score:2)
That depends on where you work
Re: (Score:2)
Pornography is in the eye of the beholder
Just go with the "what would your employer think" standard and it almost universally becomes images with nudity.
I am afraid that the origin of the world [musee-orsay.fr] will be considered pornographic by any algorithm capable of identifying it
Is that really something you fear? Are you really concerned that too much nude artistry is going to filtered out from your search results?
Re: (Score:2)
Well, it is a slippery slope, where does it end? The next thing you know we'll be exposing innocent new born infants to nipples. Degenerates!!!
Deep Learning (Score:2)
Is that what they're calling it nowadays?
Re: (Score:2)
Deep learning can now penetrate your pathways in expanding ways and inject fulfilling content that triggers a euphoria of discovery and edification.
Re: Deep Learning (Score:2)
How deep, baby? Tell me how deep your learning goes!
(did yahoo! classify this? will they tell us in two years?)
Re: (Score:2)
Pornographic Deep Learning Blocks? (Score:2)
Great it's about time ADs got this kind of treatment.
rule 34 (Score:2)
Re: (Score:1)
My thoughts exactly.
We may have finally managed to out-weird the Japanese.
About time! (Score:3, Funny)
Glad to hear AI is finally being used for a benevolent purpose: To more easily locate pornographic images! So now we can just bypass the Google images search with safe search off when we're looking for stuff for the spankbank, right? And, practically overnight, Yahoo! becomes relevant yet again.
Why? (Score:3)
What possible uses does this have other than censorship?
Re:Why? (Score:5, Interesting)
Re: (Score:2)
Well, you can laugh all you want, but I object to the de facto censorship imposed on us by these de facto monopolies like Facebook, Google, and now Yahoo(?)
As if simply seeing something is the worst affront that one can suffer, so we need this AI nanny.
I agree with others on many points:
- I would rate violent images worse, automatically
- what about artistic nudes? Is this thing smart enough to discriminate between guys with cameras and the good stuff?
- what about legitimate naturist and nude beach mementos?
Re: (Score:2)
Re: (Score:2)
> - what about artistic nudes? Is this thing smart enough to discriminate between guys with cameras and the good stuff?
Doubt it.
I guess Leonard Nimoy's books (yes, "Spock") photography books will classified as pornographic:
* Shekhina [amazon.com]
* The Full Body Project: Photographs by Leonard Nimoy [amazon.com]
Re: (Score:2)
Of course not, you'd need an automatic poet for that. ...
To do it just follow Stanislaw Lem's instructions.
First simulate a universe
Re: (Score:2)
Are we to raise an entire generation to think that shooting (imaginary) people until blood splatters the virtual screen is just peachy keen, but those photos of our trip to the nude beach are just oh so terrible?
I suggest you study your own question a bit more and I'm sure you'll come up with your own answer about the utility and value of violence (for the people who are part of the club ("it's a big club and you ain't in it" -- Carlin)) as opposed to the value of art, mementos, and porn (I suggest that por
Re: (Score:2)
So I can sort my porn collection into softcore and harder. Duh.
Re: (Score:1)
No mainstream ad networks (such as google Adsense) are allowed on sites with adult NSFW content. So if you have an ad-supported site with user-generated content, you have to screen the images somehow.
Call it censorship if you want to, but these algorithmic methods are getting better, and it's pretty useful as there will always be nitwits uploading pics of their manhood.
Re: (Score:2)
Well, some engineer at Yahoo convinced his boss that he should spend his work time surfing porn... for training the model, yeah, that's it.
Also, if Yahoo wanted to be profitable again they could have the best porn search engine by tomorrow.
Re: (Score:2)
To power the "safe search" option on a search engine. Self-censorship does not really count as censorship. And if I need to search for questionable words at work, I'd prefer not to have NSFW results.
Re: (Score:2)
Curation.
So, "Deep Learning" is the cyber equivalent of (Score:1)
an overweight guy in sweat pants in his mother's basement?
welcome to nazi germany papers please (Score:1)
welcome to nazi germany papers please
Re: (Score:2)
Who cares about filtering? (Score:2)
Who cares about filtering these images? I want to hook this up to an internet spider and have it go out and fetch me a vast collection of glorious pornographic images.
Great for organizing a collection (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2)
Foo. I never get the fun projects... (Score:2)
Re: (Score:2)
Schwing!
Re: (Score:2)
Wow, that thing's erection is...amazing. So long, and versatile, and hard. I bet women look at it and go, "Oh my god!"
I wish I could say I was the proud owner of a Putzmeister.
Re: (Score:1)
WARNING: The video above depicts multiple men yanking an extremely long tool all over the place. Most of the video focuses on the men slowly getting the tool erect.
Tay says "*%^#(@ 4CHAN *$#) HITLER C*CK *#&@" (Score:2)
I see no possible way this could go awry.
Do carry on.
Deep Learning (Score:2)
So deep learning way about learning how to deep throat without choking the whole time?
AI learns to go deep. (Score:1)
Great,
But can it detect duplicates and organize it by category for me?
Going for a perfect score (Score:4, Funny)
New challenge, find an image that gives a perfect 1 score
Re: (Score:2)
DeepWetDream anyone? (Score:4, Insightful)
Any takers?
Let the jokes begin (Score:2)
*knock knock* "What are you doing in there in the bathroom, son?"
*furtive rustling noises* "I'm, uh, doing Deep Learning, mom!"
"But Jimmy, you've been in there for hours!"
"Uh, yeah, mom, but there's a lot of sites- I mean, ummm material to look at."
Re: (Score:2)
analyze porn (Score:1)
0 to 1? (Score:2)
The tool gives images a score between 0 to 1 on how NSFW the pictures look.
Wake me when the things turns it up to 11.
Classification? (Score:1)
So it doesn't classify porn images, just works out if they are.
Phrasing msmash!