Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Google AI

When It Comes to Gorillas, Google Photos Remains Blind (wired.com) 306

Tom Simonite, writing for Wired: In 2015, a black software developer embarrassed Google by tweeting that the company's Photos service had labeled photos of him with a black friend as "gorillas." Google declared itself "appalled and genuinely sorry." An engineer who became the public face of the clean-up operation said the label gorilla would no longer be applied to groups of images, and that Google was "working on longer-term fixes." More than two years later, one of those fixes is erasing gorillas, and some other primates, from the service's lexicon. The awkward workaround illustrates the difficulties Google and other tech companies face in advancing image-recognition technology, which the companies hope to use in self-driving cars, personal assistants, and other products. WIRED tested Google Photos using a collection of 40,000 images well-stocked with animals. It performed impressively at finding many creatures, including pandas and poodles. But the service reported "no results" for the search terms "gorilla," "chimp," "chimpanzee," and "monkey."
This discussion has been archived. No new comments can be posted.

When It Comes to Gorillas, Google Photos Remains Blind

Comments Filter:
  • by TimothyHollins ( 4720957 ) on Thursday January 11, 2018 @03:48PM (#55910499)

    More than two years later, one of those fixes is erasing gorillas, and some other primates, from the service's lexicon. The awkward workaround illustrates the difficulties Google and other tech companies face in advancing image-recognition technology, which the companies hope to use in self-driving cars, personal assistants, and other products.

    So what do their cars do now when they spot a gorilla crossing the road?

    • Re: (Score:3, Funny)

      by Anonymous Coward

      They pretend that the gorilla is a large rabbit. This usually keeps the car from hitting the gorilla.

    • "What does it do when it spots a black pedestrian?" is a much better way to troll.
      • If it thought the pedestrian was a gorilla... brake and avoid followed by 'accelerate away quickly'.

        An adult silverback can run 400lbs of angry muscle, so it's going to wreck your car if you hit it... and if you scare it, it's going to wreck your car anyway.

    • So what do their cars do now when they spot a gorilla crossing the road?

      Answer: Anything it wants.

      • I once ran one over in my pajamas. How he got into my pajamas I'll never know.
    • So what do their cars do now when they spot a gorilla crossing the road?

      It's not about crossing the road. It's about hailing a ride from a self-driving taxi. Until google solves this problem, gorillas will be able go wherever they want by car, which will cause the other animals to get mad.

    • Swerve and hit the white guy instead?
    • by Solandri ( 704621 ) on Thursday January 11, 2018 @04:49PM (#55911019)
      You're trying to be funny, but I think this highlights part of the problem. It's not necessarily Google's image recognition software, it's also poor photographs. A self-driving car (if it needed to distinguish between gorillas and black people) wouldn't have as much problem because it can adjust the camera's exposure to where it can see enough detail to distinguish the two. But a lot of photos of black people are taken with the wrong exposure, and the black skin tones end up crushed down into just a few discrete color values near black, or clipped to 0. The AI then ends up trying to distinguish one black humanoid-shaped blob from another.

      When I shot weddings on film, I had to use special low contrast film (had a larger dynamic range). That was the only way to retain detail in both the bride's white dress and the groom's black tuxedo. And even then, if the wedding was held in sunlight a lot of the detail might still be unrecoverable. Modern cameras are getting around the problem with automatic HDR photo mode (takes two photos at different exposures to preserve detail in both the highlights and shadows, then combines them nonlinearly). But there's still a huge library of badly-exposed photos out there (from the pre-HDR days and being added to by current photographers taking simple snapshots without really caring about exposure), just waiting to trip up any image recognition AI.

      White skin tends to be slightly brighter than the average background, while black skin tends to be much darker. So to properly expose a portrait of a black person, you have to either make sure their skin dominates the camera's auto-exposure algorithm, and not the background or their clothing. Or put additional light specifically on their face (e.g. fill flash) so it's not so dark relative to the background. A professional photographer knows this. The average person taking a snapshot, and the auto-exposure algorithm in their camera, does not.
      • Re: (Score:3, Interesting)

        by Anonymous Coward

        A self-driving car (if it needed to distinguish between gorillas and black people) wouldn't have as much problem because it can adjust the camera's exposure to where it can see enough detail to distinguish the two. ...

        A professional photographer knows this. The average person taking a snapshot, and the auto-exposure algorithm in their camera, does not.

        There is a a fascinating contradiction here that reveals why "self driving cars" are not anywhere close to being a reality.

        In order for the car to adjust the camera exposure to see enough detail to distinguish between a black person and a gorilla, it needs to somehow "know" that there is a problem with what it "thinks" it sees. Of course a professional photographer has the skill to do this, because they are considered to be intelligent. If the car is able to do this, then it, too, will be considered intelli

        • > The AI researchers have their hands full with getting their programs to identify street signs, but now the code needs to have some common sense

          Your naivety is endearing. Do you think you just discovered the problem of AI? Also, you're misinformed regarding street signs - AI is better than us at that task.
      • by AmiMoJo ( 196126 )

        It's a weakness of the AI too. Humans know how other humans act. They stand upright, they sit in chairs, they eat with cutlery, they tend to look at the camera, they smile, they have patches of skin not covered by hair... Even with a bad photo, a human can tell it's not a primate from other clues.

      • by dgatwood ( 11270 )

        White skin tends to be slightly brighter than the average background, while black skin tends to be much darker.

        ... unless you're doing digital photography of people on a painted-black stage. Then, dark skin tends to have similar brightness to the background, and white skin tends to be a blown out pile of poo (unless you under-expose by at least a couple of stops). And, of course, the background ends up at 50% grey, so you can see every scuff mark on the floor. *sigh*

    • So what do their cars do now when they spot a gorilla crossing the road?

      Stop so the passenger can learn the punch line?

  • by Baron_Yam ( 643147 ) on Thursday January 11, 2018 @03:54PM (#55910549)

    Have you ever gone to the zoo and looked at the larger primates? They're fascinating because they're so much like us; I defy you to look a silverback in the eyes and not see a near-human intelligence looking back at you.

    To a human, they're obviously not human... but to an algorithm checking out just the facial features? I'm surprised this didn't happen sooner.

    • Re: (Score:3, Funny)

      by Anonymous Coward

      As the silverback looks back thinking "That's what you get for not nominating Bernie".

    • I think you should avoid eye contact with gorillas as they may view it as a form of aggression. There was an incident at a Dutch zoo several years ago where a Gorilla escaped its exhibit and attacked a women who was constantly visiting the zoo and making eye contact and smiling at the Gorilla (why they didn't kick her out or ban her I don't know) which was making it absolutely pissed.

      Regardless of how much or little they're like us, or the amount of intelligence they posses, they are ridiculously strong
      • >I think you should avoid eye contact with gorillas as they may view it as a form of aggression.

        Absolutely. And because I'm smarter than a gorilla (I hope!) it's on me to bend to its instincts. They're already locked up in a smallish habitat with a bunch of hairless apes constantly walking through their territory, they don't need us entering an eye-contact dominance contest with them to stress them out.

        My local zoo has extremely thick Plexiglas (or equivalent) on the gorilla enclosure. I've seen the b

        • by aevan ( 903814 )
          Pretty sure it's more because it can bend your limbs like a pretzel than intelligence. Though actively avoiding being bent like a pretzel is a case for being intelligent.
      • Sure it wasn't the smiling part since that is also a sign of aggression that some how got reversed with just us humans.
    • ...obscuring contrasts and therefore shapes.

      It would be interesting to see how their algorithm did on pics with various color bit depths.

    • The problem is how the algorithm makes it determination. Most of the programs take a photo and look for shapes and features, for example facial recognition programs use the corners of eyes and mouths and a few other easily identified points on the face. The problem is almost all primates have the same points and symmetry, but they should be able to solve the problem by expanding beyond these points and looking at things like hair patterns, teeth and other features that distinguish the different primate spec

      • I find the question of exactly how WE identify a unique human face to be interesting. And given we're not certain about that, replicating it with an algorithm becomes an interesting challenge.

        How do you tell the difference between an orangutan face and that of an ugly, old, hairy fat guy? As far as we're concerned, "you just do", which is a real bitch of a rule to program.

  • by swb ( 14022 ) on Thursday January 11, 2018 @03:57PM (#55910593)

    Why is this so hard to accept as not only true, but also a giant image recognition/computer vision challenge?

    You go to nearly any zoo with large primates and you're bound to hear someone say "They look so human!" Well of course they do, humans are primates.

    Which means that it works in reverse, too, primates look like humans. And it's not surprising that blacks look more like gorillas. I mean, there is the whole black coloration to begin with, but also the flatter nose and other facial features of gorillas which are shared with black more than Caucasians.

    Of course no reasonable human would think that a black *is* a gorilla or vice versa. But computer vision? It's like version 0.01 alpha and the similarities are strong enough that it's not surprising at all that it would misidentify blacks as gorillas or vice versa.

    • by Baron_Yam ( 643147 ) on Thursday January 11, 2018 @04:03PM (#55910641)

      It's like that recent H&M ad scandal with the little kid wearing a "Coolest monkey in the jungle" shirt. Kids get called monkeys all the time... when they're playing (especially climbing trees!) there's not a hell of a lot of difference between them and other young primates playing.

      Because of (primarily) American racism issues, everyone assumes if you're calling a dark-skinned kid a 'monkey' you're trying to chain him and put him to work picking cotton. Same thing here - it's an understandable situation that gets people all bent out of shape because of shit that SHOULD be nothing but embarrassing history that died with our grandparents' generation.

      • by lgw ( 121541 ) on Thursday January 11, 2018 @04:12PM (#55910717) Journal

        And yet, if you said what swb said inside Google, you'd be blacklisted by managers, targeted by Googles peer-pressure diversity acceptance program, and possibly threatened with violence (according to the screenshots presented as evidence in the lawsuit).

        None of this should be controversial, none of this should be interesting, and yet there's a whole political group in the US (well, more than one) who exist only to benefit from identity politics, so everything is offensive.

        • Re: (Score:2, Informative)

          by AmiMoJo ( 196126 )

          Bollocks. You have zero evidence that is true, and in fact no one seems to have been fired and Google assigned engineering resources to finding a solution.

          It's hard because the AI is simplistic. It's actually a good test case.

          • by lgw ( 121541 )

            There's plenty of evidence this is true: evidence in a court case, in the form of screen shits of Google communications. Could be faked, of course, but presented under oath.

            No one was fired, of course, because no one said what swb said.

      • Grandparents? Try great-great grandparents. (I'm probably old enough to be the grandfather of most slashdotters, and my great grandfather was born during the Civil War.)

        • If you want to play that game... it should never have happened in the first place. "Do unto others as you would have others do unto you" is a pretty universal bit of wisdom humans routinely ignore if it interferes with their tribal instincts.

          I went with 'grandparents' because as a middle-aged white guy I had racist grandparents who are no longer living. And mostly because I wasn't raised in a racist environment and the idea of one is (to me) foreign to my generation.

        • by swb ( 14022 )

          I'm 51 this year and my great-grandmother was only born in 1882.

          You must be over 60 and your previous generation parents must have had their kids when they were somewhat older to go back to the Civil War.

      • by aevan ( 903814 )
        What do they call Monkey Bars at an inner city school?
        And you're spot on about the playing. Nephew used to love just clambering all over people to get onto their shoulders, or hanging upside-down off things.
    • by danbert8 ( 1024253 ) on Thursday January 11, 2018 @04:22PM (#55910811)

      Considering the equivalent intelligence of a computer, mis-identifying a gorilla and a black person based on facial features alone isn't half bad... It's exactly something you'd expect from a low intelligence entity with little experience and limited comprehension of the ramifications of the identification. Similar to a toddler, don't be surprised when computers start thinking all fat people are going to have a baby just because they have been told that there is a baby inside a big belly.

      It's pattern recognition. The computers are seeing a pattern, but it's incomplete and thus wrong. It's not like the computer was programmed to be offensive...

    • by Ichijo ( 607641 )

      Technically, people are apes [wikipedia.org]! But not gorillas as the software determined.

      It's likely that the image recognition software decided with high certainty that the two were primates, and decided with low certainty that they were probably gorillas. With the certainty so low on the latter, the software should not have been so specific. Just "primates" or "apes" would have been factually correct.

      It's like when mapping software provides coordinates that are ridiculously precise [slashdot.org] and leads investigators to that farm i

    • The word also means a low-level worker. In Nelson's navy a powder monkey was a boy who would run down to the magazine to fetch ammunition. Today we have grease monkeys, code monkeys and editing monkeys.

    • Why is this so hard to accept as not only true

      I doubt many people are accepting it as not true.

      What the rest of us are also accepting as true is that comparisons to apes have been and still are used as racial slurs.

      What we also accept as true is that just becuause it's an algorithm, does not magically absolve you of responsibility. You wrote it, tested it and deployed it therefore you are responsible for what it's doing.

      It's not racist to accidently create a thing. However if you create something that acts

  • http://www.johnperkins.com/Eagle%20Monkeys.jpg

  • by doconnor ( 134648 ) on Thursday January 11, 2018 @04:25PM (#55910849) Homepage

    I expect a lot of humans wouldn't get 100% accuracy at telling species from the same order apart either. We have certain hardcoded advantages when it comes to our own species.

  • by SirJorgelOfBorgel ( 897488 ) on Thursday January 11, 2018 @04:28PM (#55910875)

    I'm as white as they come and Google Photos has tagged several monkeys in my pictures as me. Nobody is writing news stories about that (as well they shouldn't!), but because this guy is black the world ended ?

    • by AmiMoJo ( 196126 )

      That's a quite incredible coincidence! Google's face identification uses things like the distance between your eyes, nose and mouth position etc to tell you apart from other people. It works very well.

      So for it to misidentify monkeys as you, somehow it must be measuring the monkey's face as very close to your own.

      The issue in TFA is different though. It's not identifying a specific person, just confusing humans and monkeys in general.

    • but because this guy is black the world ended?

      Yes. Any questions?

  • by FeelGood314 ( 2516288 ) on Thursday January 11, 2018 @04:43PM (#55910989)
    You do not have the right not to be offended. Generally I shouldn't go out of my way to do something just to offend you but that's not even close to the case here. I seriously doubt many people were offended. I do however think a certain group of people used this as an opportunity to criticize google. This group of people care less about difficulties black people face than they do care about being seen about caring about black issues. There is a reason SJW is a derogatory term.

    There are so many actual issues that black or native North Americans face where the solutions are actually hindered by SJWs. It is quite frustrating.
    • >There are so many actual issues that black or native North Americans face

      When I was a kid, dark-skinned people didn't show up well in photos... primarily because white people chose the film chemistry to make the faces they were familiar with (pale ones!) look good in pictures.

      I think THAT was probably deserving of some indignant complaining. THIS is a quick laugh and a "Well, let's try and figure out how to make the algorithm better". Anything more tells you a lot more about the person complaining tha

    • by AmiMoJo ( 196126 )

      Freedom of speech means that you absolutely do have a right to be offended, and to express that offense. No one has to listen to you, but you have a right to speak about the issue.

      Anyway, I don't think anyone is really offended here, just frustrated that they can't solve this engineering challenge.

      • Anyway, I don't think anyone is really offended here, just frustrated that they can't solve this engineering challenge.

        There are many engineering challenges left in image recognition. The difference is that the AI recognizing a table correctly as a table in 99.99% of the cases gets praise as a very accurate system, but when it defines 99.99% of black people as black people, and 0.01% as gorillas, then they will get offended and demand an immediate fix.

        • by AmiMoJo ( 196126 )

          If it was 0.01% then no one would care. Unfortunately it's actually kinda common. Web cams with face tracking that can't see black people, standard auto settings on cameras not handling black skin well... There was a great example on Twitter of a hand dryer with optical hand detection that couldn't see black skin.

      • Anyway, I don't think anyone is really offended here, just frustrated that they can't solve this engineering challenge.

        Have you read the thread? There are a lot of people here who are INCREDIBLY offended that google's engineers are actually taking responsiblity for their code rather than just saying "it's an algorithm" and letting it continue.

        I think, fundamentally, what they can't stand is the idea that people might be responsible for what they do.

  • If I had an algorithm that occasionally misidentified people in a way that can cause public outrage, I would filter the outputs to avoid controversy too.

    Wake me up when they release a fixed version. Hell, a paper describing the issue in detail would be interesting---even fascinating, if I were any sort of expert.

    Googles themselves admitted that their algorithms still make the same mistake. This article boils down to "hard problem takes longer than 3 year to solve"---with excessive puffery.

  • I wonder if this is why Google is also blind to steam donkeys?

    I can only find ONE picture of a steam donkey, and it is not of one in actual operation, loading or unloading a ship, but one that has been half buried in an outdoor exhibit.

    The rest are just pictures of the lyrics to the sea shanty "Donkey Riding".

    YMMV, as this might only be because I was, in fact, previously searching for audio files of the sea shanty "Donkey Riding". I wanted to see what a real steam donkey actually looks like. Google doesn't

  • by account_deleted ( 4530225 ) on Thursday January 11, 2018 @04:56PM (#55911075)
    Comment removed based on user account deletion
  • Black people look more like gorillas than average in the same way white people look more like white bird poop than average. There are 'bad' things black people will look more like than other groups and there are 'good' things they'll look more like than other groups. All this algorithm did was uncover this relationship in its rough state. As it is refined it will uncover relationships more and more toward what is intended.
  • So your self driving car detects 4 gorillas on one side and one man on the other and it must hit one of the two groups, which one will it select?

    • Why do people always come to such idiotic scenarios?
      How the funk should a car that has its speed adapted to the conditions, come into a situation where it has to chose to hit one group of pedestrians? Pedestrians walking directly on the road? Directly behind a curve in a wood?
      The car will break, that is all, it wont turn away from 5 people in front of it to hit one to the left or one to the right. And you can damn be assured, it does not even check if what is in front of it is a human, an other obstacle or

      • A human who steps out within 60 feet of a car going 40MPH that has no alternate path available to it will get hit, brakes or no brakes. There will come situations where any alternate paths will also have humans. Self driving cars will sometimes be killing humans, that is a certainty.

      • The car will break, that is all, it wont turn away from 5 people in front of it to hit one to the left or one to the right. And you can damn be assured, it does not even check if what is in front of it is a human, an other obstacle or two cows. It sees an obstacle and breaks, thats it.

        What good is a car that needs to be fixed whenever it encounters an obstacle?

    • So your self driving car detects 4 gorillas on one side

      When in doubt, plow into Harambe.

    • The brakes, obviously. Hit the brakes.

  • About 20 years ago, jewwatch was in the top 3 listings when searching for jews on yahoo search, due to the way that links were created. Too many people got upset, so things were patched to prevent this, by adding code to explicitly prevent this.

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...