Google Apologises For Photos App's Racist Blunder 352
Mark Wilson writes: Google has issued an apology after the automatic tagging feature of its Photos apps labeled a black couple as "gorillas". This is not the first time an algorithm has been found to have caused racial upset. Earlier in the year Flickr came under fire after its system tagged images of concentration camps as sports venues and black people as apes. The company was criticized on social networks after a New York software developer questioned the efficacy of Google's algorithm. Accused of racism, Google said that it was "appalled" by what had happened, branding it as "100% not OK".
I know how this is going to be fixed... (Score:4, Insightful)
Anything that's politically incorrect will be blacklisted from being labeled as a result. No more gorrillas or any other of a million and one potentially offensive labels!
Although gorillas might be labeled as people, which would actually make some SJWs happy.
Mod parent down (Score:5, Funny)
This person is using "blacklisted" word. Mod it down.
Re:I know how this is going to be fixed... (Score:5, Insightful)
Re:I know how this is going to be fixed... (Score:5, Insightful)
No your comment is sad. Go and view the tagged pictures. And tell me that there is no similarity. Your lying or have so deeply fooled yourself that you have difficulty thinking un-good thoughts.
Amazing, modern man believes man descended from apes and when a computer algorithm mistakes a black colored and ape shaped ape descendant for, you know, an ape, it's because the photo alg guys are a bunch of frat boys.
It's sad the world is bending over backwards affecting this total bullshit 'how COULD this happen!!!!,' though really it isn't the world. It's media and blog bubble who are encouraged, nay, validated! by the corporate "This. Is. 100% not. ok" but show the pic and explain the story off-line and see how normal mankind really still is.
Re: (Score:2)
You sir (or mam), have left me a huge confused mess.
Don't be confused, be amused. It was probably intended as a joke. :)
alogrithms aren't racist (Score:5, Insightful)
Alogorithms aren't racist, and teaching a computer to visually recognize objects is hard. Move along.
that's right (Score:4, Informative)
And as Richard Dawkins has said, we ARE apes - all of us humans.
Re:that's right (Score:5, Informative)
"Ape" is a general language term that can be used by different people to specify quite different groups of creatures; it's more correct to say we're all hominids.
Re: (Score:2)
That's probably what the algorithm should return, just to avoid the possibility of offending anyone. Just label any photos of intelligent hirsute bipedal mammals as "Hominids" and call it a day.
Who knows, maybe the term will even catch on in the larger culture. "Machine learning" doesn't mean we can't learn from our machines.
Re:that's right (Score:5, Insightful)
No, "Ape" is a very specific term used to specify members of Hominoidea. It is unfortunate many are ignorant of the meaning of the term and use it improperly to include monkeys.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Humans are apes - specifically, great apes. (aka Hominidae aka "hominids"). "Hominids" simply means human-like. It used to mean only humans, then it included other extinct human-like creatures and now it generally includes all hominidae. While "hominid" (or alternatively "great ape") is a more specific term, it is certainly NOT a more correct term, merely the Family of the SuperFamily.
One could say that humans are mammals and it would be no less correct. Humans are animals, chordates, mammals, primates, apes, and also great apes.
It's unfortunate that the Google facial recognition software was not aware that humans don't like being reminded that they are indeed very closely related to other great apes and could easily be confused with gorillas by a non-human intelligence. Our indignance at the notion we're apes that look a lot like gorillas is rather silly -- like zebras being offended at being miscategorized as ordinary horses.
Granted, I understand the racist implication that those flagged erroneously as gorillas are somehow less human than others. Thankfully, the computer isn't racist. It merely wasn't sophisticated enough to discern the difference given the input, the algorithm, and its training.
I'm impressed it figured out the object in the photo was a living thing and got the kingdom, phylum, class, order, superfamily, family and sub-family correct. If it had chosen chimp or bonobo, it would have been even closer.
Heck, check out this comparison of a gorilla baby and a human baby -- no one would have blinked an eye if the software said the gorilla was a human baby.
http://intentblog.com/wp-conte... [intentblog.com]
Another cute gorilla baby -- a bit older:
http://www.ctvnews.ca/polopoly... [ctvnews.ca]
Re: (Score:3)
it's more correct to say we're all hominids.
Now we're going from racism to homophobia. Thanks Slashdot!
Re: (Score:3, Informative)
Re:that's right (Score:5, Informative)
Richard Dawkins is a biologist. he would never say something so stupid.
I'm curious what you feel is stupid about that straightforward statement. Regardless, Richard Dawkins did, in fact, say exactly that.
Gaps in the Mind, by Richard Dawkins [animal-rig...ibrary.com]
"We admit that we are like apes, but we seldom realise that we are apes."
"In truth, not only are we apes, we are African apes. The category 'African apes', if you don't arbitrarily exclude humans, is a natural one"
"'Great apes', too, is a natural category only so long as it includes humans. We are great apes."
I did a search for the words "dawkins" and "ape" and the first result was a video of Dawkins saying that he is an ape [youtube.com]. I challenge you to find any living biologist that claims otherwise.
we are all hominids, and we are certainly not apes.
Gorillas are hominids, and all hominids are apes. Humans are apes and hominids, just like gorillas.
Re: (Score:3)
Some asshole must have changed wikipedia to make you wrong. It says:
The Hominidae (/hmndi/), also known as great apes,[notes 1] or hominids, form a taxonomic family of primates, including four extant genera: orangutans (Pongo) with two species extant; gorillas (Gorilla) with two species; chimpanzees (Pan) with two species; and humans (Homo) with one species.[1]
You'd better go in there and correct it to say say humans are not apes.
Re:alogrithms aren't racist (Score:5, Interesting)
I followed the link and looked at the photos. I could see how it would make that mistake.
1. The Color balance was off: What we call black people are actually just a richer brown. the color balance gave their color more of a real Black/Gray color, the natural color of a Gorilla.
2. The Angle of the shot. The tilted Angle makes it appeared that they are not upright but slouching in.
3. They were making unnatural facial features for humans. They were making funny faces at the camera.
4. The dark hue of the gentleman who was behind shirt, combined with the ladies hair style, makes it seem the body with much broader shoulder.
I expect the combination of a lot of factors created the wrong choice. But computer decision making, while getting good, isn't perfect, but it is often better then not having it because then it wouldn't be possible to catalog the millions of images. We need to accept that computers make mistakes and there should be a way to fix them when they are found.
Many of our derogatory comments come from the fact that we find similarities with something else, so it come to reason that a computer may make an actual mistake that will reinforce such derogatory meaning.
Re:alogrithms aren't racist (Score:4, Insightful)
Re: (Score:3, Insightful)
But the people writing the algorithm and choosing the input data *can* be racist. And even in the absence of malice, you can create racist outcomes.
If your training set has many photos of white people and few photos of black people, it's not going to be great at recognizing black people. If it doesn't know what black people look like, it's bound to misclassify them more often than white people.
Anecdotally, I noticed that the Microsoft "how old are you" site a while back recognized me (a white person) in eve
Re: (Score:3)
But the people writing the algorithm and choosing the input data *can* be racist.
Call the grand scrutinizer and ready the scourges.
And even in the absence of malice, you can create racist outcomes.
The universe didn't agree with a totalitarian's philosophy.
Re:alogrithms aren't racist (Score:5, Insightful)
I find it hard to believe that there was racism intended in any way, shape, or form. It is unfortunate that this took place but Google certainly took care of the problem in short order, as is right.
There are too many of the LBTO (looking to be offended) crowd these days. Come on, there are plenty of real problems with racism, there's no need to label inadvertent and unintentional things.
Re:alogrithms aren't racist (Score:5, Informative)
It's also one thing if this was a program just designed to distinguish between different people. But it looks like it's trying to recognize objects of all sorts and distinguish between people and just about everything else. That's a hard problem, and the only response to this sort o thing is to take a regular failure case and feed it back into the training data so you can hit the next regular failure case. Hopefully it will be less coincidentally embarrassing, but it will definitely be there. Perhaps confusing bald men with balloons or something like that.
But I also think people underestimate how much skin color affects machine vision problems. I spent years in the biometrics industry and one consistent fact is that people with darker skin just don't provide as much easy-to-recognize detail as people with lighter skin. There will be more misclassifications as long as the image is taken using the visible spectrum. To a computer extracting features, dark skinned people and gorillas are both human-ish face shapes with a particular color range and somewhat indistinct geometry due to weak contrast and shadows. Distingushing between those two sets just isn't as easy as distinguishing between fair-skinned blondes and gorillas. You can make that decision just by looking at the color histograms and not even bothering with geometry.
Re:alogrithms aren't racist (Score:5, Insightful)
Re:alogrithms aren't racist (Score:5, Informative)
It isn't a racist outcome. It is the outcome of a flawed algorithm.
You're not paying attention. These days, outcomes that have nothing to do with intention, purpose, or simple transparent standards, but which happen to lean statistically towards results not in perfect balance with skin color as a function of population (though, only in one direction) ... the process must be considered racist. The whole "disparate impact" line of thinking is based on this. If you apply a standard (say, physical strength or attention to detail or quick problem solving, whatever) to people applying to work as, say, firefighters ... if (REGARDLESS of the mix of people who apply) you get more white people getting the jobs, then the standards must surely be racist, even if nobody can point to a single feature of those standards that can be identified as such. Outcomes now retro-actively re-invent the character of whoever sets a standard, and finds them to be a racist. Never mind that holding some particular group, based on their skin color, to some LOWER standard is actually racist, and incredibly condescending. But too bad: outcomes dictate racist-ness now, not policies, actions, purpose, motivation, or objective standards.
So, yeah. The algorithm, without having a single "racist" feature to it, can still be considered racist. Because that pleases the Big SJW industry.
It's the same thinking that says black people aren't smart enough to get a free photo ID from their state, and so laws requiring people to prove who they are when they're casting votes for the people who will govern all of us are, of course, labeled as racist by SJW's sitting in their Outrage Seminar meetings. It's hard to believe things have come that far, but they have.
Re: (Score:2)
Great post, you'd have a mod point from me if I hadn't already commented.
Re: (Score:3)
I don't disagree with the idea of preventing voter fraud nor government issued ID's as the tool to accomplish that task. That said it is pretty obvious that the main proponents of voter laws are Republicans because they know it will benefit them in elections, and the main opponents of voter laws are democrats because they know it will not benefit them in elections. Neither side cares about fairness, they only care about winning.
If voter fraud were a big problem, I think the disparity in outcome would not
Re: (Score:3)
That said it is pretty obvious that the main proponents of voter laws are Republicans because they know it will benefit them in elections, and the main opponents of voter laws are democrats because they know it will not benefit them in elections.
Backwards. The Republicans know that the biggest source of bogus voter registrations, and the areas with the largest number of actively dead registered voters and turnout at polling places where the number of votes exceeds the eligible population, are in places where Democrat activists work the hardest to hold on to power. It's not that knowing people who vote are voting legally and only once isn't going to benefit Democrats, it's that such a process is counter to what liberal activist groups work so hard
Re: (Score:3)
If you're worried about people not knowing there's an election coming up, and not bothering to get an ID (really? you can't go to the doctor, fill a prescription, collect a welfare check, or much of ANYTHING else with already having an ID),
I'm not worried because I'm not a democrat, and don't have any interest in helping democrats win elections.
What I am saying is that voter fraud is currently not a problem. There just aren't significant number of fraudulent votes. It is pretty clear that the problem is inflated by republican politicians and strategists to get voter ID laws to help them win elections.
then why not encourage the Democrats to apply the same level of effort they put into the shady practices described above, and focus it instead on getting that rare person who never sees a doctor, never gets a prescription, collects no government benefits of any kind, doesn't work (but whom you seem to suggest none the less are a large voting block) and, with YEARS to work with between elections ... just getting them an ID?
I'm not saying it's hard to get an ID. I'm saying that t will result in less votes for democrats, and everyone knows that. The republicans k
Re: (Score:3)
the only person saying "black people aren't smart enough" is you.
Over here we live in reality, and the reality I that getting one of those IDs requires taking time off from work that we frequently either don't get or can't afford to take. It also frequently requires traveling clear cross town, when you don't have reliable personal transportation, and live in towns that lack adequate public transportation. Or dealings with the fact that some states (*cough*Georgia*) have been intentionally shutting down DMV'
Re: (Score:3)
Over here we live in reality, and the reality I that getting one of those IDs requires taking time off from work that we frequently either don't get or can't afford to take
Really. What sort of job do you have that didn't involve showing ID in order to submit the required federal tax forms as you were hired? What sort of paycheck are you getting that doesn't involve you using an ID in order to open a bank account or cash a check? Please be specific about the people who are working full time, so hard, that not once in their entire life can they be bothered to get a form of ID. And, out of curiosity, how on earth did they find time to go register to vote, or find time TO vote?
Re: (Score:3)
I don't believe the algorithm is impugning the humanity of my offspring, I just think it is far-from-perfect.
But is the algorithm even wrong? I think the question to the Google recognizer is "of the images in my collection which ones look most like a seal"? If the collection is mostly all pictures of your kids, it'll show you the pictures of your kids that it thinks have the most in common with what it has as an idea about what seals look like. This isn't to make fun of your kids, of course, it's just it
Re: (Score:2)
But the people writing the algorithm and choosing the input data *can* be racist. And even in the absence of malice, you can create racist outcomes.
This just in:
Fwipp, who doesnt know shit about machine learning, has decided that deep convolution networks can be cleverly programmed to be racist. Fwipp knows that he doesnt know shit about machine learning, but feels that his expertise in finding racist versions of both bubble sort and hello world qualifies him as an expert here.
Re: (Score:3)
#include<racist_version_of_stdio.h>
main()
{
printf("Hello World, but only if you're white!");
}
Then yeah helloworld.c can be pretty fucking racist.
Re: (Score:2)
I can't help but wonder if this person (which I haven't seen) actually did resemble a gorilla. Wouldn't be the first time I've seen such a thing.
I suspect that if the person misidentified was white, this wouldn't be news however.
Re:alogrithms aren't racist (Score:4, Interesting)
OTOH If it was a pic of a gorilla but labelled 'Black Afro-American' then you would have the same issue.
Re: (Score:2)
I can't help but wonder if this person (which I haven't seen) actually did resemble a gorilla. Wouldn't be the first time I've seen such a thing.
I suspect that if the person misidentified was white, this wouldn't be news however.
Yeah, god forbid you actually glance at the fucking linked article. I'm not even expecting you to read it, just look at the pictures. I know delaying your insightful reply by 10 seconds would be torture, otherwise how could you proclaim your ignorance?
Re: (Score:2)
I did open the article moron, but for whatever reason no pictures were showing on my mobile.
Re: (Score:3)
Alogorithms aren't racist, and teaching a computer to visually recognize objects is hard. Move along.
Oblig Better Off Ted [vimeo.com]
Re:alogrithms aren't racist (Score:5, Informative)
The algorithm wasn't that far off. I'm sure the gorilla will come over it.
Re: (Score:2)
Re: (Score:3)
Alogorithms aren't racist
FacialRegion *face = DetectFace(bmp);
if (face != nullptr)
{
if (face->avg_col.r < 10 && face->avg_col.g < 10 && face->avg_col.b < 10)
{
/ / Be racist
result->order = ordPrimate;
result->genus = genGorilla;
}
else
{
Casper is Concerned (Score:3)
So, do really pale "white" people get mis-labeled as ghosts? Inquiring minds are somewhat concerned because they are rather pale....
Re: (Score:2)
No, but it dig tag Lena Denham as a white manatee.
Re:Casper is Concerned (Score:5, Informative)
So, do really pale "white" people get mis-labeled as ghosts? Inquiring minds are somewhat concerned because they are rather pale....
One of the articles I saw about this mentioned that in the past, light-skinned people had been identified as dogs and seals. Strangely, there was no outrage about that.
Re:Casper is Concerned (Score:5, Insightful)
Searched for "dog" in my Google Photos. 6 photos came up, all of my kids or kid and wife. I don't care. It's an algorithm.
Searched for "seal" in my Google Photos. Only one came up, and it's of my elder kid. I don't care. It's an algorithm.
People who feel "offended" by an algorithm are batshit crazy.
Re: (Score:2)
Thank you for having children, and presumably propagating your common sense.
Re: (Score:2)
People who feel "offended" by an algorithm are batshit crazy.
No, for the most part they have no idea what an algorithm is. The deliberate "monkey" references used to refer to blacks touches a very painful part of many and, not even caring why, they are disturbed. They should not be, but they are.
Re: (Score:3)
Could you please elaborate on the "painful part" thing? Mind, you, I'm Romanian and might not fully understand what's happening, but I am interacting with people from the States on a daily basis and have quite a few good friends there (Americans, that is). My conversations with them on the "blacks" subject prompted me to draw these conclusions (which might be correct or incorrect):
- "Positive discrimination" is prevalent. Black people have grown to abuse it, hence "Because I'm black!" which is used as an a
Re: (Score:3, Insightful)
Historically racists have called black people apes and monkeys. Therefore this accidentally and somewhat embarrassingly mimics that behaviour.
Historically white people were not, to my knowledge, insulted and discriminated against by being compared to seals and dogs. It's a bit more embarrassing to have women labeled as dogs because they are sometimes called bitches as an insult.
It's really not hard to understand. Context and history attach additional meanings and sentiments to some words.
It's an algorithm (Score:5, Informative)
It's impressive that it can even recognize and classify things as such. Great apes and humans share about 99% of our DNA, any 'alien' entity would classify us amongst the apes.
The fact that black people are black and thus have a closer resemblance to the generally 'darker' great apes is not racist because an algorithm that is not programmed to have biases cannot be racist. It's just peoples interpretation of the facts that makes things 'racist'. Superficially, black people and apes look mathematically more alike than white people and apes. If the thing was trained on albino apes (which do exist), white people would be considered apes AND NOBODY WOULD THINK IT WAS RACIST.
Re: It's an algorithm (Score:4, Insightful)
One could point out that there are fewer instances of white males being miscategorized. I suspect this has less to do with any actual racism and more to do with the fact that the people who developed the algorithm are likely predominantly white males and they tend to first test the algorithm on their own collection of photos or those in their circle.
This is an argument for a more diverse workforce...
Re: (Score:3, Insightful)
It's an argument for more QA testing
Re: (Score:3, Funny)
Testing, let alone QA testing or MORE testing, is NOT the Google way.
Re: (Score:2)
Google's phases are usually as follows:
1. Develop product
2. Release product as beta
3. Let product die after X years
Re: It's an algorithm (Score:5, Insightful)
One could point out that there are fewer instances of white males being miscategorized. I suspect this has less to do with any actual racism and more to do with the fact that the people who developed the algorithm are likely predominantly white males and they tend to first test the algorithm on their own collection of photos or those in their circle.
This is an argument for a more diverse workforce...
Yeah, because I bet that's how Google develops their image recognition algorithms - white guys walk around taking pictures of themselves.
As is more likely the case, there are few pictures of Albino gorillas in their machine learning corpus (racist against Albino gorillas!), and hence less data for white folks to more easily match gorillas based on macro level color characteristics.
Re: (Score:2)
White males are just about the easiest faces to categorize. They tend to have short hair that doesn't obscure facial features or create oddball shapes that confuse the classifiers. Their skin tone makes photographing them and finding edges extracting features easier than it is with darker skinned people. White people have a greater variety of eye colors that can be used to distinguish among them. "White guy face" is
Re: (Score:3)
This is an argument for a more diverse workforce...
You're right, the best thing for black people would be to replace otherwise functional employees with people who didn't make the grade but happen to be black.
I love the way you "Diversity, Diversity" types are actually incredibly condescending and racist. Perhaps that's why you types perceive "racism" in nearly everything.
Re: (Score:3)
Actually I am impressed that it did see how similar Apes and people are. Honestly people getting upset over it are just a bit silly. The problem is people think that someone put person_of_african_descent == ape in the code and that is not true. The algorithm just confused one great ape with an expressive face with another. It is no more racist or intentional than the same system confusing a Camaro with a Firebird.
Re: (Score:2)
It's just peoples interpretation of the facts that makes things 'racist'.
Fair enough. but
Were it you and your's you probably be more than just slightly (and equally justifiably) upset as well.
Re: (Score:2, Insightful)
Furthermore, if a 4 year old had pointed out the same thing we'd regard it as incorrect and then calmly correct for the real interpretation. That's not racism, that's "learning". We should regard this engine as if it were a small child and see that it's "learning". The only analog of this response I can see is when we blame the parents for perceived racism. It's the drawback of learning on the world-stage, under the scrutiny of every man, woman, dog, goat and goldfish on the planet. Living under the politic
Re: (Score:3)
But it is racist because the Republicans that run Google, like Republicans in general, use the term ape as a racist term. They're the ones that decided to do this. It was premeditated decades ago by their kind.
Living in the Southeast USA, I'm exposed to many Republicans. Basically my entire family, extended family, and at least half of my friends and co-workers are Republican. I have never heard any of them use the term "ape" to describe a black person. Some are mildly prejudiced, but none are bigots or racists. I've known a few bigots (likely what you think of as racists) in my life and they have invariably been Democrats, probably due to the lingering legacy of the Dixiecrats from decades ago. I'm not saying th
In over news (Score:5, Funny)
Google's algorithm also identified photos of some sunburnt Essex chavettes on the beach as Yorkshire pigs. A Google spokesperson said "no apology if necessary - it's an accurate assessment".
AI has come a long way (Score:2, Funny)
Yonatan Zunger, Google’s chief architect of social: "Machine learning is hard" Now tell us more about them self-driving cars.
"Software" has no opinions of race. (Score:5, Insightful)
Apologizing for a program miscategorizing an image it has never seen before as somehow "racist" makes about as much sense as GE apologizing because my toaster looks like a frowny-face from just the right angle.
Yes, Virginia, we've taken this shit too far.
Re: (Score:2)
Apologizing for a program miscategorizing an image it has never seen before as somehow "racist" makes about as much sense as GE apologizing because my toaster looks like a frowny-face from just the right angle
Most recognition software learns by going through a ton of examples and being told when its right and when its wrong. Most likely what happened here is that the learning phase used images of gorillas, but for "humans" used almost all pictures of white people. The computer doesn't know any better than it was trained, and if it wasn't trained to see black people as human too, then IMHO Google has well-earned the crap it is getting.
Re:"Software" has no opinions of race. (Score:4, Insightful)
Actually, apologizing makes sense in this case. It's not about being at fault (they are not), but common courtesy.
Early in our marriage, my wife taught me to say "I'm sorry" when those around me were hurt or bothered, to be a nice person. Prior to that I only said it if I was at fault.
So if someone spills soda on their shirt, then "I'm sorry". Same for Google, it is the decent thing to do.
Regarding photo identification, Google should have images of the Confederate flag show up in the category "Racist"...
Re: (Score:2)
I hope you are never involved in a not at fault accident.
You are giving horrible advice.
When someone is butthurt, you say 'I'm sorry you're butthurt...'
Re: (Score:2)
I know when not to say it. And my dashcam would show I wasn't at fault...
Re: (Score:2)
Is this a problem? (Score:2)
What's wrong with being classified as a rather violent but otherwise perfectly fine animal like an ape?
What about white men who risk being classified as Bill Gates, or Poettering? That would be really offensive.
Oh the outrage a non sentient can be racist (Score:5, Insightful)
What next a hairy fat guy in a pool gets tagged as a walrus ? His girlfriend a whale ?
Oh the horror the machine was mean.
In related news... (Score:5, Funny)
How is that racist? (Score:2, Informative)
That woman does look like a gorilla when she makes faces like that, can't blame the computer that automatically flagged you for that.
Algorithms don't know how to be PC (Score:2, Insightful)
Black, simian, short dark hair, big lips. Ape.
Gorillas aren't so bad (Score:5, Insightful)
I'm not going to downplay the feeling of insult that the black couple experienced. There is a long history of racism against blacks, referring to them as apes and other things, with the intent of putting them down. In *this* case, it was an accident of a flawed algorithm, but there's some history here that makes that a hot button. For the sake of repairing the effects of racism of the past, we should be careful about how we use racial slurs, even accidentally.
That all being said, we're learning more and more about gorillas and other higher apes and how intelligent they are. We're closely related. To an alien from another planet, they may look at humans and other apes and not perceive much difference. To compare humans (in general) to apes (in general) isn't all that unreasonable. And some day, when all this racism shit is behind us, mistakes like what happened here might be merely amusing.
Prediction (Score:5, Insightful)
They tweak the algorithm a bit. A week from now, a gorilla in a photo is tagged as 'black person'. Hilarity ensues.
Young children generalise too. (Score:2)
It's stupid (Score:2)
Re:Accepting Responsibility (Score:5, Insightful)
Let's see, we'll do this completely-innocent thing, which is hard, but helps society. Suddenly, hard thing does some harmless,amusing, not-entirely-predicted thing, and people whine about it. OMG, LET'S LEGITIMIZE THEIR STUPIDITY AS A VALID OPINION!
No, you're admitting fault here for something that is NOT YOUR FAULT. You're admitting bad behavior and bad decisions for something that was good behavior and good decision-making, but produced a bad outcome.
THIS IS WHY WE HAVE SHIT SCHOOL SYSTEMS!!! If we have 60% success rate and improve the school system by broad, visible measures to give a better education and improve to an 85% success rate, 15% OF PEOPLE WILL CRY THAT OUR NEW EDUCATION SYSTEM FUCKED OVER THEIR KIDS! Someone will point to all the failures, create a collage, and claim we're totally incompetent!
The appropriate response to bitchwhining about this non-issue is to tell people to stop fucking whining.
Comment removed (Score:5, Insightful)
Re: (Score:3)
Re: (Score:2)
Should watch this video from Slavoj iek about political correctness. Man makes some pretty amazing points.
https://youtu.be/5dNbWGaaxWM [youtu.be]
Re: (Score:2)
It's called an "apology" - did you skip that day in kindergarten?
Re:Accepting Responsibility (Score:5, Insightful)
It's called an "apology" - did you skip that day in kindergarten?
When the apology is a completely over-wrought bit of silly nonsense rendered in response to gleeful press releases from the Big SJW industry (who desperately NEED there to be events like this, whipped hugely out of proportion, in order to have things to get sound angry about), then it's not an apology. It's a forced sacrifice on the alter of Political Correctness gone (ever more) insane. There's nothing to apologize for here, because nobody at Google sat down to create a racist process or racist results. People who can't mentally untangle the difference between intent and coincidence should just shut up ... except, they're all media darlings now, because it's fashionable to be completely irrational on that front, now.
If Google tagged me as "albino ape" or "yeti" or "Stay-Pufft Marshmallow Man" I'd think it was hilarious. Those manufacturing faux offense at this bit of completely benign nonsense are the real racists. They are the ones who are saying that black people aren't smart enough to understand the situation. As usual, the racist SJW condescension is the most actually offensive thing in the room.
Re: (Score:2)
This is kind of like being hit in the arm by a baseball as you are walking by your neighbor's yard. It's probably no big deal, probably didn't hurt much and is unlikely to have caused permanent damage of any kind. But it's still respectful for your neighbor to apologize.
but helps society
Remember that Google gets money from this, primarily indirectly through advertising. Anything they do to help society also lines their wallets (which is pretty much the definition of how capitalism is supposed to work).
Don't fall into the
Re: (Score:2)
And to be extremely clear, I never said "Google was being racist", in any form or fashion at all. Let's nip that in the bud before someone argues against that straw man.
Re:Accepting Responsibility (Score:4, Insightful)
This is kind of like being hit in the arm by a baseball as you are walking by your neighbor's yard.
It's kind of like being hit in the arm by a baseball THAT YOU IMAGINED, BUT WHICH DOESN'T ACTUALLY EXIST, as you are walking by your neighbor's yard.
Re: (Score:3)
What a piece of work you are. Just the sort of person I'd want to work with or have designing products. I'm sure that when a bad outcome comes about, despite your behavior and decision-making clea
Re: (Score:3)
Hm, so, you're saying if you wrote some software that has undesired, incorrect behavior that could easily be considered deeply insulting and someone told you about it or even-gasp-complained,
I would assume they're ridiculous. There's a difference between, "Oh, that's not quite right" and "OMG LOOK AT THIS HORRID! YOU MUST APOLERGIZE!" This is an unremarkable bug, not a sleight against anyone; an apology has no context, aside from patting someone on the head and placating them for being retards.
I'm sure that when a bad outcome comes about, despite your behavior and decision-making clearly having been perfect, your response will be polite and professional.
It might be, but it won't be an apology. When people start rallying and screaming on my Facebook page because 85% of people who watched Planet of the Apes also watched a Martin Luther King documenta
Re: (Score:3)
It might be, but it won't be an apology.
Unless you are Steve Jobs reincarnate, I doubt this position will get you as far in life.
Re: (Score:2, Troll)
How is this not their fault? They clearly didn't test their software properly. Their software produced completely unacceptable results. They're right to apologize.
You're admitting bad behavior and bad decisions for something that was good behavior and good decision-making
Good decisions? Sorry, but releasing poorly tested software like this was obviously a bad decision. The bad outcomes were a direct result of their poor decision making.
Reasonable people take responsibility for their mistakes. They don't play pretend that "they did everything right" and that their failures were completely beyond their control.
Re: (Score:2)
How is this not their fault? They clearly didn't test their software properly.
They may have tested it with hundreds or thousands of photos available on Picasa and not had it tag anyone "Canus Lupus Homus Sapius Chimpanzeeus", and then released it and in a week had someone take a picture at their wedding and get tagged "Chimpanzees". If your face is hard, deeply-wrinkled, and sporting a bolt-on pair of enormous, leathery ears, it might tag you as a monkey; I think I've encountered exactly one person in my life who looked like that, so it's not surprising it'd miss him in testing. M
Re: (Score:2)
How good does the cutting edge of object recognition need to be before it's not "poorly tested" anymore, especially when it's for a silly photo app and not a medical or military application? I never hear this type of thing from people who have actually had to solve these types of problems. The reality is that objects are going to be confu
Re: (Score:2)
Woah bud, calm down. You weren't on this development team, were you?
Re: (Score:3)
there's no issue. just a hissy fit thrown by a guy who has heard this expression addressed at him/his friends in a negative manner in the past (understandable). i mean, just look at the faces the girl in those pictures is making... isn't gorilla the first thing you think of?
what's next? are we going to pretend a closeup of a bald patch doesn't look like a billiard ball? or that asians don't look like they're winking in photos? or that dwarves in funny hats don't remind us of garden gnomes? some people just
Re: (Score:3)
An apology is in order, nothing more. And only to the misidentified people, not to t
Re: (Score:2)
agent smith plz go.
Re: lol (Score:5, Insightful)
The truth is that there are a ton of entitled idiots who believe they have the right to be offended, and gigaton of idiots who chose to oblige. That's why we don't have nice things anymore.
Re: lol (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
I think we're looking for racism where there is none. Observing physical characteristics is not racism, and the fact of the matter is that some black people tend to have a more protruding jawline and fuller lips than one would see in a specific group of people who are caucasian. If you compare those two, very specific, physical characteristics with the great ape family, you see similarities.
Now, before anyone starts screaming about how I'm racist (too late), having one or two physical characteristics with another species, out of hundreds of physical characteristics, doesn't mean a damn thing when it comes to humans easily identifying people as not belonging to the same subgroup that shares those characteristics. It's the same reason why we don't think that someone with a striped mohawk is a zebra. We're able to take contextual clues and infer that that person is not, in fact, a zebra.
However, it DOES mean that an algorithm that has been largely trained on either insufficient or faulty data sets can make incorrect inferences based on the characteristics that it has been trained to recognize. If anybody is at fault here, it isn't the algorithm, but whatever engineer fed their image recognition an algorithm so woefully insufficient that it would confuse a human with an ape just based off of a physical characteristic or two out of any number of data points that would indicate, "Hey, this is obviously a human being".
Having done some random searches I agree this is just a mistake - and comparable to others. A few examples that I have found are
- a search for "dolphins" including a picture of my daughter swimming.
- a search for "squirrel" including meekcats
- a search for "cat" including some dogs
- a search for "ghost" including a slightly blurred picture of my wife
- a search for "man" showing some women and "woman" showing some men