Facebook Apologizes After Its AI Labeled Black Men 'Primates' (bbc.com) 217
Facebook users who watched a newspaper video featuring black men were asked if they wanted to "keep seeing videos about primates" by an artificial-intelligence recommendation system. From a report: Facebook told BBC News it "was clearly an unacceptable error", disabled the system and launched an investigation. "We apologise to anyone who may have seen these offensive recommendations." It is the latest in a long-running series of errors that have raised concerns over racial bias in AI.
Humans *are* primates (Score:2, Informative)
checkmate, woketardz!
Re: (Score:2, Interesting)
Then the story would more closely resemble "Facebook users who watched a newspaper video featuring men [in general] were asked if they wanted to "keep seeing videos about primates" by an artificial-intelligence recommendation system. ".
Re: (Score:3)
To be fair, the video showed black men and white men. The report chose to highlight that there were black men in the video. I saw behavior typical of primates by all parties in the videos.
Neither video showed enough context to decide who was right or wrong (if anyone).
Re: (Score:2)
Where did you see the video? Do you have a link?
Re: (Score:2)
An article from NPR [npr.org] had the link [facebook.com].
Re: (Score:2)
I just watched it. On a percentage of time basis, it looked to me like the white people in the video were featured just as much if not more than the black people in the video. I think the mis-classifcation may not be race based. (I know, I know, technically humans are primates, but I assume it doesn't just say "want to watch more primate videos" for all videos depicting humans, so it is still a mis-classification)
Re: (Score:3)
An article from NPR [npr.org] had the link [facebook.com].
I see people with dark skin, I see people with light skin. Are we certain that the word "primate" was referring only to the people with dark skin?
The video itself is about people with light color skin harassing people with dark color skin. And it is supportive of the people with dark color skin. So it seems like people that consider themselves against racism and harassment of other people would be watching. And sad to say, a fair number those folks might take offense.
Gotta be careful with those fol
Re: (Score:3)
Re: (Score:2)
Not exactly...or perhaps not, anyway. But it was clearly either wrong or misapplied. People don't like to be reminded that they are primates, and calling a human a primate is normally intended as an insult.
I can see an AI that was trained to categorize living objects calling people primates without error. It would still be wrong to use it in this context.
Re: (Score:2)
That's like saying humans don't like being called "Homo Sapiens" because it has the word "homo" in it. The fact that using "primates" to describe humans is upsetting to someone's sensibilities does not make it wrong.
Re: Humans *are* primates (Score:2)
By correcting themselves and indirectly saying that black people aren't primates, Facebook is implying that they're not human.
Re: Humans *are* primates (Score:2)
Re:Humans *are* primates (Score:5, Insightful)
This is really a problem with how machine learning actually works.
This is similar to "click all the bicycles" and all you see are bricks. The ML believes there is a bike in there, and won't let you continue until you click it.
Somewhere along the line, the training set contained black men and whatever was labeled "primates", possibly a wikipedia entry, and the ML correctly identified that yes, humans are primates, but the training set didn't have anything in there to discern why "primate" was the wrong word to use.
It's similar to other ML problems that require babysitting the training, like ANY interaction with humans, has to be reviewed before being fed to a ML program, because you don't want it to learn bad habits (eg Microsoft Tay), and you don't want it to misinterpret slang as proper language. You can easily trick RNN's into believing that the concurrent use of certain symbols belong together, even when those symbols are not analogous. The ML RNN would quite literately learn to speak like a gangster or a nazi just be constantly being fed slang that it understands but doesn't contextually use in an inoffensive way.
Context is important (Score:5, Insightful)
Re: (Score:3)
Black culture has always been more than being oppressed.
Re: (Score:2)
ALL people look like monkeys, or apes, etc. This is a notoriously difficult problem for a piece of computer software to solve. I look so much like a capuchin that one in a zoo in Panama offered me a piece of his banana. (Or maybe he was just trying to make nice with the much larger primate, who knows. But we did share a significant resemblance in facial hair at the time.) And the kids in the zoo, who have presumably seen all of these animals before, found me far more interesting than any of them; I'm two me
Re: (Score:2)
Totally natural, possibly. Also something that they can properly be called on. When Facebook does it than a defamation lawsuit seems reasonable to me, and possibly to them, so they quickly apologized. Perhaps semi-sincerely, to the extent that a corporation can be sincere.
Re: Context is important (Score:5, Insightful)
Small children of all races are affectionately called monkeys in many languages.
Older children of all races are not-so-affectionately called monkeys, in many cultures and many languages, when they fail to act like civilized human beings.
Dehumanization requires intent. Something that superficially resembles dehumanization but is not intentional is not dehumanization.
Confusing superficial resemblance of a thing with the thing itself is a cargo cult mentality. It occurs in all sorts of places, like church attendance being confused with piety or spirituality and university attendance being confused with learning.
You're doing it too. If you were to take offense at the Latin alphabet on the grounds that those same 26 letters were used to write racist polemics and laws and whatnot, you'd be committing the same kind of error, if only in greater degree.
Part of the scientific revolution was the ability to distinguish superficial appearances and first impressions and anthropomorphizing ascription or morality as concocted entirely in your own mind from underlying reality as deduced from repeatable objective measurements. Let's not lose that.
Re: (Score:2)
Also, yes, context is important. Here, the context is a flaw in ML training, not archaic slurs or who their targets may have been.
true but irrelevant (Score:3)
Re:Humans *are* primates (Score:4, Insightful)
Pat yourself on the back for never opening a history book or novel set in the American South.
Re:Humans *are* primates (Score:4, Insightful)
Re: (Score:2)
Obviously, humans are primates. However, the word "primate" has a slightly different meaning in colloquial English.
Yes, in colloquial English it means most important,or highest ranked, and is not specific to zoology.
Re:Humans *are* primates (Score:5, Insightful)
I remember an uproar over the word "niggardly" not that long ago. The word is much older than the similar, yet entirely unrelated, racial slur and even predates the European-African slave trade by centuries.
So, what is this "descriptivist" viewpoint that says people who don't understand what the words they use actually mean get to impose their mistake on everyone else? How can any legitimate approach to meaning say it is correct to erroneously assign to a word a meaning it does not possesses? Aren't you saying they are correct to call it racist because they don't know what it means? Or at least in spite of not knowing what it means? How much of that can we permit before communication becomes impossible?
Re:Humans *are* primates (Score:4, Insightful)
Language changes; get used to it. Maybe you're just getting old and struggling to keep up? :)
In general, people only use the term "primates" to refer to animals. Rascists have long used the term "monkey", and football "fans" make monkey noises at certain players. Thus primate is a synonym for monkey in this context.
The question is: why would you choose to be obtuse when this is all very apparent?
Re: (Score:3)
It's absolutely common usage and I'm not going to make a fool of myself claiming it's about control and finding contrary arguments to refuse to go along with it. I concede that the English language has a lot of variations in its usage from place to place and country to country, but in this instance it's very clear and very common and I have a hard time believing that's not the case where you live.
Re: (Score:2, Troll)
So we completely derail the language every time somebody wins an Oscar for performative outrage?
Nah.
Performative outrage! Love it.
One of the problems with attempts to remove words from the lexicon is that it ends up becoming a sort of Newspeak. We've seen it with words like Moron, retarded, Imbecile, idiot, midget, dwarf, and other words that have somehow become offensive, and now I guess we are going to go after primate?
The problem with this cleansing of the lexicon is that these words are not born offensive.
I'm an asshole - that word was never a positive thing - it was born offensive
All tho
Re: (Score:2)
We've seen it with words like Moron, retarded, Imbecile, idiot, midget, dwarf, and other words that have somehow become offensive
Words used for decades as insults somehow become offensive? It is a mystery I tell you, and so unfair to the word.
Re: (Score:2)
As a non-human primate I'm offended that humans are offended to be associated with me.
Re: (Score:2)
I would agree with you were the same option given to those viewing pictures of white or yellow people.
The entire Facebook banned from Facebook (Score:5, Funny)
Facebook reacted promptly to the disturbing racist incident and permanently banned Facebook from Facebook for violating community standards.
Re:The entire Facebook banned from Facebook (Score:5, Insightful)
And the people rejoiced.
Re: (Score:3)
And productivity increased 175%!
Re: The entire Facebook banned from Facebook (Score:2)
Freeing up humanity for more productive pursuits. Within days of the ban, Twitter users increased by 4,000% the number of people 'cancelled' for jokes posted 10 years ago. Jack Dorsey, speaking live from a cave in Afghanistan, expressed cautious optimism for Twitter's plan to end humanity.
You would think (Score:2)
They would test AI before they unleash it on the world
Re: (Score:3, Interesting)
Re: (Score:2)
I think they did which shows that their testing was bias too...
Re: (Score:2)
They would test AI before they unleash it on the world
What's the racial diversity breakdown of Facebook's employee pool?
Re: You would think (Score:5, Insightful)
While a black person might have noticed this, that's not the key issue. They could just as easily have missed the issue.
Their methodology for training the AI was probably flawed. Its not racism or muh representation. It's a badly designed AI that, along similar lines, could have misidentified women as male if they are particularly tall. It could have confused bald people with eggs.
Hiring an impossibly diverse staff would be a far less effective approach than having well designed experiments.
Re: (Score:2)
Maybe they did test it... after all, it wasn't wrong.
Flip it (Score:4, Interesting)
Don't look at it as AI confusing black men with non-human primates (because we are primates too), look at it as AI unable to tell the difference between our species.
Have you ever gone to a zoo and looked at an orangutan or gorilla? Maybe a chimpanzee? They're a lot closer to us than most people probably realize. If you're doing facial recognition using skin tone and distances between key features, it doesn't surprise me in the least that a man and gorilla can be a 'close match' to an AI.
Re:Flip it (Score:5, Funny)
> Have you ever gone to a zoo and looked at an orangutan or gorilla? They're a lot closer to us than most people probably realize.
The differences between orangutans and humans should be obvious. Orangutans are - well, orange. And unable to override their emotions with logic. Have you ever seen a human is orange and unable Oh shit!
Re: (Score:2)
2017-2021 will forever be burned into our memories.
Re:Flip it (Score:5, Informative)
Orangutans are quite capable of overriding their emotions with logic. Give one a screwdriver and it will pretend to be disinterested, hide the screwdriver and later disassemble its cage. The emotion is wanting out, the logic is wait until the coast is clear and use the tool for what it was designed for.
There's all kinds of stories of Orangutans being escape artists, doing things like scoring a piece of wire, hiding it in their mouth and picking the lock later. Definitely thinking ahead and waiting for gratification.
Do a search, there's some good stories
Re: (Score:3)
So you're saying they carefully PLAN their exit, rather than trying to run away immediately with no real plan (and leaving their friends behind)?
Re: (Score:2)
Yes, more logic then certain people
Re: Flip it (Score:2)
An AI can mistake a ferret for a guinea pig, and if its job is to perform image classification for humans, that's an abject failure, because any human would know we don't eat ferrets.
There is no point in rationalizing it, I know they're both fuzzy little creatures, and the machine is not rational. Don't make excuses for a failed algorithm, it's not a child.
Re: (Score:2)
You are correct. But are we to believe that the "AI" was simply let loose in the wild with zero supervision.
There's no reason that the "AI" wouldn't tag other races as such. It seemed to do well on whites and even darker people's. Black is an adjective. But the shades are from white complexion to ebony. It didn't seem to treat others of different heritage but that were darker or at least as dark the same way.
Why is that? A VI is trained to an extent. Why the disconnect?
I think there was human involvement.
No, you and others are still giving the AI algorithms way too much credit, as if because it is only a dumb machine it must be right on some 1+1=2 level so it's either telling us a fundamental truth or a human must have screwed it up on purpose. That is totally wrong. It is a dumb machine using fuzzy logic, and it makes dumb decisions, that is all. We don't need to justify it, we need to improve it or paper over the faults, and that is a major fault. It isn't doing 1+1=2, is it doing 1ish plus 1ish shoul
Re: (Score:2)
You can't tell wether a subject is a voter or member of a union from a video unless that video actually shows them voting etc.
A human could be wearing clothes which resemble the hair naturally present on a gorilla.
The clothing can obscure the body shape.
There are a wide range of body shapes and facial structures to account for.
The pose of the subject could also affect the body shape identified by an AI.
AI is still pretty much in its infancy, it makes mistakes, and humans/gorillas are not hugely far apart. I
Re: (Score:3)
The main reason I can tell a gorilla from a human is that I'm a human. I'm fairly sure that the average European wildcat is very capable of telling its own species from the common housecat, I wouldn't count on you having the same capability.
Same goes for an AI who has no vested interest in mating with the correct species.
Re: (Score:2)
They were never taught to wear clothes...
I know a lot of hairy humans...Congenital Hypertrichosis.
So... =p
That's not so bad. (Score:4, Funny)
Re: (Score:2)
Re: (Score:2)
Boy do I have a country to show you!
https://en.wikipedia.org/wiki/... [wikipedia.org]
It's pronounced like "knee-jair". Like you'd say it in French.
And both countries are named after a river.
Which is where they kidnapped people for use as slaves in America. Which is how the word came to be that then became a slur.
Same for "moor" by the way. The Moors were a powerful nation in west Africa.
I wonder... is any of this part of US education? (Because it should be. It's super-interesting too.)
Re: (Score:2)
At least it didn't label them "Nigerians".
I may have missed a joke there, but how about Archbishop Ndukuba, Primate of All Nigeria?
I'm sure he has no problem with the Word.
https://livingchurch.org/2019/... [livingchurch.org]
Re: (Score:2)
Maybe in Spanish versions it did.
Primates come in a bunch of (Score:2, Informative)
Re: (Score:2)
Re: (Score:2)
There are definitely people out there posting photos of black people and describing them as primates. They probably aren't doing so with the intention that they be fed into AI systems. They are doing so with the intention of making racist comments on social media.
Re: (Score:2)
> wear clothes
Speak for yourself, kiddo!
-- old-scool geek. ;)
Re: Primates come in a bunch of (Score:5, Interesting)
No, it's not odd and it's a leap to begin suspecting racism. A poorly trained system is a far more likely explanation.
Bias is more likely than racism. Their sample may have been inadequate. Blacks are around 13% of the general population. That figure may be lower in some populations (e.g. FaceBook users). If they were assigned x number of training images, even just based on population representation, then 87% of the sample would not be black. Probably far lower if including data of international users.
It is far more likely somebody ballsed up the sampling. A form of bias, sure, but not quite burning crosses on the lawn.
Re: (Score:3)
It is far more likely somebody ballsed up the sampling. A form of bias, sure, but not quite burning crosses on the lawn.
No it's more than someone ballsed up the sampling, someone ballsed up the entire process of making the product from the sampling all the way through to release. I do work with deep learning. We can't release something without it having to go through a review by a team who's job it is to check for things like this.
I don't work for facebook: the company I work for could be swallowed by faceb
Re: (Score:2)
Let's say I go to a shop; the shop keeper is a racist but keeps that very much to himself, because they know that treating customers in a racist way is bad for business and hurts your own wallet. Shopkeeper = racist, shopping experience = not racist.
This is the exact opposite. The algorithm isn't racist, because "racist" is not something that algorithms can be. However, the outcome is racist: A derogatory term was used for black people.
For all those w
Re: (Score:2)
Why not both?
Humans are primates and it's reasonable that an AI would classify humans as such. It's not that AI is wron
Re: (Score:3)
It might not even be a sample size issue.
This is pure speculation here as no one here knows the exact algorithm.
Yet, let's assume that Facebook has a generalized AI to detect what a video is about then recommend ads.
Video of airplane --- suggest trip ...
Video of cat --- suggest cat food
Video of primates --- suggest more primate videos
Let's also keep in mind the context. These videos were mainly black people v police encounter videos. This is not just black people walking around. I'd find it surprising if Fa
Re: (Score:2)
Not necessarily racism (Score:5, Insightful)
Re: (Score:3)
This.
The AI operates based on the data set it's been trained with.
If you feed it a bunch of photos of white men labelled "human" and a bunch of dark gorillas labelled "gorilla" it's going to bias towards labelling darker images as gorillas.
If you carefully control the training data, you could get the AI to recognise a pile of coal as a gorilla because it's dark.
Re: (Score:3)
Re: (Score:2)
Re: Not necessarily racism (Score:2)
But if race is a social construct that we no longer find useful, isn't deconstruction the correct path forward? That means we need to stop categorizing people by race.
Re: (Score:2)
Never attribute to racism that which is adequately explained by an AI trained on an insufficiently robust data set.
Nobody says the word "primate" was created through racism. What is said is that using the word _is_ racism, no matter _why_ it was used.
Re: (Score:2)
In this case, an overly robust data set, that correctly identified primates.
AI is like children ... (Score:2)
Reminds me of a comedy bit by Patton Oswalt about when his baby daughter accidentally acted racist [youtube.com].
We wouldn't let children do recognition and decision tasks that some corporations use higher-order pattern-matching (so called "AI") for ...
Why you they think that the "AI" would be more appropriate? We could train children with the same data - but it wouldn't make them more mature.
Facebook is a joke (Score:4, Insightful)
AI is the "blockchain" ... (Score:3)
... of scam. AI will not be here until a computer responds with, "Fuck it! I ain't doin' it"
Likely ImageNet Striking Again (Score:3)
For a decent synopsis of the issue, see:
https://venturebeat.com/2020/11/03/researchers-show-that-computer-vision-algorithms-pretrained-on-imagenet-exhibit-multiple-distressing-biases/
just wait the AI hiring race lawsuits to hit! (Score:2)
just wait the AI hiring race lawsuits to hit!
Pay bananas get monkeys (Score:2)
who think that their AI is working at SkyNet/Matrix levels.
You're all racist (Score:3)
The footage contained cops too, but you're assuming the AI decided that it was the black people that made it decide the video contained primates, not the cops. Therefore making you racist.
Where is the video (Score:2)
Are we sure the AI was calling the black guys... (Score:2)
Re: (Score:2)
It sounds more and more to me that the people calling the whole thing "racist" simply assumed that the AI labeled the black people primates.
So yes, I do think the accusation of racism is apt. I only wonder if it's made in the right direction.
Too fucking much people! (Score:2)
Re: (Score:2)
Before you bleat about people needing to learn "fucking science" you would do well to learn "fucking history".
Finding this very hard to assess (Score:3)
I find this impossible to assess.
First not having found or seen the video, its impossible to tell what its depicting. From the postings it appears to show black and white men in dispute. Without knowing how the algorithm works we do not know what in the video led to the suggestion of more clips of primates. Was it behavior? Was it the black guy? Or was it the white guy? Who knows?
Second, we do not know what you got to, if you clicked on the 'more primates' link. Did you get just ape clips? Or did you get a mixture of clips of humans and apes? Or did it give you just humans? Or maybe clips of all kinds of unrelated stuff or creatures?
If FB's algorithm were routinely to suggest clips of apes to anyone looking at clips in which blacks are found, but not otherwise, then you'd have to agree there would be a problem.
But from the reported facts and available evidence its impossible to know if that is happening.
Checking FB for 'primates' (Score:2)
I checked by searching YouTube for 'primates'.
As far as I can tell it is not coming up with pictures of black people, or not preferentially so. The few people in the little icons seem mostly (not all) to be white or quite light skinned. The subject matter seems to be mostly clips of apes, and quite a few are of people talking pedagogically about the differences and similarities and evolutions of apes and humans in general.
I didn't check out what is in them, having spent enough time on this already. I don
Zoology says (Score:5, Informative)
Humans:
Domain: Eukaryota
Kingdom: Animalia (Metazoa)
Phylum: Chordata
Subphylum: Vertebrata
Class: Mammalia
Order: Primates
Family: Hominidae
Genus: Homo
Species: Homo Sapiens
Now, calling someone a Prokaryote would really be an insult.
Heh (Score:2)
They, as always, apologise for nothing yet fail to apologise for fake news and rigged elections and referendums.
This reveals a flaw with assumptions about AI (Score:2)
Story is almost too perfect (Score:2)
The story is almost too perfect.
While I can easily believe a poorly trained AI is at work, it's just sooo perfect for a Two Minutes Hate that one can't help but be suspicious.
AI Not Necessary in this Application (Score:2)
There is no need to have AI analyze a video and guess what the content may be. AI is insufficiently capable of doing so and the entire effort is better served by simply suggesting videos with similar keywords.
Re:So what? (Score:5, Insightful)
Re: (Score:2)
Interestingly, they did not apply the "primates" label to videos of dark East Indians, for example. Draw your own conclusions.
Re: (Score:3)
The video in question showed white men and black men. All displayed typical primate behavior.
Re: (Score:3)
But socially clueless bots are not.
Re: (Score:2)
Humans ARE primates...
To you, as a subspecies of "smart ass primate": In general conversation, the word "primate" is commonly used as "non-human primate". I suggest that in the future if anyone asks you if you identify as white, black, or whatever, you answer "primate".
Re: (Score:2)
In general conversation, the word "primate" is commonly used as "non-human primate".
Maybe where you are. Here, the only people who use the word, know what it means.
Re: (Score:2)
I call bullshit on that. Primate means exactly what the dictionary says, and it includes humans. Just like mammals includes humans.
Re: (Score:2, Insightful)