Apparent Racial Bias Found in Twitter Photo Algorithm (venturebeat.com) 119
An algorithm Twitter uses to decide how photos are cropped in people's timelines appears to be automatically electing to display the faces of white people over people with darker skin pigmentation. From a report: The apparent bias was discovered in recent days by Twitter users posting photos on the social media platform. A Twitter spokesperson said the company plans to reevaluate the algorithm and make the results available for others to review or replicate. Twitter scrapped its face detection algorithm in 2017 for a saliency detection algorithm, which is made to predict the most important part of an image. A Twitter spokesperson said today that no race or gender bias was found in evaluation of the algorithm before it was deployed "but it's clear we have more analysis to do." Twitter engineer Zehan Wang tweeted that bias was detected in 2017 before the algorithm was deployed but not at "significant" levels.
Its abit strange. (Score:1, Flamebait)
Re: (Score:1)
Seems abit strange that its only racist when its in favour of someone white. Im sure it would not be equally racist algoritm if it prioriticed a black person.
Who said it was racist?
Re:Its abit strange. (Score:4, Informative)
Seems abit strange that its only racist when its in favour of someone white. Im sure it would not be equally racist algoritm if it prioriticed a black person.
Who said it was racist?
The headline only said it was racially biased. Since the algorithm seems to have a bias based on skin color (which is to say, race), that headline seems accurate to me.
Re: (Score:2)
Which is it, biased based on skin color, or on race? The two aren't exactly the same thing... (Pretty clear it's skin color, not race.)
Re: (Score:2)
Which is it, biased based on skin color, or on race? The two aren't exactly the same thing... (Pretty clear it's skin color, not race.)
You are indeed right in your pedantry, but if we are going to be a pedant then the fact that "race" doesn't exist from a biological point of view should figure in to it. I guess they meant skin colour and most people would understand that we conflate the two since one doesn't really exist.
Re: (Score:2)
Yep, exactly. Race doesn't exist from a biological point of view.
What's more, I'm pretty sure the pattern-matching-system in question doesn't have a concept of race which matches those wielding identity politics and making everything (including things like this, which simply aren't...) about race.
Which is why people with different eye shape (as opposed to skin color) likely don't factor into this problem, despite that being just as much about "race" as popularly conceived as skin color is. So then who gains
Re: Its abit strange. (Score:2)
Re: (Score:2)
Why don't you talk to a doctor instead of taking on the mantle of spreading idiocy until someone corrects you? Could it be that what doctors consider race and what you consider race are two different terms describing two different things?
Nah! That couldn't possibly be it!
Re: Its abit strange. (Score:2)
Re: (Score:2)
If you think I am sjw you are an idiot. That is very exact, isn't it. You are in idiot and I gave you the reason why.
Now, can you give a real example when you went to a doctor and they asked you for your race? Don't give me "it happened" bs. Tell me a real incident that happened to you. What was the illness? Did you ask the doctor why he needed that? Do you know or have some idea why the doctor asked you that?
Because I have NEVER been asked for it in all my fucking life.
Re: (Score:2, Interesting)
The article is content-free, it says that a bias was detected, but not what inputs the algorithm looked at.
The real answer might be that it showed more popular photos and those happened to be of white actors or something like that. In other words, it's probably the case that the bias here came from input from the users that was fed into the algorithm. Given that everything they do is based on popularity, it will be interesting to see how much they can overrule the decisions of the users before the users a
Re:Its abit strange. (Score:5, Insightful)
"Apparent" because this is something people have noticed and needs investigation.
When there are photos that need to be cropped for Twitter, especial mobile, it does it automatically. Face recognition is used to try to make sure that the subject is in the frame, but when there are a white person and a black person in the same image it usually picks the white person.
It's a well known problem. Face recognition often fails with dark skin, devs often don't test for it. I saw it happen in a thread about Zoom removing black people's heads because the face recognition failed and considered them part of the background.
Google, Bing, DuckDuck Go, etc. (Score:1)
On each search engine, enter , "hot bikini girls" or something similar.
99% white checks.
Re: (Score:2)
On each search engine, enter , "hot bikini girls" or something similar. 99% white checks.
on an image search engine, enter 'jamila', which is swahili for 'beautiful woman', a language spoken by people who are mostly of black african descent.
99% dark skinned or black women.
What a strange coincidence.
Re: (Score:1)
So beautiful woman don't wear Bikinis?
Nothing in "Hot Bikini girls" implies race.
Re: (Score:2)
Nothing in "Hot Bikini girls" implies race.
Other than being an english phrase, spoken mostly by whites. You've restricted your data set to mostly one type, don't be surprised when your sample reflects it.
Re: Its abit strange. (Score:2)
So a contrast issue? Seems a little over the top to call in inherent racial bias. Conjuring up images of cross burnings and lynchings does little to solve a technical problem of using shadows to detect bone structure, and geometric spacing of eyes, ears, and nose.
Headlines should really say, black persons get a break from big brother using facial recognition to catalog and track them.
Re: (Score:2)
Unfortunately what usually happens is the police buy a "99.999% accurate" facial recognition system that has only been tested on white people and it incorrectly identifies a load of black people who get a knee to the neck from the cop who is convinced they are the person they are looking for.
Re: (Score:2)
I've been against facial recognition since before the movie minority report. I have never had a facebook account and I have made sure my wife crops me out of anything she ever uploads and at no time is my image tagged or any other sort of identity to reference me should I appear in the background of a picture. What I do not understand is all these anti-maskers. Masks are really screwing up the facial recognition systems, except for the dumbasses that are literally taking selfies of themselves wearing a mask
Re: (Score:2)
> "Apparent" because this is something people have noticed and needs investigation.
Maybe they should investigate and find out what's going on *before* writing the story? Would it really take that long to figure out what input is going into the algorithm? A simple code dive like that should take less than a day, so why didn't they?
Re: Its abit strange. (Score:2)
Re: (Score:2)
Where did you get "racist devs" from?
The problem is incompetence and lack of proper testing.
Re: (Score:2)
That would make it a technical limitation and not a racist bias. TFS states "Racial bias" in the title. I don't think RGB (or CMYK or whatever kids use these days) is very racist, but perhaps I'm just blind to the plight of color schemes these days.
But racial bias here doesn't mean "racist." It simply means the algorithm has a bias, one that happens to be in how it perceives skin color and how it uses to make decisions. It doesn't mean the algorithm was implemented that way on purpose, but that it simply was modeled with input that for whatever reason led to this behavior (the data feeding process itself not necessarily being ZOMG racist.).
The algorithm could easily had gone in some other direction: gender bias, or age bias. There's nothing sinister
Re: (Score:2)
There are people arguing that the algorithm is racist just a bit further down, though.
Re:Its abit strange. (Score:5, Interesting)
Re:Its abit strange. (Score:5, Interesting)
As a photographer and programmer...
White faces are typically defined by their shadows. Therefore you light white people so that the shadows on their face fall appropriately, because it's going to be the shadows that your brain uses to define their facial structure.
Black faces are typically defined by their highlights. Therefore you light black faces so that the highlights on their face fall appropriately, because it's going to be the highlights that your brain will use to define their facial structure.
You can blast a black person with light and light their face the same as a white person. It will work, but it will look more like a flash-blasted sports/documentary type of look, and it won't necessarily look natural and that's not the way we see them out and about in normal life.
You can also light a white person like a black person too, but the result will be a deliberately "dark" portrayal of the person and again it won't look natural or the way we see them in normal life. This is actually a mature technique for shooting something in broad daylight, while making it look like it was shot at night. Many nighttime scenes from classic hollywood were shot during the day actually, like many of the scenes in Psycho or Jaws for example. This is not going to look normal and neither is over-lighting a black person.
It's all about contrast either way, but whether the algorithm is programmed, or whether it's trained, or whether it's a combination of both, we should expect a facial recognition algorithm would need to use a really different techniques, and/or be trained on completely different datasets in order to detect black faces equally as well as white ones. In normal lighting situations, the physical light patterns will be more or less totally different. If you train with a dataset of mostly white people I would expect the resulting algorithm to be bad a detecting black people, which is really common sense isn't it.
Re: (Score:2)
Thank you for giving a much more comprehensive answer to this than I did. All I'd add is that an algorithm doesn't have to actually light the people in a photo properly, something you clearly know how to do.
Re: (Score:2)
thanks I'd mod you up if I had points. so it sounds like to make it work you would first have to identify which algorithm to run on a face ( for dark vs light skin) and so you would need at least 3 algorithms. ( 1 to choose which of the other 2 to use).
Re: Its abit strange. (Score:2)
Re: (Score:2)
No it is racist, no matter what, take for example facial recognition not recognizing black people as well, racist right. What if someone made a facial recognition system that recognized black people better than white, that would be racist too, police are targeting black people. I think it is a fundamental problem with the way people think, we don't use the facts to come to a conclusion, we have a view of the world and arrange the facts to match our view of the world. So if you think the world is racist you
Re: Its abit strange. (Score:1)
Re: (Score:2)
well, the 1st problem is that 'race' is a construct and a term that needs to be dropped 'race' doesn't exist because 'race' implies a culture as well as physical characteristics. 'Racism' is actually caused by multiple effects and then usually re-enforced by social norms etc. Some we can fix, some we can't.
As you have pointed out, the human mind/eye is automatically drawn to that which moves and that which is different. It is a basic survival mechanism and the things that are changing are most likely to
Too smart is dumb (Score:3)
Twitter trying to find what parts of a photo I'm most interested in seeing as a thumbnail seems to be a problem too hard to solve.
Their attempt at focusing at faces may have seemed like a good idea, but some faces are easier to detect than others, depending on how you've trained the algorithm.
Twitter would be better off scaling the whole image down to an actual thumbnail, and stop trying to guess what I want to see.
Re: (Score:2)
Re: (Score:2)
Congratulations, you made an observation. Now go and look to see why people concerned by social hierarchy and the stratification of classes in our society have drawn those seemingly arbitrary delineations and maybe you can turn that observation into insight.
Re: (Score:2)
apparently everything old becomes new again at some point.
colored => bad!
PoC => good!
segregation => bad
building wakanda in rural GA => good (of course if whites tried this there would be a joint ATF/FBI raid within a week.)
All them Whities in San Fran are racially biased (Score:1)
Re: Photons are systematically racist (Score:2)
Contrast differences (Score:2)
When your algorithms look at points of contrast, any picture with less contrast will be harder to classify in detail.
People with darker complexions naturally have less contrast, so they're harder to classify. This is not racist - it is, as someone else pointed out, a property of light reflecting off surfaces.
What might be racist is asking a computer to classify things for you, since it isn't going to be affected by prejudice that says you must get a certain outcome, regardless of facts.
Re: (Score:2)
"What might be racist is asking a computer to classify things for you, since it isn't going to be affected by prejudice that says you must get a certain outcome, regardless of facts."
You were doing well until that part. If the goal is to detect faces and sometimes it doesn't because a face has dark skin then it has failed, period. And if the way it is written causes it to demonstrate bias, then it is biased. It doesn't mean it's racist, even if it's racially biased, because it doesn't know anything about ra
Won't someone think of the CCDs? (Score:2)
How much of this has to do with the charge-coupled device sensors [youtu.be] in the first place? Wouldn't it at least partially have to do with biasing light-gathering towards a section of the total dynamic range, after which you'd probably be better able to tweak the algorithms that do their own raw luminance -> detection-biased luminance mapping prior to detection?
Racial bias or contrast bias? (Score:5, Insightful)
Re: (Score:3)
"Biased" simply means that the algorithm preferentially selects white faces as interesting over black faces.
The existence of bias has nothing to do with whether that was built into the code, or the algorithm learned it from scratch.
(It is almost certainly due to the fact that the algorithm was trained on more white faces than black faces. Is that bias? There are more white faces than black faces in America, that's just demographics. But if the result is preferentially selecting white over black, it is stil
Re: (Score:2)
Yes, but reading through the Twitter thread linked in the article, it seems there is an assumption of preference without actually demonstrating it.
Let's say you have 100 pictures, 20 of black people and 80 of white people. If the first step of your algorithm is to randomly select 10 pictures, the uniformly sampled and random result (ie: unbiased) would have 2 black people and 8 white people. In reality, a single sample may not have these exact proportions. For example, with 1 sample, you may get 1 black per
Re: (Score:2)
I think you're missing the subject here. The twitter algorithm is taking images that have a white person and a black person in them, and cropping them to show just the white person.
You say it takes something more to determine if the algorithm has bias, but this is the very definition of bias.
Re: (Score:2)
It is cropping them because it is trying to display a larger image on devices with smaller screens (the problem seems most apparent on mobile devices). It could just grab a random part of the image to crop, which is what a lot of these viewers do, but Twitter was trying to be clever by selecting the "most important" part of the image to crop. Notwithstanding the obvious subjectivity of "importance", clearly this is a difficult problem.
One way to approach it is to think about it as a semantic segmentation ta
Re: (Score:2)
(It is almost certainly due to the fact that the algorithm was trained on more white faces than black faces. Is that bias? There are more white faces than black faces in America, that's just demographics. But if the result is preferentially selecting white over black, it is still bias, even if the bias is caused by demographics. By definition.)
You make a reasonable point, but I wonder if you're too kind about the demographics. When I'm testing a function, I focus my effort on what's likely to cause my system to fail, not on what's most common. I think a lot of programmers do. Saying that you're going to train on a dataset with 90% white faces because that's the demographics of the target population is like saying you're going to test your division function with 1-9 because 0 only happens 10% of the time.
Re: (Score:2)
Which is it? This is a site for nerds, so let's talk about the the actual technical problem, not society's problem. Things such as algorithms having a hard time with low contrast images, poor training of AI models, etc. If there is actual racial bias, let's talk about if it is because of actual intentional bias, unconscious bias, or just technical momentum (biased or not) because whomever trained the model happened to have access to more caucasian images than those showing people of color. I would be interested to know if the algorithm is actually biased, as the headline suggests (meaning some code specifically makes a racial judgement) or if the process leading to the creation of the algorithm was influenced by bias (meaning that the headline is not technically accurate), or if there isn't actually any bias.
Bro, chill the fuck up. It is a legitimately technical discussion to discuss "bias" in an AI algorithm, which could be "racial bias", which is distinct from "racist bias."
If there's someone that is technically incapable to distinguish between these terms, that's you, so just stop telling the rest of us what is technical or what is not.
Or just keep ranting at the wind going out of a tangent if it makes you feel better, I am not judging.
Re: (Score:2)
unfortunately , with many articles here. The article is so low on useful information as to make such discussions not much more the speculative. if the algorithm is selecting 'white' faces as 'interesting' as seems to be suggested. Where did it get the definition of interesting? I mean , if the majority of the people looking are white and most of them are more interested in seeing family or people who look like themselves then not , it may simply be performing as designed and giving 'the majority' of use
Re: (Score:2)
So much easier to just cry "RACISM" and let some angry people burn shit down.
Nobody reads to the 2nd tier comments anyway.
Light reflectivity is not an excuse (Score:3, Insightful)
Look, if the software was developed by black scientists, it would not have been released because it failed to identify black faces.
The racism is not in the failure to identify black faces as easily as white because of light reflectivity. Instead it is a) not testing the software enough and b) using a software that clearly fails when used on black faces.
Fix the software, then implement it, not the other way around.
Re: (Score:2)
Look, if the software was developed by black scientists, it would not have been released because it failed to identify black faces.
The racism is not in the failure to identify black faces as easily as white because of light reflectivity. Instead it is a) not testing the software enough and b) using a software that clearly fails when used on black faces.
Fix the software, then implement it, not the other way around.
Which would this be racist? A "racial bias" can occur in a system without requiring racism among the implementers. You are absolutely right if this was developed by black scientists, this problem would not have occurred, but we can be almost sure that it would have been biased in another way. This industry is in its infancy.
Why there were no black scientists or testers involved, that's a good question. And for all we know, there might have been - just because a test cycle doesn't detect a bug (in particu
Re: (Score:2)
The racism is not in the failure to identify black faces as easily as white because of light reflectivity. Instead it is a) not testing the software enough and b) using a software that clearly fails when used on black faces.
That isn't racism. Racism is hating people based on their race. I'm not sure where the notion that anything having to do with race is racist. But that's not how it works. Only intentional malice is actually bigotry. Everything else is usually accidental, lack of awareness, ignorance or at the worst negligence.
Just because some programmers failed to account for this in training their models doesn't make them racist.
Re: Light reflectivity is not an excuse (Score:2)
Re: (Score:2)
so if it works for 90% of the use cases it isn't good enough because it makes other people feel bad? I wish someone would send Microsoft the Memo, I think most of they time they figure 80% success on all use cases is really good testing.
Re: (Score:2)
It does not fail to identify black faces. The claim is that it chooses white faces over black faces when asked to identify the "important" part of the picture. I say "claim" because it is stated, not proven. It's easy to say they "just didn't test it enough", but assigning a category such as importance is much harder than simply identifying it. There is likely to be a high error rate and a significant number of exception cases. The algorithm may in fact be biased, but we can't just assume that, not if you w
Re: (Score:2)
Black people in "More black-coloured" shock (Score:1)
There is horrible racism in America and other countries. Relative reflectivity is not part of that propblem.
Post this stuff on WhineDot: News for social scientists. Stuff that we can make matter.
I guess the racial bias (Score:2)
...is the contrast setting.
And we're just going to ignore... (Score:2, Troll)
...Google image search biasing the other way [battleswarmblog.com]?
Social Justice is a totalitarian, anti-rational ideology [battleswarmblog.com].
Yeah, there's racial bias (Score:5, Interesting)
I love the way Slashdot has suddenly become the home of dozens of expert photographers, many of whom are throwing around terms like "relative reflectivity" and "contrast settings" and "luminance", to prove this whole kerfuffle is nothing more than SJWs getting their knickers in a knot for no reason.
Well, I've worked as a professional photographer, and I can tell you one thing: if this algorithm is actually making poor cropping choices based on skin colour, it's because whoever wrote it was too lazy or too stupid to ask a pro how we somehow magically manage to create group photos that with minimal correction or none at all make people with a wide variety of skin tones look pretty much the same. And by "the same", I mean we don't produce results where black people look fine while white people look washed out and flat, or white people look fine while black people look like featureless shadows with eyes.
Re: (Score:1)
Interesting. So, as a professional photographer, do you have an algorithm that just magically does your job for you no matter what the starting image looks like? Or do you have to manually touch up each image, applying different filters as necessary to achieve the result you are looking for? Can you work your magic on a low resolution out of focus compressed JPEG, or do you typically start from a high quality RAW image?
Before criticizing the programmer, maybe take some time to appreciate the difficulty of y
Re: (Score:2)
I love the way Slashdot has suddenly become the home of dozens of expert photographers, many of whom are throwing around terms like "relative reflectivity" and "contrast settings" and "luminance", to prove this whole kerfuffle is nothing more than SJWs getting their knickers in a knot for no reason.
Well, I've worked as a professional photographer, and I can tell you one thing: if this algorithm is actually making poor cropping choices based on skin colour, it's because whoever wrote it was too lazy or too stupid to ask a pro how we somehow magically manage to create group photos that with minimal correction or none at all make people with a wide variety of skin tones look pretty much the same. And by "the same", I mean we don't produce results where black people look fine while white people look washed out and flat, or white people look fine while black people look like featureless shadows with eyes.
That isn't apples to apples and I feel you're missing the point. Or at least if you haven't don't indicate that you get it. Fwiw I totally get what you are saying and I've taken plenty of photos of white people in dark clothing/dark hair against dark background. Same for very dark skintone people in pale clothing against dark backgrounds yadda yadda all without blown highlights or crushed blacks. I am not a pro as I do it as a hobby and not for a living for various reasons, however a bit more able than most
When all you've got is a hammer... (Score:1)
Just physics? (Score:2)
Re: (Score:3, Insightful)
forgive me, but why wouldn't he be down-modded? he added nothing to the conversation and is only trying to incite the trolls. good on the mods that downvoted him - me being one of them.
Re: (Score:1, Flamebait)
For example— the swastikas below your comment have been up for 10 minutes and are at score 0. My comment was downvoted twice within 2 minutes.
Re: (Score:3)
However your 2 previous comments are still at 1, you first posted so that post was the most prominent, Slashdot hasn't got lots of active users so, and moderation is kind of slow. The swastikas I have seen are always modded to -1, I don't know how long it takes on average to get there but they get there. I would agree with the moderation of your post as well, it did nothing to encourage debate, all it did is try to pre-insult anyone who has an opinion not matching yours. This stifles debate, not encourages
Re: (Score:2)
They're at 1 because a) one or more people came in and upvoted them very recently, and b) my karma is excellent so I get extra points.
My initial post was modded to -1 in 2 minutes and the swastikas were unmodded for 20.
"But dismissing everybody else's opinion that disagrees with you because you are good and they are evil racists, is not productive."
That is a completely disingenuous way to describe what I did.
Re: (Score:2)
Did I say there was a conspiracy? I was comparing the demonstrated priorities people had in modding down my vote twice, instantly while ignoring the swastikas for nearly 20 minutes. I said as much multiple times. If you need more information, AC, reread this comment.
Re: (Score:2)
Gotta be quick on the draw to get in before the rampant empty whataboutism and "well actually, it's not racist if (a whole bunch of ill-conceived bullshit)"
Re: (Score:1)
That people aren't willing to spend their mod points downvoting a 10-post long banner of swastikas but they're willing to spend their mod points down voting calling out racists is the only evidence needed to prove my point. No matter what sort of "this is below refuting" justification you can pull out of your ass, it's a clear show of priorities.
Re: (Score:2)
That you are using anything from your experience with Frist Psot, moderation, and swastika posts on /. as evidence for anything resembling a rational conclusion means you cannot be trusted with conclusions, presuppositions, and rationality.
Re: (Score:2)
Time to empty your poop sock.
Re: (Score:1)
* a dick to nazis
Re: (Score:2)
If you're offended about me saying impolite things about nazis I couldn't care less. I'm far beyond the age of needing to tickle someone's nuts because they're more focused on decorum than the actual problem.
Re: (Score:1)
Re: (Score:1)
But why is it that leftist institutions such actual cesspools of racism?
Well, once one pressure hoses way the political venom that drips off your post it basically boils down to business value. When people are being cropped out of photos because of some badly trained AI it's (A) plain annoying and (B) a crappy product. When people are being cropped out of video conferencing calls it's even worse because the whole point of video conferencing software is to see other people while you talk to them:
https://twitter.com/colinmadla... [twitter.com]
When people are being denied healthcare because
Re: (Score:1)
Re: (Score:2)
Have any specific examples? I imagine that reasons for racism on the left are just as varied as they are on the right.
The alt-right on the other hand is pretty uniformly paranoid, dickless, wannabe third reich losers who blame their lack of personal success on people in a lower societal position than them. That would be "ha ha" sad if it wasn't so "uh oh" sad, considering the body count of right-wing terrorists in this country.
Re: (Score:2)
> Have any specific examples? I imagine that reasons for racism on the left are just as varied as they are on the right.
All sad and inexcusable, mind you, but it's not like there's one pathetic mental malfunction that drives it.