Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Twitter AI Technology

Twitter Wants To Tackle Its Biased Image Cropping Problem By Giving Users More Control (thenextweb.com) 31

An anonymous reader shares a report: Last month, a bunch of users tweeted about how Twitter's image cropping algorithm seems to have a bias towards fair-skinned people. When users posted uncropped images containing both light and dark-skinned people, the social network's algorithm often showed the light-skinned person in the preview. At that time, Twitter said, while its algorithm was tested for bias, it will conduct further investigations to resolve the issue. Last night, in an update, the company said it's planning to give users more control over how the final image will look like.
This discussion has been archived. No new comments can be posted.

Twitter Wants To Tackle Its Biased Image Cropping Problem By Giving Users More Control

Comments Filter:
  • Re: (Score:2, Insightful)

    Comment removed based on user account deletion
    • Re:Not a bias (Score:4, Insightful)

      by ZackSchil ( 560462 ) on Friday October 02, 2020 @11:42AM (#60565314)

      I am not sure what you're getting at. It seems like the AI in all your examples made actual, real mistakes. They were mistakes that seemed especially bad because of historical racism but they were all mistakes nonetheless, and it ought to be retrained not to do those things. I think you know why putting images of gorillas on news stories about human criminals is not correct, or why not identifying black pedestrians is a problem, so let's look at the other two.

      In the case of predicting crime in minority neighborhoods, presumably this is a network that was supposed to guide police to where they should increase enforcement. But high reported crime rates in minority neighborhoods is partly due to already high enforcement. This is classic sampling bias. Using enforcement level to guide enforcement level creates a feedback loop that hampers the goal of the AI. You want to adjust for current level of enforcement, at which point the statistical bias goes away and you get a more useful system.

      In the case of mistaking black women for black men, perhaps identifying gender by appearance is ill-posed and we should reconsider the utility of such an algorithm.

      • Re: (Score:3, Insightful)

        Comment removed based on user account deletion
        • by Twinbee ( 767046 )
          It sounds a bit strange, but I'm guessing some people would like all men and women to look alike, so we become a single amorphous mass without any distinguishing features. I would find that kind of world to look bland, for the same reason I enjoy colour instead of greyscale video.
        • by iNaya ( 1049686 )

          It's kind of laughable that you think blacks resemble gorillas. Unless the only thing you're judging is the colour of the skin, though most 'black' people aren't really all that black. I guess they also resemble black cats too.

          Gorillas have a lot of fur, the face is a very different shape to a human, I mean, look at the muzzles on those things! I wouldn't call those minute characteristics.

          As far as the crime stuff goes, yeah it makes sense; but your claim that blacks resemble gorillas is really absurd.

        • by Ly4 ( 2353328 )

          If it sends cops to minority neighborhoods ...

          then the police will find more crime there, because that's where they are looking, and the cycle will be reinforced.

  • No (Score:4, Informative)

    by nospam007 ( 722110 ) * on Friday October 02, 2020 @11:29AM (#60565276)

    "seems to have a bias towards fair-skinned people."

    No, it's a contrast bias.

    Everybody who has ever tried to photograph a black cat, knows what I mean.

    No need to search for racism, it's very easy to find.

    • Some of this is spillover bias from photography, both the historical chemical process and the modern sensors and algorithms in smartphones. Both are tuned to produce nice exposures of lighter skinned people, while darker skinned people are not given as much consideration. This is baked right into the gamma curves, the auto-exposure settings, the focus algorithms, and even the gain of the sensor.

      You could argue given the average skintone among human photography subjects, this tuning makes sense, but it does

      • "You could argue given the average skintone among human photography subjects, this tuning makes sense, but it does leave some people out."

        Sure, but these rules and algorithms were created during the time my racist uncle was created.
        It's not a problem dating from today.

      • It's not even so much "consideration" as two more fundamental issues. We put the spotlight on someone to draw attention to them because the eye is naturally drawn to the brightest area.

        For most people, those from North America, South America, Europe, Asia, and the Middle East, it's pretty easy to put some light on them and have the subject brighter than the background. You photograph to get the subject which is directly lit, not the slightly darker background.

        With a very dark skin tone, the subject may be

        • by AmiMoJo ( 196126 )

          Have a look at some early colour movies. The skin tones look really fake, and that's with make up designed to make them more realistic.

          Over time the chemistry was improved, lighting improved, make up improved. Modern cameras, especial phone ones, are tuned to produce good skin tones. Even so it rarely looks right on TV, we are just used to it, same as we don't blink at photoshopped magazines.

          Anyway point is that with some work we can improve the photography of dark skin tones a lot.

          • I understand why you want to think that.

            The fact is, there are just a lot more shades of color between 20-180 than there are between 20-40.

            Magicians use the fact that you tell the shape of a black surface. They open the front door to a box and you can see inside, seeing the box is empty (and painted black). What you don't know is that you aren't seeing the inside of the box at all - you're seeing a black surface mounted right behind the door.

            https://www.amazon.com/Black-A... [amazon.com]

            It's just a fact that you can't

            • by AmiMoJo ( 196126 )

              Right but black skin is not a matte black surface designed to fool the eye.

              A good recent example of this is "Father" from Raised by Wolves. He has pretty dark skin but plenty it detail is visible on screen.

              You talk about luminance values but with no scale. The scale depends on the exposure. And besides which even when the image is very poor humans can recognize a face, we are just bad at training AI to do the same.

              • > You talk about luminance values but with no scale. The scale depends on the exposure.

                The scale is 0-255, Ami. With your sig I thought you'd know that.

                Sure you can redefine your values, doubling all brightness levels such that anything brighter than half intensity is rendered as full bright white. Then you lose 63% of the scene, because you've made most of the scene nothing but white. Not the best first step when you're trying to figure out what's in the scene - to throw most of it away.

                It's just scie

                • I guess I should say using a dark background isn't the only option if you know you're setting up for a dark subject. You can also use a plain single-color background, so there is no detail to be lost. You can use short depth of focus so the background is blurred and it doesn't matter much that it's also blown out. There are different options, but regardless you lose detail on lighter objects when you set the exposure to get the details of dark objects.

                • by AmiMoJo ( 196126 )

                  0-255 is the output V value. When you capture an image you have to set the exposure to convert the huge dynamic range of the real world to a 0-255 scale. Put it another way, how many photons does 5 equal?

                  • And there's pretty much one useful way to map that, given you don't know what's in the scene. The brightest areas are white the darkest black. log2(ls/k). Anything else is throwing away information.

                    What else are you going to do, make 40%, of the scene bright white, 30% pitch black, and use all of the other values to see only 30% of the scene? I would suggest that throwing away 70% of the information isn't a good way to start trying to figure out what it represents.

                    • by AmiMoJo ( 196126 )

                      Well there are a few ways you could do it, but the obvious one would be to do face detection on the image as-is and then crank the gamma up and do it again to see if you find any more faces. Or maybe just train you AI better, after all humans can see black faces even in very poor lighting.

                    • > . Or maybe just train you AI better, after all humans can see black faces even in very poor lighting.

                      So can the AI. The complaint is that it can see white faces *better*, because they have more contrast, just like you and I and anything can. We see using light. Light absorbed can't be used for seeing.

                      It's the same as complaining that it finds things in daylight images better than nighttime images. Yep, that's how photography (and vision) works.

                      Let's say you have 4000 photons per unit area coming fr

                    • by AmiMoJo ( 196126 )

                      Right, it's just a bad algo, it doesn't weight black faces properly and then they select the one with the highest rating instead of saying "oh there are two here, better include them both".

                      Saying that it seems like some of these systems can't see black faces at all sometimes. There was an example with Zoom where it removed the guy's entire head because it thought it was part of the background. It was really weird, the rest of his body was there, looked like a movie effect.

                    • I think we've passed the point of useful discussion, so thanks for the chat. Really. You're interesting to talk to even when I think the starting point you begin with is odd.

                      I suppose what it comes down to is a I think 800 > 400 is mathematical fact, you think it's "a bad algorithm". Thanks for the conversation.

                    • by AmiMoJo ( 196126 )

                      To be honest I didn't find it that useful. You just kept repeating the same thing and ignoring the point I was making. Barely meets the classification of "discussion" according to my AI.

                    • > ignoring the point I was making

                      Certainly not intentionally. Perhaps it would be interesting to state your point in a single, clear sentence just because I'm curious.

    • by AmiMoJo ( 196126 )

      That's a ridiculously reductive thing to say. It's a face detection algorithm.

  • by _xeno_ ( 155264 ) on Friday October 02, 2020 @11:36AM (#60565298) Homepage Journal

    Oh, so it took reports of the cropping algorithm being biased to get them to finally give some control to the users like they should have done in the first place.

    The Twitter cropping algorithm has always been somewhat terrible. Considering that there's basically one aspect ratio it uses when cropped (2:1 landscape) you'd think they'd have been able to let users select that crop since day 1, especially since their basic "detect the most face-like thing in the picture" fails spectacularly for things that don't have people in it.

    Even in pictures with people, if the people aren't looking at the camera head-on, you can get somewhat hilariously bad crops, like the image cropping to someone's legs instead of their face.

    Letting the person posting the picture decide the crop is how this should have worked from the start.

    (It's also worth noting that it's less that it "picks the lighter skinned person" and more that it picks whatever face has the highest contrast ratio. It's unclear that the examples show Twitter's bias as much as they show photographers' biases.)

  • there are significantly more fair skinned people in the US than darkskinned like 8x or so going by AA populations. Especially in lilywhite Sillycon Valley. At a certain level of sophistication, the population gap is so large the accuracy the model gains from looking for white skin outweighs what it loses by excluding dark skin. No racism needed.
  • by GuB-42 ( 2483988 ) on Friday October 02, 2020 @11:47AM (#60565338)

    People will find bias even when on average, there isn't. If I remember well, someone did a small study with various faces and found that the black person was selected slightly more often, but not in a statistically significant way.

    The test is a forced choice and Twitter's algorithm can't win. So how do you resolve the issue? Shift the burden of choice to the user.

  • Complete and perfect solution that makes everyone happy instantly:

    Don't automatically crop images.

    This gives everyone what they want, which is to post the image they posted instead of a different image.

  • Twitter's image cropping algorithm seems to have a bias towards fair-skinned people

    [ Taking a cue from the show Better Off Ted [wikipedia.org], season 1, episode 4, Racial Sensitivity [fandom.com] ... ]

    Dear Twitter Users,

    The Twitter cropping algorithm works by detecting the light reflecting off people's skin and has a problem with darker skin. We are working to correct this, but we would "like everyone to celebrate the fact that it sees Hispanics, Asians, Pacific Islanders, and Jews."

    While we work on the issue, Twitter encourages darker-skinned people to simply stand next to lighter-skinned people in their ph

How many QA engineers does it take to screw in a lightbulb? 3: 1 to screw it in and 2 to say "I told you so" when it doesn't work.

Working...