Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Google

Google Apologises For Photos App's Racist Blunder 352

Mark Wilson writes: Google has issued an apology after the automatic tagging feature of its Photos apps labeled a black couple as "gorillas". This is not the first time an algorithm has been found to have caused racial upset. Earlier in the year Flickr came under fire after its system tagged images of concentration camps as sports venues and black people as apes. The company was criticized on social networks after a New York software developer questioned the efficacy of Google's algorithm. Accused of racism, Google said that it was "appalled" by what had happened, branding it as "100% not OK".
This discussion has been archived. No new comments can be posted.

Google Apologises For Photos App's Racist Blunder

Comments Filter:
  • by CajunArson ( 465943 ) on Wednesday July 01, 2015 @02:20PM (#50027599) Journal

    Anything that's politically incorrect will be blacklisted from being labeled as a result. No more gorrillas or any other of a million and one potentially offensive labels!

    Although gorillas might be labeled as people, which would actually make some SJWs happy.

  • by Anonymous Coward on Wednesday July 01, 2015 @02:22PM (#50027611)

    Alogorithms aren't racist, and teaching a computer to visually recognize objects is hard. Move along.

    • that's right (Score:4, Informative)

      by Anonymous Coward on Wednesday July 01, 2015 @02:37PM (#50027759)

      And as Richard Dawkins has said, we ARE apes - all of us humans.

      • Re:that's right (Score:5, Informative)

        by magarity ( 164372 ) on Wednesday July 01, 2015 @02:47PM (#50027853)

        "Ape" is a general language term that can be used by different people to specify quite different groups of creatures; it's more correct to say we're all hominids.

        • That's probably what the algorithm should return, just to avoid the possibility of offending anyone. Just label any photos of intelligent hirsute bipedal mammals as "Hominids" and call it a day.

          Who knows, maybe the term will even catch on in the larger culture. "Machine learning" doesn't mean we can't learn from our machines.

        • Re:that's right (Score:5, Insightful)

          by Ramze ( 640788 ) on Wednesday July 01, 2015 @04:23PM (#50028525)

          No, "Ape" is a very specific term used to specify members of Hominoidea. It is unfortunate many are ignorant of the meaning of the term and use it improperly to include monkeys.

          https://en.wikipedia.org/wiki/... [wikipedia.org]

          Humans are apes - specifically, great apes. (aka Hominidae aka "hominids"). "Hominids" simply means human-like. It used to mean only humans, then it included other extinct human-like creatures and now it generally includes all hominidae. While "hominid" (or alternatively "great ape") is a more specific term, it is certainly NOT a more correct term, merely the Family of the SuperFamily.

          One could say that humans are mammals and it would be no less correct. Humans are animals, chordates, mammals, primates, apes, and also great apes.

          It's unfortunate that the Google facial recognition software was not aware that humans don't like being reminded that they are indeed very closely related to other great apes and could easily be confused with gorillas by a non-human intelligence. Our indignance at the notion we're apes that look a lot like gorillas is rather silly -- like zebras being offended at being miscategorized as ordinary horses.

          Granted, I understand the racist implication that those flagged erroneously as gorillas are somehow less human than others. Thankfully, the computer isn't racist. It merely wasn't sophisticated enough to discern the difference given the input, the algorithm, and its training.

          I'm impressed it figured out the object in the photo was a living thing and got the kingdom, phylum, class, order, superfamily, family and sub-family correct. If it had chosen chimp or bonobo, it would have been even closer.

          Heck, check out this comparison of a gorilla baby and a human baby -- no one would have blinked an eye if the software said the gorilla was a human baby.
          http://intentblog.com/wp-conte... [intentblog.com]

          Another cute gorilla baby -- a bit older:
          http://www.ctvnews.ca/polopoly... [ctvnews.ca]

        • it's more correct to say we're all hominids.

          Now we're going from racism to homophobia. Thanks Slashdot!

      • Re: (Score:3, Informative)

        by faway ( 4112407 )
        Richard Dawkins is a biologist. he would never say something so stupid. we are all hominids, and we are certainly not apes.
        • Re:that's right (Score:5, Informative)

          by Barefoot Monkey ( 1657313 ) on Wednesday July 01, 2015 @06:19PM (#50029275)

          Richard Dawkins is a biologist. he would never say something so stupid.

          I'm curious what you feel is stupid about that straightforward statement. Regardless, Richard Dawkins did, in fact, say exactly that.
          Gaps in the Mind, by Richard Dawkins [animal-rig...ibrary.com]
          "We admit that we are like apes, but we seldom realise that we are apes."
          "In truth, not only are we apes, we are African apes. The category 'African apes', if you don't arbitrarily exclude humans, is a natural one"
          "'Great apes', too, is a natural category only so long as it includes humans. We are great apes."

          I did a search for the words "dawkins" and "ape" and the first result was a video of Dawkins saying that he is an ape [youtube.com]. I challenge you to find any living biologist that claims otherwise.

          we are all hominids, and we are certainly not apes.

          Gorillas are hominids, and all hominids are apes. Humans are apes and hominids, just like gorillas.

        • Some asshole must have changed wikipedia to make you wrong. It says:

          The Hominidae (/hmndi/), also known as great apes,[notes 1] or hominids, form a taxonomic family of primates, including four extant genera: orangutans (Pongo) with two species extant; gorillas (Gorilla) with two species; chimpanzees (Pan) with two species; and humans (Homo) with one species.[1]

          You'd better go in there and correct it to say say humans are not apes.

    • by jellomizer ( 103300 ) on Wednesday July 01, 2015 @02:48PM (#50027865)

      I followed the link and looked at the photos. I could see how it would make that mistake.
      1. The Color balance was off: What we call black people are actually just a richer brown. the color balance gave their color more of a real Black/Gray color, the natural color of a Gorilla.

      2. The Angle of the shot. The tilted Angle makes it appeared that they are not upright but slouching in.

      3. They were making unnatural facial features for humans. They were making funny faces at the camera.

      4. The dark hue of the gentleman who was behind shirt, combined with the ladies hair style, makes it seem the body with much broader shoulder.

      I expect the combination of a lot of factors created the wrong choice. But computer decision making, while getting good, isn't perfect, but it is often better then not having it because then it wouldn't be possible to catalog the millions of images. We need to accept that computers make mistakes and there should be a way to fix them when they are found.

      Many of our derogatory comments come from the fact that we find similarities with something else, so it come to reason that a computer may make an actual mistake that will reinforce such derogatory meaning.

    • Re: (Score:3, Insightful)

      by Fwipp ( 1473271 )

      But the people writing the algorithm and choosing the input data *can* be racist. And even in the absence of malice, you can create racist outcomes.

      If your training set has many photos of white people and few photos of black people, it's not going to be great at recognizing black people. If it doesn't know what black people look like, it's bound to misclassify them more often than white people.

      Anecdotally, I noticed that the Microsoft "how old are you" site a while back recognized me (a white person) in eve

      • But the people writing the algorithm and choosing the input data *can* be racist.

        Call the grand scrutinizer and ready the scourges.

        And even in the absence of malice, you can create racist outcomes.

        The universe didn't agree with a totalitarian's philosophy.

      • by chipschap ( 1444407 ) on Wednesday July 01, 2015 @03:29PM (#50028141)

        I find it hard to believe that there was racism intended in any way, shape, or form. It is unfortunate that this took place but Google certainly took care of the problem in short order, as is right.

        There are too many of the LBTO (looking to be offended) crowd these days. Come on, there are plenty of real problems with racism, there's no need to label inadvertent and unintentional things.

      • by lq_x_pl ( 822011 ) on Wednesday July 01, 2015 @03:46PM (#50028281)
        It isn't a racist outcome. It is the outcome of a flawed algorithm. Might even be able to argue that wider testing (and improvement) is needed for the image sensors for computer-attached video equipment. If I my own photo albums for "seal" or "dog" I get pictures of my kids in both. I don't believe the algorithm is impugning the humanity of my offspring, I just think it is far-from-perfect. The outcomes of my search aren't hateful. The outcomes of the picture labels in this story aren't racist.
        • by ScentCone ( 795499 ) on Wednesday July 01, 2015 @04:40PM (#50028621)

          It isn't a racist outcome. It is the outcome of a flawed algorithm.

          You're not paying attention. These days, outcomes that have nothing to do with intention, purpose, or simple transparent standards, but which happen to lean statistically towards results not in perfect balance with skin color as a function of population (though, only in one direction) ... the process must be considered racist. The whole "disparate impact" line of thinking is based on this. If you apply a standard (say, physical strength or attention to detail or quick problem solving, whatever) to people applying to work as, say, firefighters ... if (REGARDLESS of the mix of people who apply) you get more white people getting the jobs, then the standards must surely be racist, even if nobody can point to a single feature of those standards that can be identified as such. Outcomes now retro-actively re-invent the character of whoever sets a standard, and finds them to be a racist. Never mind that holding some particular group, based on their skin color, to some LOWER standard is actually racist, and incredibly condescending. But too bad: outcomes dictate racist-ness now, not policies, actions, purpose, motivation, or objective standards.

          So, yeah. The algorithm, without having a single "racist" feature to it, can still be considered racist. Because that pleases the Big SJW industry.

          It's the same thinking that says black people aren't smart enough to get a free photo ID from their state, and so laws requiring people to prove who they are when they're casting votes for the people who will govern all of us are, of course, labeled as racist by SJW's sitting in their Outrage Seminar meetings. It's hard to believe things have come that far, but they have.

          • Great post, you'd have a mod point from me if I hadn't already commented.

          • I don't disagree with the idea of preventing voter fraud nor government issued ID's as the tool to accomplish that task. That said it is pretty obvious that the main proponents of voter laws are Republicans because they know it will benefit them in elections, and the main opponents of voter laws are democrats because they know it will not benefit them in elections. Neither side cares about fairness, they only care about winning.

            If voter fraud were a big problem, I think the disparity in outcome would not

            • That said it is pretty obvious that the main proponents of voter laws are Republicans because they know it will benefit them in elections, and the main opponents of voter laws are democrats because they know it will not benefit them in elections.

              Backwards. The Republicans know that the biggest source of bogus voter registrations, and the areas with the largest number of actively dead registered voters and turnout at polling places where the number of votes exceeds the eligible population, are in places where Democrat activists work the hardest to hold on to power. It's not that knowing people who vote are voting legally and only once isn't going to benefit Democrats, it's that such a process is counter to what liberal activist groups work so hard

              • If you're worried about people not knowing there's an election coming up, and not bothering to get an ID (really? you can't go to the doctor, fill a prescription, collect a welfare check, or much of ANYTHING else with already having an ID),

                I'm not worried because I'm not a democrat, and don't have any interest in helping democrats win elections.

                What I am saying is that voter fraud is currently not a problem. There just aren't significant number of fraudulent votes. It is pretty clear that the problem is inflated by republican politicians and strategists to get voter ID laws to help them win elections.

                then why not encourage the Democrats to apply the same level of effort they put into the shady practices described above, and focus it instead on getting that rare person who never sees a doctor, never gets a prescription, collects no government benefits of any kind, doesn't work (but whom you seem to suggest none the less are a large voting block) and, with YEARS to work with between elections ... just getting them an ID?

                I'm not saying it's hard to get an ID. I'm saying that t will result in less votes for democrats, and everyone knows that. The republicans k

          • by dywolf ( 2673597 )

            the only person saying "black people aren't smart enough" is you.

            Over here we live in reality, and the reality I that getting one of those IDs requires taking time off from work that we frequently either don't get or can't afford to take. It also frequently requires traveling clear cross town, when you don't have reliable personal transportation, and live in towns that lack adequate public transportation. Or dealings with the fact that some states (*cough*Georgia*) have been intentionally shutting down DMV'

            • Over here we live in reality, and the reality I that getting one of those IDs requires taking time off from work that we frequently either don't get or can't afford to take

              Really. What sort of job do you have that didn't involve showing ID in order to submit the required federal tax forms as you were hired? What sort of paycheck are you getting that doesn't involve you using an ID in order to open a bank account or cash a check? Please be specific about the people who are working full time, so hard, that not once in their entire life can they be bothered to get a form of ID. And, out of curiosity, how on earth did they find time to go register to vote, or find time TO vote?

        • I don't believe the algorithm is impugning the humanity of my offspring, I just think it is far-from-perfect.

          But is the algorithm even wrong? I think the question to the Google recognizer is "of the images in my collection which ones look most like a seal"? If the collection is mostly all pictures of your kids, it'll show you the pictures of your kids that it thinks have the most in common with what it has as an idea about what seals look like. This isn't to make fun of your kids, of course, it's just it

      • But the people writing the algorithm and choosing the input data *can* be racist. And even in the absence of malice, you can create racist outcomes.

        This just in:

        Fwipp, who doesnt know shit about machine learning, has decided that deep convolution networks can be cleverly programmed to be racist. Fwipp knows that he doesnt know shit about machine learning, but feels that his expertise in finding racist versions of both bubble sort and hello world qualifies him as an expert here.

        • To be fair, if your first C program is:

          #include<racist_version_of_stdio.h>

          main()
          {
                  printf("Hello World, but only if you're white!");
          }

          Then yeah helloworld.c can be pretty fucking racist.
    • I can't help but wonder if this person (which I haven't seen) actually did resemble a gorilla. Wouldn't be the first time I've seen such a thing.

      I suspect that if the person misidentified was white, this wouldn't be news however.

      • by Whiteox ( 919863 ) on Wednesday July 01, 2015 @03:00PM (#50027941) Journal

        OTOH If it was a pic of a gorilla but labelled 'Black Afro-American' then you would have the same issue.

      • I can't help but wonder if this person (which I haven't seen) actually did resemble a gorilla. Wouldn't be the first time I've seen such a thing.

        I suspect that if the person misidentified was white, this wouldn't be news however.

        Yeah, god forbid you actually glance at the fucking linked article. I'm not even expecting you to read it, just look at the pictures. I know delaying your insightful reply by 10 seconds would be torture, otherwise how could you proclaim your ignorance?

    • Alogorithms aren't racist, and teaching a computer to visually recognize objects is hard. Move along.

      Oblig Better Off Ted [vimeo.com]

    • by johanw ( 1001493 ) on Wednesday July 01, 2015 @03:33PM (#50028183)

      The algorithm wasn't that far off. I'm sure the gorilla will come over it.

    • Algorithms are not created by humans? Computer do as they are told using Algorithms created by humans. Was the person who created this Algorithm a racist? I dought it very much...but you never really know right?
    • Alogorithms aren't racist

      FacialRegion *face = DetectFace(bmp);
      if (face != nullptr)
      {
      if (face->avg_col.r < 10 && face->avg_col.g < 10 && face->avg_col.b < 10)
      {
      / / Be racist
      result->order = ordPrimate;
      result->genus = genGorilla;
      }
      else
      {

  • by eyepeepackets ( 33477 ) on Wednesday July 01, 2015 @02:23PM (#50027623)

    So, do really pale "white" people get mis-labeled as ghosts? Inquiring minds are somewhat concerned because they are rather pale....

    • No, but it dig tag Lena Denham as a white manatee.

    • by eth1 ( 94901 ) on Wednesday July 01, 2015 @02:57PM (#50027929)

      So, do really pale "white" people get mis-labeled as ghosts? Inquiring minds are somewhat concerned because they are rather pale....

      One of the articles I saw about this mentioned that in the past, light-skinned people had been identified as dogs and seals. Strangely, there was no outrage about that.

      • by war4peace ( 1628283 ) on Wednesday July 01, 2015 @03:12PM (#50028019)

        Searched for "dog" in my Google Photos. 6 photos came up, all of my kids or kid and wife. I don't care. It's an algorithm.
        Searched for "seal" in my Google Photos. Only one came up, and it's of my elder kid. I don't care. It's an algorithm.
        People who feel "offended" by an algorithm are batshit crazy.

        • by fche ( 36607 )

          Thank you for having children, and presumably propagating your common sense.

        • People who feel "offended" by an algorithm are batshit crazy.

          No, for the most part they have no idea what an algorithm is. The deliberate "monkey" references used to refer to blacks touches a very painful part of many and, not even caring why, they are disturbed. They should not be, but they are.

          • Could you please elaborate on the "painful part" thing? Mind, you, I'm Romanian and might not fully understand what's happening, but I am interacting with people from the States on a daily basis and have quite a few good friends there (Americans, that is). My conversations with them on the "blacks" subject prompted me to draw these conclusions (which might be correct or incorrect):
            - "Positive discrimination" is prevalent. Black people have grown to abuse it, hence "Because I'm black!" which is used as an a

      • Re: (Score:3, Insightful)

        by AmiMoJo ( 196126 )

        Historically racists have called black people apes and monkeys. Therefore this accidentally and somewhat embarrassingly mimics that behaviour.

        Historically white people were not, to my knowledge, insulted and discriminated against by being compared to seals and dogs. It's a bit more embarrassing to have women labeled as dogs because they are sometimes called bitches as an insult.

        It's really not hard to understand. Context and history attach additional meanings and sentiments to some words.

  • It's an algorithm (Score:5, Informative)

    by guruevi ( 827432 ) <evi@evc i r c u its.com> on Wednesday July 01, 2015 @02:24PM (#50027639) Homepage

    It's impressive that it can even recognize and classify things as such. Great apes and humans share about 99% of our DNA, any 'alien' entity would classify us amongst the apes.

    The fact that black people are black and thus have a closer resemblance to the generally 'darker' great apes is not racist because an algorithm that is not programmed to have biases cannot be racist. It's just peoples interpretation of the facts that makes things 'racist'. Superficially, black people and apes look mathematically more alike than white people and apes. If the thing was trained on albino apes (which do exist), white people would be considered apes AND NOBODY WOULD THINK IT WAS RACIST.

    • by imgod2u ( 812837 ) on Wednesday July 01, 2015 @02:34PM (#50027737) Homepage

      One could point out that there are fewer instances of white males being miscategorized. I suspect this has less to do with any actual racism and more to do with the fact that the people who developed the algorithm are likely predominantly white males and they tend to first test the algorithm on their own collection of photos or those in their circle.

      This is an argument for a more diverse workforce...

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        It's an argument for more QA testing

      • by buybuydandavis ( 644487 ) on Wednesday July 01, 2015 @02:41PM (#50027797)

        One could point out that there are fewer instances of white males being miscategorized. I suspect this has less to do with any actual racism and more to do with the fact that the people who developed the algorithm are likely predominantly white males and they tend to first test the algorithm on their own collection of photos or those in their circle.

        This is an argument for a more diverse workforce...

        Yeah, because I bet that's how Google develops their image recognition algorithms - white guys walk around taking pictures of themselves.

        As is more likely the case, there are few pictures of Albino gorillas in their machine learning corpus (racist against Albino gorillas!), and hence less data for white folks to more easily match gorillas based on macro level color characteristics.

      • by Copid ( 137416 )

        One could point out that there are fewer instances of white males being miscategorized.

        White males are just about the easiest faces to categorize. They tend to have short hair that doesn't obscure facial features or create oddball shapes that confuse the classifiers. Their skin tone makes photographing them and finding edges extracting features easier than it is with darker skinned people. White people have a greater variety of eye colors that can be used to distinguish among them. "White guy face" is

      • This is an argument for a more diverse workforce...

        You're right, the best thing for black people would be to replace otherwise functional employees with people who didn't make the grade but happen to be black.

        I love the way you "Diversity, Diversity" types are actually incredibly condescending and racist. Perhaps that's why you types perceive "racism" in nearly everything.

    • by LWATCDR ( 28044 )

      Actually I am impressed that it did see how similar Apes and people are. Honestly people getting upset over it are just a bit silly. The problem is people think that someone put person_of_african_descent == ape in the code and that is not true. The algorithm just confused one great ape with an expressive face with another. It is no more racist or intentional than the same system confusing a Camaro with a Firebird.
       

    • It's just peoples interpretation of the facts that makes things 'racist'.

      Fair enough. but

      Were it you and your's you probably be more than just slightly (and equally justifiably) upset as well.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Furthermore, if a 4 year old had pointed out the same thing we'd regard it as incorrect and then calmly correct for the real interpretation. That's not racism, that's "learning". We should regard this engine as if it were a small child and see that it's "learning". The only analog of this response I can see is when we blame the parents for perceived racism. It's the drawback of learning on the world-stage, under the scrutiny of every man, woman, dog, goat and goldfish on the planet. Living under the politic

  • by oobayly ( 1056050 ) on Wednesday July 01, 2015 @02:25PM (#50027649)

    Google's algorithm also identified photos of some sunburnt Essex chavettes on the beach as Yorkshire pigs. A Google spokesperson said "no apology if necessary - it's an accurate assessment".

  • by Anonymous Coward

    Yonatan Zunger, Google’s chief architect of social: "Machine learning is hard" Now tell us more about them self-driving cars.

  • by pla ( 258480 ) on Wednesday July 01, 2015 @02:27PM (#50027669) Journal
    Software doesn't hate black people. Software doesn't dislike Islam. Software doesn't think kids these days need to pull their damned pants up and stop playing that crap music too loudly.

    Apologizing for a program miscategorizing an image it has never seen before as somehow "racist" makes about as much sense as GE apologizing because my toaster looks like a frowny-face from just the right angle.

    Yes, Virginia, we've taken this shit too far.
    • by T.E.D. ( 34228 )

      Apologizing for a program miscategorizing an image it has never seen before as somehow "racist" makes about as much sense as GE apologizing because my toaster looks like a frowny-face from just the right angle

      Most recognition software learns by going through a ton of examples and being told when its right and when its wrong. Most likely what happened here is that the learning phase used images of gorillas, but for "humans" used almost all pictures of white people. The computer doesn't know any better than it was trained, and if it wasn't trained to see black people as human too, then IMHO Google has well-earned the crap it is getting.

    • by turp182 ( 1020263 ) on Wednesday July 01, 2015 @04:20PM (#50028513) Journal

      Actually, apologizing makes sense in this case. It's not about being at fault (they are not), but common courtesy.

      Early in our marriage, my wife taught me to say "I'm sorry" when those around me were hurt or bothered, to be a nice person. Prior to that I only said it if I was at fault.

      So if someone spills soda on their shirt, then "I'm sorry". Same for Google, it is the decent thing to do.

      Regarding photo identification, Google should have images of the Confederate flag show up in the category "Racist"...

       

      • I hope you are never involved in a not at fault accident.

        You are giving horrible advice.

        When someone is butthurt, you say 'I'm sorry you're butthurt...'

  • What's wrong with being classified as a rather violent but otherwise perfectly fine animal like an ape?
    What about white men who risk being classified as Bill Gates, or Poettering? That would be really offensive.

  • by Crashmarik ( 635988 ) on Wednesday July 01, 2015 @02:41PM (#50027807)

    What next a hairy fat guy in a pool gets tagged as a walrus ? His girlfriend a whale ?

    Oh the horror the machine was mean.

  • by wbr1 ( 2538558 ) on Wednesday July 01, 2015 @02:46PM (#50027843)
    Google announce that it will change the tagging of Caucasians from 'cracker' to 'saltine'.
  • How is that racist? (Score:2, Informative)

    by Anonymous Coward

    That woman does look like a gorilla when she makes faces like that, can't blame the computer that automatically flagged you for that.

  • by Anonymous Coward

    Black, simian, short dark hair, big lips. Ape.

  • by Theovon ( 109752 ) on Wednesday July 01, 2015 @03:05PM (#50027979)

    I'm not going to downplay the feeling of insult that the black couple experienced. There is a long history of racism against blacks, referring to them as apes and other things, with the intent of putting them down. In *this* case, it was an accident of a flawed algorithm, but there's some history here that makes that a hot button. For the sake of repairing the effects of racism of the past, we should be careful about how we use racial slurs, even accidentally.

    That all being said, we're learning more and more about gorillas and other higher apes and how intelligent they are. We're closely related. To an alien from another planet, they may look at humans and other apes and not perceive much difference. To compare humans (in general) to apes (in general) isn't all that unreasonable. And some day, when all this racism shit is behind us, mistakes like what happened here might be merely amusing.

  • Prediction (Score:5, Insightful)

    by sootman ( 158191 ) on Wednesday July 01, 2015 @03:37PM (#50028205) Homepage Journal

    They tweak the algorithm a bit. A week from now, a gorilla in a photo is tagged as 'black person'. Hilarity ensues.

  • I know it is hard for many to be rational and factual about some topics but, I have see the same categorisation choices made by young children on more than one occasion. We must be forgiving and understanding toward Google's AI and it's faux pas because it has not received the social conditioning necessary to discriminate politically and override it's simple visual correlation matches. We may also learn something about the deeper causes of xenophobia and politically incorrect behaviour in humans if we cons
  • How can a computer algorithm be racist? I just did a search on mine and when i type in "dog" it found my white cat and my 8 year old son. It's no different from autocorrect. People need to lighten the fuck up.

"The greatest warriors are the ones who fight for peace." -- Holly Near

Working...