Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Google Businesses The Almighty Buck

Google Allowed Advertisers To Target 'Jewish Parasite,' 'Black People Ruin Everything' (buzzfeed.com) 139

Alex Kantrowitz, reporting for BuzzFeed News: Google, the world's biggest advertising platform, allows advertisers to specifically target ads to people typing racist and bigoted terms into its search bar, BuzzFeed News has discovered. Not only that, Google will suggest additional racist and bigoted terms once you type some into its ad buying tool. Type "White people ruin," as a potential advertising keyword into Google's ad platform, and Google will suggest you run ads next to searches including "black people ruin neighborhoods." Type "Why do Jews ruin everything," and Google will suggest you run ads next to searches including "the evil jew" and "jewish control of banks." BuzzFeed News ran an ad campaign targeted to all these keywords and others this week. The ads went live and were visible when we searched for the keywords we'd selected. Google's ad buying platform tracked the ad views. Following our inquiry, Google disabled every keyword in this ad campaign save one -- an exact match for "blacks destroy everything," is still eligible. Google told BuzzFeed News that just because a phrase is eligible does not guarantee an ad campaign will run against it. A total of 17 ad impressions were served before the keywords were disabled.
This discussion has been archived. No new comments can be posted.

Google Allowed Advertisers To Target 'Jewish Parasite,' 'Black People Ruin Everything'

Comments Filter:
  • by WoodstockJeff ( 568111 ) on Friday September 15, 2017 @12:27PM (#55203777) Homepage

    Deja vu, except for the company name.

    • Indeed, can we mark TFA as redundant?

      • Yep. Slashdot ran pretty much this same story yesterday [slashdot.org], except Facebook instead of Google.

        Are we going to get the same story every day until we run out of different platforms you can put a search term into?

        • TFA raises important issues about what kind of society we want to have. Should giant corporations be the moral gatekeepers? Should people be allowed to shop for a Confederate flag, or should Google ban products based on the ideology they symbolize? What about a "Black Lives Matter" t-shirt?

          You may have an opinion about Confederate flags and/or BLM t-shirts, but if so, then you are missing the point. TFA is advocating that your opinion DOESN'T MATTER, and instead of individual opinions, we should just ac

          • More to the point how much of this is automated.
            The problem with these adaptive and learning systems is they pickup trends without any moral background. And the size of google it is nearly impossible to to monitor everything.

            Being nearly every human probably has a moral failing often this moral failing they don't realize and often celibate it. Makes it difficult to code a bias moral engine.

            Most racist vindicat themselves with some logic, often saying that they are unfairly targeted, or the groups they hat

    • by Luthair ( 847766 ) on Friday September 15, 2017 @12:32PM (#55203831)

      Really though, are the creators of automated systems meant to think of every possible questionable phrase?

      If someone pins a hateful ad on a bulletin board are the owners letting the person do it or have they just not seen it yet?

      • Really though, are the creators of automated systems meant to think of every possible questionable phrase?

        That is not a good argument. Google could easily build a filter that could stop 90% or more of these phrases. They will never stop 100%, but they could easily do way better.

        But should they? Is it really their role be society's ideological and moral gatekeepers?

        • > Google could easily build a filter that could stop 90% or more of these phrases. They will never stop 100%, but they could easily do way better.

          If we made a list of objectionable phrases, we may find that Google DID block 90% of them. Without checking, I can't agree with "they could do way better" - we don't know how well they did. We only know that somebody was able to come up with a few phrases that weren't blocked.

          > But should they? Is it really their role be society's ideological and moral

          • > Google could easily build a filter that could stop 90% or more of these phrases. They will never stop 100%, but they could easily do way better.

            If we made a list of objectionable phrases, we may find that Google DID block 90% of them. Without checking, I can't agree with "they could do way better" - we don't know how well they did. We only know that somebody was able to come up with a few phrases that weren't blocked.

            > But should they? Is it really their role be society's ideological and moral gatekeepers?

            That is indeed a very good question. It gets real interesting when you consider the types of racist things Al Sharpton says, or the things many black comedians say.

            If they did block all those phrases, then you would not be able to google this slashdot discussion.

            • This discussion is about how people can target their ads, not about totally removing a site from Google's index, or indeed removing it from the web completely.

              Suppose you are selling unique cases for Raspberry Pi. You wouldn't want to show that ad to just anybody and everybody at random, that would be wasteful. Instead, you'd want to advertise Pi cases to people who search for "Raspberry Pi", "Pi case", "Pi model B", etc.

              When you advertise through Google, you can show your ads to likely buyers by selecting

      • You don't get to absolve yourself of responsibility just because you created an automated system. If you are making money of such things then you are responsible.

    • by Daetrin ( 576516 )
      ProPublica caught Facebook, and Buzzfeed caught Google. So which important high-traffic website is next? Is anyone focusing on Bing to see if it suffers from the same problem?

      [crickets]

      Hello? Anyone there?

      .
    • I'm amazed they got 17 impressions. Was the campaign likely to get hundreds, or low-budget and picking up 2-3 a day?
    • Which just goes to show, those people are all the same.

      Why do ad people have to ruin everything?

  • by Timothy Hartman ( 2905293 ) on Friday September 15, 2017 @12:35PM (#55203855)
    I'm not seeing why this is a problem. You're not going to eliminate racism by hiding it. It's not like a confederate statue staring you in the face, you have to go looking for racist garbage for you to find it on Google search and if you do find some racist garbage you can get rid of it fairly easy with their reporting tools.

    People want information and advertisers want information. You can find perfectly legal porn sites and the like and they can advertise. I'm just not seeing the social dilemma. It's not like their getting links to fake pharmacies that can sell you knock off Viagra that could be potentially dangerous and lacks quality control (which they had an issue with before and fixed). I'm a big proponent of people being accepting and part of that is being accepting of people who I think have horrible world view points. As long as it doesn't harm me people can do what they like, unless it is BuzzFeed news. That's literally Hitler. #ICan'tEven
    • Re: (Score:2, Insightful)

      You're not going to eliminate racism by hiding it.

      Sure you are. No baby is born racist, they get taught that shit by someone or something.

      Your idea is sound in principle ("information wants to be free"), but in practice it doesn't really work out that way.

      I'm just not seeing the social dilemma.

      Others do, though. Are there opinions worth any less than yours?
      • To summarize the summary of the summary: people are a problem.
        Douglas Adams

      • by Anonymous Coward

        Actually, science shows that babies demonstrate in-group preferences. So, yes babies are born racist.

      • by Anonymous Coward

        Sure you are. No baby is born racist,

        Actually, babies are born racist. Well, not literally racist as "race" is too complex of a concept for babies to grasp, but babies do discriminate. One of the early stages of development is learning to discriminate between people who look like you/your parents, and those who don't. It's part of learning to identify people they know (parents) from strangers

        This is why much emphasis is played on creating "diverse" environments. If a baby (or even a grown adult) is exposed to seeing people of all races working

        • Appearance is a proxy for relatedness; who looks most like you? Your parents, children & siblings. Who looks least like you? People of other races.

          So any gene that caused people to like similar looking people (and by implication dislike those who looked different) would be favouring itself. Once it existed, it would be self-reinforcing.

      • by gweihir ( 88907 )

        Actually, people find out about racism all by themselves. It is one of the common ways to of fuckups to elevate themselves above others. Fuckups that are still kids do discover it all by themselves. The idea is pretty simple and obvious: Identify some characteristic somebody else has that they cannot do anything about and are not actually responsible for and that you yourself do not have. Then attach negative meaning to it.

      • >No baby is born racist, they get taught that shit by someone or something.

        They're not born racist, but they do learn fairly quickly to prefer people who look most like their parents.

        As you get older, you tend (not always, there's also the allure of the exotic) to look for a mate who looks similar - but not too similar - to the people you grew up around. There's also some evidence there's a pheromone effect in play, as women have been found to prefer body odour similar to their father's.

        We have plenty o

    • by Vermonter ( 2683811 ) on Friday September 15, 2017 @12:50PM (#55204009)

      When you ban certain ideas, those ideas end up getting discussed in private where they are far less likely to be disputed. The best thing to do is to invite bigoted ideologies to be discussed openly, so that counterarguments can be put forth and the general public can see why they are bad ideas. Sadly it's less effort to ban offensive ideas than to debate them.

      Just like prohibition didn't stop drinking, banning offensive ideas does not kill them.

      • by Anonymous Coward

        This isn't about banning racism, it's about whether Google is profiting from racism.

        • by gweihir ( 88907 )

          Google is profiting from what people want. If you force them to censor, you are banning things. While this state of affairs is not good, the proposed "cure" is far worse.

          Caveat: I do think Google is an evil large corporation these days, but that is the normal state of things in capitalism. (No, communism is even worse, there you have an evil, all-encompassing state.)

          • It seems Google itself is not comfortable with profiting from racist targeting. No force needed. This is just pointing out that it is happening and showing that even Google does feel some shame for selling this kind of targeting. Freedom of speech does not guarantee your right to use a particular advertising service in a particular way. If Google and Facebook want to be the fascists' and the racists' best buddies, they might lose some respect. (Now I agree that respect is largely undeserved, but there
      • I'm afraid that's not true. These ideas are sticky in a lot of people's minds, especially the uneducated, auto-didacts, and the not fully educated. We let them see these ideas, they're going to believe them despite the ridiculous conclusions to which they lead, and before we know it the trains are running on time to the camps.

        The best thing we can do is attempt to inoculate people beforehand by crapping all over the ideas before anyone sees them, and the next best thing is to eliminate them from the pub

        • by gweihir ( 88907 )

          You are advocating censorship and propaganda. These are universally abused as soon as established and far worse than having stupid people think stupid things on their own. The only think we can to without making matters much worse is to be resilient as a society and tolerate the idiots. Any attempt at suppression is not only futile, but dangerous.

      • The issue is that, in this moment in time, people who strongly identify with either right-wing or left-wing ideology tend to prefer shouting down those who hold opposing views instead of engaging them in actual discussion. Additionally, there appears to be extreme reluctance - on both sides - to even acknowledge that perhaps some part of what "the other side" believes is a legitimate concern.

      • by gweihir ( 88907 )

        Indeed. And it validates those ideas and the people that have them by the "David vs. Goliath" effect. (Works something like this: "They have to suppress the idea. Hence the idea must have merit, because otherwise they would have actual arguments against it and would not need to suppress it.)

        But the cave-man reflex is to apply violence to anything they not not like, in modern times by proxy of the big fetish of "the law". This routinely makes matters much worse.

    • by hey! ( 33014 )

      Well, eliminating racism is kind of a straw argument. Is anyone actually arguing that racism will disappear because Google took action in this case?

      I suspect the reason that Google stepped in here was profit maximization, not social engineering.

      • by gweihir ( 88907 )

        My guess would be Google actually just is using some weak AI mechanisms and they did not feed in a list of things that are "forbidden". Does say bad things about some of their customers, but the world is full of idiots and _that_ cannot be fixed.

        • by hey! ( 33014 )

          Right. In a world where everyone was perfectly logical and unemotional we'd say, "Well, the AI classifier identified distinct groups of racist customers." But in the world we live in reputation matters, and the most valuable customers don't want to be associated, even tenuously, with these morons.

          • by gweihir ( 88907 )

            In that world we would not have racists, because racism is just a mechanism of losers to claim "I am better than those others because they have race characteristic XYZ, but I do not". Listen to some black racists some time and the utter stupidity of the racist idea becomes glaringly obvious.

    • unless you got a solution to eliminate prejudices.

    • by dbialac ( 320955 )
      Not only that, but if you want to create ads to fight racism, you can do that by targeting these kinds of terms. But the knee-jerk reaction by SJW's doesn't give them a chance to actually consider this fact.
      • Totally good point. The same way that competing companies post ads on their competition. It's not as though there aren't some non-profits that are flush with cash who should be doing this. Even if it's not their target market they could get racists to rage about them, thusly get more word of mouth. It's could be a better spend on than targeting people who agree with their viewpoints in not condoning racism.
    • ..Okay, here's where I have problems with what you're saying:

      You're not going to eliminate racism by hiding it.

      While as a stand-alone statement this is true (racism is like cockroaches, after all; it prefers existing away from the light -- at least until they feel they have 'strength in numbers') this does not mean that anyone, individual or especially a business of any kind, should CATER TO IT/THEM -- not unless they want to be 'guilty by association'. By catering to it/them (by selling them ads) that is precisely what Google did -- and they implicitly ad

    • by AmiMoJo ( 196126 )

      It's a problem because Google doesn't want to enable people to send targeted messages to the far right.

      I'll defend your right to free speech to the death, but I won't help you advertise your racist messages.

  • by Anonymous Coward

    Anyone can put together any combination of words that adds up to a custom-made definition of "hate". Furthermore, what about other languages? Should Google hire people to cover all languages and spend their days thinking of hateful sentences to ban?

    If only 17 ads were served during X numbers of hours or days or weeks before the sentence was banned (and probably served to the BuzzFeed employees who did the research), then they are not really addressing any problem. It seems like a fake news.

    It is likely,

  • "Pit bulls dangerous" allowed? How about ads targeting "Crooked Hillary"? Or even worse, what if I want to market a product to members of the gay community? Can I use the word "Queer"?
  • 1. Google's primary income is from their ad platform.
    2. Targeting is what makes their ad platform competitive.
    3. Hate speech is protected speech in the US. ...
    4. Profit?

  • Come on msmash! The story this is a dupe [slashdot.org] of is still on the main page!

    Just like old times...
  • Opportunities Lost (Score:5, Interesting)

    by Jonathan C. Patschke ( 8016 ) on Friday September 15, 2017 @12:52PM (#55204035) Homepage

    Ages ago, before torrents and automated enforcement of the DMCA, one could find direct-download "warez," "keygens," and "cracks" easily through major search engines.

    One day, I was searching for a particular software crack so that I could try it without borrowing a key dongle, and I got a notice from Google directing me to a drug rehabilitation hotline. I'd never even considered that people might use Google to look for crack cocaine, but, thanks to the naiveté of keyword-matching, there was the opportunity to get help for an addiction I thankfully did not have.

    I'm sure all sorts of hateful organizations paid Google for the opportunity to sell swag to bigots. I'm sure Google spent their money just as easily as they spent money from people buying access to nicer keywords.

    Want to burn a bigot in the ass? Let it come out that every dime spent on buying access to "Jewish Parasite" and "Black People Ruin Everything" went to the ADL and NAACP. Give organizations like the ADL and NAACP access to those keywords gratis. Offer alternatives to hate as easily as alternatives to drugs.

    Google missed an opportunity to, relatively cheaply, buy a huge PR win and help overcome hate.

    • by Kartu ( 1490911 )

      It wouldn't change jack shit, not to mention that any organization could do it without google's help, although not for free.

    • by Anonymous Coward

      The ADL's own bigotry would likely just push anyone who is opposed to their ideas further away. It probably wouldn't help crush the conspiracy theories that google is using the ADL and other free speech control groups to undermine freedom of thought on their platforms. Of course like any advertising entity, the profit of the advertiser is always the largest motivation, so if so called "anti-racist" groups want to push ads to presumed racists that's fine with the advertising provider.

      Would be interesting t

  • Did they set it up specifically that way, or is there some algorithm that picked up on dreadful people chatting and try to sell to them?

    Everybody remembers Tay, right? [theverge.com] She started off a simple chatbot and wound up the grand wizard of the kkk by the end of the day.

  • Google told BuzzFeed News that just because a phrase is eligible does not guarantee an ad campaign will run against it.

    You'd think a company as rich as Google could afford a competent spokesperson who wouldn't say really stupid shit like that.

  • ... the day they start engaging in significant censorship is the day they become too powerful and get slapped with common carrier status and broken up as a monopoly.

  • by Anonymous Coward

    In Soviet Union self identification in the form of "I am Russian" or "I am Georgian" or "I am Ukrainian" and etc was considered as imperial Chauvinisme or Nazism and the person was put under surveillance from secret service. We are living in free country and having free speech and right of expression. The problem should be resolved from other end, via counter propaganda. Prohibition does not solve the problem, it only let those groups to proliferate their ideas faster.

  • Free speech includes ignorant speech, idiotic ideas and even speech designed to flame the fan of hatred. Trying to limit free speech is expensive, troublesome and almost a total failure. The centuries old fight against pornography is a great example. Today we have more access to a huge variety in porn than ever before. About the only item that has been beaten down is child porn simply because almost all people hate child molesters. and even when we put the sick types in prison or other facility we spend
  • by Khashishi ( 775369 ) on Friday September 15, 2017 @04:47PM (#55205987) Journal

    The English language has allowed people to voice hate speech and conspire to conduct terror. But worry not, the truth ministry is working on New English which will rectify these shortcomings.

  • This whole discussion and the way its being moderated are severely turning me against continuing to use Slashdot. I've been a long time, probably 12-13 years on this account, and I don't even know when I registered the first one. I think we were on Suse 7.1 or so back then, which I actually bought the DVD + CD set for. I remember when this was a free place to discuss ideas, ideals, and even bring the occasional website crawling to its knees from the flood of traffic. Those were good times. Since about 2015

Where there's a will, there's an Inheritance Tax.

Working...