Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Google AI

Google Autocomplete Still Makes Vile Suggestions (wired.com) 238

An anonymous reader shares a report: In December of 2016, Google announced it had fixed a troubling quirk of its autocomplete feature: When users typed in the phrase, "are jews," Google automatically suggested the question, "are jews evil?" Almost a year after removing the "are jews evil?" prompt, Google search still drags up a range of awful autocomplete suggestions for queries related to gender, race, religion, and Adolf Hitler. Google appears still unable to effectively police results that are offensive, and potentially dangerous -- especially on a platform that two billion people rely on for information. Like journalist Carol Cadwalladr, who broke the news about the "are jews evil" suggestion in 2016, I too felt a certain kind of queasiness experimenting with search terms like, "Islamists are," "blacks are," "Hitler is," and "feminists are." The results were even worse. For the term "Islamists are," Google suggested I might in fact want to search, "Islamists are not our friends," or "Islamists are evil." For the term, "blacks are," Google prompted me to search, "blacks are not oppressed." The term "Hitler is," autocompleted to, among other things, "Hitler is my hero."

Google Autocomplete Still Makes Vile Suggestions

Comments Filter:
  • by Anonymous Coward on Monday February 12, 2018 @01:45PM (#56109579)

    Google is showing what others are searching for. What else would you expect from humanity?

    • I see this as a reflection of human kind, not of Google.

      As we get more polarized, it gets too easy to see the other group as bad or evil, we tend to forget that the other people have the same set of problems that we do. And are just trying to make it in the world the same as us.

  • Thought Police (Score:5, Insightful)

    by Calydor ( 739835 ) on Monday February 12, 2018 @01:51PM (#56109615)

    Google is not and SHOULD not be the thought police. If their algorithms show these to be common search queries, take that as a hint that we need to DO something - as long as that something isn't to sweep things under the rug by censoring the results.

    Yes, censoring. I don't give a rat's ass about the argument that it's only censorship if the gubbermint does it. The internet is the new town square, deal with it. Circumventing censorship laws by "suggesting" to private companies what is and isn't appropriate things for people to see is bad.

    • by ranton ( 36917 )

      Google is not and SHOULD not be the thought police. If their algorithms show these to be common search queries, take that as a hint that we need to DO something - as long as that something isn't to sweep things under the rug by censoring the results.

      That is a very easy ideology to maintain if you don't think too hard.

      Google's search engine by definition censors what you see on the Internet when using their service. Even limiting your results to popular sites, or the ones Google thinks you probably want to see, is censoring out plenty of other results.

      Circumventing censorship laws by "suggesting" to private companies what is and isn't appropriate things for people to see is bad.

      No one is doing that here. Google will still serve up results for "Hitler is my hero" if the user types that in. What Google doesn't want to do is suggest that search query because they feel a sizable numb

      • That's not censoring any more than a city is censoring your exposure to farms.

        Censorship requires the intentional curation of content to remove certain elements. You can intentionally curate content without removing those elements (by portraying other elements more-strongly--propaganda), and you can unintentionally over-represent some content (by collecting information for a purpose and by a method, which incidentally ends up over-representing the view as per the collected data and under-representing any

    • Suggesting to private companies what they should do is called free speech. If private companies choose to protect their brands by listening to feedback from the public that is also called free speech.

      So you want stop the thought police by first stomping on freedom? How is that going to work out?

    • Google is not and SHOULD not be the thought police

      Well, if they have no hand in what the algorithms produce, they'll be even more gamed than they already are. I'm sure trolls and political operatives would love that.

  • Oh noes! (Score:5, Insightful)

    by nwaack ( 3482871 ) on Monday February 12, 2018 @01:53PM (#56109627)

    Google showed me something I don't agree with! Better run back to my safe space and hide with my teddy bear. - smh

    People in general are vile and disgusting, this is just another attempt to hide the real world from people and make them snowflakes.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Google showed me something I don't agree with! Better run back to my safe space and hide with my teddy bear. - smh

      Also worth noting that Google customizes its completions and suggestions based on what you look at. If you're getting results that include "kill all the jews" and "why are jews evil" - it's because Google has recognized that you seem to search for stuff like that a lot.

      This trips people up a lot when they post "hilarious" Google autocomplete results where Google recommends really weird fetishes related to some benign search - not realizing that what they're actually showing the world is that they frequently

  • by Anonymous Coward on Monday February 12, 2018 @01:57PM (#56109661)

    Help! Help! I can't think for myself and my eyes are sensitive!

  • When I tested this by entering "you should not eat", I got the helpful autocompletion "mangoes in a heatwave".

  • Color me naïve... (Score:4, Informative)

    by DrTJ ( 4014489 ) on Monday February 12, 2018 @01:57PM (#56109665)

    ... but I fail to see the problem.

    Those who search for "hitler is my hero" will find the results anyway and will not hindered by the completion removal.

    Most others are not likely to be converted to die-hard nazies because they see a completion alternative.

    The minuscle part of humanity that are, probably have worse problems.

  • by Anonymous Coward

    Welcome to the Internet. Sack up and deal with it.

  • by bradley13 ( 1118935 ) on Monday February 12, 2018 @02:08PM (#56109763) Homepage

    This is an impossible problem to solve. There will always be the next "offensive" sentence. And anyway, who gets to define what is offensive? The only solution to this problem is to stop showing other people's searches. Which, arguably, might be a good change to make.

    Regardless, I'm not seeing the problem. These are search terms. What is offensive about searching for "blacks are not oppressed"? Are you looking for evidence to support that conjecture? To refute it? As a search term, why should anyone take offense?

    Or how about "islamists are not our friends"? The term "islamist" refers to a muslim who believes that Islam should be not only a religion, but also a political system. In the West, we believe in a separation of church and state. So, in fact, islamists are not our friends. Where's the offense?

    We'll skip the "Hitler" searches, because the vast majority of those are not serious. Bored teens on 4chan have to do something with their spare time.

    What about "feminists are sexist"? In fact, modern feminism in the West no longer seeks equality for women, it now seeks special treatment. Just as affirmative action is by definition racist, so modern feminism is sexist. Even if you disagree with this, searching for the reasons that people may believe it, is a perfectly legitimate search.

    Climate change? There is, in fact, still a great deal of debate. Not about the climate warming, perhaps, but certainly about the degree of warming, about the predictions being made, and the degree to which climate change is natural or anthropogenic. Again, why should search terms be problematic?

  • You've got to wonder how their autocomplete algorithm works and even why it would bother with a ridiculously open ended, e.g. Are x question - unless such questions are not that open ended, which would be interesting.
  • Ummm ... (Score:5, Insightful)

    by Obfuscant ( 592200 ) on Monday February 12, 2018 @02:18PM (#56109829)

    I too felt a certain kind of queasiness

    Then step away from the computer and read a book. The internet is not a place for people who are queasy when faced with opinions they don't agree with, ESPECIALLY AS AUTOCOMPLETIONS ON A SEARCH ENGINE. If the question "are jews evil?" makes you queasy, they you will not like the internet, and the rest of us do not want you to try to recast it in your limited vision of what is proper and correct.

    I would suggest only reading books your mother picks out for you, since she will be able to filter those to prevent you seeing queasy-making things.

  • Queasy? (Score:5, Insightful)

    by Shotgun ( 30919 ) on Monday February 12, 2018 @02:19PM (#56109847)

    Google returning a suggestion for a search makes the author "queasy"?

    Really, snowflake (in all of its derogatory connotations), you need to turn off your computer and go outside for a few minutes. You're not mature enough to use a keyboard.

  • People type all sorts of things into Google and autocomplete simply reflects that. Ever played Google Feud [googlefeud.com]?

    Here are the answers to What is the number for [googlefeudanswers.com].
    Check out #8.

    1. comcast
    2. the irs
    3. pi
    4, marvin
    5. autozone
    6. time
    7. cvs
    8. 911
    9. boost mobile
    10. walmart

    • By my keypad doesn't have an eleven... it only has numbers zero through nine... how do you dial nine-eleven???? Aaaa!

    • To be fair, I'm guessing most of those went from #1 straight to #8 after 3 hours on the phone with nothing to show for it.
  • Due to a concerted campaign spear-headed by Dan Savage, putting "santorum" into Google not only returns a deliberately vile suggestion as the first result, it offers it as a "featured suggestion".

    Regardless of what you think of the politician, should Google be held accountable or be compelled to alter search results? Not only is the result incontrovertibly vile (that was the explicit goal) but it was an engineered effort by an individual and his fans against another individual.

    If this is okay, why are sear

    • by ceoyoyo ( 59147 )

      It certainly is vile: "santorum... rick."

      When you click you find out that this some American politician who: "likened Obamacare to Apartheid in South Africa in a Nelson Mandela tribute speech" and "signed an online pledge vowing not to respect any law, including any decision by the United States Supreme Court, conferring legal recognition on same-sex marriage."

      Now, the second suggestion is a useful noun describing a particular mixture of biological and non-biological products.

      Vile is in the eye of the behol

    • The choice of search terms to illustrate "vile" results seems to betray the ideological moorings of the author,

      You say that like it's a bad thing. I guess that's your ideology speaking.

  • This is the result of jokers, idiots and people possibly too smart (and hateful) for their own good.

    Google is simply showing what it thinks you're trying to type, based on what others have put in. Now imagine a bored someone writes a script to peg Google search with some undesirable search terms and has the bot hammer Google 24/7/365. So Google naturally learns 'people like to search for this term, I'll suggest it to others cuz it's so popular.'

    With botnet armies at their disposal, anyone with some know h

  • Islamists? (Score:5, Interesting)

    by TsuruchiBrian ( 2731979 ) on Monday February 12, 2018 @02:30PM (#56109965)
    I thought Islamists was a term created specifically to differentiate the bad Muslims. (e.g. Muslims aren't bad, it's the Islamists that are bad). I don't think this is unreasonable. I don't think it's fair to say "Christians are bad" whenever some Jesus cult does some crazy shit, but at the same time, I wouldn't expect google to come with good results for "christian fundamentalists" or similar terms.
    • by ceoyoyo ( 59147 )

      I was curious. Interestingly, the first suggestion for "christian fundamentalists are" is "not a separate denomination." Second is "stupid," third is "cults."

      Seems reasonably on par with the results for "islamists are". I expected "...stupid" is probably a top five suggestion for any X is/are query because it's the internet.

    • Christian fundamentalists build replicas of Noah's Ark in the desert. Islamists do other things. Comparing the two is ridiculous.
      • Some christian fundamentalists build replicas of Noah's Ark. Some Christian fundamentalists bomb abortion clinics. Some Christian fundamentalists commit genocide. Maybe you don't want to lump the genocidal christians in with the fundamentalists. Ok fine. I don't know what to call those people either. Shall we call them Christianists? This seems like a semantic debate.
    • People always talk about the small percentage of Muslims that are terrorists. Less spoken about is, depending on the country, up to 62% of all Muslims say suicide bombings against civilians are often or sometimes justified. [pewglobal.org] Among Muslims in the US and western Europe, 13-35% [pewresearch.org] support suicide bombings at least some of the time. Large majorities of Muslims support Sharia Law [pewforum.org] to be the law of the land (and large percentages support then having it apply to non-Muslims as well). From country to country, support fo
      • I never said anything about parity in extremist views between christianity and Islam. I am not a person who claims that all religions are equally dangerous, and I do think the polls you cite are indicative of a real problem.

        I would like to bring up a couple points, however.

        #1. Like all polls, there are probably polling errors. These can result from poor translations of questions and/or answers. They can result from the fact that people do want or feel comfortable answering questions honestly (e.g. due t

  • by hyades1 ( 1149581 ) <hyades1@hotmail.com> on Monday February 12, 2018 @03:47PM (#56110603)

    I tried "Jews are". Google auto-completed it as "Jews aren't white".

    • by PPH ( 736903 )

      And I didn't get anything remotely racist or anti-Semitic. But what I have gotten (in the past) is auto-completes related to my recent searches. So perhaps the author of TFA spends too much time digging around for racial slights.

  • by Zorro ( 15797 )

    This is a regular gag on Tosh.0

    You should try Tijuana sometime and try NOT to get a suggestion for a Donkey Show.

  • "Blacks are not oppressed" is now vile? You can say that you disagree, but this appears to be suggesting that you can't even ask the question. Talk about wrongthink.
    • It's like looking into the racial disparity in IQ or violent crime... you can't investigate the causes and possible solutions, because even looking at the size of the effect you're a vile racist troll for even posing the question since it suggests that "there is a difference" might be a possible answer (notwithstanding that, whatever the reason, there *is*).
  • Are really just "objectionable" from a particular point of view.
  • by eepok ( 545733 ) on Monday February 12, 2018 @04:20PM (#56110925) Homepage
    Why the hell would searching "Are Jews evil?" a bad thing? Is it showing one of those Google answer summaries saying, "Yes"? Chances are that it's not and that it's showing the historical issues of antisemitism, prejudice, stereotyping, and scapegoating.

    I'd be more concerned if the widely spread issues of antisemitism weren't being combated by people going to Google and asking if what they've been told as children or are being told by their peers is true.

    I searched "Is God..." and the first option is "Is God Real?". Great question!

    I searched "Are all criminals... " and the first suggestion was "Are all criminals mentally ill?" The second was "Are all criminals bad?" Again, great questions!

    Questions are good. Especially when they are intended to seek truth and combat prejudice.
  • A cookie setting is not an opt out when you have an account.
    There should be a way to turn it off so it stays off everywhere you are logged in.

The solution of problems is the most characteristic and peculiar sort of voluntary thinking. -- William James

Working...