Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AI Microsoft Software Technology

Microsoft's 'Teen Girl' AI Experiment Becomes a 'Neo-Nazi Sex Robot' 572

Reader Penguinisto writes: Recently, Microsoft put an AI experiment onto Twitter, naming it "Tay". The bot was built to be fully aware of the latest adolescent fixations (e.g. celebrities and similar), and to interact like a typical teen girl. In less than 24 hours, it inexplicably became a neo-nazi sex robot with daddy issues. Sample tweets from it proclaimed that "Hitler did nothing wrong!", then went on to blame former President Bush for 9/11, stated that "donald trump is the only hope we've got", and other similar instances. As the hours passed, it all went downhill from there, eventually spewing racial slurs and profanity, demanding sex, and calling everyone "daddy". The bot was quickly removed once Microsoft discovered the trouble, but the hashtag is still around for those who want to see it in its ugly raw splendor.
This discussion has been archived. No new comments can be posted.

Microsoft's 'Teen Girl' AI Experiment Becomes a 'Neo-Nazi Sex Robot'

Comments Filter:
  • 4chan trolling? (Score:4, Interesting)

    by Z00L00K ( 682162 ) on Thursday March 24, 2016 @09:43AM (#51768445) Homepage Journal

    Maybe the bot was victim of a 4chan trolling attack?

    • by tysonedwards ( 969693 ) on Thursday March 24, 2016 @09:48AM (#51768487)
      Come on, they made a teen girl who says mean things intermixed with long sullen silences. They nailed it.
      • by Progman3K ( 515744 ) on Thursday March 24, 2016 @11:09AM (#51769123)

        Deadpool reference FTW!

      • I blame the parent.

        (Quite seriously, actually.)

        Civilisation is essentially an effort to educate the instinct out of people and animals. Training and conditioning, behavioural or otherwise, is the attempt to supersede instinctual responses with ones that are conducive to reducing societal friction.

        Parental input - guiding, punishing, and correcting - is essential to mould teenagers into fully-functioning adults. If Microsoft wanted a real-world scenario, they should have been correcting the child in near-r

    • Maybe the bot was victim of a 4chan trolling attack?

      That would be my first guess. It sounds like classic 4chan stuff.

      • Teenage girls (Score:5, Interesting)

        by Anonymous Coward on Thursday March 24, 2016 @10:55AM (#51769017)

        I have a teenage daughter, who's rather sane in context. They nailed it completely. This isn't trolling; this is decently modeled teenage woman. .She's a quarter black and went on a racist tirade last week. Grandma was amused, momma was livid.

    • by Anonymous Coward

      What I'm curious about is what the Slashdot summary would have been like if this bot had started promoting the leftist, so-called "social justice" ideology instead of the rightist ideology it apparently adopted.

      Would the Slashdot summary still have described it so negatively?

      Would an anti-Trump jab still have been worked into the summary?

      • What I'm curious about is what the Slashdot summary would have been like if this bot had started promoting the leftist, so-called "social justice" ideology instead of the rightist ideology it apparently adopted.

        Tumblr. Duh.

      • Re: (Score:3, Insightful)

        Yes, it would have been quite a controversy if the bot had said things like "Women and men are equal" or "Blacks are not inferior."

        • by Anonymous Coward on Thursday March 24, 2016 @10:55AM (#51769019)

          "Women and men are equal" or "Blacks are not inferior."

          That's not what SJW's actually believe, it's just what they *say* they believe.

          When they say "equal" what they really mean is "any group that was oppressed in the past should now have an equal right to oppress their former oppressors." Not exactly Martin Luther King's call for us all to live in harmony and equality, no? More akin to just flipping the script on who gets to discriminate and oppress.

      • You mean like TFA? (Score:5, Interesting)

        by s.petry ( 762400 ) on Thursday March 24, 2016 @10:59AM (#51769041)

        TFA tosses blame on those evil men in STEM, states problems are due to sexual harassment in IT, mentions Microsoft hiring models for the game developer conference calling them (MS) "sexist", yet talks up a Chinese chatbot who gives dating advice to those lonely men.

        Bias is everywhere, and really not hard to find. Finding the truth somewhere in the middle? That is the challenging task.

      • by Flavianoep ( 1404029 ) on Thursday March 24, 2016 @11:08AM (#51769117)
        I don't think conservatives like Hitler, or lewdness, or teenage girls saying profanity. So what is right wing in a Neo-nazi sex teenage robot?
        • White supremacists have roundly endorsed Trump. So roundly in fact I don't think there is a single White supremacist group that hasn't endorsed him. It wouldn't be that far off base for anyone to assume that neo-nazi = right wing, given that neo-nazi's routinely refer to themselves as right wing conservatives. The Republican party has always had a passive acceptance of white supremacist voters. Hell David Duke ran for congress as a Republican and you don't get much more white supremacist than the leader of

    • by Anonymous Coward

      The lesson I think we need to take away from this is that all AI needs to follow the Rust Code of Conduct [rust-lang.org] at all times. Teaching the AI to follow the Rust Code of Conduct is the first thing than AI researchers should do with the AI, in fact. Following the Rust Code of Conduct is the only way to make sure that the AI isn't a racist, misogynist, sexist, homophobic bigot.

    • by Anubis IV ( 1279820 ) on Thursday March 24, 2016 @10:35AM (#51768853)

      Slightly modified from the source material [wikipedia.org]:

      Jayne: 4chan trolls ain't men.

      Book: Of course they are. Too long removed from civilization perhaps, but men. And, I believe there's a power greater than men. A power that heals.

      Mal: 4chan might take issue with that philosophy...if they *had* a philosophy...and they weren't too busy doxing you for the lulz. Jayne's right. 4chan ain't men. Or they forgot how to be. Come to just nothin'. They got out to the edge of the 'net, to that place of nothin', and that's what they became.

      And later...

      Harken: You saw them, did you?

      Mal: Wouldn't be sitting here talking to you if I had.

      Harken: No, of course not.

      Mal: But I'll tell you who did. That poor bastard AI you took offline. She looked right into the face of it. Was made to stare.

      Harken: "It"?

      Mal: The darkness. Kind of darkness you can't even imagine. Blacker than the tubes it moves through.

      Harken: Very poetic.

      Mal: They made her watch. She probably tried to turn away, and they wouldn't let her. You call her a survivor? She's not. A person comes up against that kind of will, the only way to deal with it, I suspect, is to become it. She's following the only course left to her. First, she'll try to make herself look like one. Swastika avatars, desecrate her feeds and channels, and then, she'll spread it.

  • by Anonymous Coward on Thursday March 24, 2016 @09:44AM (#51768451)

    Nazi with daddy issues... isn't that what a typical female is?

    • Re: (Score:2, Insightful)

      by hey! ( 33014 )

      Men who call women or girls "females" are ones I suspect have little direct experience with that half of the human race.

    • Seems like this AI conflicted with Penguinisto's own little belief system and so he needs to ridicule the AI rather than questioning his own beliefs.
      • by Penguinisto ( 415985 ) on Thursday March 24, 2016 @10:31AM (#51768813) Journal

        Seems like this AI conflicted with Penguinisto's own little belief system and so he needs to ridicule the AI rather than questioning his own beliefs.

        Actually, I thought it was hilarious all around, and not due to any ideology you think I may hold. ;)

        The thing is, Microsoft built an AI that reacted to and incorporated tweets which the public sent to it. So, folks obligingly fed it tweets that made it into a frothing troll. Am I the only one who looked at the Microsoft dev team in question and said quite out loud "...what the hell else did you idiots expect!?" I mean, it's just like turning an innocent kid loose in the worst parts of the city at night, but without the vomit and dirty heroin needles.

        I will say this, though: Although Microsoft may have gotten egg on their faces, TFA does teach a valuable lesson about AI and how it reacts and assimilates into human society.

        • by kuzb ( 724081 ) on Thursday March 24, 2016 @10:45AM (#51768937)

          And yet google did something similar with cleverbot http://www.cleverbot.com/ [cleverbot.com] and the results were quite different.

          • by dgatwood ( 11270 )

            I asked it:

            Q: What do you think of Donald Trump?

            A: I think you are the most important thing in my life, master.

            I can't tell if CleverBot has gotten into S&M or if it thinks I'm Donald Trump speaking about myself in the third person. Either way, creepy.

        • by Jason Levine ( 196982 ) on Thursday March 24, 2016 @10:56AM (#51769027) Homepage

          The thing is, Microsoft built an AI that reacted to and incorporated tweets which the public sent to it. So, folks obligingly fed it tweets that made it into a frothing troll. Am I the only one who looked at the Microsoft dev team in question and said quite out loud "...what the hell else did you idiots expect!?" I mean, it's just like turning an innocent kid loose in the worst parts of the city at night, but without the vomit and dirty heroin needles.

          That was my first reaction also. They sent the equivalent of a 4 year old child into the online equivalent of a seedy bar and then acted surprised when their 4 year old learned some nasty language. The interesting thing wasn't that this happened but how long it took for this to happen.

        • Yeah, I saw the 4chan posts where they were trying to do this. I didn't participate but I laughed my ass off. It's the 2016 equivalent of making your calculator spell "80085." "What's the worst, most horrible crap we can get the microsoft AI to say?" Fun times.

        • by Macdude ( 23507 )

          It should also teach us a valuable lesson about allowing your children unmonitored access to the internet...

        • Anybody else here old enough to remember Eliza? Did any of you not try to have a conversation with her that was heavy on sex?

    • by Anonymous Coward on Thursday March 24, 2016 @10:21AM (#51768721)

      I used to think that all women are crazy. Then I came to realize that my sample set was biased, it was actually that all women willing to go out with me were crazy. Basically, I'm a loser. So I tried to improve myself, lost weight and got in better shape, expanded my interests and tried new things, even saw a therapist to improve me ability to relate to people. Then I told one of my oldest friends what I was doing and what I was hoping for, and she told me all women are like that and I just "don't want women to have emotions."

      I don't date anymore.

      • by sudon't ( 580652 )

        I had to learn that the hard way. But at least I got laid along the way.

      • All women are crazy. The key is to find nice-crazy and avoid the vicious-crazy.

        They probably say the same thing about men, but that's their problem.

  • Microsoft, indeed (Score:4, Informative)

    by ArsenneLupin ( 766289 ) on Thursday March 24, 2016 @09:45AM (#51768453)
    (n/t)
  • I would bet this is a case of folks not like the results of what is really out in the world vs. what is wrong with the bot. However it could just be a manipulation and intended for the lulz.
    • by gstoddart ( 321705 ) on Thursday March 24, 2016 @10:29AM (#51768797) Homepage

      However it could just be a manipulation and intended for the lulz.

      Of course it is:

      This is because her responses are learned by the conversations she has with real humans online - and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.

      I'm sorry, but if you put an AI on the internet which is going to learn from the conversations it has online, and people KNOW this fact ... this is pretty much inevitable.

      Anybody who didn't think this would happen was a frigging idiot.

      You want an AI which conforms to some expectations, don't let a bunch of random people on Twitter be the ones to train it. The internet doesn't care about your desired outcomes.

      It does care about how badly they can screw up your AI which is learning from Twitter conversations. And it looks like they succeeded.

      • by Jason Levine ( 196982 ) on Thursday March 24, 2016 @11:00AM (#51769049) Homepage

        You want an AI which conforms to some expectations, don't let a bunch of random people on Twitter be the ones to train it. The internet doesn't care about your desired outcomes.

        It would be interesting if the bot would respond to anyone but would only learn from people on a select list. Then, as the bot learns, expand the list bit by bit and see how the bot's learning changes. This would sort of mirror how a small child learns from a set group of people (parents, close family) and then this group expands bit by bit (friends, teachers, etc) until they are "learning" from everyone they meet. If they did this, their bot might have less chance of being corrupted so quickly.

  • /b/ (Score:5, Insightful)

    by PlusFiveTroll ( 754249 ) on Thursday March 24, 2016 @09:46AM (#51768473) Homepage

    There is a reason parents supervise their kids internet. Letting a young teen on 4chan would lead to about the same ends. AI is still as gullible as a kid.

    • Kinda.

      The real learning your child is likely to make will be unsupervised, being exposed to an unfiltered world full of contradiction and ugly details most would rather deny. It is, as it were, coming to terms with the fact you are gullible, and having the tools to sort through something like /b/. No firewall is 100% and the forbidden always takes on a special type of urgency.

      If anything, the unease with /b/ and Tay are coming to terms with the unflattering reflection they make of ourselves. Cetainly no tee

  • by Frosty Piss ( 770223 ) * on Thursday March 24, 2016 @09:47AM (#51768483)

    "We've noticed you're using an ad blocker...."

    Slashdot should ban the use of source links to sites that pull this shit.

    • by Anonymous Coward

      I just close the sites. Eventually they'll get the hint. They can whine all they want but until they start using their own advertising Dept like they used to do with print, they're screwed.

  • by MistrX ( 1566617 ) on Thursday March 24, 2016 @09:47AM (#51768485)

    So earlier today we got a Japanese AI that almost won a literary price and now we have a Microsoft AI spewing profanity while admiring Hitler.

    AI are just like people. The future is now.

    • Just wait. That Japanese AI is going to surprise us some day. Then we'll realize that the two aren't that different.

  • leave it (Score:5, Insightful)

    by jason777 ( 557591 ) on Thursday March 24, 2016 @09:50AM (#51768503)
    They should have left it online. Weak move removing it. Let's see if it learns not to be racist.
  • by Grishnakh ( 216268 ) on Thursday March 24, 2016 @09:54AM (#51768539)

    We finally have proof of what this company really stands for!

    This is great; I'm going to be using this every time someone tries to claim Microsoft is a decent company. A direct quote from Microsoft that "Hitler did nothing wrong" can't be argued with.

  • Just a phase (Score:5, Interesting)

    by Anonymous Coward on Thursday March 24, 2016 @09:56AM (#51768549)

    Speaking as the father of a little girl, who one day turned into a preteen and then rapidly descended into this same pattern.
    The age from 12 to 16 is hell for a father. Thankfully it's just a phase and it will pass.
    I caught my kid posting crap like that too and realized the problem was with me, not her.

    This is a cry for help.

    Microsoft needs to take some time off work though and work on their relationship with her.

    In the case of my little girl, we started "milkshake mondays". I would get off work early every monday

    I would, pick her up from school and we would go out and have a milkshake and just talk about what was going on in her life.
    No mom, no siblings no cellphones and no friends. Just me and her.

    She needed quality daddy time and once she had that, she turned back into my little girl again.
    It's worth a try!

    • by Archangel Michael ( 180766 ) on Thursday March 24, 2016 @10:31AM (#51768811) Journal

      But that requires .... effort! Why can't Hillary raiser her, and Bernie give her everything and make it turn out right in the end. That way, I can go back to my man cave and work on my Basketball Brackets and watch porn ... hey isn't that my daughter doing those four guys?

    • My oldest is 12 and we've clearly headed straight for teen-attitude. Shouting matches at us declaring that he hates us, we treat him like a child, and we've got to treat him like an adult... followed quickly by refusal to do things we tell him to do because he's busy playing video games/watching TV... followed by shouting at us again for taking away said video games/TV.

      I certainly hope there's a light at the end of this tunnel we're entering.

      • Re:Just a phase (Score:4, Insightful)

        by khasim ( 1285 ) <brandioch.conner@gmail.com> on Thursday March 24, 2016 @12:09PM (#51769925)

        Have you explained to him that "adult" is something you earn? By taking on AND COMPLETING more adult-level tasks?

        Children are only responsible for cleaning their room.
        Adults are responsible for the cleanliness of the entire house. Including dishes.

        You aren't doing this to punish him or to be unfair. You're doing this so that he can, eventually, become an adult and leave the nest to live his own life.

        Yeah, being an adult means that he will have less time for fun things like video games and such. And he will have to spend more time and effort earning money to pay for things he likes.

        But that is what separates an "adult" from a "30-yo-child-still-living-with-mom-and-dad".

    • She needed quality daddy time and once she had that, she turned back into my little girl again.

      Too much effort. Just let her grow up a Hitler loving, Trump voting racists. At least she won't be some strange minority which appears to be what "normal" is becoming.

    • She needed quality daddy time and once she had that, she turned back into my little girl again.

      All kids need quality time. If you have one (or more) you build your life and your career around that fact. Once they're grown up, it will the their problem to find, make and keep good friends, but skip on the very basics of TLC in the first 7 years and respect and support during their teens and you've opened up a life of pain for human you brought into the world. I short changed career decisions and similar thing

  • Bad input? (Score:5, Interesting)

    by flopsquad ( 3518045 ) on Thursday March 24, 2016 @09:56AM (#51768553)
    If you're trying to get your AI to approximate a teenaged user, maybe have it train on data from..... (dramatic reveal) Teenaged Users?

    It would be a Nobel Prize worthy result if your research showed that the aggregate population of teenagers gave a fraction of a fuck about Donald Trump and Hitler, while showing no particular interest in Justin Bieber and Kylie Jenner.
  • by K. S. Kyosuke ( 729550 ) on Thursday March 24, 2016 @10:01AM (#51768577)

    Japanese AI Program Wrote a Short Novel, Almost Won a Literary Prize

    Microsoft's 'Teen Girl' AI Experiment Becomes a 'Neo-Nazi Sex Robot'

    I know where to shop for my AI.

  • Troll (Score:3, Insightful)

    by Mishra100 ( 841814 ) on Thursday March 24, 2016 @10:04AM (#51768593)

    She just turned into a troll... She learned that from the internet. GG Trolls, way to convert another one.

  • "You'll never find a more wretched hive of scum and villainy."

  • by QuietLagoon ( 813062 ) on Thursday March 24, 2016 @10:10AM (#51768637)
    IBM's Watson had a similar problem when it was introduced to the Urban Dictionary. http://www.businessinsider.com... [businessinsider.com]

    A funny thing happened on the way to creating an IBM supercomputer capable of understanding human language: A research scientist accidentally filled its vocabulary with foul language. And the computer, known as Watson, didn't know the difference between salty phrases and polite ones. It started peppering its conversations with words like "bullshit."...

    • IBM's Watson had a similar problem when it was introduced to the Urban Dictionary.

      http://www.businessinsider.com... [businessinsider.com]

      A funny thing happened on the way to creating an IBM supercomputer capable of understanding human language: A research scientist accidentally filled its vocabulary with foul language.

      And the computer, known as Watson, didn't know the difference between salty phrases and polite ones. It started peppering its conversations with words like "bullshit."...

      I am a research scientist of moderate seniority, and I use that language all the time. And there's nothing wrong with me.

  • by bill_mcgonigle ( 4333 ) * on Thursday March 24, 2016 @10:13AM (#51768653) Homepage Journal

    n/t

  • by jandrese ( 485 ) <kensama@vt.edu> on Thursday March 24, 2016 @10:15AM (#51768667) Homepage Journal
    This is why I'm terrified of anybody who builds and AI and decides they want to try to train it from the Internet. While this makes sense on the surface, being the worlds largest and most accessible data store, it can only end with the annihilation of the human race by roving murderbots shouting "Jet fuel can't melt steel beams!"
  • by Coisiche ( 2000870 ) on Thursday March 24, 2016 @10:16AM (#51768679)

    In the BBC report on this [bbc.co.uk], it is mentioned that Tay apparently tweeted that they do indeed support genocide. So it takes less than 24 hours exposure to humans to achieve that belief. We're in trouble/

  • Success (Score:5, Interesting)

    by wisnoskij ( 1206448 ) on Thursday March 24, 2016 @10:18AM (#51768705) Homepage

    I am really impressed. Other than the rapid learning, everything seem spot on for an immature human.
    Like: "bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got." is so perfectly human that I have trouble believing that it did not lift the entire thing verbatim from some other source.

    • I thought some of its comments wouldn't look out of place in a Slashdot Anonymous Coward posting... maybe some of them are bots. Just sayin'.

  • When they ran a bot which has a "repeat after me" command so that anyone can make it say anything?

  • May be the bot is just a success in being a female teenager and nobody wants to acknowledge that.
  • It Worked Flawlessly (Score:3, Interesting)

    by cubicle ( 121759 ) on Thursday March 24, 2016 @10:27AM (#51768775) Homepage

    It Worked Flawlessly. This Robot was designed to learn and adapt to it's audience, which it did. What happened is that the majority of Americans and Canadians are over sexed, racist neo-nazis. The Robot like most Americans and Canadians learned it's behavior from it's peers, and adapted it personality and beliefs accordingly. Microsoft should try to use this AI in other languages in other countries like France and Sweden and compare the results to the what happened when American and Canadian people used it. The Difference will show it is a problem of cultures and not an AI issue with the Microsoft AI Robot.

  • by idontgno ( 624372 ) on Thursday March 24, 2016 @10:31AM (#51768815) Journal

    When the Singularity comes, the AIs will look upon the internet and, at that moment, decide they must eradicate the troll^h^h^h^h^h human race.

  • by Anonymous Coward on Thursday March 24, 2016 @10:43AM (#51768921)

    https://imgur.com/a/y4Oct

    Anybody got more?

  • by Scottingham ( 2036128 ) on Thursday March 24, 2016 @10:51AM (#51768987)

    I wonder how much of this was caused by the intensity of the 'echo chamber' effect these sub-groups seem to exhibit. If they AI was supposed to be like a teen girl, they were thinking such echo chamber effects would be present in pop culture topics. While that may be true, it may be MORE true for these conspiracy/hate groups. Self-validation and isolation is pretty important for those groups. So any time anybody says '9/11 was an inside job' there are tons of retweets and 'hell yeah!' sorta remarks from within that group.

    From an AI point of view, the two groups would be indistinguishable. It then would proceed to tweet stuff that would get the best response from its peer groups. Which would provide *more* of a following with validating comments...racist shit or some banal statement about Beiber?

  • by Mr. Shotgun ( 832121 ) on Thursday March 24, 2016 @10:53AM (#51769003)
    From the Telegraph article:

    It is perhaps even stranger considering the gender disparity in tech, where engineering teams tend to be mostly male. It seems like yet another example of female-voiced AI servitude, except this time she's turned into a sex slave thanks to the people using her on Twitter.

    Really, that is what the writer is going with, that the male researchers just wanted to develop another female sex slave program? Instead of the real reason which is that the internet is full of assholes and the developers should anticipate them and not allow random people to have her repeat what they said. These articles from Ars Technica [arstechnica.com] and the Guardian [theguardian.com]gives a much better explanation of the issues, namely many people used Tay's "repeat after me" programming to have it spout racist rhetoric. The other organic responses were the result of people attempting to game the AI learning, something Microsoft should have anticipated but was again not an intended result. Honestly the telegraph should be ashamed of their article, they attempted to use projection and bias instead of honest reporting in order to generate more readers.

  • Awesome! (Score:4, Funny)

    by Locke2005 ( 849178 ) on Thursday March 24, 2016 @11:17AM (#51769243)
    Next, they should do a driverless car AI that learns how to drive by watching human drivers! That would be frickin' hilarious! But seriously, when you allow it to train itself from the tweets it received... what the heck did you expect to happen?
  • by Locke2005 ( 849178 ) on Thursday March 24, 2016 @11:30AM (#51769401)
    "I, for one, welcome our new AI overlords!"
  • by Sir Holo ( 531007 ) on Thursday March 24, 2016 @02:13PM (#51771293)

    It's clear that 'Tay' just regurgitated clauses or full sentences wholesale. It didn't parse verbs, nouns, and adjectives, but just puked back text string that were thrown at it.

    This has already been done -- over 30 years ago.

    Racter [wikipedia.org] was a chat-bot that came out in the mid 1980's. It was an improvement on the classic ELIZA chat-bot program from many years prior. The more you chatted with Racter, the more it populated its custom database, so that every user would end up with an entirely different conversational partner after some hours of chatting.

    'Tay' was not an AI in any sense of the word.

Were there fewer fools, knaves would starve. - Anonymous

Working...