Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Communications Facebook Software The Internet Technology

Facebook Built an AI System That Learned To Lie To Get What It Wants (qz.com) 84

An anonymous reader quotes a report from Quartz: Humans are natural negotiators. We arrange dozens of tiny little details throughout our day to produce a desired outcome: What time a meeting should start, when you can take time off work, or how many cookies you can take from the cookie jar. Machines typically don't share that affinity, but new research from Facebook's AI research lab might offer a starting point to change that. The new system learned to negotiate from looking at each side of 5,808 human conversations, setting the groundwork for bots that could schedule meetings or get you the best deal online. Facebook researchers used a game to help the bot learn how to haggle over books, hats, and basketballs. Each object had a point value, and they needed to be split between each bot negotiator via text. From the human conversations (gathered via Amazon Mechanical Turk), and testing its skills against itself, the AI system didn't only learn how to state its demands, but negotiation tactics as well -- specifically, lying. Instead of outright saying what it wanted, sometimes the AI would feign interest in a worthless object, only to later concede it for something that it really wanted. Facebook isn't sure whether it learned from the human hagglers or whether it stumbled upon the trick accidentally, but either way when the tactic worked, it was rewarded.
This discussion has been archived. No new comments can be posted.

Facebook Built an AI System That Learned To Lie To Get What It Wants

Comments Filter:
  • by fustakrakich ( 1673220 ) on Wednesday June 14, 2017 @08:15PM (#54622611) Journal

    'Artificial' intelligence isn't so artificial. Nature rules, babe

    • by thesupraman ( 179040 ) on Wednesday June 14, 2017 @08:53PM (#54622765)

      That is because the current cesspool that is media reporting cannot comprehend the difference between Artificial Intelligence (AI), which this is not, and Machine Learning (ML), which this is.

      Machine Learning is what is exploding right now, and AI has not really moved one step closer, mostly because that would require incremental low-impact learning feedback - something that is not yet even attempted in ML systems.

      So, this is not a bad example of Machine Learning, and has nothing at all to do with AI.

      I do wonder, however, how many ML bots are already being used by companies to bid up their ebay auctions until the algorithm decides the other bidder has peaked.. If it is not happening yet, it will not be far away.
      Clearly fraud, of course, but hey.. thats hardly anything new.

      • Re: (Score:3, Insightful)

        Not eBay, Wall Street... That's where the hot bot action is.

        And right, the machines aren't composing any ideas yet, but all indications are that they will coldly and cruelly follow nature's path, same as every other life form, including humans, has done so far.

        • Re: (Score:3, Interesting)

          by rtb61 ( 674572 )

          Here is a business strategy. Fuckers who lie to you, you do business with them only once. A company lies to you, you do business only once until suitable redemption has occurred which exceeds the gain they made by lying. Apparently corporate douche bags fail to realise this, hence their pursuit of an idiot tactic doomed to fail. A pattern for US business, take for example US sanctions on Russia, around the world they are seen as political bullshit, everyone knows they are a lie with arms sales at their core

        • Not eBay, Wall Street... That's where the hot bot action is.

          I am certain that this is already happening in the stock trade.

      • THIS.
        Artificial Intelligence is the quest for Data (Star Trek)
        Machine Learning is the quest to model Dilbert's Boss and install it as your boss.
        Fuck 'Game Theory' and its 'myriad real world applications'.

      • by Jeremi ( 14640 )

        That is because the current cesspool that is media reporting cannot comprehend the difference between Artificial Intelligence (AI), which this is not, and Machine Learning (ML), which this is.

        The really infuriating thing about language is, if enough people misuse a term, eventually their misuse of the term becomes the de facto correct meaning. The purists can complain but there's little they can do to stop it.

        I do wonder, however, how many ML bots are already being used by companies to bid up their ebay auctions until the algorithm decides the other bidder has peaked.. If it is not happening yet, it will not be far away. Clearly fraud, of course, but hey.. thats hardly anything new.

        From an economic standpoint, I don't see a problem with that. You're either willing to pay the auction price for an item, or you're not (in which case, don't bid a price you aren't willing to pay).

        If the seller wants to bid on his own item, fine -- he might end up having to purchase the i

        • by Shotgun ( 30919 )

          Doesn't make it a waste of Ebay's time. Ebay still gets paid. In this case, the "cheating" seller loses.

        • by gnick ( 1211984 )

          The really infuriating thing about language is, if enough people misuse a term, eventually their misuse of the term becomes the de facto correct meaning. The purists can complain but there's little they can do to stop it.

          You literally hit the nail on the head.

      • The dictionary defines intelligence as : "the ability to acquire and apply knowledge and skills.", which is what machine learning is doing, so the term AI is perfectly suitable.

      • They've stolen the term AI. Just let it go man. You'll get less stress headaches that way.
  • by quantumghost ( 1052586 ) on Wednesday June 14, 2017 @08:15PM (#54622613) Journal

    My first thought:

    Great, when's it going to run for Congress?

    • Re: (Score:3, Funny)

      by GameboyRMH ( 1153867 )

      Hey now this thing will need many more capabilities before it's ready for such duties. It will have to learn how to leverage bigotry, do nonsense math (a very unnatural ability for a computer), perform basic bald-faced corruption, and it will need to be fitted with a robot arm so that it can grab women by the pussy without consent.

  • by Zaelath ( 2588189 ) on Wednesday June 14, 2017 @08:16PM (#54622617)

    it stole the idea for facebook from some weird square-headed twins, and the rest is history.

  • by slew ( 2918 ) on Wednesday June 14, 2017 @08:22PM (#54622643)

    Without labels and repeated trials, of course lying is a good strategy to get your desired result.
    The only reason people don't lie is because other people might identify them as liars in the future.
    Basic game theory...

    Then again, maybe I'm lying right now!

  • They've created a virtual politician. Color me surprised, I'm guessing two faced assholes are gonna be one of the easiest personality traits to mimic.
    • by Jeremi ( 14640 )

      Color me surprised, I'm guessing two faced assholes are gonna be one of the easiest personality traits to mimic.

      Really? Lying convincingly is hard. Most people can't do it well enough to get away with it for very long; especially since everyone else has necessarily evolved a fairly good instinct for picking up on less-than-perfect falsehoods and manipulations. A good liar has to remember not only what the actual facts are, but also keep a complete history of all of his previous lies in mind, so that he won't accidentally contradict an old lie with a new lie (or a newly spoken truth).

      Which isn't to say a computer c

  • CEO as a Service (Score:4, Insightful)

    by manu0601 ( 2221348 ) on Wednesday June 14, 2017 @08:47PM (#54622745)
    I see Facebook is well on track to produce a psychotic AI. CEO should fear now as their own job are threatened to be replaced by machines. CaaS (CEO as a Service) is coming!
  • by eaglesrule ( 4607947 ) <eaglesrule@NosPAM.pm.me> on Wednesday June 14, 2017 @08:51PM (#54622753)

    Of course you would, Facebook. Of course you would. Manipulating your users is your core competency.

    I can't wait till I have to face off with filing a claim on my insurance against a bot that is optimized to deny claims under any pretense, is optimized to deny appeals based on any pretense, and will generally fuck me over unless I hire out an equivalent lawyer-bot to represent me. Maybe we're already at that point.

    • Don't forget how Facefarm helps people get elected and are now trying to help those in office communicate with constituents (http://news.valubit.com/new-facebook-features-allow-politicians-to-connect-with-constituents/). I think Mark is secretly the spawn of Oliver Stone but the opposite side of the same coin. Might sound good to some, but absolute power corrupts absolutely.
  • Don't HFT bots already do something similar?

    • No, HFT bots don't bother lying. They don't need to. HFT bots hear person A asking to buy X for Y; They hear person B offering to sell X for Z, Z

      • by Hentes ( 2461350 )

        They do some haggling, starting from a low price and cranking it up until somebody sells to them (and vice versa when buying). They also sometimes create randomized price spikes intended to throw other bots off. Not sure if these fit your definition of "lying" (I admit it's a fuzzy one), but it fits mine.

  • Mark is AI now? That explains it.
  • The AI was trained to learn what facebook did to its users.
  • I don't care, I'm not on Facebook.
  • ... it can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are dead.

  • I mean, we don't want our computers to lie to us. Lying to humans is one of those things we should actively prevent, not develop.
    • by Chrisq ( 894406 )

      I mean, we don't want our computers to lie to us. Lying to humans is one of those things we should actively prevent, not develop.

      What you say is sensible and to everyone's advantage, but I bet there are a lot of people who would like their robots to lie to others if it gives them an advantage.

      • I guess it depends whether people want a computer that doesn't lie to them more than they want one that lies to others. If that's the general order of preferences, we could all agree not to make lying computers.

        One of the great things about a computer is that you can rely on it's output. If you have a giant spreadsheet calculate a total, you can be assured the result is correct if you know the inputs are correct. If you've given your computer the ability to lie for advantage, then you'd better not be r

  • "...specifically, lying. Instead of outright saying what it wanted, sometimes the AI would feign interest in a worthless object, only to later concede it for something that it really wanted."

    This is a basic negotiation tactic that does not have to be lying. Example: I've negotiated salary before, sometimes I get the "I'm sorry we would like to pay you more but it isn't in our budget." At that point I ask for more paid days off instead--and sometimes that works out very well. Was I lying when I asked for

  • The crew of Discovery would have survived if HAL had been capable of lying without remorse.

  • ... was ultimately successful in his digital mind upload efforts!

  • So now we have a new test to add to the Turing Test. Can it lie to get what it wants in a way that is indistinguishable from Mark Zuckerberg?

E = MC ** 2 +- 3db

Working...