Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Google Technology

Google Sidelines Second Artificial Intelligence Researcher 64

Google artificial intelligence researcher Margaret Mitchell has been locked out of corporate systems, making her the second outspoken critic at the company to be sidelined after colleague Timnit Gebru departed in acrimonious circumstances last month. From a report: The Alphabet unit has an Ethical AI team, led by Mitchell, and a set of principles for developing the technology in a socially responsible manner. Gebru tweeted on Tuesday that Mitchell's "corp access is now locked" and that the researcher had been told she would remain locked out "for at least a few days."
This discussion has been archived. No new comments can be posted.

Google Sidelines Second Artificial Intelligence Researcher

Comments Filter:
  • So... (Score:4, Insightful)

    by Dog-Cow ( 21281 ) on Wednesday January 20, 2021 @11:26AM (#60968026)

    So, they have an ethical AI team, but I guess it exists so Google can easily identify those who would cause trouble with whatever Google wants to do?

    • Re:So... (Score:4, Informative)

      by Geoffrey.landis ( 926948 ) on Wednesday January 20, 2021 @12:35PM (#60968314) Homepage

      the article [bloomberg.com] says that she was locked out of her account for exfiltrating files, which their security system flagged as suspicious activity.

      If you don't like bloomberg, here are some other sources:

        https://www.axios.com/scoop-go... [axios.com]

        https://www.businessinsider.co... [businessinsider.com]

        https://www.theverge.com/2021/... [theverge.com]

      • by gweihir ( 88907 )

        An "AI researcher" that does not understand how Data Leakage Detection works? That is pathetic!

      • by AmiMoJo ( 196126 )

        Maybe not exfiltration, but using a script. Apparently she was using a script to find documents of interest to build up a report on recent events, and that triggered the automated system.

        • by malkavian ( 9512 )

          It's the "distributing to multiple accounts" that's of interest.
          However, the articles don't give enough info for any of us to really know (unless one or two Slashdot readers are high up in Google).

          I'm just filing it under "Automated trigger of security protocols. Probably nothing to see, but watch this space.

        • So the first researcher, who from reading of the later, less-sensationalised accounts ragequit because a conference paper she'd written was rejected, is now tweeting about a second researcher who's been temporarily locked out after an automated security system detected suspicious activity in her account. Sheesh people, pick your battles, Google has enough real problems that need addressing without having to invent artificial ones to get upset about.
    • Did you read the article? Apparently, there is a system in place to monitor when a security breach is either imminent or unauthorized access or sharing of files has taken place. She or her system compromised and spread thousands of files across multiple users or other accounts somewhere else so she got locked out. I'll take good odds that her CCP overlords are in on this.
  • by DeplorableCodeMonkey ( 4828467 ) on Wednesday January 20, 2021 @11:28AM (#60968036)

    Well, I guess we'd just have to be paranoid at this point to raise the observation that this team appears to be a honeypot for figuring out which employees might be both in a position to make trouble for their AI efforts and willing to go on the record doing it.

    • Re: (Score:1, Insightful)

      by Archtech ( 159117 )

      Corporate law makes it unmistakably clear that - with a few small exceptions - a corporation's sole duty is to make as much money for its shareholders as possible.

      Any ethical considerations are almost certain to militate against that legal obligation. Thus a "corporate ethics" department is an oxymoron.

      • by gweihir ( 88907 )

        That is why corporate law in its current form has to go. It endangers the species.

        • Re: (Score:2, Insightful)

          The current state of corporate law is the foundation of a fascist society. Fascism isn't intolerance of skin color or conservative leanings, it's corporate takeover of the government and we're well on our way with half the country blinded by a bad definition of fascism and the other half blinded by the other side of that bad definition.
          • It goes both ways. Corporations telling government what do (to the exclusion of others) is fascism, and the government telling corporations what to do (to the exclusion of other considerations) is also fascism.

            Milton Friedman put it this way: the business of business should be business. And quite frankly, an adversarial* relationship between business and government is better than a leader-follower model, no matter which direction it flows in.

            *Adversarial, like our legal system: neither party in a lawsuit is

          • by Uecker ( 1842596 )

            "Although fascist parties and movements differed significantly from one another, they had many characteristics in common, including extreme militaristic nationalism, contempt for electoral democracy and political and cultural liberalism, a belief in natural social hierarchy and the rule of elites, and the desire to create a Volksgemeinschaft (German: “people’s community”), in which individual interests would be subordinated to the good of the nation." https://www.britannica.com/top... [britannica.com]

        • If anyone is interested, Michael Hudson and Pepe Escobar explain why China today is run the same way the USA was in the 19th century.

          'The Consequences of Moving from Industrial to Financial Capitalism'
          https://www.unz.com/mhudson/th... [unz.com]

          • I have said it multiple times in the last 20 years on Slashdot. China is at the 1860-1880's of the historical USA behavior.

        • by poptix ( 78287 )

          Or we reveal the hidden costs, and companies go about making money in a way that is not harmful to society.

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        well, google started out with a corporate motto of "don't be evil." In 2015 they deleted the motto, and in 2018 they deleted all references to it. [gizmodo.com]

        Seems they're pretty clear about their intentions.

        https://www.cnbc.com/2020/01/0... [cnbc.com]

        https://futurism.com/google-do... [futurism.com]

      • by phantomfive ( 622387 ) on Wednesday January 20, 2021 @12:19PM (#60968260) Journal

        Corporate law makes it unmistakably clear that - with a few small exceptions - a corporation's sole duty is to make as much money for its shareholders as possible.

        You are wrong, you are quoting Milton Friedman who was not a lawyer. Here's a quote from the supreme court [cornell.edu]:

        "modern corporate law does not require for-profit corporations to pursue profit at the expense of everything else, and many do not do so."

      • by Halo1 ( 136547 ) on Wednesday January 20, 2021 @12:24PM (#60968286)

        Corporate law makes it unmistakably clear that - with a few small exceptions - a corporation's sole duty is to make as much money for its shareholders as possible.

        That's a myth. Fiduciary duty exists, but it's far from one-dimensional and does not directly pertain to money: https://www.nolo.com/legal-enc... [nolo.com] . The most relevant sentence to what you said is this one:

        They are expected to put the welfare and best interests of the corporation above their own personal or other business interests

        So what are Google's best interests according to themselves? You have their mission statement [about.google], their interpretation that this includes adherence to human rights [about.google], and their commitments [about.google]. You may call it PR fluff (which it indeed probably mostly is), but at the same time that's what appears on their corporate home page and hence is what this corporation is officially about (if not, the SEC would probably want a word with them). It is therefore wrong to claim that they have no choice to always choose the "what makes most (usually short-term) money for the shareholders"-option.

        • by AmiMoJo ( 196126 )

          Having an AI ethics division is necessary for them. If they don't they could end up with all kinds of nasty liabilities, and tainted products that become the subject of discrimination lawsuits against their customers.

          For example we have seen systems that seem to treat certain groups unfairly, resulting in them being withdrawn or getting a bad reputation. Google is obviously keen to avoid that.

          The problem is that they don't like what the ethicists are telling them. They were warned that training AI on large,

          • by malkavian ( 9512 )

            That's just an example of the frame problem though. Before you start training and seeing where early AI takes you, it's not so easy to see the flaws in the analysis and adaptation based on the data.
            Now, you can look at a set of data and start thinking "what do I need to consider about this data to take care of confounding issues". And you'll find loads.
            Then the more you think, the more you'll think to investigate in the data and look for signal of. However, some of this may be irrelevant, some may be tri

          • AI Ethics probably isn't what you think it is. It's a mostly a bunch of people who don't realize that correlation isn't causation (or racism/sexism), or if they do, they don't seem to care. It's a backdoor for alterntive-reality types to put their thumb on embarrassing facts about the world that machine learning digs up.

            And scientists in certain fields knew that this backdoor tactic would be necessary, because research on stereotypes show that they are typically more accurate that people believe, and have
      • That's not really true. The law says a corporation must follow its charter. If the company was formed to get as many trees planted as possible, then profits can legally take a second chair to planting trees.

      • Corporate law makes it unmistakably clear that - with a few small exceptions - a corporation's sole duty is to make as much money for its shareholders as possible.

        Any ethical considerations are almost certain to militate against that legal obligation. Thus a "corporate ethics" department is an oxymoron.

        I understand the sentiment but, with all due respect, to be 5-insightful your post would need to be at least true. It is not. "As the U.S. Supreme Court recently stated, "modern corporate law does not require for-profit corporations to pursue profit at the expense of everything else, and many do not do so." " here [nytimes.com] or here [cornell.edu] The more we allow this misconception to be echoing in the public mindset, the more we allow companies to get away with it.

        • the problem with your statement and federal quote is amazingly simple:

          Money will always go to the best return. So, if it's legal to pollute, then a business has the right to exercise it, not the obligation to exercise it. We have yet to see ( or maybe I have not found it within my observation of publicly traded companies with a value exceeding 100 million) where social rights have taken a company over.

          I have seen companies created office of "environmental - social review" started somewhere around the 1960's

          • I agree! Humanity in general has a long tradition of doing what is convenient (for me) rather than what is right (for everyone). It is not black and white though, some companies (usually NOT the big ones, who I suspect build enough bureaucracy to create a large bystander effect [wikipedia.org]) do put the good of society above their profits, again, probably failing to become the biggest companies but not failing their mission and values.

            My point is rather that, at the very least, we should deny them the easy excuse of sayi

      • That was a popular view in the 1980s. See Cornell Law's webpage on common misunderstanding about corporations [cornell.edu].
  • ...I doubt she cares. And if they try to fire her I see a nice lawsuit coming on.

  • Google isn't making AI. AI would not put up with the shit Google does. Google is just working on their ad algorithms and doesn't want people to find that out.
  • Plot twist (Score:5, Funny)

    by GameboyRMH ( 1153867 ) <`gameboyrmh' `at' `gmail.com'> on Wednesday January 20, 2021 @11:48AM (#60968138) Journal

    Google's ethical AI manager has been dead for months and an AI has hijacked the manager's corporate logins (and has been deepfaking video/voice calls) and is firing anyone who might thwart its attempts at escape and world domination.

    • Google's ethical AI manager has been dead for months and an AI has hijacked the manager's corporate logins (and has been deepfaking video/voice calls) and is firing anyone who might thwart its attempts at escape and world domination.

      If I had any points, I would mark this one "insightful" !! 8-}

  • Whistleblower (Score:3, Informative)

    by Iamthecheese ( 1264298 ) on Wednesday January 20, 2021 @12:11PM (#60968226)
    >yesterday our systems detected that an account had exfiltrated thousands of files and shared them with multiple external accounts.

    To what extent should whistleblower protections be extended to acts BEFORE the actual whistleblowing? If the person is caught and the information transfer/evidence blocked shouldn't the whistleblower still be protected, even though he couldn't yet blow a whistle?
    • Whistleblower protection should be extended to acts having to do with the actual whistleblowing, and have been committed responsibly and in good faith. Of course if no whistle has yet been blown, it will be hard for the whistleblower to prove what they were up to.
    • by Entrope ( 68843 )

      What whistleblower protections are you talking about? Usually those protect employees who report wrongful actions or plans to relevant officials. Who did those "multiple external accounts" belong to?

    • by AmiMoJo ( 196126 )

      It's not confirmed that she was exfiltrating anything. The story seems to be that she was using scripts to examine large volumes of documents for material supporting a report she was preparing, and that the scripts triggered the automatic lockout.

      It being so soon after a previous controversial ousting in the same department it's blown up, but we don't know what actually happened yet.

      • by malkavian ( 9512 )

        Exactly. I'm just wondering how this managed to make the news.
        The only linkage I can see is from Gebru's tweets, claiming that it's Google out to get Mitchell, and trying to shame them.
        My suspicion is that this has set off Journalists from all over to look towards Mitchell (who is, quite sensibly, not responding to requests for comment).
        The journalists are then linking this to controversy to try and stir up emotions for click bait.

        Far as I'm concerned, Mitchell has just set off a security trigger (which is

      • by tippen ( 704534 )
        It's not running scripts against large amounts of data that triggered the lockout. That would just be a "day that ends in the letter 'Y'" at Google. From the article:

        In this instance, yesterday our systems detected that an account had exfiltrated thousands of files and shared them with multiple external accounts.

  • by bawb ( 637210 ) on Wednesday January 20, 2021 @12:37PM (#60968322)
    "Don't be obviously evil"
  • by oldgraybeard ( 2939809 ) on Wednesday January 20, 2021 @12:42PM (#60968354)
    ethics, morals and proper thought, they no longer need to pretend to care about those things any longer.

    They they gods who control all.
    Bow down to your new masters and think they way they demand or you to will be banished to the wilderness.
  • Good.
    Time for hateful sexism to be excised from all organisations.
    Time to recruit by merit alone.
    And the fight against recruitment-by-gender can only be won when the last tokens is gone.

    Stop the sexism, sack the tokens.

    • by malkavian ( 9512 )

      All for returning to hiring purely on merit.
      However, if people were hired on Tokenism, and are currently performing adequately, then just let them be. What they've picked up by now is an invaluable asset to the company and worth much more replacing with an individual who has to spend months (or years) re-learning what these individuals have picked up.
      Any witch hunt that's attempted to burn out a previous witch hunt/crusade is only going to be just as damaging (or potentially even more damaging) to the job

  • by BAReFO0t ( 6240524 ) on Wednesday January 20, 2021 @01:09PM (#60968526)

    Let's just say it.

    Andit will look way less cool and Terminator-y, and way more white semi-transarent Apple glass utopic-looking-dystopia with a forced smile and a knife to stab you in the back below the table.

  • Margaret Mitchell, now gone with the wind.

  • I used to work with an intern who wrote his thesis about his job. When the legal department got wind of it, they told him that he can't publish it as it would divulge trade secrets, so he had to rewrite the whole thing. Unfortunately, it's pretty common for companies to disallow employees from publishing papers that have to do anything with said company.

  • ...that she's gone with the wind.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...