Forgot your password?
typodupeerror
AI Businesses

Citigroup Mandates AI Training For 175,000 Employees To Help Them 'Reinvent Themselves' (fortune.com) 35

Citigroup has rolled out mandatory AI training for all 175,000 of its employees across 80 locations worldwide, a sweeping initiative that CEO Jane Fraser describes as helping workers "reinvent themselves" before the technology permanently alters what they do for a living.

The $205 billion bank sent out an internal memo last year requiring staffers to learn prompting skills specifically. Fraser told the Washington Post at Davos that AI "will change the nature of what people do every day" and "will take some jobs away." The adaptive training platform lets experts complete the course in under 10 minutes while beginners need about 30 minutes. Citi reported last year that employees had entered more than 6.5 million prompts into its built-in AI tools, and Q4 2025 data shows a 70% adoption rate for the bank's proprietary AI tools.
This discussion has been archived. No new comments can be posted.

Citigroup Mandates AI Training For 175,000 Employees To Help Them 'Reinvent Themselves'

Comments Filter:
  • Ok, what's wrong with this?

    • Oh, nothing. Everyone loves a mandatory training.
      • Re:Hm (Score:5, Insightful)

        by Junta ( 36770 ) on Tuesday January 27, 2026 @05:42PM (#65953266)

        Perhaps a better question is 'why is it newsworthy?'

        All these companies push all sorts of mandatory training. It's only vaguely indicative of things they *might* think need to be taken seriously or need to show they care about stuff like legal compliance, harassment, discrimination, privacy, or security.

        Hilarious on those trainings that have questions to check your knowledge that boil down to "what bad thing is *technically* ok for you to do?"

        • Hardly newsworthy. Likely just about any white collar company over 100 people has offered or made mandatory some sort of AI training by this point.
          • "The adaptive training platform lets experts complete the course in under 10 minutes while beginners need about 30 minutes."

            So, will the time to complete and error rates be used in evaluating employees for merit pay increases, promotions and retention?

      • Oh, nothing. Everyone loves a mandatory training.

        Who said mandatory training is to be loved. It's an essential part of working life. Not something worth putting on Slashdot.

        Yeah I was enrolled in mandatory CoPilot training too. Whoop de fucking do. We do training for various company tools all the time, some of them turn out to be useless.

        • Re:Hm (Score:4, Interesting)

          by hobbes75 ( 245657 ) on Tuesday January 27, 2026 @06:01PM (#65953310)

          I had a similar training recently, and used an AI browser extension to do it for me. It was significantly slower than doing it by hand, but it surprisingly mostly worked, it also used a lot of tokens and "AI" (which leadership seems to aim for).

        • Oh, nothing. Everyone loves a mandatory training.

          Who said mandatory training is to be loved. It's an essential part of working life. Not something worth putting on Slashdot.

          Yeah I was enrolled in mandatory CoPilot training too. Whoop de fucking do. We do training for various company tools all the time, some of them turn out to be useless.

          I see what you're saying, but the article frames this as something bigger than just another new tool to be learned. And is sounds like the Citigroup management think that way too.

          I wonder if it is similar to the introduction of computers on the desk for office workers.

          I think this transition happened sometime in the 1990s. In the 1980s, only computer programmers had computers on their desk, with the occasional exception of specialised work like CAD (Computer Aided Design) which was just starting to come in.

          • Everything is something bigger, articles overhype things. Management forced everyone to do mandatory excel training not because they like wasting hours but because they hope efficiency and computer proficiency will cause more work output with less staffing costs. This has literally been the status quo since the invention of the production line.

            In management's view everything will revolutionise the world.
            In the media's view everything is unique and newsworthy. ...

            News at 11.

    • by sjames ( 1099 )

      Can't wait until the AI mixes up the Zimbabwe dollar and the U.S. dollar in a transaction...

    • They are training their replacement.

      Just a wild ass guess though, but knowing efficiencies are always driven from the top, this is the primary likelihood.

    • Nothing. Look, the AI reaper is going to come with the sickle eventually. Citigroup is saying get on board or eventually get off-boarded.

      Making the training voluntary is the worst option. They could do nothing and let the raper start swinging. That would be the cleanest. They could make the training mandatory, which they have, and when the time comes it will be about performance or adherence to expected standards. But voluntary training traps them in an unpleasant limbo of refusal, expectation negotiation,

    • Well, who doesn't want to reinvent himself/herself? Don't you?
  • by Pseudonymous Powers ( 4097097 ) on Tuesday January 27, 2026 @05:58PM (#65953296)
    I fear what these employees will be reinventing themselves as is drones with ever less expertise, who can be ever less helpful, and who will eventually devolve into nothing but intermediary impediments between the customer and the chatbot they never wanted to talk to in the first place.
  • And once it gives you BS data to a seemingly correct input, then you can use that as justification to commit dollarcrime.

    • by Vrallis ( 33290 )

      I work for a company damn near the same size as Citygroup, and while there definitely wasn't mandatory training for AI tools, it's heavily encouraged (as it is in every large business, especially in tech businesses).

      Simple coding project, 600 lines of code. ChatGPT can't handle it. Internal support had me open a support ticket directly with OpenAI, and after many many emails back and forth (and easily demonstrating the issue repeatedly) they basically threw their hands up and gave up, saying they just g

      • by abulafia ( 7826 )
        I work for a similarly-sized moneycorp.

        I guess we're less top-down than some peers, at least about this stuff.

        But we have a mostly free hand to use LLMs, or not. There has been vague encouragement to experiment with them, and the data security policy of course applies, but otherwise, no mandates or even heavy-handed suggestions.

        I think the main use here is as a coding assistant, but engineers are expected to support/talk about/defend the code they check in, and the way we work enforces that.

        We're buil

      • Your company had you open a support ticket because ChatGPT couldn't write some code? I'm pretty sure ChatGPT isn't provided with any kind a guarantee or warranty that it will do that. Much like google search it's meant to be a tool to assist and have a human vet it's output is valid for the use case before throwing it into production.
      • Genuine question: Would ChatGPT still be useful if you broke up the 600 line project into more bite-sized tasks?

        Even the task of breaking up the project into sub-tasks might be considered for ChatGPT. ("We need to write software for xxx. Can you first provide a breakdown of xxx. I want a list of sub-tasks that are reasonably self-contained").
        Then each sub-task (presumably less than 600 lines) could be given to ChatGPT to code.

        Do you think this approach might work?

        Personally, I have found ChatGPT useful as a

    • Your job as an employee being forced to use AI tools is to feed it bogus garbage data. That way when they eventually try to use it to replace you it royally blows up in their face.
  • You type extended prompts into the LLM then wade through masses of verbiage, which either hallucinates or contradicts itself in the same text. When you point this out, the LLM asks if you want to speak to a mental health professional.
  • I've trained my replacement before.
  • by SlashbotAgent ( 6477336 ) on Tuesday January 27, 2026 @06:22PM (#65953372)

    Oh, Citigroup, you absolute trailblazers of the corporate cosmos! Nothing screams "reinvent yourself" quite like forcing 175,000 people to sit through a glorified crash course in typing clever sentences into a chatbot. Truly, the Renaissance has arrived—except instead of Michelangelo sculpting David, we've got Bob from accounting learning how to ask AI for a pivot table in 30 minutes flat.

    And let's not forget the memo—because nothing inspires innovation like a sternly worded email reminding you that your future depends on mastering the art of prompt engineering. "Adapt or be replaced," says Jane Fraser at Davos, presumably while sipping champagne and nodding gravely about how the robots are coming for your job.

    Experts breeze through in under 10 minutes, beginners slog for 30—because apparently the difference between mastery and mediocrity is just 20 minutes of corporate e-learning. Meanwhile, Citi proudly reports 6.5 million prompts entered, which is basically the digital equivalent of shouting "Hey Siri" into the void until something useful happens.

    And that 70% adoption rate? Bravo! Nothing says success like a majority of employees reluctantly poking at proprietary AI tools while wondering if their "reinvention" will end with them being replaced by the very thing they just trained to use.

  • If your mandatory AI course is ten minutes long, and your mandatory sexual harassment training is two hours long, you might have a problem.

  • It's the same thing when companies are compulsive that users must use the documentation, cite the documentation and then reference the documentation, and add your notes to the documentation, and it has to be in a way someone can completely follow what you did as if they weren't technical.

    So that your time and skills are ensured to add training documentation our company can use to train future employees as a reduced cost. It's the same strategy with AI. If they want to develop the tool, they need everyone to

  • I'm surprised it's only 10-30 minutes. The mandatory AI training at my job was 8 hours, unless you skipped large sections of the videos.

    The training I was subjected to wasn't just about how to converse with chat bots, but training you to do missionary work in spreading the AI gospel. There were multiple, graded (by AI) role-plays (with AI), where you have to extol the virtues of the software and convince your skeptical (AI) co-worker to use AI.

    It was all very meta. Wait, not that Meta. Wait, yes, that Meta

  • So Citigroup is dumbing-down its employees. Here's what *.ai use buys: *.ai addicts are 'cowed' into behavior, have poorer memory, stop intuiting, have lower ability to integrate , disparate information, stop using geometric analogs, forget howto pattern-match and lose back-of-envelope O(n) maths skill.

    Clearly the *.ai addict "progresses" from a thinker to an ant -- from yeoman to peon; just what anti-ablist SJWs , C-Suites & investors want. Enjoy. Of-cour
  • The training takes 10-30 minutes? This isn't training in any sense of the word. It not for the benefit of the worker, nor is it so Citigroup can benefit from their workers using the new tools. The sole purpose of this (and most other corporate "training" programs) is so companies can tick off a checkbox somewhere. Citigroup doesn't care if their employees know or use AI, they only care that they can tell someone (maybe the government, maybe a big customer, maybe their own board) that they're hip to the late

  • This is essentially the same as saying everybody needs to get trained now in how to use spreadsheets in like 1985.

"The Mets were great in 'sixty eight, The Cards were fine in 'sixty nine, But the Cubs will be heavenly in nineteen and seventy." -- Ernie Banks

Working...