Tech CEOs Suddenly Love Blaming AI For Mass Job Cuts (bbc.com) 66
An anonymous reader quotes a report from the BBC: Sweeping job cuts at Big Tech companies have become an annual tradition. How executives explain those decisions, however, has changed. Out are buzzwords like efficiency, over-hiring, and too many management layers. Today, all explanations stem from artificial intelligence (AI). In recent weeks, giants including Google, Amazon, Meta, as well as smaller firms such as Pinterest and Atlassian, have all announced or warned of plans to shrink their workforce, pointing to developments in AI that they say are allowing their firms to do more with fewer people. [...] But explaining cuts by pointing to advances in AI sounds better than citing cost pressures or a desire to please shareholders, says tech investor Terrence Rohan, who has had a seat on many company boards. "Pointing to AI makes a better blog post," Rohan says. "Or it at least doesn't make you seem as much the bad guy who just wants to cut people for cost-effectiveness."
That does not mean there is no substance behind the words, Rohan added. Some of the companies he's backing are using code that is 25% to 75% AI-generated. That is a sign of the real threat that AI tools for writing code represent to jobs such as software developer, computer engineer and programmer, posts once considered a near-guarantee of highly paid, stable careers. "Some of it is that the narrative is changing, some of it is that we really are starting to see step changes in productivity," Anne Hoecker, a partner at Bain who leads the consultancy's technology practice, says of the recent job cuts. "Leaders more recently are seeing these tools are good enough that you really can do the same amount of work with fundamentally less people."
There is another way that AI is driving job cuts -- and it has nothing to do with the technical abilities of coding tools and chatbots. Amazon, Meta, Google and Microsoft are collectively planning to pour $650 billion into AI in the coming year. As executives hunt for ways to try to ease investor shock at those costs, many are landing on payroll, typically tech firms' single biggest expense. [...] Although the expense of, for example, 30,000 corporate Amazon employees is dwarfed by that company's AI spending plans, firms of this size will now take any opportunity to cut costs, Rohan says. "They're playing a game of inches," Rohan says of cuts at Big Tech firms. "If you can even slightly tune the machine, that is helpful." Hoecker says cutting jobs also signals to stock market investors worried about the "real and huge" cost of AI development that executives are not blithely writing blank cheques. "It shows some discipline," says Hoecker. "Maybe laying off people isn't going to make much of a dent in that bill, but by creating a little bit of cashflow, it helps."
That does not mean there is no substance behind the words, Rohan added. Some of the companies he's backing are using code that is 25% to 75% AI-generated. That is a sign of the real threat that AI tools for writing code represent to jobs such as software developer, computer engineer and programmer, posts once considered a near-guarantee of highly paid, stable careers. "Some of it is that the narrative is changing, some of it is that we really are starting to see step changes in productivity," Anne Hoecker, a partner at Bain who leads the consultancy's technology practice, says of the recent job cuts. "Leaders more recently are seeing these tools are good enough that you really can do the same amount of work with fundamentally less people."
There is another way that AI is driving job cuts -- and it has nothing to do with the technical abilities of coding tools and chatbots. Amazon, Meta, Google and Microsoft are collectively planning to pour $650 billion into AI in the coming year. As executives hunt for ways to try to ease investor shock at those costs, many are landing on payroll, typically tech firms' single biggest expense. [...] Although the expense of, for example, 30,000 corporate Amazon employees is dwarfed by that company's AI spending plans, firms of this size will now take any opportunity to cut costs, Rohan says. "They're playing a game of inches," Rohan says of cuts at Big Tech firms. "If you can even slightly tune the machine, that is helpful." Hoecker says cutting jobs also signals to stock market investors worried about the "real and huge" cost of AI development that executives are not blithely writing blank cheques. "It shows some discipline," says Hoecker. "Maybe laying off people isn't going to make much of a dent in that bill, but by creating a little bit of cashflow, it helps."
BS (Score:5, Insightful)
Re:BS (Score:5, Informative)
The annoying thing is that the "AI" seems to stand for Actually India, as they're replacing US developers and testers with foreign contractors when the AI automation doesn't work out as planned. It's really the same outsourcing efforts that we saw in the 2000's and 2010's with a better cover story.
Re:BS (Score:4, Insightful)
My experience has been slightly different. AI seems to be leveling the playing field, just like how google maps flattened the field allowing all and sundry to become cab drivers in any city. Knowing all the city routes, traffic choke points etc. is no longer a "skill". I am now seeing previously clueless Indian developers, delivering not so bad software solutions, obviously with copious help from AI (Claude code in our specific case). The result? More software development/testing jobs are moving to India, thanks to AI. Anyone retiring or leaving the org for whatever reason, is being replaced by an Indian. While there is a hiring freeze in the western hemisphere (thankfully no layoffs yet).
Re:BS (Score:5, Interesting)
The CEOs of these companies are trying to justify inflated stock prices that were high based on the expectation of future growth.
No, CEOs are trying to show their board, investors, and activist investors that they have a plan for how to take advantage of AI and can at least keep up with their competitors use of AI, if not surpass them. I work at a large enterprise (close to 50k employees) and VPs are being told that they need to find ways for AI to have an impact on their department or their leaders will find someone who can. If it isn't happening fast enough consultants are brought in to take over their department's transformative roadmap and leaders who can't keep up are relegated to being SMEs until they are eventually replaced. I'm not in the room when that message is given, but I've seen the rapid shift of VPs who were raising alarms nearly immediately turn into AI cheerleaders.
If you work for a publicly traded or VC backed company I assure you your CEO does not have a choice on whether to jump on the AI bandwagon. That's not how hype driven bubbles work.
1% personality type tap dance for Wall Street (Score:2)
It is the same old, same old the corporate leaders use to 1) sell a look at the infinite growth fantasy, 2) keep being employed, 3) hide that most companies in the S&P 500 excluding the mega caps have lower than inflation (a decline in inflation adjusted money) in top-line sales revenue over the last few years.
The US companies have for 20 years followed a business damaging plan to exist by cutting costs (layoffs), selling the same old products, and buying back stock instead of developing new products.
Un
look at the narrative (Score:2)
Once a few companies are successful in blaming "economic topic X" on their lack of revenue growth and lack of profitability growth, many company leaders for other companies follow the same "blame economic topic X" as cover for lackluster business performance.
Wall Street analysts ask softball questions during earnings calls so that their Wall Street firm can get new business from the company being covered.
One of the indicator questions from a Wall Street analyst is "Can you provide more color on X?" asked by
Re: (Score:2)
Your description sounds exactly like a hype driven bubble - No one has a clue but the competition is doing this thing therefore we have to too.
Re: (Score:2)
The CEOs of these companies.
It constantly demonstrates that there has been a dearth of competence at the management level of IT for a long time.
They probably had incompetent people anyway... (Score:1)
75% AI-generated code suggests lots of boilerplate...
which suggests that neither developers nor management cared about proper abstraction layers...
Re: (Score:3)
Based on some codebases I've seen...
AI slop can be bad, but has *nothing* on the closed source codebases I've seen for low quality slop.
Re: (Score:3)
>> the closed source codebases I've seen for low quality slop
Same here, most established companies have a gigantic amount of legacy software written long ago by people who left. Little to no documentation and a rickety, brittle structure that is always teetering on failure. At least with AI you can make it document the code it writes and the architecture.
Fortunately you can also tell it to evaluate and document the legacy code. Very helpful.
Re: (Score:3)
At least with AI you can make it document the code it writes and the architecture.
Who was it who said that no documentation was better than incorrect documentation? Aside from you in the not-too-distant future, I mean.
Seriously, this is right up there with silly nonsense like "just have AI write the tests as well". It's like you hate your future self and want them to suffer.
Fortunately you can also tell it to evaluate and document the legacy code.
Yes, I suppose you can tell it to do that ... it's just not something that it can actually do.
Very helpful.
Is it really? Every AI fluffing article we've seen that makes those claims has, after even cursory examination, turned o
Re: (Score:2)
Well, in one respect it is 'very useful'. Executive direction that the legacy codebase must be 'documented' fully. Poof, it is 'documented'. Is it correct? Who knows, no one will ever read it, but it fluffs the executives "thought leadership". The compromise between 'port the code' which is a risk no one will take and 'document the code to prepare for a porting effort that will never come'.
Just be careful to keep the LLM vomit clearly distinguished from actually curated documentation, lest some naive pe
Re: (Score:3)
>> Executive direction that the legacy codebase must be 'documented' fully
I'll assume you don't work in the profession, the execs usually don't care or even know. If they had wanted documentation they would have emphasized it when the code was being written, or at least shortly afterwards. Software rots over time and if its a valuable internal application someone will eventually have to fix it. I've been paid very good money at times to do that job.
Sometimes the first task is to determine where the so
Re: (Score:2)
Yeah I have seen this whining from you previously. Clearly you don't use this tech and all you know about it is " Every AI fluffing article we've seen".
Re: (Score:2)
Clearly you don't use this tech
That's because I know better. You'll figure it out someday as well. It's a little surprising you haven't figured it out already.
I suspect you might know already, even if you don't want to admit it. That's why you're attacking me instead of addressing my claim.
There are other dangers to using AI far worse than just some shoddy code and sketchy documentation, such as cognitive atrophy and loss of brain plasticity, that should concern you as well.
There is also a growing body of evidence that calls the produ
Re: (Score:2)
>> That's because I know better.
You obviously don't know a damn thing, lol.
Re: (Score:2)
Re: (Score:2)
In many companies, the worse the documentation is, the better. If docs are good, the dev gets replaced. If the dev's code can't be fixed and it is critical, he will remain onboard, because it would cost the company more to fire him and get 2-3 cheaper devs.
At previous jobs I've seen devs have two Git servers. Their personal one on their desktop or build box, and the one used for their check-ins. They run an obfuscation script before their pushes that turns variable names into variants of 1, I, 0, O, l.
Re: (Score:2)
They run an obfuscation script before their pushes that turns variable names into variants of 1, I, 0, O, l. They never were the ones laid off.
Yeah, they weren't laid off because they were fired for cause first.
Re:They probably had incompetent people anyway... (Score:4, Interesting)
AI-generated code is just this generation's version of copying code from the web or from another part of the codebase. Sometimes that person understood the code fully, and sometimes they just checked to see if the output matched what they expected.
The only uniquely dangerous thing about this recent iteration of that problem is the massive scale.
Re: (Score:2)
The big worry is that AI can pump out a lot of slop. At least it took a tiny bit of thought to copy and paste from Stack Overflow. Now, entire projects can be AI slop that have never seen a debug pass. And this stuff winds up going live.
Who wouldn't use this trick? (Score:3)
You can deceive investors by telling them that the company is getting more efficient, and layoff people with minimal damage. Who wouldn't make this choice? The problem is it's not going to fix your company and people will figure it out eventually.
Re: (Score:3)
May not ever 'figure it out'.
A lot of 'leadership' saw "everyone is hiring tech" in the aftermath of the pandemic and so they did, with or without any vision.
This represents a narrative consistent with shedding those people they didn't have business value for. So they end up no more broken than they were in 2019, and it provides a narrative consistent with doing things "right".
Re: (Score:2)
This is why we should have a stakeholder system, and not shareholders. For stakeholders, layoffs damage a company, because institutional memory and talent is lost. For shareholders, it is just a bunch of faceless schmucks replaced by an offshore group for cheaper, so their gains this quarter are great, although they get surprised when the pace of new products slows or halts.
Not me guv (Score:5, Insightful)
It's all because of AI!
Re: Not me guv (Score:1)
It turns out that acquihiring a twenty person company just to get a couple developers and two UX people is a bad idea. Who knew?
Insider perspective: AI helps with amnesia only (Score:5, Interesting)
OK, so what about things that take me minutes?...like writing a unit test?...well, that's my favorite use case for AI. I'd LOVE to see it succeed, but I work primarily in Java, a compiled language...and it is strict about getting things right, so I see the errors immediately. Python users who vibe code, just ship bugs and let their users find them. OK, so with enough tries, it barfs out a unit test. It looks pretty good...afterall, LLMs are top-notch guessers. Unfortunately, the unit test is completely wrong and useless...so I have to go make it actually test the code instead of testing bean getters and setters and stupid shit like that. The scary part is that it looks good. It looks correct. But it often isn't, so you have to evaluate line-by-line.
One of my coworkers is more bullish on AI and introduced over 20 bugs last week with his AI slop, including undoing half my fixes for the week. His boss is consider putting him on a Performance Improvement Plan for his AI use. He's not a dumb guy. He just didn't understand the pieces I worked on, didn't read my docs and comments, and was fooled by the AI when it undid all my code to make his component's test pass. He is in India and didn't wait for me to review the code and had someone in IST review it who knew even less.
The only powerful use case I've found for them is for scenarios where you need to work with a technology you used to know well, but have forgotten. As a backend software engineer, this would be front-end code, RegExes, obscure stored procedure method calls, etc. For RegExes, I write them maybe 2x a year...so I never am confident of things I write. I can review the code better than I can write it from scratch.
If you've never used a technology, the code is unreliable. At best, it might save you some time learning. For example, if I had to write something in C#, a language I barely touched 20 years ago, it might double the ramp-up time, but I'd still have to spend a lot of time learning the fundamentals of the language. It would take the place of a really well written book...helpful, but not a game changer.
The point being...AI doesn't tangibly save time. It might save a bit under some circumstances, but not enough to justify layoffs. The CEOs are full of shit. They're AI washing routine layoffs. Either they overhired, or they wanted to shut down products and features or they wanted to get rid of dead weight....but apparently it's more fashionable to overtly lie to investors? It baffles me why shareholders haven't filed a lawsuit against Beinhoff....or any other CEO.
Re: (Score:1)
AI code doesn't have to be perfect. It just has to be better than a modern CS graduate. It is.
Re: (Score:2)
I have no idea what graduates you look at, but this is not the case here (Europe).
No, it does have to be perfect! (Score:2)
Re: (Score:2)
False. There is a minimum standard. If a "modern CS graduate" doesn't meet it then he, and AI, can disappear. A high school graduate can exceed the performance of AI at many things, though.
Re:Insider perspective: AI helps with amnesia only (Score:4)
The point being...AI doesn't tangibly save time. It might save a bit under some circumstances, but not enough to justify layoffs. The CEOs are full of shit.
Pretty much this. LLMs can be convenient, but they are not magic and that they make competent coders slower is pretty well established by now.
Re:Insider perspective: AI helps with amnesia only (Score:4, Interesting)
Forgive me, but I'm going to rant some, because this is the only place I can do so.
I've started having to tell my friends to stop talking to me about AI.
Don't get me wrong. I use it. I find it helpful and saves time with stupid scripting tasks, throwing together modals, etc. There's a ton of ways that it helps me be more efficient with my human person job.
But actual work - architecture, design, thinking through a full process...that still requires a human.
What I'm starting to get really freaking irritated at is that everyone talks about AI like it's magic, and all I *hear* is "I couldn't do my job myself, but *now* I think I can!!".
Quit treating the fact that you spent money on Claude credits like some kind of proof of value. If you want to talk to me about something cool you're working on and a problem you had to solve - awesome. If you want to brag about how you spent all day crafting a prompt and then AI did all the work for you, then I kinda just want to punch you in your stupid face.
The one rather depressing bright spot I have is that the owner of the company discovered OpenClaw, and managed to set one up (even though he required me to do the really complicated stuff, like signing up for a Twilio account). His LinkedIn posts suddenly got way more articulate, added a ton of graphics, and is trying to sell people on his new agentic workflow that's running his company. Meanwhile, I know that nothing at all has changed, and that all he's managed to do is have the AI create a post and graphic and post it.
The "bright" point there is that it finally hit me that that's what literally all of the AI-spam is in my LinkedIn feed - a bunch of other people's bosses in the same boat - and that real people are still required to do anything of actual, legitimate value.
Re:Insider perspective: AI helps with amnesia only (Score:5, Insightful)
Remember how people used to add up numbers by hand until it became easier to do it with a calculator, and now they can't do match?
One wonders what other skills will atrophy due to AI reliance.
Re: Insider perspective: AI helps with amnesia onl (Score:2)
No, I remember owning a calculator watch with phone directory because I could never remember phone numbers, outside of a few I used all the time. Then I got a cellphone and now I don't need to wear a watch.
Re: (Score:2)
Re: (Score:2)
One wonders what other skills will atrophy due to AI reliance.
No wondering is necessary. AI reliance will absolutely kill off any remaining critical thoughts/thinking you may have. Reality is going to be absolutely and completely miserable for humanity from here on out. Your life will be miserable, but you will be unable to figure out why... just asking yourself over and over, "why are things like this?", but being unable to come up with an answer.
Hey, wait a minute, we are already like that without AI.
NO MORE WOOL!
Re: (Score:2)
"Python users who vibe code, just ship bugs and let their users find them."
I think that's not just python users who vibe code, that's all CI/CD.
"The scary part is that it looks good. "
The purpose of all automated testing.
It is popular for people to repeat a common lie that a corporation's only responsibility is to make money for its stockholders. As long as people accept that corporations act like sociopaths they will do so, this kind of thing will be the consequence.
Re: (Score:2)
Re: Insider perspective: AI helps with amnesia onl (Score:2)
s/best/only/
Re: (Score:2)
> The point being...AI doesn't tangibly save time. It might save a bit under some circumstances, but not enough to justify layoffs.
Agreed with all of the above, but my even bigger concern with the idea of changing programming to babysitting electronic code writers, and doing the same for other parts of the business, is we're losing knowledge. Actively destroying knowledge indeed.
If luddites were in charge of the world, they could do nothing more effective to their cause than promote AI. AI means nobody u
Re: (Score:2)
That really depends on the type of work one has to do. For me, for example, AI is quite helpful (if frustrating at times) because I have to be a jack of all trades and know a shitload of different systems, but that also makes me a master of none. So I am not able to keep all the details about all the different systems in my head, but I still know enough to either do the job reasonably well but slow without the AI, or do my work quickly with the help of an AI and still know when it screws up.
"Suddenly" my ass (Score:1)
The trick has been around since industrial spies noticed the first liar got away with it.
I wonder if stock-holders can successfully sue for lying about the cause of a sales slump?
Re: (Score:2)
I wonder if stock-holders can successfully sue for lying about the cause of a sales slump?
If they have a smoking gun? Sure. And the SEC will want in, too. Otherwise? GLWT
The blind marketing to the blind ... (Score:4, Interesting)
MBAs are sheep. Blindly following the flock is what MBA schools teach them to do - and questioning conventional wisdom is strictly verboten.
Why else would they worship at the altar of stockholder supremacy, when transferring all their company's liquid assets to high-volume stock trading algorithms forces them to borrow money at commercial interest rates to fund their "investment" in AI, instead of using cash on hand for the purpose, and saving the interest payments for other investment purposes ... ?
Suddenly? (Score:4, Informative)
What is sudden about this? CEOs have been doing this for over two years (at least). Duolingo and Klarna were among the first, and both of those were in Q1 2024. This is not new behavior.
Re: (Score:2)
100% this.
CEOs have known for years that Wall Street prefers to see downsizing framed as "productivity gains due to AI."
Growing companies hire people (Score:2)
Does not matter how much tech they use to increase efficiency.
If your company is firing people, it is past it's prime. It may be a solid, safe investment, but it is not going to double profits for the next two years.
Re: (Score:2)
You are part of the problem.
Re: (Score:2)
We're dooming (Score:2)
Step 1: Ask the AI agent how to solve the problem.
Rather than continue training the AI agent to do my job, I'm resigning.
Copyright (Score:2)
Hello naysayers (Score:3)
Now here is the grand problem: you need to spot your jobs are going away to change how you vote BEFORE your jobs actually go away, and if you can't do that the average joe surely can't either
Re: Hello naysayers (Score:3)
Google xkcd extrapolate
Re: (Score:2)
I don't think the point is to deny that AI is useful. It's to point out that it has become a handy excuse for layoffs, regardless of the usefulness of AI.
Are you a troll or just clueless? (Score:2)
Can you spot a trend? Even if your experience shows AI lacking now let's look at where it was at 3 years ago: zero. So in three years it went from 'zero' to 'useful sometimes/as good as recent graduate' and this is the worst it will ever be. If that trend continues at pace you are out of a job in three years. Things slowing down you say? The breakthroughs they keep a coming in software, like increased memory efficiency: https://techcrunch.com/2026/03... [techcrunch.com] and hardware like Rubin rack computing units: https://www.tomshardware.com/t... [tomshardware.com] Now here is the grand problem: you need to spot your jobs are going away to change how you vote BEFORE your jobs actually go away, and if you can't do that the average joe surely can't either
Everything you said is true about theoretical AI, but not about real-world AI...or at the very least Claude Opus/Sonnet 4.6. It hasn't improved in any measure that I can identify in the last 2 years. I've been using it every workday, by mandate from my employer. So I am not a naysayer...I'm a frustrated user.
Look, you can show me bullshit theoretical papers and press releases all you want. There's a simple test that will prove either of us right or wrong and that's the real world and the market.
C
Re: (Score:2)
If that trend continues at pace you are out of a job in three years.
Your logic is weak. AI has not shown any real improvements since it was released onto the public. They are dialing in the exact weights and such needed to maintain "accuracy", but there will be no further improvements on the current technology that will lead to future breakthroughs. LLMs are a thing, and MIGHT be part of a future very capable AI; however, LLMs will NEVER be AGI regardless of what is done with them.
Sounds like I'll be able to get a job in a year (Score:2)
Let them do all their code with AI. Please. When it all comes crashing to the ground, they will have to hire real developers for real money to come in and try to fix the slop.
AI only needs to provide a marginal gain (Score:2)
I've been writing software for over 30 years. I used to spend 20% of my time writing and maintaining unit tests, one-off scripts, and stuff like that. I would not trust AI to touch my core code. But tests and one-off scripts? Stuff that fits easily in its context window? Sure.
No, it doesn't *always* work, but it *usually* does. And it doesn't take very much time to figure out when it's not working. And rather frighteningly, it does a *better* job on most of those one-off scripts than I would. (I'm n
Re: AI only needs to provide a marginal gain (Score:2)
Re: (Score:2)
I'm not saying you're completely wrong, but this is the exact same stuff they used to say about offshoring, and look at how well that went. It stuck in a few places to some degree, and maybe some even managed to eke out some savings without everything going to shit. But on average, and for most companies, it just made them lose their best employees and know-how, and ultimately ended up costing them a lot more money, too.
And it's the exact same stuff they said about factory automation.
And it's the exact same stuff they said about using machines in factories at all.
Yes, companies that offshored too much or stuff that was too important to the core of the company did screw themselves over. But most companies did not go that far and lose their best employees and know-how. That would have caused most companies to collapse.
And that's where we are now. Companies that use AI for too much or stuff that's too important will discov
AI does not make Developers obsolete (Score:2)
AI make development more efficient and effective but the classic problem in development is not lack of people but lack of development progress.
If doubt that you can be able to lay off people, you just have to deal with more code to shape and scrutinise.
Take software maintainers as an example. You need more human maintainers to deal with the AI slop of reputation farming, not less.
Sooner or later the AI LLM market will collapse on the financial market but we will continue to need programmers that use AI.
The