Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
AI Businesses

AI-Generated 'Workslop' Is Destroying Productivity (hbr.org) 48

40% of U.S. employees have received "workslop" -- AI-generated content that appears polished but lacks substance -- in the past month, according to research from BetterUp Labs and Stanford Social Media Lab. The survey of 1,150 full-time workers found recipients spend an average of one hour and 56 minutes addressing each incident of workslop, costing organizations an estimated $186 per employee monthly. For a 10,000-person company, lost productivity totals over $9 million annually.

Professional services and technology sectors are disproportionately affected. Workers report that 15.4% of received content qualifies as workslop. The phenomenon occurs primarily between peers at 40%, though 18% flows from direct reports to managers and 16% moves down the hierarchy. Beyond financial costs, workslop damages workplace relationships -- half of recipients view senders as less creative, capable, and reliable, while 42% see them as less trustworthy.
This discussion has been archived. No new comments can be posted.

AI-Generated 'Workslop' Is Destroying Productivity

Comments Filter:
  • by Mr. Dollar Ton ( 5495648 ) on Tuesday September 23, 2025 @10:46AM (#65678100)

    Nothing that can't be solved by dumping another few trillion borrowed dollars to generate even better "AI" slop.

    • Use your own AI to summarize the manglement slop, then you only need to read one paragraph.
      • Re:Not a problem. (Score:5, Insightful)

        by Fons_de_spons ( 1311177 ) on Tuesday September 23, 2025 @01:39PM (#65678474)
        That is the thing with AI. You use an AI to summarize the long text from management to something brief and to the point. The manager, on the other hand, threw a few bullet points into an LLM and asked it to write the text.
        The manager could have just emailed the bulletpoints. But no, now we need two nuclear power plants. One to generate the text from the summary and another to summarize the text. AI slop...
        Just... talk... to... eachother.
        • So you're saying AI consumes too much energy. GPT-4 training consumed as much energy as 300 cars in their lifetime, which comes about 50 GWh. Not really that much, could be just families on a short street burning that kind of energy. As for inference, GPT-4 usage for an hour consumes less than watching Netflix for an hour. If you compare datacenter energy usage to the rest, it amounts to 5%. Making great economies on LLMs won't save the planet.
    • Re:Not a problem. (Score:5, Insightful)

      by godrik ( 1287354 ) on Tuesday September 23, 2025 @01:16PM (#65678434)

      AI is like XML; If it doesn't solve your problem, use more!

  • by khchung ( 462899 ) on Tuesday September 23, 2025 @10:53AM (#65678112) Journal

    "workslop" -- AI-generated content that appears polished but lacks substance

    I regularly receive human-generated content that appears polished but lacks substance, from management, throughout decades of my career.

    Wonder why no one did a study on how much money do management workslop wasted.

    • by gurps_npc ( 621217 ) on Tuesday September 23, 2025 @10:59AM (#65678132) Homepage

      Wait, it is possible to receive management content that has substance in it? Why didn't anyone tell me this?

      Which companies have management that does not provide slop?

      • by davidwr ( 791652 )

        Wait, it is possible to receive management content that has substance in it? Why didn't anyone tell me this?

        When the engineering managers came from the engineering ranks you can get non-slop content from managers. It's not guarenteed, but it can happen.

      • Wait, it is possible to receive management content that has substance in it?

        I've heard of such a thing repeatedly over my 50+ years in the workforce but I've never actually seen it in the wild.

    • I can't agree more! Maybe we could use AI to reduce slop and not increase it?

      IMHO, people use this "padding" or "slop" as an illusion of productivity
      • I can't agree more! Maybe we could use AI to reduce slop and not increase it? IMHO, people use this "padding" or "slop" as an illusion of productivity

        Expecting AI to slow the tide of slop is expecting the AI providers to fight themselves. They want to justify their existence, so the creation of "communications" must be filled with the MAXIMUM slop that management teams will allow while still accepting the output. That way the AI companies can justify filtering the slop on the user side by summarizing it into slightly less filthy slop. Gotta generate massive quantities of useless slop, in order to justify slimming down the slop on the other end.

        And someho

        • The AI companies can profit on this too... Efficiency! Less tokens! More bang for the buck!! Less electricity! We are green!

          Meanwhile the rest of us get to use it to get rid of bloody "fat middle" in management
        • AI will read the texts you provide, write generic text on any topic, if you don't bring anything original don't expect anything but slop. On the other hand, if you have plenty of original and useful ideas to communicate, why not use AI as your wordsmith?
          • AI will read the texts you provide, write generic text on any topic, if you don't bring anything original don't expect anything but slop. On the other hand, if you have plenty of original and useful ideas to communicate, why not use AI as your wordsmith?

            If you have plenty of original and useful ideas to communicate, why would you need AI as your wordsmith? To make sure it gets fluffed to the point the original and useful ideas are obfuscated beyond recognition?

        • by JustAnotherOldGuy ( 4145623 ) on Tuesday September 23, 2025 @03:48PM (#65678794) Journal

          Expecting AI to slow the tide of slop is expecting the AI providers to fight themselves.

          It's like trying to extinguish an oil fire by throwing aviation gas on it.

    • by wyHunter ( 4241347 ) on Tuesday September 23, 2025 @11:56AM (#65678258)
      What? You mean leveraging off our core competencies and synergize in a cross functional team environment on a go forward basis isn't a crystal clear way to communicate our new strategy? I'm shocked, just shocked.
    • I will keep saying this, but AI should replace management. I can't be the only one noticing that chatgpt can talk itself out of anything. Perfect manager!

      Then again, if all management jobs are replaced by AI, those managers will need to do technical work. OK, AI should not replace managers. We will just need to keep them busy ourselves.
      • If all you need is someone to spew mountains of buzzword-compliant text to justify its existence and incidentally generate heat, I'd say AI is a perfect replacement.

        • AI has no skin, can't be sued, punished, shamed, or imprisoned. But managers on the other hand ... hmm they are also safe. Never mind
    • That explains why management loves AI so much.
  • Human slop (Score:4, Interesting)

    by fluffernutter ( 1411889 ) on Tuesday September 23, 2025 @11:02AM (#65678140)
    My place of work sends human slop. They are constantly giving us impediments to doing our job while complaining that we should be more productive. Emails every 5 minutes. Teams going off constantly from various different channels we are 'required' too be in. Then you miss something because it was buried in all the crap, people look down on you. So you need to wade through all the crap. Oh yes and now people are just seeing up websites for their team and expecting you to go out of your way to check out and stay informed. They act all indignant like you are a bad employee if you aren't up to date on every one.
    • Re:Human slop (Score:4, Interesting)

      by phantomfive ( 622387 ) on Tuesday September 23, 2025 @11:37AM (#65678208) Journal
      Depending on your manager, it's intentional. For managers, personal power is a result of having more people below you on the organization tree. The most logical way to respond to this incentive is to hire more people below you. And if people are working slowly, then it's a justification to hire more people.

      Not all managers do this, but the incentive exists, and managers who do it tend to gain power faster than those who don't.

      So, because incentives are misaligned, evolutionary pressure exists in corporations to make you, as an individual contributor, inefficient.
    • Re:Human slop (Score:4, Interesting)

      by karmawarrior ( 311177 ) on Tuesday September 23, 2025 @12:29PM (#65678336) Journal

      Your the second person to write this and it's all true, although when someone asks you "Is the website down?" and you sigh because FFS we have eleventeen different fucking websites and anyway I'm a dev" there's at least a reality behind it. When your manager's manager tells you they're merging two teams in order to facilitate the cross pollination of synergies to promote teamwork, they are actually telling you your team is being merged.

      But... what we don't need is all the human slop augmented by a giant mountain of AI generated shit. Because if you think the human slop is being replaced by AI shit, you have another thing coming. And the AI shit, unlike the human slop, isn't long winded (or the opposite in the "Is the website down?" example) statements of things that you need to be aware of, it's most of the time useless crap from start to finish.

      • > it's most of the time useless crap from start to finish.

        Maybe.. you forgot to put any ideas in the prompt. That would explain it.
        • You mean maybe the people sending it to you forgot to put any ideas into the prompt?

          Here's a fun idea, have a look at the open source contributors who are dealing with this shit, "security bug reports" day after day after day that's AI generated slot generated by people desperately hoping the AI will spit out at least one legitimate bug report so they can collect a bounty. People who blindly submit this shit not understanding what it means?

          And fuck you BTW. Trying to blame the victims to justify your AI fet

  • by rsilvergun ( 571051 ) on Tuesday September 23, 2025 @11:03AM (#65678144)
    To write documentation that I know nobody is ever going to read or use. I have to admit if I had had AI back then I absolutely would have used it to write that nonsense.

    Also and this is somewhat terrifying but I have encountered people incapable of reading at the director level. I don't mean that as an insult I mean literally they are functionally illiterate. I figured it out when I had one of them and I kept emailing a basic explanation of something to them and they didn't understand but the moments I told them on a phone call they got it.

    If you are trying to hide functional illiteracy I could see AI being extremely useful for that.

    Now a proper civilization would not have let somebody get out of school functionally illiterate, and I don't mean we failed them so you right wingers can take your dicks out of your hands. We know how to take someone who is functionally illiterate and make them literate it's just a hell of a lot of work and it costs a bit of money. It's well worth doing but there's a bunch of people who get off on the idea that there are people out there who can't read.

    It's always about the totem pole. The hierarchy. There has to be somebody beneath me no matter what. So keeping a handful of people unable to read when you yourself can read becomes highly desirable to a certain type of jerkwad.
    • by Gilmoure ( 18428 ) on Tuesday September 23, 2025 @01:11PM (#65678414) Journal

      After working tech support for the VIP / VP floor, I decided that the less you read, the more money you made.

      VP customer: "It won't let me log on."

      [Admin scrambles IT support, flashing red lights down the hallway]

      PFY: "It says your password has expired. You need to create a new one."

      VP [cogitating]: "I have to create a new password?"

      PFY: "Yes"

      VP: "Sheila, I need you to get me a new password, STAT!"

    • To write documentation that I know nobody is ever going to read or use.

      I have read documentation that was clearly written by somebody who thought nobody was ever going to read or use it. It was worthless, but it sure would have been nice to have some actual useful documentation. This was pre-LLM by at least a decade or two, so I'm pretty sure it was written by a human who generated "slop". And now I'm not sure how to differentiate "AI slop" from human "slop".

  • Real progress is being made in creating useful tools that deliver real value. I find perplexity and chat GPT to be very useful for a limited number of tasks. Future AI may deliver tools that help us solve previously intractable problems.

    Unfortunately, a lot of crap is being produced. Image generators, music generators, vibe coding tools, robot friends, robot therapists, corporate email generators and more are wasting time and money and making everything worse.

    This seems to be true of every new tech. Some us

    • by gweihir ( 88907 )

      Real progress is being made in creating useful tools that deliver real value.

      Please elaborate and cite research. Because, for example for coding, AI decreases productivity by an average of 20%, while most users mistakenly think they have a productivity increase.

    • I think the majority of AI text people read is their own AI text. There are already 1B users. OpenAI alone has 800M. Seeing clear AI generated text in the wild is rare, seeing your own chat is frequent. I think most AI usage is a private affair. Almost all LLM text is read just once. If people complain of AI slop, then it's their own fault. You have to chip some original ideas in, can't expect the model to be original. But there are also times when the model surprises you with a really useful idea or connec
  • This is in no way unexpected and adds to the overall performance decreases caused by "AI" observable from other effects. For example, for coding productivity decreases on average by 20%, not counting cost from lower code quality, while typical users mistakenly believe to have higher productivity.

    • There is a way to us DNN/LLM "AI"s correctly; use them like a search engine.

      Ask a hyper specific question, and scrutinize the answer given thoroughly.

      In the same way that crowdsourced intelligence made google a useful tool for search, and social media created a great pool of questions and answers for that search to run over, DNN's are just an extension of search.

      They are a wonderful improvement in the areas of (1) parsing the query and (2) re-jiggering the resultant hits.

      (1) They can decode the user's quest

      • I would say, with judicious use of a search-engine DNN/LLM, any programmer should expect perhaps a 1% to 2% productivity increase on average.

        You'll get more productivity gains if you get some more Z-s instead of wasting that time to tell the AI slop container how to search for stuff or generate better hallucinations.

      • by gweihir ( 88907 )

        I would say, with judicious use of a search-engine DNN/LLM, any programmer should expect perhaps a 1% to 2% productivity increase on average.

        Any programmer who tries to ask it to write code or solve problems will likely eat the worm, and suffer a 20%-50% decline in real productivity. Hopefully, any programmer caught doing this would face some kind of disciplinary action.

        These numbers are probably pretty accurate. But the coders having AI write code for them are not only allowed to do so, they are encouraged and forced to do it by "management".

  • by Anonymous Coward

    ...at bullshitting and wasting peoples' time.

    People interested in doing the work properly will do the work properly. The rest of the work force who bullshits and autopilots their way through their job and wastes peoples' time will still be doing the same thing with AI. They managed to keep their jobs before with bullshit and they'll now be doing the same with the power of AI.

    • It's kind of like the Matrix though. Those of us who want to get actual stuff done, are able to be left alone to do it more, because the drones around us are too busy dealing with all the AI slop.

  • Maybe our white collar jobs are safe after all.

  • Sounds familiar. Five, maybe ten years ago, we hit the peak of Stack Overflow Cut and Paste.

    I was fortunate to mostly retire from tech a few years ago, but it was already becoming way, way, way too much of my job to walk on egg shells code reviewing junior people who would submit large PRs, and then clam up and get real awkward when asked any questions about them, because well, they hadn't written more than 10% of it. Being diplomatic about it was a never ending challenge.

    This just seems like the next, much

"Why should we subsidize intellectual curiosity?" -Ronald Reagan

Working...