Forgot your password?
typodupeerror
Stats AI

AI Use at Work Has Increased, Gallup Poll Finds (apnews.com) 53

An anonymous reader shared this report from the Associated Press: American workers adopted artificial intelligence into their work lives at a remarkable pace over the past few years, according to a new poll. Some 12% of employed adults say they use AI daily in their job, according to a Gallup Workforce survey conducted this fall of more than 22,000 U.S. workers.

The survey found roughly one-quarter say they use AI at least frequently, which is defined as at least a few times a week, and nearly half say they use it at least a few times a year. That compares with 21% who were using AI at least occasionally in 2023, when Gallup began asking the question, and points to the impact of the widespread commercial boom that ChatGPT sparked for generative AI tools that can write emails and computer code, summarize long documents, create images or help answer questions...

While frequent AI use is on the rise with many employees, AI adoption remains higher among those working in technology-related fields. About 6 in 10 technology workers say they use AI frequently, and about 3 in 10 do so daily. The share of Americans working in the technology sector who say they use AI daily or regularly has grown significantly since 2023, but there are indications that AI adoption could be starting to plateau after an explosive increase between 2024 and 2025...

A separate Gallup Workforce survey from 2025 found that even as AI use is increasing, few employees said it was "very" or "somewhat" likely that new technology, automation, robots or AI will eliminate their job within the next five years. Half said it was "not at all likely," but that has decreased from about 6 in 10 in 2023.

A bar chart lists the sectors most likely to be using AI at their jobs:
  1. Technology (77%)
  2. Finance (64%)
  3. College/University (63%)
  4. Professional Services (62%)
  5. K-12 Education (56%)
  6. Community/Social Services (43%)
  7. Government/Public Policy (42%)
  8. Manufacturing (41%)
  9. Health Care (41%)
  10. Retail (33%)

This discussion has been archived. No new comments can be posted.

AI Use at Work Has Increased, Gallup Poll Finds

Comments Filter:
  • Now do ... (Score:5, Funny)

    by PPH ( 736903 ) on Saturday January 31, 2026 @07:55PM (#65961748)

    ... alcohol.

    • by Anonymous Coward

      AI Use at Alcohol Has Increased, Gallup Poll Finds

  • How many ... (Score:5, Insightful)

    by evanh ( 627108 ) on Saturday January 31, 2026 @08:07PM (#65961758)

    ... just use it as a, power wasting, search engine?

    • by evanh ( 627108 )

      How many of that list, I meant.

    • It's actually fantastic as a way to find that one bit of info you need in 100 pages of documentation.
      • Comment removed based on user account deletion
        • Which is great unless the docs are spread across a dozen web pages or you don't know exactly what word to search for. Natural language queries that can use the bug you're working as context are a massively powerful tool.
          • Depends on your level of expertise in the domain. The old way of doing things is 1) programmer reads *all* the documentation once, then 2) programmer knows what keywords will find the API documentation that defines the corner case of interest.

            Personally, I'm dubious about the new way of only asking about the one thing without knowing the context, but YMMV.

          • Comment removed based on user account deletion
        • by allo ( 1728082 )

          Which works fine until the documentation uses a synonym for the word you thought it would use.

          • Comment removed based on user account deletion
            • by allo ( 1728082 )

              Yeah, you can type ten synonyms, or get a LLM find the information no matter if you don't find the right synonym. Hey, nobody is taking ctrl-f away from you, but some like to have modern options as well.

    • ... just use it as a, power wasting, search engine?

      Uh, exactly what else is it supposed to do at work?

      Maybe an actual manager who actually knows what it takes to run the business, should sit and have a chat with Toddler AI. See exactly how useful that tool is beyond a glorified search engine that can (kinda) be trusted to provide accurate feedback. Maybe even compare the results to traditional search engines to ensure it's not actually making the business worse. Or employees dumber.

      The proverbial jury is no longer out on the end results of "social" medi

      • Hmm, to be fair, if the Internet was just a global network that could be queried for factual information, it would still be insanely useful to have around. I imagine in the future we'll be losing our posting rights on centralized "social media" when the government eventually kills s230. Most platforms won't be able to handle the liability or have the sophistication to filter "illegal stuff" out.

  • by ffkom ( 3519199 ) on Saturday January 31, 2026 @08:10PM (#65961762)
    It's not like the makers of LLM-based coding tools wouldn't know what the consequences will be, to cite https://www.anthropic.com/rese... [anthropic.com]

    On average, participants in the AI group finished about two minutes faster, although the difference was not statistically significant. There was, however, a significant difference in test scores: the AI group averaged 50% on the quiz, compared to 67% in the hand-coding group—or the equivalent of nearly two letter grades (Cohen's d=0.738, p=0.01). The largest gap in scores between the two groups was on debugging questions, suggesting that the ability to understand when code is incorrect and why it fails may be a particular area of concern if AI impedes coding development. [...] Given time constraints and organizational pressures, junior developers or other professionals may rely on AI to complete tasks as fast as possible at the cost of skill development—and notably the ability to debug issues when something goes wrong.

    This is exactly what will happen: More AI-Slop produced faster, with more bugs that never get fixed. And skilled IT personnel being replaced by unskilled prompt-monkeys, because cheaper.

    • People flunk by using it, but they finish slight faster. So AI wins!

    • Nobody knows how to write assembly code any more either. So much deskilling!

      Technology always causes people to stop learning old skills replaced by new automation.

      Nobody knows how to hand-wash clothes any more. Or for that matter, many young people have no idea how to cook. And certainly, few know how to shoe a horse.

      We might be "deskilled" as far as traditional programming techniques, but skill will still be required, just different skills.

  • by SeaFox ( 739806 ) on Saturday January 31, 2026 @08:23PM (#65961782)

    I've seen many people comment on other discussion platforms they are using AI simply because the leadership at their employer is requiring it. And many of those comments follow up with how the results of the AI usage are not helpful, or worse, require checking/correcting work and resulting in a net negative for productivity verses if they had just done it themselves.

    • I too was wondering how much of this adoption was being forced by management rather than taken up on the user's own free will, now that would be a really interesting poll.

    • by Hasaf ( 3744357 ) on Sunday February 01, 2026 @12:09AM (#65961968)
      As a teacher I use it quite a bit. I have the requirement to rewrite my lesson plans every two years. This is about filling in boxes on a form, which regularly changes in order to insure that the biannual rewrites are not just a copy and paste activity.

      It is about rechecking the state standards codes, they are frequently revised even though there is little fundamental change, just the codes. These are the things AI is a great help for. I have them done without ever having to look at them and admin can check and see that they are done and revised on schedule.

      My real lesson plans are entirely different documents. I used to put them at the bottom of my lesson plans until I was told that what I was putting on the plan were "lesson notes," which have no place on a "lesson plan."

      So, AI is useful for doing things that have no need to be done, but are required to be done.
    • Comment removed based on user account deletion
    • That's what I was wondering. Are employees doing it to trick their managers into believing that they're being more productive, or are they using it as a part of a corporate drive towards using more AI? If it's the latter, then sooner or later, those people will be gone, b'cos their work can just as easily be done by AI

      A few days ago, I was watching a clip of a person who claimed to have written scripts that ends up in him only having to work one hour a day, rather than eight. A recruiter who played thi

    • I've seen many people comment on other discussion platforms they are using AI simply because the leadership at their employer is requiring it.

      I'd imagine one of the fastest ways to reverse that mentality is to start bragging to your boss about how awesome AI is at doing their job.

  • by Anonymous Coward on Saturday January 31, 2026 @08:23PM (#65961784)

    AI is very good for novices, people who don't know something well.

    Executives use it for office tools.

    Programmers and engineers use it when working with languages and tools they don't know well. We're all novices at some languages and tools, so AI can be useful to everybody, to some extent.

    The big problem is that most of the material on the Internet on programming and engineering is done by novices. Professionals don't have the time to be posting and even if they did they aren't allowed to share their work.

    Academics have generally sold their souls to the publish-or-perish system and seldom have professional level skills, so no help there.

    AI is basically all about pattern matching, so it matches work done by novices, and in practice people blindly apply that work.

    This means people aren't spending time learning how to do it right, which usually involves design time, abstraction, thinking about data structures, experiments, and so forth.

    Then one makes the inevitable mistakes, and learns from those mistakes - and ultimately gets better.

    With AI, that whole process is short-circuited. You get something the sort-of works, but without the whole learning process.

    So it is not that AI is a bad thing, but it can be - people who don't have wisdom or guidance from others aren't going to get the learning experiences they need to get good. They will be condemned to mediocrity. So we're creating a culture of amateurs that don't know how to do anything well.

    Really good organizations supplement the learning process with really good mentoring - and that's key to turning the talented amateurs that graduate from colleges into really good professionals.

    Unfortunately, really good mentoring is quite rare, because most technical people aren't good enough at either their technical skills or the people skills (or both) to be good mentors, and most organizations don't know how to run a good mentoring program. By that I mean substance, not illusion. Lots of managers can manage the illusion part - we're mentoring, check that box off!

    A good mentoring program can help ensure that AI doesn't lead organizations down the wrong paths, and that people develop the genuine high level skills they need to produce high quality products that will generate long term income from repeat customers.

    Most organizations don't know how to run such a program, so they rely on patents, trademarks, copyright, and cost-of-entry barriers to make up for all the quality failures in their products that happen because their people don't have the skills they need.

    It's another reason to be very suspicious of whether or not IP law is actually providing a genuine benefit to society, as least as it is currently implemented. Basic legal protection from counter-fitting would probably be more efficient and beneficial to society than current IP law.

    • by ffkom ( 3519199 ) on Saturday January 31, 2026 @08:32PM (#65961798)

      AI is very good for novices, people who don't know something well.

      There is plenty of evidence already that novices using AI will remain novices, rather than develop advanced skills. So yes, as a "novice", you can get to some result quicker by using AI, but the result will be that of a "fool with a tool", and your next work's result won't be better, because you didn't learn anything.

      • AI is very good for novices, people who don't know something well.

        There is plenty of evidence already that novices using AI will remain novices, rather than develop advanced skills. So yes, as a "novice", you can get to some result quicker by using AI, but the result will be that of a "fool with a tool", and your next work's result won't be better, because you didn't learn anything.

        It depends...

        So, I'm a very experienced software engineer. Going on 40 years in the business, done all kinds of stuff. But there are just too many tools and too many libraries to know, and you never use the same ones on consecutive projects, that's just reality. What I've found is that telling an LLM to do X using this tool I've never used before and then examining the output (including asking the LLM for explanations, and checking them against the docs) until I understand it is at least an order of ma

        • by twdorris ( 29395 )

          I just wrote up a slightly different take on the exact same situation you described. As a fellow "advanced" gray beard with decades of experience and broad knowledge of all the various "cool architectures du jour" that have popped up and gone away in that time, I feel that I am (as you are) fully qualified in writing code that others consider to be of "superior" quality.

          The hands of such a person, these tools can build our knowledge. But only because we have all these learned system architectures upon whi

          • by ffkom ( 3519199 )

            Now...what we do not know is the long term playout of this. There was a time, I'm sure because I was there as I suspect you were as well, where higher level compilers were considered suspect. I've had to review the assembly they spit out to find floating point bugs I *knew* weren't in my code. But now? In 2026?

            The mature compilers have become reasonably dependable with regards to correctness (which LLMs are far away from), the performance of compiled code is also usually ok, but people never having read assembler output of compilers still tend to program in ways that force compilers to generate machine code that is worse than what could be achieved if people were aware of what the underlying CPU can do efficiently.

            But more importantly: People did not stop at using compilers, they added libraries, then "framewor

          • *Those* are the novices I am / we are concerned about never advancing beyond "novice" level.

            Indeed. That's a very real concern. We can safely and effectively use LLMs because of our experience and deep understanding of all the layers. But, clearly, novices who come up with LLM assistance will never have to do that. They'll rely on the AI.

            I suspect it's more of that than what some are claiming that "software is doomed" and "we're going to lose all experienced coders". Nah...I suspect we're just changing the type of coder that's going to be considered "experienced" and the domain we're going to consider them experienced in.

            That's... plausible! And honestly the most hopeful thing I've heard in a while about what the future of the profession looks like. I like your analogy with compilers and other low-level tools that we used to have to know how to double-check.

            But my point was

        • "I'm writing a new crypto library"

          yeah ok so you can be put on ignore.

          • "I'm writing a new crypto library"

            yeah ok so you can be put on ignore.

            Sigh. That's why I clarified that I'm not writing algorithms.

            Also, you should consider that I wrote the primary crypto library used on Android, some three billion devices. I'm neither a dilettante nor a clueless noob. I've been a professional crypto security engineer for over 20 years. The reason I'm writing a library with a new API is because I have broad and deep experience with all of the existing libraries and the footguns they provide, and I'm trying a novel approach that I think will reduce user

            • by twdorris ( 29395 )

              Sigh. That's why I clarified that I'm not writing algorithms.

              You're feeding the trolls...don't feed the trolls. ROFL!

              Genuinely great conversation here. Thanks for that! Been a while...thus why I don't usually post. ;)

              • Sigh. That's why I clarified that I'm not writing algorithms.

                You're feeding the trolls...don't feed the trolls. ROFL!

                Fair!

                Genuinely great conversation here. Thanks for that!

                Same. Have a good one.

    • Comment removed based on user account deletion
    • by twdorris ( 29395 )

      You're right, but....

      I mean, yes, you're right in that a novice using AI will very likely never advance much beyond being a novice. Which sucks and I don't have any answers for that yet. It's a HUGE looming issue and even my younger son is caught up in this mess trying to find a job when companies have absolutely put the brakes on hiring "novices" out of school (not even sure I'd call him a novice as he's quite competent but apparently "right out of school" immediately removes you from consideration at mo

  • by Morromist ( 1207276 ) on Saturday January 31, 2026 @09:35PM (#65961852)

    Some of this is real, no doubt about it, especially if you're a coder or someone whose job is to create spam or shitty advertisements or run chatbots.

    But a lot of this is people using ai because its been pushed violently in front of us. Just see how MSFT tried to replace microsoft word with it, or how google has you look at AI before any actual search results. And then there are the employers who mandate it. When you that and then survey the results and find people are using it more, that doesn't translate to it actually being more useful.

  • by gweihir ( 88907 ) on Saturday January 31, 2026 @11:26PM (#65961932)

    I do use LLM-type AI "daily" too. DDG gives an AI answer by default. As that is somewhat useful in, maybe, 50% of all simpler searches, I leave it on. But does that mean I have "integrated AI into my workflow"? Hell, no. That claim would be a direct lie.

  • Well DUH (Score:4, Interesting)

    by the_skywise ( 189793 ) on Sunday February 01, 2026 @01:50AM (#65962028)

    While it's not mandatory at my company (which is fairly large) AI is strongly encouraged to be used by all employees of any job. My team is tracking our AI usage of a particular type to see how useful it is. Team leads have integrated AI into their workflows with PR reviews, generating meeting agendas, let alone code production. All because of a giant top down push for AI, AI, AI.

    And, of course, the company is now providing "AI" products. (thanks Steve)

    Is it any wonder that a poll finds AI use has increased when it's being demanded to be used by CEOs and CTOs alike and people are being fired over it?

    In all my years as an engineer, I've never seen this kind of ramrodded adoption and supercharged spending of an unproven technology/process simply so they can get on the bandwagon. Yes, I use it for unit test generations and function generation. But I'm no longer gaining skillz about minutae in mocking for unit tests and while I can let the AI generate modern code syntax, like current C++ syntax or the latest Java syntax, I'm not "learning" it. I'm just having the AI generate the code, reading up on the routine and going "oh, that looks right" but it's not working its way into my inner thought processes.

    Maybe I'm just an old man yelling at clouds but while it's cool to get the AI to do your homework for you (In a very talk to the Star Trek computer way) we're moving the ability to craft code, especially for younger programmers, further away. Although maybe this is just another argument of going from machine language to assembly to high level languages... (but even then I think some of those arguments were valid, though in the age of cloud computing and virtual computers... coding to the metal is a lost art unless you're an embedded programmer)

  • by zephvark ( 1812804 ) on Sunday February 01, 2026 @03:07AM (#65962064)

    This study notes that it has defined "frequent use" to mean "a few times a week", which most people would consider "occasional" rather than "frequent". It then goes on to use the word "frequent" according to this fraudulent definition.

  • ...that asks poorly phrased questions like 'do you have a best friend at work'?

    Yeah, not enough grains of salt for their bullshit.

  • I've started using one of the new AIs for some mechanical, thermal and high power electrical engineering. Its surprisingly good. Of course it makes mistakes but not really any more often than would a human engineer. It seems to do better at these sorts of problems than it does at coding - which surprised me.

  • Things like
    - Creating job descriptions
    - Creating root cause analysis documents
    - Making presentation slide decks
    - Taking notes

    Personally, I couldn't be happier.

  • You know what I mean. People totally forgot about how the PC came to dominate the landscape. The average moron at work was suddenly able to type 40 words a minute without having to use "white out". Files could be saved and updated on a disk. Little things. LITTLE THINGS. Everybody is always looking at the "systems" level for success. But they don't see the average person just trying to find an easier way to get the mundane things done. "How do I do this in my spreadsheet?" "Can I automate this somehow

"If you want to eat hippopatomus, you've got to pay the freight." -- attributed to an IBM guy, about why IBM software uses so much memory

Working...