Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Windows AI Microsoft

Microsoft Warns Its Windows AI Feature Brings Data Theft and Malware Risks, and 'Occasionally May Hallucinate' (itsfoss.com) 65

"Copilot Actions on Windows 11" is currently available in Insider builds (version 26220.7262) as part of Copilot Labs, according to a recent report, "and is off by default, requiring admin access to set it up."

But maybe it's off for a good reason...besides the fact that it can access any apps installed on your system: In a support document, Microsoft admits that features like Copilot Actions introduce " novel security risks ." They warn about cross-prompt injection (XPIA), where malicious content in documents or UI elements can override the AI's instructions. The result? " Unintended actions like data exfiltration or malware installation ."

Yeah, you read that right. Microsoft is shipping a feature that could be tricked into installing malware on your system. Microsoft's own warning hits hard: "We recommend that you only enable this feature if you understand the security implications." When you try to enable these experimental features, Windows shows you a warning dialog that you have to acknowledge. ["This feature is still being tested and may impact the performance or security of your device."]

Even with these warnings, the level of access Copilot Actions demands is concerning. When you enable the feature, it gets read and write access to your Documents, Downloads, Desktop, Pictures, Videos, and Music folders... Microsoft says they are implementing safeguards. All actions are logged, users must approve data access requests, the feature operates in isolated workspaces, and the system uses audit logs to track activity.

But you are still giving an AI system that can "hallucinate and produce unexpected outputs" (Microsoft's words, not mine) full access to your personal files.

To address this, Ars Technica notes, Microsoft added this helpful warning to its support document this week. "As these capabilities are introduced, AI models still face functional limitations in terms of how they behave and occasionally may hallucinate and produce unexpected outputs."

But Microsoft didn't describe "what actions they should take to prevent their devices from being compromised. I asked Microsoft to provide these details, and the company declined..."

Microsoft Warns Its Windows AI Feature Brings Data Theft and Malware Risks, and 'Occasionally May Hallucinate'

Comments Filter:
  • urg (Score:5, Insightful)

    by MilenCent ( 219397 ) <johnwh@noSPAm.gmail.com> on Sunday November 23, 2025 @03:49AM (#65813133) Homepage

    Tech companies: Security is such a huge priority that we'll load our software with power and memory wasting countermeasures that annoy the hell out of you. You may hate being that using two-factor authentication requires you to grab your phone for a text message before you log into anything, but it's all in the name of security! You should learn with it, it's all for the best!

    Also tech companies: It's so important to lard our work with generative AI features that a little security compromise is fine!

    • Re:urg (Score:5, Interesting)

      by fluffernutter ( 1411889 ) on Sunday November 23, 2025 @07:38AM (#65813291)
      At one time companies used to ask themselves "will our customers like this". Lately it seems they have entire departments geared towards squeezing every cent out of everything. Now it's fine If you're device wakes up on its own in the middle of the night and plays commercials, or if they tell you your product has shipped when it hasn't really shipped, or they take away HECV support to save a few dollars after calculating that most people won't notice. At some point every little scrap will be collected. Then what?
  • because.. (Score:5, Informative)

    by fortunatus ( 445210 ) on Sunday November 23, 2025 @04:21AM (#65813161)
    it's not an AI, it's an LLM being marketed as an AI. on the other hand, if it /were/ an actual AI, it could simply be /convinced/ to spy, steal and damage!
    • AI is not magic (Score:2, Informative)

      by Anonymous Coward

      AI is the broadest category, not the finest. AI means the chess computer, it means ELIZA, it means the next word prediction on your phone, it means generative AI, it means expert systems, it means Markov chains ... you're thinking about AGI.

      • your point resonates with AI as a research discipline, and many of its outcomes. but here we have an LLM/GPT based product being marketed as "AI", or even as "an AI", which is different than talking about AI as a research area.
  • by SeaFox ( 739806 ) on Sunday November 23, 2025 @04:24AM (#65813169)

    "Using Windows AI may cause data theft and malware risks, so don't come to us when it happens, you were warned (not that you had the choice to disable the AI...)"

    • it's definitely legalese.

    • by KiloByte ( 825081 ) on Sunday November 23, 2025 @06:10AM (#65813233)

      Not any different from using Windows without AI.

      • by gweihir ( 88907 )

        Au contraire! For attackers, this is an exciting new feature that will offer endless new functionality and may finally prevent users from sabotaging their efforts by actually having a clue and being careful.

    • by Vlad_the_Inhaler ( 32958 ) on Sunday November 23, 2025 @07:31AM (#65813285)

      (not that you had the choice to disable the AI...)

      Meanwhile the first paragraph of the summary above includes

      "and is off by default, requiring admin access to set it up."

      Along with several warnings from Microsoft to only activate this if you really want to take the risk.

      If I wanted to use this "feature" I'd want a standalone machine with no actual real data, but I don't so it's moot anyway.

      • by organgtool ( 966989 ) on Sunday November 23, 2025 @02:05PM (#65813735)
        Anybody who has been around long enough knows that once Microsoft developers have had a bit more time to fine-tune the software and Microsoft marketers have had enough time to soften the blowback of this anti-feature, Microsoft will enable it by default. And while you'll be able to disable it, a future Windows update will silently re-enable it or it will be discovered that it still collects and silently leaks information back to Microsoft to help train their algorithms. Microsoft has been disrepecting Windows users' wishes for decades now, and while some users take pride in spending lots of time to figure out clever workarounds to get Windows to behave the way they actually want, at a certain point you'll realize that if you have to work that hard to get your computer to do what you want, it's not really your computer.
        • If they do that they'll have to take responsibility for the consequences when shit happens, and it's not as though they can weasel out with "we didn't know" - this warning makes it clear that they are aware of what can go wrong.

          • by SeaFox ( 739806 )

            If they do that they'll have to take responsibility for the consequences when shit happens...

            Yup, just like most companies today face meaningful punishments when they have security bre-- BWAHAHAHAHAHA. Sorry, I couldn't continue that with a straight face.

    • by jowifi ( 1320309 )

      "Using Windows may cause data theft and malware risks, so don't come to us when it happens, you were warned"

      FTFY

  • Obvious answer (Score:5, Informative)

    by NotEmmanuelGoldstein ( 6423622 ) on Sunday November 23, 2025 @04:31AM (#65813173)

    ...what actions they should take to prevent their devices from being compromised.

    Obviously, uninstall Windows. Because one can't uninstall AI crap-ware MS Recall and MS Co-pilot.

    • Re:Obvious answer (Score:5, Informative)

      by Mr. Dollar Ton ( 5495648 ) on Sunday November 23, 2025 @04:40AM (#65813181)

      An advice that is almost 30 years late, but welcome.

    • ...what actions they should take to prevent their devices from being compromised.

      Obviously, uninstall Windows. Because one can't uninstall AI crap-ware MS Recall and MS Co-pilot.

      I nuked my nice fast Windows 10 laptop that was no longer eligible for W11 update. Now it works flawlessly and fast. My new W11 laptop is fast, but W11 is buggy. This operating system feels and acts like early beta.

      Maybe Microsoft could think about getting W11 to function first instead of providing users with roulette wheel malware.

  • Interesting times (Score:5, Interesting)

    by gweihir ( 88907 ) on Sunday November 23, 2025 @04:48AM (#65813187)

    And dangerous for dumb people. Remember that "malware installation" usually means lateral movement and then compromise of the whole organization these days, because AD security sucks and then it is often misconfigured on top of that.

    I would not trust this on a hardened Linux with network access. Windows? Do you want to get hacked?

    Also note that they only put that in there because the lawyers told them they had to. This means this technology represents a fundamental and systematic novel risk they do NOT have under control. The usual limitations of warranty are not enough. Providing or using this feature will likely fall under gross negligence. Microsoft can get out of the resulting liability by explicitly warning its users that this is not a feature you can use securely and that result quality is very much not ensured. Or in other words, this is a very dangerous toy, not a professional product.

    That they feel they need to add a warning with this uncommon level of clarity is very telling. I am sure all the MS fans and all the nil wits will still ignore it. So let me add this, because it will be relevant: We told you so.

    • by sinij ( 911942 )

      Also note that they only put that in there because the lawyers told them they had to. This means this technology represents a fundamental and systematic novel risk they do NOT have under control.

      Exactly. This means that an average user taking reasonable precautions would be impacted. No dumb users falling for exploits are necessary to exploit this. What a mess.

      In the age of widespread AI with superuser permissions we need to create a secure and authenticated prompt. There now must be a difference between what used actually typed in as a query/prompt and text that AI may have across that contains query/prompt. This means security redesign.

      • by gweihir ( 88907 )

        Exactly. This means that an average user taking reasonable precautions would be impacted. No dumb users falling for exploits are necessary to exploit this. What a mess.

        Indeed. This is about average users behaving in reasonable ways not being able to be reliably secure anymore.

    • I'd be less concerned about the malware on windows knowing the typical home user. It's just yet another method, joining all the infected game hacks (or non-functional ones that are just malware claiming to be Roblox or Minecraft mods) , infected "useful" plug in with the same story, MS office documents that still represent about a third of all pfishing, or Microsoft letting .zip files run as executables if you just renamed the exe BINARY. The malware is something people already do, so it is not surprising.

      • by gweihir ( 88907 )

        The home-user does not really matter in this. It is corporate users that provide basically all MS profits.

  • by evanh ( 627108 ) on Sunday November 23, 2025 @05:58AM (#65813225)

    The LLM can and will make a mess all on its own. There is no need for external malice to get screwed by the LLM.

    • by gweihir ( 88907 )

      Well, MS made sure to live up to everybody's expectations by not only making this a security mess, but also a reliability mess! So much quality. So much winning. So much improvement.

  • in anticipation of fees to be earned from class action lawsuits. Win or lose these parasites will come out on top.

  • Maybe they trained Copilot on Microsoft Bob data. If you enter your password wrong three times it offers to change it for you so you can get in.

  • by serviscope_minor ( 664417 ) on Sunday November 23, 2025 @06:49AM (#65813253) Journal

    Feature Brings Data Theft and Malware Risks, and 'Occasionally May Hallucinate'

    As a long time Linux user I can't tell if they talking about AI or just regular Windows.

    • by sinij ( 911942 )
      I think your optimism in expecting Linux to be immune from AI infestation is unwarranted. The difference between Linux/Open Source and Microsoft is that latter has billions to burn to push early adoption on AI.
      • by allo ( 1728082 ) on Sunday November 23, 2025 @08:17AM (#65813317)

        Linux is customizable and has a huge variety of distributions with different focus. Of course it gets support for different AI systems (and already has), but you choose what you want and what you don't. The comparison "Linux will follow" rarely makes sense, because the difference is not how it is built right now or will be built together, but the philosophy of how users are allowed to customize it.

      • You misunderstand: I was being a smug neckbeard about Linux.

        With that aside: yeah well maybe GNOME. And systemd! Bring on the hate whoop! whoop!

        But seriously though the system is in my control. I have a GPU and pytorch installed. I can install an LLM coding bot if I want (I don't). I can use any of the pure local ones, etc etc.

        • by gweihir ( 88907 )

          Gnome? What's that? I do remember throwing off systemd because it gave me problems in the first hour of having it on a system. If it looks like crap and smells like crap ...

          So far I have noticed zero disadvantages of my approach.

        • by sinij ( 911942 )

          With that aside: yeah well maybe GNOME.

          I don't think you are using "maybe" correctly here. Maybe is used to indicate there is some level of doubt.

  • A new AI feature !!
  • by zmollusc ( 763634 ) on Sunday November 23, 2025 @07:16AM (#65813271)

    Why is this even being discussed in public? If a pc is running windows, that means all the hardware and software now belongs to microsoft, along with any data typed in, results of wifi scans, audio and video from any peripherals, everything from attached storage. It all belongs to microsoft and will all be stored, used and sold by microsoft as it sees fit.
    AI stealing data is just an inefficiency as it is data duplication in the silos.

    • by gweihir ( 88907 )

      Not in Europe. They could do it, but then their executives should probably never visit the EU again and they should close all dependencies here.

  • by aRTeeNLCH ( 6256058 ) on Sunday November 23, 2025 @07:41AM (#65813299)

    But you are still giving an AI system that can "hallucinate and produce unexpected outputs" (Microsoft's words, not mine) full access to your personal files.

    Nope, LLMs don't hallucinate. Their algorithmic output is deterministic and just gives the output based on the input, the training data, and various settings and configuration values.

    So (repeating myself): AI don't hallucinate from time to time. Every answer they ever give is equally made up. What people call hallucinations are merely cases where the made up answers are ostensibly wrong.
    Any AI may apologise when it's pointed out that their answer was incorrect, even if in fact, it happened to be correct.

    • by Anonymous Coward

      Your brain is also deterministic, if you aren't religious. Hallucinations just describe consistent plausible but wrong results. A single strange idea crossing your mind is quickly dismissed just like a wrong word in an LLM output. But sometimes your brain or the LLM may produce a chain of wrong but consistent results, which are harder to detect as wrong. They hallucinate.

      • by HiThere ( 15173 )

        Agree about the meaning of "hallucinate" in this context, but...

        You can't be sure your brain is deterministic. It may well have features that operate at the quantum level, with the implied genuine uncertainty. Transistors are normally scaled to avoid that problem. This isn't exactly "free will" in any normal sense, but it *is* non-deterministic behavior, at least as far as we can tell. (Yeah, superdeterminism is a valid interpretation of quantum theory, and so is the multi-world interpretation and a few

    • by gweihir ( 88907 )

      Just as a reminder, LLMs get randomized to better resemble a person. People are not deterministic in any way (unless you are a quasi-religious physicalist fuckup that mistakes religion for Science), and hence LLMs are made to not be either.

  • Microsoft Warns Its Windows AI Feature Brings Data Theft and Malware Risks, and 'Occasionally May Hallucinate'.

    With friends like that behind the corporate firewall, where’s Pablo Escobar the HR Director when you need him.

    Seriously. A sneeze-activated cocaine dispenser on the CEOs desk sounds better for business than that shit. And ironically is what currently works to keep their stock price higher than giraffe pussy.

  • by jpellino ( 202698 ) on Sunday November 23, 2025 @08:40AM (#65813337)

    and on some candidate resume, it says that in their professional capacity they have/will commit some data theft, inject malicious code into the software they are operating, and occasionally hallucinate on the job.

    You hiring this person?

  • by TrentTheThief ( 118302 ) on Sunday November 23, 2025 @09:27AM (#65813387)

    When micorsuck introduced VBA, touting it as a panacea, Dvorak said it would lead to increasingly difficult to avoid exploits.

    Here we go again.

    Microsuck does not care how about shitty their OS security is because everyone is stuck with it.

    • by gweihir ( 88907 )

      Well, there is a "late stage monopoly effect". It is when the product gets so bad that you cannot base your business on it anymore. I guess MS is close to that point now.

      • After reading the post about them finally admitting w11's core functions are hosed, i certainly agree.

        • by gweihir ( 88907 )

          Then add Azure getting hacked and being massively insecure, several times now. Add that many people are looking into leaving o365 because MS blocked a user for political reasons.

          MS is done for. They just will take quite a while dying. But there is no realistic chance they can turn things around anymore.

  • They won't say because Microsoft doesn't want to tell you to install Linux.
  • I use Linux at home but Windows at work. If I was an IT person at work, I wouldn't know what I should be doing.
    Microsoft ended support for Windows 10, and Windows 11 is riddled with half-assed AI features that even Microsoft is warning about. What are they even doing at this point?
  • And now they need to kill it and start over and produce a usable operating system that actually respects people's privacy, anything less deserves nothing but contempt, and leave the fucking AI & LLMs out and keep it only as an optional package users can CHOOSE if they want it
  • by FudRucker ( 866063 ) on Sunday November 23, 2025 @11:26AM (#65813515)
    Abandon Microsoft Windows, anything less condones their corrupt backstabbing evil shenanigans, there is plenty of good Linux distributions to choose from
  • Who thought letting a glorified spreadsheet write your gateway product was a GOOD idea, again?
  • I re-state a question I have asked before:

    Just how much more abuse will their customers take? When will users of Microsoft (or Apple) operating systems finally say ENOUGH? How much is too much? How far is too far? When will people finally say "all we wanted was a sold reliable OPERATING SYSTEM!" and turn their backs on these corporate dictators?

    YOU own a computer and you want an OS for it... but THEY say when you MUST update, what features you MUST accept, where your files will be stored, who can see those

  • "We know it sucks, we know it's bad for you, we know you don't want it. We're doing it anyway."

  • Not the 1st time this has happened on this site !! Microsoft admits that features like Copilot Actions introduce " novel security risks ...surprised they didn't say it was a new feature !

Nothing succeeds like excess. -- Oscar Wilde

Working...