Forgot your password?
typodupeerror
Encryption Intel Hardware Technology

Intel Demos Chip To Compute With Encrypted Data (ieee.org) 37

An anonymous reader quotes a report from IEEE Spectrum: Worried that your latest ask to a cloud-based AI reveals a bit too much about you? Want to know your genetic risk of disease without revealing it to the services that compute the answer? There is a way to do computing on encrypted data without ever having it decrypted. It's called fully homomorphic encryption, or FHE. But there's a rather large catch. It can take thousands -- even tens of thousands -- of times longer to compute on today's CPUs and GPUs than simply working with the decrypted data. So universities, startups, and at least one processor giant have been working on specialized chips that could close that gap. Last month at the IEEE International Solid-State Circuits Conference (ISSCC) in San Francisco, Intel demonstrated its answer, Heracles, which sped up FHE computing tasks as much as 5,000-fold compared to a top-of the-line Intel server CPU.

Startups are racing to beat Intel and each other to commercialization. But Sanu Mathew, who leads security circuits research at Intel, believes the CPU giant has a big lead, because its chip can do more computing than any other FHE accelerator yet built. "Heracles is the first hardware that works at scale," he says. The scale is measurable both physically and in compute performance. While other FHE research chips have been in the range of 10 square millimeters or less, Heracles is about 20 times that size and is built using Intel's most advanced, 3-nanometer FinFET technology. And it's flanked inside a liquid-cooled package by two 24-gigabyte high-bandwidth memory chips—a configuration usually seen only in GPUs for training AI.

In terms of scaling compute performance, Heracles showed muscle in live demonstrations at ISSCC. At its heart the demo was a simple private query to a secure server. It simulated a request by a voter to make sure that her ballot had been registered correctly. The state, in this case, has an encrypted database of voters and their votes. To maintain her privacy, the voter would not want to have her ballot information decrypted at any point; so using FHE, she encrypts her ID and vote and sends it to the government database. There, without decrypting it, the system determines if it is a match and returns an encrypted answer, which she then decrypts on her side. On an Intel Xeon server CPU, the process took 15 milliseconds. Heracles did it in 14 microseconds. While that difference isn't something a single human would notice, verifying 100 million voter ballots adds up to more than 17 days of CPU work versus a mere 23 minutes on Heracles.

This discussion has been archived. No new comments can be posted.

Intel Demos Chip To Compute With Encrypted Data

Comments Filter:
  • Cannot trust (Score:3, Interesting)

    by manu0601 ( 2221348 ) on Tuesday March 10, 2026 @07:30PM (#66034352)
    How can the end user trust such a system? Even if we ever find a way to ensure it runs the software it is supposed to run (does it really encrypts your data?), the inner working involves concepts that even an IT engineer do not master.
    • by znrt ( 2424692 )

      the end user delivers encrypted data and decrypts the result. if you trust the encryption method then you can trust the output.

      now, i have no idea how this system in particular works and what its capabilities or intended use are (didn't rtfa), but for specific operations and given specific assumptions this is entirely possible.

    • Re:Cannot trust (Score:4, Informative)

      by fuzzyf ( 1129635 ) on Tuesday March 10, 2026 @07:49PM (#66034380)
      Compute on encrypted data is more of a theoretical exercise (I know, I know, I mean "not particularly useful") than a practical one. The limitations are so many that it can hardly be called processing. You can't make decisions on data that is encrypted, because then you could figure out what the data is (think 20 questions). You can only do some limited math on specific scenarios.

      It's interesting, but it can't really process your health data or much of any real world data imho.
      • Re:Cannot trust (Score:5, Informative)

        by abulafia ( 7826 ) on Tuesday March 10, 2026 @08:11PM (#66034420)
        Fully homomorphic encryption [quarkslab.com] is mostly theoretical, but that's because it is incredibly slow and uses huge amounts of memory, not because you can't write conditionals.

        You can compute anything using FHE that you can with any other turing machine. As long as you can wait long enough [arxiv.org].

        If Intel can provide 1000x+ speedups, some of this might become usable in limited ways. Because right now it costs multiple seconds to do a single FHE multiply, and it needs something like 20000x the memory space of unencrypted computation.

        • Fully homomorphic encryption [quarkslab.com] is mostly theoretical, but that's because it is incredibly slow and uses huge amounts of memory, not because you can't write conditionals.

          You can compute anything using FHE that you can with any other turing machine. As long as you can wait long enough [arxiv.org].

          For certain (3 letter?) agencies, knowing that the data always stays encrypted may be worth the trade-off of slow. Not for all situations, of course, but for very specific targeted uses.

    • by serviscope_minor ( 664417 ) on Tuesday March 10, 2026 @08:19PM (#66034428) Journal

      You don't need to trust the system, that's the beauty of FHE.

      You encrypt the data and send it to the FHE chip presumably in the cloud along with your code. It crunches the code and the output is still encrypted because it doesn't have the keys. You get it back and decrypt it.

      • Nope, you still need to trust the system. It doesn't know the exact data, but it does know the decision process for generating output based on the encrypted data. That means that just like unmasking people using TLS to connect to websites, the metadata (what decisions are possible / which ones are taken / when they are taken / who is doing the processing / the time in which it takes to process that data / the time the data was queued for processing / etc.) is important and can be used to defeat the encrypti
      • You don't need to trust the system, that's the beauty of FHE.

        You run some software on your device, and you trust it to actually encrypt the data, and not exfiltrate stuff through a side channel.

  • Now imagine this isn't your workloads running on a cloud server, but software you paid for running on your own computer.

    Working DRM would be the worst invention in the history of computing, and specialized hardware that can hide things from a root user makes it much more plausible.

    • Yep. Encrypted software running in an encrypted system that is completely opaque and only the results are available. User, developer, and owner hostile. Perhaps the only use of it is for running stuff on other people's untrusted machines without disclosing what the data or code is doing.
      • Aka an accountability nightmare for anyone concerned with such things, and a gift from god for those with ill-intent. Of course it will be deployed everywhere in a few years under mandate.

        Or at least it would be, if unlike all other forms of "use other people's resources with impunity" it actually worked long term. The reality is that analytics will be get better and those pushing this will quickly move on to the next "secure" chip that they can sell you to "protect" your data from the prying eyes of the
  • Yeah, that's gonna happen.
  • How do you work on an encrypted set of data without decrypting it? This sounds like absolute BS and coming from Intel it probably is with their recent track record. I mean if a processor can work on encrypted data without decrypting it and knowing what the data actually is, what's to stop someone from writing code making that processor spit out all the plain text of that encrypted data?
    • by HiThere ( 15173 )

      I think it's real. This isn't the first time I've heard about it. So far it hasn't been practical. If this one actually works, trusting it is going to depend on trusting the implementation, not on understanding or auditing it.

      • by gweihir ( 88907 )

        It is real: https://en.wikipedia.org/wiki/... [wikipedia.org]

        But it is exceptionally slow. This is something you only do when you have no choice.

        • Ok, but I still think it's BS that Intel has anything close to working at a decent performance level in this field. If anything I would consider this field to be up Nvidia's alley and if they don't have anything going on here, I have few hopes for anything Intel has to be anything less than smoke and mirrors.
      • trusting the implementation, not on understanding or auditing it.

        That's called blind faith, not trust. And these kinds of "secure" chips have had serious implementation bugs before. Not being able to audit it is just asking to get owned.

    • by ledow ( 319597 )

      Homomorphic encryption is well-documented, it's just incredibly slow with conventional technology.

      You can do any binary process on encrypted data using homomorphic encryption - it will modify the encrypted data in-situ without ever needing or knowing what the unencrypted data is. It literally doesn't care, and can't tell.

      Think of it like running, say, "AND" or "OR" Boolean commands on specially-encrypted data. You design it in such a way that the "AND"/"OR" processes manipulate the encrypted data. Which,

      • Microsoft can host remove old database entries before a certain age

        And the reason you'd want them to? Doesn't matter if it's encrypted or not, if Microsoft is deleting your data, there's problems with that alone. Never mind that you just violated your assumption of "Microsoft doesn't know what's in your SQL database." They know the entries in question are older than the given criteria. Every single operation performed is revealing more metadata about the contents of the database, creating more and more violations through side-channels. Make enough of those violations, and

        • by ledow ( 319597 )

          You literally aren't understanding.

          It means a foreign host can host your database, perform database actions that you ask them to, and at no point reveal anything to that host about the contents of your database.

          Microsoft CAN, on your instruction, remove a row, or filter, or sort or whatever SQL you want to do, on your database, without knowing what the data is. The database will be manipulated from an encrypted database WITHOUT that action being performed to an encrypted database WITH that action having be

  • Here's a better idea (Score:4, Informative)

    by MpVpRb ( 1423381 ) on Tuesday March 10, 2026 @10:03PM (#66034506)

    Don't use the cloud
    The cloud is a trap
    Run away

  • For that reason, this is hardware that has very limited use.

  • Either this will be fraud and the chip will somehow decrypt your data, or more likely, they'll use the same think of the children shit to attack it that they use on end-to-end encryption.
  • Wastes lot of money on specialized tech and electricity, doesn't add value anywhere else, and permits hiding their CSAM and stolen PII.
  • Hello,

    I looked at that a while ago in the context of voting systems. And yes, you can perform any operations on fully encrypted data. The flow is: user encrypts, sends data to system, system performs algorytm, system sends data to user (might be an other user) and decryption happends there...

    Problem is it's slow...

    So, they are proposing to solve the slowness...

    Great.. But it woulf still be faster to do the compute on your own computer on unencrypted data!!!!

    This seems like a solution looking for a problem!

    C

  • Whether you are are stealing programmers' and creators' hard earned work, or making making undressing porn of someone's kid sister, or helping bomb another country, you can sleep well at night--until your computer gets is infected with work units you didn't ask for, and you have high power bills, and very little computing results.

Profanity is the one language all programmers know best.

Working...