Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Software Security

Computer Scientists Develop 'Mathematical Jigsaw Puzzles' To Encrypt Software 245

another random user writes "The claim here is that the encrypted software can be executed, but not reverse-engineered. To quote from the article: 'UCLA computer science professor Amit Sahai and a team of researchers have designed a system to encrypt software so that it only allows someone to use a program as intended while preventing any deciphering of the code behind it. According to Sahai, previously developed techniques for obfuscation presented only a "speed bump," forcing an attacker to spend some effort, perhaps a few days, trying to reverse-engineer the software. The new system, he said, puts up an "iron wall," making it impossible for an adversary to reverse-engineer the software without solving mathematical problems that take hundreds of years to work out on today's computers — a game-change in the field of cryptography.'"
This discussion has been archived. No new comments can be posted.

Computer Scientists Develop 'Mathematical Jigsaw Puzzles' To Encrypt Software

Comments Filter:
  • I Call BS (Score:5, Insightful)

    by MightyMartian ( 840721 ) on Thursday August 01, 2013 @12:36AM (#44443399) Journal

    I'm sure they can further obfuscate the actual code, but at the end of the day the processor is going to have to run machine code, and one way or the other you can tap the processor's activity to read the "decrypted" code. Beyond that, I imagine the performance penalties involved will be monstrous. Even normal obfuscation techniques have pretty heavy penalties.

  • by technosean ( 169328 ) on Thursday August 01, 2013 @12:38AM (#44443417)

    Who gains from this? Most of all? Virus writers.

    Perhaps the real solution is to have the OS never execute code looking anything like this.

  • make a crackme (Score:3, Insightful)

    by ZeroNullVoid ( 886675 ) on Thursday August 01, 2013 @12:41AM (#44443433)

    If they really think it is so good, then they should put their money where their mouth is.
    Make it into a crackme, issue a large award for solving it.
    Post it online. I give it a few weeks max, if that.
    And who is to say it can't still be manipulated once running?
    Think of the performance cost.

    Either way, I have no faith in an article with little details.

  • by MightyMartian ( 840721 ) on Thursday August 01, 2013 @12:44AM (#44443447) Journal

    Yup. At the end of the day, if this is at all useful and the hardware and OSs out there now, it;s still going to have to execute, and if it executes, you can run it through a debugger and watch it.

  • Re:I Call BS (Score:5, Insightful)

    by MightyMartian ( 840721 ) on Thursday August 01, 2013 @12:46AM (#44443459) Journal

    Unless the software can magically tlel that it's running in a debugger or a sandbox and refuse to execute, it's activity; from stack activity to system calls to memory allocations can be traced.

  • by wierd_w ( 1375923 ) on Thursday August 01, 2013 @12:52AM (#44443501)

    One way around this (for reverse engineering) would simply be to run it inside a VM with a built in stack trace debugger, like Bochs.

    You can peek at the raw instructions percolating through the VM's emulated CPU that way. The application itself is then the decryption key, since the system has to be able to run the application.

    PITA, but I don't see how this is anything more than just a "bigger" speedbump, at least as far as disassembly goes. Now, making said decryption require nasty bits of hardware to thwart using a VM, and clinging tightly to treacherous computing with a TPM and a hardware hypervisor on top of this? That's more dicey.

    The question here is... why do this? Is preventing Cracker Jack from the warez scene from simply disabling your "authenticity checks" so horrifically important? That's the major application for this that I really see, besides making epically horrific malware. (Though if you ask me, they are both the same thing.)

    Seriously, other than making personal computing become something from communist russia, what is the benefit of this?

  • Re:I Call BS (Score:5, Insightful)

    by MightyMartian ( 840721 ) on Thursday August 01, 2013 @01:44AM (#44443757) Journal

    That's because VMs really don't try to hide themselves from the guest. While it might be pretty hard to build a VM that did a good enough job to essentially fool software attempting to identify whether its physical or virtualized hardware that it's running on, we have the source for a number of VMs (ie. KVM and Xen) and if this kind of obfuscated software started showing up on the market, I'm sure there would be a much greater push to make a rock solid virtualized environment that mimicked physical hardware with much more fidelity.

  • Re:I Call BS (Score:4, Insightful)

    by MikeBabcock ( 65886 ) <mtb-slashdot@mikebabcock.ca> on Thursday August 01, 2013 @01:59AM (#44443823) Homepage Journal

    This exactly -- virus writers have been at the forefront of code that hides and obfuscates itself and VM type systems were developed to essentially run the code to determine its effects without actually running the code.

    So long as the code can be executed, it can be reverse-engineered.

If a train station is a place where a train stops, what's a workstation?

Working...