

Imparting Malware Resistance With a Randomizing Compiler 125
First time accepted submitter wheelbarrio (1784594) writes with this news from the Economist: "Inspired by the natural resistance offered to pathogens by genetically diverse host populations, Dr Michael Franz at UCI suggests that common software be similarly hardened against attack by generating a unique executable for each install. It sounds like a cute idea, although the article doesn't provide examples of what kinds of diversity are possible whilst maintaining the program logic, nor what kind of attacks would be prevented with this approach." This might reduce the value of MD5 sums, though.
Cute but dumb (Score:5, Insightful)
Would cause major debugging headaches (Score:5, Insightful)
Can you imagine parsing a stack trace or equivalent from one of these? Each stack is different.
Ignoring the fact that Heisenbugs would be much more prevalent.
Part of programming is paring of states. The computer is an (effectively) infinite-state machine. When you add bounds and checks you're reducing the number of states. This would add a great deal, making bugs more prevalent. Since a lot of attacks are based on bugs, this may increase the likelihood of some attacks.
....why? (Score:5, Insightful)
..would a professor of CompSci think this is a good idea, despite the hundreds of problems it *causes* with existing practices and procedures?
Oh, wait.. maybe because the idea is patented and he'll get paid a lot.
http://www.google.com/patents/US8239836
So we're stuck with the source then? (Score:4, Insightful)
So we should use something like ABS with that randomisation enabled? Or should we trust to download distinct blobs for every download? For the latter, nice try NSA, but I don't want you to be abled to incorporate spyware into my download and not be noticed. ... extra features. The blobs should be signed by more entities, so then all would have to be NSLed.
Its already a pity software gets signed only by so few entities (usually one at a time, at least for deb). Perhaps I know that the blob came from Debian, but I can't verify whether it is the version the public gets, or the special version with some
Re:Cute but dumb (Score:5, Insightful)
Re:Cute but dumb (Score:2, Insightful)
And would make that buggy software nearly impossible to patch.
Every time there's a security vulnerability found, you'd essentially have to reinstall the whole application.
Knock on wood, but I've not had enough bad experiences with malware to think the tradeoff is worth it.
Explain Like I'm Five (Score:5, Insightful)
The problem with this in "Explain like I'm Five" terms:
You can have no idea what the program you are running does.
You cannot trust it. You cannot know it hasn't been tampered with. You cannot know a given copy works the same as another copy. You cannot know your executable has no back doors.
On the security minded front we have a trend towards striving for deterministic build capability; so that we have some confidence and method of validating that a source code to executable transformation hasn't been tampered with, that the binaries you just downloaded were actually generated from the source code in a verifiable way.
Another technique I'm seeing in secure conscious areas is executable whitelisting, where IT hashes and whitelists executables, and stuff not on the whitelist is flagged and/or rejected.
Now this guy comes along and runs headlong in the other direction suggesting every executable should be different. And I'm not sure I see any real benefit, nevermind a benefit that offsets the losses outlined above.
Re:Cute but dumb (Score:5, Insightful)
Re:the crutch of determinism (Score:4, Insightful)
All in all, your post reads like a smug "Code better, noob!" while completely ignoring the tremendous extra costs that are going to be necessary to properly test hundreds of thousands of randomized builds for consistency.