In Face of Flame Malware, Microsoft Will Revamp Windows Encryption Keys 100
coondoggie writes "Starting next month, updated Windows operating systems will reject encryption keys smaller than 1024 bits, which could cause problems for customer applications accessing Web sites and email platforms that use the keys. The cryptographic policy change is part of Microsoft's response to security weaknesses that came to light after Windows Update became an unwitting party to Flame Malware attacks, and affects Windows XP, Windows Server 2003, Windows Server 2003 R2, Windows Vista, Windows Server 2008, Windows 7, and Windows Server 2008 R2 operating systems."
Moles at Microsoft and apple (Score:1, Insightful)
Re:Moles at Microsoft and apple (Score:5, Interesting)
That's a pretty serious "fact". And not to sound like a smart-ass Wikipedia editor, but some kind of citation would be great.
One can certainly believe there are moles at Microsoft/Apple. One can even reasonably assume that the United States Government has the power to compel Microsoft/Apple to do things that are in the U.S.'s best interests. However for a foreign mole to be able to insert back doors into the Windows source code - which I would add is fairly well vetted since most governments and educational institutions have read access to the source - would be quite remarkable to the point of being unbelievable.
Re:Moles at Microsoft and apple (Score:5, Insightful)
Indeed. Why have a mole try to alter the code, and run the risk of being discovered, when you have a copy of the source, and can find existing bugs to use?
Re:Moles at Microsoft and apple (Score:5, Interesting)
True, but as ITWorld's Kevin Fogarty says;
Still, the assumption seems to be true metaphorically, if not physically, so it's safer to assume Microsoft and its software have both been compromised. Given the track record of Stuxnet, Duqu and Flame for compromising everything they're aimed at, that assumption isn't even much of a stretch.
http://www.itworld.com/security/281553/researcher-warns-stuxnet-flame-show-microsoft-may-have-been-infiltrated-nsa-cia [itworld.com]
Personally, I use Linux because it's lower maintenance and less overhead, and gets out of my way when I'm working, but if I was a business lead, I'd certainly be avoiding Windows for anything requiring data security. The wonder is that we're not seeing users suing over compromised data/systems.
Re: (Score:1)
Well - there are reason for that.
At first - the EULA especially gives no guarantee your data will be preserved, uncompromised or protected. Yes that's right - the end user has no guarantee the software he/she/it uses is safe, and Microsoft is not responsible for anything. Great hm?
Now - nothing is guaranteed with Linux either, but at least every end user can go trough the sourcecode to discover faults. Most end users will not do that or are not capable to do that, but at least the chances of discovery are b
Re: (Score:3)
"but at least every end user can go trough the sourcecode to discover faults"
Yes, sure they can. And every user has such ability, and kernel level hacking skills. And each and every individual and company should employ people to do this.
Yessss, right. While this is philosophically a nice point, I have to say that the real world aspect is delusional. And I'll add another point. Its open source. If Governments can insert code into MS codebases, they can do so with any open source. The fact is they might only
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
How many need it? Submit a patch that fixes a known issue, and quietly creates a new one. Who is going to know, off-hand, that the code causes a hardware glitch at regular intervals on Intel / AMD processors that causes the kernel to check a memory location and execute any code found there?
It's not like someone is going to submit a patch that looks like "LOL. if(username == "w00t!") { privilegeescalationcode(); fork(); } "; they aren't even going to do "code that patches something(); code for a completely u
Re: (Score:3, Insightful)
Personally, I use Linux because it's lower maintenance and less overhead, and gets out of my way when I'm working, but if I was a business lead, I'd certainly be avoiding Windows for anything requiring data security. The wonder is that we're not seeing users suing over compromised data/systems.
I know right... What are the chances out of the bazillion open source projects that go into your average linux distribution any of them could be be infiltrated by a three letter agency from this or any other nation... Impossible.... totally ...utterly..... impossible... ..right...?
I know some people will say well its open source others would have the code and just know. Just like they knew about that Debian "SSL patch"... Or any of hundreds of "innocent" security bugs having later been discovered by attack
Re: (Score:3)
How long was kernel.org compromised? Without anyone knowing?
We don't know. At least 17 days was the initial assessment, but kernel.org has - despite claims about openness - never got around to write up a post mortem.
Many of us would like to know what the attack vector was. How a compromised *user* account could lead to the kernel being rooted. How a relatively amateurish infection (phalanx is a old, known and pretty trivial rootkit) could penetrate so deeply into infrastructure run by the people most knowledgeable on Linux *in the world*.
Was the actual attack which
Re: (Score:2)
Re: (Score:1)
when you have a copy of the source, and can find existing bugs to use?
--
“Forget Jesus. The stars died so you could be born.”
Re: (Score:2)
Because it's difficult to place bugs in the Source code that won't be discovered during quality testing (laugh with me, but yes, some companies do do it) or by your fellow developers.
Re:Moles at Microsoft and apple (Score:5, Insightful)
Citation: my contacts at Microsoft and apple. Obviously I can't name names.
Obviously you can't be taken seriously, either. It's not that I don't believe you, it's that I can't ever cite you.
Re: (Score:3)
Re: (Score:1)
Long live TrueCrypt!
Re:Moles at Microsoft and apple (Score:5, Insightful)
Others have come to the same conclusion as noh8rz5
Well, I know this is one of those things annoying people say to be annoying, but the plural of anecdote is not data. I have come to the same conclusion, too, but I don't state it as fact, because there's no citable evidence.
Re:Moles at Microsoft and apple (Score:4, Insightful)
Also, I seriously doubt a 'contact' at Apple or Microsoft is going to know about spies.
Re: (Score:2)
NSA can break the encryption anyway if they put the effort to the task.
Re: (Score:3)
They can do that at their 2k qbit quantum computer?
Re: (Score:3)
Well, until MS explains what the NSAKey does, I'll just assume the worst.
http://web.archive.org/web/20000520001558/http://www.microsoft.com/security/bulletins/backdoor.asp [archive.org]
You could have stopped assuming the worst over a decade ago. If you really think that the NSA would allow its back door to carry such an obvious name, then you need to get your head checked. Here is the sort of back door I might be willing to attribute to the NSA, but even this seems a little too obvious:
http://www.wired.com/politics/security/commentary/securitymatters/2007/11/securitymatters_1115 [wired.com]
Re:Moles at Microsoft and apple (Score:5, Informative)
Not to mention:
http://www.schneier.com/crypto-gram-9909.html#NSAKeyinMicrosoftCryptoAPI [schneier.com]
Re: (Score:2)
Re: (Score:2)
Re:Moles at Microsoft and apple (Score:5, Informative)
Well said.
Nothing compels you to run Microsoft's encryption APIs either. They are convenient, and well documented, so most programmers do use them, but you can write or bring your own from any platform you trust. If your platform is backdoored none of this will help you much.
The assertion that there are backdoors in spite of no one finding it and every single person in the chain of knowledge for the last 20+ years keeping their mouth shut right into the grave.
Re: (Score:2)
Re: (Score:2)
However for a foreign mole to be able to insert back doors into the Windows source code - which I would add is fairly well vetted since most governments and educational institutions have read access to the source - would be quite remarkable to the point of being unbelievable.
Yep. Anything that totally compromises the system would have been discovered by now. There's a lot of people interested in finding such things.
Re: (Score:3)
Yes but if the key length is sufficiently large they lose plausible deniability.
No ones going to believe that anyone bruteforced a 2048 cert key but if you start mentioning MD5 and less and 1024 then it could be anyone.
Re: (Score:2)
The Flame certificate's key was replaced with a shorter RSA key due to bad usage of crypto by Microsoft.
The estimated computational ability to crack a 1024-bit RSA is quite large (but maybe, maybe feasible if you can devote large amounts of computational time to it.) But 2048-bit RSA? No, not unless a three letter agency has figured out a new way to factor integers.
Re: (Score:2)
With current (non secret) algorithms? Well, they could do it fast (in geological terms), but they'll need a star thousands of times larger than our Sun powering a computer running near to the optimum efficiency.
Somehow I doubt the NSA has such infrastructure.
Re: (Score:1)
Somehow I doubt the NSA has such infrastucture.
If they don't yet, they sure as hell will soon [wired.com]
Re: (Score:2)
Better yet.. dont answer that.. instead simply explain why your 4-digit slashdot ID isn't backed up by someone aware?
Re: (Score:2)
You mean his FIVE-digit slashdot ID right? Or did you find a fancy new factoring algorithm ?
Re:Moles at Microsoft and apple (Score:4, Informative)
I wonder how long it would take a modern cray or a cluster of 1,000 computers to crack a 2048 cert?
Throwing 1000 computer instances at the problem does not make the difference you think it does.
Re:Moles at Microsoft and apple (Score:5, Informative)
Even if you know that its the square of the power required to crack 1024 bit certs, which themselves are the square of the power to crack 512 bit certs, which are themselves the square of the power to crack 256 bit certs.. when you are ignorant of how much power THAT is, you are still just guessing.
No organization on earth considers the breaking of 256 bit hashes/encryption trivial. Thats a 1 followed by a whopping 77 zeros. Thats only about 3 zeros away from the number of baryons in the entire visible universe.
Re: (Score:3)
What are you on about? 1024 bit RSA is reasonable to crack in the next 10 years.
This is RSA, its not as strong as AES for the same bit length.
http://en.wikipedia.org/wiki/Key_size#Asymmetric_algorithm_key_lengths [wikipedia.org]
According to Wikipedia approximation you need about 4 billion times the compute instances to calculate 2048 compared to 1024.
Re: (Score:2)
Symmetrical ciphers like AES are secure at short key lengths like 128-256 bits. For asymmetric ciphers like public/private key encryption, you need to go up to 1024 or 2048 bits in order to get the same level of protection (same approximate resis
Re:Moles at Microsoft and apple (Score:5, Informative)
Yes and no. Open source doesn't guarantee security. For example, BIND had a long history of bugs (many of which involved security) due to poor design prior to version 9. You didn't need a mole or any malicious intent when the software was so full of big holes you could drive your car through them. OpenBSD had an alleged FBI back door [marc.info] in the news a couple years ago that had lain unnoticed for years.
Then again, there are examples of open source uncovering security issues. A quick google search uncovered this old one [freebsd.org] and this more recent one [freebsd.org]. By the way, if it sounds like I'm picking on BSD, I was searching for that FBI link. The other stuff just popped up. I know the various BSDs have a reputation for stability and security.
Re: (Score:1)
I think the question that GP hinted at was if you can not do code review, can it ever be secure. Code that is closed source only leaves one method to measure security, and that’s past performance. Based on past performance, you can make an educated guess over how secure something is. In that sense, new products and new market actors with closed software should never, ever, be trusted in anything requiring security. Without a past performance track record there is a complete lack of evidence to provide
Re:Moles at Microsoft and apple (Score:5, Interesting)
Why do people assume there is a large group of developers that actually understand OS source code and are capable of locating and correcting any problems found? Most of the people with the necessary skills to do this are already busy working for companies that actually pay them for their services. The vast majority of security issues are discovered by companies and individuals who specialize in this area and expect payment for their services. OS troubleshooting and development also requires well stocked labs to test all of the different permutations of hardware and software behaviors. The low hanging fruit has already been grabbed which forces deeper analysis of the OS code to locate potential issues and determine the impact their proposed changes will have. Just because someone is half way competent in Application development does not mean they have the skills needed to understand OS development. OS development is quite different than Application development. Just downloading the OS source code and building it can be a gigantic pain in the ass when trying to sort out all of the dependencies and compiler configurations for a particular environment.
I you want a secure system you are better off making sure the system administrators and application developers are doing their jobs. Some of most harmful security issues have exploited known issues that were corrected way before someone started exploited them. And those happens because system administrators failed to stay current on their security related service packs.
Re:Moles at Microsoft and apple (Score:5, Informative)
The only way out of this is to use an open source operating system where you can do your own code review
Have you ever tried to do this? I have tried, and trust me, no single person can review all of the software that runs on their system. There are a lot of places where a back door could be hiding, especially if you are talking about cryptography. Even something as seemingly innocuous as the default environment variables that programs see could be part of a back door (in case anyone does not know, the length of the environment variables can affect the alignment of memory, which can affect cache misses and potentially amplify a side channel).
Have you reviewed the millions of lines in the Linux kernel? Have you reviewed OpenSSL? Have you reviewed GnuPG? Have you reviewed glibc, libstdc++, ld, bash, gcc, your python interpreter, your X server, your email client, your web browser, etc?
Re: (Score:2)
Have you reviewed the millions of lines in the Linux kernel? Have you reviewed OpenSSL? Have you reviewed GnuPG? Have you reviewed glibc, libstdc++, ld, bash, gcc, your python interpreter, your X server, your email client, your web browser, etc?
There was a story a while ago with some reasonable mathematics that showed that if your CPU delivers a predictable incorrect result for _one_ 64x64 bit multiplication then this could be used to crack certain encryption. And that would be totally undetectable.
Re: (Score:3)
Re:Moles at Microsoft and apple (Score:4, Informative)
Except the OpenBsd back door claim was never proven and dismissed by basically everyone - subsequent audits of code and checkins haven't revealed anything suspicious.
It was basically someone who wanted to get their name in the papers, that's all.
Re: (Score:2)
Fact is, domestic and foreign govt agencies have moles working at Microsoft and apple to insert back doors or defeat encryption at the source. This is how stuff like flame happens. The only way out of this is to use an open source operating system where you can do your own code review, and where one guy doesn't have a bottle neck of control. Same goes for ios vs android.
If that rabid nonsense that you are posting about Microsoft and Apple were true, what makes you think that the Linux code that you look at is the Linux code that gets compiled and shipped?
Re: (Score:2)
This is only true if you're compiler hasn't been compromised [bell-labs.com], or the that compiled it, or the one that compiled that one, and on and on.
The reality is that no matter how clever you are, how long you spend reading the source code for your favorite operating system, or how well you understand the results of that reading, you have to trust someone some time.
Even aside from that, the number of people who truly understand the source and design of any given OS completely could probably be counted without resortin
Re: (Score:2)
Er, export restrictions? (Score:1)
IIRC, crypto algorithms that use keys that large qualify as munitions and are subject to ITAR export regulations. Which means a lot of people with legal licenses will be (legally, anyway) prevented from making use of any Windows feature which requires a key length of 1024 bits or more.
This also begs the question of why they allowed shorter keys to begin with... o_o
Re: (Score:2)
Re: (Score:1)
And the second amendment means I have the right to encryption.
Assuming that you mean the 2nd amendment to the US Constitution, then you are simply WRONG.
The parent post said EXPORT restrictions, so it was referring to users outside the US. Users outside the US aren't covered by the US Constitution.
Re: (Score:2)
Re: (Score:3, Interesting)
Because doubling the key length roughly increases the required time by 7. Increasing compute time by 7^20 is a little extreme, when just doubling it is good for a while.
Re:Er, export restrictions? (Score:4, Interesting)
Sorry got my maths wrong its only about about 300 million times longer.
Re:Er, export restrictions? (Score:5, Informative)
Because RSA-2048 keys (twice the length of RSA-1024) take about four times as long to operate on (http://www.cryptopp.com/benchmarks.html). RSA-15360 (which is roughly the strength of AES-256 (http://csrc.nist.gov/publications/nistpubs/800-57/sp800-57-Part1-revised2_Mar08-2007.pdf, page 63)) would take about (15360/1024)^3 = 3300 times as long as RSA-1024 (http://www.design-reuse.com/articles/7409/ecc-holds-key-to-next-gen-cryptography.html). This isn't a big deal for your local PC, where a single signature verification might take 250 ms rather than the sub-ms that it does with RSA-1024, but it has huge impacts on the servers that you're talking to - imagine increasing your server load by 330,000%.
Re: (Score:2)
Faster than RSA1024 by a few factors, and about the strength of a 256bit symmetrical key, putting it at "universe lifes" for how long it takes to break.
Get rid of crypto patents (Score:2)
Re: (Score:2)
The number does indeed refer to the length of the key, RSA-1024 is a 1024 bit key, that is the key is a "1024 digit" binary number. 2048 bits will indeed be twice as long.
Re: (Score:2)
Why are we screwing with 1024-bit keys?
We are not supposed to be, 2048 should be considered the minimum going forward.
Why aren't we using keys that are 1048576-bits?
It would take too long to encrypt anything, and there are diminishing margins of return when key sizes grow so large. If you are using more than 16384 bit keys, you are doing it wrong -- if you really need security that far into the future, you should be using ECC (which is more efficiently in terms of key sizes) or something that is secure even in the presence of a quantum computer like McEliece.
Also, keep in mind that s
Re: (Score:2)
what's wrong with shorter, cascading keys using different algorithms?
Re: (Score:2)
http://secgroup.ext.dsi.unive.it/teaching/security-course/composition-of-ciphers/ [unive.it]
Or if you prefer a more rigorous treatment of this topic,
http://www.cs.toronto.edu/~myers/BBCEuro.pdf [toronto.edu]
You should also be careful about composing a cipher with a compression function:
http://news.slashdot.org/story/11/05/26/1933219/Chapel-Hill-Computational-Linguists-Crack-Skype-Calls [slashdot.org]
Re: (Score:2)
what's wrong with shorter, cascading keys using different algorithms?
The complexity that results adds a lot of risky unknowns. There are many tales of protocols being broken due to people looking at the various pieces in unexpected ways. Take DES, for example. Ordinary DES takes 56 bits of key material and offers 56 (OK, 55) bits of security. When DES was starting to appear fragile to brute force attacks, people looked for ways to strengthen it without replacing it. They thought of running two instances of DES, each with its own key, one after the other. How much secur
Re:Er, export restrictions? (Score:5, Insightful)
IIRC, crypto algorithms that use keys that large qualify as munitions and are subject to ITAR export regulations. Which means a lot of people with legal licenses will be (legally, anyway) prevented from making use of any Windows feature which requires a key length of 1024 bits or more.
Maybe ... we your time machine works and they are all send back to 1997. Because, since then, it is no longer restricted by ITAR and can be freely exported...
financial implosion (Score:2)
If encryption is a part of this hack or any such security failures, we can't afford any more security theatre and survive financially.
Seriously? Dupe already? heh (Score:1)
http://it.slashdot.org/story/12/07/10/2122220/microsoft-revokes-trust-in-28-of-its-own-certificates [slashdot.org]
Re: (Score:2)
ok, I suppose it's not quite a dupey as it could be. But still, heh.
Re: (Score:2)
It's a follow up to the so thought dupey.
ISPs have been rejecting CSR requests (Score:1)
ISPs have been rejecting CSR requests with less than 1024-bit keys for a long long time. Looks like windows is forcing a long overdue change back at the server, but I suspect providers have already forced most hands earlier.
Re: (Score:2)
Wrong, ISP as in Internet Service Providers don't care what you are doing with your bits... It's the CAs (Certificate Authorities) who are not issuing certificates with CSRs of 1024 bits in length.
Well at least they're making a change (Score:5, Informative)
If only there was a standards group, like NIST, that could determine what the acceptable key lengths were.
Oh yeah, NIST does have a publication on this topic and stated that 1024 bit keys were no longer acceptable back in ... 2010. [nist.gov]
by the way, is it really 1024 bit encryption keys as stated in the article? I thought that the encryption keys were symmetric and its' the signature of the public key that's 1024 bit.
Re: (Score:2)
Those are also encryption keys. They are most often used for signing, or for encrypting a symmetric key, but they are encryption keys.
Re: (Score:2, Informative)
Correct... sort of.
For secure email, the sender encrypts with a 1024 bit public key. The recipient used the matching private key to decrypt. Only holders of that key can do so.
For signing, the signer encrypts with a 1024 bit private key. The verifier used the public key to decrypt. If the verifier can do that and the hashes match, it's a legit signature. Anything encrypted with the private key is by definition signed. The 1024 bit public key isn't the signature - it's the key used to verify the messag
Re: (Score:2)
Re:1024? (Score:4, Insightful)
Re: (Score:3)
How does that help? (Score:1)
AFAIK, the problem with flame was a trust problem, not a bitstrength problem. They allowed Terminal Services certificates signed by Microsoft to be used to sign application code and the certificate chain still passed. Presumably those TS certs could have been 2048 bit or higher.
PCI Compliance (Score:2)
I had to update my certs 2 years ago to meet PCI compliance. Honestly, Im shocked vendors still allow 1024 certs to be distributed.
The tin foil hat guy inside me says this is great for vendors who will get to charge fees to upgrade everyones certs....