Forgot your password?
typodupeerror
Windows Operating Systems Software Security Your Rights Online

UK Government Wants a Backdoor Into Windows 598

Posted by CmdrTaco
from the there-are-plenty-of-worms-available dept.
REBloomfield writes "The BBC is reporting that the British Government is working with Microsoft in order to gain backdoor access to hard drives encrypted by the forthcoming Windows Vista file system. Professor Anderson, professor of security engineering at Cambridge University, urged the Government to contact Microsoft over fears that evidence could be lost by suspects claiming to have forgotten their encryption key."
This discussion has been archived. No new comments can be posted.

UK Government Wants a Backdoor Into Windows

Comments Filter:
  • Why? (Score:2, Interesting)

    by jjares (141954) on Wednesday February 15, 2006 @09:52AM (#14723744) Homepage
    This simply doesn't make sense. What prevents an user, using a different tool without said backdoor?
  • by autopr0n (534291) on Wednesday February 15, 2006 @09:55AM (#14723764) Homepage Journal
    If someone gets a hold of your whole computer, they can read files. If someone hacks your system, they can read your files.

    About the only thing windows encryption seems to be able to do is prevent you from recovering your files if your PC ever dies.

    Whats the point?
  • Re:Why? (Score:3, Interesting)

    by mustafap (452510) on Wednesday February 15, 2006 @09:56AM (#14723775) Homepage
    Simply that the vast majority of users will use Windows defaults.

    You would be surprised how dim some crooks can be, like thinking that swallowing a sim card will destroy the data. Or even snapping it in two - might break the bond pad connections, but not the die. Easy to fix.
  • by Anonymous Coward on Wednesday February 15, 2006 @09:59AM (#14723797)
    Seeing as they are talking to the UK about it I am sure they wll have no problem building a backdoor key into the sytem for each govenment without trouble... Right?
  • Not "lost" (Score:5, Interesting)

    by ajs (35943) <<moc.sja> <ta> <sja>> on Wednesday February 15, 2006 @10:01AM (#14723807) Homepage Journal
    This is that definition of "lost" that appeared in the late 20th century. It's akin to the money that the music industry is "losing" due to file sharing. The evidence is not lost, it is as yet, undiscovered, and in any civilized country, we would not assert that there WAS any evidence unless we could actually see it. In the U.K., however, they actually have a law that says that you have to reveal your secret keys to the authorities with no provision for simply not knowing them. You can be convicted of the crime of having white-noise on your disk that authorities assert is encrypted data to which you are refusing to reveal the key. Heck, you could be convicted of a crime for not divulging the key to /dev/random, which is clearly some secret message channel from an unknown party, since messages arrive from it in small bursts!
  • Re:Pfff (Score:5, Interesting)

    by elrous0 (869638) on Wednesday February 15, 2006 @10:03AM (#14723824)
    What bad guy would be stupid enough to trust any encryption or security scheme introduced by a major corporation to begin with? If you want encryption, you go with open source. With any corp that has to answer to the government, you'd might as well assume there WILL be a backdoor.

    In the end, the bad guys will use real encryption and the backdoor won' effect them. It will only serve as a security risk for legitimate users.

    -Eric

  • Contempt of court (Score:4, Interesting)

    by springbox (853816) on Wednesday February 15, 2006 @10:04AM (#14723831)
    I often see arguments like this one [slashdot.org]. What's the point for some people to encrypt their files (other than temporary privacy) if you're going to get in trouble later in court anyway for not revealing your keys? Now this might actually be unlikely, but what if average windows user genuinely forgets their password? Seems kind of unfair.
  • keyloggers (Score:5, Interesting)

    by Barbarian (9467) on Wednesday February 15, 2006 @10:09AM (#14723859)
    How about making governments install a keylogger before they seize the computer? Hardware or software, it would go in the old tradition of installing a telephone tap. It's not that hard either. Did the government demand that paper notebook makers supply a backdoor so they could decipher drug accounts written in code?
  • by seanellis (302682) on Wednesday February 15, 2006 @10:12AM (#14723879) Homepage Journal
    Anyone with something to really hide will use a third-party encryption system, and "lose" the keys to that instead.

    Everyone else* will have a computer with a guaranteed back door, which I am willing to bet will be open to hackers on about Day 3 after Vista's launch.

    * - Well, everyone else who's not running Linux, of course.
  • Don't attribute.... (Score:3, Interesting)

    by gmuslera (3436) on Wednesday February 15, 2006 @10:14AM (#14723897) Homepage Journal
    to idiocy what can be explained by malice. There are a lot of backdoors around, and Windows had functional ones for years (wmf anyone?) but the intentionality of them could have been in doubt. Now if is known, proved, and by design adding another backdoor, one that will not be removed by any hotfix because is a "feature", well, 2 things will probably happen: the bad guys will find how to exploit it making all backdoored windows a target, and the bad guys find know how to disable it, so the most harmed people will be the good ones that should not have anything to hide (and because of that, removing/disabling the backdoor would make them suspectful)
  • Re:China & PGP (Score:4, Interesting)

    by iagreewithmichael (927220) on Wednesday February 15, 2006 @10:20AM (#14723944)
    seems we may see the fragmenting of the OS market with each local government insisting that only a domestic version be sold within its borders all in the name of security.
  • Re:China & PGP (Score:4, Interesting)

    by OhHellWithIt (756826) on Wednesday February 15, 2006 @10:24AM (#14723977) Journal
    You may remember the "clipper chip". The idea, proposed during the first Bush administration, was that encryption technologies would have to include a back door for U.S. intelligence agencies and law enforcement. I forget whether this was just for export, or whether it included domestic products as well. The argument "pro" was that we could trust the U.S. government not to misuse the key; the argument "con" was that it would inhibit exports of U.S. products, because while Americans might trust their government with keys to their back door, why would anyone else? And there was also the issue that foreigners might be smart enough to come up with something that the NSA couldn't crack. I was disappointed to see the Clinton administration follow through on the idea. Ultimately, export controls were relaxed somewhat, but I'd be surprised if there weren't back doors and/or key cracking algorithms available in Fort Meade. (sp?)

    It'll be interesting to watch this play out. I'm sure any resolution will disappear deep within the inner pages of the paper, if it is discussed at all.

  • Re:Why? (Score:3, Interesting)

    by CastrTroy (595695) on Wednesday February 15, 2006 @10:25AM (#14723981) Homepage
    Couldn't they just brute force the password? Assuming that the password was under 15 characters (most cases), and the information was valuable enough, they could do it. A lot easier than brute forcing the 256-bit encryption or whatever it is they are using.
  • Time to switch! (Score:4, Interesting)

    by caveat (26803) on Wednesday February 15, 2006 @10:27AM (#14723994)
    OS X FileVault...AES128 encryption of your home directory with no backdoors! (At least not that I know of). Ain't nobody reading your files without your key.
  • by Gadzinka (256729) <rrw@hell.pl> on Wednesday February 15, 2006 @10:38AM (#14724080) Journal
    Why in the world would they have to boot your computer simply to read your hard drive?

    Because all the sectors on my hard drive are encrypted on the fly. When you read it directly in other computer all you get is nearly random gibberish. There's not even a proper filesystem on it. Only after you mount it giving my long and convoluted passphrase the OS decrypts the sectors on the fly, so you can read the files. Switch the power off, reboot my machine or unmount the partition and there is no way to access my data again.

    Is that easier to grok?

    Robert
  • by tezza (539307) on Wednesday February 15, 2006 @10:46AM (#14724135)
    Anyone who values their privacy already uses non-OS provided encryption. This will raise public awareness of the need to do the same.

    The pleasant result of all this is that it dispells the whiff of paranoid conspiracy-theory. The government has been advised to ask for the backdoor access. By a british Cambridge expert. There is every reason to think Microsoft will agree.

    There is now simple historical evidence to point the public to. Previously there were more technical , less convincing ones.

    The average person is not going to care if Microsoft accidentally included some debugging code in a patch. Even if that made it look like it had a backdoor key. "Whatever that means?", they'll say.

    A BBC news article about an expert asking for such a backdoor is a lot more convincing.

  • by brother.sand (952928) on Wednesday February 15, 2006 @10:54AM (#14724204)

    Unless of course the password / passphrase that you enter in is still held in the pagefile in some obtainable manner. Anyone want to take a guess as to whether Windows Vista keeps your passphrase in the pagefile? Anyone want to further bet that the Fed already knows this?

    D.
    --
    The history of science resembles a collection of ghosts remembering that once they too were gods.
    -- David Berlinsky, theoretical mathematician
  • by abb3w (696381) on Wednesday February 15, 2006 @11:22AM (#14724443) Journal
    If someone gets a hold of your whole computer, they can read files. If someone hacks your system, they can read your files.

    Having needed to break into someone's system to recover encrypted files, I can say it's not that simple.

    Windows NTFS encryption is certificate based. For installs done by anyone not a professional paranoid, the user has access to the file recovery certificate, and the domain administrator may have access to a file recovery certificate valid domain-wide. To use a certificate stored on the hard drive, you MUST have the password to that certificate... which is NOT changed when you force-change an account password.

    So, yes, you can hack a machine, install a trojan, and read the users files when they login next. But, until the user logs in (which, yeah, is usually a short wait) and starts the trojan running under their user ID and password before your trojan can decrypt the files to examine/copy them. Alternately, you can get a dump of the encrypted password files, and try a brute force crack. But if the password used on the account (and, ergo, certificate) is, say, 12 random printable characters... dude, you are so SCREWED.

    Fortunately, the time I needed to break in for someone, the password was "only" nine random characters. I used a boot disk to dump the password file. Then, we wandered over to the operator for the school 128-processor Linux cluster with a case of good beer at 3:30 on Friday, explained the problem, and he agreed it would be OK this once to "not notice" the copy of the cracker program that would be blatantly running over the weekend in violation of several rules. We left, "not noticing" the case we were leaving behind. At 9AM Monday morning, I checked my email, and my batch job had left the user password sitting in my inbox.

    If it had been a 12 random printable character password, we'd still be waiting for the rest of our lives. And, for the professionally paranoid, I understand it's possible to use a non-default certificate (with potentially a different password) for encrypting files... where the decryption certificate need not be on the machine.

    Afterwards, I gently explained to the user that EFS should generally be reserved for situations where you consider the data's loss preferable to its disclosure. "EFS is not quite blow-up-the-building-first security, but it's close." He now reserves EFS for his financial information and consulting work covered under legal privelege.

  • by Anonymous Coward on Wednesday February 15, 2006 @11:26AM (#14724479)
    It's worth noting that harm can come not only from data being revealed under coercion, but also from data becoming unavailable.

    If terrorists or an oppressive government take your computer and hard drives away, anyone who depends on that data is very much out of luck.

    For this reason, local encrypted filestores and plausible deniability are only part of the puzzle. Quite a lot more is required, in particular cryptographic online distribution.

    A comprehensive solution will need to use a large population of fixed size raw dataspaces spread across the net, instead of local disks. Quite likely, it would be stored steganographically 1:<large-N>:1 so that (for example) changing webcam images could be used as repositories. And it will need cryptographically-random access for site selection and dataspace selection and to individual bits in the dataspaces. And it'll need huge redundancy since the online storage will be inherently unreliable, yet without laying the scheme open to pretty simple differential cryptoanalysis.

    That's a very tall order.
  • Re:Pfff (Score:5, Interesting)

    by Kadin2048 (468275) <slashdot.kadin@xoxFREEBSDy.net minus bsd> on Wednesday February 15, 2006 @11:37AM (#14724576) Homepage Journal
    In addition, you'd want a system whereby you could enter a distress password, and unlock one level of security, while at the same time transparently destroying data, from the most secure level on upwards. So let's say you had three levels of encrypted data. The first layer is just some dodgy pictures of you and your wife. The second contains some emails showing you were evading taxes. The third is whatever you really want to protect.

    For each level there are two passwords, one which will unlock it as normal, and another which will unlock it, and also begin a routine which will start securely erasing the third level data, then the second level, and then the first level + OS, and maybe trigger a lump of thermite sitting on top of the RAM for good measure. Or maybe it would be better just to get rid of the third level silently, so that it's as if it never existed. That's probably healthier, on second thought.

    So that after you provide a good show of resisting giving out the password, you hand over the 'distress' one and let them have fun getting through the first level of junk data, while at the same time the system is slowly eating away at the stuff you really don't want, down on the third level.

    You could even set it up so that the mal-effects caused by the distress passwords increase as you move through the levels of security. The distress password on the first level of security just starts the "silent erase" mechanism. The distress password on the second level speeds it up at the cost of less subtlety (because obviously they're getting closer to the actual data, so you need it gone faster). The distress password on the third level physically destroys the system in some sort of obvious (but quick) fashion. That way you're almost guaranteed not to compromise the data, but you also don't have to necessarily compromise yourself, unless they're really close to getting the stuff.
  • Keyloggers (Score:5, Interesting)

    by Kadin2048 (468275) <slashdot.kadin@xoxFREEBSDy.net minus bsd> on Wednesday February 15, 2006 @11:46AM (#14724646) Homepage Journal
    Worth pointing out that keyloggers are exactly the route that the FBI here in the US has taken:
    http://www.epic.org/crypto/scarfo.html [epic.org]

    That's US v. Scarfo; basically a mobster was using PGP to encrypt his communications and rather than breaking the encryption the hard way, the investigators got a warrant to install a keylogger. I'm not sure exactly how they did it, but I'm pretty certain that it was a hardware device implanted in the keyboard, rather than software. (The warrant they got was pretty much a blanket thing, approval for 'hardware, software, and firmware as necessary...') However they didn't divulge the exact methodology in the trial, because they successfully claimed an exemption under the Classified Information Procedures Act.
  • Re:Contempt of court (Score:4, Interesting)

    by geoffspear (692508) on Wednesday February 15, 2006 @12:14PM (#14724886) Homepage
    One would hope that you're not going to be forced to reveal your password unless the Government establishes probable cause that you've committed a crime.

    It's kind of silly to think that an average user with no incriminating evidence encrypted is going to be randomly ordered to turn over a password, and thrown in jail for legitimately forgetting it. It's a disturbing thought that the law, as written, could lead to that, but it's not a compelling argument against using encryption if you're not a criminal.

    Using this sort of hypothetical scenario to argue against routine use of encryption is a bit like arguing against keeping sharp knives in your kitchen, because you're afraid the police might claim you stabbed someone with one of them and cleverly removed all forensic evidence of the stabbing from the knife.

  • Private Disk (Score:4, Interesting)

    by gr8dude (832945) on Wednesday February 15, 2006 @12:33PM (#14725043) Homepage
    Well, TrueCrypt is freeware and open-source, but there is also another aspect that has to be taken into account - it is NOT a certified product.

    Institutions such as NIST test the implementations of the algorithms, then the program either gets certified or not.

    The problem is that without certification, we do not know whether what they've implemented is what they think they've implemented*.

    The point is that they might use some obscure algorithm nobody knows - which has no guaranteed strength; thus one cannot rely on it. They can also implement standard algorithms such as AES or DES - but were they correctly implemented?

    Sure - "why don't you take the sources and look at them yourself?" some might say, but is everybody competent enough to do that?

    On the other hand, implementing something and then certifying it, means that:
    [a] it was done right
    [b] it is as strong as the standard says


    In the case of encryption, the strength is in the key itself and in the mathematical basis of the algorithm, NOT in the obscurity of the mechanisms applied within the software.

    One minor thing - NIST certification is expensive, I doubt TrueCrypt will pass it, unless some company pays for this. Commercial encryption software is a different thing, if they want to be treated seriously, they must go for it. An example is Private Disk [dekart.com].

    * an old saying:
    "The problem with computer programs and programmers is that the program does what the programmer wrote, not what he thought he wrote".
  • Decide for yourself (Score:3, Interesting)

    by Kadin2048 (468275) <slashdot.kadin@xoxFREEBSDy.net minus bsd> on Wednesday February 15, 2006 @12:34PM (#14725054) Homepage Journal
    Although I don't know the man, I just looked up what I think is his blog, and provided he's not lying through his teeth, the Politics and Public Policy section of his blog seems quite agreeable in spirit to me.

    He also has some really interesting papers on there. (Check out the "Cocaine Auction Protocol" and "Programming Satan's Computer" -- the first is a methodology for creating an un-mediated auction house, the latter is about programming on untrusted networks.)

    Of course, to each his own.

    Here's the link:
    http://www.cl.cam.ac.uk/~rja14/#Lib [cam.ac.uk]
  • USA & 5th amendment (Score:5, Interesting)

    by SnprBoB86 (576143) on Wednesday February 15, 2006 @01:15PM (#14725404) Homepage
    I'm not sure about the UK, but in the USA, wouldn't this be a 5th amendment rights issue?

    The summary states that this black hole is desirable for "fears that evidence could be lost by suspects claiming to have forgotten their encryption key", but why would a suspect have to say they lost their encryption key? Why not just plead the 5th?

    The 5th amendment states: "No person shall [...] nor shall be compelled in any criminal case to be a witness against himself [...]"

    I honestly do not believe that the contents of a person's hard drive falls into the same category of evidence as eye witnesses or DNA. A personal computer's hard drive, particularly one with an encrypted file system, is effectively an extension of that person's memory and hence any data extracted from it seems very much like testifying against oneself.
  • by raitchison (734047) <robert@aitchison.org> on Wednesday February 15, 2006 @01:45PM (#14725679) Homepage Journal

    Maybe his long term goal is Muslim rule (though I'm not conviced he's anything more than a power hungry madman who's merely using Islam) but his short term goals generally revolve around hurting/killing people and the general undermining of societies he doesn't like.

    He doesn't like our way of life, with our quasi-democracy and capitalism and relative tolerance of different faiths. And every time we change our way of life, every time we give up one of our rights in the name of "fighting terrorism" we are delivering a victory to him and people like him.

  • by Anonymous Coward on Wednesday February 15, 2006 @02:11PM (#14725878)
    Osama has always been a CIA agent. There's so much proof available that it's quite interesting that some seem to choose to ignore it.

    http://www.globalresearch.ca/articles/CHO311A.html [globalresearch.ca]

    ... and now he's dead [whatreallyhappened.com], and has been for a while. Take good notice as to when the "tapes" appear - it's always when the media at home needs to concentrate at something else besides the administration - and scare US citizens with "boo terrorists" into accepting something new and Orwellian.

  • by lkcl (517947) <lkcl@lkcl.net> on Wednesday February 15, 2006 @02:28PM (#14726010) Homepage
    He said: "From later this year, the encryption landscape is going to change with the release of Microsoft Vista." The system uses BitLocker Drive Encryption through a chip called TPM (Trusted Platform Module) in the computer's motherboard. It is partly aimed at preventing people from downloading unlicensed films or media.

    oh please, yes please. switch on encryption that uses TPM. then all it takes is a virus to overwrite the TPM keys in the BIOS memory and that's it - game over: your entire hard drive rendered useless. mwhahahahah

  • by Anonymous Brave Guy (457657) on Wednesday February 15, 2006 @03:26PM (#14726392)
    Now we just have to wait for the media companies, that lobbied for TCP in the first place, to demand access to the back door so that they can check machines for illegal movies.

    And so, inevitably, the Powers That Be(TM) competing to dominate the lives of the Minions(TM) come into conflict.

    If the governments get their way, there will be no true encryption permitted, because otherwise they can't spy on people.

    If there is no true encryption, there is no point whatsoever to having the TPM, the entire DRM concept just got screwed, etc. It doesn't matter whether it's "only governments" who can break the codes, because someone will crack/leak/otherwise work around that restriction within days, and the Internet will do the rest within hours.

    So, the media industry's current prime directive and major investment just came into direct opposition with the government's current prime directive and major political hot potato. The blue touch paper has been lit; please retire to a safe distance, and wait to see which of the rights you thought you were losing will be staying after all...

"The only way for a reporter to look at a politician is down." -- H.L. Mencken

Working...