Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Windows

Google Security Expert Finds, Publicly Discloses Windows Kernel Bug 404

hypnosec writes "Security expert Tavis Ormandy has discovered a vulnerability in the Windows kernel which, when exploited, would allow an ordinary user to obtain administrative privileges of the system. Google's security pro posted the details of the vulnerability back in May through the Full Disclosure mailing list rather than reporting it to Microsoft first. He has now gone ahead and published a working exploit. This is not the first instance where Ormandy has opted for full disclosure without first informing the vendor of the affected software."
This discussion has been archived. No new comments can be posted.

Google Security Expert Finds, Publicly Discloses Windows Kernel Bug

Comments Filter:
  • Who cares. (Score:2, Insightful)

    by gr8_phk ( 621180 )
    Seriously. I think it was a comic strip (possibly xkcd) that pointed out that an exploit that had user level privileges could impersonate someone on web sites, do money transfers at their banks, etc... While a system level exploit would all it to install drivers. Whohooo!
    • Re:Who cares. (Score:5, Informative)

      by ericloewe ( 2129490 ) on Tuesday June 04, 2013 @04:26PM (#43908567)
    • Re:Who cares. (Score:5, Insightful)

      by khasim ( 1285 ) <brandioch.conner@gmail.com> on Tuesday June 04, 2013 @04:37PM (#43908667)

      That is correct for home users.

      But for corporate users, a system level exploit allows things like installing sniffers and key loggers so that more passwords can be collected. Including the admin/root passwords.

      Which can be used against the computers in the Accounting department to transfer money from the corporate accounts to "money mules".

      • by Richy_T ( 111409 )

        Let's not forget multi-user systems too. If you're really paranoid, you can keep one account for the important stuff and one for general day-to-day crap.

      • Re:Who cares. (Score:4, Informative)

        by nmb3000 ( 741169 ) on Tuesday June 04, 2013 @07:04PM (#43909813) Journal

        But for corporate users, a system level exploit allows things like installing sniffers and key loggers so that more passwords can be collected. Including the admin/root passwords.

        Absolutely. What takes it to the next level is that most (effectively all) Windows sysadmins will log into workstations using their user credentials which are members of the Domain Admins group. If a standard user is able to gain administrative access on their computer and then get a sysadmin to log in to "look at a problem" (very easy), they will likely gain full control over the local domain. This includes the ability to distribute a malicious binary over the network to every computer in the domain, allowing them to collect personal credentials and information from every other person in the company.

        Even without getting a Domain Admin to log into their workstation, there is potential for other security problems. For example, the user might extract the hashed passwords stored in the active directory credential cache which likely contains an entry for a Domain privileged user. They could then attempt brute force decryption on this (salted and hashed) cached password. With modern GPU farms such brute force attacks aren't as crazy as they used to be, especially if the password is weak.

    • Re:Who cares. (Score:4, Informative)

      by AmiMoJo ( 196126 ) * on Tuesday June 04, 2013 @04:45PM (#43908737) Homepage Journal

      No, user level programs can't generally do that. Since Vista user privileges don't give access to other app's data or any system files. There is no easy way to steal credentials out of a browser or read email or anything like that.

      That is why viruses often try to trick the user into granting them admin level permissions via a UAC warning prompt. In this case a way has been found to take those permissions without a prompt, giving the user a false sense of security and not alerting them to potentially dangerous behaviour.

      As for drivers even a kernel level exploit usually won't be able to install them these days. Drivers need to be signed before Windows will allow them to be installed. On Windows 7 you can installed unsigned code after the user gives permission, but Windows 8 flat out refuses to install unsigned binaries as drivers.

      • That is why viruses often try to trick the user into granting them admin level permissions via a UAC warning prompt. In this case a way has been found to take those permissions without a prompt, giving the user a false sense of security and not alerting them to potentially dangerous behaviour.

        You described a trojan. Viruses exploit a vulnerability to install themselves and spread.

        As for drivers even a kernel level exploit usually won't be able to install them these days. Drivers need to be signed before Windows will allow them to be installed. On Windows 7 you can installed unsigned code after the user gives permission, but Windows 8 flat out refuses to install unsigned binaries as drivers.

        I haven't written shellcode for Windows since XP (I work on the defensive side of security now), but I do suspect you are not correct here. If you can get your shellcode to execute in kernel space, it can do anything. You could read a driver file from the network, copy it into kernel space and execute it, completely bypassing the signature check. You could also disable the signed-driver requirement so that a rootkit

        • by AmiMoJo ( 196126 ) *

          You described a trojan.

          I meant it as a generic term for malware, apparently should have been more specific.

          If you can get your shellcode to execute in kernel space, it can do anything.

          If you get in right at the very lowest level you can theoretically do pretty much anything. Practically though there are two things stopping you.

          Firstly getting in at that level is hard. The kernel is not monolithic, and the different parts have different permissions. That's why you don't see many viruses that actually do that any more - all the attack vectors that are exposed are for stuff that runs outside the core kernel

          • Re:Who cares. (Score:4, Informative)

            by GoogleShill ( 2732413 ) on Tuesday June 04, 2013 @10:42PM (#43911163)

            Firstly getting in at that level is hard. The kernel is not monolithic, and the different parts have different permissions. That's why you don't see many viruses that actually do that any more - all the attack vectors that are exposed are for stuff that runs outside the core kernel level we are talking about.

            It is typically hard, but this exploit runs at ring-0.

            Even if you can get in at that level it still isn't easy to just install your driver. The driver management code won't accept unsigned code even from the inner kernel. You would have to replicate those routines yourself and patch it directly into the driver system. Bypassing the driver loading system, as you say. Hardly trivial.

            I don't think you understand what it means to "install your driver". I'm not talking about adding a .dll and .inf file, I'm talking about actually executing driver/shellcode in the kernel. This exploit executes code in ring-0 which gives full access to the kernel memory, hardware, OS, filesystem, registry... everything. There is no need to bypass anything. You've already "installed the driver" and anyone with the skill to exploit a kernel vulnerability will have no trouble overwriting the crypto check function in program space with a "return success" stub. Since this attack does not require the exe to be signed, it can permanently install itself by adding a startup entry in the registry. SecureBoot won't protect against that.

            What SecureBoot does protect against is some malware permanently installing itself on the system /after/ the OS has been patched.

      • Re:Who cares. (Score:5, Informative)

        by nmb3000 ( 741169 ) on Tuesday June 04, 2013 @06:54PM (#43909747) Journal

        No, user level programs can't generally do that. Since Vista user privileges don't give access to other app's data

        I'm sorry, but you are incorrect. Programs running under the same user's security context are all on equal footing and can inspect and interact with each other. Notepad could, for example, read the entire contents of Firefox's private memory. I can create a remote thread in the Firefox process to do whatever it pleased. Vista did not change this.

        There is no easy way to steal credentials out of a browser or read email or anything like that.

        This is also not true. Firefox clearly stores passwords using reversible encryption (how else could it send the plaintext passwords to websites?). Both the encrypted password and the decryption key is available to any program running under the user's context.

        "Reading email" is a little vague, but if absolutely nothing else, a program could capture the text being displayed in the email application using any number of Win32 API / accessibility calls.

        That is why viruses often try to trick the user into granting them admin level permissions via a UAC warning prompt

        UAC does nothing to prevent a program from gaining adminstrative access (elevating). This has been reliably demonstrated many times by different people, and even Microsoft has said that UAC is not a security boundary. It was created (essentially) for one thing: to force software vendors to start writing programs that did not assume or require the user to have administrator rights. It had a positive side effect of making Microsoft look more focused on security.

        As for drivers even a kernel level exploit usually won't be able to install them these days. Drivers need to be signed before Windows will allow them to be installed.

        I'm sorry, but this is also incorrect. Keep in mind there are multiple meanings of a "driver", but once you are executing code inside kernelspace, all bets are off. As Raymond Chen likes to say, It rather involved being on the other side of this airtight hatchway [msdn.com].

        Windows 8 flat out refuses to install unsigned binaries as drivers

        That's unfortunate for independent/small software development shops and open-source software projects. I remember when I had control over what ran on my computer; those were good days. If, however, malicious code has found its way into the kernel your machine is still fully compromised.

    • by oGMo ( 379 )

      The comic (as previously posted) was amusing and also wrong; a user-level exploit might be able to get you those things, if credentials aren't encrypted. Browser exploit can probably scrape your pages or similar, which is of course bad. However, a system-level exploit can do all this and more:

      • All of the above, plus for every user on a multi-user system
      • Read your keystrokes, and thus get passwords without decryption
      • Read directly from memory, therefore also bypassing the need for decryption, and accessi
      • by EvanED ( 569694 )

        Read your keystrokes, and thus get passwords without decryption

        I'm not sure, but this may already be possible (for the current user) now, without root.

        Even if it's not in general, you could still do something like install a browser extension for the user that does it while they're in the browser. (At least for Firefox; not sure if Chrome extensions are powerful enough to do that.)

        Read directly from memory, therefore also bypassing the need for decryption, and accessing even more sensitive information unaid

      • Not to mention with access to a privileged account the malware becomes substantially harder to remove.

      • by nmb3000 ( 741169 )

        I think you're making some assumptions here about user capabilities and how encryption is used that are incorrect.

        if credentials aren't encrypted

        User credentials are never encrypted in such a way that the current user cannot access them. What would be the point? Secure storage exists to protect users from other users, and to some extent from nosy administrator (though you can't protect *anything* from a determined and nosy administrator). Bob needs to be able to read Bob's plaintext password or Bob cannot make use of it.

        Browser exploit can probably scrape your pages or similar

        No exploit nee

    • It's sweet and all that you think paraphrasing xkcd shows that you have some kind of deeper insight, but you're clearly missing the point. A kernel mode exploit can do all the things that a user mode exploit can do, as well as install nasty malware like keyloggers, or worse... which in turn (likely) allows everything that physical access to the machine would have granted anyway.

      So who cares? Me, and everyone even remotely versed in security.
    • Generally user-land viruses will be immediately picked up by antivirus, while a kernel-level exploit can install undetectable keylogger drivers.

  • by Bugler412 ( 2610815 ) on Tuesday June 04, 2013 @04:23PM (#43908547)
    if he was an independent researcher doing this it might be one thing, but in this case he's not revealing the vulnerability based on full disclosure principals, he's doing it to give his employer's largest competitor a black eye. Motives matter
    • by Nimey ( 114278 ) on Tuesday June 04, 2013 @04:28PM (#43908585) Homepage Journal

      You don't know his motivations, you're making an assumption.

    • by Hatta ( 162192 ) on Tuesday June 04, 2013 @04:34PM (#43908635) Journal

      Why does it matter? Full disclosure is the only responsible choice. That doesn't change no matter who your employer is.

      • I also don't see him posting that he is doing this as a Google employee or really, that he is related to them in any way. It's an interesting fact, but not necessarily relevant.

      • IMHO, full disclosure after a reasonable period of private disclosure is the responsible choice. Such a policy should be applied uniformly to all vendors regardless of relationship; although I suppose you could argue that if there's a partnership then it's quasi-internal. You might even be bound to nondisclosure by the partnership agreement.

        Anyway, I digress. By keeping it private for a fixed time and then disclosing, you give the subject time to fix it before an exploit gets produced and you give them a

        • by Hatta ( 162192 )

          IMHO, full disclosure after a reasonable period of private disclosure is the responsible choice.

          Why give an attacker a window of time in which he can use his exploit freely? Inform the public immediately, and they can stop using the software, or decide if it's worth the risk.

          you give the subject time to fix it before an exploit gets produced

          Why do you assume an exploit does not already exist? If you can find it, an attacker can find it too. The prudent assumption is that any bug that can be exploited is b

    • by bug1 ( 96678 )

      Motives matter

      If he had bad motives he wouldnt have disclosed it in the first place.

      If microsoft are too dumb to monitor popular outside forums where faults in their products are discussed then they deserve a black eye, doesnt matter who gives it to them.

  • Target Microsoft (Score:5, Interesting)

    by mrbluejello ( 189775 ) on Tuesday June 04, 2013 @04:23PM (#43908549)

    If it hadn't been Microsoft, Google may have been a bit more responsible about this, but since it makes their competitor look bad, time to forget about "do no evil".

    • by chuckinator ( 2409512 ) on Tuesday June 04, 2013 @04:27PM (#43908573)
      "Do no evil" means "don't get caught doing something that will put handcuffs on our executives." Get your definitions straight.
      • According to the Jargon File, it implies that you won't design software that no-one wants to use. Instead, you design software that everyone wants to use and then Spring Clean it away!
    • by Hatta ( 162192 )

      You cannot be more responsible than full disclosure. The responsible thing to do when you find a bug is to inform those who are at risk from the bug. Any delay leaves those people at risk unnecessarily, and is irresponsible.

    • ...forget about "do no evil".

      Google is still better than AT&T, whose motto is "Now I am become Death, the destroyer of worlds." Executive bonus recovery fee tagged to your wireless bill: $0.96

  • by danbuter ( 2019760 ) on Tuesday June 04, 2013 @04:24PM (#43908555)
    I'm betting this is the only way to get MS to fix the problem in a timely fashion. If it's in the wild, they HAVE to fix it, and fast. Guys had to do this with Apple, as well, because they never fixed any bugs unless absolutely forced to.
    • Microsoft is actually pretty good about timely patches.
      • Re: (Score:2, Insightful)

        Yes, if you call releasing all patches at the same time, once a month, "timely." Personally, I'd like to get patches as soon as they're ready, especially security patches. That's one of the many reasons why I use Linux, not Windows.
    • I'm betting this is the only way to get MS to fix the problem in a timely fashion. If it's in the wild, they HAVE to fix it, and fast. Guys had to do this with Apple, as well, because they never fixed any bugs unless absolutely forced to.

      So why not report it, wait two weeks, and then disclose it publicly?

      This entire conversation assumes reporting it to the vendor and disclosing it publicly are mutually exclusive. Report to the vendor, and give them a deadline as to when you'll disclose it. If they don't patch by the deadline, it gets disclosed. Thus they have to patch it quickly.

  • by intermodal ( 534361 ) on Tuesday June 04, 2013 @04:27PM (#43908575) Homepage Journal

    The irony of the difference between closed source and open source is that while Ormandy has posted an exploit to this Windows bug, in the open-source world he potentially could have posted a fix too, considering he's the one who seems to understand the bug itself the best...

  • Just Desserts (Score:2, Insightful)

    by Anonymous Coward

    Been a long time coming, but we finally don't have Microsoft pushing us around any longer.

    Some of us with long memories see absolutely no issue with disclosing MS bugs on public forums.

  • by anthony_greer ( 2623521 ) on Tuesday June 04, 2013 @04:42PM (#43908721)

    Can google and/or this guy be prosecuted for this because releasing the working demo is basically aiding and abetting a criminal

    • subject should be 1896 fraud and abuce act - didnt proofread the subject - Do'H

  • by anthony_greer ( 2623521 ) on Tuesday June 04, 2013 @04:58PM (#43908865)

    I guarantee every talking head on TV would be calling for the DoJ to look into it...

    This is all about PR and image, Google and apple are sexy, MS is big and boring, but arguably more critical to daily life (you have no idea how many devices and backend systems you use everyday are on Windows)

  • What is the exploit that makes the carriage return in posts on /. work?
  • Win 32bit only? Meh (Score:4, Interesting)

    by snikulin ( 889460 ) on Tuesday June 04, 2013 @05:16PM (#43909023)

    The code is clearly targeted for x86 only, not for x64 (__declspec(naked)).
    I don't have x86 PC.
    On Win7x64 the code plainly crashes.

    Unimpressed.

  • by LazLong ( 757 ) on Tuesday June 04, 2013 @05:24PM (#43909087)

    ...but not disclosing it to the vendor first and giving them a chance to release a fix is both unprofessional and irresponsible. Add in the fact that this is coming from a Google employee makes it inexcusable, and reflects poorly on Google. If I were his manager he would certainly receive a reprimand.

news: gotcha

Working...