Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Network Security Software The Internet Technology

New Standard For Issuance of SSL/TLS Certificates 62

wiredmikey writes "In light of the many security breaches and incidents that have undermined the faith the IT industry has in Certificate Authorities (CAs) and their wares, the CA/Browser Forum, an organization of leading CAs and other software vendors, has released the 'Baseline Requirements for the Issuance and Management of Publicly-Trusted Certificates,' an industry-wide baseline standard for the operation of CAs issuing SSL/TLS digital certificates natively trusted by the browser. The CA/Browser Forum is requesting Web browser and operating system vendors adopt the requirements (PDF) as part of their conditions to distribute CA root certificates in their software. According to the forum, the Baseline Requirements are based on best practices from across the SSL/TLS sector and touch on a number of subjects, such as the verification of identity, certificate content and profiles, CA security and revocation mechanisms. The requirements become effective July 1, 2012, and will continue to evolve to address new risks and threats."
This discussion has been archived. No new comments can be posted.

New Standard For Issuance of SSL/TLS Certificates

Comments Filter:
  • Or (Score:5, Insightful)

    by Spad ( 470073 ) <slashdot@ s p a d . co.uk> on Friday December 16, 2011 @05:14PM (#38404094) Homepage

    In other news: Certificate Authorities who were already half-assing their verification processes, hiding their security breaches and not bothering to secure their internet-facing phpmyadmin installs will continue to do exactly that under the new regime right up to the point that they get caught, just like now.

    • Re:Or (Score:4, Informative)

      by Bloopie ( 991306 ) on Friday December 16, 2011 @05:21PM (#38404202)
      And (it is hoped) Web browser and operating system vendors will see that these CAs don't meet the new proposed criteria, and will stop including their root certificates in the browsers/OSes. It says that right in the summary.
      • by jd ( 1658 )

        Ah yes. Microsoft enforcing security protocols. A fascinating idea. Yes, you said "it is hoped", but I have to wonder who is left with that much hope. It's well beyond the hope budget of most individuals.

        • Well, we've already seen Google, Mozilla and Microsoft remove root CA certs from their products. Each one of them could do a lot of damage to a CA by removing a root cert, even without cooperation with the others. And yet, in the very recent past we've seen their security teams cooperate closely when dealing with compromised CAs. I think it's reasonable to believe that at least one of them will stay on the ball, when we've already seen proof that they can actually coordinate pretty well in exactly this sort

  • You mean that until now there has been no standard for issuing publicly trusted SSL Certs?

    • Re:What? (Score:5, Informative)

      by jd ( 1658 ) <imipakNO@SPAMyahoo.com> on Friday December 16, 2011 @05:32PM (#38404318) Homepage Journal

      There was an agreement, way back when, which involved multiple levels of certs and a general understanding of what level of integrity each level of cert offered. It wasn't hard-and-fast, certainly no standard, but it was generally accepted and generally used. Then people wanted certs cheap and now, not something high levels of integrity checking really allow for, so what agreement did exist simply went up in smoke as vendors pandered to customers over and above common sense.

      • Not just that. (Score:4, Insightful)

        by khasim ( 1285 ) <brandioch.conner@gmail.com> on Friday December 16, 2011 @05:44PM (#38404452)

        Then people wanted certs cheap and now, not something high levels of integrity checking really allow for, so what agreement did exist simply went up in smoke as vendors pandered to customers over and above common sense.

        Not just that. Certs are also now MARKETED as a means of verifying the web site you're connecting to.

        The process you mention is one means of attempting to fill that role.

        But certs were not designed as a means of verifying a web site. They're just for encryption. And for encryption they work pretty good.

        The question now is how do you verify that the site at a.b.c.d is REALLY the site you think it is. Here's a hint: you cannot rely upon the certificate to validate it.

        • by Anonymous Coward

          If certificates were intended solely for encryption, why do we even have certificate authorities?

          • You can't have encryption without authentication, otherwise anyone can stand between you and the endpoint and impersonate the other end.

            A MitM is somewhat harder than passive eavesdropping, but not in any fundamental way -- there's surprisingly little difference in being able to intercept your communication to read it and being able to modify it on the fly.

            • Sure you can. (Score:5, Interesting)

              by khasim ( 1285 ) <brandioch.conner@gmail.com> on Friday December 16, 2011 @06:43PM (#38404916)

              You can't have encryption without authentication, ...

              Sure you can. Encryption by itself does not mean secure communications. Bruce Schneier has a great book on the subject, "Practical Cryptography".

              ... otherwise anyone can stand between you and the endpoint and impersonate the other end.

              That does not mean that the transmissions are not encrypted. Just that the communications channel is compromised. But transmission between you and the MitM are encrypted. As are communications between the MitM and the site you think you're connected to.

              Which gets to the core problem. The way it is currently set up depends upon too many points of failure with no way to validate any of the connections.

              How do you KNOW that the site you've connected to is your bank in the USofA? Are you going to check to see that the CA issuing that certificate is not based out of Romania?

              And the reason that it is set up this way is because it is EASIER for the banks to pass on any losses to the customer or business. Change that and you'll see fixes happening.

              Right now your computer accepts a SINGLE source for encryption and "authentication". There should be at least 2 or 3 different checks.

              • Right now my browser combines the concepts of encrypted and authenticated into one value. Actually that's not always entirely true. Some modern browsers display a scary warning about something or other (self-signed certificates... when did those become bad? As long as they represent the domain in question they are fine but not according to browser makers).

                Anyway, the point is that we need to get away from treating an SSL certificate as more than they are. Hell, I can create as many certificates as I want an

      • by makomk ( 752139 )

        Then people wanted certs cheap and now, not something high levels of integrity checking really allow for

        For a lot of applications that's all that's really needed. In fact, for many of them even just SSH-style remembering of the last certificate seen would provide enough security - but unfortunately web browsers don't support that, so if you've got a website with anything you don't want transmitted over the internet in the clear then you need a certificate from a CA.

  • by Anonymous Coward

    The more rules you're required to follow before you're accepted as a trusted root CA, the fewer CAs there will be. The fewer CAs there will be, the more they can charge, and charge more they will, because they have to follow stricter rules and actually pretend to verify your identy, you know.

    Do not be fooled, DNSSEC based certificates can not give you the green URL bar that you crave.

    • by jd ( 1658 )

      The CAs shut down or investigated were added since the relaxation of the tiered certificate scheme. The CAs that have integrity now are the ones that had integrity then. Thawte was cheap when there were few CAs and have raised their prices as CAs have been added. Do explain.

      • by Anonymous Coward

        So? They'll raise prices even more when there is less competition. It is important to understand that the quality of a CA used to be binary: It either was a trusted root CA or it was not. You could not pay more and expect a better product, because the amount of verification a CA does does not improve security one bit. An attacker can dupe a less diligent CA, regardless of the CA the actual domain owner chose and paid. Nowadays you can pay more and get extended validation certificates, but that's just that m

      • by jd ( 1658 )

        Note to those abusing their mod points: It's not off-topic to point out that a lack of standards is bad for SSL certification and that prior proto-standards were extremely good for certification.

        Note to real Slashdotters: Do try and meta-moderate, at least once a year. Maybe twice. It'll limit the mod trolls at least a bit.

  • by pwileyii ( 106242 ) on Friday December 16, 2011 @05:29PM (#38404290)

    They call them baseline requirements, but since they have no authority, they are really just guidelines. Ultimately, these really don't fix anything, they are just an attempt to say "look, we are doing something!" You could argue that something is better then nothing, but until we have a real system for managing CAs that doesn't involve browser and operating system updates, we will stay in the same boat of trusting a third party (whose method of becoming a valid CA is questionable or unknown) and getting CA updates in a slow manner.

  • A fig leaf condonement of a fundamentally broken system. And those "identity ascertaining" provisions cost certificate requesters' privacy without actual security gain. You still cannot really actually trust any of those 600-odd root CAs and their many, many subsignees on anything but their own say-so. Well, that's reassuring. By the same token, nothing in that standard provides guarantees against fuckups and breaches, just like PCI could not prevent TJX. Plus ça change...

  • by jdastrup ( 1075795 ) on Friday December 16, 2011 @05:33PM (#38404342)
    I remember the days when SSL certs where at least $200 USD/year, required faxes and forms, including one notarized. It was a pain in the butt to get a legit SSL cert that worked in most browsers. Looks like we may be going back to that.
    • by jd ( 1658 )

      Perhaps, but the certs were at least genuine. In the long-run, losses due to fraud are bound to swamp the savings of the cheaper (but useless) certs currently out there.

      • by makomk ( 752139 )

        On the other hand, a whole bunch of data that could really have used encryption instead got send across the net in the clear because it wasn't worth the cost of a certificate and I've no doubt that some of that could've been used for fraud...

  • Interesting. (Score:3, Informative)

    by jd ( 1658 ) <imipakNO@SPAMyahoo.com> on Friday December 16, 2011 @05:38PM (#38404388) Homepage Journal

    Seems like Go Daddy and a number of other CAs don't do much beyond check your Who Is entry in the way of integrity checking, don't actually generate a full key pair (they just sign the public key, which means they don't know who actually generated it, and no signature is placed on the private key which they never see, so the installer of the private key can't know that the private key is the key corresponding to the public key they're offering).

    That's really, really, really bad design. Especially if you remember the problems with NULLs being added to strings, the ability to add arbitrary data to files without altering the MD5 hash, etc. I would never trust a signed cert if no individual in the chain could actually certify that one end-point matched the other. That's asking for a MITM.

    • Re:Interesting. (Score:4, Informative)

      by nullchar ( 446050 ) on Friday December 16, 2011 @05:47PM (#38404488)

      You can't trust GoDaddy or any one else to generate your private key! Thus it would no longer be private. Granted, more checking besides Whois data should happen for the ridiculous prices the CAs demand. Also, the owner of the private key obviously knows the public key, and when they install the CA generated certificate along with the keypair, the cert must match the public key.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      It appears that you fail basic understanding of the concepts of public key cryptography and certification authorities. OF COURSE the CA does not generate the private key because this would break the security concept. There is nothing wrong with having the CA only sign the public key. Your musings about the private key are total nonsense, using a private key which does not belong to the public key will simply not work.

      Really, get yourself educated before posting such rubbish.

    • by Anonymous Coward

      don't actually generate a full key pair (they just sign the public key, which means they don't know who actually generated it, and no signature is placed on the private key which they never see, so the installer of the private key can't know that the private key is the key corresponding to the public key they're offering).

      You have just demonstrated an astounding level of ignorance of how PKI works. Please refrain from posting further on this topic until you have educated yourself. I recommend, at minimum, spending some quality time with the Google query "how PKI works." Next, you may wish to cuddle up with a copy of Applied Cryptography [schneier.com]. Hope this helps.

  • by nullchar ( 446050 ) on Friday December 16, 2011 @05:40PM (#38404398)

    It's great the CA/Browser Forum, made up of the most prominent Certificate Authorities, is taking steps to standardize their rules for certificates. Many rules in the PDF are technical and exact, which will help with software enforcement.

    However, even this necessary step of not issuing public certs for non-FQDN hostnames and reserved IP addresses won't take effect until late 2016!

    As of the Effective Date of these Requirements, prior to the issuance of a Certificate with a subjectAlternativeName
    extension or Subject commonName field containing a Reserved IP Address or Internal Server Name, the CA
    SHALL notify the Applicant that the use of such Certificates has been deprecated by the CA / Browser Forum and
    that the practice will be eliminated by October 2016. Also as of the Effective Date, the CA SHALL NOT issue a
    certificate with an Expiry Date later than 1 November 2015 with a subjectAlternativeName extension or Subject
    commonName field containing a Reserved IP Address or Internal Server Name. Effective 1 October 2016, CAs
    SHALL revoke all unexpired Certificates whose subjectAlternativeName extension or Subject commonName field
    contains a Reserved IP Address or Internal Server Name.

    If we're going to spend time and resources updating our browsers and operating systems to enforce some of these requirements and properly query certificate revocation lists, we may as well throw out the entrenched CA model and try something else [convergence.io].

  • This sounds like adding more red tape to solve a technical problem.
    Trust is still defined by a single path up to a root that I apparently should trust blindly.

  • by Anonymous Coward

    For a system that is meant to supply you with a level of trust, there is a surprising small foundation for that trust only a lot of money for a virtual trust launderer. There is a better model in how dns is organized, where parties handle everything within their domain and where central parties explicitly guarantee uniqueness and how updates and transfers are handled. Now if only the protocol was secure...

  • by thegarbz ( 1787294 ) on Friday December 16, 2011 @05:45PM (#38404476)

    These are companies in which we should place our full trust. There shouldn't be standards, codes of practices, or anything else which could let an idiot with a computer become a root CA.

    Instead the requirements should open them up to vigorous external audits. The auditors should be security experts and should be able to look at every part of the internal and external infrastructure owned by the CA. Any CA that fails should be kicked off the list.

    Maybe then we'd weed out the incompetents from the companies with which we trust our security.

    • by mcrbids ( 148650 )

      They *are* held up to vigorous external audits. However, the audits are based on accountancy principles (See SAS 70) and are performed under the guidelines of "reasonable" which provides a lot of wiggle room.

      I did some research for a SAS 70 audit to become a CA some years ago. In anticipation, I designed a certificate exchange system with not a single operation performed "in the clear", where even with full compromise of any stage in the process, a certificate couldn't be invalidly given!

      When I presented my

      • Makes you wonder then if the auditors aren't the ones who need to change. It's a bit misdirected to ensure two factor authentication to physical hardware and then have a CA go south because he managed to get root keys out by exploiting a problem in a company website.

  • Why was there never such governing body making these decisions from day one!!!

    • by Mashiki ( 184564 )

      There should be one, the question is who to trust to do it? Because I sure the hell can't think of any off the top of my head. The UN? US? EU or European countries? Asian or Middle eastern? China? Canada? Australia? African nations? Fuck no. I don't trust any of them with that level of authority. I think the best option will probably be to go with high level registrars to sit down and hammer out a governing body agreement, but to have the entire group draw from the lottery who heads the body eve

  • by Animats ( 122034 ) on Friday December 16, 2011 @05:57PM (#38404580) Homepage

    This is a step forward. Not a huge step, but a step. There's a standard, and although it's weak, it's at least there. And, importantly, there's a list of who's signed onto complying with it. The CA Browser Forum says that 94% of the issued certificates were issued by their members, and there's a list of all 40 members. All root certs from non-Forum members should be removed from browsers. Some browsers now recognize as many as 200 CAs. A purge is in order.

    There's now a way to check whether a CA claims to comply with these rules. If the cert contains OID 2.23.140.1.2.2, the identity of the business behind the cert has supposedly been validated. If the cert contains OID 2.23.140.1.2.1, only the domain behind the cert has been validated. If it doesn't contain either of those, after July 2012, it's worthless. Browsers should be adjusted accordingly. Note that, for the first time, there's an actual verification process for business identity for non-EV certs. This is a big step forward.

    I'm very interested in the validation process for business name information, as described in section 11.2.1 of the specification [cabforum.org]. We use that info in SiteTruth [sitetruth.com], and we'll be tightening up our cert validation, now that there are standards.

    It's not clear how this will interact with Akamai's practice of issuing secondary certs for their customers, so that Akamai's caching servers can present SSL certs from companies Akamai represents. Akamai isn't a CA itself, and isn't a member of the CA/Browser forum.

  • only a fool trusted anything about certificates and the companies that issued them. for that matter, it's irritating as hell that firefox and other ware vendors make it such a hassle for me to use self-signed certificate. ssl certs should not cost money and have no corporations in control of it. instead, a free decentralized system of authenticating a domain should be employed. trivial as we already do similar things with signed encrypted files.
  • by pauljlucas ( 529435 ) on Friday December 16, 2011 @07:52PM (#38405540) Homepage Journal
    Why aren't SSL certs only to encrypt the transmission so data can't be packet sniffed? Why must the cert also certify that foo.com's owners paid $X for a cert?

    If I connect to mybank.com, can't I clearly tell from the URL that I'm going to where I think I'm going?

    In contrast, when I ssh between computers, I don't need any certs for that. Assuming I typed the host's name correctly, I'm going to where I think I'm going. Right?

    • by mhogomchungu ( 1295308 ) on Friday December 16, 2011 @09:26PM (#38405958)

      Why aren't SSL certs only to encrypt the transmission so data can't be packet sniffed? Why must the cert also certify that foo.com's owners paid $X for a cert?

      SSL uses PKI(public key infrastructure). PKI provides two things, authentication and encryption. Authentication is critical because it proves the encrypted message is going the the recipient and there is nobody in the middle.

      Why must the cert also certify that foo.com's owners paid $X for a cert?

      It only certify that foo.com owns the certificate, it says nothing about how much the certificate costs.A certificate is a signed public key.

      If I connect to mybank.com, can't I clearly tell from the URL that I'm going to where I think I'm going?

      If you type "mybank.com" on your browser, your browser will make DNS request to get "mybank.com" IP address. Somebody could high jack the DNS request and return "iownyou.com" IP address and all of your data will send there instead of "mybank.com". Here is the part where the authenticity of the connection comes in.

      In contrast, when I ssh between computers, I don't need any certs for that. Assuming I typed the host's name correctly, I'm going to where I think I'm going. Right?

      When you ssh to a new computer, you will be presented with the other computer signature and asked if you trust the connection is coming from where you think its coming from and it is your responsibility to authenticate the connection. The CA system puts the responsibility on somebody else. The way ssh works is equivalent to self signed keys online. They will give you encryption but not authenticity. If you go to "mybank.com" and they say "we are mybank.com, trust us,we are who we say we are, here is an encrypted connection, use it to send your bank info", would you proceed? i hope you wont.

      • by grahammm ( 9083 ) *

        If you go to "mybank.com" and they say "we are mybank.com, trust us,we are who we say we are, here is an encrypted connection, use it to send your bank info", would you proceed? i hope you wont.

        Many banks do that when they phone you. They do not present their number and then ask you to answer their security questions (the ones you have to answer when you phone them). Then act surprised if you tell them that as they called you anonymously that they have to first demonstrate that they are calling from the bank and are not scammers attempting to elicit your security details.

        If you were presented with a fingerprint on first connection and mybank published its fingerprint 'out of band;' (eg having prin

    • Wrong. MITM attacks are possible with DNS poisoning. You need both authentication and encryption.
      That said, the two should not be one mechanism. Encryption should be separated from authentication, and should be always on. Reducing the number of attackers with the resources to complete an attack is adding security.
    • Because you ,make https to more places than you make ssh connections.

      When you connect to a new server via ssh you confirm the that RSA finger print with the machine owner right? Do you really want to have to do that with every https site you visit?

      And no DNS can be spoofed so you aren't necessarily going where you think you are. Heck if the attacker is "in the middle" enough they can just route packets for the IPs they want elsewhere.

  • by xeno ( 2667 ) on Friday December 16, 2011 @07:59PM (#38405570)

    I was just in another window, messaging a colleague about how there's still value in doing really lame or stupid things as long as you do those things consistently, and establish a common scope and language... so that you can then start to do real work. IOW: "You don't know how f---ed up things really are until you try."

    This doc is basically the product of a terribly depressing concall on which CA after CA lamented the lack standards... and 5min from the end, one of the participants stepped up and said something like "Hey, we drafted this amateur-hour recommendation doc by ourselves -- how would the group like to adopt it?" This document is a very sad, sad, incomplete, short-sighted, sad (did I say sad?) first step -- basically munging together RFC 3647 with some ideas from PCI, but still sets no real standards for actual operational security of a CA.

    However, if this gets adopted & reissued by a real standards-issuing body, /then/ people can say "Hey, ISO/IEC 2XXXX security standard for CAs really sucks; why don't we make it not suck..." THEN this doc will have had real value in ensuring there's a place for the non-suck document when it's done. (BS7799=suck, but it became 27002 and in the process set the stage for other standards that are, frankly, quite good.) The first step out of a swamp is still a step in the swamp.

    -Jon

  • by Anonymous Coward

    Consider that basically the government really does not know who you are very well in some jurisdictions. For example there have been reports that something like 15% of SSNs in the US are faked. Some of these are illegal immigrants but far from all. Consider the difficulty of relating anything to birth certificates, which have in many cases no evidence of which human they refer to. The new recommendations are similarly weak: let someone steal a domain and a check that the domain "belongs" to the thief will p

  • issuing SSL/TLS digital certificates naively trusted by the browser.

    FTFY

  • This is just an attempt to put a good face in a bad game. We call it bluffing.

    There is nothing that can be done to make this process more useful and secure, a new process must be established and recognized, which involves treating the self signed certificates with some dignity (not pretending they are WORSE than plain text HTML), that's a first step.

    Second step is to start working on a distributed fingerprint / key system as well, while the site managers themselves need to recognize what they are dealing wi

Technology is dominated by those who manage what they do not understand.

Working...