Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Mozilla Firefox Security The Internet

Mozilla Begins To Move Towards HTTPS-Only Web 324

jones_supa writes: Mozilla is officially beginning to phase out non-secure HTTP to prefer HTTPS instead. After a robust discussion on the mailing list, the company will boldly start removing capabilities of the non-secure web. There are two broad elements of this plan: setting a date after which all new features will be available only to secure websites, and gradually phasing out access to browser features for non-secure websites, especially regarding features that pose risks to users' security and privacy. This plan still allows for usage of the "http" URI scheme for legacy content. With HSTS and the upgrade-insecure-requests CSP attribute, the "http" scheme can be automatically translated to "https" by the browser, and thus run securely. The goal of this effort is also to send a message to the web developer community that they need to be secure. Mozilla expects to make some proposals to the W3C WebAppSec Working Group soon.
This discussion has been archived. No new comments can be posted.

Mozilla Begins To Move Towards HTTPS-Only Web

Comments Filter:
  • Excellent. (Score:5, Insightful)

    by Anonymous Coward on Friday May 01, 2015 @09:58AM (#49593427)

    More wildcard certs for me to buy.

    • Re:Excellent. (Score:5, Informative)

      by kthreadd ( 1558445 ) on Friday May 01, 2015 @10:10AM (#49593539)

      More wildcard certs for me to buy.

      If Let's Encrypt takes off, and it's fairly likely to do so given the sponsors they have (including Mozilla), you won't have to buy any certs at all. They will just be there automatically.

      • I have my own personal web site. It uses HTTP. Several years ago I looked into upgrading it to HTTPS. No thanks. Why not? Because [a] I had to shell out my own money to by a certificate to vouch for my domain name, and [b] It seemed wrong to me to have somebody else to voucth for me. Maybe Mozilla can solve the first problem. But if you go to my domain name, why do you need anybody else to swear that that really is me? Seems wrong, somehow.

        - Andy Canfield
        www.andycanfield.com

        Ahah! Do you believe what

        • Re: Excellent. (Score:4, Insightful)

          by amxcoder ( 1466081 ) on Friday May 01, 2015 @02:58PM (#49595967)
          Actually this. I'm in the same boat, with my own domain on shared hosting. I'm not going to shell out money to a third party for a cert that really isn't needed for a website that just gives info about me and my business.

          On another note, I program embedded control systems for a living, and often am incorporating automation to reach out and either pull out scrape data from web servers for different reasons (to diplay weather or energyvusage stats) or control home security monitors etc. These embedded platforms dont have the encryption frameworks for them to access most https sites. Meaning to do the simple thing like scraping info from a https page requires delving into encryption protocols, rolling your own encryption implementations and having it run on a platform that is less powerful than a typical phone. It all started when all email servers went to https and then trying to get an automation system to send a status or alert email turned into a major PITA. Now the whole web is going to be like that. I love how in the dawn of IoT, that everyone assumes that all these microprocessors are going to be running standard full fleged web frameworks and all the goodies that goes with them, including encryption, XML, JSON, Restful and other frameworks that are common on on your big 5 OSes, but not so common in the land of proprietary OSes running on embedded platforms.

          BTW, I program AMX and Crestron automation systems if anyone was wondering what platforms Im specifically referring to, but there are others as well.
    • by Anonymous Coward on Friday May 01, 2015 @10:27AM (#49593713)

      Mozilla used to be the Savior of the Web. But after these last few years, I fear they've lost that role.

      The UI changes to Firefox were totally unwanted, and have pretty much killed it as a product. Its share of the market keeps dropping and dropping. When we look at global web browser usage stats like these [caniuse.com], we see that Firefox is now maybe 10% of the market, if even that. Chrome for Android alone, Chrome 41 alone and Chrome 40 alone each have about the same or more users than all versions of Firefox. Heck, even IE 11 alone and Safari have about the same number of users these days.

      Mozilla has also engaged in numerous other half-arsed efforts, like Firefox OS and Persona, that nobody wants. Every review I've seen of Firefox OS has been negative. Nobody likes it, and nobody wants it, even the third-worlders they've had to resort to targeting it to. With Android, iOS, and so many other alternatives that are so much better, why the heck would anyone sensible use Firefox OS? The only reason to use it is to try to conform with some weird fringe ideology that worships HTML5/JS/CSS above all else, even above usable, working applications.

      Then there was the whole Eich debacle. Regardless of your stance, it's pretty disgusting that somebody had to lose his job merely because of his beliefs regarding same-sex marriages. It would be considered unacceptable if a homosexual was forced out of a job for supporting same-sex marriage, and it should be considered just as unacceptable if a heterosexual was forced out of a job for not supporting same-sex marriage. This is no place for hypocrisy or double standards.

      Now there's this shit that will cause headaches and problems for so many Web users.

      We need a new organization to save us, and the Web, from Mozilla. We need an organization that will put out a usable browser. We need an organization that focuses on doing what's right, and what the Web community wants, rather than what it wants. We need an organization that will listen and respect its users, rather than trampling on them and ignoring their pleas. We need a new Savior, and we need it now.

      • by Grishnakh ( 216268 ) on Friday May 01, 2015 @11:27AM (#49594331)

        Then there was the whole Eich debacle. Regardless of your stance, it's pretty disgusting that somebody had to lose his job merely because of his beliefs regarding same-sex marriages.

        Bullshit.

        When you're the CEO of a company, your personal beliefs are no longer your own; anything you do in public reflects on that company. You are in effect the company's face and public image. So if the company's board of directors doesn't like the image you're conveying of the company, they are entirely within their rights to fire you and hire someone else.

        Simpletons like you don't seem to understand that being a CEO is not a normal job where you come to work, punch a time clock, do what you're told, and collect a paycheck and go home to live your private life. When you're CEO, you have no private life. Just look at Steve Jobs when he was alive: he was well-known, famous, he was Apple. Everything he did represented that company. Not only does the CEO direct the company and make all the big decisions, he also serves as the public face of the company.

        Granted, Mozilla isn't as big or prominent a company as Apple Computer, but it's still fairly well-known, as countless people do use their browser (or have in the past). If they thought that Eich was making their company look bad, they had a very good reason to replace him.

        Are you going to try to argue that if Coca-Cola hires some celebrity to do some ads for them, and that celebrity gets caught on camera spouting a bunch of racist stuff like Mel Gibson, that they shouldn't fire him, and that they should continue showing ads showing this now-controversial personality and thus completely ruin their public image?

      • Also the whole h.264 non-support debacle. Of course, to be fair, Google waffled on that too... but was on the flip-side, and never actually followed through and removed it from their mainstream browser.

        At some point Mozilla decided its philosophical (and sometimes political) agenda was going to be the driving force behind its decisions, rather than the users' wants/needs. That's fine; they're certainly free to do that - but if their users don't see value in them doing so, they're going to fade into obscurit

  • Wait a minute... (Score:5, Insightful)

    by jez9999 ( 618189 ) on Friday May 01, 2015 @10:02AM (#49593459) Homepage Journal

    If my website just serves up public data that I don't care about the government seeing, you're going to disable new features on it anyway? Seems a bit extreme.

    • Re:Wait a minute... (Score:5, Informative)

      by LordLimecat ( 1103839 ) on Friday May 01, 2015 @10:05AM (#49593487)

      Not sure if you've been watching the news, but China has been using Baidu effectively as a botnet because they are able to intercept and modify javascript sent via HTTP.

      Stops a lot of threats, even if you're just a hobbyist; it ensures that an attacker cant just intercept your hobby page and drop a bunch of exploit kits on it.

      • Re: (Score:2, Insightful)

        by jez9999 ( 618189 )

        What about development though? You want to go through the PITA of setting up HTTPS for every development site? This also stops you using Wireshark for seeing what data is actually being transmitted.

        • This also stops you using Wireshark for seeing what data is actually being transmitted.

          Is that not the point of HTTPS?

          • Wireshark is a useful debugging tool. The ability to snap off encryption to analyze things at the wire is a lifesaver.

            That said, if I'm debugging something a browser is doing, the developer console is usually better anyway. There remains the case where you are trying to debug a tester's experience without access to their browser, but the scenarios where that is true *and* it would be a good idea to disable TLS are limited. Being able to disable encryption is more important for clients that aren't so deve

            • That said, if I'm debugging something a browser is doing, the developer console is usually better anyway.

              Yes, it is, and the same holds everywhere. Being able to grab the data on the wire has long been an easy way to get sort of what you want to see, but it's almost never exactly what you're really looking for. HTTPS will force us to hook in elsewhere to debug, but the "elsewhere" will almost certainly be a better choice anyway.

            • If you're trying to write software which bypasses the browser and does HTTPS directly, using Wireshark is extremely useful for debugging, and not easily replaced any other way.

        • by AmiMoJo ( 196126 )

          Considering how much effort Mozilla put in to providing tools for developers I'd be amazed if they hadn't considered development and wire sniffing for debugging. Also, one of the other major goals of efforts to make HTTPS the default is to provide a simple way to enable it.

        • by Lennie ( 16154 )

          I believe an exception for localhost is included.

      • by markhb ( 11721 )

        Do you have an English reference for the Baidu comment (I'm not doubting, just want to see the details)?

      • Not sure if you've been watching the news, but China has been using Baidu effectively as a botnet because they are able to intercept and modify javascript sent via HTTP.

        Now that you mention it I vaguely remember hearing something about CNNIC and that CAs have effectively become key escrow for governments around the world.

    • If my website just serves up public data that I don't care about the government seeing, you're going to disable new features on it anyway? Seems a bit extreme.

      TLS can actually be used without encryption, the data is transfered in clear but you still get the authentication; which is actually something you want even if the data itself isn't secret.

    • by mothlos ( 832302 )

      Secure protects against a whole class of man-in-the-middle attacks which allow third parties to inject malicious code into non-sensitive communications.

      More importantly, however, requiring security of everyone makes secure sites more secure. The big problem is that security notifications for users don't work. It is simply too difficult and error-prone to notify users of important security problems while also ignoring unimportant ones. False negatives put users at risk and false positives train users to igno

      • Re:Wait a minute... (Score:5, Interesting)

        by Todd Knarr ( 15451 ) on Friday May 01, 2015 @11:10AM (#49594115) Homepage

        The problem is that requiring HTTPS doesn't make sites more secure. It prevents an attacker who can't obtain a legitimate SSL certificate for the domain from running a mid-transit MITM attack, nothing more. The biggest problems seem to be a) phishing attacks that convince the user to visit a rogue site eliminating the need for MITM, b) local system compromises (client- or server-side) that have access to the cleartext traffic and don't need an MITM, and c) rogue CAs who issue certificates for domains the recipient isn't authorized for which allows for mid-transit MITM with HTTPS. The first two can't be mitigated by anything other than smarter users (HAH!), and mitigating the third requires massive changes to certificates so it's possible to determine whether a certificate belongs to a given site without depending on anything in the certificate and without depending on the CA having validated the recipient.

    • If my website just serves up public data that I don't care about the government seeing, you're going to disable new features on it anyway? Seems a bit extreme.

      I get the feeling Mozilla don't want anyone to use their browser...

    • by Lennie ( 16154 )

      The features they are talking about are things like:

      enable the webcam

      Do you really want a man-in-the-middle attack inserting some extra Javascript when you enable the webcam on some site ?

      I would think the answer is: no

    • I'd think by your UID you'd have been around long enough to recognize this pattern.

      This is just how Netscape manages itself into ... well not being in business. Just because they changed their name to Mozilla after Sun realized how shitty they were doesn't mean its a different company really.

      Netscape has never had a grasp on what their customers wanted or needed. They have always coded themselves right out of existence by doing stupid shit JUST like this. No one at Netscape that makes decisions should be

    • I don't think it's extreme at all. I think we're past the point that's it's socially reasonable or responsible not to encrypt all traffic by default.

      Even if you're 100% OK with visitors to your site being snooped on, consider that adding to the amount of crypto in use worldwide makes it hard for repressive governments to tell what their citizens are doing online. Maybe your site would be the straw that broke the Great Firewall's back and lets some kid read uncensored news.

  • F**** you, Mozilla! (Score:2, Interesting)

    by Anonymous Coward

    First, you introduce "features" like https://bugzilla.mozilla.org/show_bug.cgi?id=435013 and then you want to block the rest of pages the mighty Mozilla Security Council does not approve?? Get stuffed.

    • It's almost like they even consider 11% too much... It's like they forgot why they forked to Firefox in the first place.

      I'll miss "Password Maker [mozilla.org]" but I think it's really time for me to ditch it completely.

      Does Chrome have anything like Firebug?

      • Re: (Score:3, Informative)

        by Anonymous Coward

        Does Chrome have anything like Firebug?

        Oh my yes!! I quit using Firefox for Javascript development because the Chrome developer tools are so much better than Firebug. I didn't think that anyone could improve on Firebug, but I was quite pleasantly surprised.

  • So.... (Score:4, Funny)

    by Continental Drift ( 262986 ) <slashdot@@@brightestbulb...net> on Friday May 01, 2015 @10:03AM (#49593471) Homepage
    No more http://slashdot.org [slashdot.org]?
  • by QuietLagoon ( 813062 ) on Friday May 01, 2015 @10:09AM (#49593525)
    My bank still insists on using RC4 ciphers and TLS 1.

    If Firefox were to stop supporting the bank's insecure website, it would surely get their attention better than I've been able to.

    • My bank still insists on using RC4 ciphers and TLS 1.

      If Firefox were to stop supporting the bank's insecure website, it would surely get their attention better than I've been able to.

      What bank is this? There's nothing wrong with public shaming in cases like this, in fact it does the world a service.

      Also, you should seriously consider switching banks. Your post prompted me to check the banks I use. One is great, one is okay. I'll watch the okay one.

    • My bank still insists on using RC4 ciphers and TLS 1.

      If Firefox were to stop supporting the bank's insecure website, it would surely get their attention better than I've been able to.

      As others have pointed out, they might claim that the latest Firefox was defective and encourage users to stay at an old version or switch browsers "until it is fixed". Once such decisions are written into policy, front line workers unwittingly protect the decision makers from having to find out that they were wrong. They will simple 'teach' the users one-by-one to 'fix the problem' by installing a different browser.

      It would be better to have Firefox warn that the site had "outdated security" or something like that. The warnings could start out hardly noticeable and gradually become more conspicuous. It could start with a subtle change in the lock icon, then a mild click through warning, then a warning with a scary graphic and phrases such as "proceed at your own risk".

      The idea is to get the message in front of as many Firefox using customers as possible before the businesses are aware of it. This makes it instantly a "a well-known security flaw in our website" rather than a "known problem with a version of Firefox used by two customers".

      At that point they can either fix their website or block Firefox. But now if they block Firefox the reason will be widely known and the bank subject to public ridicule.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Friday May 01, 2015 @10:13AM (#49593575)
    Comment removed based on user account deletion
  • by klapaucjusz ( 1167407 ) on Friday May 01, 2015 @10:18AM (#49593617) Homepage
    There's still no opportunistic encryption [wikipedia.org] in HTTPS. Does that mean I'm going to have to buy a TLS certificate for my printer every year?
  • by kav2k ( 1545689 ) on Friday May 01, 2015 @10:26AM (#49593709)

    I fully support this proposal. In addition to APIs, I'd like to propose prohibiting caching any resources loaded over insecure HTTP, regardless of Cache-Control header, in Phase 2.N. The reasons are:
    1) MITM can pollute users' HTTP cache, by modifying some JavaScript files with a long time cache control max-age.
    2) It won't break any websites, just some performance penalty for them.
    3) Many website operators and users avoid using HTTPS, since they believe HTTPS is much slower than plaintext HTTP. After deprecating HTTP cache, this argument will be more wrong.

    I'm sure the users will appreciate the extra traffic!

    I can see 1 being a thing, but 2 is a penalty for the end-user on metered connections, and 3 is an argument for "Mozilla is much slower than [insert browser here]".

    • I think it's even worse than that. Are there ANY caching services, edgecast, or CDNs that support encryption?
      https is great when you need it but for static content like images it makes caching next to impossible as well
      as requires several times more servers to serve the same amount of traffic as an http server can serve over
      double the number of pages per second as a https server and that's without looking at all the traffic that is
      skipped with caching and CDNs.

      • by dbrueck ( 1872018 ) on Friday May 01, 2015 @11:23AM (#49594285)

        I do worry about the downsides of this in terms of how it'll cause higher load on servers because of higher traffic. That said, all major CDNs support HTTPS on the edges and non-HTTPS between the origin and the CDN, so they'll be fine. Where this will probably hurt more is with forward proxies at universities and businesses and transparent intermediate caches at ISPs.

      • by Strider- ( 39683 ) on Friday May 01, 2015 @11:28AM (#49594341)

        Also, for those of us operating network connections to remote locations, everything https is absolutely destructive to the network performance. Right now, our WAAS setup gives us about a 30% boost on the satellite connection, mostly through low level de-duplication and compression. When you have 50+ people depending on a 1.8Mbps satellite connection, every bit counts. Enabling https for things that don't need it is a huge performance penalty.

        Basically, the people making these decisions assume that everyone has an unlimited, fast internet pipe. This is simply not the case.

        • Good point. Yet another example is in-flight wifi like Gogo - not only do those guys rely heavily on caching, they'll even do things like recompress jpegs on the fly to be smaller. I'll sidestep the debate around whether that is good or bad, but another consequence of HTTPS-only web is that stuff like that has the potential to get even slower.

    • I'm sure the users will appreciate the extra traffic!

      Only users??

      Most serious hosters still charge by traffic. The web-site owners too would appreciate the increased traffic and higher bill.

  • Choice, not force. (Score:2, Insightful)

    by Tablizer ( 95088 )

    I hope they give a setting choice similar to:

    * Block all non-HTTPS sites
    * Prompt on all non-HTTPS sites (view/no-view confirmation, perhaps with a "remember choice for this site" option.)
    * Automatically allow all non-HTTPS sites, with a yellow warning bar and disabling of JavaScript.
    * Automatically allow all non-HTTPS sites, with a yellow warning bar.
    * Automatically allow all non-HTTPS sites, withOUT a warning bar.

    (There may be a way to simplify this by putting some of the questions in the warning bar.)

    Moz

  • by debrain ( 29228 ) on Friday May 01, 2015 @10:41AM (#49593839) Journal

    Unintended Affordances [pocoo.org]
    (or why I believe encrypting everything is a bad idea) is worth a read on this.

    I am not sure I agree on every point, but it's well thought out post.

  • by Anonymous Coward

    HTTP needs to be phased out, but that doesn't mean everything needs to be encrypted. A lot of sites serve static content thats not a secret to anyone. Even in an encrypted stream, the contents of static files isn't really a secret. What you don't want is some man in the middle intercepting your request for some static file and responding with something malicious like the Great Cannon.

    If static content were signed with the server's cert, its authenticity could be verified more cheaply than with HTTPS. Th

  • Self-signed (Score:5, Insightful)

    by Dwedit ( 232252 ) on Friday May 01, 2015 @10:43AM (#49593859) Homepage

    Okay, but if you're going to do that, you might want to throw out all the incredibly dire warnings about self-signed certificates. Nobody should be forced to pay a cartel for SSL certificates.

    Instead, throw out the dire warnings when the self-signed certificates aren't correct, such as when it changes.

    • Re:Self-signed (Score:4, Interesting)

      by Strider- ( 39683 ) on Friday May 01, 2015 @11:32AM (#49594401)

      Okay, but if you're going to do that, you might want to throw out all the incredibly dire warnings about self-signed certificates. Nobody should be forced to pay a cartel for SSL certificates.

      It's gets worse. Chrome throws the dire warnings on self-signed SSL certificates, and then refuses to do the username/password autofill on those pages. I've basically ditched using chrome for most of my network admin stuff that goes over https, because of this.

      • If you are not willing to spend the 30 minutes it takes to set up your own CA and and deploy that cert on your own desktop, please hand in you "network admin" card immediately and seek alternative employment.

    • Contrariwise, what we need is a trustable CA that gives out free certificates.

  • by bradley13 ( 1118935 ) on Friday May 01, 2015 @10:49AM (#49593923) Homepage

    HTTPS is all well and good, but the certificate situation is just a mess. Currently, essentially any CA can issue a certificate for any website anywhere. That means that every time you surf, you are placing your trust in literally hundreds of CAs.

    Meanwhile, self-signed certificates bring up horrendous warnings, or are simply refused. The chance of verifying a self-signed certificate (for example, getting the fingerprint via another channel) are a lot better than the chance of verifying that some random CA hasn't been bribed or pressured.

    Can we please fix this mess, along the way to making HTTPS standard?

  • Doesn't that depend on the configuration and purpose? If the HTTP server's running on my own machine and the URL is "http://localhost/...", am I automatically insecure because I can't get an SSL certificate for "localhost"? And how would an attacker not already on my machine exploit this?

    If I can't test the full capabilities of a Web site because the browser won't let me, I'm going to have to switch browsers and relegate Firefox to testing-only just like IE is currently.

  • There are still plenty of clients out there that support neither SNI nor IP6, so the implication of everyone going to SSL is that everyone needs a static IP4 address. That sounds unsustainable to me.

    • I was wondering about that. It's been a number of years since I've had to worry about configuring Apache but when I did it was for a government department that had a fair number of virtual hosts. Most of then didn't have HTTPS so they were all grouped onto one IP address and used a virtual host to configure them. But if they all needed to be on HTTPS and you still can't use a virtual host for configuration then I can see that being a huge pain for them. The web configuration isn't too bad but it would i

    • This has not been the case for a long, long time.
       
      All major web server software will allow virtual hosts on shared IPs using Server Name Indication which has been part of the TLS standard since version 1.0

  • by ftobin ( 48814 ) * on Friday May 01, 2015 @11:04AM (#49594059) Homepage

    It would be nice if they focused on fixing the certificate authority structure by supporting DANE, using DNS records to indicate certificates. Even though there is plenty of interest at https://bugzilla.mozilla.org/s... [mozilla.org] , Mozilla doesn't seem interested in solving this problem:
    https://bugzilla.mozilla.org/s... [mozilla.org]

  • Yet another reason (Score:4, Informative)

    by JohnFen ( 1641097 ) on Friday May 01, 2015 @11:07AM (#49594083)

    Thanks, Mozilla, for yet another reason to stop using Firefox.

    • Thanks, Mozilla, for yet another reason to stop using Firefox.

      You'd think that they would take a hint from their declining usage [wikipedia.org], instead of doing crazier and crazier shit.

    • At the rate that Google is going with their crusade against insecurity, I believe it is only a matter of time before they follow suite with Chrome.

  • Can't upgrade since it causes me to be locked out of the Windows domains at work if I go to 37.

    [John]

  • Last time I tried, https didn't work. Kinda surprised me.

Fast, cheap, good: pick two.

Working...