Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Chrome Security IT Technology

Google Chrome Wants To Block Some HTTP File Downloads (zdnet.com) 207

An anonymous reader writes: Google wants to block some file downloads carried out via HTTP on websites that use HTTPS. The plan is to block EXE, DMG, CRX, ZIP, GZIP, BZIP, TAR, RAR, and 7Z file downloads when the download is initiated via HTTP but the website URL shows HTTPS.

Google said it's currently not thinking of blocking all downloads started from HTTP sites, since the browser already warns users about a site's poor security via the "Not Secure" indicator in the URL bar. The idea is to block insecure downloads on sites that appear to be secure (loaded via HTTPS) but where the downloads take place via plain ol' HTTP.

This discussion has been archived. No new comments can be posted.

Google Chrome Wants To Block Some HTTP File Downloads

Comments Filter:
  • UGh. (Score:5, Insightful)

    by flippy ( 62353 ) on Wednesday April 10, 2019 @02:17PM (#58416542) Homepage
    Why oh why does Google think that they know better than everyone? Give a warning, sure, and then let the user decide. Just the same way it handles an HTTP page vs an HTTPS page.
    • On top of that there are a ton more file extensions that should be added to it if they are trying to stop on the fly bin replacment. This is google playing nanny again.

      • by flippy ( 62353 )
        100%. I get that they want to try to protect people from their own mistakes, but just outright disallowing stuff isn't the way to do that.
        • Yeah, if you ask 10 random people what TLS is, you'll find out why Google security engineers think that they know security better than thr average consumer does. It's their. JOB to know security, so they SHOULD be much better informed than the average user. They shouldn't forget that fact when they make *defaults* and *warnings*.

          On the other hand, I've been an internet security professional for twenty years. I can reasonably decide to override the defaults in selected situations. I am not a typical user i

    • Re:UGh. (Score:5, Insightful)

      by supremebob ( 574732 ) <themejunky&geocities,com> on Wednesday April 10, 2019 @02:53PM (#58416858) Journal

      I wish that Google gave you the ability to suppress those warnings as well. I have a few internal development sites with invalid SSL certificates on them, which Google throws an obnoxious "YOUR CONNECTION IS NOT PRIVATE" warning every time I hit them.

      Congratulations, Google, you're training people to click on the "Proceed to x (unsafe)" link EVERY time they see that page as a muscle memory reaction, whether or not it's a real security issue or not.

      • It's incredibly stupid that browsers don't make a peep about plaintext HTTP connections, but go into full "DANGER WILL ROBINSON!!1" alert for HTTPS connections with a self-signed or invalid cert. In what way could the latter possibly be less secure than the former?

        • If you had a physical safe, 2,000 pounds, which would open whenever someone tapped it on the left side, that would be a defective product. Since you probably bought the safe to protect valuables, you'd want to know if it doesn't offer any security. A security warning about that safe would be warranted.

          A cardboard box would not be defective of it could be opened easily. You don't store gold in a cardboard box and expect high security.

          By applying TLS, the site operators are essentially declaring that the con

          • The problem is that you suggest the common user can tell the difference between a cardboard box and a safe. They can't (thus the green locks and such), and yet we're still treating a safe with potentially no lock (or potentially the best lock of all, if you roll your own cert, verify keys out-of-band, and save them) as a less secure container than a cardboard box. Which it in no way is.

            • The purpose of the cert is for the browser to know whether they are talking to your server, or to my MITM proxy which I made on a Raspberry Pi, ans presents itself as a WiFi network "Convention Guest WiFi".

              If you don't tell the browser WHICH cert you've rolled, it's unable to distinguish your cert from my imposter cert, and therefore you have almost zero security.

              > you suggest the common user can tell the difference between a cardboard box and a safe. They can't (thus the green locks and such)

              I suspect u

              • The green lock is there because the user doesn't know the difference between http: and https: in the URL bar. As such, the browser should display the green lock for an HTTPS connection with a valid cert and not display one for an HTTP connection or an HTTPS connection with an invalid cert. It's even easier for your RasPi to MITM an HTTP connection, but the browser will happily use that protocol without complaint.

        • It's not stupid at all.

          Do you expect a warning every time you walk down the street while you talk that your conversation may be overheard by the person walking next to you?
          Do you expect a warning when you're in your home and someone has installed a microphone in your closet?

          There are completely different use cases.

      • by AmiMoJo ( 196126 )

        You are supposed to install a local root certificate that you use to produce your own test certs.

        • by tepples ( 727027 )

          I've read it's a lot harder to install a local root certificate on an iPhone, iPad, Android phone, or Android tablet than on, say, a desktop computer. Besides, as of Android 7, local root certificates don't even work in all apps unless each app's developer has opted into using local root certificates through the app's Network Security Config.

          • by AmiMoJo ( 196126 )

            It's fairly easy: https://support.google.com/nex... [google.com]

            Should work with all major browsers.

            • by tepples ( 727027 )

              Should work with all major browsers.

              From the page you linked:

              Most apps don't work with CA certificates that you add

              In Android 7.0 and up, by default, apps don't work with CA certificates that you add. But app developers can choose to let their apps work with manually added CA certificates.

              Do you mean that the publishers of Chrome, Firefox, and other major browsers do in fact "choose to let their apps work with manually added CA certificates"?

              • by AmiMoJo ( 196126 )

                Read it again, carefully. That caveat only applies to CA certificates, not ones you make yourself.

                • by tepples ( 727027 )

                  If you're acting as a private CA, what's the difference between "a local root certificate that you use to produce your own test certs" and "CA certificates"?

                  • by AmiMoJo ( 196126 )

                    Basically it's the level of trust that each gets. CA certs are generally handled transparently without any user interaction and accepted as validating identity. Self signed certs are just used for security and don't prove identity, and some clients may choose to ask the user to confirm their use, typically only once the first time they are encountered.

      • as a muscle memory reaction,

        Just hand Chrome (Chromium) over to the UI/UX experts. They'll have your errant muscle memory fixed in no time. And even better, it'll even STAY fixed since they'll keep moving it around and changing it's appearance.

      • by jythie ( 914043 )
        Even worse, there are some cases where it doesn't even give you the 'Proceed to x (unsafe)' link anymore. It makes dealing with outdate (gasp only a few years old!) embedded devices REALLY frustrating.
      • by gmack ( 197796 )
        Setting up an internal Certificate Authority is not that hard.
      • Comment removed based on user account deletion
      • Congratulations, Google, you're training people to click on the "Proceed to x (unsafe)" link EVERY time

        No. They are training IT experts who should be immune to to the training to do so. The number of times an ordinary user will experience a page with a legitimate SSL certificate error that they need to routinely click through is close to zero. The result is that people take pause.

        Quite telling that your example talks about internal development sites. I'm not concerned too many users have to worry about those.

    • Firefox isn't much better. I had no idea all download URLs were transmitted to a malware check, until after I spent 8 hours on a super slow download of one part of a zipped video file (i.e. not even executable), then Firefox, without warning, said it was malware and deleted it so hard forensic software couldn't get it back a minute later. And the only option around this is disabling the service entirely, which was fine for me since I was appalled at it transmitting all my URLs to a 3rd party without warning
    • Most people have no ability to decide. Providing the feature can be turned off, I have absolutely no problem with a default that blocks files that are the most frequent delivery agents of malware.

    • As if Mozilla (and others which usually make browsers based on Chromium) is much better. I remember I couldn't paste a quite large piece of JS code in Firefox dev console since Mozilla thinks that it's very insecure to execute something from an unknown source. Actually it's uncommon for an average clueless user to open browser dev tools, paste and execute a JS code. This is just an example of ruining the developers' experience. And many similar things are can be said for user experience. Others aren't signi
    • Why oh why does Google think that they know better than everyone?

      I'm going to guess it's because they spend more money on R&D and human interaction studies than the typical armchair warrior does.

  • by SuperDre ( 982372 ) on Wednesday April 10, 2019 @02:21PM (#58416578) Homepage

    But http or https doesn't really matter these days, even malicious sites are using https..
    As long as you get a warning when downloading and you are still able to download the file, I don't have anything against it. But if they just block download completely because it isn't coming from an https site, than I won't be using Chrome anymore.. As I said, https doesn't say anything about the file being safe.

    • But it does mean that the executable file wasn't altered in transit.

      • Auto check summing would be better all around.

        • by Anonymous Coward

          I always wondered... if the checksum was from the same place, how do you know some MITM attack didn't change the checksum?

      • In case you didn't know, MITM attacks are now also possible through https....... It's not as simple as with http, but it is possible..
      • Re:uhh,, (Score:4, Insightful)

        by Chris Mattern ( 191822 ) on Wednesday April 10, 2019 @02:56PM (#58416884)

        But it does mean that the executable file wasn't altered in transit.

        Catching executables in flight and altering them sound like a really hard way to do something unless your ISP is doing it to you (and if your ISP would do that to you, you have much bigger problems). It ranks way down on my list of worries, being massively overshadowed by the possibilities that the site itself has been hacked or is intentionally serving up malware--neither of which this does anything to help you cope with.

    • Considering how many are downloading from wifi or untrusted router, it doesn't make sense to use http because you can get you file change during the download. HTTPS wont slow you down and will offer basic security against the hacker next door. For web hosting, https://letsencrypt.org/ [letsencrypt.org] offer free SSL certificates.
  • Ok, and how exactly do they expect people to be able to download software, or other files?

    Apparently in Google's world everyone has gigabit fibre so very large log files (for example) is not an issue. But for those of us in the real world, being able to compress stuff before sending is still incredibly valuable.

    (And for anyone that plans to latch onto the log file example like starving dog on a steak and say "Well you should be splitting up your log files!", I kindly invite you to eff off in advance. I'm

    • Or, I could read the summary and article a little more carefully and realize it's restricted to HTTP downloads from an otherwise HTTPS site.

      I can why they would do that since an HTTP connection can be MITM'ed easily. But that goes for literally anything. Malicious office docs, PDFs... There are tons of files that can have a malicious payload beyond the ones they mentioned. Hell, someone MITMing an HTTP connection can basically send whatever they want, so it would be far simpler to just bring up a warnin

      • The big difference I see is that executable inherently get pretty much full unrestricted user-level access to the machine, whereas compromised documents rely on exploiting vulnerabilities in applications (i.e. it's somebody else's problem). Those applications are typically constantly being updated to remove vulnerabilities (well, so long as they're not "too big to care about users' needs" at least, which perhaps covers the specific examples you mentioned...)

        That said - yeah, it does seem like simply warnin

    • Comment removed based on user account deletion
  • Is that we could all agree on some sort of standard whereby from a secure site you could initiate a download, have that download be unencrypted, but the download link would include a sha256 checksum that would be checked automatically by the browser once the download was complete.

    This would allow popular downloads to be cached closer to the user, while providing for verification of the download integrity.

    • by Anonymous Coward

      if you think about it a bit longer you'll discover that doesn't solve anything

      • The idea, that was not communicated clearly, is that the hash would be transmitted over the encrypted channel, thus extending trust to the object served outside of that trusted umbrella.

      • I'm not seeing it.

        Obviously the checksum would have to be sent over the encrypted channel, but so long as you do that, sending the data itself unencrypted and cacheable is a non-issue. (well, aside from surveillance)

        I've often wondered why such a thing isn't common myself - not just for security purposes, but to reliably and transparently detect accidental transmission errors.

      • by tepples ( 727027 )

        Would a "signing-only cipher suite" make sense?

  • Mostly Pointless (Score:5, Insightful)

    by EndlessNameless ( 673105 ) on Wednesday April 10, 2019 @02:32PM (#58416684)

    Most sites provide their file hashes over HTTPS. If I'm going to verify the file on my end anyway, there's no real reason for the site to waste CPU encrypting the entire ISO every time someone downloads it.

    Digital signatures and hash verification address the same security concerns with less impact.

    • by Kjella ( 173770 )

      Most sites provide their file hashes over HTTPS. If I'm going to verify the file on my end anyway

      Well to my knowledge there's no standard way to do this. Like if you could have an <a href="http://my.plain.download" sha-256="e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855"> this would be fine, simply fail if it doesn't verify. But if you're expecting the average user to verify security certificates etc. then 99.9% of them won't do that.

    • >Most sites provide their file hashes over HTTPS.
      I'm going to have to disagree. In my experience, most sites don't provide hashes, and most users don't know how to check them anyway.

      If you're downloading ISOs you're probably a fellow Linux enthusiast, which puts you in a (generally) much more technically competent group, but a group so small as to be largely irrelevant to the attack channels used against the broader population.

      Also - even for technical users they're not talking about blocking all HTTP d

    • by DRJlaw ( 946416 )

      If I'm going to verify the file on my end anyway, there's no real reason for the site to waste CPU encrypting the entire ISO every time someone downloads it.

      Nobody will see this due to the wall of trolling that's accumulated under your post, but...

      Sure there is. "They'll"* know that you downloaded that file through their deep packet inspection gear.

      *They being the government (three letter agencies), or the transit provider, or the cable/DSL oligopoly, or FAANG because why the hell not.

    • by AmiMoJo ( 196126 )

      What percentage of internet users do you think actually know what a hash is?

      This is just another step towards fixing a very old mistake. Security should be the default.

      • What percentage of internet users do you think actually know what a hash is?

        Is this an issue that needs solving? Are people actually being owned by this vector? What percentage of Internet users have been attacked in this way? Where is the evidence supporting this position? Why are less assertive measures insufficient?

        Or like Google's bullshit reasoning for reducing security by removing public key pinning and falsely claiming certificate transparency is an effective analogue is this just another scheme for punishing *undesirable* sites who link to third party files for download

        • by AmiMoJo ( 196126 )

          Is this an issue that needs solving?

          Yes. MITM attacks are used by everyone from governments to ISPs to spread malware.

          there is something fucked up when one company imposes restrictions on the entire Internet because it can.

          The entire internet runs on Chrome?

          • Yes. MITM attacks are used by everyone from governments to ISPs to spread malware.

            There isn't a government in the world worth mentioning lacking resources to MITM HTTPS. Numerous CAs are located in effective dictatorships, some even state run.

            This fact is why it was so damaging and counterproductive for Google to have removed key pinning protections from Chrome while falsely claiming certificate transparency to be an analogous replacement. The man asked and Google complied.

            As for ISPs spreading malware this is illegal criminal behavior in most of the world. Any ISP caught doing so fa

    • If I'm going to verify the file on my end anyway

      What are you working for the NSA or something? Normal people don't do that.

  • Let me guess: So that a site without a Hollywood approved security certificate can't make use of HTTPS to encrypt and circumvent mandatory Hollywood file inspection?

    • by tepples ( 727027 )

      What does the neighborhood of Hollywood or even the US movie industry have to do with HTTPS? Let's Encrypt offers free certificates to anyone who owns a domain name.

  • It brings back bad memories of when Google decided that certain filetypes were too dangerous to be handled by gmail, so suddenly I could no longer access a bunch of .js files a buddy had sent me long ago that I had in my mailbox. I was not impressed.
  • Including .EXE but forgetting about .SCR, .COM, .BAT, .LNK, and possibly other extensions that are treated just like .EXE files?

    (Yes, .BAT files which are valid PE executables will run as executables, and it won't try to execute it as a command prompt script)

  • by Anonymous Coward

    I think there's actually a push to encrypt too much. It's got obvious benefits for privacy and security, but encrypted traffic can't use the internet's caching infrastructure which would benefit popular downloads, which tend to be ZIPs, EXEs, TARs and such. What I'd really like to see is browser integration to insecurely download files and securely confirm its fingerprint.

    That said, informing the user aware of an unencrypted download from an encrypted site would be good. Blocking it would be bad.

    • encrypted traffic can't use the internet's caching infrastructure which would benefit popular downloads

      A CDN contracted by the operator of the origin server, such as CloudFront or Cloudflare, can cache HTTPS just as easily as cleartext HTTP.

  • ... web site to go through all their web pages and make sure that no instances of "http:" are accidentally left in pages where downloads are available?

    It might be easier for web sites to merely add a browser detector to their pages to warn the user that they're using a product from a vendor that's actively trying to make their use of the Internet into a royal pain in the behind?

  • by nadass ( 3963991 ) on Wednesday April 10, 2019 @02:55PM (#58416872)
    The Google Chrome engineer who posted this ask to the W3C mailing list ( https://lists.w3.org/Archives/... [w3.org] ) also made a social media poll, https://twitter.com/estark37/s... [twitter.com]

    Essentially, they're reinforcing their own echo-chamber effect to only listen to confirmations of their conceived notion of correctness rather than truly encouraging discourse on the matter. Her poll options are, "yes" and "yes" -- and several Twitter replies have been deleted.

    Personally, it seems they are an engineer looking for a problem to solve to help justify their job... and that's just sad in itself.
  • It's getting close to a point where we need to make our own web and web browsers, with blackjack and hookers (but seriously!). There are lots of free TCP ports left, let's just choose another one and walk away from the HTTP2/3/AMP/QUIC bullsh$t. Make a translator gateway if someone wants to visit the "Google-web" with all of its limitations.

    This is a clear violation of stack layers -- a general purpose APP for browsing and fetching online content should NOT attempt to be the gatekeeper for what the TRANSPOR

  • It already blocks .torrent files on certain sites. Says they are dangerous. Orly?
  • Chrome? (Score:2, Funny)

    by sremick ( 91371 )

    As long as it continues to let me download FirefoxSetup.exe, we're good.

  • why not include office documents and pdf's as well? they've been a source of infections too.
    well, not much else is left except pure media (video/audi/pictures) files.

It is easier to write an incorrect program than understand a correct one.

Working...