FTP Resources Will Be Marked Not Secure in Chrome Starting Later This Year (google.com) 152
Google engineer Mike West writes: As part of our ongoing effort to accurately communicate the transport security status of a given page, we're planning to label resources delivered over the FTP protocol as "Not secure", beginning in Chrome 63 (sometime around December, 2017). We didn't include FTP in our original plan, but unfortunately its security properties are actually marginally worse than HTTP (delivered in plaintext without the potential of an HSTS-like upgrade). Given that FTP's usage is hovering around 0.0026% of top-level navigations over the last month, and the real risk to users presented by non-secure transport, labeling it as such seems appropriate. We'd encourage developers to follow the example of the linux kernel archives by migrating public-facing downloads (especially executables!) from FTP to HTTPS.
As someone who has to administer firewalls... (Score:5, Insightful)
...FTP just needs to die. The two port requirement and worse still, people who don't get it still insisiting on 'active' FTP, is a pain in the backside for firewall admins (we had one vendor insist that passive mode was 'insecure' and active mode was somehow 'secure' but after some browbeating and the threat of the wire brush of enlightenment accepted they should use this new fangled "sftp" which didn't have any of the drawbacks of ftp, passive or active).
FTP's day was done over ten years ago.
Re:As someone who has to administer firewalls... (Score:4, Interesting)
Re: (Score:2)
Yes, FTP isn't secure by itself, but it's simple.
Spoken like someone who has never looked at the FTP protocol or the code in a client or server. HTTP is far simpler to implement than FTP and, unlike FTP, is also trivial to add TLS support to, easier to scale up with CDNs, and so on. FTP hasn't been the right tool for any job for well over a decade.
Re: (Score:2)
Not for uploads. You'll need a server-side script for those.
HTTP PUT (Score:2)
Uploads through the POST method require a server-side script. But I thought a server could handle the PUT method by itself.
Re: (Score:2)
By default, no. You need DAV enabled. On Apache that's mod_dav, and on nginx it's HttpDavModule.
Re: (Score:2)
And then every malware bother will use your HTTP server with a PUT enabled to distribute itself.
401 Authorization Required
Re: (Score:2)
A failed WebDAV PUT can be resumed with a second PUT specifying a Content-Range.
Re: (Score:2)
Or you can use WebDAV. It's about 5 lines of httpd configuration and you can tie it in to use whatever auth module you are already using.
Re:As someone who has to administer firewalls... (Score:5, Informative)
1) FTP uploads are easier to support than HTTP uploads HTTP uploads require CGI scripts to handle, and if configured wrongly, can lead to security issues (see FCC website w.r.t. comment system)
2) FTP supports TLS -it's called FTPS (not to be confused with SFTP - the former uses FTP and initiates a TLS session, the latter uses SSH). Modern FTP clients and servers support STARTTLS as a command to initiate TLS, and they do it before the USER/PASS commands so the connection is encrypted from the get-go. Note that you need to use passive mode while doing this as most NAT gateways spy on FTP sessions to set up dynamic mappings, and TLS doesn't allow them to do it.
3) HTTP doesn't allow for easy downloading of multiple files other than picking and saving one at a time. Sure browser extensions may try to simplify this, but in general, you can't pick a list of files and transfer that. Triply so if you want to upload multiple files - either the web page and script has to implement support or you're having to upload files one at a time. Clever javscripting can help with that, but now you're relying on user side and server side scripts and not all websites that support uploads support multiple file transfers.
Granted, it's time for a modern upgrade to FTP that gets rid of the multiple port requirements, but HTTP is not a complete replacement for FTP. FTPS still has all the issues with FTP. SFTP is a lot better, but support is generally lacking across the board, including bypassing strict firewalls.
STARTTLS stripping (Score:2)
Modern FTP clients and servers support STARTTLS as a command to initiate TLS
Unless the ISP intercepts the STARTTLS command sent by the client and turns it into a garbage command that produces a 502 Method Not Supported response, fooling the client into thinking the server doesn't support TLS. This has happened [eff.org], Ars Technica has reported on it [arstechnica.com], and there's even a proof of concept in PyPI [python.org]. What's FTP's counterpart to HSTS?
Re: (Score:2)
Unless the server sincerely doesn't support STARTTLS. Without some counterpart to HSTS, how is a client supposed to distinguish a server that doesn't support TLS from one that does but is behind a proxy that changes "Send a STARTTLS command first" to "Send a XXXXXXXX command first"?
Re: (Score:2)
No kidding. I just used FileZilla to download 250GB of weather data samples from the NOAA web site (ftp://ftp.ncdc.noaa.gov). It's spread over close to 100,000 files. I just drug the folders I wanted over to my drive and all of the files got queued up. Left it downloading over night and came back to a completed transfer.
I don't see using any sort of HTTP based solution as being able to do that.
It's public data (and very likely not going to be MITM'ed), so unencrypted FTP was the perfect solution for thi
Re: (Score:2)
1) FTP uploads are easier to support than HTTP uploads HTTP uploads require CGI scripts to handle, and if configured wrongly, can lead to security issues (see FCC website w.r.t. comment system)
Nope, HTTP supports several verbs including PUT and POST. PUT doesn't require any scripting and can be configured in most servers to allow uploads based on the current authentication (which can be a client-side SSL certificate, a password, or a couple of other things).
2) FTP supports TLS -it's called FTPS (not to be confused with SFTP - the former uses FTP and initiates a TLS session, the latter uses SSH). Modern FTP clients and servers support STARTTLS as a command to initiate TLS, and they do it before the USER/PASS commands so the connection is encrypted from the get-go. Note that you need to use passive mode while doing this as most NAT gateways spy on FTP sessions to set up dynamic mappings, and TLS doesn't allow them to do it.
Needing to support passive mode isn't too much of a problem, because active mode is pretty broken with a lot of NATs anyway. Unfortunately, the STARTTLS mode is trivial to attack with downgrade attacks and most FTP clients don't complain. I
Re: (Score:2)
HTTP Compared to FTP:
Re: (Score:2)
Actually, no, when somebody explains that there are use cases with different considerations, you can't just disagree and have any chance to be right. Disagreeing just shows you don't comprehend the words he used. And they were simple words. But you thought you had a magical brain that can see other people's use cases better than they can see them, even if all you know is that it involved FTP! Durrrrrrrrrrrr
Re: (Score:2)
Re: (Score:2)
Sftp is scp with its interface reworked to look like an ftp client. It really has no relation to ftp.
Re: As someone who has to administer firewalls... (Score:1)
Yeah, that's why the AC asked if SFTP would have these problems. You see, the knowledge that they are different is inherent in the question. If they were not different the answer would automatically be "yes".
Reading comprehension is truly on the decline around here.
Anonymous SSH (Score:2)
To what extent do common clients and servers for SSH, the session protocol underlying SFTP, support sessions that do not require authenticating with a username and password or username and public key?
Re: (Score:2)
Yes, FTP isn't secure by itself, but it's simple.
Go to bed, you're drunk. The only thing "simple" about FTP is the minds that came up with a system that requires multiple ports with connections established in different directions that is the bane of the modern internet NAT'd routing.
Just because it's plain text and in english, doesn't make it "simple".
PASV (Score:2)
a system that requires multiple ports with connections established in different directions
A data connection established with PORT does go in the opposite direction of the control connection. But I thought PASV, which runs both connections in the same direction and cooperates better with NAT, had become more common. The exception is so-called "FXP" transfers from one server to another, where the client opens control connections to two servers and sends PASV to one and PORT to the other in order not to have to bounce a file off a residential last mile.
Re: (Score:2)
But I thought PASV, which runs both connections in the same direction and cooperates better with NAT, had become more common.
It has, and brought it's own problems with it. But you highlighted my point well. Active, passive, one port, two ports, server to server FXP, and all of this is apparently "simple", where in reality its a damn complicated protocol with a lot of intricacies which has been extended and hacked about many times over the years.
Re: (Score:2)
There still exists a subset of FTP that's simple to support if the server operator doesn't need FXP functionality, namely PASV-only. Or is FXP too important in practice to consider it expendable?
Re: As someone who has to administer firewalls... (Score:1)
Re: (Score:2)
A million times this.
FTP (and Telnet) are antique protocols that cannot be made adequately secure and have long since had more secure alternatives available.
They just need to go away now.
Re: (Score:3)
If you're giving a file away for free to everyone, how secure do you need the transport protocol to be?
Re: (Score:2)
As secure as possible. You don't want MITM attackers modifying the file, and you don't want your server to be compromised.
Re: (Score:3)
Re: (Score:2)
I generally agree with you. I actually laughed when I read your comment because I can't tell you the number of times people have berated me here for making very similar arguments.
Maximum security is not the right answer for all circumstances. However, as much security as you can reasonably implement is a good idea. Multilayered security is always desirable.
Case in point: in my home network, almost everybody behind my firewall still talks using encrypted channels. Why not, when the machinery supports this wi
MITM can alter the hash (Score:2)
It's a better use of just about anything I can think of to encrypt the file or make a secure hash of it or whatever ONCE, and transmit it in the clear
If you transmit the hash in the clear, a man in the middle can alter the hash in transit.
Re: (Score:2)
There are many use cases where the a security signature or certificate needs to be transmitted once and the data transmitted orders of magnitude more times. In which case, the computational overhead of establishing an encrypted channel needs to be accepted infrequently and for a relatively small payload instead of every time for a large payload.
There are many use cases where the key distribution mechanism isn't over the internet at all.
There are use cases where you really honestly don't car
Re: (Score:2)
There are use cases where you really honestly don't care if the cat video you're getting to kill time is authentic or not.
But you do care whether the operating system UI presented after the conclusion of the cat video is authentic and not a phishing attempt.
Re: (Score:2)
Know your sender (Score:2)
You can just as easily spoof a UI image over an encrypted channel as you can over a cleartext channel.
With an encrypted channel, the browser has a fully qualified domain name with which to associate all images transmitted over that channel. With an unencrypted channel, the browser can't tell what domain has injected the images.
Re: (Score:2)
Re: (Score:2)
The MITM could insert malicious code into files transferred through FTP. Even if the file looks like a video, several video containers such as WMV have contained functionality to download DRM code to obtain a license needed to decrypt the video in the WMV file. This functionality can be and has been used for dropping trojans [stackexchange.com].
Re: (Score:2)
Re: (Score:2)
Or unless it's a video whose publisher doesn't want it sent in the clear to be viewed by other people who haven't paid for it.
Re: (Score:2)
There are many use cases where the a security signature or certificate needs to be transmitted once and the data transmitted orders of magnitude more times. In which case, the computational overhead of establishing an encrypted channel needs to be accepted infrequently and for a relatively small payload instead of every time for a large payload.
A prime example of this is software updates. From Microsoft, from Apple, from whomever. Everyone should be keeping their systems as up to date as possible. That said, the downloads for these updates are huge, so leaving them in the clear allows transparent caching infrastructure to work properly. What about a MITM attack on the binaries? Well, you can resolve that by making the control link from update service to the update client HTTPS, and including sha256 checksums in the system to verify the cleartext d
Re: (Score:2)
You're right! The only thing is that the Internet was never designed to be "secure", and likely never will be. FTP works just fine for moving files around. Not everybody needs security on the Internet. If you want to pretend that you can slap some software on top of an inherently insecure network to try to make it "secure", then go right ahead. I'll happily continue
Re: (Score:2)
Ahh, the old "if it can't be perfectly secure, then there's no point" argument! Personally, I'll take "as secure as possible" as preferable over "not secured at all".
Re: (Score:2)
FTP works just fine for moving files around.
But what works fine for ensuring that the file the receiver receives is identical to the file the sender sent?
Re: (Score:2)
...FTP just needs to die. The two port requirement and worse still, people who don't get it still insisiting on 'active' FTP, is a pain in the backside for firewall admins (we had one vendor insist that passive mode was 'insecure' and active mode was somehow 'secure' but after some browbeating and the threat of the wire brush of enlightenment accepted they should use this new fangled "sftp" which didn't have any of the drawbacks of ftp, passive or active).
FTP's day was done over ten years ago.
what? As someone else who administers firewalls, I have to ask, what the hell kind of ancient packet filter are you using?
Any modern stateful firewall for almost the last two decades has been able to inspect unencrypted FTP control channels, and dynamically open the appropriate data channel ports. Both active and passive usually work just fine. As a security person, I hate FTP for its lack of encryption and clear text password transmission, but technically, it's one of the easiest.
SFTP will often end up wit
Wherdid the vaby go? (Score:1)
Re: Wherdid the vaby go? (Score:2)
Re: (Score:1)
On the other hand, whatever you are using to type needs to go.
Re: (Score:2)
I don't think I've ever actually seen FTPS in use.
FTP and checksums or SFTP.
So how about FTPS (Score:5, Interesting)
FTP can be done using TLS and there is also SSH-FTP. FTPS is no more or less secure than HTTPS.
Have you ever downloaded large files over HTTP? It's not built for it, you practically need a download manager because the browsers will just choke or won't be able to continue unfinished downloads and there are hacks that make it work but many configurations aren't set up right to continue partial downloads.
Re: (Score:2)
I've noticed a lot of https or torrent as download options for large files.
Re: (Score:2)
FTP can be done using TLS and there is also SSH-FTP. FTPS is no more or less secure than HTTPS.
FTP over TLS is a pain, because you still end up needing multiple connections and the protocol is a nightmare. It's also implemented in a variety of different and incompatible ways by different people and so there's basically no good way of supporting it. SFTP is an entirely different protocol that shares some initials with FTP but is otherwise unrelated.
Have you ever downloaded large files over HTTP? It's not built for it, you practically need a download manager because the browsers will just choke or won't be able to continue unfinished downloads and there are hacks that make it work but many configurations aren't set up right to continue partial downloads.
I don't think I've downloaded anything over about 50GB via HTTP, but I've had no problems doing so. The protocol supports range requests and both comman
Re: (Score:3)
FTPS is relatively standard, I never have had an issue with it although you are right that there are various shoddy closed source FTP servers, the only "problem" is that most use self-signed certificates.
I'm currently trying to download 300GB via HTTP on a server that limits each user to a single connection of 128kbps (yay, academia). I know it supports range requests but as I said, they are particularly flaky on many non-Apache servers (looking at you IIS) and many admins misconfigure it, especially on ngi
Re: (Score:2)
Usually "academia" has loads of bandwidth including Internet2 availability. Many of my Linux distro's mirrors are .edu domains. I get a lot more than 128kbps on those downloads (usually more like 5-10 MB/s) despite having no affiliation with the university at all. How are you possibly constrained to such a low speed?
Just a guess, but perhaps the high speeds are available only to people who have an affiliation with some university, even if not the same university. Transfers between one university and the other are fast; transfers to or from the "public" are slow.
Re: (Score:1)
I download large files over HTTP using wget. You can even configure wget to ignore robots.txt.
Wget is a download manager (Score:2)
you practically need a download manager because the browsers will just choke
I download large files over HTTP using wget.
True, GNU Wget offers more robust resume support than a web browser. But it illustrates guruevi's point because it's a download manager, not a browser.
Re: (Score:2)
What would the CLI counterpart to said derpy clients look like? I imagine not unlike Wget.
Re: (Score:2)
The problem with HTTP is that range requests are a bit of a hack, there is a huge amount of protocol overhead and it does not work universally especially since there is no universal session state mechanism in HTTP, so when your session expires, often so does your download.
Re: (Score:1)
FTPS is WORSE. The problem with FTP in general is with stateful firewalls and NAT/PAT.
FTP is a ridiculous protocol. For FTP to work at all these days, firewalls actually need to go out of their way to snoop in on the control channel and watch for the data channel IP & port, and then use that to pre-populate the state table with an entry for the data session.
This breaks down on a lot of firewalls if you change the control port, or if you use FTPS where the control session is encrypted and the firew
Re: (Score:2)
Some (most) people would say NAT is a ridiculous situation. The protocol isn't broken, NAT is.
Re: (Score:2)
FTP can be done using TLS and there is also SSH-FTP. FTPS is no more or less secure than HTTPS.
True, but that doesn't fix the rest of the things that make FTP suck.
Have you ever downloaded large files over HTTP?
Who said to use HTTP? That's not the only (or even a good) alternative.
Re: (Score:2)
Re: (Score:2)
The explicit FTPS method allows a client to revert to an unencrypted control channel after authentication, but it is only at the client's discretion. The FTPS server can't tell the client to issue a CDC command. So if your site requires it, you have to tell your users to configure their FTPS client that way.
And that's a good thing. Otherwise, neither side can be sure that the file that is received is identical to the file that was sent.
Re: (Score:2)
Re: (Score:3)
Yes, yes it is, it is the same protocol. You obviously have no clue and attempting to argue with a 20+ year Linux/Unix veteran is only going to embarrass you further.
Re: (Score:1)
Google won't hire you if you're that old.
Re: (Score:2)
FTPS =/= SSH-FTP. Either way, even "anonymous" FTP is authenticated, you just authenticate with a bogus username/password.
The answer is (Score:2, Funny)
GOPHER!
Re: (Score:2)
Could be worse (Score:1)
At least it's not providing a false sense of security, unlike several other protocols (those with an 's' on the end) that allow any fraudulent certificate authority to issue certificates for any domain they feel like.
Re: (Score:1)
As far as FTP goes, there is no analog to something like proftpd which has all kinds of cool options (like upload download ratios, advanced per-user chroot support, encryption, or RADIUS accounting support. You aren't going to find a WebDAV service that supports all that. It's not a valid replacement for FTP.
Bring Back ZMODEM (Score:2, Funny)
Make file transfers great again. Trimp 2018!
Typical (Score:1)
This is typical of Google; always trying to gain CONTROL over our lives. Fortunately, I never use Chrome, and I am to the point where I am thinking of creating my own browser that enables the user to have FULL control.
Imagine...
Being able to block popups on one site but be able to enable them on another.
Blocking some sites but not others.
Enable cookies at one site, but not another.
False seeding cookies
etc. etc.
If somebody else comes up with one of these, and no not just another addon, then I will try it be
Re: (Score:3)
Now that would be a plugin! One where people can choose the (ad) cookies to share with the other users of the plugin, basically rendering any and all data collected absolutely worthless because nobody can ever know anymore who used what ad cookie to visit a page.
Re: (Score:2)
Now that would be a plugin! One where people can choose the (ad) cookies to share with the other users of the plugin, basically rendering any and all data collected absolutely worthless because nobody can ever know anymore who used what ad cookie to visit a page.
Interesting idea. It's not so certain that it would be a clear win for users, though. Inability to target ads means that ads are worth less, all else being equal. This leads advertisers to try to make their ads worth more by making them attention grabbing (bigger, brighter, blinkier, self-playing video, etc.), or to simply pay less for ad space. If they pay less for ad space, then site owners are incented to increase the amount of ad space on their sites, or else to stop depending on advertising revenue and
Re: (Score:2)
The benefit of more obnoxious ads is more people blocking them.
And yes, people start doing that. "Normal" people. Not geeks. The same people that dutifully close 20 popups and error messages every time they launch a browser. The same people that do the same every time they start their machines. The same people that have a 5 by 4 inch browser window because the rest is occupied by "free" search bars.
Can you imagine just how much you have to piss people like THAT off to make them install something? And they d
Re: (Score:2)
Re: (Score:2)
ISPs get money from their customers to connect them. Hosters get money from their customers to host content. What else do you need?
Re: (Score:2)
ISPs get money from their customers to connect them. Hosters get money from their customers to host content. What else do you need?
How do content producers get money to produce content? To take one specific example, who would pay for slashdot's hosting?
Re: (Score:2)
Most likely the users. At least if the content is worth it.
Re: (Score:2)
Most likely the users. At least if the content is worth it.
Which means that 90% of the content on the web will disappear, and the remainder will all be paywalled and inaccessible to many people. There's good reason to dislike advertising, but there are also extremely good reasons that it has been the primary mechanism for funding broadly-distributed content for centuries. No other approach has proven to work remotely as well.
But, actually, none of that will happen. Instead, we'll just have an arms race between adblockers and adblocker-blockers, and the better-fun
Re: (Score:2)
Then the login credentials to said ad networks will be shared. If they are tied to Facebook accounts, fake FB accounts will be generated by the dozen per nanosecond to fill the need. You build the better mouse trap and the better mouse will evolve.
In the end, I can't say that seeing 90% of the gunk that clogs the internet vanish would be a bad thing. I do remember the internet before its commercialization. It was smaller. But the average IQ was at least double its current rate.
Enjoy looking at paywall notices (Score:2)
The biggest benefit I can think of would be for all of the advertisers to join all of the lawyers at the bottom of the ocean.
That won't work in practice for one simple reason. Publishers will sw...
To read the rest of this comment, log in to your comments by tepples account or subscribe to comments by tepples.
Re: (Score:2)
(close window)
(choose next result in google)
(add the page to my "-site:..." search string list)
Another addon I'd like: Something that automatically adds "-site:..." to every google search string to weed out clickbaiters.
Why is it supported at all? (Score:2)
Browsers should have stopped supporting FTP at least 10 years ago. We're never going to force the dinosaurs to upgrade until we stop enabling them! FTP has a proud place in the history of the internet, but it's time has long since passed, and it is time to retire the protocol, forever.
Re: (Score:2)
Because it has its uses, especially with capable clients like ncftp. mget * is much easier and more reliable than building up a list of switches for wget.
Ah, publicly secure (Score:2)
One day, someone will explain to me why a completely public piece of information, distributed freely to everyone everywhere, needs to be delivered securely.
In a world where FedEx drops packages at my front door, and leaves them there when I'm away.
In a world where the only thing stopping my speeding car from hitting an on-coming speeding car is a line of yellow paint.
In a world where my front door is locked with a deadbolt, right next to a glass window.
Can anyone say "overboard"? I don't need to encrypt so
Re: (Score:2)
This has been the big thing with security all along. As long as you are fine with HTTP, why wouldn't you be fine with FTP?
Not everything needs to be secure.
Next time I drive by a billboard on the side of the highway it better be encrypted so I can't read the ad without some security certificates on my end!
That said, browsers have always been positively HORRIBLE FTP clients, so if people decide to use FTP clients instead of browsers to use FTP sites, it's not really a huge loss.
Re: (Score:2)
One day, someone will explain to me why a completely public piece of information, distributed freely to everyone everywhere, needs to be delivered securely.
Because, while the information is public, the fact that you requested it should be private. Even seemingly innocuous requests can reveal personal information which can be used against you, regardless of whether you are doing anything "wrong". Limiting how much strangers know about you is a powerful first line of defense: a form of social camouflage.
Because you need to be able to trust that the information is from the expected source and hasn't been tampered with in transit. This is particularly true when th
Re: (Score:2)
Who said anything about unsigned .deb packages? The web page you're reading contains executable code (JavaScript). If you browse the Web with JS enabled, like a normal person, then you need to be careful about provenance of the HTML and JS files which make up the pages. They run in a sandbox, true, but that doesn't prevent all issues (e.g. ad injection, tracking, cross-site attacks) and there is always the possibility of an exploit which can escape the sandbox. As a content provider, if you want people to b
Re: (Score:2)
The way to do this is with digital signatures. You have no idea if the source server has been compromised.
Digital signatures (HMAC) are part of HTTPS. They're automatically generated on demand, so not quite as good as a manual signature generated offline after careful examination of the content and verified after every download by the end user with a proper web of trust—but still better than nothing and no worse than any other signature generated on the server itself. They at least prove that the content did originate from a system authorized to sign it on behalf of the administrator. Whether or not you c
Re: (Score:2)
Answer: third parties. Do you want anyone who can see the traffic to be able to insert themselves into the traffic flow and alter what you're receiving? Because that's what can happen with unencrypted traffic. And while it might not be easy to do that at the level of say the backbone routers, it's really easy for someone to hack the WiFi router at a coffee shop and install a transparent proxy to hijack downloads and replace them with malware. In fact, if you make regular use of public WiFi access, the hijac
Re: (Score:2)
Obligatory car analogy [xkcd.com].
Since it's Chrome, they need to add: (Score:1)
Re: (Score:3)
Given that FTP's usage is hovering around 0.0026% of top-level navigations over the last month
or... the kind of people who use FTP are also the kind that disable telemetry.
(and... the kind that use sFTP are the kind that don't use a browser.)
Re: (Score:2)
Marking cleartext HTTP as "not secure" is actually the eventual plan, as I understand blog posts by Google [chromium.org], Mozilla [mozilla.org], and DigiCert [digicert.com]. First documents delivered over HTTP containing a password form was marked not secure. Then documents delivered over HTTP containing any forms. Then documents delivered over HTTP containing scripts. And finally, all documents delivered over HTTP other than from localhost.
Re: what about HTTP (Score:1)
It's also ass-backwards. Do I need to watch cat videos over SSL? No. But it insists on SSL and cuts the battery life down by doing so. Not to mention destroys caching and consumes more power on the server too.
Forcing SSL everywhere also screws up the trust model of looking for the SSL indicator, since shitty phishers just register paypal-fraud.cn and give it a ssl cert or push it over cloudflare.
What follows the cat video? (Score:2)
Do I need to watch cat videos over SSL?
TLS ensures that you're watching only the cat video, not the cat video followed by an ad inserted by a man in the middle, nor the cat video followed by a full-screen phishing form inserted by a man in the middle [feross.org].
Re: (Score:2)
not the cat video followed by an ad inserted by a man in the middle
And how do you know it's not a cat in the middle?
Re: (Score:2)
Why does Google know the number of visits to FTP servers from Chrome ?
Because Chrome users have opted into sending anonymous coverage data to Google in exchange for Google developers not deprecating and removing features that the users use on grounds that nobody uses them.
Re: (Score:2)
It would seem that the only unsecure bit of an ftp download would be some asshole manipulating packets at a router somewhere down the line. That should be illegal anyway, imagine if they did that with TV.
Cable TV operators already replace a small number of commercials per hour per their retransmission contracts with the networks. You can tell this is happening on (say) The Weather Channel because the crawl at the bottom disappears.