Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Internet Communications Networking

Most Alarming: IETF Draft Proposes "Trusted Proxy" In HTTP/2.0 177

Lauren Weinstein writes "You'd think that with so many concerns these days about whether the likes of AT&T, Verizon, and other telecom companies can be trusted not to turn our data over to third parties whom we haven't authorized, that a plan to formalize a mechanism for ISP and other 'man-in-the-middle' snooping would be laughed off the Net. But apparently the authors of IETF (Internet Engineering Task Force) Internet-Draft 'Explicit Trusted Proxy in HTTP/2.0' (14 Feb 2014) haven't gotten the message. What they propose for the new HTTP/2.0 protocol is nothing short of officially sanctioned snooping."
This discussion has been archived. No new comments can be posted.

Most Alarming: IETF Draft Proposes "Trusted Proxy" In HTTP/2.0

Comments Filter:
  • by the_B0fh ( 208483 ) on Sunday February 23, 2014 @11:51AM (#46316043) Homepage

    You don't understand how things work, do you? This bypasses your "acceptance" requirement.

    They can just do it transparently.

  • by the_B0fh ( 208483 ) on Sunday February 23, 2014 @11:52AM (#46316047) Homepage

    You have no clue what you are talking about. The "legally required" shit is already being done. There's no need to do any IETF crap.

    This is for ISPs to do it to you, without you being able to prevent it.

  • The current solution (Score:2, Informative)

    by Anonymous Coward on Sunday February 23, 2014 @11:54AM (#46316055)

    If you want to do this now, you're typically in one of two situations:

    You need to proxy the traffic for all users of a company, in order to filter NSFW content and to scan for viruses and other malware. In this case you add your own CA to all company computers. Then you MITM all SSL connections. This doesn't work for certain applications which use built-in lists of acceptable CAs, but mostly the users will be none the wiser.

    The other situation is that you want a reverse proxy in front of your hosting infrastructure. In this case you just have the proxy operator install your certificate and make it look like the proxy is your actual server.

    In both cases, the Trusted Proxy extension would make more transparent what's actually going on, instead of pretending that there is no proxy when in fact there is.

  • by MobyDisk ( 75490 ) on Sunday February 23, 2014 @11:56AM (#46316069) Homepage

    My employer uses a MITM HTTPS proxy. The IT department pushed down a trusted corporate certificate, and most people don't even know their HTTPS connections aren't secure any more. The real problem is when some application, other than a browser, needs internet access and it fails. This includ sethings like web installers that download the app during installation, automatic update systems, secure file transfer software, or things that call home to confirm a license key. On occassion a developer curses some installer for not working, then we inspect the install.log file and find something about a certificate failure.

    IT departments forget that HTTPS is used for more than just browsing the web.

  • by Jane Q. Public ( 1010737 ) on Sunday February 23, 2014 @07:48PM (#46319267)

    "someone didn't RTFM!"

    And apparently that someone was not alone.

    Right there on the first page it also says it calls for a mechanism for the person making the request to provde consent for the "trusted" proxy to, well, be a proxy.

    Granted, there could be problems with people consenting when they shouldn't. There might also be problems with essentially coerced "consent", as in a situation where that is the only avenue for accessing that resource. But those are different problems than that of someone just inserting themselves in as a man-in-the-middle.

  • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Sunday February 23, 2014 @08:45PM (#46319599) Homepage Journal

    In the vast majority of cases, when you are using an encrypted connection it is because the information you are exchanging is a private matter between you and the other endpoint.

    Even if the only private piece of information is the session cookie identifying the logged-in user to the site, that's still "a private matter between" the user and the site. Since the Firesheep tech demo [slashdot.org] became public, it has become common for some web sites to go all HTTPS all the time to prevent intruders from snooping and replaying session cookies. Facebook [slashdot.org] and Twitter [slashdot.org] do this, and Wikipedia [wikimedia.org] turned it at the end of August of last year. The biggest historical obstacle to HTTPS implementation for any site on a VPS or bigger has been mixed content introduced by ad networks, but in September of last year, Google finally enabled HTTPS for AdSense [google.com].

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...