Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Networking Communications Media The Internet

ISPs Experimenting With New P2P Controls 173

alphadogg points us to a NetworkWorld story about the search by ISPs for new ways to combat the web traffic issues caused by P2P applications. Among the typical suggestions of bandwidth caps and usage-based pricing, telecom panelists at a recent conference also discussed localized "cache servers," which would hold recent (legal) P2P content in order to keep clients from reaching halfway around the world for parts of a file. "ISPs' methods for managing P2P traffic have come under intense scrutiny in recent months after the Associated Press reported last year that Comcast was actively interfering with P2P users' ability to upload files by sending TCP RST packets that informed them that their connection would have to be reset. While speakers rejected that Comcast method, some said it was time to follow the lead of Comcast and begin implementing caps for individual users who are consuming disproportionately high amounts of bandwidth."
This discussion has been archived. No new comments can be posted.

ISPs Experimenting With New P2P Controls

Comments Filter:
  • less peering (Score:4, Interesting)

    by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Friday June 20, 2008 @07:10PM (#23880447) Homepage
    give increased speeds when you don't leave the network. downloads will complete faster, so less peering will be done.
  • by Odder ( 1288958 ) on Friday June 20, 2008 @07:17PM (#23880527)

    Here's how media companies will kill the free internet we all know and love:

    "Legitimate" media caches and disruption of all other P2P traffic only makes step one worse. They will continue to slow the rest to lower than their heavily filtered networks can deliver. The result will look like broadcast media does today, one big corporate billboard, instead of a free press. Part of censorship is shouting louder than others.

    Yeah, I've said this before [slashdot.org]. As long as ISPs have the same story, so will I.

  • by MrKaos ( 858439 ) on Friday June 20, 2008 @07:21PM (#23880583) Journal
    My ISP very cleverly tells me I can download 12gb per month, which is true. What they don't tell me is anything I upload when I'm peering is also counted to the 12Gb total.

  • by Anonymous Coward on Friday June 20, 2008 @07:23PM (#23880601)

    I wonder what they'll do about encrypted traffic.
    Not cache it, obviously. Unencrypted traffic will therefore be faster, so people will use less encryption. Was that a trick question?
  • by Vectronic ( 1221470 ) on Friday June 20, 2008 @07:31PM (#23880667)

    Curious, how do you know you have downloaded (and/or uploaded) 12GBs?

    I mean I doubt you grab the calculator everytime you download a file, or a webpage is finished loading... They could even be inserting corrupt packets, and including that in the 12GB total, or what about ICMP, Ping, DNS's lookups... surely thats included aswell, which is probably in at least the 10's probably the hundreds of MB's after 12GB's...

    "no no, see this graph? says there it was 12 GBs"

    Ive always gone for the DL/UL limited ISP's cause then as slow as it may (or may not) be, I know that im getting what I can get in a given amount of time... including overhead, and corruption.

  • by NoobixCube ( 1133473 ) on Friday June 20, 2008 @07:31PM (#23880669) Journal
    Having a local cache server, while it does spark privacy concerns, is actually probably the best solution they've come up with yet. ISPs won't have to spend a great deal of money on upgrading infrastructure, and users don't get shafted by reset packages. It's something of a compromise between doing it the right way (upgrading everything) and the wrong way (strangling the users).
  • by EWAdams ( 953502 ) on Friday June 20, 2008 @07:35PM (#23880697) Homepage


    I'm told I get 10 MBPS. As far as I'm concerned, that means 10 MPBS 24 hours a day, 7 days a week, for as long as I pay my bill. Any effort to throttle that back and I sue for false advertising.

  • by drDugan ( 219551 ) on Friday June 20, 2008 @07:40PM (#23880755) Homepage

    P2P shifts costs of distribution from central servers and spreads the load out among the downloaders. This is *helpful*, and it is more equitable given that the marginal costs of data copying is near zero - pushing the price of downloaded content lower and lower.

    The pricing seems like such a non issue. The elephant in the room is that companies like Comcast are making a killing, taking a ton of money selling services that largely go unused. many service businesses over sell their capacity to ensure high usage rates, but broadband has taken it to an absolute extreme.

    The obvious and easy solution is for providers of cable and DSL services to price their offerings according to usage, and when it comes to bandwidth, the accurate solution is 95% billing: you use a ton of bandwidth, the customer gets charged more. They don't really want to do this though - they make a lot more money buying in bulk and selling little access services for much higher rates than the bandwidth used.

    One huge upside of changing the pricing system for home Internet to 95% billing is that you don't have to go metering and capping bandwidth to homes. People could get an *extremely* fast connection, but if they utilize it fully 24/7 then they get billed a high rate. This is not that complex a concept to implement technically.

  • by Nom du Keyboard ( 633989 ) on Friday June 20, 2008 @07:41PM (#23880763)
    You should be allowed to use the bandwidth you paid for as you please. It's not your ISP's business what you decide to do with what they sold you. Whether it's downloading via BT, or watching video on Hulu, no one else should be trying to decide which are Good Bits and which are Bad Bits.
  • "Legitimate" media caches and disruption of all other P2P traffic only makes step one worse.
    It's a tiered internet in disguise, one step at a time. Not only that it's a double edged sword... I download OpenOffice via p2p, but in reality I assume the "legitimate" cache would be so far under utilized they would take the numbers to congress as some measure of "proof" to pass anti-p2p legislation.
  • Re:Legal content? (Score:5, Interesting)

    by Opportunist ( 166417 ) on Friday June 20, 2008 @07:54PM (#23880863)

    No, but it will be a good "proof" for the argument against P2P. Akin to "See? We have caches with all the legal P2P content and yet no decline in P2P traffic. So it's proven that P2P is mainly used for illegal means".

    Yes, I know it's no proof. Tell your congressman, not me.

  • by Anonymous Coward on Friday June 20, 2008 @08:07PM (#23880947)

    This just gave me an idea. Why not have the next generation of P2P protocol have ad space in the client. The catch is, the ads are pushed from other P2P clients. The receiver would then display ads from the top 3 seeders (measured by bitrate or bytes sent, or whatever makes sense). Then all of a sudden, we have incentive for these ISPs to seed.

  • by Triv ( 181010 ) on Friday June 20, 2008 @08:10PM (#23880969) Journal

    I'd probably choose an ISP that carries the latest kernel downloads locally...


    hahahahaha. You think the ISPs are going to start caching the Linux kernel? Where's the money in that? Now, if you want the latest Britney Spears video (kickbacks for promotion from the RIAA) or movie trailers (ditto from the MPAA) or game demos, you're set.

    You gotta understand, to the content distribution companies, "legal P2P" = "free shit that we'll give you under the hope that you'll spend money later". Linux absolutely isn't on that list.

  • by Sponge Bath ( 413667 ) on Friday June 20, 2008 @08:17PM (#23881021)

    A more realistic car analogy:
    It's a bit like an average person having a fast car.
    They drive it to work, school, shopping, and entertainment.
    Most of the time it is unused, but when they are using it the extra speed is useful.

  • by John Hasler ( 414242 ) on Friday June 20, 2008 @08:26PM (#23881087) Homepage

    But they have to pay by the gallon for the gasoline they use.

  • Re:alt.binaries (Score:1, Interesting)

    by Core-Dump ( 148342 ) on Friday June 20, 2008 @08:50PM (#23881289)

    Thank God not here (Netherlands).
    Here it is LEGAL by law to download music or movies for own use, and binaries are just the perfect way of getting them...... as long as it lasts.
    But then again, seeding torrents is illigal here, but the dutch MPAA/RIAA isn't able to sue private persons because of the privacy laws here

  • The plan to avoid it (Score:1, Interesting)

    by Anonymous Coward on Friday June 20, 2008 @11:07PM (#23881955)

    The truth is, content blocking (which is what this really is, none of that 'filtering' crap) is yet another hurdle to overcome. Things like encrypted BitTorrent and Distributed Hash Tables (DHT, or decentralized bittorrent) are only the first step.

    Without a legal or free-market solution (since most ISPs are geographical monopolies or duopolies), a technological solution must be developed. The truth is, these content blocking appliances are pervasive in the ISPs network, and can pretty much masquerade as the intended endpoints for most traffic. The most effective solution, it seems, is to blind these appliances.

    It makes the most sense for these boxes to take a multi-pronged approach:

    1. Protocol Recognition (combatted by encrypted bittorrent)
    2. Tracker snooping (a hole AFAIK)
    3. Traffic/connection heuristics

    If a P2P application is moved to a 'plane' above the usual network protocols and conventions, with encryption every step of the way, P2P communications will appear as noise or an unrecognized protocol, or not even as a single application at all.

    Imagine this: A .torrent file that contains a URL to an authentication server. The authentication server is an https server, with the certificate published right along with the .torrent on thepiratebay.org or a similar site. After verifying the certificate, the BitTorrent client (Azureus, uTorrent whatever) contacts the authentication server, which presents a Turing Test (CAPTCHA). After the user selects all the kittens in the picture, the authentication server returns the tracker URL and the public key used for communication. The tracker then behaves like a normal BitTorrent tracker.

    This way of joining the swarm completely prevents packet sniffing by using encryption every step of the way, and also stops the content blocking appliance from masquerading as a BitTorrent client by using a CAPTCHA. Without knowing the members of the P2P swarm, the appliance cannot block the connections made to and from those hosts.

    As soon as the client becomes a part of the swarm, it must also take steps to avoid heuristics-based detection. It must not listen for connections on a single port. In fact, every server (I'm talking about TCP clients and servers now), must only accept a small number of connections, and new server ports must be opened to accept new clients. After a short time, those connections must be severed, and new ones opened, effectively migrating the data streams to completely new client and server ports, giving the appearance of multiple different TCP/IP applications being used at different times. The traffic might even be hidden in a known protocol, such as HTTP, to avoid the appliances that throttle all unrecognized protocols. (BitTorrent over HTTP, now that's funny).

    So far, with this kind of detection avoidance, the only flaw is a spike in the user's bandwidth usage, which may not really be a big deal.

    The beauty of this plan is that it requires little modification to current BitTorrent clients and trackers, as opposed to what's involved in a completely new protocol.

  • by Jah-Wren Ryel ( 80510 ) on Friday June 20, 2008 @11:13PM (#23881967)

    Yet, obviously these caches will have to be legal content, which means filtering out illegal content,
    I don't know if you can make that assumption. We have a mechanism in place by which an ISP is essentially given immunity for hosting 'illegal content' - the much maligned DMCA notice. As long as they respond to DMCA notices, they have very little legal liability.


    It seems plausible, at least, that an ISP could deploy a 'torrent sniffer' that automatically joined the swarms of any torrents that the ISP's users were in and then started to serve only local users from its cache. It might be possible to become a tracker spoofer such that the ISP could start redirecting all requests for cached content to itself rather than out over the (expensive/bottlneck) of peered connections.

    So every once in a while they have to respond to a DMCA notice and kill a cache. Its not the end of the world, eventually someone else will come along and start a new torrent for the same content anyway and the game begins again.

    Unfortunately, I think the only reason ISPs are not more interested in something like that which would deliberately follow the letter of the law is that they want to make nice-nice with the MAFIAA so that they can resell MAFIAA content directly to their own subscribers. If ISPs would stick to being INTERNET service providers and stop trying to diversify into being CONTENT providers I think we would already see such automated 'blind-eye' caching mechanisms in place.

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Friday June 20, 2008 @11:48PM (#23882093)
    Comment removed based on user account deletion
  • Answer me one question before applauding the idea: How are they going to discriminate between legal and illegal content without looking at what you're downloading?

    They can't. Even if they know for sure what you are downloading, they have no way of knowing whether or not you have the permission of the copyright owner to download it. They are saying "legal" to avoid a pre-emptive attack by the RIAA. When the cache is installed, it will turn out that it doesn't discriminate, and they hope the RIAA won't be able to persuade Congress to declare it illegal.

  • by Maxo-Texas ( 864189 ) on Saturday June 21, 2008 @12:54AM (#23882411)

    The hole in this is the huge microsoft patches and downloads (tho the largest I ever got was double digit megabytes- never gigabytes).

  • by Layth ( 1090489 ) on Saturday June 21, 2008 @01:02AM (#23882461)

    These issues are complex, but going by the article summary I'm not sure we're all on the same page.

    It sounded (to me) like they're looking for ways to maintain internet traffic, but help alleviate some of the costs of that traffic by using caches. Just because you pledge to allow certain levels of users access, it doesn't mean you have to provide them with that functionality in the MOST expensive way possible.

    If they want to brain storm on ways to improve the means, I say have at it.
    Also I see nothing wrong with having certain users paying a higher fee for using a higher percetange of the system. It doesn't make any sense for somebody that likes to browse html at broadband speeds to be placed in the same category as a differnt person that likes to download 2-3 new dvd-quality movies a day.

  • by Anonymous Coward on Saturday June 21, 2008 @07:56AM (#23883895)

    I'm quite happy with my unlimited service. After taking one ISP's (almost) top 8MB package and quickly (within days) getting cut off for overuse, I switched to a UK ISP (entanet reseller) that offers a truly unlimited connection at 2MB, albeit for a *little* more than usual. I know they mean it, because their other packages list transfer limits like 320GB per month off peak, and this one simply says n/a under those columns. Just in case, I saved a copy of the package comparison page though.

    You CAN still get a decent product, if you don't accept the B.S. products and keep looking.

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...