Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Networking Businesses Google The Internet

Net Neutrality Opponent Calls Google a "Bandwidth Hog" 320

Adrian Lopez writes "According to PC World, an analyst with ties to the telecom industry — in a baseless attack on the concept of Net Neutrality — has accused Google Inc. of being a bandwidth hog. Quoting: '"Internet connections could be more affordable for everyone, if Google paid its fair share of the Internet's cost," wrote Cleland in the report. "It is ironic that Google, the largest user of Internet capacity pays the least relatively to fund the Internet's cost; it is even more ironic that the company poised to profit more than any other from more broadband deployment, expects the American taxpayer to pick up its skyrocketing bandwidth tab."' Google responded on their public policy blog, citing 'significant methodological and factual errors that undermine his report's conclusions.' Ars Technica highlighted some of Cleland's faulty reasoning as well."
This discussion has been archived. No new comments can be posted.

Net Neutrality Opponent Calls Google a "Bandwidth Hog"

Comments Filter:
  • by senorpoco ( 1396603 ) on Sunday December 07, 2008 @12:46PM (#26021233)
    Loaded fine for me. Here is the post. "Response to phone companies' "Google bandwidth" report Thursday, December 4, 2008 at 3:28 PM Posted by Richard Whitt, Washington Telecom and Media Counsel Earlier this week I thought that the announcement of a broadband access "call to action" was an encouraging sign that the phone and cable carriers could set aside their differences with Internet companies and public interest groups over network neutrality, and focus on solving our nation's broadband challenges. Unfortunately, a report issued today suggests that some carriers would still rather point fingers and keep fighting old battles. Scott Cleland over at Precursor Blog is, of course, not exactly a neutral analyst. He is paid by the phone and cable companies -- AT&T, Verizon, Time Warner, and others -- to be a full time Google critic. As a result, most people here in Washington take his commentary with a heavy dose of salt. The report that Mr. Cleland issued today -- alleging that Google is somehow unfairly consuming network bandwidth -- is just the latest in what one blogger called his "payola punditry." Not surprisingly, in his zeal to score points in the net neutrality debate, he made significant methodological and factual errors that undermine his report's conclusions. First and foremost, there's a huge difference between your own home broadband connection, and the Internet as a whole. It's the consumers voluntarily choosing to use our applications who are actually using their own broadband bandwidth -- not Google. To say that Google somehow "uses" consumers' home broadband connections shows a fundamental misunderstanding of how the Internet actually works. Second, Google already pays billions of dollars for the bandwidth and server capacity necessary to connect our data centers together, and then to carry traffic from those data centers to the Internet backbone. That is the way the Net has always operated: each side pays for their own connection to the Net. Third, Mr. Cleland's cost estimates are overblown. For one, his attempt to correlate Google's "market share and traffic" to use of petabytes of bandwidth is misguided. The whole point of a search engine like Google's is to connect a user to some other website as quickly as possible. If Mr. Cleland's definition of "market share" includes all those other sites, and then attributes them to Google's "traffic," that mistake alone would skew the overall numbers by a huge amount. Mr. Cleland's calculations about YouTube's impact are similarly flawed. Here he confuses "market share" with "traffic share." YouTube's share of video traffic is decidedly smaller than its market share. And typical YouTube traffic takes up far less bandwidth than downloading or streaming a movie. Finally, the Google search bots that Mr. Cleland claims are driving bandwidth consumption don't even affect consumers' broadband connections at all -- they are searching and indexing only websites. We don't fault Mr. Cleland for trying to do his job. But it's unfortunate that the phone and cable companies funding his work would rather launch poorly researched broadsides than help solve consumers' problems. "
  • Re:Probably true (Score:1, Informative)

    by Anonymous Coward on Sunday December 07, 2008 @01:04PM (#26021357)

    Learn to use the robots.txt file before moaning, duh! But then, judging from your previous posts, you are full of shit and don't know what you are talking about.

  • Re:Charge more? (Score:3, Informative)

    by Anonymous Coward on Sunday December 07, 2008 @01:09PM (#26021411)

    Because they're not in a business relationship with Google. The traffic from Google appears at their network borders as a result of transit contracts with tier-1 carriers, not with Google directly.

    Basically some providers see themselves in an important enough position to try and negotiate deals which put them higher up in the food chain. Instead of bargaining with world-wide network backbone connections, these ISPs try to bargain with their end-user reach.

    Network neutrality is a (necessary) kludge, because many home users can not choose a different provider. If users could always choose another provider, then the market would indeed deal with ISPs which overestimate their importance.

  • by Ceriel Nosforit ( 682174 ) on Sunday December 07, 2008 @01:10PM (#26021427)

    There's a local company offering a 1.5TB external drive when you order a 2mbit or faster internet connection. Since few people are likely to fill the drive up with holiday photos, the use for this combo is obvious.

    ISPs and digital storage manufacturers benefit from online piracy. I'd wager the profits are greater than the loss the content producers face, and are of net benefit to the global economy.

    But, my perspective on the issue is skewed. I've been a pirate since I was 7. :p

  • by Anonymous Coward on Sunday December 07, 2008 @01:14PM (#26021467)

    Nah the telco companies would just pass this cost on to the end consumer and effectively get exactly what they are wanting.

  • by Gothmolly ( 148874 ) on Sunday December 07, 2008 @01:16PM (#26021485)

    The people who go to Google are the hogs. If your pricing model doesn't take into consideration your consumer's usage patterns, then FAIL.

  • Re:Bandwidth hog? (Score:5, Informative)

    by macemoneta ( 154740 ) on Sunday December 07, 2008 @01:35PM (#26021653) Homepage

    That's exactly right. The customers paid for a shared connection. Google (Youtube) paid for a commercial connection. The ISPs are already being paid twice for transporting the same bits.

    Since the customer's connection is shared, there is no service guarantee. If contention is too high, bits get dropped. If too many bits get dropped, and the customer has a choice, they can go to another ISP.

    To summarize, ISPs are currently double-dipping, and they don't like competition. To solve this "problem", they propose triple-billing for transport so they don't have to re-invest as much in infrastructure. The "net neutrality" spin is just an obfuscation of what would otherwise be an obvious abuse of their position.

  • by flycream ( 1381739 ) on Sunday December 07, 2008 @01:43PM (#26021737)
    Crawl-delay directive

    Several major crawlers support a Crawl-delay parameter, set to the number of seconds to wait between successive requests to the same server: [1] [2]
    User-agent: *
    Crawl-delay: 10
  • by darkpixel2k ( 623900 ) on Sunday December 07, 2008 @01:43PM (#26021739)

    I don't see a way to use robots.txt to limit the number of crawler hits per interval other than just denying it. So you can block it, but that's undesirable if you want people to find it. It's also undesirable to have a robot hit your site every two seconds if ShieldW0lf is saying the truth, but robots.txt only address it in a simplistic allow / disallow.

    I'm not sure if any of the other providers implement this, but Google does. SiteMaps [sitemaps.org]

    Lets you specify how often to update certain content, what URLs to block. It's a more advanced robots.txt.

  • if-modified-since (Score:5, Informative)

    by SgtChaireBourne ( 457691 ) on Sunday December 07, 2008 @01:51PM (#26021825) Homepage

    Crawl-delay directive

    Several major crawlers support a Crawl-delay parameter, set to the number of seconds to wait between successive requests to the same server: [1] [2]
    User-agent: *
    Crawl-delay: 10

    Further, not only do the Google crawlers obey the robots.txt [robotstxt.org] described above (or other standards for robot exclusion), they also use HTTP's if-modified-since [w3.org] to make a conditional request. The file is only returned to the crawler if it has been changed. That saves a lot of time and bandwidth.

    PC World will also lose out if double-dipping is allowed.

  • Re:Probably true (Score:5, Informative)

    by earlymon ( 1116185 ) on Sunday December 07, 2008 @01:54PM (#26021851) Homepage Journal

    Oh - and here's a big PS: If you feel you're getting too much spider traffic - meaning you're somehow SO wildly popular that you really believe Google is hitting you too often - you can reduce the Google crawl frequency via your Google webmaster account - voila, your (non-existent) problem solved.

    And for those that don't use the service, and I do - the Google webmaster features in no way require you to be hosted at Google.

  • Re:Probably true (Score:3, Informative)

    by RobertM1968 ( 951074 ) on Sunday December 07, 2008 @02:22PM (#26022163) Homepage Journal

    Google hits my server regularly - but doesnt use much bandwidth in doing so. But then again, I run Google ads on my sites, so they monitor the content to show more relevant ads. Considering most sites are 80% graphical, 20% html/css/javascript; these requests are no big deal.

    When it comes to them indexing the site for their search engine, a simple directive in the robots.txt file to tell them how frequently you wish them to stop by is all that is needed - and is spelled out numerous places on the Internet (of course, including on their own pages). Any webmaster who is not aware of that (especially since Yahoo's bot is at least 20 times worse per my server records for www.startreknewvoyages.com where it would be 10-15 GoogleBots and 200-300 YahooBots) just doesnt know what they are doing. Both Google and Yahoo honor it (the "how many times in x minutes to visit flag in robots.txt). The only reason I put it in was for Yahoo, followed someplace inbetween by Microsoft, and in least invasive position at a fraction of the number of simultaneous bots, Google.

    I dont care how many pages they index, but Google's bots at least seem a lot smarter. Often I would have 10 or more Yahoobots reading the exact same page.

    Their overall traffic use (all combined) was nothing compared to normal site traffic from the same number of "requesters"

  • by SuperQ ( 431 ) * on Sunday December 07, 2008 @02:35PM (#26022313) Homepage

    There are a lot of things you can do. If Googlebot is using too much bandwidth, you could easily (man tc) add an outbound QoS limit to your webservers.

    http://www.google.com/search?q=googlebot+IP+range [google.com]

    If you're unable to do this, there is the GoogleBot webmaster tools that let you manage your hit rate.

    http://www.google.com/support/webmasters [google.com]

  • Re:Charge more? (Score:3, Informative)

    by MooUK ( 905450 ) on Sunday December 07, 2008 @02:40PM (#26022377)

    If the government had just stayed out of it, there wouldn't be a problem.

    Alternatively, if the companies had been less greedy and, y'know, invested some of their huge profits back into infrastructure...

  • by HexOxide ( 1375611 ) on Sunday December 07, 2008 @02:42PM (#26022407) Journal
    They think Google is being unfair, block it outright. Why haven't they done this? Because they know that 1) They're in the wrong and 2) They would lose just about all of their customers You know there's more to it when the simple and obvious solution is not employed, and then forgotten about. Google is what the consumers want, the consumers have paid for their internet connection, as have Google, end of story? Hah I wish.
  • Re:Probably true (Score:3, Informative)

    by Wovel ( 964431 ) on Sunday December 07, 2008 @02:51PM (#26022489) Homepage
    Actually if his first point is untrue, there is no reason for the site to exist. Anything that generates more traffic to spiders than users has no point in existing.
  • Actually, you're wrong.

    From a business standpoint, you'd be right, except that Google was designed to be a search engine, not a way to sell advertising, so the GP is correct.

    Google is designed to be an excellent search engine with minimal interference that very quickly leads consumers to the sites they were searching for. It also sells advertising within that limitation.

    As proof that you're wrong, Google doesn't carry the high-profit pop-up or pop-under ads, flash based ads or image ads on their own search engine, even though they offer them through Adsense. They don't offer them, because they'd be in the way of the primary design functionality of the Google website.

  • Re:Fair Share (Score:5, Informative)

    by adolf ( 21054 ) <flodadolf@gmail.com> on Sunday December 07, 2008 @03:48PM (#26023071) Journal

    Cable TV already does this -- they want paid for access to their tubes. We, as Time Warner Cable customers, recently lost our ability to watch the local Fox affiliate for a few weeks.

    Why?

    Cable company wanted paid to carry Fox, while Fox wanted paid to be carried on Cable. This went on and on, with various hateful ads about Time Warner appearing on Fox prior to the blackout. And then, one day, it was dark.

    Eventually, they figured it out. Not sure who is paying who, or if they just went back to the ages-old arrangement wherein no money changes hands. But it's back, for now.

    It doesn't really matter to me, in this instance. All I watch on Fox is House, and it's easy enough to snag episodes from TPB.

    But if I sed s/Cable/AT&T/ and also sed s/Fox/Google/, it'd be a very sorry state of affairs.

  • by rossifer ( 581396 ) on Sunday December 07, 2008 @04:26PM (#26023473) Journal

    That would make the user experience worse for those users.

    Based on that fact and everything I know about Google, that type of change: Will. Not. Happen.

    (disclosure: I work for Google)

  • by N7DR ( 536428 ) on Sunday December 07, 2008 @04:52PM (#26023749) Homepage
    What this is really about is whether the ISPs still have common carrier status

    In the US, ISPs do not have, and never have had, "common carrier status".

  • Re:Probably true (Score:5, Informative)

    by jasen666 ( 88727 ) on Sunday December 07, 2008 @05:41PM (#26024193)

    Exactly how I see it.
    I paid my ISP their asking price for my bandwidth.
    Google paid their ISP for their bandwidth.
    Why the hell would google have to pay my ISP a second time for my bandwidth?
    I see it as nothing more than greed.

  • by spazdor ( 902907 ) on Sunday December 07, 2008 @09:30PM (#26026205)

    No, but the telcos have and some of them turned into ISPs too.

  • BBC (Score:2, Informative)

    by DriveMelter ( 1345271 ) on Monday December 08, 2008 @05:55AM (#26030231) Homepage
    I believe a similar argument was used against the BBC when it first brought out Iplayer, the big difference however was it's use of a peer to peer arangement.
  • by smoker2 ( 750216 ) on Monday December 08, 2008 @06:26AM (#26030387) Homepage Journal
    None of the urls you just posted are the same. Are you complaining that google indexes your site ? To do that they have to visit the various separate pages. Maybe if you didn't run everything through a couple of PHP scripts, they wouldn't put so much load on your server.
    Yahoo has been guilty of large amount of spidering on my sites, but google is only once every week or so. But then I don't use php so much. If you don't like google doing what you see, then script a robots.txt file that changes according to the day/date whatever. Crontab might be your friend.

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...