Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Internet Communications

Weighing the Internet 144

the-dark-kangaroo writes "Jason Striegel has taken Physics to a new dimension by 'Weighing the Internet.' Well, actually calculating the total number of users online in one day. The conclusion that was reached was that there are ~519 million users per day online. Also, 'From what we calculated, it would appear that roughly 41 percent of internet users did not log in that day.'"
This discussion has been archived. No new comments can be posted.

Weighing the Internet

Comments Filter:
  • what? (Score:2, Insightful)

    by Anonymous Coward
    seriously.

    what?
  • by NotBorg ( 829820 ) * on Thursday July 14, 2005 @08:24PM (#13069031)
    that their penis could be HUGE!
  • hmmm... (Score:4, Funny)

    by xor.pt ( 882444 ) on Thursday July 14, 2005 @08:26PM (#13069042)
    It's probably overweight too.
  • by SamAdam3d ( 818241 ) on Thursday July 14, 2005 @08:26PM (#13069045)
    If i am on two computers at the same time? Isn't everyone? No?
    • What about Network Address Translation (NAT)? You can have a LAN in your house with like 20 computers connected to a broadband router, and if you're using NAT, to the outside world, your LAN is only 1 collective node/IP address on the internet.
    • If you try that, you'll get into trouble with Heisenberg.

      Unless you have infinite energy (and who does nowadays?)

      I'm certain of that!

  • Technique (Score:5, Funny)

    by Azadre ( 632442 ) on Thursday July 14, 2005 @08:26PM (#13069049)
    What algorithim did they use? The one involving magic?
  • Weight: (Score:5, Funny)

    by wot.narg ( 829093 ) <wot DOT narg AT gmail DOT com> on Thursday July 14, 2005 @08:27PM (#13069053) Homepage
    The awswer is 42 (metric gigatons total).

    Breakdown of Internet Weight:

    10 gigatons of Flames.
    20 gigatons of Spam.
    10 gigatons of e-dicks.
    2 gigatons of information.
    • I think you have too much information. Let's add another category:

      News/Reports - looks like information, but it really isn't.

      That should bring information to a more logical 45 kilos.
    • you missed the pr0n.
    • Wait, you're missing one:

      10 gigatons of Flames.
      20 gigatons of Spam.
      10 gigatons of e-dicks.
      2 gigatons of information.
      2,473 gigatons of silicone.
  • by XFilesFMDS1013 ( 830724 ) on Thursday July 14, 2005 @08:30PM (#13069080)
    Jason Striegel continued by saying that "we didn't count anyone from Slashdot, because, lets face it, sitting in front of your computer all day eating Doritos tends to skew the results".
  • So 519 million users, each getting 10 spam messages a day....6 billion spam at the very least. 519 million users, and 519 billion pornographic web sites
  • by Doc Ruby ( 173196 ) on Thursday July 14, 2005 @08:33PM (#13069098) Homepage Journal
    Accelerating electrons through wires makes them weigh more. And pushing photons through fibers makes them weigh more. I wonder how much extra weight the Internet accounts for? If we're going to count the users as the Internet's weight, we should also be asking "how pasty is the Internet?"
  • by istartedi ( 132515 ) on Thursday July 14, 2005 @08:35PM (#13069109) Journal

    The Internet and the computer won't really be finished until the "booting up and logging in" are replaced with "turning it on and instantly getting what you want". We had nearly instant boots with 8-bit micros and ROMs. We gave 'em up for the flexibility of putting the OS on the hard disk. There was no need to log in when the thing wasn't networked. Alas, security concerns gave rise to the login; but we don't log in to our telephones, we just dial. There is no way to bring down the whole phone network just by dialing the wrong number or saying the wrong thing into it. So there is hope that one day the whole "boot up and login" hack that we're using can be eliminated. Then this whole "computer and the internet" project will be done. Of course, it was a government project wasn't it? Maybe that'w why it's taking so long to finish.

    • 2600 Hz (Score:1, Informative)

      by Anonymous Coward
      That was the "operator" login for telephones.
    • I think thirty seconds is a fine boot time.

      With landline phones, the "computer" is always-on, that is, the switch at the CO.

      If one just wanted to do email or UNIX command line stuff, then it would be trivial, assuming you are using a terminal and the computer at the other end is always on. Even then, the terminals of the old days still took maybe thirty seconds to warm up.

      I would say maybe you could do sleep mode? My computers wake up in a second or two, quicker than any of my monitors can start showin
    • Linux is already capable of booting extremely fast, but it's the distro guys that are lagging on making it happen. Basically, a large part of the boot time is starting a bunch of services sequentially. However, if you have proper service dependency information (like LSB-based distros should all have, and Gentoo has for sure), instead of just boot order numbers (/etc/rc2.d/SNNsomeservice), you can parallelize a lot of the boot process. Add to that the fact that except for kernel upgrades, you don't really
      • you don't really need to reboot linux anyways

        It grates on me a bit whenever I hear that. Too many *NIX people are locked in to the "rackspace" mentality where shutting down is only done for maintenance. Most of us work with desktops, and although power consumption during hibernate or standby modes is not nearly as bad as just letting it sit there, it's still a hack and not a truly fast boot. I'll grant though, that it's a time-honored hack:

        When I was a kid TV tubes took a long time to warm up. Sol

        • Keeping the tube filaments partially heated was as much for extending tube life as it was to speed up turn on time.
        • "'you don't really need to reboot linux anyways '

          It grates on me a bit whenever I hear that. "


          Yeah, but apparently you're not catching the point of the grandparent post. You only read the part you quoted. Suspending to disk, done right, can allow you to entirely power the thing down. I've tested it with my Windows laptop even to the point of removing the battery, and then restoring. The idea is to take the current state of memory (which is volatile, lost on power down) write it to disk, and then, on
      • However, if you have proper service dependency information (like LSB-based distros should all have, and Gentoo has for sure), instead of just boot order numbers (/etc/rc2.d/SNNsomeservice), you can parallelize a lot of the boot process.

        Please explain to me how you speed up the boot process by parallelizing something on a one cpu - one system disk machine.
        • Please explain to me how you speed up the boot process by parallelizing something on a one cpu - one system disk machine.

          What do you think the CPU is doing while a single task is waiting on the hard disk?

          • Boot-up processes is most often almost completly IO which is why I said "one system disk" too. The gain in parallelizing them is often negligible small. I think a far better approach would be by bypassing the current boot-up script architecture completly and load only bare minimum services to get the desktop up and then load the rest once that has been accomplished.
        • Not until you take a basic course in computer I/O.
          • Not until you take a basic course in computer I/O.

            Suprise suprise. I have. And I've written kernel patches. And I've done my own Linux system from source (no, no easypeasy Gentoo). And I've written my own OS, which admittely did little more than Linus first teletype program.

            But as I said in reply to some other comment, the gains of parallelizing the current script architecture isn't very great. The amount of cpu work most boot-up scripts do is insignificant to the amount of IO they do. So as long as you
        • ### Please explain to me how you speed up the boot process by parallelizing something on a one cpu - one system disk machine.

          $ cd /etc/init.d/
          $ grep sleep * |wc -l
          49

          Do I need to say more?
          • Hehe. Good point. But I think you'd gain more from optimizing the scripts than just parallelizing them.

            As I've said before, besides from those sleep the slowness comes from IO. Any CPU gain you get by parallelizing will be lost due to the fact you now need another bit of code to check finished process and keep a track of depenedencies.
          • Addendum: I haven't run Linux since 2001 and was convinced that by now someone had rewritten those wretched scripts to be a bit more streamlined. Obviously I was wrong. After having looked at their current state, yes they could probably gain a bit from parallelization but I think that's the wrong turn to take since they would gain far more from some simple optimization.
    • Dude, I think what you're describing would have to be a government project. The internet is just kinda wild and untamed (part of its charm, there's actually danger here when most real-world danger is made up on TV or with falsified evidence), but once it's been tamed, as most non-technical governments seem to want to do, then you'll be required to have what you're talking about.

      I think internet should be more like space. If you're going to step out into no atmosphere don't blame the maintainers when you ge
    • You can get instant-on systems based around linux... though "instant" actually means a few seconds in tedious reality, they can boot as fast as you can get through the bios check. It's linux bios [linuxbios.org], and if you're willing to get compatible hardware, it can be quite cool.

      And yes, there are consumer applications [proteinos.com] in the wild.
    • Technically, we don't need logins. We could do as the phone lines, identified by the "port" at the CO we're connected to. Unless you are suggesting we just skip authentication altogether and allow everyone free Internet access. Pedos and terrorists and whatnot connecting without a trace, that'll fly as well as a cement truck. Nevermind that we seem to be going that way with open WiFi...

      The other part is about making an appliance. Great, make the next WebTV. I don't want it. I don't want a PC in console dra
    • there's several points i would like to refute.

      There is no way to bring down the whole phone network just by dialing the wrong number or saying the wrong thing into it.

      u cant take down the whole internet just by connecting to some website or crashing particular servers either. what you can do is take down particular servers which might cause paths to be re-routed, but still the internet holds. to take down the entire internet would require synchronized attacks on all or most major backbones which you wou
  • by Anonymous Coward
    So, when does his server reach critical mass due to slashdotters?
    • > So, when does his server reach critical mass due to slashdotters?

      There's a goatse joke in there somewhere, folks.

      (Preferably behind the event horizon. You really don't want to see the naked singularity.)

  • by Anonymous Coward
    Does the internet weigh more in the US?
  • hackaday (Score:1, Offtopic)

    by a3217055 ( 768293 )
    This was on hackaday already, you can check it out at http://www.hackaday.com/ [hackaday.com] Sorry slashdot, ain't the news site that it used to be. Maybe tomorrow will be better
  • by Kaorimoch ( 858523 ) on Thursday July 14, 2005 @08:42PM (#13069146) Journal
    Perhaps a better term would be "Counting the people on the internet"? That weighing stuff is for things with, well, MASS.
    • Weighing the internet based on the number of people using it. Isn't that like weighing all ideas based on the the number of physical stimulants in the human environment, or the weight of health based on the number of insurance cards people carry around?
  • Woah, not even close (Score:4, Informative)

    by kyndig ( 579355 ) on Thursday July 14, 2005 @08:43PM (#13069150) Homepage
    This is so horribly full of conjectures, uncontrolled data resources, and just pure speculation. The figures are based off Alexa Toolbar users, and one website visitor ratio. The author uses these as the base of forumlating a simple division/multiplication approach to postulating the gross users of the internet.

    Suggestion for more accurate collection of information. Talk to ICANN or that nifty website senderbase.org [senderbase.org] that has a broader view on traffic flow across the internet.
    • How can they detect people behind nats?

      Many ISP's never give their users even the illusion of an I.P. so they would be totally hidden.

      I just don't think it's possible... isp numbers might be interesting for themselves but I really think this is something that won't be possible.
    • ...are you sure the site is accurate? It doesn't even list google.com or gmail.com in the top senders list... Moreover, it just monitors email traffic which will again skew the results.
  • by erice ( 13380 ) on Thursday July 14, 2005 @08:44PM (#13069158) Homepage
    The trouble with these kinds of measurements is not that it is hard to get the data. The trouble is that it is hard to get data that makes any sense and even harder to define what sort of sense it is supposed to make.

    This isn't the 80's. People don't connect to the Internet in discrete blocks every few days. They are connected 24x7 either at home, work, even on their phones. Who is to say that somone who doesn't visit some popular website isn't online? Who is to say that a particular visit to a web site is even represents a person?
    • Most likely that visit is from Google's spider, anyway.
    • Bah.. just count the number of unique IPs querying google. If you're 'online' you're going to use a search engine at some point and so very many people use google. Not sure what to do about IP spoofing, but whatever there is that differentiates the different sub-addresses should be able to differentiate different users as well.
    • My sister uses dial up.
      I turn my computer off before I go to bed.
      My mothers cell phone logs out when she isn't using it.

      In the United States cell phones can't stay connected to the cell network 24/7. (Plus all the times we turn our phones off for eather politeness, policy or law)

      I remember people in the UK complaing about having to use the BT dial up "pay by the bandwith" style service. (Don't know if this is still the case as I don't live there)

      In some parts of the world Internet access means going to a
  • by michaeldot ( 751590 ) on Thursday July 14, 2005 @08:45PM (#13069164)
    Here's an idea: as there are clearly an enormous number of people accessible via the internet, if we could all be coordinated to use our weight by jumping up and down at a notified time, we may be able influence the rotational orbit of the Earth.

    We could have time zone +0 GMT start jumping at one part of the day, then time zone +12 GMT do it twelve hours later.

    The cumulative effect might be enough to push the Earth into a longer orbit, thus moving us further away from the sun and cooling the planet.

    (Of course, it's not solely proximity to the sun that determines global temperature, and Newton's Third Law + the weight of the planet vs the weight of humans might have something to say about whether jumping would actually work, but don't let that spoil some silly science!)
  • what the hell (Score:5, Insightful)

    by hobotron ( 891379 ) on Thursday July 14, 2005 @08:51PM (#13069198)

    Horrible Horrible "study".

    "So we can figure out the number of people who view hackaday by dividing 72,500 by 1.4, which gives us roughly 51,800 daily viewers."

    Wrong. Bad sample population, low sample size with ONE DAY, NO inclusion of error propagation across statistical barriers. When you multiply estimates, you multiply error as well.

    "With this knowlege, you can easily estimate the traffic to other sites. If we go by the 471 million estimate, Slashdot gets a whopping 380,000 daily readers."

    Pretty sure I F5 more than that.

    "Alexa... Alexa... Alexa...etc."

    I dont know about you but Alexa is bordering on adware with this. Call me paranoid, I dont care.

    Also not everyone (like me) would sign up and run a dumb banner like this on their browser, so your sample excluedes pretty much everyone that got hit with the smarts bat growing up.

    Perhaps im missing some gross humorous overtone, but mod article -1 Statistical Chicanery
    • "With this knowlege, you can easily estimate the traffic to other sites. If we go by the 471 million estimate, Slashdot gets a whopping 380,000 daily readers."

      Pretty sure I F5 more than that.

      Sure, but that information is encapsulated in the page-views-per-visit figure, which will be different for slashdot than for the site for which the original author had some sample data. And you might be a statistical outlier anyway.

      Let's consider the actual figure the original author extracted for the size of the i

  • the answer to the really important question:

    how many Libraries of Congress does it weigh?
  • horribly ? (Score:4, Funny)

    by William Robinson ( 875390 ) on Thursday July 14, 2005 @08:59PM (#13069239)
    and am horribly attracted to women everywhere,

    'Horribly' is not accepted as standard word in scientific research publications. The description must be quantitative like 'and am 91% time attracted to women at 45% of the places'. A graph of level of attraction vs cup size would be great!!

  • Wait a second...didn't we conclude yesterday that 1/3 of all studies are bunk? Well, at least these guys did admit their data wasn't statistically valid ;).
  • I using the internet. Is anyone else here using the internet too? I'm so unique.
  • What does this have to do with Physics? Sure, weight is a physical phenomenon. Does that mean that buying a half a pound of ham from the deli is taking "Physics to a new dimension"?

    How about "Abusing statistics in an unconscionable manner."? That seems more apt.

    -Peter
    • RTFA:
      In 1798 Henry Cavendish, known for his scientific brilliance and terrible fear of women, developed a system for calculating the gravitational constant (G) by measuring the gravitational attraction between two small spheres. In essence, he was able to "weigh the earth" by comparing the relationship between two known objects.

      He used the term "Weighing the Internet" because he used an analogous techniqe by comparing the number of actual visitors to a website to the number recorded for the same website
      • I disagree. Cavendish (apparently) derived a constant from the interaction of small objects. Given that this is a constant it naturally generalized to all massive objects.

        This guy, as I said, (ab)used statistics to show . . . whatever the fuck.

        Is that clearer? Cavendish didn't use statistics.

        -Peter
  • What portion were overwight?
  • If users equates weight that means the maximum weight of a buddy list in AIM (which is 150) weighs 0.00000004% the Internet. (Atkins) Instant Messenger, anyone?
  • The numbers pumped through his equation seem to show inflated numbers or suspicion of inflated #'s.
    Public reports of ratings and fraud have been mentioned in his blog, and we have LOTS and lots of cash related the Nielsen/NetRatings reports as the issue.

    Sounds like the SEC FTC should start sniffing up Nielsen skirts.
    SEC raids Nielsen next.

    Bet the best buy stores around Nielsen's HQ will be out of shredders tonight.

  • ...roughly 1978:

    Professor A. ("Affidavit") Donda (the offspring of an unfortunate genetical experiment including 3 women and a microscope slide) invents "Svarnetics" (roughly: "Advanced Mumbo-Jumboistics") as a pretense to load the worlds biggest computer with as much data as possible to find out if information has physical weight. He succeeds; however at the moment information is actually so dense that it becomes matter/weighable it turns into an info-black-hole, swallowing all information so far accumul
  • That would match the title of this article a bit better... anybody care to take a stabe at the number?

    Assuming X number of electrons to store a bit, multiplied by the amount of traffic in a given day, multiplied by the weight of an electron.

    In Stone, of course, because nothing beats an ancient, obscure weight system used by exactly ONE country on this planet.
  • My old CRT weighs about 35 lbs. Does that count? Because I switched to a LCD, did the Internet loose weight?
  • Log in? (Score:3, Insightful)

    by Transcendent ( 204992 ) on Thursday July 14, 2005 @09:38PM (#13069453)
    How do I "log in" to the internet?
  • According to Alexa [alexa.com], BBC News [bbc.co.uk] has a daily reach of about 20,000 per million. After the London bombings last week, that shot up to about 32,000.

    So a daily reach of 32,000 per million means that 0.032 of users visit the BBC News website.

    Now according to this article [bbc.co.uk], the BBC news website had a record 115 million page views last Thursday, so with 5.9 page views per user (from Alexa), that's 19.49 million users.

    Dividing 19.49 by 0.032 gives 609M.

    Of course, something is totally out of whack because that arti
  • by kihjin ( 866070 ) on Thursday July 14, 2005 @09:48PM (#13069491)
    Internet: "I'm not fat, I'm just sufficiently back-boned."
  • The 41% off-line number matches what we see too. On our little (ADSL) ISP with 20,000 end users, we typicaly have 60% on line at any one time.
  • by louarnkoz ( 805588 ) on Friday July 15, 2005 @01:15AM (#13070500)
    The methodology presented here is deeply flawed: it extrapolates a large number based on a very small sample and on unsupported assumptions about browsing habits. Yet, it is possible to actually measure the number of users with some proper method.

    The most obvious method is a basic opinion poll. Take a large enough random sample of the earth population, ask simple questions like "have you used the Internet ever, this year, this month, this week, today", compute the average and extrapolate.

    In practice, taking a world-wide poll is not very practical, but it is certainly possible to perform polls on a country by country basis, and then compute the results. In fact, such polls are regularly conducted, and the results are just a google search away, at least for major countries.

    Polls are snapshot at a moment in time, and this is problematic. If you don't pay attention, you end up adding the number of users measured in China last January, in the US last month, in Finland in May, etc. So, you want to complement the polls by an indication of trend, something that you can easily measure at frequent interval.

    One possibility is to use Internet host counts, which can be obtained by sampling the DNS (see the Internet Domain Survey [isc.org]). One can measure the number of host in a country and the number of users at the time of the poll, the current number of host in the same country, and extrapolate.

    There are other potential sources, e.g. measure the volume of traffic, the number of dial-up and broadband subscriptions, etc. Again, it is possible to link these numbers to various poll data, and maintain estimates.

    By the way, the Internet Domain Survey in January 2005 showed 317.6 million IP addresses in use. The typical broadband connection uses one IP address per household, i.e. for 1 to maybe 4 or 5 users. A dial-up connection typically only use an address only a fraction of the time, so the ratio is even higher. Then, there are about 650 million PC available worlwide, many of which are shared. Based on that, there were probably somewhere between 500 millions and a billion users on the Internet.

  • Can anyone tell me, with all the users supposedly online and with the popularity of Slashdot, why are there so few comments per story? Amount of comments seems to be in the range of 100-1000. There are millions of readers. Why so few comments?
  • At least the minimal possible weight of the information.

    (1) Calculate the minimum energy required to represent one bit.
    (2) Calculate the number of bits stored on the Internet.
    (3)Multiply (1) by (2) an divide by c^2.

    I am not a physicist, but I'm sure there must be some physical minimum amount of energy required to ensure a single bit is in a determined state.

"Life sucks, but death doesn't put out at all...." -- Thomas J. Kopp

Working...