Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Internet Data Storage Technology

Average Web Page Approaches 1MB 319

MrSeb writes "According to new research from HTTP Archive, which regularly scans the internet's most popular destinations, the average size of a single web page is now 965 kilobytes, up more than 30% from last year's average of 702KB. This rapid growth is fairly normal for the internet — the average web page was 14KB in 1995, 93KB by 2003, and 300KB in 2008 — but by burrowing a little deeper into HTTP Archive's recent data, we can discern some interesting trends. Between 2010 and 2011, the average amount of Flash content downloaded stayed exactly the same — 90KB — but JavaScript experienced massive growth from 113KB to 172KB. The amount of HTML, CSS, and images on websites also showed a significant increase year over year. There is absolutely no doubt that these trends are attributable to the death throes of Flash and emergence of HTML5 and its open web cohorts." If you have a personal home page, how big is it?
This discussion has been archived. No new comments can be posted.

Average Web Page Approaches 1MB

Comments Filter:
  • How Big? (Score:5, Funny)

    by Anonymous Coward on Thursday December 22, 2011 @03:21PM (#38463874)

    That's rather personal.

    • Re:How Big? (Score:5, Insightful)

      by Pharmboy ( 216950 ) on Thursday December 22, 2011 @06:02PM (#38465988) Journal

      Well, I don't mind bragging about mine. I was 100k, but now has swollen to 150k this year. As to *real* servers, I try to keep our ecommerce pages below 250k for gateway pages. Until this year, I tried to keep them under 150k. Up until 2008, 100k was the target. Before 2003, 50k. This is kind of light, and a few pages bust this, but very few. Before 2000, I used to spend lots of time just optimizing graphics, now I just use some common sense, PS, and very little time.

      What I have found is that the total k of data isn't as important as the number of items and hosts the page calls. I find I can make my pages faster by using image maps, which make larger images size (12 images 1 image of all 12 items) but load faster because it takes less connects. There are a few tools online that can help you figure out total load times. Nowadays, load time is NOT purely a function of the size of the data. If you can cut down on the number of GETS and cross domain GETS (ie: DNS lookups) you can radically cut down load time and reliability.

      Also, pages that don't need to be dynamic, shouldn't be. Our gateway (to product categories) pages are generated as we update the site, and stored static. This allows them to be cached. It sounds old fashioned, but the fact is that it greatly increases perceived latency. I am amazed at how many websites are generated via PHP and SQL on the fly, yet aren't updated more than a couple times a day or less. That is a lot of wasted CPU cycles on the server, and a lot of wasted potential for caching, both locally and down the line. And yes, it makes your website load slower, making it seem like your pages are larger than they are.

      • Also, pages that don't need to be dynamic, shouldn't be. Our gateway (to product categories) pages are generated as we update the site, and stored static. This allows them to be cached. It sounds old fashioned, but the fact is that it greatly increases perceived latency. I am amazed at how many websites are generated via PHP and SQL on the fly, yet aren't updated more than a couple times a day or less. That is a lot of wasted CPU cycles on the server, and a lot of wasted potential for caching, both locally

        • Re:How Big? (Score:4, Informative)

          by buchner.johannes ( 1139593 ) on Friday December 23, 2011 @05:32AM (#38469702) Homepage Journal

          With a good caching engine, dynamically generated webpages should be nearly as fast as a static page - the page it's self is parsed and cached, then only re-parsed if the input changes.

          The Linux kernel can take a file and put it on the socket without Apache loading it even partially into memory. *This* is fast.

  • by Anonymous Coward on Thursday December 22, 2011 @03:21PM (#38463878)

    It's a good thing phone carriers don't limit your data consumption....

    oh wait..

    • by MichaelSmith ( 789609 ) on Thursday December 22, 2011 @03:39PM (#38464220) Homepage Journal

      The browser on my phone crashes on pages that size, including most articles on slashdot, so the data it uses is somewhat self limiting.

    • It's a good thing phone carriers don't limit your data consumption....

      oh wait..

      Yes, it is a good thing. We get unlimited 3G data for an increment of 3euro/month on the basic cellphone service per phone. It's only a 384kbps data service, but it can be used non-stop without incurring any extra fees. For the three cellphones I pay for (me and two kids), the combined bill rarely reaches 20euro/month, including taxes and all calls.

      Oh wait... you weren't actually being facetious, were you?

    • by hipp5 ( 1635263 ) on Thursday December 22, 2011 @05:21PM (#38465570)
      The phone issue is interesting. I was just on the "How Much Data Do You Need?" page for a local provider. You slide the bar for various things, like how many web pages you visit in a month (as if anyone really knows that). Their assumption was 0.17MB/page. I know there are mobile versions of pages and such, but this still seems like a gross underestimate given this story.
  • Not surprised (Score:5, Insightful)

    by kannibal_klown ( 531544 ) on Thursday December 22, 2011 @03:21PM (#38463888)

    With the growth of Javascript libraries like JQuery for more UI features, more images, I can see it reaching that high.

    Meanwhile, web developers don't care because more and more people are getting faster and faster broadband speeds. So as long as the page-load metric works OK on their rig or perhaps what the envision most of their viewers have... they think it's all OK.

    • Re:Not surprised (Score:5, Insightful)

      by Fujisawa Sensei ( 207127 ) on Thursday December 22, 2011 @03:23PM (#38463916) Journal
      Apparently it is because users are still hitting their websites.
    • Re:Not surprised (Score:5, Insightful)

      by Anonymous Coward on Thursday December 22, 2011 @03:26PM (#38463998)

      Web developers don't care because the majority of their images/css/js is cacheable by each visitor (and most people have jQuery cached from the official site and many sites link to that directly). 1MB page but it's only 45k on the next visit.

    • ... reimplementing jQuery (which is 31K, btw) badly in uncacheable custom ways without being able to draw on the years of expertise of the developers of jQuery would be a great alternative.

    • Certainly, JS frameworks do contribute to the total size of a page, that framework is generally cached and isn't re-downloaded on subsequent pages on the same site. So, your 965KB page just dropped to 800KB after the first page load. Images that are carried through a site (logos, widget buttons, backgrounds) can also only be counted on the first page load.

      I tend to focus on keeping things small, reusing anything I can. Some web developers do care...at least I do.

    • Well, stuff like jQuery/Dojo/etc libraries shouldn't be loading every time you view a page.
      The first view, your browser will need to load all the associated CSS, HTML, etc.
      After that, included files should hopefully be cached, and only page content need be loaded.

      Also, with JS libraries and AJAX, one should be able to build pages that load the overall template once, but don't require pulling large HTML files for updates (rather just pull content with AJAX).

    • I don't know about other developers, but I do care, and try to keep pages small. More and more people are accessing the web on mobile devices, so minimizing the data going back and forth, and round trips to the server, is important to user experience. In the design community, designing with mobile devices in mind is a growing practice.

    • Re:Not surprised (Score:4, Interesting)

      by Threni ( 635302 ) on Thursday December 22, 2011 @04:37PM (#38465010)

      Forgive me piggybacking here, but I've a web question. I read Slashdot predominantly on my phone (doesn't everyone?), but once you get 5 or 6 levels of replying in, the posts become unreadable. Each reply has a shorter width then the one above, meaning you end up with a handful of characters per line, and the rest of the horizontal space as just that - space. Is that really how it's supposed to be - completely unreadable? Is there no way of overriding it and saying 'look, I know it's a reply due to the context`. I've tried poking around in the various options within Slashdot, but I don't understand what most of them do, and the so-called help is completely useless and doesn't describe what the options mean nor how to use them. I think the problem is that the designers of Slashdot believe everyone is using a monitor so you'd probably need to be about 30 or 40 levels in to get to the same problem.

      I'm using Dolphin HD on Android but a friend with an Apple phone has the same problem. Is there an answer?

      • Re: (Score:3, Insightful)

        by matmota ( 238500 )

        First select the "classic discussion system (D1)" under "Discussions" in your options (gear icon). Then, in the settings just below the summary pick the "flat" view instead of "nested".

  • Missing data (Score:5, Interesting)

    by instagib ( 879544 ) on Thursday December 22, 2011 @03:22PM (#38463896)

    Average information content - does a page view give me more insight as a user now than it did 10 years ago?

    • Re:Missing data (Score:5, Insightful)

      by Anonymous Coward on Thursday December 22, 2011 @03:29PM (#38464056)

      10 years ago online video was virtually nonexistent, and where it did exist it was never larger than 320x240. Pictures were equally low resolution and page formatting was minimal. Allowing user comments was rare, and user contribution based sites like YouTube and Wikipedia were nonexistent. Oh yea, and the "blink" tag was still popular. So yes, I would say the amount of information has increased significantly.

  • Ad Content (Score:2, Insightful)

    by Anonymous Coward

    And how much of it is ads?

  • by MetricT ( 128876 ) on Thursday December 22, 2011 @03:23PM (#38463944)

    and the 3G users, and the satellite users, and everyone else that has a low-bandwidth and/or high cost per byte connection.

    My parents can't get DSL or cable. They're stuck with 22k dial-up, and use AdBlock Plus, NoFlash, and Propel accelerator with compression set to the point where you can barely recognize photos, and it still takes 2 minutes for a reasonably normal page (CNN, MSNBC) to load, much less anything with a ton of Javascript or Flash.

    Can't websites automatically detect connection speed the first time a client visits, and store a cookie so that us slow people get a nice, simple website?

    Oh, and Propel, please move to JPEG2000 and XZ compression. Some people need every byte they can get.

    • If they have cell reception then how about http://www.virginmobileusa.com/mobile-broadband/ [virginmobileusa.com]

      • by MetricT ( 128876 )

        They don't have any cell reception, barring standing in the right spot in the yard. They live at the bottom of a large valley that blocks cell signals.

        One thing I have thought about is buying two antennas and building a passive reflector to beam some signal into their valley, but I'm waiting for Verizon (or anyone, for that matter) to roll out 4G before I spend money on it.

    • by Mitreya ( 579078 )
      Can't websites automatically detect connection speed the first time a client visits, and store a cookie so that us slow people get a nice, simple website?

      Ooooh, ooooh, I have a better idea! Can't website ditch the f**cking crap they now use (reaching 1MB average?) and just re-do their site as a nice, simple website that you describe?
      I will never understand why any website would have a legitimate need for background music. Or an interactive (a-la-DVD opening screen) navigation with 1-second delay and bu

  • by stating_the_obvious ( 1340413 ) on Thursday December 22, 2011 @03:25PM (#38463966)
    It's not the size of the home page, it's the motion of the .GIF
  • Ads (Score:5, Interesting)

    by Anonymous Coward on Thursday December 22, 2011 @03:25PM (#38463968)

    And.... when running AdBlock Plus, this figure goes down to 100kB. I run AdBlock mostly for the massive speed increase that comes with it.

  • by Anonymous Coward

    My fully featured CMS that used jQuery, jQuery UI, and a lot of heavy library takes 140kb. Learn to optimize people!!

  • by burning-toast ( 925667 ) on Thursday December 22, 2011 @03:26PM (#38463992)

    I have a homepage, and it's only 4.92Kb. Granted it is the "It Works!" page for CentOS which has all of the other text and icons and such but who needs more than that? Do people really have personalized home pages now that Facebook came about (other than some hobbyists or professionals who run a side business)?

    I wonder what the average "Facebook" homepage size is... since that is what most people will be seeing regularly.

    - Toast

    • by wjcofkc ( 964165 )
      I don't know about Facebook but I'm willing to bet that the average MySpace page is up to 5 megabytes. Then again I haven't been there since 07'.
  • by ackthpt ( 218170 ) on Thursday December 22, 2011 @03:27PM (#38464030) Homepage Journal

    I think eBay lead the curve on this one. I complained bitterly to them about how long it took their bloated pages to load when I was still on dialup. Nobody cares.

    I suppose the telecoms do. This increases the liklihood of blowing through your monthly bandwidth cap without even watching videos.

  • Compression? (Score:4, Insightful)

    by s7uar7 ( 746699 ) on Thursday December 22, 2011 @03:29PM (#38464050) Homepage
    If the bulk of the increase is from javascript wouldn't turning on compression on the web server solve the problem? They're text files, they compress down massively.
  • Larger Pages (Score:5, Insightful)

    by Master Moose ( 1243274 ) on Thursday December 22, 2011 @03:29PM (#38464060) Homepage

    And Less Content. .

    I remeber the days when a site would include an 10 paragraph article on one page - Not 10 pages with a paragraph on each.

    • Re: (Score:2, Interesting)

      by greenhollow ( 63021 )

      On most sites that I go to that have a paragraph per page model, I just click the "Print" button/link on the site and they combine the pages for printing. Then I read it without needing to print it. Sometimes they require printing it. If they do, I am less likely to read the article at all.

    • If you consider ads the content the site wants to deliver, it makes more sense.

  • Flawed methods ... (Score:5, Interesting)

    by oneiros27 ( 46144 ) on Thursday December 22, 2011 @03:34PM (#38464130) Homepage

    This only matters if people go to the first page, and never go to any additional ones.

    For most websites these days, you'll take the initial hit from javascript and the 'branding' images when you first get to the site ... but the changing content per page is much lower.

    If websites are using standard javascript libraries being served by Google's CDN [google.com], then it's possible that someone visiting your page already has jquery, mootools or similar cached and doesn't need to load yet another copy.

    I also didn't see if they had any comparison between transferred size vs. used size. (eg, javascript that's sent compressed) ... and as this is from an new archive ... does anyone know if Archive.org could analyze their holdings to see what the longer term trends are?

  • by rbrander ( 73222 ) on Thursday December 22, 2011 @03:36PM (#38464152) Homepage

    My home page remains where it has been since 1993 at the Calgary Unix Users Group: http://www.cuug.ab.ca/branderr [cuug.ab.ca] ...clocks in at 9.2K, plus a 15K GIF and a 9.1K JPG (if you "turn on images" in your browser - remember when it was a realistic option not to?)

    I have held the line, while Viewing With Alarm (VWA) the growth of web pages for the entire 18 years since. I wrote Bob Metcalfe when he had a column at InfoWorld 15 years back, and he was Viewing With Alarm the exponential growth in Internet traffic and predicting the "collapse of the Internet" (had to eat those words - literally) because of it. My letter pointed out that his column constituted 2K of text - that was all the generated content that was bringing in the readers, (unless you count the 10K gif of Bob Metcalfe, and I don't), and the page had an additional 100K of framing and advertising-related image GIFs. His reply was somewhat defensive.

    This last year, I had occasion to travel on the Queen Mary 2, where all internet is via satellite at a minimum of 34 cents per minute with their bulk plan. How quickly I grew to resent the giant Flash blobs that would be automatically downloaded with every page of a newspaper so I wouldn't miss the animated ads for the latest in car buys. At QM2 speeds, I'd have to wait about two minutes before I even had an "X" mark to click on to dismiss the ad. I was rather quickly cured of almost any interest in the Internet content at ALL, I did my E-mail, checked the google news headlines (fewest high-byte ads), and logged off.

    My point: 90% of mail is spam. So are 90% of web page bytes. We just don't call it spam. We call it "the whole outside frame around the news page that we try not to see, but keeps jumping around into our field of view".

    • From a usability perspective:

      • Blue background. Why? Are you trying to accomplish some artistic purpose we're not privy to?
      • Why are the pictures laid out vertically rather than horizontally? Why is there lots of text to the right of the second picture rather than to the right of both pictures. That means that your contact info is obscured/invisible to potential readers -- it's also out of context in that place.
      • Why do your anchors span multiple sentences rather than just a few semantically relevant key words?

      In

      • by mickwd ( 196449 ) on Thursday December 22, 2011 @07:04PM (#38466538)

        "In short: You fail web page design, so who the fuck cares if your page is 10K?"

        As a normal human being possessing the ability to read, I found his site perfectly accessible, and it gave me a decent amount of information about the guy in a quick, concise manner.

        If I was to be snarky here, I would say something like:

        In short: You fail meaningful criticism, and who the fuck cares if his "anchors span multiple sentences rather than just a few semantically relevant key words"?

    • Don't feel too smug. Your page isn't even compressed.

    • by whisper_jeff ( 680366 ) on Thursday December 22, 2011 @06:14PM (#38466082)
      Sorry to be a dick but you're bragging about that page? Really? You know when they say "size doesn't matter"? Yeah - sometimes it also means being as small as possible is not necessarily a good thing. I would have thought that page was trash ten years ago when Geocities webpages were everywhere so, now, it's really not good... Seriously, stop bragging about it and spend some time designing a real page.

      Sorry to be a dick - someone had to tell you...
  • How much is cached? Yeah, initial page load sucks terribly, but how much has to be loaded on subsequent page requests?

    How many copies of jQuery and etc. do people have cached on their machines?

    It almost feels like we need dependency managers for browsers! I mean, I know there is the Google hosted stuff and other projects urge you to use their hosted version and fallback on a local copy.

  • by Animats ( 122034 ) on Thursday December 22, 2011 @03:37PM (#38464178) Homepage

    There is absolutely no doubt that these trends are attributable to the death throes of Flash and emergence of HTML5 and its open web cohorts.

    No, it's not about HTML 5. A lot of it is about bloated content management systems and templates.

    I was looking at a Wall Street Journal page recently, and I brought it into an HTML editor so I could eliminate all non-story content. The story required an HTML page with only 72 lines. The original page was over 4000 lines. It contained a vast amount of hidden content, including the entire registration system for buying a subscription. All that junk appears on every page.. Inline, not in an included file.

    On top of that, there are content management systems which create a custom CSS page for each content page. So there's no useful caching in the browser.

    Remember those people who said CSS was going to make web pages shorter? They were wrong. Look at Slashdot - bloated, slow pages that don't do much, yet consume CPU time when idle.

    • by Desler ( 1608317 ) on Thursday December 22, 2011 @03:45PM (#38464298)

      But Slashdot is now web 2.0, Ajax-enabled and social. Pretty soon it'll be "hosted on the cloud" and provide SaaS so it can win at buzzword bingo!

    • I would mod you up but i have commented all ready - its a major problem seems worse in old school publishers unfortunately. Lucky the one I work for has finally seen the light lets how that I see some changes next year.
    • by archen ( 447353 )

      I agree that the reason things are getting bigger is because of extra "crap" getting served. Comercial pages are the biggest chunk of it, but even stuff like wordpress can toss out a lot of junk with templates. With bigger screen resolutions and assumed high speed internet, I'm seeing many sites being much more sloppy with large graphics too. The slashdot question at the end makes it sound like personal pages are relavent to this statistic. What percentage of the population actually has a personal homepa

    • by Trixter ( 9555 )

      It contained a vast amount of hidden content, including the entire registration system for buying a subscription. All that junk appears on every page.. Inline, not in an included file.

      Which reminds me: What's so bad about frames again? Is it so incredibly wacky to have static border/background/scripts downloaded only once per visit?

    • by mjwx ( 966435 )

      No, it's not about HTML 5. A lot of it is about bloated content management systems and templates.

      What do you think HTML5 is all about.

      An all new way to deliver bloated CMS's. Why do some people think HTML5 is some kind of magic fix for all the ills of the web?

      The problem is bad design and lack of care. No one gets punished for creating a crap system, bad developers get coddled, customers are coerced, sweet talked and sometimes forced into accepting bad CMS's.

      HTML5 will not stop bloat, will not s

  • by Anonymous Coward on Thursday December 22, 2011 @03:39PM (#38464216)

    That's still too much flash for me

    Sent from my iPhone

    • Re: (Score:2, Insightful)

      by Anonymous Coward
      Waaaah

      Sent from pretty much any Android
  • by Anonymous Coward

    Ironically posted on a website that is itself a bloated pig.

  • My personal site's home page? Fairly large, 18k of which 11k is images. I mean, it's a home page not an image gallery or something like that where you expect a lot of large content.

    I've seen some of those sites with large pages, and mostly I hate visiting them. The loading makes them feel like I'm wading through molasses, and the amount of stuff they're loading and the complexity of the scripts means more and more glitches and things that break when the network isn't perfect or they didn't expect the exact

  • by lkcl ( 517947 ) <lkcl@lkcl.net> on Thursday December 22, 2011 @03:42PM (#38464250) Homepage

    my site's a pyjamas application. it is therefore 1,000 lines of python.... or, when compiled (and therefore including the pyjs runtime which does stuff like dict, list, exceptions etc. all emulated in javascript, as well as including the library of widgets that are used on the page) it's 1.3mb of really obtuse but functionally correct javascript.

  • by nick357 ( 108909 ) on Thursday December 22, 2011 @03:43PM (#38464268)

    It remains the size of one of those animated "under construction" gifs.

  • by mikael ( 484 ) on Thursday December 22, 2011 @03:44PM (#38464274)

    I've been able to run both CPU and GPU based CFD and 3D visualisation on my laptop without any problems, yet some flash games which are just doing 2D animation will roast a 2.7 GHz CPU to the point that the kernel decides to call it a day and shut down the whole system.

    Unbelievably, these flash games aren't doing anything more complex than playing a retro 2D platform game. I'm guessing that this is due to the way in which all the separate texturemaps/pixelmaps are treated as generic webpage images rather than as a single DOOM style WAD file.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      In 5 more years there will be another layer of abstraction and 5GHz CPU's will be dragged to a crawl by a Super Mario Brothers clone.

    • It's caused by inept code that just runs at maximum frame rate regardless of display. Those 2d animations are probably being generated at several thousand FPS, just because the programmer didn't know how to limit it to something more reasonable.
    • by amaupin ( 721551 )
      No doubt you ran into a game using the Flixel library for AS3. 99% of the time I see the Flixel logo pop up before a game I know my laptop fan is going to turn on and the game is going to run choppy. I don't know what that library does behind the scenes, but it's an amazing CPU hog. ...And yet in my own games when I implement the built-in AS3 BitmapData.copyPixels() routines to move around massive amounts of sprites, my CPU doesn't even break a sweat.
  • by LiquidMind ( 150126 ) on Thursday December 22, 2011 @03:46PM (#38464300)

    http://www.the5k.org/ [the5k.org]

    It seemed so long ago. Didn't /. have an entry as well?

  • The abuse of __VIEWSTATE in certain pages makes the actual viewstate bigger that the site itself, per click, growing and growing. Which basically must count for something. I have always wondered how Microsoft hould have thought this out, or maybe the lack of education of its "developers".
  • My personal web site's home page is 2KB. It's HTML5, no CSS, no JS. My research group site has a bit of all three plus a handful of images and comes in at 125KB. Big website I sysadmin weighs in at 1.1MB. A nice variety there. I think my personal site claims the crown as the fastest loading and quickest to render.
  • by rollingcalf ( 605357 ) on Thursday December 22, 2011 @03:56PM (#38464458)

    Some sites use Javascript to display what is semi-static data that should be assembled on the server side before transmitting to the user. For example, a news site where the stories are loaded by Javascript.

    Some sites even have pages that are entirely blank if Javascript is turned off. It seems that some of these "web programmers" don't even know how to dynamically build a page with server-side scripting instead of Javascript.

    • by catbutt ( 469582 )
      Actually that's a really smart thing to do from a bandwidth point of view. There are all kinds of reasons not to do that (some of those are gradually disappearing, since now Google's crawler is starting to run some javascript to build the page as the user will see it), but if you are concerned about bandwidth, having javascript build your page for you is a very good way to do it.
  • by keith_nt4 ( 612247 ) on Thursday December 22, 2011 @04:04PM (#38464588) Homepage Journal

    Damned FrontPage

  • From 1995 to 2003, 26.7% annual gowth (take the eighth root of (93/14) and then subtract 1). From 2003 to 2008, 26.4%. From 2008 to 2010, 53.0%. Last year's growth was 37.5%. All percentages rounded to the nearest tenth of a percent.

  • by Anonymous Coward

    What's worse is that the "payload" of text is less and less interesting. Bandwidth isn't the problem. I have more than enough bandwidth for these pages. When they hit the browser, they take forever just to render. There are a handful of web sites I still use, Slashdot among them. Most new sites I just back right up. If your site does that on day 1, it's not worth the bother. I'm not buying a new machine just to look at your crap web site that's probably just a rehash of every Internet meme.

    We're well

  • Including 9366 bytes worth of images. It's been pretty steady since 1992 or so (initially hosted over ftp instead of http as my Dept didn't have an http server yet).

  • Riiigght... Javascript increases by about 50Kb, so it's responsible for the other several hundred Kb of increase over the last few years?

    Everyone realizes that gzipped jQuery is only 31Kb, right? I'm sick of people blaming Javascript for bloat. Do you realize how much work it would be to produce several hundred Kb of it? Much less think of reasons to produce that much?

    I've been a web designer for years, and where the increases in page size I've seen actually come from is just plain old images. Monitors are

  • Well maybe if you include all the images and the PDFs. I have a rather extensive website and if I recall, even when I backed up the entire thing, it came out to maybe 76MB, and that included all the image hosting I was doing for a different website.

    The problem is the same problem we're having now with "windows" software. Bloated because it's being generated by machine rather than hand-coded. All these WYSIWYG HTML code generators that allow people to just drag and drop text and pictures and let Dreamweaver

  • by cvtan ( 752695 ) on Thursday December 22, 2011 @04:38PM (#38465022)
    Well, that explains why my 400MHz WINXP laptop with 128M memory has a heart attack trying to scroll down a page of comments on Slashdot.
  • by Solandri ( 704621 ) on Thursday December 22, 2011 @08:19PM (#38467214)
    Median is the measure you want.

    If you use the mean, 90% of web pages could stay the exact same size, but if the other 10% go nuts and increase their size 20x, the mean will grow nearly 3x.
  • by adolf ( 21054 ) <flodadolf@gmail.com> on Thursday December 22, 2011 @10:34PM (#38467986) Journal

    So a web page today is about 10x bigger than it was in 2003. I can accept that.

    But in 2003, I had a baseline 2-megabit-per-second Internet connection and could have had a 3- or 5-megabit connection for a bit more cash.

    Today, 8 years later, the "normal" connection speed for my ISP is 6-megabit.

    So according to my observations and their statistics, folks are expected to download 10 times the amount of stuff using just 3 times as much available bandwidth.

    In other words, the web is currently more than three times slower than it was in 2003.

    Hooray!

  • by retroworks ( 652802 ) on Friday December 23, 2011 @05:03AM (#38469610) Homepage Journal
    About half my regular blog readers are based in emerging markets / less developed countries. I began to notice that hittership was dropping in Africa and India. Reviewing about a thousand posts, I noticed that the more photos and "blogger apps" I put on the web page, the lower the readership in countries with low bandwidth. I've been more conscientious now about which photo resolution I post and tend to avoid videos. And a lot of the cool little blogger widgets don't seem as important when measured in seconds to open the page. http://retroworks.blogspot.com/2010/12/blog-has-widget-fever.html [blogspot.com] Of course my content also sometimes sucks, and it also helps if I lay off the haiku.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...