Forgot your password?
typodupeerror
The Internet Data Storage Technology

Average Web Page Approaches 1MB 319

Posted by timothy
from the includes-the-toaster-and-the-pool dept.
MrSeb writes "According to new research from HTTP Archive, which regularly scans the internet's most popular destinations, the average size of a single web page is now 965 kilobytes, up more than 30% from last year's average of 702KB. This rapid growth is fairly normal for the internet — the average web page was 14KB in 1995, 93KB by 2003, and 300KB in 2008 — but by burrowing a little deeper into HTTP Archive's recent data, we can discern some interesting trends. Between 2010 and 2011, the average amount of Flash content downloaded stayed exactly the same — 90KB — but JavaScript experienced massive growth from 113KB to 172KB. The amount of HTML, CSS, and images on websites also showed a significant increase year over year. There is absolutely no doubt that these trends are attributable to the death throes of Flash and emergence of HTML5 and its open web cohorts." If you have a personal home page, how big is it?
This discussion has been archived. No new comments can be posted.

Average Web Page Approaches 1MB

Comments Filter:
  • How Big? (Score:5, Funny)

    by Anonymous Coward on Thursday December 22, 2011 @04:21PM (#38463874)

    That's rather personal.

    • Re:How Big? (Score:5, Insightful)

      by Pharmboy (216950) on Thursday December 22, 2011 @07:02PM (#38465988) Journal

      Well, I don't mind bragging about mine. I was 100k, but now has swollen to 150k this year. As to *real* servers, I try to keep our ecommerce pages below 250k for gateway pages. Until this year, I tried to keep them under 150k. Up until 2008, 100k was the target. Before 2003, 50k. This is kind of light, and a few pages bust this, but very few. Before 2000, I used to spend lots of time just optimizing graphics, now I just use some common sense, PS, and very little time.

      What I have found is that the total k of data isn't as important as the number of items and hosts the page calls. I find I can make my pages faster by using image maps, which make larger images size (12 images 1 image of all 12 items) but load faster because it takes less connects. There are a few tools online that can help you figure out total load times. Nowadays, load time is NOT purely a function of the size of the data. If you can cut down on the number of GETS and cross domain GETS (ie: DNS lookups) you can radically cut down load time and reliability.

      Also, pages that don't need to be dynamic, shouldn't be. Our gateway (to product categories) pages are generated as we update the site, and stored static. This allows them to be cached. It sounds old fashioned, but the fact is that it greatly increases perceived latency. I am amazed at how many websites are generated via PHP and SQL on the fly, yet aren't updated more than a couple times a day or less. That is a lot of wasted CPU cycles on the server, and a lot of wasted potential for caching, both locally and down the line. And yes, it makes your website load slower, making it seem like your pages are larger than they are.

      • by Burning1 (204959) on Thursday December 22, 2011 @09:25PM (#38467286) Homepage

        Also, pages that don't need to be dynamic, shouldn't be. Our gateway (to product categories) pages are generated as we update the site, and stored static. This allows them to be cached. It sounds old fashioned, but the fact is that it greatly increases perceived latency. I am amazed at how many websites are generated via PHP and SQL on the fly, yet aren't updated more than a couple times a day or less. That is a lot of wasted CPU cycles on the server, and a lot of wasted potential for caching, both locally and down the line. And yes, it makes your website load slower, making it seem like your pages are larger than they are.

        With a good caching engine, dynamically generated webpages should be nearly as fast as a static page - the page it's self is parsed and cached, then only re-parsed if the input changes.

  • by Anonymous Coward on Thursday December 22, 2011 @04:21PM (#38463878)

    It's a good thing phone carriers don't limit your data consumption....

    oh wait..

  • Not surprised (Score:5, Insightful)

    by kannibal_klown (531544) on Thursday December 22, 2011 @04:21PM (#38463888)

    With the growth of Javascript libraries like JQuery for more UI features, more images, I can see it reaching that high.

    Meanwhile, web developers don't care because more and more people are getting faster and faster broadband speeds. So as long as the page-load metric works OK on their rig or perhaps what the envision most of their viewers have... they think it's all OK.

    • Re:Not surprised (Score:5, Insightful)

      by Fujisawa Sensei (207127) on Thursday December 22, 2011 @04:23PM (#38463916) Journal
      Apparently it is because users are still hitting their websites.
    • Re:Not surprised (Score:5, Insightful)

      by Anonymous Coward on Thursday December 22, 2011 @04:26PM (#38463998)

      Web developers don't care because the majority of their images/css/js is cacheable by each visitor (and most people have jQuery cached from the official site and many sites link to that directly). 1MB page but it's only 45k on the next visit.

    • by warrax_666 (144623) on Thursday December 22, 2011 @04:30PM (#38464070)

      ... reimplementing jQuery (which is 31K, btw) badly in uncacheable custom ways without being able to draw on the years of expertise of the developers of jQuery would be a great alternative.

    • by Crudely_Indecent (739699) on Thursday December 22, 2011 @04:57PM (#38464482) Journal

      Certainly, JS frameworks do contribute to the total size of a page, that framework is generally cached and isn't re-downloaded on subsequent pages on the same site. So, your 965KB page just dropped to 800KB after the first page load. Images that are carried through a site (logos, widget buttons, backgrounds) can also only be counted on the first page load.

      I tend to focus on keeping things small, reusing anything I can. Some web developers do care...at least I do.

    • by phorm (591458) on Thursday December 22, 2011 @05:19PM (#38464796) Journal

      Well, stuff like jQuery/Dojo/etc libraries shouldn't be loading every time you view a page.
      The first view, your browser will need to load all the associated CSS, HTML, etc.
      After that, included files should hopefully be cached, and only page content need be loaded.

      Also, with JS libraries and AJAX, one should be able to build pages that load the overall template once, but don't require pulling large HTML files for updates (rather just pull content with AJAX).

    • by jenn_13 (1123793) <jenn@bohm.gmail@com> on Thursday December 22, 2011 @05:27PM (#38464884)

      I don't know about other developers, but I do care, and try to keep pages small. More and more people are accessing the web on mobile devices, so minimizing the data going back and forth, and round trips to the server, is important to user experience. In the design community, designing with mobile devices in mind is a growing practice.

    • Re:Not surprised (Score:4, Interesting)

      by Threni (635302) on Thursday December 22, 2011 @05:37PM (#38465010)

      Forgive me piggybacking here, but I've a web question. I read Slashdot predominantly on my phone (doesn't everyone?), but once you get 5 or 6 levels of replying in, the posts become unreadable. Each reply has a shorter width then the one above, meaning you end up with a handful of characters per line, and the rest of the horizontal space as just that - space. Is that really how it's supposed to be - completely unreadable? Is there no way of overriding it and saying 'look, I know it's a reply due to the context`. I've tried poking around in the various options within Slashdot, but I don't understand what most of them do, and the so-called help is completely useless and doesn't describe what the options mean nor how to use them. I think the problem is that the designers of Slashdot believe everyone is using a monitor so you'd probably need to be about 30 or 40 levels in to get to the same problem.

      I'm using Dolphin HD on Android but a friend with an Apple phone has the same problem. Is there an answer?

  • Missing data (Score:5, Interesting)

    by instagib (879544) on Thursday December 22, 2011 @04:22PM (#38463896)

    Average information content - does a page view give me more insight as a user now than it did 10 years ago?

  • Ad Content (Score:2, Insightful)

    by Anonymous Coward on Thursday December 22, 2011 @04:23PM (#38463942)

    And how much of it is ads?

  • by MetricT (128876) on Thursday December 22, 2011 @04:23PM (#38463944) Homepage

    and the 3G users, and the satellite users, and everyone else that has a low-bandwidth and/or high cost per byte connection.

    My parents can't get DSL or cable. They're stuck with 22k dial-up, and use AdBlock Plus, NoFlash, and Propel accelerator with compression set to the point where you can barely recognize photos, and it still takes 2 minutes for a reasonably normal page (CNN, MSNBC) to load, much less anything with a ton of Javascript or Flash.

    Can't websites automatically detect connection speed the first time a client visits, and store a cookie so that us slow people get a nice, simple website?

    Oh, and Propel, please move to JPEG2000 and XZ compression. Some people need every byte they can get.

  • by stating_the_obvious (1340413) on Thursday December 22, 2011 @04:25PM (#38463966)
    It's not the size of the home page, it's the motion of the .GIF
  • Ads (Score:5, Interesting)

    by Anonymous Coward on Thursday December 22, 2011 @04:25PM (#38463968)

    And.... when running AdBlock Plus, this figure goes down to 100kB. I run AdBlock mostly for the massive speed increase that comes with it.

  • by Anonymous Coward on Thursday December 22, 2011 @04:25PM (#38463970)

    My fully featured CMS that used jQuery, jQuery UI, and a lot of heavy library takes 140kb. Learn to optimize people!!

  • by burning-toast (925667) on Thursday December 22, 2011 @04:26PM (#38463992)

    I have a homepage, and it's only 4.92Kb. Granted it is the "It Works!" page for CentOS which has all of the other text and icons and such but who needs more than that? Do people really have personalized home pages now that Facebook came about (other than some hobbyists or professionals who run a side business)?

    I wonder what the average "Facebook" homepage size is... since that is what most people will be seeing regularly.

    - Toast

  • by ackthpt (218170) on Thursday December 22, 2011 @04:27PM (#38464030) Homepage Journal

    I think eBay lead the curve on this one. I complained bitterly to them about how long it took their bloated pages to load when I was still on dialup. Nobody cares.

    I suppose the telecoms do. This increases the liklihood of blowing through your monthly bandwidth cap without even watching videos.

  • Compression? (Score:4, Insightful)

    by s7uar7 (746699) on Thursday December 22, 2011 @04:29PM (#38464050) Homepage
    If the bulk of the increase is from javascript wouldn't turning on compression on the web server solve the problem? They're text files, they compress down massively.
  • Larger Pages (Score:5, Insightful)

    by Master Moose (1243274) on Thursday December 22, 2011 @04:29PM (#38464060) Homepage

    And Less Content. .

    I remeber the days when a site would include an 10 paragraph article on one page - Not 10 pages with a paragraph on each.

  • Flawed methods ... (Score:5, Interesting)

    by oneiros27 (46144) on Thursday December 22, 2011 @04:34PM (#38464130) Homepage

    This only matters if people go to the first page, and never go to any additional ones.

    For most websites these days, you'll take the initial hit from javascript and the 'branding' images when you first get to the site ... but the changing content per page is much lower.

    If websites are using standard javascript libraries being served by Google's CDN [google.com], then it's possible that someone visiting your page already has jquery, mootools or similar cached and doesn't need to load yet another copy.

    I also didn't see if they had any comparison between transferred size vs. used size. (eg, javascript that's sent compressed) ... and as this is from an new archive ... does anyone know if Archive.org could analyze their holdings to see what the longer term trends are?

  • by rbrander (73222) on Thursday December 22, 2011 @04:36PM (#38464152) Homepage

    My home page remains where it has been since 1993 at the Calgary Unix Users Group: http://www.cuug.ab.ca/branderr [cuug.ab.ca] ...clocks in at 9.2K, plus a 15K GIF and a 9.1K JPG (if you "turn on images" in your browser - remember when it was a realistic option not to?)

    I have held the line, while Viewing With Alarm (VWA) the growth of web pages for the entire 18 years since. I wrote Bob Metcalfe when he had a column at InfoWorld 15 years back, and he was Viewing With Alarm the exponential growth in Internet traffic and predicting the "collapse of the Internet" (had to eat those words - literally) because of it. My letter pointed out that his column constituted 2K of text - that was all the generated content that was bringing in the readers, (unless you count the 10K gif of Bob Metcalfe, and I don't), and the page had an additional 100K of framing and advertising-related image GIFs. His reply was somewhat defensive.

    This last year, I had occasion to travel on the Queen Mary 2, where all internet is via satellite at a minimum of 34 cents per minute with their bulk plan. How quickly I grew to resent the giant Flash blobs that would be automatically downloaded with every page of a newspaper so I wouldn't miss the animated ads for the latest in car buys. At QM2 speeds, I'd have to wait about two minutes before I even had an "X" mark to click on to dismiss the ad. I was rather quickly cured of almost any interest in the Internet content at ALL, I did my E-mail, checked the google news headlines (fewest high-byte ads), and logged off.

    My point: 90% of mail is spam. So are 90% of web page bytes. We just don't call it spam. We call it "the whole outside frame around the news page that we try not to see, but keeps jumping around into our field of view".

    • by warrax_666 (144623) on Thursday December 22, 2011 @05:16PM (#38464744)

      From a usability perspective:

      • Blue background. Why? Are you trying to accomplish some artistic purpose we're not privy to?
      • Why are the pictures laid out vertically rather than horizontally? Why is there lots of text to the right of the second picture rather than to the right of both pictures. That means that your contact info is obscured/invisible to potential readers -- it's also out of context in that place.
      • Why do your anchors span multiple sentences rather than just a few semantically relevant key words?

      In short: You fail web page design, so who the fuck cares if your page is 10K?

      • by mickwd (196449) on Thursday December 22, 2011 @08:04PM (#38466538)

        "In short: You fail web page design, so who the fuck cares if your page is 10K?"

        As a normal human being possessing the ability to read, I found his site perfectly accessible, and it gave me a decent amount of information about the guy in a quick, concise manner.

        If I was to be snarky here, I would say something like:

        In short: You fail meaningful criticism, and who the fuck cares if his "anchors span multiple sentences rather than just a few semantically relevant key words"?

    • by sakdoctor (1087155) on Thursday December 22, 2011 @06:02PM (#38465332) Homepage

      Don't feel too smug. Your page isn't even compressed.

    • by whisper_jeff (680366) on Thursday December 22, 2011 @07:14PM (#38466082)
      Sorry to be a dick but you're bragging about that page? Really? You know when they say "size doesn't matter"? Yeah - sometimes it also means being as small as possible is not necessarily a good thing. I would have thought that page was trash ten years ago when Geocities webpages were everywhere so, now, it's really not good... Seriously, stop bragging about it and spend some time designing a real page.

      Sorry to be a dick - someone had to tell you...
  • by Rinisari (521266) on Thursday December 22, 2011 @04:36PM (#38464158) Homepage Journal

    How much is cached? Yeah, initial page load sucks terribly, but how much has to be loaded on subsequent page requests?

    How many copies of jQuery and etc. do people have cached on their machines?

    It almost feels like we need dependency managers for browsers! I mean, I know there is the Google hosted stuff and other projects urge you to use their hosted version and fallback on a local copy.

  • by Animats (122034) on Thursday December 22, 2011 @04:37PM (#38464178) Homepage

    There is absolutely no doubt that these trends are attributable to the death throes of Flash and emergence of HTML5 and its open web cohorts.

    No, it's not about HTML 5. A lot of it is about bloated content management systems and templates.

    I was looking at a Wall Street Journal page recently, and I brought it into an HTML editor so I could eliminate all non-story content. The story required an HTML page with only 72 lines. The original page was over 4000 lines. It contained a vast amount of hidden content, including the entire registration system for buying a subscription. All that junk appears on every page.. Inline, not in an included file.

    On top of that, there are content management systems which create a custom CSS page for each content page. So there's no useful caching in the browser.

    Remember those people who said CSS was going to make web pages shorter? They were wrong. Look at Slashdot - bloated, slow pages that don't do much, yet consume CPU time when idle.

    • by Desler (1608317) on Thursday December 22, 2011 @04:45PM (#38464298)

      But Slashdot is now web 2.0, Ajax-enabled and social. Pretty soon it'll be "hosted on the cloud" and provide SaaS so it can win at buzzword bingo!

    • by mjwalshe (1680392) on Thursday December 22, 2011 @04:52PM (#38464402)
      I would mod you up but i have commented all ready - its a major problem seems worse in old school publishers unfortunately. Lucky the one I work for has finally seen the light lets how that I see some changes next year.
    • by archen (447353) on Thursday December 22, 2011 @05:01PM (#38464546)

      I agree that the reason things are getting bigger is because of extra "crap" getting served. Comercial pages are the biggest chunk of it, but even stuff like wordpress can toss out a lot of junk with templates. With bigger screen resolutions and assumed high speed internet, I'm seeing many sites being much more sloppy with large graphics too. The slashdot question at the end makes it sound like personal pages are relavent to this statistic. What percentage of the population actually has a personal homepage these days? 0.2% ? Facebook and blogging covers 99% of what most people need.

      I've had a page since 1997 and the average page size hasn't changed a whole lot. With the transition from table layouts to css the file sizes went down for a while, then my css got more complicated and it's about the same. A big page adds up to maybe as much as 15-20k. Anyone actually try writing 20k worth of text, grammer check it, proof read and edit it multiple times? That's a LOT of work for a file that is that small. I've got over 765 pages, with about 28Mb of stuff all together. I have no idea how much css is involved but my javascript adds up to something like 2k total (not including the google spy stuff).

    • by Trixter (9555) on Thursday December 22, 2011 @05:47PM (#38465112) Homepage

      It contained a vast amount of hidden content, including the entire registration system for buying a subscription. All that junk appears on every page.. Inline, not in an included file.

      Which reminds me: What's so bad about frames again? Is it so incredibly wacky to have static border/background/scripts downloaded only once per visit?

    • by mjwx (966435) on Thursday December 22, 2011 @09:38PM (#38467378)

      No, it's not about HTML 5. A lot of it is about bloated content management systems and templates.

      What do you think HTML5 is all about.

      An all new way to deliver bloated CMS's. Why do some people think HTML5 is some kind of magic fix for all the ills of the web?

      The problem is bad design and lack of care. No one gets punished for creating a crap system, bad developers get coddled, customers are coerced, sweet talked and sometimes forced into accepting bad CMS's.

      HTML5 will not stop bloat, will not stop bad, bandwidth consuming ads, it wont stop anything we dont like. In fact everything we dont like (even about flash) can and likely will be transmitted by HTML5 because like Flash, HTML5 is a delivery system, not content (yes I know this was the parents point).

      Another problem is as bandwidth increases, developers put less and less time into optimisation. They dont take into account customers using mobile broad band who a lot of the time get 500-ish KB that still exist.

  • by Anonymous Coward on Thursday December 22, 2011 @04:39PM (#38464216)

    That's still too much flash for me

    Sent from my iPhone

  • by Anonymous Coward on Thursday December 22, 2011 @04:39PM (#38464218)

    Ironically posted on a website that is itself a bloated pig.

  • by Todd Knarr (15451) on Thursday December 22, 2011 @04:40PM (#38464228) Homepage

    My personal site's home page? Fairly large, 18k of which 11k is images. I mean, it's a home page not an image gallery or something like that where you expect a lot of large content.

    I've seen some of those sites with large pages, and mostly I hate visiting them. The loading makes them feel like I'm wading through molasses, and the amount of stuff they're loading and the complexity of the scripts means more and more glitches and things that break when the network isn't perfect or they didn't expect the exact combination of things I've got at the moment. The pages come across as not being able to stay out of their own way, and more and more often they actually get in the way of my seeing what I came to the page for. There's merchants I've actually walked away from even though they had the product I wanted and had the best price on it simply because I couldn't get their pages to work well enough to get to the product page let alone order it. And I'm a techie who knows how to tweak the browser to make pages work when they don't want to, I shudder to think what it's like for someone who isn't a techie and is afraid to touch the security settings.

  • by lkcl (517947) <lkcl@lkcl.net> on Thursday December 22, 2011 @04:42PM (#38464250) Homepage

    my site's a pyjamas application. it is therefore 1,000 lines of python.... or, when compiled (and therefore including the pyjs runtime which does stuff like dict, list, exceptions etc. all emulated in javascript, as well as including the library of widgets that are used on the page) it's 1.3mb of really obtuse but functionally correct javascript.

  • by nick357 (108909) on Thursday December 22, 2011 @04:43PM (#38464268)

    It remains the size of one of those animated "under construction" gifs.

  • by mikael (484) on Thursday December 22, 2011 @04:44PM (#38464274)

    I've been able to run both CPU and GPU based CFD and 3D visualisation on my laptop without any problems, yet some flash games which are just doing 2D animation will roast a 2.7 GHz CPU to the point that the kernel decides to call it a day and shut down the whole system.

    Unbelievably, these flash games aren't doing anything more complex than playing a retro 2D platform game. I'm guessing that this is due to the way in which all the separate texturemaps/pixelmaps are treated as generic webpage images rather than as a single DOOM style WAD file.

  • by LiquidMind (150126) on Thursday December 22, 2011 @04:46PM (#38464300)

    http://www.the5k.org/ [the5k.org]

    It seemed so long ago. Didn't /. have an entry as well?

  • by Skinkie (815924) on Thursday December 22, 2011 @04:51PM (#38464378) Homepage
    The abuse of __VIEWSTATE in certain pages makes the actual viewstate bigger that the site itself, per click, growing and growing. Which basically must count for something. I have always wondered how Microsoft hould have thought this out, or maybe the lack of education of its "developers".
  • by unts (754160) on Thursday December 22, 2011 @04:51PM (#38464388) Homepage Journal
    My personal web site's home page is 2KB. It's HTML5, no CSS, no JS. My research group site has a bit of all three plus a handful of images and comes in at 125KB. Big website I sysadmin weighs in at 1.1MB. A nice variety there. I think my personal site claims the crown as the fastest loading and quickest to render.
  • by rollingcalf (605357) on Thursday December 22, 2011 @04:56PM (#38464458)

    Some sites use Javascript to display what is semi-static data that should be assembled on the server side before transmitting to the user. For example, a news site where the stories are loaded by Javascript.

    Some sites even have pages that are entirely blank if Javascript is turned off. It seems that some of these "web programmers" don't even know how to dynamically build a page with server-side scripting instead of Javascript.

  • by keith_nt4 (612247) on Thursday December 22, 2011 @05:04PM (#38464588) Homepage Journal

    Damned FrontPage

  • by colinrichardday (768814) <colin.day.6@hotmail.com> on Thursday December 22, 2011 @05:22PM (#38464836)

    From 1995 to 2003, 26.7% annual gowth (take the eighth root of (93/14) and then subtract 1). From 2003 to 2008, 26.4%. From 2008 to 2010, 53.0%. Last year's growth was 37.5%. All percentages rounded to the nearest tenth of a percent.

  • by Anonymous Coward on Thursday December 22, 2011 @05:24PM (#38464854)

    What's worse is that the "payload" of text is less and less interesting. Bandwidth isn't the problem. I have more than enough bandwidth for these pages. When they hit the browser, they take forever just to render. There are a handful of web sites I still use, Slashdot among them. Most new sites I just back right up. If your site does that on day 1, it's not worth the bother. I'm not buying a new machine just to look at your crap web site that's probably just a rehash of every Internet meme.

    We're well into the "nobody comes here anymore it's too crowded" and/or "57 million web sites and nothing on" stage.

  • by pruss (246395) on Thursday December 22, 2011 @05:28PM (#38464894) Homepage

    Including 9366 bytes worth of images. It's been pretty steady since 1992 or so (initially hosted over ftp instead of http as my Dept didn't have an http server yet).

  • by JobyOne (1578377) on Thursday December 22, 2011 @05:34PM (#38464958) Homepage Journal

    Riiigght... Javascript increases by about 50Kb, so it's responsible for the other several hundred Kb of increase over the last few years?

    Everyone realizes that gzipped jQuery is only 31Kb, right? I'm sick of people blaming Javascript for bloat. Do you realize how much work it would be to produce several hundred Kb of it? Much less think of reasons to produce that much?

    I've been a web designer for years, and where the increases in page size I've seen actually come from is just plain old images. Monitors are bigger. That means web page designs need to be bigger. That means the images that make them up need to be bigger.

    Think about it: it used to be common practice to design a fixed-width website to render on an 800x600 (or even 640x480) monitor. I remember doing them 450px wide, back in the dark ages. Now I do them at 960px, or wider if the audience is likely to have higher resolution monitors. That means the images that make up the layout need to be a little more than twice as wide, if they're made equally taller that's a little more than 4 times as many pixels, just to do the same old things at the new standard resolution.

    Since bandwidth is less of an issue today (for the vast majority of people, anyway) we also compress our images less, in the name of things not looking like crap. When designing a website (especially a graphics-heavy one) you also need to spend more care on the stuff around the main design, for people with maximized browsers on high-resolution screens. So not only are we making images with 4X the pixels, compressed a little less, we need to add a few more images to the mix to maintain the same level of visual excitement for people using big screens.

    On websites produced by competent people HTML5, CSS, Flash and Javascript have basically fuck-all to do with the recent ballooning of bandwidth. Specific implementations of Javascript that you happen to have a grudge against have less that fuck-all to do with it. It's just a fact of life that bigger screens connected to fatter pipes will wind up with more pixels piped to them. Deal with it.

    That said, my portfolio website is only 300-odd Kb on the homepage, and it's CMS-driven, includes jQuery and is pretty damn heavy on the graphics. I like to think I'm sharper than the average copy-pasting "web designer," though.

  • by tekrat (242117) on Thursday December 22, 2011 @05:35PM (#38464974) Homepage Journal

    Well maybe if you include all the images and the PDFs. I have a rather extensive website and if I recall, even when I backed up the entire thing, it came out to maybe 76MB, and that included all the image hosting I was doing for a different website.

    The problem is the same problem we're having now with "windows" software. Bloated because it's being generated by machine rather than hand-coded. All these WYSIWYG HTML code generators that allow people to just drag and drop text and pictures and let Dreamweaver do the rest -- or worse -- those crazy websites that just build other websites (i.e blogger).

    The point is: You can get away with a lot less, but nobody cares because bandwidth is fast and cheap and so is processing time. If the NY Times took 10 minutes to load, you'd better believe they would do something to optimize it.

    But that's like asking modern programmers to hand-optimize their code for office applications. It's just not going to happen. MS Word loads and runs fast enough even though the code for that thing is a nightmare. Yet, for the majority of what you use Word for, the free, scaled down desk accessory "Wordpad" is more than enough.

  • by cvtan (752695) on Thursday December 22, 2011 @05:38PM (#38465022)
    Well, that explains why my 400MHz WINXP laptop with 128M memory has a heart attack trying to scroll down a page of comments on Slashdot.
  • by Solandri (704621) on Thursday December 22, 2011 @09:19PM (#38467214)
    Median is the measure you want.

    If you use the mean, 90% of web pages could stay the exact same size, but if the other 10% go nuts and increase their size 20x, the mean will grow nearly 3x.
  • by adolf (21054) <flodadolf@gmail.com> on Thursday December 22, 2011 @11:34PM (#38467986) Journal

    So a web page today is about 10x bigger than it was in 2003. I can accept that.

    But in 2003, I had a baseline 2-megabit-per-second Internet connection and could have had a 3- or 5-megabit connection for a bit more cash.

    Today, 8 years later, the "normal" connection speed for my ISP is 6-megabit.

    So according to my observations and their statistics, folks are expected to download 10 times the amount of stuff using just 3 times as much available bandwidth.

    In other words, the web is currently more than three times slower than it was in 2003.

    Hooray!

  • by retroworks (652802) on Friday December 23, 2011 @06:03AM (#38469610) Homepage Journal
    About half my regular blog readers are based in emerging markets / less developed countries. I began to notice that hittership was dropping in Africa and India. Reviewing about a thousand posts, I noticed that the more photos and "blogger apps" I put on the web page, the lower the readership in countries with low bandwidth. I've been more conscientious now about which photo resolution I post and tend to avoid videos. And a lot of the cool little blogger widgets don't seem as important when measured in seconds to open the page. http://retroworks.blogspot.com/2010/12/blog-has-widget-fever.html [blogspot.com] Of course my content also sometimes sucks, and it also helps if I lay off the haiku.

Everything that can be invented has been invented. -- Charles Duell, Director of U.S. Patent Office, 1899

Working...