Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Internet Technology

Modern Web Bloat Means Some Pages Load 21MB of Data (tomshardware.com) 110

Christopher Harper reports via Tom's Hardware: Earlier this month, Danluu.com released an exhaustive 23-page analysis/op-ed/manifesto on the current status of unoptimized web pages and web app performance, finding that just loading a web page can even bog down an entry-level device that can run the popular game PUBG at 40 fps. In fact, the Wix webpage requires loading 21MB of data for one page, while the more famous websites Patreon and Threads load 13MB of data for one page. This can result in slow load times that reach up to 33 seconds or, in some cases, result in the page failing to load at all.

As the testing above shows, some of the most brutally intensive websites include the likes of... Quora, and basically every major social media platform. Newer content production platforms like Squarespace and newer Forum platforms like Discourse also have significantly worse performance than their older counterparts, often to the point of unusability on some devices. The Tecno S8C, one of the prominent entry-level phones common in emerging markets, is one particularly compelling test device that stuck. The device is actually quite impressive in some ways, including its ability to run PlayerUnknown's Battlegrounds Mobile at 40 FPS -- but the same device can't even run Quora and experiences nigh-unusable lag when scrolling on social media sites.

That example is most likely the best summation of the overall point, which is that modern web and app design is increasingly trending toward an unrealistic assumption of ever-increasing bandwidth and processing. Quora is a website where people answer questions -- there is absolutely no reason any of these websites should be harder to run than a Battle Royale game.

This discussion has been archived. No new comments can be posted.

Modern Web Bloat Means Some Pages Load 21MB of Data

Comments Filter:
  • Q? (Score:5, Insightful)

    by dohzer ( 867770 ) on Tuesday March 19, 2024 @08:06PM (#64329383)

    the most brutally intensive websites include the likes of... Quora

    Why is this website still around? All it seems to do is degrade search results.

  • I remember limiting database forms, queries, edit pages and reporting to about 250kbytes per page to remain spunky across the T1,OC3 corp wan. These were very data dense things, about the same as a service now page today without the style and formatting options of today.

    That it has only grown by a factor of 10 to 100 over the past decade is an amazing small amount of growth when bandwidth to the user is up 100 to 1000 times.
    • by rogersc ( 622395 )
      Many sites continue to limit lists to about 5 items, forcing the user to click next pages dozens of times to see everything. All to save a few kilobytes in a web that is many megabytes.
      • by sixoh1 ( 996418 ) on Tuesday March 19, 2024 @10:17PM (#64329615) Homepage

        Many sites continue to limit lists to about 5 items, forcing the user to click next pages dozens of times to see everything. All to save a few kilobytes in a web that is many megabytes.

        Uhh, did you not realize those listicles are limited so that you have to generate clicks? They measure engagement and harvest your eyeball-attention-usage data from those clicks. Even with JavaScript they don’t get much data from you scrolling through a page, but make it clickable and they can behaviorally profile your interest, consumption, and maybe even the person using the mouse, depending on whether we’re thinking Cambridge Analytica/Palintir level magic, or just make a good guess if you don’t believe in the BS. Either way listicles have not a single thing to do with data/bandwidth saving.

      • by AvitarX ( 172628 )

        I'll take that over loads as I scroll any day.

    • bandwidth to the user is up 100 to 1000 times.

      Is that per second or per month? I'm under the impression that a lot of people are still stuck on satellite or cellular Internet access with a monthly Internet data transfer quota in low double digit gigabytes or even in the single digit gigabytes.

    • by jonadab ( 583620 )
      I used to make sure my pages loaded reasonably quickly over dialup connections. Most of them probably still would, if dialup were still a thing, apart from inherently image-heavy things like the photo gallery. Although the need to add media queries to make things reasonable on handheld devices with absurdly small screens has more than doubled the length of my stylesheets.

      And yeah, if a web page exists principally to show an hour-long high-definition video, then it's excused from fast-load-over-slow-connec
  • by cstacy ( 534252 ) on Tuesday March 19, 2024 @08:18PM (#64329409)

    How much does Slashdot download?
    How about Ars Technica, from where most of the Slashdot content comes?

    • Open the developer tools in your browser and find out.

      I just did, and then did a reload on this page. 88.4kB transferred, 1.8MB total resources.

  • by russotto ( 537200 ) on Tuesday March 19, 2024 @08:26PM (#64329419) Journal

    I guess if there's any place to point out that forums worked fine over 300bps connections, it's here. Although the upgrade to 1200bps was admittedly a welcome change.

  • by PPH ( 736903 ) on Tuesday March 19, 2024 @08:36PM (#64329435)

    ... tracking code.

    Or lazy developers who pull in megabyte JavaScript libraries just to use one function. Which calls a bunch of tracking code.

    • Re: (Score:3, Interesting)

      I know it has been an extension for yonks but I recently installed NoScript. Sure, it's pain to whitelist essential Javascript on the couple of dozen sites I regularly visit.

      But doing my bit in fighting climate change to save a couple of kWh a year in not processing garbage from random URLs. :)

      • > But doing my bit in fighting climate change to save a couple of kWh a year in not processing garbage from random URLs. :)

        Then you might be interested in this:
        https://addons.mozilla.org/en-... [mozilla.org]

        Also available for Chrome...

      • by nmb3000 ( 741169 )

        I know it has been an extension for yonks but I recently installed NoScript. Sure, it's pain to whitelist essential Javascript on the couple of dozen sites I regularly visit.

        I love NoScript and have been using it for ages, but I do wish there was a "curated" mode with a minimal trusted whitelist you could opt into using, sort of like a reverse ad-blocker. There are times when visiting a new site that the list of blocked domains is monumental, and figuring out the minimal set necessary for the page to function is pretty much impossible without investing way too much time.

        Even an option akin to "temporarily allow all" but instead using a whitelist built from contributions (maybe

    • ... tracking code.

      Or lazy developers who pull in megabyte JavaScript libraries just to use one function. Which calls a bunch of tracking code.

      Both

  • more importanly (Score:4, Interesting)

    by spaceman375 ( 780812 ) on Tuesday March 19, 2024 @08:39PM (#64329441)
    They should have measured how much data each site uploads.
  • by caseih ( 160668 ) on Tuesday March 19, 2024 @08:46PM (#64329449)

    Both addons give you a live total of the number of connections blocked. It's quite staggering what some web sites try to load. Privacy Badger works very well to reduce the bloat and nearly all sites I've ever seen work just fine without all the superfluous trackers. If you stay on some sites long enough I've seen Privacy Badger block thousands of connections.

    Both of these addons are essential to using the web safely these days. To me most web sites are unusable without them.

    • Heh. I use Privacy Badger and uBlock for everything, including here.
    • Privacy Badger's permission requirements are incredibly invasive. No thanks.
      • by caseih ( 160668 )

        How would you expect it to work if it could not access and modify web site data streams, browser tabs, and browser activity? Seems a pretty silly objection to me. Certainly the alternative (nothing) is a pretty poor choice.

  • Ad Blocker (Score:5, Interesting)

    by crow ( 16139 ) on Tuesday March 19, 2024 @08:46PM (#64329451) Homepage Journal

    A good study of web site bloat would also report on how much bandwidth ad blockers save. And comparing various ad blockers would give a good metric as to how effective they are.

    And as someone else suggested, also measure upload data, especially with and without ad blockers.

    • I agree. Every page I open now seems to have animated ads or run some sort of video either in the page or on a overlaid minipage. I am pretty sure that that is where most of the volume is going. Since ads pay for the pages, I am not sure what we can really do. Ad blockers are a possibility, but you can be sure that the people who own the websites will do everything they can to defeat them.
      • Since ads pay for the pages, I am not sure what we can really do.

        Some hardliners on this site have recommended that viewers stop patronizing websites run as a business and start patronizing more websites run by the public sector, by charities, or by enthusiasts as a hobby out of said enthusiasts' own pocket. What makes this practical or impractical?

      • by vlad30 ( 44644 )
        Many sites, in particular news sites won't load unless you turn off the ad blocker for them. Brave seems to work though
  • by e065c8515d206cb0e190 ( 1785896 ) on Tuesday March 19, 2024 @08:48PM (#64329455)
    say Ghostery + uBlock Origin
  • My WASM page (Score:5, Interesting)

    by OrangeTide ( 124937 ) on Tuesday March 19, 2024 @09:06PM (#64329503) Homepage Journal

    I was freaking out because my WASM page using enscripten, SDL2, etc was a little over 4MB of wasm and .js loaders to have some basic spinning triangles. So I spent a bunch of time to get it down to 100kB. All because I felt that 4MB was an excessive amount to download to try my dumb little toy apps. No actual business reason even, just politeness. Oh well

  • My company uses flutter as it's framework and our site with fonts is over 20mb; trying to do cache invalidation on the client is a nightmare. Our devs say there's nothing they can do to really reduce the package size as the majority of the dependencies are from our vendors software development kits. The bloat is amazing, devs see no reason to optimize as everyone is on 4G/LTE mobile or 100mbit connections so data transfer isn't a concern. I'm glad we're starting to reach a point where clients are starting
    • The bloat is amazing, devs see no reason to optimize as everyone is on 4G/LTE mobile or 100mbit connections so data transfer isn't a concern.

      Do users on 4G/LTE mobile stop visiting your site when they run out of Internet data transfer quota for the month?

  • The main reason for the bloat is that huge frameworks and helpers are downloaded in the header of the websites before any code even gets run. These are frequently provided for free by CDN's so the downside for the website coder is nil but the upside is a much easier job.

    20-30 MB of downloads happen in a flash on modern broadband anyways so I don't see this as a problem.

    • Re:the main reason (Score:5, Interesting)

      by Baron_Yam ( 643147 ) on Tuesday March 19, 2024 @10:35PM (#64329641)

      I run ad, script, and tracking blockers on my browsers. It's amazing how much cross-site crap people expect you to just load because the server you went to tells you to.

      As a general rule, if your site breaks because of that I go somewhere else. I wish more people did the same to push everyone back to hosting everything on a page in one place. Own your shit.

      And yeah I'm old, but I miss the days when people worried about efficiency. When I started doing web design as a teen, if you had a bloated page people would get bored waiting and click away. You'd lose traffic. I don't want auto-loading videos. Ever. I don't want auto-starting audio. EVER.

      • >> ad, script, and tracking blockers on my browsers

        I run those too, and as a result webpages load up quickly. Commonly used frameworks and helpers can load in with your your webpage, no problem if it is a just couple dozen MB. I write website code myself and I understand how it is.

      • by tepples ( 727027 )

        As a general rule, if your site breaks because of that I go somewhere else.

        Is gracefully falling back to a subscription prompt considered "breaking"? Otherwise, if every viewer were to block ads, how would a website's writers continue to eat?

        • Host your own ads instead of trusting a third party won't insert something malicious deliberately or through incompetence. Or use my loading of their ad as a way to track my activity across all sites that third party serves.

          Ad blockers don't block same-host ads without custom filters.

          • by tepples ( 727027 )

            Host your own ads instead of trusting a third party won't insert something malicious deliberately or through incompetence.

            How does the publisher of a newly launched website go about seeking advertisers to advertise on that website without going through a network or exchange?

    • Re:the main reason (Score:5, Insightful)

      by dryeo ( 100693 ) on Wednesday March 20, 2024 @01:54AM (#64329843)

      20-30 MB of downloads happen in a flash on modern broadband anyways so I don't see this as a problem.

      Not everyone has modern broadband.

      • It also isn't the total amount that is the problem as much as the MANY connections that amount tends to have. The overhead for each connection to a (different) server is vastly more time than the download once the connection is made. So if you have ads and javascript libraries and images and content (each from a different server sometimes!), you are going to see much worse performance than a huge chunk of HTML loaded in one connection session.

        • by kackle ( 910159 )
          Amen. I HAD to replace my parents' 2.8 MHz machine: Even at 15 years old it could handle everything they wanted it to do, including 3D games for the grandkids, but its single-core couldn't keep up with the websites--the browsers got slower and slower.
      • by mjwx ( 966435 )

        20-30 MB of downloads happen in a flash on modern broadband anyways so I don't see this as a problem.

        Not everyone has modern broadband.

        Yep, try opening a web site that downloads 12 Mb of crap on a mobile connection in a busy city centre... you'll be waiting ages.

        Terrible internet is still very much a first world problem.

    • by tepples ( 727027 ) <tepples.gmail@com> on Wednesday March 20, 2024 @11:31AM (#64330863) Homepage Journal

      20-30 MB of downloads happen in a flash on modern broadband anyways so I don't see this as a problem.

      20 to 30 MB per page in a flash is a good way to blow through your monthly mobile Internet data transfer quota in a day, leaving you without mobile Internet access for 29 days until your cap resets.

  • by paul_engr ( 6280294 ) on Tuesday March 19, 2024 @10:45PM (#64329649)
    I was clearing my cache a few months ago and i had 135mb of cached data for kfc.com. I've never been to KFC.com, nor have I even been to a KFC in the past decade.
  • Too many web sites fail to show anything/much if Javascript for them is blocked. The result is that I will often go somewhere else.

    Javascript is great to make a web page slicker but should not be needed just to see it. It is done for the web dev's convenience not for the web surfer's experience. Efficiency seems to be a concept that many developers today do not understand.

    • by tepples ( 727027 )

      Javascript is great to make a web page slicker but should not be needed just to see it.

      How would you make, say, an IRC to HTTP proxy without JavaScript? Would it be practical to require the user to periodically press F5 to refresh the view of the channel to check for new messages sent to the channel?

      • by pacinpm ( 631330 )

        You do it normally with JS and use pressing F5 as a fallback for people who can't/won't use JS. It works worse but still works.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Wednesday March 20, 2024 @02:27AM (#64329859)

    Senior WebDev here.

    I've been building websites and webapps for 24 years now and the shit we have around today would've gotten people fired on the spot back then. Entire generations who know nothing but bloated VDOM setups and use them to build even the most trivial websites. Pagebuilder bloat loaded with images in print resolution and bullshit "consent management" widgets, pointless trackers and services that load half a GB to show some text and a button. Dimwitt deciders that couldn't tell a client from a server and haven't seen a single line of HTML in their entire life. Websites that load 10x of the entire Amiga operating system before they can even render a single pixel. ... it's a disaster and one giant stinking mess. And one of the reasons I'm shifting my career as we speak. To many clueless people calling the shots. Way to frustrating.

    • Fellow curmudgeon here... and yeah, I mostly agree.

      20-25 years ago most of us would've killed for a 1.5mb T-1 line or even a 256k frame relay connection to our house. Even my place which is somewhat rural I have 25mb down/3mb up DSL service.

      Think it is a combination of the commercialization of the web, and folks who do/did graphic design work "on paper" tried to move their absolute layout control over to the web, which caused a ton of javascript and css stuff to get created that as others have noted leads

      • by Merk42 ( 1906718 )
        Do either of you have good examples of websites that meet the expectations of users today, with regards to design and functionality, but are also quite lean?
        • Don't you know how being a curmudgeon works! We don't have show our work, just tell you how much better it was in our day! (And, yes, I'm one of those curmudgeons! Been at this for almost 30 years now!)

        • by SpzToid ( 869795 )

          Do either of you have good examples of websites that meet the expectations of users today, with regards to design and functionality, but are also quite lean?

          Most pages on drupal.org [drupal.org] are exemplary per your requirements.

      • 22 1/2 years ago I moved, and went from 56K dial-up to 3 Mbps Cable. The new place also had DSL as an option; the old place had neither.

    • ^^^^^ THIS

      I've been building web properties since 1996 and the web has turned into a massive pile of bloated crap.

      The size of the pages I built are positively anemic compared with almost any recent site, or anything that's been built in the last 15 years.

      I'm no longer doing active web development, but it makes me sick to see all the shit that a modern page loads, it's utterly insane.

      • My primary site has a mobile/desktop friendly layout, and the index page weighs in at 450kb. Checked a random page on my site and that was 450kb too. I'd be amazed if any page is more than a megabyte.

        Then again, I write all my html by hand, and use my own homebrew text compiler (ygen2) to generate all the pages from templates and includes. I put the site up in 1999 I think it was, and all pages still use the same URLs I publish them with. None of that content management crap.
        • Then again, I write all my html by hand

          That's what I did too. I wrote literally thousands of pages using nothing more than Metapad and/or Notepad++. All my HTML was lean and mean without any extra garbage. I never used an IDE or anything like that.

          Most CMS apps suck balls (I'm especially looking at you, e107) so I went back writing all the code myself. All those sites are still up although traffic has tapered off in the last few years.

          Later I did upgrade a few of the sites to use a CMS, but I wrote the scripts that imported the pages myself and

    • VDOM WAFS were built for sites that needed streaming/dynamic content - client-side pub/sub systems that update the DOM dynamically basically. They're trying to make HTML browsers look and act like native apps where lots of dynamic data is concerned. I see a lot of mostly static web sites trying to be single page apps when there's absolutely zero reason to do so. We older folks are always like "hrrrr, what problem are we trying to solve here??" and the answer is pretty much we already decided to use it becau

  • Many social media sites nowadays assume they are always open - so on the scale of what's important, initial loading time -& data is way down.

  • Just for fun, I ran Wix on a desktop Chromium without an ad-blocker on a decade-old PC. And guess what... nothing extraordinary happened. The page loaded quickly. Maybe because my connection is fast, I dunno.

    There are other websites that would quickly consume all of my RAM, slow my Linux desktop to a crawl, and sometimes even force me to perform a hard reboot to recover. There are other websites that load with themselves a gazillion of subframes (visible in Chromium Task Manager) each doing something - two

  • I could not find a Firefox add on/plugin that would display the weight of a tab. Any suggestion ? Does it even exists ?
  • Pihole, firefox with noscript and ublock origin, and I see very few ads.

    When I implemented those some years ago when I was still on 8Mbit ADSL, I was astonished at how much faster web pages were able to load, albeit with a few empty frames where the ads were supposed to go.

    Now that I've got Starlink it probably wouldn't be such a big difference, but I'm used to the ad-free* experience now.

    *some websites serve the ads from their own domain and I still see some ads when visiting places like distrowatch.

  • by Askmum ( 1038780 ) on Wednesday March 20, 2024 @05:13AM (#64329975)
    With delayed loading and then repositioning text when it was already loaded and you are already reading it, forcing you to keep scrolling until the website is finally fully loaded. Modern websites are total crap.
  • Developers these days are dependent on various frameworks and libraries to build web apps. Each one can add many megabytes to a page load. Not to mention eye candy and other bells and whistles that don't add anything to the functionality. The use of these resource can usually be optimized (tree shaking) but that requires extra time that may be limited due to release schedules. Don't they realize or care that their users are probably not using the latest and greatest developer laptops on gigabit networks or

    • It's worse when you are aware of the bloat and you can (and want) to avoid bloat on your site but your boss demands that you use the trendy (and extremely bloated) framework "because it will save time".
  • 15 years ago, I sent an email to the USA Today editor. I used wget to pull an article from their website. I ran it through tidy to clean it up. It was 25% the original size. The rest was unnecessary spaces in the html. I assumed it was either sloppy coding, or a deal with the phone companies to create data overages.
  • A truely great and humorous take on this topic is at https://idlewords.com/talks/we... [idlewords.com]

  • My own company just moved to online version of Office 365. What that means for me is 4s lag when I hit ctrl-f in Word to search before I can search, then another 4s waiting for it to start searching.

    The odd part is they appeared to do extra work so I could bit ctrl-f then start typing immediately, and all that gets buffered and processed eventually, rather than enter some void mode where the user has to wait for the prompt to appear in the find window, and only then start accepting typing.

    Which means they

  • Yeah, it is a big problem for Google and very nice news for everybody else, the internet slowly but steadily transforms into internet of applications from that ancient as mammoth shit HTML standard.
  • by bradley13 ( 1118935 ) on Wednesday March 20, 2024 @10:30AM (#64330653) Homepage

    The favorite excuse - also in the /. comments - is the use of frameworks. Tie in a dozen JS libraries, that themselves tie in a dozen more. Huge attack surface, and lots of security hole - who cares? Hit your users with dozens of trackers - again, who cares?

    I'm in Europe, so I always get the cookie popups. Yesterday, I accidently clicked "show details" instead of "reject". There was an autogenerated, alphabetical list of the 3rd-party cookied, alphabetical by source. On the screen, I could only see to the letter 'B' - it went on for literally hundreds of entries.

    Do we blame this on clueless managements? Underqualified developers? I don't know, but I do think it's time to take the regulation regarding 3rd-party tracking a step farther: Don't ask for permission to send tracking information to 3rd parties - just make it flat-out illegal.

  • I *DESPISE* the jerks who to "speed up loading" late load 20 websites. Like, for example, last night, creating a bluesky account, where it didn't load the links to captcha until it was ready for you to prove "you're human". I had to redo the whole damn thing.

    And anyone doing any development who doesn't assume that 90% of every visitor does *NOT* have script and link blocking (like noScript) is a moron, working for an arsehole who tells them do magic.

  • This cookie consent bullshit is even worse. I have to click every . single . webpage . It needs to die.
    • Right now, 24MB seems wasteful for a single web page.
    • A few years ago, 2MB would have seemed wasteful.
    • Before that, hundreds of KB would have been wasteful.
    • And in the pre-web days applications would be wasteful if they used tens of KB.
    • Or single-digit numbers of KB.
    • In even earlier days, using extra individual bytes was wasteful.
    • People have been known to optimize code just to find individual *bits* of unused space.

    The point is, the problem of creeping bloat has existed ever since the world's second program was

  • Anyway...

Crazee Edeee, his prices are INSANE!!!

Working...