Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Internet Technology

Modern Web Bloat Means Some Pages Load 21MB of Data (tomshardware.com) 48

Christopher Harper reports via Tom's Hardware: Earlier this month, Danluu.com released an exhaustive 23-page analysis/op-ed/manifesto on the current status of unoptimized web pages and web app performance, finding that just loading a web page can even bog down an entry-level device that can run the popular game PUBG at 40 fps. In fact, the Wix webpage requires loading 21MB of data for one page, while the more famous websites Patreon and Threads load 13MB of data for one page. This can result in slow load times that reach up to 33 seconds or, in some cases, result in the page failing to load at all.

As the testing above shows, some of the most brutally intensive websites include the likes of... Quora, and basically every major social media platform. Newer content production platforms like Squarespace and newer Forum platforms like Discourse also have significantly worse performance than their older counterparts, often to the point of unusability on some devices. The Tecno S8C, one of the prominent entry-level phones common in emerging markets, is one particularly compelling test device that stuck. The device is actually quite impressive in some ways, including its ability to run PlayerUnknown's Battlegrounds Mobile at 40 FPS -- but the same device can't even run Quora and experiences nigh-unusable lag when scrolling on social media sites.

That example is most likely the best summation of the overall point, which is that modern web and app design is increasingly trending toward an unrealistic assumption of ever-increasing bandwidth and processing. Quora is a website where people answer questions -- there is absolutely no reason any of these websites should be harder to run than a Battle Royale game.

Modern Web Bloat Means Some Pages Load 21MB of Data

Comments Filter:
  • Q? (Score:5, Insightful)

    by dohzer ( 867770 ) on Tuesday March 19, 2024 @08:06PM (#64329383)

    the most brutally intensive websites include the likes of... Quora

    Why is this website still around? All it seems to do is degrade search results.

  • I remember limiting database forms, queries, edit pages and reporting to about 250kbytes per page to remain spunky across the T1,OC3 corp wan. These were very data dense things, about the same as a service now page today without the style and formatting options of today.

    That it has only grown by a factor of 10 to 100 over the past decade is an amazing small amount of growth when bandwidth to the user is up 100 to 1000 times.
    • by rogersc ( 622395 )
      Many sites continue to limit lists to about 5 items, forcing the user to click next pages dozens of times to see everything. All to save a few kilobytes in a web that is many megabytes.
      • by sixoh1 ( 996418 ) on Tuesday March 19, 2024 @10:17PM (#64329615) Homepage

        Many sites continue to limit lists to about 5 items, forcing the user to click next pages dozens of times to see everything. All to save a few kilobytes in a web that is many megabytes.

        Uhh, did you not realize those listicles are limited so that you have to generate clicks? They measure engagement and harvest your eyeball-attention-usage data from those clicks. Even with JavaScript they don’t get much data from you scrolling through a page, but make it clickable and they can behaviorally profile your interest, consumption, and maybe even the person using the mouse, depending on whether we’re thinking Cambridge Analytica/Palintir level magic, or just make a good guess if you don’t believe in the BS. Either way listicles have not a single thing to do with data/bandwidth saving.

      • by AvitarX ( 172628 )

        I'll take that over loads as I scroll any day.

  • by cstacy ( 534252 ) on Tuesday March 19, 2024 @08:18PM (#64329409)

    How much does Slashdot download?
    How about Ars Technica, from where most of the Slashdot content comes?

  • by russotto ( 537200 ) on Tuesday March 19, 2024 @08:26PM (#64329419) Journal

    I guess if there's any place to point out that forums worked fine over 300bps connections, it's here. Although the upgrade to 1200bps was admittedly a welcome change.

    • 300 bps? Try 75 baud upload speed on the Minitel. With only 7 bits. Using 2 or 3 bytes videotex characters, the modem could not keep up with a typing speed of just 3 keystrokes per second. At one point I could type 120 wpm. Way faster than the modem could handle.

  • by PPH ( 736903 ) on Tuesday March 19, 2024 @08:36PM (#64329435)

    ... tracking code.

    Or lazy developers who pull in megabyte JavaScript libraries just to use one function. Which calls a bunch of tracking code.

    • I know it has been an extension for yonks but I recently installed NoScript. Sure, it's pain to whitelist essential Javascript on the couple of dozen sites I regularly visit.

      But doing my bit in fighting climate change to save a couple of kWh a year in not processing garbage from random URLs. :)

    • ... tracking code.

      Or lazy developers who pull in megabyte JavaScript libraries just to use one function. Which calls a bunch of tracking code.

      Both

  • by spaceman375 ( 780812 ) on Tuesday March 19, 2024 @08:39PM (#64329441)
    They should have measured how much data each site uploads.
  • by caseih ( 160668 ) on Tuesday March 19, 2024 @08:46PM (#64329449)

    Both addons give you a live total of the number of connections blocked. It's quite staggering what some web sites try to load. Privacy Badger works very well to reduce the bloat and nearly all sites I've ever seen work just fine without all the superfluous trackers. If you stay on some sites long enough I've seen Privacy Badger block thousands of connections.

    Both of these addons are essential to using the web safely these days. To me most web sites are unusable without them.

  • Ad Blocker (Score:5, Interesting)

    by crow ( 16139 ) on Tuesday March 19, 2024 @08:46PM (#64329451) Homepage Journal

    A good study of web site bloat would also report on how much bandwidth ad blockers save. And comparing various ad blockers would give a good metric as to how effective they are.

    And as someone else suggested, also measure upload data, especially with and without ad blockers.

    • I agree. Every page I open now seems to have animated ads or run some sort of video either in the page or on a overlaid minipage. I am pretty sure that that is where most of the volume is going. Since ads pay for the pages, I am not sure what we can really do. Ad blockers are a possibility, but you can be sure that the people who own the websites will do everything they can to defeat them.
  • by e065c8515d206cb0e190 ( 1785896 ) on Tuesday March 19, 2024 @08:48PM (#64329455)
    say Ghostery + uBlock Origin
  • by OrangeTide ( 124937 ) on Tuesday March 19, 2024 @09:06PM (#64329503) Homepage Journal

    I was freaking out because my WASM page using enscripten, SDL2, etc was a little over 4MB of wasm and .js loaders to have some basic spinning triangles. So I spent a bunch of time to get it down to 100kB. All because I felt that 4MB was an excessive amount to download to try my dumb little toy apps. No actual business reason even, just politeness. Oh well

  • My company uses flutter as it's framework and our site with fonts is over 20mb; trying to do cache invalidation on the client is a nightmare. Our devs say there's nothing they can do to really reduce the package size as the majority of the dependencies are from our vendors software development kits. The bloat is amazing, devs see no reason to optimize as everyone is on 4G/LTE mobile or 100mbit connections so data transfer isn't a concern. I'm glad we're starting to reach a point where clients are starting
  • The main reason for the bloat is that huge frameworks and helpers are downloaded in the header of the websites before any code even gets run. These are frequently provided for free by CDN's so the downside for the website coder is nil but the upside is a much easier job.

    20-30 MB of downloads happen in a flash on modern broadband anyways so I don't see this as a problem.

    • I run ad, script, and tracking blockers on my browsers. It's amazing how much cross-site crap people expect you to just load because the server you went to tells you to.

      As a general rule, if your site breaks because of that I go somewhere else. I wish more people did the same to push everyone back to hosting everything on a page in one place. Own your shit.

      And yeah I'm old, but I miss the days when people worried about efficiency. When I started doing web design as a teen, if you had a bloated page peop

      • >> ad, script, and tracking blockers on my browsers

        I run those too, and as a result webpages load up quickly. Commonly used frameworks and helpers can load in with your your webpage, no problem if it is a just couple dozen MB. I write website code myself and I understand how it is.

    • by dryeo ( 100693 )

      20-30 MB of downloads happen in a flash on modern broadband anyways so I don't see this as a problem.

      Not everyone has modern broadband.

  • I was clearing my cache a few months ago and i had 135mb of cached data for kfc.com. I've never been to KFC.com, nor have I even been to a KFC in the past decade.
    • by nickovs ( 115935 )
      Perhaps this is why so many people in the USA are overweight. It's not the fried chicken that they eat; it's the billion bits that go with it!
  • Too many web sites fail to show anything/much if Javascript for them is blocked. The result is that I will often go somewhere else.

    Javascript is great to make a web page slicker but should not be needed just to see it. It is done for the web dev's convenience not for the web surfer's experience. Efficiency seems to be a concept that many developers today do not understand.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Wednesday March 20, 2024 @02:27AM (#64329859)

    Senior WebDev here.

    I've been building websites and webapps for 24 years now and the shit we have around today would've gotten people fired on the spot back then. Entire generations who know nothing but bloated VDOM setups and use them to build even the most trivial websites. Pagebuilder bloat loaded with images in print resolution and bullshit "consent management" widgets, pointless trackers and services that load half a GB to show some text and a button. Dimwitt deciders that couldn't tell a client from a server and haven't seen a single line of HTML in their entire life. Websites that load 10x of the entire Amiga operating system before they can even render a single pixel. ... it's a disaster and one giant stinking mess. And one of the reasons I'm shifting my career as we speak. To many clueless people calling the shots. Way to frustrating.

  • Many social media sites nowadays assume they are always open - so on the scale of what's important, initial loading time -& data is way down.

  • Just for fun, I ran Wix on a desktop Chromium without an ad-blocker on a decade-old PC. And guess what... nothing extraordinary happened. The page loaded quickly. Maybe because my connection is fast, I dunno.

    There are other websites that would quickly consume all of my RAM, slow my Linux desktop to a crawl, and sometimes even force me to perform a hard reboot to recover. There are other websites that load with themselves a gazillion of subframes (visible in Chromium Task Manager) each doing something - two

  • I could not find a Firefox add on/plugin that would display the weight of a tab. Any suggestion ? Does it even exists ?
  • Pihole, firefox with noscript and ublock origin, and I see very few ads.

    When I implemented those some years ago when I was still on 8Mbit ADSL, I was astonished at how much faster web pages were able to load, albeit with a few empty frames where the ads were supposed to go.

    Now that I've got Starlink it probably wouldn't be such a big difference, but I'm used to the ad-free* experience now.

    *some websites serve the ads from their own domain and I still see some ads when visiting places like distrowatch.

  • With delayed loading and then repositioning text when it was already loaded and you are already reading it, forcing you to keep scrolling until the website is finally fully loaded. Modern websites are total crap.
  • Developers these days are dependent on various frameworks and libraries to build web apps. Each one can add many megabytes to a page load. Not to mention eye candy and other bells and whistles that don't add anything to the functionality. The use of these resource can usually be optimized (tree shaking) but that requires extra time that may be limited due to release schedules. Don't they realize or care that their users are probably not using the latest and greatest developer laptops on gigabit networks or

% "Every morning, I get up and look through the 'Forbes' list of the richest people in America. If I'm not there, I go to work" -- Robert Orben

Working...