Modern Web Bloat Means Some Pages Load 21MB of Data (tomshardware.com) 48
Christopher Harper reports via Tom's Hardware: Earlier this month, Danluu.com released an exhaustive 23-page analysis/op-ed/manifesto on the current status of unoptimized web pages and web app performance, finding that just loading a web page can even bog down an entry-level device that can run the popular game PUBG at 40 fps. In fact, the Wix webpage requires loading 21MB of data for one page, while the more famous websites Patreon and Threads load 13MB of data for one page. This can result in slow load times that reach up to 33 seconds or, in some cases, result in the page failing to load at all.
As the testing above shows, some of the most brutally intensive websites include the likes of... Quora, and basically every major social media platform. Newer content production platforms like Squarespace and newer Forum platforms like Discourse also have significantly worse performance than their older counterparts, often to the point of unusability on some devices. The Tecno S8C, one of the prominent entry-level phones common in emerging markets, is one particularly compelling test device that stuck. The device is actually quite impressive in some ways, including its ability to run PlayerUnknown's Battlegrounds Mobile at 40 FPS -- but the same device can't even run Quora and experiences nigh-unusable lag when scrolling on social media sites.
That example is most likely the best summation of the overall point, which is that modern web and app design is increasingly trending toward an unrealistic assumption of ever-increasing bandwidth and processing. Quora is a website where people answer questions -- there is absolutely no reason any of these websites should be harder to run than a Battle Royale game.
As the testing above shows, some of the most brutally intensive websites include the likes of... Quora, and basically every major social media platform. Newer content production platforms like Squarespace and newer Forum platforms like Discourse also have significantly worse performance than their older counterparts, often to the point of unusability on some devices. The Tecno S8C, one of the prominent entry-level phones common in emerging markets, is one particularly compelling test device that stuck. The device is actually quite impressive in some ways, including its ability to run PlayerUnknown's Battlegrounds Mobile at 40 FPS -- but the same device can't even run Quora and experiences nigh-unusable lag when scrolling on social media sites.
That example is most likely the best summation of the overall point, which is that modern web and app design is increasingly trending toward an unrealistic assumption of ever-increasing bandwidth and processing. Quora is a website where people answer questions -- there is absolutely no reason any of these websites should be harder to run than a Battle Royale game.
Q? (Score:5, Insightful)
the most brutally intensive websites include the likes of... Quora
Why is this website still around? All it seems to do is degrade search results.
Re:Q? (Score:5, Insightful)
Re:Q? (Score:4, Interesting)
A related question, why does Google still keep Quora around by including Quora in search results
I get solicited by Google periodically asking me to please contribute to Quora. I suppose they need more input to train their AI on.
Re: (Score:3)
Google only cares what advertisements you see above the results. I use the uBlocklist extension for this very reason.
Re: Q? (Score:3)
If a device can't load Quora, that should be listed as a feature.
250kbytes in 2003 (Score:2)
That it has only grown by a factor of 10 to 100 over the past decade is an amazing small amount of growth when bandwidth to the user is up 100 to 1000 times.
Re: (Score:1)
Re:250kbytes in 2003 (Score:5, Insightful)
Many sites continue to limit lists to about 5 items, forcing the user to click next pages dozens of times to see everything. All to save a few kilobytes in a web that is many megabytes.
Uhh, did you not realize those listicles are limited so that you have to generate clicks? They measure engagement and harvest your eyeball-attention-usage data from those clicks. Even with JavaScript they don’t get much data from you scrolling through a page, but make it clickable and they can behaviorally profile your interest, consumption, and maybe even the person using the mouse, depending on whether we’re thinking Cambridge Analytica/Palintir level magic, or just make a good guess if you don’t believe in the BS. Either way listicles have not a single thing to do with data/bandwidth saving.
Re: (Score:2)
I'll take that over loads as I scroll any day.
Slashdot (Score:3)
How much does Slashdot download?
How about Ars Technica, from where most of the Slashdot content comes?
Re: (Score:2)
My own scratch-built (i.e. I wrote the framework etc.) dynamic site built for a customer averages 30kB full load and under 170ms total load time on the first visit.
Inexcusable for forums to take up so much. (Score:5, Insightful)
I guess if there's any place to point out that forums worked fine over 300bps connections, it's here. Although the upgrade to 1200bps was admittedly a welcome change.
Re: Inexcusable for forums to take up so much. (Score:2)
300 bps? Try 75 baud upload speed on the Minitel. With only 7 bits. Using 2 or 3 bytes videotex characters, the modem could not keep up with a typing speed of just 3 keystrokes per second. At one point I could type 120 wpm. Way faster than the modem could handle.
It's either ... (Score:3)
Or lazy developers who pull in megabyte JavaScript libraries just to use one function. Which calls a bunch of tracking code.
Re: (Score:3)
I know it has been an extension for yonks but I recently installed NoScript. Sure, it's pain to whitelist essential Javascript on the couple of dozen sites I regularly visit.
But doing my bit in fighting climate change to save a couple of kWh a year in not processing garbage from random URLs. :)
Re: (Score:2)
> But doing my bit in fighting climate change to save a couple of kWh a year in not processing garbage from random URLs. :)
Then you might be interested in this:
https://addons.mozilla.org/en-... [mozilla.org]
Also available for Chrome...
Re: (Score:3)
Or lazy developers who pull in megabyte JavaScript libraries just to use one function. Which calls a bunch of tracking code.
Both
more importanly (Score:3)
Install Privacy Badger and uBlock Origin (Score:5, Insightful)
Both addons give you a live total of the number of connections blocked. It's quite staggering what some web sites try to load. Privacy Badger works very well to reduce the bloat and nearly all sites I've ever seen work just fine without all the superfluous trackers. If you stay on some sites long enough I've seen Privacy Badger block thousands of connections.
Both of these addons are essential to using the web safely these days. To me most web sites are unusable without them.
Re: (Score:2)
Re: (Score:2)
Ad Blocker (Score:5, Interesting)
A good study of web site bloat would also report on how much bandwidth ad blockers save. And comparing various ad blockers would give a good metric as to how effective they are.
And as someone else suggested, also measure upload data, especially with and without ad blockers.
Re: (Score:2)
Now try with an adblocker (Score:3)
Re: (Score:3)
I never leave 127.0.0.1 without them.
My WASM page (Score:3)
I was freaking out because my WASM page using enscripten, SDL2, etc was a little over 4MB of wasm and .js loaders to have some basic spinning triangles. So I spent a bunch of time to get it down to 100kB. All because I felt that 4MB was an excessive amount to download to try my dumb little toy apps. No actual business reason even, just politeness. Oh well
Re: (Score:2)
Dear peer, I salute you.
Framework and dependency bloat is real (Score:1)
the main reason (Score:2)
The main reason for the bloat is that huge frameworks and helpers are downloaded in the header of the websites before any code even gets run. These are frequently provided for free by CDN's so the downside for the website coder is nil but the upside is a much easier job.
20-30 MB of downloads happen in a flash on modern broadband anyways so I don't see this as a problem.
Re: (Score:3)
I run ad, script, and tracking blockers on my browsers. It's amazing how much cross-site crap people expect you to just load because the server you went to tells you to.
As a general rule, if your site breaks because of that I go somewhere else. I wish more people did the same to push everyone back to hosting everything on a page in one place. Own your shit.
And yeah I'm old, but I miss the days when people worried about efficiency. When I started doing web design as a teen, if you had a bloated page peop
Re: (Score:2)
>> ad, script, and tracking blockers on my browsers
I run those too, and as a result webpages load up quickly. Commonly used frameworks and helpers can load in with your your webpage, no problem if it is a just couple dozen MB. I write website code myself and I understand how it is.
Re: (Score:2)
20-30 MB of downloads happen in a flash on modern broadband anyways so I don't see this as a problem.
Not everyone has modern broadband.
Re: (Score:2)
Text resources are usually sent gzip compressed, and unpacked by the client. The original screed covers this, but it's somewhat buried, and also seems to be overlooked in a lot of the discussion. I would expec in your example that transferred is just that, the compressed bits over the network, while"resources" is how large the data unpacks to.
That's not even bad (Score:2)
Re: (Score:3)
Too much Javascript (Score:2)
Too many web sites fail to show anything/much if Javascript for them is blocked. The result is that I will often go somewhere else.
Javascript is great to make a web page slicker but should not be needed just to see it. It is done for the web dev's convenience not for the web surfer's experience. Efficiency seems to be a concept that many developers today do not understand.
Oh, don't get me started ... (Score:5, Informative)
Senior WebDev here.
I've been building websites and webapps for 24 years now and the shit we have around today would've gotten people fired on the spot back then. Entire generations who know nothing but bloated VDOM setups and use them to build even the most trivial websites. Pagebuilder bloat loaded with images in print resolution and bullshit "consent management" widgets, pointless trackers and services that load half a GB to show some text and a button. Dimwitt deciders that couldn't tell a client from a server and haven't seen a single line of HTML in their entire life. Websites that load 10x of the entire Amiga operating system before they can even render a single pixel. ... it's a disaster and one giant stinking mess. And one of the reasons I'm shifting my career as we speak. To many clueless people calling the shots. Way to frustrating.
Always Open (Score:2)
Many social media sites nowadays assume they are always open - so on the scale of what's important, initial loading time -& data is way down.
Maybe the problem is elsewhere (Score:2)
Just for fun, I ran Wix on a desktop Chromium without an ad-blocker on a decade-old PC. And guess what... nothing extraordinary happened. The page loaded quickly. Maybe because my connection is fast, I dunno.
There are other websites that would quickly consume all of my RAM, slow my Linux desktop to a crawl, and sometimes even force me to perform a hard reboot to recover. There are other websites that load with themselves a gazillion of subframes (visible in Chromium Task Manager) each doing something - two
Any Firefox Extension Displaying Weight ? (Score:2)
Blockers make a difference (Score:2)
Pihole, firefox with noscript and ublock origin, and I see very few ads.
When I implemented those some years ago when I was still on 8Mbit ADSL, I was astonished at how much faster web pages were able to load, albeit with a few empty frames where the ads were supposed to go.
Now that I've got Starlink it probably wouldn't be such a big difference, but I'm used to the ad-free* experience now.
*some websites serve the ads from their own domain and I still see some ads when visiting places like distrowatch.
Modern websites are crap anyway (Score:2)
Blame it on JS frameworks and libraries (Score:2)
Developers these days are dependent on various frameworks and libraries to build web apps. Each one can add many megabytes to a page load. Not to mention eye candy and other bells and whistles that don't add anything to the functionality. The use of these resource can usually be optimized (tree shaking) but that requires extra time that may be limited due to release schedules. Don't they realize or care that their users are probably not using the latest and greatest developer laptops on gigabit networks or