Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
The Internet Programming

Front-End Developer Decries 'Garbage' Design Choices on 'The Bullshit Web' (pxlnv.com) 409

"Ever wondered why pages seem to load slower and slower? Or why it is that browsing seems to take just as long to load a page, even though your broadband connection doubled in speed a couple of months ago?" gb7djk, a long-time Slashdot reader, blames "the bullshit web" -- as described in this essay by Calgary-based front-end developer Nick Heer (who does his testing on a 50 Mbps connection). A story at the Hill took over nine seconds to load; at Politico, seventeen seconds; at CNN, over thirty seconds. This is the bullshit web... When I use the word "bullshit" in this article, it isn't in a profane sense. It is much closer to Harry Frankfurt's definition in On Bullshit: "It is just this lack of connection to a concern with truth -- this indifference to how things really are -- that I regard as of the essence of bullshit...." The average internet connection in the United States is about six times as fast as it was just ten years ago, but instead of making it faster to browse the same types of websites, we're simply occupying that extra bandwidth with more stuff. Some of this stuff is amazing.... But a lot of the stuff we're seeing is a pile-up of garbage on seemingly every major website that does nothing to make visitors happier -- if anything, much of this stuff is deeply irritating and morally indefensible.

Take that CNN article, for example. Here's what it contained when I loaded it:

- Eleven web fonts, totalling 414 KB
- Four stylesheets, totalling 315 KB
- Twenty frames
- Twenty-nine XML HTTP requests, totalling about 500 KB
- Approximately one hundred scripts, totalling several megabytes -- though it's hard to pin down the number and actual size because some of the scripts are "beacons" that load after the page is technically finished downloading.

The vast majority of these resources are not directly related to the information on the page, and I'm including advertising... In addition, pretty much any CNN article page includes an autoplaying video... Also, have you noticed just how many websites desperately want you to sign up for their newsletter?

The essay also deals harshly with AMP, "a collection of standard HTML elements and AMP-specific elements on a special ostensibly-lightweight page that needs an 80 kilobyte JavaScript file to load correctly....required by the AMP spec to be hotlinked from cdn.amp-project.org, which is a Google-owned domain. That makes an AMP website dependent on Google to display its basic markup, which is super weird for a platform as open as the web."

It argues AMP is only speedier "because AMP restricts the kinds of elements that can be used on a page and severely limits the scripts that can be used," calling it a pseudo-solution. "Better choices should be made by web developers to not ship this bullshit in the first place.... An honest web is one in which the overwhelming majority of the code and assets downloaded to a user's computer are used in a page's visual presentation, with nearly all the remainder used to define the semantic structure and associated metadata on the page."
This discussion has been archived. No new comments can be posted.

Front-End Developer Decries 'Garbage' Design Choices on 'The Bullshit Web'

Comments Filter:
  • by Anonymous Coward on Saturday August 04, 2018 @08:48PM (#57071056)

    The standards are not really that bad (well, maybe CSS isn't great). But the problem is when graphics and art people get done with a site. For example, I built a simple, in house web tool for a customer. It had a bit of JavaScript (which I minified) because it was doing lots of processing. But it was maybe ~120K to load everything. I didn't even bother with stuff like jQuery since the set of browsers I'm running on is well controlled.

    But it didn't look good. I'm color blind and have no interest in making things aesthetic; I made it functional. The powers that be wanted it to be made pretty. It's now ~2MB because it loads fonts, jQuery, bootstrap and all sorts of stuff from all sorts of external servers. Which defeats a requirement that this work even when the Internet connection is down. The server for this content is an embedded device, which the web interface controls so there is no need for external Internet whatsoever.

    But that's no longer the case once the 'web fartists' got through with it. Perhaps people should worry far less about visual appearance and far more about functionality. Not that you can't even make a visually appealing site without piles and piles of third party libraries.

    So once the development culture of tossing all sorts of third party stuff anywhere it can be hammered in is dissolved then the web will come back to its original vision.

    • Massive misnomer (Score:5, Insightful)

      by Anonymous Coward on Saturday August 04, 2018 @09:46PM (#57071276)

      To me the "content" is the useful bit I'm coming to see. For a news site, typically the actual write-up conveying some tidbit of news. Even that is debatable, as some news outlets (*cough* bbc *cough*) have a habit of writing a longish text that says the same thing three times over and wastes your time never actually adding much extra information, certainly not worth the while reading the text beyond the first few paragraphs. The trick is to scan for this repetition and stop reading in a timely fashion. Sometimes there's a picture but more and more often it's a stock photo that has no direct connection to the incident. The more the site has a habit of showing big-ish pictures ("optimised for tablets"), the more likely it is the pictures are stock.

      For, say, twitter the actual useful payload is "up to 160 characters". Okay, it's 240 now, and it took them years to get there when all it should have taken was adjusting a single value. Shows how well that thing is designed, but I digress. Just wget a tweet. It's on the order of 30 kB, for the unadorned (and quite useless) bare HTML. Now load that same tweet with the network waterfall. It's easily a couple megabytes. That's a very large cruft multiplier.

      I don't count that cruft as content. So the people who put that in there are not "content authors". I tend to simply call'em web monkeys.

      The problem, as you rightly point out and it's been endemic since september, is exactly the wish to create an often literally picture perfect website "experience". As in the webmonkey gets a picture and orders to make the website "experience" provide an identical look and feel across all borders.

      And then it gets worse, reinventing interfaces in ways that aren't useful to me, but just happen to be popular. The dreaded "automatically load more as you scoll down" for one, turning the scrollbar into a nuisance instead of a useful tool, causing me to curse webmonkeys yet again for a bunch of PHP-grade tools taking my useful and expected tools away. And so on, and so forth.

      HTML was never designed to do that "experience" thing. So the webmonkey "just has to" muck around with all sorts of bullshit to somewhat make that limp along. I don't agree that the problem isn't in the standards, it's where the problems got set in stone. But the root cause is that HTML is (a rather inept attempt at providing) a vehicle to convey content that might end up rendered differently across devices, not "experience". That was supposed to be a feature, but the webmonkeys oversold "experience" and providing it has become a rather large industry full of crap, crud, and idiot webmonkeys.

      An entire industry providing only homeopathic concentrations of what I think of as "content".

    • by Anonymous Coward on Saturday August 04, 2018 @09:49PM (#57071286)

      Good-looking design and functionality aren't mutually exclusive.

      That said, if you need much more than out-of-the box HTML5 and CSS3 (and an occasional SVG) to make your application look good- you suck at design.

      I'm shocked at how many developers (both the 'fartists' you describe and grognardian 'senior' devs) have no clue what the browser gives them for free, basically.

      The majority of web developer interviews I've been on almost entirely consist of Javascript algorithm efficiency and cleverness, and almost zero on baseline HTML5 spec.

      • Re: (Score:3, Insightful)

        by Narcocide ( 102829 )

        Really? They seem more interested in counting my gray hairs these days than evaluating my "algorithm efficiency."

        • by rtb61 ( 674572 )

          Well, I'll ignore you gray hairs and I'll do my own algorithm efficiency metrics, using https://noscript.net/ [noscript.net]. If you page is running a bit shite, I'll check out the scripts and start killing them off one by one, until the page loads smoothly with most of the content intact, even approved ad sources. Seems like a lot of work for one web site but many of those shite slow loading scripts are all over the place, kill it once to kill it many times and of course come back to that site in future and it will load

          • I used to use NoScript for years, until it broke in the Great FireFox Plugin Breakage (it got mostly fixed a couple of weeks later).
            Then I started using uMatrix [github.com]. It took some time to understand how to use it properly, but now I do, I don't look back. It's far superior to NoScript, except maybe for some of the click-jacking stuff that NoScript can block (although I got mostly false positives with that, so it was more annoying than useful).

            One thing I really didn't like about NoScript, is that if I unblocked

            • I agree on the value of uMatrix. Right now, for example, it is blocking 14 items from Slashdot. I can't imagine browsing the web without it or something similar.

              When I sometimes see people browse the web without it I am a bit shocked at what most web pages look like. Also, once I posted a link to a reasonable site without knowing that there was advertising-supplied "your computer has been infected" junk there -- because I usually just don't see junk like that.

              For personal web surfing, I mostly use a not-too

        • by Megol ( 3135005 ) on Sunday August 05, 2018 @07:53AM (#57072784)

          Sucks for me then - got my gray hairs at 15.

          And isn't the most efficient algorithm the one that needn't be run? Much of the javascript on the web isn't necessary at all.

    • Comment removed based on user account deletion
    • Wrong. (Score:5, Insightful)

      by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Saturday August 04, 2018 @11:57PM (#57071680)

      Perhaps people should worry far less about visual appearance and far more about functionality.

      Wrong.

      Professionals should care about and be able to handle both. Aesthetics isn't that difficult and if someone doesn't care how a website looks they still have some work to before they can call themselves a professional webdev. Not caring about aesthetics is just as bad as not caring about wether your stuff is processed client- or serverside. That doesn't mean they have to do screens all day, but it does mean they should know how it's done, to a certain degree. Just as any screendesigner should know what state, focus and context is and the difference between a value and a variable.

      If someone can build a lean website that just can't help looking like shite they're part of the problem. The way smaller part, but still a part. The others are people who build neat screendesigns but wouldn't know a client from a server and think because they are OK in print they're good in web. Then there is that massive blob inbetween who are bad at both and still claim to be professionals.

      Bottom line: Learn some basics about design, it's really not that difficult and knowing some general stuff about modern bauhaus minimalism shouldn't be to much of work for any decent developer.

      • Am I the only remaining person on the internet who thinks that the word 'wrong', on a line by itself, as the opening salvo in a war of absolute moral superiority, is just painfully rude?

    • people should worry far less about visual appearance and far more about functionality

      In other words, find an ugly girl who can cook.

    • Do browsers NOT cache static content anymore? Or, in copyright frenzy, do sites mark & tag shit like JQuery, webfonts, etc as the browser-equivalent of "copy-never" to explicitly prevent it from being cached (even within the same site, let alone sites linked to the same static content [like JQuery, Bootstrap, etc.])?

      It seems like the kind of stuff that *used* to be aggressively-cached (across multiple sites, no less) by browsers like MSIE & Firefox now LITERALLY gets fetched over and over again).

      • I've seen some sites that put a random number in a ?r=23423423423 for each request in order to bypass any cache so that they can do click counting.
    • by vtcodger ( 957785 ) on Sunday August 05, 2018 @07:06AM (#57072670)

      "But that's no longer the case once the 'web fartists' got through with it. Perhaps people should worry far less about visual appearance and far more about functionality."

      The wretched hive of scum and infamy that is the modern Internet seems slowly to be evolving toward near total unusability. I have to confess that some of what is going on seems almost inexplicable. For example, Amazon.com, which used to be relentlessly consumer oriented has a web site that is becoming so slow and unresponsive in every browser I try as to be virtually unusable. Google maps has gone to enormous effort to put together a terrific GIS database -- at least for the US -- but for some reason they insist on presenting map data in a low contrast format that often makes their maps pretty much unusable. You often can't even see secondary roads if you turn terrain display on. And Google itself is so busy spying on its users that many of the services it has gone to great effort to build are compromised. For example, Google News no longer works with the simple text browsers (links, lynx, etc) used by the visually impaired.

      And then there is security. I (probably) loaded all the comments in this thread -- a capability that has worked only erratically this weekend presumably thanks to Slashdot's flaky site scripting. Only two comments mention security. Come on folks. Does anyone seriously believe that users can keep confidential information confidential and still load and execute random code from random web sites? Really? You folks believe that?

      I don't know how, when or where all this ends. But I'm guessing that it doesn't end very well.

  • by AHuxley ( 892839 ) on Saturday August 04, 2018 @08:48PM (#57071060) Journal
    Everything is filler around the ads.
    Servers now have the bandwidth, CPU power, RAM, OS, expert staff to be really great.
    All that they are used for is pushing better ads.
  • I actually can't stand browsing the web without at least these two items running. The ads, scripts, and general fluff running on pages now is staggering. Not to mention the RAM and processor usage which could bring an obsolete computer to its knees.

    • Another essential one: Google link cleaner. [mozilla.org] Why spend all that latency and net traffic informing Google what you actually clicked on so they can sell your thoughts to the highest bidder?

      • Uhhh, about using regex to parse HTML [stackoverflow.com]
        • Sorry for exceeding your coding ability, I hope your neck isn't smoking. I will try not to do it again.

          For the rest of us... this particular link cleaner works fine and a bunch of other nice ones are available.

        • He's actually right and you're wrong. He's extracting the redirect from the link, not parsing HTML.
      • Nice Slashvertisement.

        • I didn't write this one and it's not even the one I happened to install. But you will benefit from it or one of its kin, trust me. Say, you don't have a vested interest do you?

      • by AHuxley ( 892839 )
        Thanks very much for the advice about the link clearer. Always good to find new add ons that make the internet better :)
  • by h33t l4x0r ( 4107715 ) on Saturday August 04, 2018 @08:54PM (#57071082)
    The clients are. They want more and more stuff. If they see a feature another site has, then they bring it up in a meeting and decide they need that too. And then when it gets too slow they want to know why it's so slow.
  • Take back control (Score:5, Interesting)

    by nmb3000 ( 741169 ) on Saturday August 04, 2018 @08:59PM (#57071100) Journal

    This isn't something that large content publishers or hosting sites are going to address or change themselves. They don't really care how much data their page downloads, and the big ones like CNN don't even care how long the page takes to load. As long as they get their ad impressions, user profiling, and 27 different kinds of analytics then they're happy.

    Individuals will need to take back control themselves. An adblocker and NoScript change CNN from a 30 second load to about 5 seconds. NoScript is the real champion, and yet it is so often maligned as "hard to use". The truth is that making sure the usual sites you visit work right takes just a few minutes, most sites work pretty well without scripts, and the vast majority work just fine if you enable first-party scripts only. And since it's the second and third (and fourth and fifth, ad infinitum) which load most of the garbage this is usually a good tradeoff.

    What we really need (assuming it doesn't exist already?) is a curated whitelist for NoSciprt, like the subscription lists for AdBlock Plus. This would make the extension more user-friendly and allow a maximum level of functionality while still completely blocking a significant amount of unwanted and dangerous garbage.

    Oh, and don't waste your time with a hosts file. It's completely useless in the age of dynamic DNS entries which appear and disappear on a daily or hourly basis.

    • Re:Take back control (Score:4, Informative)

      by Bengie ( 1121981 ) on Saturday August 04, 2018 @11:19PM (#57071574)
      I got different results

      1) No addons/extensions 8 seconds to load all of the scripts
      2) No addons/extensions 1 second to load the contents of the page
      3) Block 3rd party scripts 800ms
      4) NoScript 40ms
    • An adblocker and NoScript change CNN from a 30 second load to about 5 seconds.

      Does it actually take that long? I read stories about GDPR causing websites to be served with less shitty scripts in Europe. I get CNN page load times of 4 seconds with adblock on (no noscript), and 6 seconds with adblock off.

  • You must be referring to the "off shore resources" developing websites for pennies on the dollar. They don't care about UX or performance of a website, their only concern is will this pass pipeline testing and get released.

  • by ad454 ( 325846 ) on Saturday August 04, 2018 @09:08PM (#57071120) Journal

    When I visit "xyz.com", why should my browser need to connect to 1000 other websites just to get the homepage to load, any of which can contain potential browser exploits in order to inject malware.

    First party sites can still host their own malware-free ads, and everything would be much faster, safer, and privacy preserving.

    If advertisers require traffic evidence, they could still still opt to share their web logs, regularly timestamped by a trusted timestamp authority. It is still a better option that the current obstructive tracking we have now.

    All it takes is for all of the non-Chrome browsers to make this a standard default.

    Especially since Google would not allow this for Chrome, since it would impact their bottom line too much.

    On a personally level, I am constantly complaining to my IT, that they are still using Google Analytics and other 3rd party trackers in our internal employee-only corporate website in our intranet.

  • by NewtonsLaw ( 409638 ) on Saturday August 04, 2018 @09:13PM (#57071140)

    It's gotten so bad that one of our local news sites here produces near 100% CPU utilization of the (admittedly older) dual-core (4 logical cores) 2.4GHz processor running under Linux in my web-browsing PC. It pegs the CPU for nearly 30 seconds per page and I can't do much else during that time because of the high CPU usage.

    Seriously... you need *that* much resource just to show us some text and a few pictures????

    Ah, how fondly I remember the days of 1200/75 modems when good designers spent hours trying to make BBS screens load just a few seconds faster. These days, optimization and elegant simplicity are cuss-words within the online development community I guess.

    • by phantomfive ( 622387 ) on Saturday August 04, 2018 @09:27PM (#57071210) Journal
      Elegance is dead everywhere, even in the embedded world. To be elegant, you need to actually understand programming, and how to organize code. In the modern world, we use rule-based programming, like Java enforces in structure or Python enforces in syntax, and Golang enforces through convention. If you see a way to make things more elegant, your coworkers will tell you it's not the proper way to do things. We've lost taste in exchange for propriety. The result is that bad programmers are able to write acceptable code, but we're a bunch of prudes.
      • by kbonin ( 58917 )
        This is so true. I've always prized myself as someone who could write elegant code. I write distributed microservices that scale well to millions of nodes, but I still remember writing optical mouse firmware using a 4-bit slice processor where we had to figure out how to decode quadrature inputs and emulate a UART in a few hundred bytes of assembly and only a few registers. Just last week I was asked to refactor a DevSecOps solution that I was quite happy with, since I was coordinating ephemeral Linux Do
      • by Cederic ( 9623 ) on Sunday August 05, 2018 @03:52AM (#57072164) Journal

        Elegance is irrelevant. Your website can be the most elegant hand crafted artisan display of html, CSS and four brutally optimised lines of javascript that implement an AI capable of delivering peace in the middle east, and it's still going to be shit:

        The marketing and sales team are going to overload it with third party shit anyway, that upsets your users, tracks them, molests their children and ruins any attempt you've made to provide a clean usable experience.

  • by bobstreo ( 1320787 ) on Saturday August 04, 2018 @09:20PM (#57071182)

    5 seconds is how long I'll wait for a page to load before I close the tab.

    I use no script,. privacy badger and uBlock Origin.

    I've also been known to use links.

    • 5 seconds is how long I'll wait for a page to load before I close the tab.

      TFA is missing the point. Users don't consider the web slow because they don't sit around waiting for the scripts to finish doing their stuff. It may take CNN 30 seconds to load in the USA (I'm in the EU, the slowest I could get it was 6 seconds with adblockers turned off), but that doesn't change the fact that the content has finished loading and I'm able to browse the page after the 1st second.

  • Code doesn't run fast. That's ok buy more RAM, faster processor, disks, whatever. This is just the logical progression taken to the internet. I say blame bloated frameworks.
  • OMG! (Score:5, Insightful)

    by jetkust ( 596906 ) on Saturday August 04, 2018 @09:38PM (#57071238)
    OK. So I'm not the only one noticing this! Accessing the web now is HORRIBLE. Especially on a phone. It's like the web is back to dial up speeds. And when the page is finally loaded, what you have is a screen full of irrelevant garbage, like a flashy picture, and you have to haplessly scroll. No clear idea of what you are supposed to click on. No clear idea of if the page is loaded or if right before you click on something the image shifts and you click on something else.

    Then theres the damned newsletter screen! Why did people all of a sudden start signing up for newsletters?? Didn't they stop doing that over a decade ago, maybe two? WHO THE HELL IS DOING THIS? The worst thing is you kinda know it's coming, but it's still just as annoying every time. Like a fullscreen popup add but worse. You could literally be on a website trying to buy something FROM THE WEBSITE and a freaking newsletter screen will pop up preventing you from doing it.

    Then there is the whole navigation thing. In all these years of the internet you would think they'd figure out a way so that back returns you to where you were. Only to realize back really just reloads the page again which will take forever AGAIN and MAY but likely will not return you back to where you were. And what sucks is that these are the biggest, richest companies. It supposed to be the state of the art, and it already sucks.
    • Then theres the damned newsletter screen! Why did people all of a sudden start signing up for newsletters?? Didn't they stop doing that over a decade ago, maybe two? WHO THE HELL IS DOING THIS? The worst thing is you kinda know it's coming, but it's still just as annoying every time. Like a fullscreen popup add but worse. You could literally be on a website trying to buy something FROM THE WEBSITE and a freaking newsletter screen will pop up preventing you from doing it.

      I am in complete agreement with you. Fortunately, with ublock origin, you only need to see it once.

    • Re:OMG! (Score:4, Insightful)

      by TeknoHog ( 164938 ) on Sunday August 05, 2018 @04:20AM (#57072244) Homepage Journal

      OK. So I'm not the only one noticing this! Accessing the web now is HORRIBLE. Especially on a phone. It's like the web is back to dial up speeds.

      When the "web on a phone" first appeared in the form of WAP etc., I thought it would mean the return of clean and simple web design. Mobile connection speeds were worse than dialup to begin with, and some of the display size and input limitations still apply today. But now we have fast connections and CPUs with small displays, so we get Fisher-Price text and image layout with all the advertising and tracking bits, or even less actual content to see. Of course, you get this on the desktop too because many sites only design for mobile now.

      Add to this the appification of web, a great step backwards in platform independence and the use of computers as universal tools. For example, posting videos on Instagram needs the app, so you need one real computer to produce the work, and a toy device to distribute it. (I use Android-x86 under a VM, but you get the issue.)

    • people all of a sudden start signing up for newsletters

      I don't know that people are actually signing up, but the news organizations have big motivation for wanting this be adopted. There are, at least, two reasons for doing this. The first is news companies (especially print based) are switching from subscription based monetization to purely ads based. They are desperately trying to get back to a degree of repeat viewing / consistent readership like in the old days, where they could reach a specific group of people regularly. They want some consistent baseli

    • You could literally be on a website trying to buy something FROM THE WEBSITE and a freaking newsletter screen will pop up preventing you from doing it.

      And the newsletter popup will appear at random. Sometimes it's time based, sometimes it's after you scroll down a certain amount.

      And they clued in that people were expecting it, and clicking the X in the upper right. Now to dismiss it requires more effort. You have to decipher which 6 pt light gray on slightly lighter gray text has the passive aggressive option "No, I don't want to receive email updates that will improve my life. I rather live as a Luddite"

  • And now the marketing people have corrupted it. We will never get the 80s and 90s back.

  • by QuietLagoon ( 813062 ) on Saturday August 04, 2018 @09:52PM (#57071302)
    Yes, I have. I've chalked it up to web developers who are more concerned about a site looking fancy than they are concerned about a site providing a good user experience. It's like the flaming logos all over again, except this time around the pages have moving things, and sliding things, and widgets, oh so many widgets.

    .
    I'm looking for a company's phone number and I have to wade through slow loading times and tons of scrolling in order to get to the phone number. When I finally do get there, the phone number is in some super low contrast grey-on-grey text.

  • Preach it, brother.
    So many web pages are so full of crap, it is hard to measure.

  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Saturday August 04, 2018 @10:49PM (#57071460) Homepage Journal

    That's what it is, bloated shit. This desire to make the web the platform for programs and applications has turned it into a totally-fucked hodge-podge of resource-sucking shit using bloated code and non-standard solutions.

  • Every page is so loaded down with script that my browser warns me that just displaying it is using too much in resources. You can't click on any link in the page until you have waited several minutes for it to stop thrashing around as ads huff and puff to assume their place in this mountain of script. After waiting this out some comment sections are usable, some are not.

    Why am I not running an ad blocker? I did until every site I visited detected its presence and demanded that I stop running it. Are there n

    • by vyvepe ( 809573 )
      Those ad blocking detectors are another scripts. You can often block them with some decent selective script blocker, e.g. uMatrix etc. The site is often usable after blocking the ad blocking detectors. That may change in the future when they start to load useful content with the same script which shows the ad blocking notification too.
  • by viperidaenz ( 2515578 ) on Saturday August 04, 2018 @11:17PM (#57071566)

    Especially on a product website.

    You scroll through their products, click on one, then when you go back, you've lost your place.

    I shouldn't have to open every link in a new tab just to keep my place in a list of things.

    • by Misagon ( 1135 )

      Oh. I really hate infinite scroll.
      The "infinite scroll" has only one valid use IMHO: When you are displaying a timeline, looking back from the present. That is what it was invented for in the first place.

      Another thing I hate is links that are not links but buttons that act like links --- but only when left-clicked. Those can't be opened in a new tab or window.
      Combined with infinite scroll, that is just hell.

      Or about about when you click on a "link", and the new "page" occupies the entire browser window ...

  • We solved the human-computer interface by the 1990s.

    The problem is that everyone insists that the old ideas are bad and only new ideas are good.

    All of this bending over backwards to get JavaScript and the single-threaded DOM become "single screen applications" ignores that we already had all this almost thirty years ago.

  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Saturday August 04, 2018 @11:39PM (#57071634)

    Webdev here.

    I'm so on board with this guy and I so totally get his frustration. This is my personal daily plight. The problem is, ever since "Web Design" we've had to deal with the vast majority of people in our field claiming to be "Web Designers" but not knowing squat about how the web works, what it does and what it can't do and how it is done correctly. This shows at every corner ever since. We need some serious steps into professionalising our field. It has come quite a way, but we are not there yet.

    People think that because it's nice and shiny and they can click on it that they can understand it. The problem is they don't. With web design and typography it is so easy for people to mistake the picture of a house with a house. After all, it looks the same, doesn't it? It frustrates the hell out of me talking to professional awarded web designers that after 20 years still blabbel utter non-sense about the 72dpi myth. I could hardly believe what I was hearing as I had this discussion last winter. That's because even the people handing out the awards don't know how the web works.

    I listened to a tech talk from a blind buy the other week who demonstrated with a screen reader and a braile terminal how he navigates the web. He also explained how to build a semantically correct web. It was such an eye-opener and a brilliant demonstration of where the wheat seperates from countless metric tons of chaff. Div soup, semantic hell and broken websites left, right and center. If I were Kind I'd pass a law that everyone who builds websites has to demonstrate the viablity of them by navigating them blind, with a screen reader. The quality of the web would instantly improve by orders of magnitude.

  • by Anonymous Coward

    Remember back when everyone was still using standard hard drives and you finally acquired your first SSD and booted up in a few seconds, and your computer felt more responsive than you ever remembered? Life was good. That game that used to take 2 minutes to load now loads up in seconds!
    But what happens is soon everyone is using SSDs, including those of us doing development. Now when I write that code that accesses that hard drive way more than it really needs to I won't care, because SSDs are so fast! But o

  • by zdzichu ( 100333 ) on Sunday August 05, 2018 @02:49AM (#57072002) Homepage Journal

    Visit https://eu.usatoday.com/ [usatoday.com] and try not to blink, or you will miss page loading and rendering. They decided that getting rid of JS trackers is a better business decision than implementing all the consent gathering, required by EU law. Now USA Today page loads fast.

  • Thanks to GDPR a low of websites are serving up the same content to Euopeans without all the bullshit attached. https://linustechtips.com/main... [linustechtips.com]

    I can't confirm if the CNN actually serves up pages that take 30 seconds to load, but I just clicked on the top most story about the wildfires and got a load time of 4 seconds with adblocking, and 6 seconds without adblocking.

  • We really shouldn't need adblockers, but we do, and this is precisely why. I don't mind a web site making a little cash from putting ads in front of my eyeballs -- it's exactly what publishers have always done with newspapers, magazines, and TV shows -- but when they waste my time or render a page unreadable then I'm done. Pages with delayed loading of ads or videos such that the text on the page is constantly moving after I have already loaded the page -- who ever thought that was a good idea? I understand

  • slashdot needs to fix their html code for displaying polls. https://imgur.com/a/HnX1CYV [imgur.com]
  • That's not news, the websites have been getting fatter and fatter for a very long time now. Maciej Ceglowski calls that "the website obesity crisis", and he gave a very good talk about this problem. Goes into a bit more detail than TFA. The text version is available here: http://idlewords.com/talks/web... [idlewords.com].

Nothing happens.

Working...