Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Chrome Chromium Google

Google Tests 'Never-Slow Mode' for Speedier Browsing (zdnet.com) 159

At some point in the future, Chrome may gain a new feature, dubbed 'Never-Slow Mode', which would trim heavy web pages to keep browsing fast. From a report: The prototype feature is referenced in a work-in-progress commit for the Chromium open-source project. With Never-Slow Mode enabled, it would "enforce per-interaction budgets designed to keep the main thread clean." The design document for Never-Slow Mode hasn't been made public. However, the feature's owner, Chrome developer Alex Russell, has provided a rough outline of how it would work to speed up web pages with large scripts. "Currently blocks large scripts, sets budgets for certain resource types (script, font, css, images), turns off document.write(), clobbers sync XHR, enables client-hints pervasively, and buffers resources without 'Content-Length' set," wrote Russell.
This discussion has been archived. No new comments can be posted.

Google Tests 'Never-Slow Mode' for Speedier Browsing

Comments Filter:
  • by Anonymous Coward on Tuesday February 05, 2019 @10:59AM (#58073178)

    Fucks sake - this is exactly the reason random web page X stops working.

    Here's a hint google: You're fixing the WRONG problem.

    The correct problem to apply pressure to:

    1) Crap web code, and specfically better educating the people that write it.
    2) Javascripts crappy threads.

    Your 'never slow mode' should only ever be a debug tool for people making web pages.

    • by Anonymous Coward
      Define "crap web code"
      • by DarkRookie2 ( 5551422 ) on Tuesday February 05, 2019 @11:14AM (#58073244)
        Just look at the code for any site made or redesigned last 5 years.
        • by Anonymous Coward

          Just look at the code for any site made or redesigned last 5 years.

          My career now is taking old functional web applications and rewriting them into slow javascript bloated web UIs.

          It's not my fault. It's what the clients want.

          • by Anonymous Coward

            Don't forget to add in blockchain, quantum computing, artificial intelligence and augmented reality. Your customers demand the very best, so you need to nail all the buzzwords when you're working on their projects.

      • Re: (Score:2, Redundant)

        by Penguinisto ( 415985 )

        Define "crap web code"

        It usually ends in *.php

        (/me ducks and runs, laughing maniacally.)

      • by PPH ( 736903 )

        Define "crap web code"

        Whatever generates the "A web page is slowing down your browser" message. Could be bad Javascript. Or bad server side code. Most often I suspect it's advertisers domains being throttled by my ISP.

        • Each tab should have its own "main thread" right?
          Or it not, then a tab's content that is found to be using resources greedily should be re-located to its own thread, which can be de-prioritized so that browsing elsewhere, and main browser controls, are not affected much performance-wise.

          Then I suppose "never-slow mode" could be enabled/disabled by user wrt a particular tab, or particular content, upon prompting from browser performance pop-up modal dialogs.
          • by PPH ( 736903 )

            The trouble is: I want the content in that tab. But the content is waiting on an advertising banner web site which has stalled. If you de-prioritize the entire tab's content, I'll still be waiting for what I want.

            Perhaps the solution is a per-tab content 'map' with resource and speed data available for each part of the web page. And a thread for each part. And the ability to click on the offending banner ad and kill it's thread. And maybe a menu option to blacklist all future content from the responsible s

    • 1. We have had crappy code for generations now. It is up to the compiler/interpreter to handle it better. Just as long as writing code is open to anyone, there is going to be crappy coding. We could try to make certified licensed fully credentialed coders, but I only see this cutting down on creativity in coding and what could be coded. For every 100 times we see crap code, there will be 1 time there would be ingenious code that would be caught by this tool, and basically prevent innovation. If IE 6 or F

      • by AmiMoJo ( 196126 )

        This is the reason behind the decision to introduce the new ad-blocking API that everyone was moaning about last week. It removes potentially slow add-ons from the critical performance path of the browser and replaces them with a native one that is presumably optimized and respects the time budget.

        Current discussion is around keeping the old API for add-ons that need it, but encouraging use of the new API. Google's usual pattern is to let that go on for a few years and then eventually retire the old API onc

        • In theory a good idea.

          In practice it means you have to trust Google.

          • by AmiMoJo ( 196126 )

            Google controls the entire browser, so how can it be any worse? They could nerf ad blockers any time if they wanted to, but instead went out of their way to provide the original deep API for blocking (it was actually extended a couple of times to allow blocking earlier in the loading process) and the new high performance one.

      • re. 1. :
        No, seriously, no. Why is it up to the browser to accommodate shitty code (outside of gracefully aborting the load and kicking up the appropriate error code - is that what you meant?) I also don't really buy the 'brilliant-code-caught-in-the-trap' argument, either; exploiting ugly standards-sloppy interpreter loopholes to do something awesome is cool, but the true innovation is to do it in a way that doesn't cause the browser to puke whenever the browser makers fix the bug you exploited for that 'i

      • by Anonymous Coward

        re 2: Plain JavaScript browser code *is not multi-threaded.* It has asynchronous behavior with callbacks or promises but it does not do multi-threading.

        The only exception to this is if you're using the WebWorkers feature which explicitly creates new threads.

      • Comment removed based on user account deletion
    • Your anonymous cowardice is showing.
    • by swillden ( 191260 ) <shawn-ds@willden.org> on Tuesday February 05, 2019 @12:55PM (#58073806) Journal

      Your 'never slow mode' should only ever be a debug tool for people making web pages.

      I think the idea is that by making "slow mode" pages fail to work well, they'll force programmers to make better web pages. Chrome has had debug tools that provide all this information for ages, and the developers who make use of them can make very snappy sites. But those conscientious and careful developers aren't the problem. It's all the ones who won't do it right until doing it wrong results in user complaints that you need to reach.

      (Disclaimer: I work for Google but don't know anything about this beyond what I read in the summary. I didn't even RTFA.)

      • by sjames ( 1099 )

        The problem there is that google is turning slow but working sites into broken sites even if the slowness cannot be avoided without removing needed functionality.

        It looks like they're going so far as to limit image size. That's because Google can't ever be wrong and had done an extensive study of everything in consultation with God himself and knows there exist no valid applications exist where 1 MB isn't good enough for everybody.

        They should be ashamed of themselves.

        My guess is that the workarounds will do

        • You're making a lot of unsubstantiated assumptions which implicitly assume that the developers of Chrome are idiots and don't care if people use their browser. You should think about whether either of those things are likely.
          • by sjames ( 1099 )

            Perhaps you should read the link [googlesource.com] provided in the summary.

            Everything I said is based on things the actual owner of the feature said.

          • You're making a lot of unsubstantiated assumptions which implicitly assume that the developers of Chrome are idiots and don't care if people use their browser.

            Well they sure are acting like idiots in this instance.

            It's not at all a given that's the implicit assumption given market share enjoyed by chrome. Some may well get tired of broken sites and switch. A much more likely scenario is the user will complain to or blame the site owners and chrome gets a pass.

      • As much as I hate to say it, forcing people to do the right thing never works. People who don't care about doing the right thing will always do terrible work, and you'll always make things more difficult for the people who do.

    • by AmiMoJo ( 196126 )

      Google has done this before and it worked quite well. Remember Flash? They first restricted it to running by default only on whitelisted pages, everything else became click-to-play by default. Then that became blocked by default. Finally after several years it was removed entirely, having given everyone plenty of time to stop (ab)using it.

      They will likely do something similar if they decide to go ahead with this. Enable one aspect at a time, in a way that causes minimal breakage, keep nudging developers to

      • by sjames ( 1099 )

        When they did that with flash, it was only after HTML5 was fully capable of doing everything Flash was doing. They were sunsetting legacy code.

        It's a little hard to tell what we're signing up for here since the supporting design docs are all internal only. That is, they are presenting a contract with a blank cover seet covering all but the dotted line and saying "just sign here".

        What it sounds like is that we will end up with abominations where pages that naturally and intrinsically need a function to take

        • OMG, your last paragraph would so improve my browsing experience.

          Of course, stopping Javascript means that Google can't follow your mouse cursor around the page, amongst other things.

    • 1) Crap web code, and specfically better educating the people that write it.

      Good luck with that. Exactly how do you plan to reach all these millions of developers writing "crap" code and forcibly educate them? Sometimes forced constraints are not such a bad thing.

      Your 'never slow mode' should only ever be a debug tool for people making web pages.

      Yeah, have you met people? Because NOBODY I know would stay in their lane on that, myself included.

      • by sjames ( 1099 )

        So your alternative is to make the browser broken in a way that some things that now work fine will just never work again?

        • So your alternative is to make the browser broken in a way that some things that now work fine will just never work again?

          "Broken"? Are you seriously arguing that everything is working fine now? Look, I have no idea if this proposal by Google is a good idea or not and I wasn't commenting on that. I'm merely arguing that the "solutions" proposed by the post I responded to are non-starters. You aren't going to educate developers into doing the Right Thing. There are ALWAYS idiots out there making crap code and by and large the only way to deal with them is with technical constraints. A lot of developers just aren't as good

          • by sjames ( 1099 )

            So what if someone has a medical app that needs to display a CT image that is more than 1MB? So sorry, no browser for you!

            A debugging tool that is off by default WILL get used by non-developers, but that's fine. Any brokenness that happens then is something the user signed up for.

    • The 'web development' players 'making web pages' keep shooting themselves in the foot, it's time to start amputating the damage. Instead of educating better developers, it's time to take away their gun.
    • They need to create an ad-blocker. That solves 95% of slowness problems.
    • The correct problem to apply pressure to:

      1) Crap web code, and specfically better educating the people that write it.
      2) Javascripts crappy threads.

      That's exactly what they're doing. If your site craps out under this mode, you'll be pressured to fix it.

    • by sjames ( 1099 )

      Exactly. All they are doing is enshrining heisenbugs as a design feature. Congested network or busy computer means pages are broken and useless rather than slow but functional (then Google blames the web pages)..

      There are a few pages where I use synchronous requests as a design decision. I do that since until that transaction completes, there is no valid user action available other than closing the tab. It's a documented API choice, so I don't feel at all bad about using it. Google should feel bad about bre

    • The correct problem to apply pressure to:

      1) Crap web code, and specfically better educating the people that write it.
      2) Javascripts crappy threads.

      I don't think you fully understand what's going on. What happens is a web page with crappy javascript code demands a lot of CPU cycles. Windows says "Oh, this thread needs more CPU. Here you go." Other pages then get starved of CPU and load more slowly.

      What this change will do is limit the max CPU any one web page can get in competition with others (e

    • Fucks sake - this is exactly the reason random web page X stops working.

      You could act more like your dad and simply not randomly enable optional features, and tell your mom to not push random buttons she doesn't understand and then complain about the result.

  • by OffTheLip ( 636691 ) on Tuesday February 05, 2019 @10:59AM (#58073182)
    Web pages load okay without all of the crap added to them.
    • Well, I can't count the number of times I've seen a fully rendered page for an instant followed by "aw snap" - with JavaScript disabled. So not quite the whole answer.

      • by sjames ( 1099 )

        So the real fix if for google to fix the "aw snap" display to not block you from reading what was already rendered successfully rather than seeking to break even more stuff.

  • by Anonymous Coward on Tuesday February 05, 2019 @11:13AM (#58073240)

    i would turn off the tracking and monitoring and everything would be much faster!

    • Just on Slashdot if you enable content blocking in Firefox (v65), 17 trackers and 3 third-party cookies are blocked. 17 trackers!!

  • by Opportunist ( 166417 ) on Tuesday February 05, 2019 @11:40AM (#58073366)

    Dump all the ads and it's gonna be blazingly fast.

  • That's exactly what BlackBerry did with their system. They would strip off all code that wouldn't render on a BB and only transmit what would. Back then it saved money on data as well as sped things up.
    • by Anonymous Coward
      Blackberry could only do that because all web content was sent through their proxy servers. While big enterprise loved the monitoring and tracking features that enabled, it provided a completely shit user experience for anyone not within 100km of Blackberry's proxy servers.
  • I remember the days when a Flash ad would instantly peg CPU usage. I just killed Flash instead of looking into why. Maybe it was rendering 1000FPS instead of 60? Sure, I missed the latest Strong Bad Email, but those eventually disappeared, too.

    Now it's not so much CPU as RAM. When closing one small page frees up 2GB, that's not a good sign.

  • Do we really need WebGL, canvas, wasm, node, jquery and all this HTML5 crap? I remember i could open hundred of tabs in Firefox on a system with just a gigabyte of ram back in 2004. Now Waterfox struggles with about 10 tabs on a 16gb system and I have to constantly re-open it. Just have HTML 4.01 with the video tag set to non autoplay and make the web simple.
    • by tepples ( 727027 )

      So where would that leave web applications that have a legitimate use for "all this HTML5 crap"? As I understand your suggestion, they'd have to become native applications, which means they might not be made available at all for minority operating systems or CPU architectures.

      • by Anonymous Coward

        Each browser tab could be put into a certain "mode": 1) Game mode, 2) App mode, 3) Reading mode. Reading mode could be the default and only provides basic functionality.

        • by tepples ( 727027 )

          Until it becomes common for ad-supported websites to offer use in app mode without charge or use in reading mode for a monthly subscription. Anti-tracking-blocking measures on The Atlantic, MIT Technology Review, and other websites already do just this.

        • That's a damn good idea.

          User gets to enable bloat crap-ware per tab and only when they need it.
      • by Fly Swatter ( 30498 ) on Tuesday February 05, 2019 @01:47PM (#58074086) Homepage
        It would leave them where they belong - not in a web browser.
        • by tepples ( 727027 )

          Assuming that applications belong in an environment that is not-a-web-browser: Which not-a-web-browser application environment is compatible with all major desktop and mobile platforms?

          • Wasn't this the original intent of JAVA, I mean the real one - not some script bolted onto a web browser because no one else wanted it.
      • by DdJ ( 10790 )

        So where would that leave web applications that have a legitimate use for "all this HTML5 crap"?

        Hopefully, with a relatively fine-grained exception system that allows this to be overridden explicitly when it makes sense to.

        • Hopefully, with a relatively fine-grained exception system that allows this to be overridden explicitly when it makes sense to.

          This raises two questions: First, how would a non-technical user learn to operate "a relatively fine-grained exception system" with the appropriate balance between safety and convenience? Second, how would a developer go about proving its application worthy of such an exception?

    • I remember i could open hundred of tabs in Firefox on a system with just a gigabyte of ram back in 2004.

      And I bet you are going to try to convince us that such a workflow is somehow practical too...

      Now Waterfox struggles with about 10 tabs on a 16gb system and I have to constantly re-open it.

      Then I suggest you switch to a browser that actually works because I have no such problem with Firefox or Chrome or Edge or Safari.

  • ... all broken, slow and bloated websites.
    Many problems would go away really fast and in a year from now the essential web would suck way less because people would've adjusted with better code and better planning.

  • by HalAtWork ( 926717 ) on Tuesday February 05, 2019 @12:46PM (#58073754)

    How about they work on a 'never slow typing' mode for Android. How does everyone get this so wrong? Doesn't matter the phone, there's always a point where the text stops popping up as you're typing and then a ton of random characters barf out at once, cursor position getting switched around as you type, it's maddening.

    • It's because the Snapdragon SoCs in these things are shit, and Android's scheduling is shit.

      When the SoC throttles (and it will), it clocks down so fucking hard that you can't fucking do anything. I believe Android's "project butter" change from 4 years back or so essentially just gave rendering the UI (which is always a graphical thing in Android) the highest absolute priority over anything. Your fucking typing goes to the back of the bus until the SoC, now running at a snail's pace, has now finished pro

    • huh?

  • Comment removed based on user account deletion
  • Has anyone else noticed how painful it is use to CNN? I don't know what they do but the JS is constantly dynamically reflowing the page. I had to pull an old iPad (iOS 9) out of a drawer recently - and quickly noticed how slow the page is. While "struggles" is the wrong word, my new PC shows slowness while rendering and on the iPad it was unusable.

    At first I thought - old iPad, bad iPad. Then I tried other news websites and it was fine. Its just friggen HTML. I can read the articles faster than

    • Has anyone else noticed how painful it is use to CNN?

      No because I don't visit their webpage. Just looked however and it is fine for me. Bear in mind though that I have Privacy Badger, uBlock Origin, AdBlock Plus and the built in Firefox privacy settings running so I'm killing a TON of tracking and add stuff. Kind of feels like fucking with 4 condoms on though when I go to sites like that.

  • by Perl-Pusher ( 555592 ) on Tuesday February 05, 2019 @01:43PM (#58074064)
    In my experience, the slowest loading pages all have the same browser status: "waiting on google analytics"! They should start there!
    • They should start there!

      Except that the browser never waits on Google analytics. At least not unless you're using Google Analytics code from pre-2009 (when Google analytics became asynchronous and thus stopped affecting page load times at all) and were stupid enough to put the code in the head of the HTML.

  • by Rick Schumann ( 4662797 ) on Tuesday February 05, 2019 @01:59PM (#58074162) Journal
    I have a better solution: how about we 'trim the fat' from the pages' sources themselves, instead of having these bloated monstrosities in the first place? I use NoScript and whitelist only a few domains, so for many sites I manually temporarily 'trust' only the domains I know are safe and don't collect data. When some website won't even load basic text without enabling Javascript, and when the NoScript list of domains that page 'needs' grows to the point where it's practically going to scroll off the bottom of the monitor, then I say there's something seriously wrong with the way webpages are created these days.
    • how about we 'trim the fat' from the pages' sources themselves, instead of having these bloated monstrosities in the first place?

      Indeed, you should start by writing the internet a strongly worded letter. I mean people not being happy about ads also got rid of ads too right?

  • Reminds me of the time I got a call about one of our configuration interfaces not working. User kept getting XSS errors and the page wouldn't even load.

    Needless to say I was quite impressed to find out browsers were employing black box naive heuristic filters that not only were not effective and could themselves be leveraged to mask attacks and as vectors for denial service they also caused random failures in non-defective code due to chance coincidence and naming conventions.

    The last thing the web needs i

  • So I won't have to block 6 trackers and 9 scripts here on /. myself?

  • This will just result in broken pages. And broken pages that are broken differently based on each device's specs and load from other applications at the time.

    clobbers sync XHR

    Some people like to design things such that events and procedures happen one after the other, you know. Some people need consistent and deterministic logic and data. Some people care about race conditions.

    • by DarkOx ( 621550 )

      ah but in Google land the user should never sit an see the throbber while the computer does something. No rather they should go no useful feedback at all while their browser sits and polls inefficiently over and over again to see if the server has completed some operation.

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...