Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Chrome Security The Internet

Chrome Now Reloads Pages 28% Faster (techcrunch.com) 124

Google has announced that it has worked with Facebook and Mozilla to make page reloads in Chrome for desktop and mobile significantly faster. According to Google's data, reloading sites with the latest version of Chrome should now be about 28 percent faster. From a report: Typically, when you reload a page, the browser ends up making hundreds of network requests just to see if the images and other resources it cached the first time you went to a site are still valid. As Google engineer Takashi Toyoshima notes in today's announcement, users typically reload pages because they either look broken or because the content looks like it should have been updated (think old-school live blogs). He argues that when browser developers first added this feature, it was mostly because broken pages were common. Today, users mostly reload pages because the content of a site seems stale.
This discussion has been archived. No new comments can be posted.

Chrome Now Reloads Pages 28% Faster

Comments Filter:
  • by zephvark ( 1812804 ) on Thursday January 26, 2017 @02:17PM (#53743741)

    I reload pages because they are broken, generally due to an excess of advertising. Yes, I could filter out advertising but, I often get paid for having it there. Not that I look at it.

  • Not all pages (Score:4, Informative)

    by Anonymous Coward on Thursday January 26, 2017 @02:17PM (#53743747)

    Chrome now reloads Facebook pages up to 28% faster. The rest of the web won't see the benefit.

    • by freeze128 ( 544774 ) on Thursday January 26, 2017 @02:24PM (#53743805)
      In other news, Chrome now uses 12% more RAM.
    • Chrome now reloads Facebook pages up to 28% faster. The rest of the web won't see the benefit.

      That's not what the article says. What it says is:

      Facebook, just like other pages, says its pages now reload 28 percent faster, too

      As for how this feat is accomplished (would have been nice if that was in the summary), what now happens is that when you hit "reload" the browser only reloads the main page. It obviously has to load any resources requested by the new version of the page that weren't requested by the previous version, but it doesn't reload resources that were already loaded for the previous version.

      • So basically if the reason for you reloading was because of porked resources on the page then Chrome now just ignores your request for a reload.
  • Hm perhaps we could use similar techniques to avoid making those hundreds of network connections in the first place...
    • That isn't how http works. Send the data and disconnect.
      We should have a better protocol that deals with Web 2.0 content. But the protocol wouldn't be http

      • by doom ( 14564 )

        What I'm getting at is the solution to Web 2.0 content might be Web 1.0 content.

        Or technical tricks to emulate Web 1.0 content.

        (I bet google has the chops to come up with a really killer ad blocker.)

        • by Anonymous Coward

          (I bet google has the chops to come up with a really killer ad blocker.)

          Yeah, but one of the earlier ways they went evil was to buy doubleclick.net.

        • by epyT-R ( 613989 )

          ..but then providers wouldn't have control over every aspect of use.. That's a big no no these days. Being able to load an old version to restore functionality? That's piracy! Not selling useage data to advertisers (or the state)? That's leaving money on the table!

      • by tepples ( 727027 )

        HTTP has supported keep-alive and pipelining since 1.1, the first to make support for name-based virtual hosts mandatory.

        • by unrtst ( 777550 )

          Neither of those solve the issue of checking the status of 100's of elements. They help, but each check still needs to be done, and they can't be done completely parallel (sane connection count limits on servers limit that, and so browsers only do N parallel requests, each keepalive'd).

          They have simply decided not to check any of those other elements. Ignoring 100's of elements for a 28% speedup seems like bad math to me. They're only reloading 1 in 100's of elements, so that should be a 100x speedup (10000

      • by Anonymous Coward

        Depends how you classify HTTP, but the protocol already exists. HTTP/2 has been around for a while and is gaining traction: https://http2.github.io/faq/

        I was sceptical at first (binary? WTF?) but it's actually pretty good.

      • HTTP 1.0? probably not
        HTTP 1.1? Yeah, you can keep the connection open and send requests before the response comes back
        HTTP 2.0? Yes.

    • If todays average website wasn't a steaming shitshow, then it wouldn't be necessary.

      But they are, so we do. *shrug*

    • We could even call it HTTP/2.0. Or I guess that is what we called it.

  • by Anonymous Coward

    I reload pages for a variety of reasons depending on what I am doing. I already have to drop into dev tools and chose from a variety of reload "flavors" to get some tasks done. If they must persist in deciding for me what I really want to do based on what other people "typically" want to do, at least make the option to express my actual goal more easy to access.

    Or, you know, GTFO with your over-engineered "solutions".

  • Actual Summary (Score:3, Informative)

    by Anonymous Coward on Thursday January 26, 2017 @02:23PM (#53743797)

    Because it'd be awful if the summary actually summarized what was being done. From the article:

    To overcome this issue, the team simplified Chrome’s reload behavior and it now only validates the main resource. Facebook, just like other pages, says its pages now reload 28 percent faster, too, so the next time you want to check if your friends finally posted new pictures of their cute corgis to Facebook (and you are using the web app instead of the native FB app), you’ll now get the answer faster.

  • TLDR (Score:5, Informative)

    by Anonymous Coward on Thursday January 26, 2017 @02:23PM (#53743799)

    One liner description of the change...
    They made refresh 28% faster by having it no longer refresh.

  • by gQuigs ( 913879 ) on Thursday January 26, 2017 @02:28PM (#53743837) Homepage

    Guess it makes Ctrl-F5 even more useful...

  • I'm hoping this article is either wrong or incomplete. Otherwise, won't this mean a significant increase in breakages? Suppose the main resource relies on two resources, one of which is in the cache, the other of which isn't. Those two resources implicitly depend on one another in some way (e.g. a new version of JavaScript code might require a new version of CSS, or else rendering would be wrong and vice-versa). If the browser validates only the main resource, then unless the URLs for the resources cha

    • by tepples ( 727027 )

      I've had so much trouble with CloudFlare caching that I've started putting version numbers in every JS and CSS filename

      Such a versioned URL scheme has in fact been the best practice for several years now, as it lets you use far-future Expires: dates to reduce bandwidth use by return visitors.

    • Just give new versions of your content unique names, and problem solved.

  • From what I understand, this changes reload behavior so that reload doesn't completely reload a page. Won't this break the reload behavior when testing a page you're developing and/or when browsing pages that glitched temporarily? Would we end up with a "hold shift during reload" that all tech people will use instead?

    • by Anonymous Coward

      Given how many problems I've encountered with aggressive or incorrect caching in Chrome (seriously, there are won't fix bugs listed because they interpret the RFCs differently to everyone else), I'm not sure how this differs from the current shitty behaviour? Awesome: more hacks and workarounds to get things to reload correctly because Chrome's developers crave speed over usability or RTFRFC.

      Y'know what speeds pages up massively? Blocking all the ads, tracking scripts, and other crap that gets downloaded. F

    • If you're doing web development, try opening the web development tools and turning off the browser cache. Press F12. On Chrome it's a checkbox called "Disable cache" under the "Network" tab. IE has a "Always refresh from server" option on their network page too.

  • Will the advertisers on the page see the additional hit?
  • by Anonymous Coward

    First a scheduler and now caching. What OS feature should come next?

    • The next version of chrome will contains a complete implementation of emacs, which qualifies as an operating system.

      Although the emacs operating system needs a better editor.

  • by xfizik ( 3491039 ) on Thursday January 26, 2017 @02:54PM (#53744039)
    So Mozilla helped Google make Chrome faster? It's not clear what Mozilla's role and benefit in all of this...
    • Their tag line is "Internet for people, not profit"
      I wasn't aware it was "Lets shun the other browser makers and continue to fuck up Firefox"

    • Re: (Score:2, Informative)

      by Anonymous Coward

      There's actual information at this link, as opposed to the glorified Chrome ad that was submitted: https://code.facebook.com/posts/557147474482256/this-browser-tweak-saved-60-of-requests-to-facebook

    • by AmiMoJo ( 196126 )

      Firefox seems to have been doing this for a while. When I refresh a page in FF it doesn't usually reload big images, just the main HTML.

  • by WaffleMonster ( 969671 ) on Thursday January 26, 2017 @03:26PM (#53744307)

    Engineer A: Lets re-interpret intent to make it faster
    Engineer B: Lets re-interpret intent to make it current
    GOTO A

    The history of HTTP cache headers are filled with this same contention between different people trying to reinterpret the meaning of words to further their narrow agendas.

    This crap always ends with everyone having a headache without solving anything.

    If you want to make reload better try adding mechanisms to explicitly signal intent so it can explicitly be acted on rather than hacking shit to make it work better for *you* because you can.

    • by DamonHD ( 794830 )

      I don't think that that's entirely fair.

      Sometimes, only through extensive use in a huge variety of use cases, does it become clear that the previously-thought-simple 'cache for a bit' needs rather more nuance...

      Can this be cached only if public?

      What if a different language was requested the second time?

      What if different content encodings are acceptable to my next cache user?

      Can I continue to show a slightly stale copy for a while rather than failing completely, and if so how long?

      Can I see if the meaning of

  • by viperidaenz ( 2515578 ) on Thursday January 26, 2017 @03:47PM (#53744477)

    Add the ability to put an ETag in the HTML document along side the resources, so the browser doesn't have to make a conditional "if-none-match" request to check if it is stale.

    <img href="blah" etag="12345" />

    • by Anonymous Coward

      <img href="blah?v=12345" />

      done.

  • Great now I get to the 6 minute long "Waiting for cache" message 28% faster!
  • Can you read 28% faster??
  • I'm not a Facebook member. I never will be. Yet when I load pages from almost anywhere on the web I can detect background traffic to Facebook (unless I explicitly block traffic to Facebook in my HOSTS file). Facebook has absolutely no right to know what I'm doing on sites as diverse as my local TV station, Slashdot and Linux support sites, yet that traffic is considerable and can't help but affect my Internet performance. I bet they could have made Chrome work even faster if they blocked traffic to Facebook
    • Facebook has absolutely no right to know what I'm doing on sites

      I'm sure you know this, but what's happening is that these sites you visit have an agreement with Facebook, and write/use code to share this information. They certainly have a right to do this (not that I like it, and in fact use an Adblock filter to strip Facebook from all non-Facebook sites).

  • so they have been doing a shift-refresh for every refresh?
  • The original article quotes a facebook article, which speaks about reducing requests by using cache with long expire headers.

    Their approach: expire header for one year, filename with a content hash.
    This means, facebook spams your browser cache with data, which will never be accessed again after they changed it, just to reduce the number of "if-modified" requests.

Some people manage by the book, even though they don't know who wrote the book or even what book.

Working...