Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
IT Technology

HTTP 103 - An HTTP Status Code for Indicating Hints (ietf.org) 50

The Internet Task Engineering Group (IETF) has approved the new HTTP status code 103. The new status code is intended to "minimize perceived latency." From the circular: It is common for HTTP responses to contain links to external resources that need to be fetched prior to their use; for example, rendering HTML by a Web browser. Having such links available to the client as early as possible helps to minimize perceived latency. The "preload" ([Preload]) link relation can be used to convey such links in the Link header field of an HTTP response. However, it is not always possible for an origin server to generate the header block of a final response immediately after receiving a request. For example, the origin server might delegate a request to an upstream HTTP server running at a distant location, or the status code might depend on the result of a database query. The dilemma here is that even though it is preferable for an origin server to send some header fields as soon as it receives a request, it cannot do so until the status code and the full header fields of the final HTTP response are determined. [...] The 103 (Early Hints) informational status code indicates to the client that the server is likely to send a final response with the header fields included in the informational response. Typically, a server will include the header fields sent in a 103 (Early Hints) response in the final response as well. However, there might be cases when this is not desirable, such as when the server learns that they are not correct before the final response is sent. A client can speculatively evaluate the header fields included in a 103 (Early Hints) response while waiting for the final response. For example, a client might recognize a Link header field value containing the relation type "preload" and start fetching the target resource. However, these header fields only provide hints to the client; they do not replace the header fields on the final response. Aside from performance optimizations, such evaluation of the 103 (Early Hints) response's header fields MUST NOT affect how the final response is processed. A client MUST NOT interpret the 103 (Early Hints) response header fields as if they applied to the informational response itself (e.g., as metadata about the 103 (Early Hints) response).
This discussion has been archived. No new comments can be posted.

HTTP 103 - An HTTP Status Code for Indicating Hints

Comments Filter:
  • by Anonymous Coward

    Great guys! I can see a few ways to use this to fake out servers and do some nasty things!

    We love it!

    Yours:

    The Russian Hacker's Association

  • by Opportunist ( 166417 ) on Tuesday October 31, 2017 @09:09AM (#55463005)

    Strip your bullshit JS code and deliver the content rather than the ads. Until you do, no header will improve performance or "user experience".

    Let's face it, no user gives a shit just how quickly you serve your ads. He wants the content, and guess what you don't give half a shit about how fast he gets it.

    • by Merk42 ( 1906718 )
      I agree! Everyone knows all websites have the static functionality of a Wikipedia article.
      • If you encounter a viewer who desires only "the static functionality of a Wikipedia article", do not force the viewer into more interaction than that. Build navigation through links to other documents on your site. Use the styling and transition functionality built into CSS to style your HTML. Build collapsible elements out of a hidden checkbox with a visible label.

        • by Merk42 ( 1906718 )
          Full page reloads for any action from the user.
          Outdated information for dynamic values (as in something that often changes, like a sports score, or stocks)

          Would you care to cite any websites that do what you claim that aren't just a list of articles?
          • by tepples ( 727027 )

            Let me predict what an anti-script curmudgeon might think of your objections.

            Full page reloads for any action from the user.

            "It'd still end up loading less data than a half-dozen JS frameworks and ad exchanges' real-time bidding scripts."

            Outdated information for dynamic values (as in something that often changes, like a sports score, or stocks)

            "When I want up-to-date information, I'll request it myself. My F5 key isn't broken, you know."

            Would you care to cite any websites that do what you claim that aren't just a list of articles?

            "I can, for example, get the weather on National Weather Service (weather.gov) without script. If I wanted something more interactive than an HTML form, I would download an application, compile it, and install it on my compute

            • by Merk42 ( 1906718 )

              Full page reloads for any action from the user.

              "It'd still end up loading less data than a half-dozen JS frameworks and ad exchanges' real-time bidding scripts."

              I'm sure that's the case on strawman.com

              Outdated information for dynamic values (as in something that often changes, like a sports score, or stocks)

              "When I want up-to-date information, I'll request it myself. My F5 key isn't broken, you know."

              "Full page reloads for any action from the user."

              Would you care to cite any websites that do what you claim that aren't just a list of articles?

              "I can, for example, get the weather on National Weather Service (weather.gov) without script.

              Thanks for the example. Weather.gov looks great on my phone (not). It's a perfect example of what curmudgeons would want. If web development just stopped around 2004.

              Not that I'm moving goalposts, but I'm wondering if there is an example of something like an e-commerce site (therefor few/little ads) and is something more complicated than a Google question.

              If I wanted something more interactive than an HTML form, I would download an application, compile it, and install it on my computer."

              We're sorry, the following website is not compatible with your

              • by tepples ( 727027 )

                I'm wondering if there is an example of something like an e-commerce site (therefor few/little ads) and is something more complicated than a Google question.

                How well does, say, Phil's Hobby Shop [philshobbyshop.com] work with script off?

                Disclosure: Phil's Hobby Shop employs me.

                • by dave420 ( 699308 )

                  It looks like someone threw up on a time-travelling toaster.

                • by Rakarra ( 112805 )

                  Looks pretty good, but all the links on the big promotion window are broken. I'll admit, I would hope that the DX8 and the VisionAire labels would actually just change tabs on that promotion window, but right all of them take you to a "discontinued item" page.

                  I'll admit that it looks "ok" on my phone. Not great, but most of the overarchitectured sites look WORSE on the phone instead of better, so that's a point in favor of the static site.

              • by Rakarra ( 112805 )

                Thanks for the example. Weather.gov looks great on my phone (not). It's a perfect example of what curmudgeons would want. If web development just stopped around 2004.

                Hmm. Maybe web sites looked better and operated a hell of a lot faster in 2004 than the overloaded sites we have today.

                • by Merk42 ( 1906718 )

                  Hmm. Maybe web sites looked better and operated a hell of a lot faster in 2004 than the overloaded sites we have today.

                  Yes I love having to constantly zoom in and out on my phone [slashdot.org].

                  • by Rakarra ( 112805 )

                    Half the web sites I go to I have to "request desktop site" because the mobile version sucks. I think I prefer the zoom-in paradigm better.

                    But having a mobile version doesn't mean you need a 2017-style website nightmare.

            • by Rakarra ( 112805 )

              Outdated information for dynamic values (as in something that often changes, like a sports score, or stocks)

              "When I want up-to-date information, I'll request it myself. My F5 key isn't broken, you know."

              But then you'll have used far more bandwidth and time and processing power reloading the whole page than would have been taken up by hours of a reloading counter.

              What I -really- want is a browser feature where any only the tab that the mouse is hovering over will have running javascript. Everything, everything else gets "frozen." If the browser doesn't have current focus, then no scripts run at all. I'm sure that will break a little functionality, but a user like me would appreciate an advanced feature lik

              • by tepples ( 727027 )

                What I -really- want is a browser feature where any only the tab that the mouse is hovering over will have running javascript. Everything, everything else gets "frozen." If the browser doesn't have current focus, then no scripts run at all.

                With a user-managed whitelist for things like music streaming services and web-based rich chat rooms, I assume.

                • by Rakarra ( 112805 )

                  Sure, sounds good to me. I know it sounds complex and maybe not something the average user would want to hassle with, but maybe the average user is similarly frustrated with web browser slowness and might find that small bit of fiddling worth the overall faster experience.

                  Or maybe I'm the only guy with 5+ tabs open at once. ^_^

    • Google tried to tell our marketing group "our site is slow". Strangely, Google's "solutions" did not involve reducing the tagging on the site. We produced documentation showing that the site content loads in ~ 800ms and is responsive, and the remainder of the "slowness" is the barnacles and leeches.

    • when the service goes away. Making content, like everything, costs money. Maybe if we ever get UBI we can talk. Until then you live with the ads or stop going to the page and complaining.
      • Let's try that, shall we? I keep hearing the story of the falling sky if we abolish spam and ads and that the internet will cease to exist for it depends on both, but let's try it.

        Hey, wait, I remember a time when it actually WAS that way. Odd. How did webpages exist in the pre-dot-com time, one has to wonder. I know. They were made by people who wanted to actually say something, and even needed to have the brain cells to slap together a webpage to do so, which had the nice side effect of improving the sign

      • That user will care when the service goes away.

        I really wish a lot of it would just go away. I'm sick to death of scumbag aggregation sites appearing near the top of Google search results - existing just to serve Google ads on top of content stolen from reputable sites.

  • by Cigaes ( 714444 ) on Tuesday October 31, 2017 @09:11AM (#55463011) Homepage

    “Nice” (i.e. commercial) websites have become immensely complex in design. Not because their needs has grown immensely complex; they have grown complex, but not that much. Because they are poorly designed in their workings. Developers in the same company cannot be bothered to make reusable libraries, and when they can, they mess the API stability and require several versions of the same library within a single project. Requests are resolved using queries to caches to proxies to No wonder the output of even a single request cannot be decided before examining the whole universe and its neighbourhood.

    It seems to me this feature is just another step in that direction: make things a little more complex for an immediate gain, and let the future take care of itself. Slowly.

    • by Anonymous Coward

      Because they are poorly designed in their workings.

      They have too much crap and ads and scripts. Why does there have to be a popup over everything my mouse hovers over?

      And being stuck with an AT&T 1.5/.25Mbps shit connection, most websites load like shit. I start reading and the page renders, I start reading, it renders some more, I wait as it continuously jumps around ... and I leave.

      And with more and more news sites having video pop-up - even though I turned off autoplay, the fucker still takes up a shit load of bandwidth and makes the site unreadabl

      • CNN and Huffington Post would be mostly useless regardless of how fast their content becomes available. IOW, their problem is their content, not how it is presented.

    • “Nice” (i.e. commercial) websites have become immensely complex in design. (...) Because they are poorly designed in their workings.

      Well, unfortunately, doesn’t this apply to all software in general (not just websites) nowadays?
      </off_topic_rant>

  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Tuesday October 31, 2017 @09:24AM (#55463077)
    Comment removed based on user account deletion
  • by fibonacci8 ( 260615 ) on Tuesday October 31, 2017 @09:29AM (#55463119)
    To me this looks like an opportunity for a client to opt out of undesirable content, but in a way that a server can detect prior to sending the desired content. At first glance it's another vector for undesirable content purveyors to bypass local DNS policy.

    A client can speculatively evaluate the header fields included in a 103 (Early Hints) response while waiting for the final response. For example, a client might recognize a Link header field value containing the relation type "preload" and start fetching the target resource. However, these header fields only provide hints to the client; they do not replace the header fields on the final response.

    This part looks exactly like the "hints" are meant as an opportunity to avoid delivering content if the hints aren't properly "obeyed". If the "preload" directive doesn't happen and a third party doesn't relay that the undesirable content is at least transmitted, the first server can continue to wait until the demand is met.

    • If the "preload" directive doesn't happen and a third party doesn't relay that the undesirable content is at least transmitted, the first server can continue to wait until the demand is met.

      That's only a minor inconvenience to ad blockers, as they can load the data then throw it away without passing it to the rendering layer.

      It seems to me that what's really happening is that the content servers will proxy the ads from the third parties through themselves, bypassing the host-based restriction that ad blockers provide. You can't block the ads without blocking the content.

  • by rsclient ( 112577 ) on Tuesday October 31, 2017 @09:42AM (#55463187) Homepage

    I think I've seen this rodeo before. What I see is that web developers work to make their site "fast enough". In Scrum terms, they don't apply premature optimizations. They use too many modules with too many dependencies and assume everyone has a fast internet.

    My two predictions: this will just encourage web site bloat, and a bunch of people are going to discover that their cheap-and-barely-working HTTP parsers don't actually handle 100-series responses.They are also going to discover that many high-level APIs don't provide any access to this new paradigm.

    • ... They use too many modules with too many dependencies and assume everyone has a fast internet. ...

      They also assume everyone likes to read low contrast text in sunlight.

    • The only good use I can see of it, is when most webservers have it, you can make much more aggresive timeouts before trying one of the other IPs you were given due to DNS. This could dramatically improve responsiveness of content in where the primary route goes through a node that is overloaded or one of the servers are down, without relying on a separate service like cloudfire.

    • by Rakarra ( 112805 )

      I think I've seen this rodeo before. What I see is that web developers work to make their site "fast enough". In Scrum terms, they don't apply premature optimizations. They use too many modules with too many dependencies and assume everyone has a fast internet.

      I also see a lot of websites that worked well when you had 10-20 users, but when you got to thousands, they started getting pokey slow.

  • With enhancements such as this one, are we just continuing to extend the life of the HTTP protocol, when we really should be taking it out behind the barn and putting it out of its misery?
  • if you need it just put it in the ordinary header ffs that way you don't break existing HTTP clients.

  • So basically you can "tell" the client before sending the page that the page has a number of things in it's "head" and "script" tags and then the browser can pre-load them or multiplex the requests instead of waiting on the HTML to load, parse it and then load them.

    The problem/security will be when you pre-load content from external sites just as regular JavaScript you load externally. It basically wastes a bit of bandwidth in the hope you have a high-bandwidth, high-latency link

  • HTTP 103

    Hint: I'm sorry, Dave. I'm afraid I can't do that.

You are in a maze of little twisting passages, all different.

Working...