Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Programming Security

A Plan for Improving JavaScript's Trustworthiness on the Web (cloudflare.com) 46

On Cloudflare's blog, a senior research engineer shares a plan for "improving the trustworthiness of JavaScript on the web."

"It is as true today as it was in 2011 that Javascript cryptography is Considered Harmful." The main problem is code distribution. Consider an end-to-end-encrypted messaging web application. The application generates cryptographic keys in the client's browser that lets users view and send end-to-end encrypted messages to each other. If the application is compromised, what would stop the malicious actor from simply modifying their Javascript to exfiltrate messages? It is interesting to note that smartphone apps don't have this issue. This is because app stores do a lot of heavy lifting to provide security for the app ecosystem. Specifically, they provide integrity, ensuring that apps being delivered are not tampered with, consistency, ensuring all users get the same app, and transparency, ensuring that the record of versions of an app is truthful and publicly visible.

It would be nice if we could get these properties for our end-to-end encrypted web application, and the web as a whole, without requiring a single central authority like an app store. Further, such a system would benefit all in-browser uses of cryptography, not just end-to-end-encrypted apps. For example, many web-based confidential LLMs, cryptocurrency wallets, and voting systems use in-browser Javascript cryptography for the last step of their verification chains. In this post, we will provide an early look at such a system, called Web Application Integrity, Consistency, and Transparency (WAICT) that we have helped author. WAICT is a W3C-backed effort among browser vendors, cloud providers, and encrypted communication developers to bring stronger security guarantees to the entire web... We hope to build even wider consensus on the solution design in the near future....

We would like to have a way of enforcing integrity on an entire site, i.e., every asset under a domain. For this, WAICT defines an integrity manifest, a configuration file that websites can provide to clients. One important item in the manifest is the asset hashes dictionary, mapping a hash belonging to an asset that the browser might load from that domain, to the path of that asset.

The blog post points out that the WEBCAT protocol (created by the Freedom of Press Foundation) "allows site owners to announce the identities of the developers that have signed the site's integrity manifest, i.e., have signed all the code and other assets that the site is serving to the user... We've made WAICT extensible enough to fit WEBCAT inside and benefit from the transparency components." The proposal also envisions a service storing metadata for transparency-enabled sites on the web (along with "witnesses" who verify the prefix tree holding the hashes for domain manifests).

"We are still very early in the standardization process," with hopes to soon "begin standardizing the integrity manifest format. And then after that we can start standardizing all the other features. We intend to work on this specification hand-in-hand with browsers and the IETF, and we hope to have some exciting betas soon. In the meantime, you can follow along with our transparency specification draft,/A>, check out the open problems, and share your ideas."

A Plan for Improving JavaScript's Trustworthiness on the Web

Comments Filter:
  • Better idea. (Score:5, Insightful)

    by Gravis Zero ( 934156 ) on Sunday October 19, 2025 @08:28PM (#65737104)

    Phase out JavaScript.

    • by buck-yar ( 164658 ) on Sunday October 19, 2025 @08:33PM (#65737106)
      With the direction things are headed, we're probably more likely to see a return of Shockwave or Flash
      • The web was more fun when Flash was around.

        • At least there's no more "under construction" pictures of half made websites. Commercialization is complete and gone are the hobbyists. Coincidentally everything is enshitified (or maybe no coincidence).
        • Was it really? I'm guessing you were a graphic artist? Flash wasn't the web, it was a binary program. You can do more and better things with css for a long time now.

          Not that I want things flying around and squawking on a web page, but flash *was* fun. Managers and graphic artists loved it, but it wasn't very good for anything but games or entertainment.
          • by dfghjk ( 711126 )

            It appears you don't even know what flash was. Replacing it took far more than CSS.

            • It was closed source binary installed as a browser plugin/extension. You needed the Adobe Flash Player, which was a closed source binary that had direct access to the hardware to display raster and vector graphics, web pages don't do that.
              • You can do more and better things with css for a long time now.

                binary that had direct access to the hardware to display raster and vector graphics

                Which is it? You can do more and better things, or you can access the hardware for cool effects? These seem to contradict.

                • I don't see a contradiction.
                  CSS is fully integrated with the DOM, whereas Flash, as a separate binary you communicate to via API. i.e. you have to call out of the browser. Two separate things. I'll turn it around. What can't you do with CSS that you could with Flash? If you want to make a game, you *may* have an argument for deeper hardware integration, but I developed some games that adhered to W3 standards, so it's not impossible. Flash is just like all software: people who develop in it want everyone to
      • In find it so sad that once Flash was killed off, and we had an opportunity to turn the web back into a document platform, the first thing we did was re-invent a new application-centric platform with JavaScript.

        Huzzah, the Flash monopoly is dead! Long live the Chrome monopoly!

        The only thing the Flash killswitch did was destroy 25+ years of Internet culture, like all those great interactive cartoons. I miss those so much.

        • by znrt ( 2424692 )

          In find it so sad that once Flash was killed off, and we had an opportunity to turn the web back into a document platform

          wrong tree.

          we still have plenty of opportunity for the web to be a document platform. nothing stops you from just publishing plain html documents, and even better than before because html has evolved a bit to allow for extra semantics and accessibility. there's people actually doing this, but they're the exception. why? because the vast (vast!) majority wants interaction on the web, and doesn't care. so your real issue is with people and the world, not with flash or javascript.

          the Flash monopoly is dead!

          flash never was a monopoly, no

      • or maybe CGI-BIN
      • by shanen ( 462549 )

        Mod parent funnier, but what was Microsoft's version called? Probably still around with new branding...

        • by kackle ( 910159 )
          I thought is was called "Blackbird". It was a lesson for me because I recall that Microsoft flew a bunch of developers out to their campus to learn it and then it a short time later (months?) it was all cancelled. I imagined the wasted effort of those poor folks and eventually saw that concept repeated all over the digital landscape over the decades--developer cats chasing flashlight spots on a wall.
    • Re:Better idea. (Score:5, Insightful)

      by PPH ( 736903 ) on Sunday October 19, 2025 @09:11PM (#65737156)

      At least, remove JavaScript's ability to communicate with anything other then the originating web server. JavaScript, as a part of a web page has some uses for offloading processing from the back end server. And that is all it should be used for.

      I load a web page by opening an SSL connection. I download some HTML, style sheets and whatnot through that connection. I can also download some JavaScript through that same pipe and execute it locally. But it should never be more than an extension of that page. And if it needs to pass data back through that open connection to enhance the functionality of the page, OK. But it should have no ability to open its own connections and negotiate its own encryption with third parties. It should have no concept of the outside network beyond the server from which it came.

      • you're confusing a scripting language with a script, executed by a browser... so your barking under the wrong tree
        • by PPH ( 736903 )

          I could extend this principle to any script or executable run by a browser. Which is what I assume is meant by "on the Web".

          • by SirSlud ( 67381 )

            Then you would be wrong. Probably best to recuse yourself from weighing in on this dicsussion.

      • by dfghjk ( 711126 )

        "I can also download some JavaScript through that same pipe and execute it locally. But it should never be more than an extension of that page."

        What's a page?

        "But it should have no ability to open its own connections and negotiate its own encryption with third parties. It should have no concept of the outside network beyond the server from which it came."

        That would work great with websockets. Won't be asking you for architectural advice.

        • by PPH ( 736903 )

          That would work great with websockets.

          Sarcasm noted. On the other hand, perhaps we need to lock down websockets as well. There are too many sites that build up trust and get their own server certificates. But then sell out and become nothing more than portals for garbage apps coming from anywhere on the Internet.

        • by tepples ( 727027 )

          What's a page?

          A "page" is an HTML document retrieved through an HTTP or HTTPS URL. I think PPH is proposing enforcing a stricter same-origin policy. Instead of CORS, the document's origin server would have to act as a proxy to retrieve any third-party resources needed by the client-side script.

      • I could accept reading data from approved sources, but in most cases it's something the originating server could do.

      • by znrt ( 2424692 )

        this already exists and it's called sop (same origin policy) which is enabled by default but can be explicitly bypassed by enabling cors (cross origin resource sharing) and very often is because, you know, strict sop is simply too strict for many needs. you're not proposing a change in javascript, you are asking for the entire www to be redesigned.

        there are already ways to do what the author seems to want, but they're specific. i've not dwelved into the details (actually didn't rtfa (yet)) but i'll be skept

      • by tlhIngan ( 30335 )

        Also remove the ability to load JavaScript off any host other than the originating one. Right now things are a huge mess because scripts can be pulled from anywhere, increasing the attack surface. It's why things like NoScript default to not running scripts from unknown hosts.

        If every website had to host their own JavaScript, it would tighten things up a lot.

        Sure it would make some webmasters jobs more difficult because they need to keep pulling in code from Facebook and Google and X and other social media

    • What would you replace it with? Just static pages? How is that an improvement?

      JavaScript has delivered on the unfulfilled promise of Java: a programming language that runs literally everywhere. As a software developer, it lets me write once, run anywhere: any browser, any manufacturer's device. Why is that bad?

      • Re:Better idea. (Score:4, Informative)

        by Waccoon ( 1186667 ) on Monday October 20, 2025 @03:44AM (#65737532)

        The problem isn't really JavaScript. The problem is the frameworks and libraires.

        I use an "alternative" web browser. Do you know how many web sites refuse to work on a browser that isn't explicitly branded as Chrome, Firefox, or Safari? Unless you're using vanilla JavaScript, which nobody actually does, hardly anything works universally across all browsers.

        Web developers are fucking idiots and aren't even aware that their code refuses to work on anything other than Chrome because they don't audit their frameworks. Hell, I still see tons of sites hard-linking their script resources directly to 3rd-party web sites because they're too lazy to update things locally and don't bother testing anything. Nobody seems to give a damn. If they can't be bothered to test before deployment, do you think they even remotely care about security?

        Honestly, yes, I'd much prefer the web went back to static pages. Please!

        Disclaimer: I've been a web developer since the late 90's. I stopped doing it professionally years ago because it's too damn depressing. Web development is, hands-down, the least disciplined field of software I've ever seen.

        • by AmiMoJo ( 196126 )

          Linking to 3rd parties is deliberate. It means the browser can use a cached version of that large Javascript framework, instead of re-downloading it and re-compiling it from scratch for every website.

          The situation with non-Chrome browsers is annoying. Even Firefox breaks a lot of sites, especially if you have privacy settings turned up. I tried a user agent changer, but Cloudflare detects it somehow even when it isn't active on that site, and you get into a captcha loop.

          It must be hell for people with disab

          • What you're experiencing is not a FireFox-specific thing, it's a "privacy settings turned up" thing. Website developers specifically *want* to be able to track you, regardless of browser. So when you block tracking, it breaks things.

        • There are reasons for the behaviors you see as "dumb."

          Linking to 3rd party websites for frameworks is a strategy that facilitates caching and improves security (since security updates will automatically be distributed).

          When it comes to "caring" about alternative browsers, what developers care about, is the browsers their users are actually using. If 99% of their users are using Chromium, they aren't going to care about problems faced by the 1% who don't. Why put all that extra effort into supporting such a

          • Linking to 3rd party websites for frameworks is a strategy that facilitates caching and improves security (since security updates will automatically be distributed).

            Caching does not work across top sites anymore. Everything is partitioned.

            https://developer.mozilla.org/... [mozilla.org]
            https://privacysandbox.google.... [google.com]

            You might say you don't want pages that are built using javascript, and yet, you're on slashdot, which uses a bunch of javascript.

            Without noscript this site is mostly unusable.

            • Partitioning doesn't negate caching of things like frameworks. For example, if I have a script tag that references Bootstrap, it will certainly cache my copy of Bootstrap as I use it for my web site. This is called double-keyed caching. My copy of Bootstrap is cached for me, but other websites that use Bootstrap won't get MY cached copy.

              OK so apparently slashdot has done some work to make their site work without javascript. Many around the web have not.

    • "Specifically, they provide integrity, ensuring that apps being delivered are not tampered with, consistency, ensuring all users get the same app, and transparency, ensuring that the record of versions of an app is truthful and publicly visible."

      Wait, this is completely bullshit. Appstores of COURSE have the same issue. There's INCREDIBLE trust put into Google and Apple to give you the same binary. But what if they gave 0.001% of users a different binary, due to being govt compelled to do so? What if the di

    • Do you realize that several popular desktop applications are written in Javascript and other web technologies, using Electron?

      Instead of phasing out JavaScript, how about supporting packaged Javascript applications that could be loaded and updated from the browser, with the consent of the user? (as opposed to Javascript files being loaded on demand at any time from any webpage). Big Tech doesn't like this, for various reasons. Firefox OS tried to do it, but failed. Progressive Web Applications may be an

      • by tepples ( 727027 )

        how about supporting packaged Javascript applications that could be loaded and updated from the browser, with the consent of the user?

        Chrome Web Store and addons.mozilla.org already implement "extensions".

  • Road to insanity (Score:4, Interesting)

    by WaffleMonster ( 969671 ) on Sunday October 19, 2025 @09:55PM (#65737216)

    Reference from the wayback machine lists three issues I could discern.

    Lack of a secure random, implementation of a insecure password authentication scheme and caching.

    This new document is gibberish: "By designing integrity and transparency together, we can make the web more trustworthy, ensuring that all users can rely on security-critical sites to deliver the code they promise, consistently and visibly."

    "Users visiting sites need to be sure that the content they are getting has not been tampered with by any actor between the content creator and the delivery of the content."

    "Users want to ensure they get access to the content that has not been modified by a third party; Websites that want to protect against malicious insider attacks; Websites that want to take additional steps to detect and protect against misconfigurations"

    Caching of common assets went out the window when caches were siloed by top site.

    There is now a Crypto.getRandomValues()

    Custom password authentication algorithms are an extremely bad idea in the first place, executing java-script in browser is too late / wrong layer for this shit.

    The solution to sites not trusting "security critical" JS libraries is to load the damn libraries from a trustworthy source. What is the point of having secure libraries in the first place if your site isn't trustworthy? This whole scheme makes no sense. And all of the shit about the users wanting to ensure and transparency seems like BS. Even if there were a technical mechanism like CT it will be completely foreign to normal people in the real world.

    The solution to missing primitives for managing PKI or password authentication is to bake high level aspects/facilities into browsers. For example rather than web forms soliciting passwords or people baking crypto via adhoc javascript the browsers TLS stack negotiates a PAKE ciphersuite and preferably via involvement of OS SAS mechanism manages secure phishing resistant user password entry.

    Ditto for managing high level PKI operations between users and sites.

    I suspect ultimately the real point of these schemes will be enabling adhoc injection of facilities into browsers by large corporations. You will start seeing the equivalent of HTTPS only features creep in where only people in an exclusive trusted club get to leverage x, y and z.

    • by AmiMoJo ( 196126 )

      For security we need to be able to modify Javascript, e.g. have a privacy enhancement add-on delete or re-write parts of it. Otherwise it will be abused for spying and DRM.

  • So their answer to trustworthiness is effectively a DRM scheme. This will go down well, I'm sure.

  • If you find yourself aiming to improve people's trust instead of making the language itself secure, you know you already lost
  • by _xeno_ ( 155264 ) on Monday October 20, 2025 @12:37AM (#65737422) Homepage Journal

    This is a plan intended to kill ad blockers. One of the major issues advertisers have right now is that they can't guarantee that their code will be executed as intended. A lot of ad systems now have anti-blocker technology that checks to make sure an ad loaded successfully, and if it didn't or if the dimensions are wrong or if the element is messed with, throw up a page-wide dialog to block access to the site. (Or do what Slashdot does at present, throw you into an infinite alert() loop.)

    But that requires JavaScript to work. Block that JavaScript, and you block the ability to block ad-blockers.

    Add in things to insure "integrity" of JavaScript delivered to the client, and you break that. No more blocking scripts, no more blocking ads - or at least, no more blocking the scripts detecting if you're blocking ads.

  • by Anonymous Coward

    A Plan for Improving JavaScript

    Don't bother, still keeping it off as much as I possibly can.
    No amount of improvement will rescue it from the shitty ways sites use it. Against you.

  • I don't mean binary blobs.... I mean neckbeards who learned to blame China 25 years after the dotcom boom (conveniently a Y2K mass hysteria). Outsourcing and tampering EULAs are to blame for the decline in productivity. WASM 3 finally enables the use of REAL practice with REAL routines. Practical practices rudimentary routines... like memory safety! Like vibe-coding a sex game about JFK legalizing THC.

The bugs you have to avoid are the ones that give the user not only the inclination to get on a plane, but also the time. -- Kay Bostic

Working...