Forgot your password?
typodupeerror
Graphics Games

Nvidia Rolls Out Its Fix For PC Gaming's 'Compiling Shaders' Wait Times (arstechnica.com) 61

Nvidia has begun rolling out a beta feature that automatically compiles game shaders while a PC is idle. It won't eliminate shader compilation the first time a game runs, but Ars Technica reports it could help reduce those repeated wait times. From the report: Nvidia's new Auto Shader Compilation system promises to "reduc[e] the frequency of game runtime compilation after driver updates" for users running Nvidia's GeForce Game Ready Driver 595.97 WHQL or later. When the feature is active and your machine is idle, the app will automatically start rebuilding DirectX drivers for your games so they're all set to roll the next time they launch.

While the feature defaults to being turned off when the Nvidia App is first downloaded, users can activate it by going to the Graphics Tab > Global Settings > Shader Cache. There, they can set aside disk space for precompiled shaders and decide how many system resources the compilation process should use. App users can also manually force shader recompilation through the app rather than waiting for the machine to go idle.

Unfortunately, Nvidia warns that users will still have to generate shaders in-game after downloading a title for the first time. The Auto Shader Compiler system only generates the new shaders needed after subsequent driver updates following that first run of a new title.

This discussion has been archived. No new comments can be posted.

Nvidia Rolls Out Its Fix For PC Gaming's 'Compiling Shaders' Wait Times

Comments Filter:
  • by Anonymous Coward
    Buffering....
  • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Thursday April 02, 2026 @12:32PM (#66074112) Homepage Journal

    Steam does this already and most of my games are delivered via steam, so most of my games have this already.

    I think steam does set the processes slightly nice, but I don't think they change the ioprio so it can still have a negative impact on systems without fast storage. (I have mirrored nVME SSDs so this is only a problem to any degree when this is done for infrequently played games, which are stored on HDD. That's a 3-way mirror too, though.)

    • by Bahbus ( 1180627 )

      Steam does this for Vulkan shaders, not DirectX.

      • Sucks for folks using a legacy, business desktop OS instead of something modern.
        • Are you talking about Vulkan? Because DirectX has quite a few features Vulkan doesn't, and is generally considered easier to code with and providing a more consistent framerate with less stutter (though slightly lower framerate overall).

          Oh sorry. You were not actually ignorant just trying to poke fun at Microsoft. Hahahah. Yeah good one.

          • I think Vulkan's overhead is slightly lower, but as far as I can tell it all balances out. I've never noticed any real difference in-game.
    • Heh, Steam does that at nice 20. Unless you're still running dnetc, it's going to yield to everything else.
      • That's good, I couldn't recall. But the IO priority is important, too. When I start intensive long-running processes where I don't care if they finish, I use nice ionice -c idle $* (I call it "nicer")

    • Yes and no. Steam does that only for Vulkan games, only for select hardware, and only for select drivers. Shaders are hardware *AND DRIVER VERSION* dependent. Works perfectly fine on something like the Steamdeck, but doesn't work if your choice of driver or hardware doesn't align with whatever Steam is doing, and doesn't help you at all if you don't have Steam always running, i.e. you update your drivers and a day later start Steam to play a game, well there was no time to pre-compile shader cache.

      Also back

      • You're not saying anything that I or someone else hasn't already said in this discussion and therefore I'm well on top of it, except that it's off by default, that part I didn't know. Perhaps I turned it on, I don't recall. And your claim that it doesn't work if you're not "aligned" with Steam, whatever that's supposed to mean. It's always worked for me.

    • I think steam does set the processes slightly nice, but I don't think they change the ioprio so it can still have a negative impact on systems without fast storage. (I have mirrored nVME SSDs so this is only a problem to any degree when this is done for infrequently played games, which are stored on HDD. That's a 3-way mirror too, though.)

      Mirroring an SSD? Does TRIM even work with mirrored SSDs?

  • Yes I guess that is a stupidly simplistic question. On steam even in Linux I have had that 'vulkan shaders' background thread running seemingly for tens of minutes any time a driver or game is updated - that's just stupid.
    • It takes tens of minutes here, too. It has to be updated when the game is changed because the assets which include the shaders have changed. It has to be updated when the driver is changed because the driver is what runs the shaders. If you don't precompile then the compilation has to be grunted out on demand, and your game will likely have chokes and stutters while it's done in realtime. IME for most titles it's not that bad and resolves itself in a few minutes.

      • your game will likely have chokes and stutters while it's done in realtime. IME for most titles it's not that bad and resolves itself in a few minutes

        It's one of those issues that often are easily ignored until suddenly it's game-breaking.

        For many games that stream assets and build shaders on the fly, if there's a bit of blurriness and stuttering when you first enter an area there are many players who can forgive that. Having that same experience walking into a boss's lair and suddenly the game is choking and stuttering as resources are processed, that's a fatal flaw that can make it difficult to play, or even outright kill the player while loading.

        The

        • by SirSlud ( 67381 )

          Particularly for PvP competitive games that require constant FPS (think CoD, Battlefield, Fortnite, etc) runtime shader compilation is a nonstarter. CoD won't even let you matchmake without compiled shaders, even tho the engine supports compiling them on demand.

    • My limited understanding is that the compiled shader is specific to the card and driver. So there are probably a lot of combinations.
      • by SirSlud ( 67381 )

        Yes, this. There are many *many* combinations. Distributed compilation and a remotely hosted shader cache would cost a lot of money to host. I don't think it's the technical considerations that are as preventative as simply the cost of hosting the service.

        • Microsoft announced Advanced Shader Delivery which does that, delivering precompiled shaders or partial ones when not possible to enumerate all combination. They did it for the rog ally(obviously easier for a fixed hardware) and now plan to extend it to pc. https://devblogs.microsoft.com... [microsoft.com]
          • by SirSlud ( 67381 )

            Oh, that's pretty neat. Microsoft is definitely the right level to address this at - they already have permission to enumerate the HW, own the hardware and software infra to tackle this, enjoy economy of scale other players are not privvy too, and can deliver a solution in a vendor agnostic way. Thanks for the heads up. It's the right thing to happen.

        • by dgatwood ( 11270 )

          Yes, this. There are many *many* combinations. Distributed compilation and a remotely hosted shader cache would cost a lot of money to host. I don't think it's the technical considerations that are as preventative as simply the cost of hosting the service.

          What is the power consumption for doing this tens of millions of times, and how does the greenhouse gas emissions from that compare with the power consumption of running the servers? It seems like there are a lot of hidden costs in the current approach.

          • by SirSlud ( 67381 )

            Of course there are. Tragedy of the commons. My point is that no single entity is likely to absorb the costs unless they're already enjoying economy of scale advantages and there are business experience/optic benefits to doing so. The poster above you pointed out that Microsoft seems to be addressing this, which makes a lot more sense to me than doing it at the 3d HW vendor level.

      • by SirSlud ( 67381 )

        It's worth noting that many game studios/engines do support shared shader caches in their local studio pipelines, but the hardware config spread is much more limited, and the costs for lost productivity waiting for shaders is far greater than hosting a shader cache on premises.

  • So now, instead of taking a moment while the shaders compile to relax and eat some chips, maybe enjoy the local ambience of Mom's basement, you have to pause the game, eat some chips, then unpause? And you don't even get to read any helpful tips while you're eating? Sounds like a pain in the ass.
  • Console master race stays winning, you filthy PC gaming peasants.
    • by bn-7bc ( 909819 )
      This just in, next gen consoles ( when/if the arrive) will be mosly pcs in gancy cases, running one version of linux or another, so you might need to explain what exectly the big console advantage will be going foorwaed?
      • Well, in this case, a distinct lack of needing to compile shaders.

        And consoles have been 'PCs in fancy cases' since the OG Xbox. Arguably the Dreamcast.

  • by reanjr ( 588767 ) on Thursday April 02, 2026 @12:59PM (#66074180) Homepage

    They need to implement BitTorrent or something. There's no reason everyone has to compile this shit themselves.

    • They can't be bothered to maintain any binary level compatibility within their drivers and instead make it your problem. But then that is exactly how the AI build-out works, they make it our problem and expense.
    • by SirSlud ( 67381 )

      Asking people to host and serve a non-trivial amount of content to other players is a non-starter. (The size of compiling all the shaders for CoD can range from a couple gigs to 10 gigs.) Opting in to a torrent-like network would have to be opt in - many people would just opt out (justifiably or not) minimizing the point of such a network.

      You can probably assume that if you've thought of something, they've thought of it too. They simply have constraints and considerations - both technical and business orien

      • Opting in to a torrent-like network would have to be opt in - many people would just opt out

        Sure, but many people would opt in, especially if you explained that they would benefit.

        They simply have constraints and considerations - both technical and business oriented - you don't need or want to account for.

        Yeah, it's added complexity they would have to support and maintain. That alone is sufficient reason not to do it frankly.

        • by SirSlud ( 67381 )

          Sure, but many people would opt in, especially if you explained that they would benefit.

          Maybe. Maybe not. Before committing to developing such a thing, you'd have to at least do some research and analysis to find out if that's true and how the likely opt in/out ratios would impact the business case. Remember, this is hosting content in a daemon on your machine .. I think that'd a non-starter for a lot of people, despite the upside of shorter shader updates. (I'm not super up on what the US ISP market/landsc

          • Data caps are still a thing but they mostly control download. Most users are on cable now, this is generally asymmetric, so the upstream is mostly just limited by practical considerations. (Upstream and downstream frequencies must differ in DOCSIS, and they dedicate more bandwidth to downstream for obvious reasons.)

      • World of Warcraft has been doing it for years

    • by SirSlud ( 67381 )

      Also a torrent like network would be absolute loaded with cache misses. You need to fetch a shader from somebody who has the exact same hardware/drive/game version combination as you do, and they need to have opted in. I highly suspect the majority case for many would be to cache miss and end up compiling locally.

      • That's a lot more common than you would think though because of automatic game and driver updates, and the march of upgrades necessary to play modern games. Most players of a particular game are on similar hardware.

      • by flink ( 18449 )

        4080, Nvidia driver:latest, $popular_game:latest is gonna be a pretty easy tupple to match. Yeah there will be misses, but who cares. You fall back to local compilation waiting 10s.

    • by tlhIngan ( 30335 )

      They need to implement BitTorrent or something. There's no reason everyone has to compile this shit themselves.

      Technically it has to be done for every graphics card model out there.

      Shaders are real programs, and your graphics driver ships with a compiler (usually based on LLVM) that takes those shader programs and produces the final binary from it for your specific video card. Now, usually the source code to the shaders are not given - instead they are in IR (intermediate representation) which is basically

    • They need to implement BitTorrent or something. There's no reason everyone has to compile this shit themselves.

      Yes there is, it's hardware and driver version dependent. It's far more efficient to just do the compilation in the background than to keep a precompiled version for each game for each combination of hardware and driver, x2 once for Vulkan and once for DirectX for games which support both.

      For consistent platforms there is already a pre-compiled shader available. E.g. Steamdeck will download precompiled shaders, providing you aren't running a beta version of Proton or an unofficial GPU driver.

      • by Jeremi ( 14640 )

        Yes there is, it's hardware and driver version dependent. It's far more efficient to just do the compilation in the background than to keep a precompiled version for each game for each combination of hardware and driver, x2 once for Vulkan and once for DirectX for games which support both.

        They could take that one step further: once your computer has compiled the appropriate shader for its particular combination of hardware/driver/etc, the game could upload that particular shader to a repository, so that the next install with the exact same combination of conditions could just download it instead of having to duplicate the work. I imagine there are a lot of people out there running functionally identical systems that would benefit.

        I suppose they don't do that because they don't trust people

        • That would add the requirement for the central repository as infrastructure which is probably not worth it bandwidth/storage-wise when so many gaming PCs are likely to be online at any time, but the possibility of a malware vector (or some kind of sabotage, maybe people would try to DoS a game by sharing corrupted compiled shaders as a form of protest) is worth considering with or without it.

          BOINC protects against errors or sabotage in their distributed computing system by having 2 random different users bo

      • Steam already does this for PCs, not just the Steamdeck. None of us are unique and beautiful snowflakes with completely unknown hardware and drivers. There's a matrix. If a game is popular, you are highly likely to find other people sharing a cell with you in the matrix.

      • Those aren't reasons for everyone to be compiling on their own. In a BitTorrent-like system nobody would be "keeping" shaders they aren't using, just sharing shaders they've compiled because they're using them. If nobody's ever done it before for the hardware/driver combination then you fall back to compilation and then share your results so others can benefit and the same work doesn't have to be done again.

        Plus most users are probably on one of the latest driver versions so there would be far more hardware

    • by hvdh ( 1447205 )

      I think you would need a specific compiled share set for each combination of
      - exact game version
      - GPU driver version
      - GPU model & hardware revision

      It will be a very high number of combinations for each game.
      Who should provide all needed setups, create all that data and host it somwhere?

  • i doubt this helps laptop users going between dedicated nvidia gpu and amd integrated on the cpu (my 890m on hx370 proart is plenty capable for most of my games and its quieter; but if i plug in the system switches to the nvidia and games have to do their recompile any time i launch after a swap i hope infinity nikki gets its mac port this year; hopefully with next update 2.5... and i hope infold optimizes it enough to work as well on neo as it does on iphone
  • I generally shut off shader cache's. I have a Samsung 840, that has been used for video editing. Oddly, it still works. Although shader caches are a pittance compared to video editing, every little bit wears SSD's out. BTW, the 4TB Samsung 990 Series NVMe I was looking at six months ago, went up from $385 to $621. In the end--even excessive logging wears SSD's, as a whole block is written for even a one byte update.
    • Oddly, it still works.

      Maybe ask yourself why that is before you make your preference. While asking yourself why not look up the SMART parameters for your drive. I also do video editing. The now 7 year old SSD I thrash with DaVinci has 97% remaining life (and that's a QVO drive, well known for having the shortest life expectancy rating of all of Samsung's lineup). My 5 year old 980 Pro drive I use for gaming has 100% remaining life.

      You won't wear out your SSD. Not with video editing, not with pre-compiling shaders. To think other

      • I should also add, shader pre-compilation happens once per driver update per game and for most games is int he order of 500MB to 1GB. I think if you downloaded every game on Steam (to the capacity limit of your SSD) and ran every new driver (you don't have to update GPU drivers) ever released, and precompiled every shader possible you'll never be able to wear out an SSD, at least not before some other part of your PC breaks over the coming decade+. It's just not a big load for your drive.

  • Microsoft Windows has been doing something similar with their NET framework for years. Whenever there is an NET update there is a background task that runs the JIT compiler across various files to reduce future app startup times. Android's modern ART AOT system does something similar. Improving the user experience (faster startup times) is a laudable goal.
  • Reticulating splines...

  • So you update your drivers and then your PC inexplicably grinds for half an hour. This will only teach people to stop updating their drivers.

    • So you update your drivers and then your PC inexplicably grinds for half an hour. This will only teach people to stop updating their drivers.

      Except that's literally the status quo now. Every driver update requires shader recompilation. By the way this process for most games can be skipped or happens in the background, but results in performance problems.

      Also not sure what you mean by half an hour. The longest I've ever seen shader recompilation was about 10minutes, and that was on an absolute potato of a computer.

  • I have been wondering for years why they couldn't/wouldn't do this.
  • they do something to bring costs down. I'm not a fan of how x070tis cost what x080s did just a couple of years ago.

    As soon as Nvidia saw how much people would pay for a GPU during Covid (which was all bitcoin driven, you cryptodicks), they jumped at the chance to bump all their prices up a tier. Which, to me, means they can drop them back down a tier now.

VMS is like a nightmare about RXS-11M.

Working...