This is an automated archive made by the Lemmit Bot.

The original was posted on /r/pcmasterrace by /u/SaladNations on 2024-10-17 16:45:18+00:00.


Hey all so I am a Youtuber who does tech reviews and benchmarks PC parts and does full in depth performance comparisons between different model cards. I have been noticing something about newer cards that is outright bugging me and something I’m surprised no tech youtubers have brought attention too, The issue I’m talking about is our good friend “shader caching”. This is when you run a game for the very first time it has to cache the shaders for your specific hardware, this is of course if the game does not have a built in shader caching screen before the gameplay begins.

I have noticed a pretty stark difference in the shader cache performance between AMD and Nvidia cards that it bears mentioning. This only happens when I have tested cards from the 6000 series and beyond for AMD, there is an obvious difference in the frametime graphs and visual stutter on AMD than the Nvidia equivalent card. Take for example the 6900XT vs a 3080, I have these systems set side by side on my test bench, Each rig uses the same exact specs, the PC’s are identical in order to keep parity between them and give each card the fair chance with no bias.

Anyways when I clear the shader cache for each card and run a game the AMD card that is tested always shows visual stutter and massive frametime spikes that dont happen on the Nvidia card. This was really odd to me because I perhaps thought that there was something wrong with the test rig and I had went over and checked each component to see if I had any out of date drivers, bios, firmware, chipset driver ect. No all was good.

So I swapped the GPU’s from each rig into the other and now the other PC shows the same strange stutter and frametime spikes as before but all I did was swap the cards between them. So this told me that there was nothing wrong with my system and perhaps it was the card itself. But I did pull out the motherboards and swap them to another entire model (Asus vs Gigabyte B550) So I tested out a 6800XT again and yet it exhibited the same stutter, this was making me really curious so I searched all of Reddit and found there was something with AMD cards called DXNAVI that supposedly fixes these issues if you do some simple edits to the registry, Well I did do those edits but nothing changed in fact it made the stutters even worse and it broke freesync (as verified through the monitors own built in refresh rate counter).

I spent another few days trying 12 different drivers for AMD and no matter what it had the same stutter and frametime spikes in the same areas. Games tested were Witcher 3, Subnautica, Deep Rock Galactic, Grim Dawn, No Mans Sky, Warframe (solo mode), Monster Hunter Rise, God of War, God of War Ragnarok, Silent Hill 2 remake, Assassins creed Mirage, Assassins creed Valhalla.

So what is this stuttering problem? I am not entirely sure, I have went through 4 different AMD cards, 6900XT, 6800XT, 7700XT, 7900XT. All of them had the same problem.

I feel like maybe there is something I’m missing here but I checked, I have the latest bios for each board, the latest chipset drivers. I have been building PC’s for over 16 years and I am not making this post lightly, I am not bashing AMD nor am I here to say that they are bad cards. But is there something fundamentally wrong with the way that AMD cards cache shaders? keep in mind the stuttering stops entirely after you go to an area at least once, so this is just a classic case of which card caches shaders faster. So in my testing it appears that when an AMD card caches a shader it causes huge stutters visible to the user and is also verifiable on a frametime graph.

Oh some thing I forgot to mention, Each PC was also tested with the same motherboards and CPU’s when looked at side by side but only with 3 different CPU’s. The 5800X 3D, 5700X 3D. and 7800X3D. All CPU’s showed within a margin of error the same stutter and spikes on a frametime graph. So changing the CPU also did not alleviate the strange stutter for the AMD cards.

So what seems to be the problem here? As a PC enthusiast this kinda bugs me, I want to give each card a fair chance and not “hide” obvious and blatant issues with each card but when I put the AMD card side by side with the Nvidia in my videos the AMD side with the graph shows those huge micro freezes and causes people to get the wrong impression about AMD when they see my videos.


I did some further digging and found that AMD switched to a new shader cache method some time ago when NAVI came out and this is likely the culprit, it seems like they changed the shader cache method and this causes shaders to cache in a way that freezes the games frametime momentarily. Nvidia cards do not seem to suffer from this problem, I have tested all my cards all the way back to the 1080ti to confirm.