A certain graphics card manufacturer has claimed that graphics cards (in continual use) reduces their performance by around 10% each year. Of course, its a near-myth, only saved by the fact that certain components (like fans and the thermal interface on certain cards) loses their performance due to factors beyond their control.
The solution is obviously to clean the damn thing.
Linus Tech Tips addressed this claim in a recent video comparing GPUs that were used for years to mine crypto vs like new identical or similar GPUs. Turns out it's barely 1% a year and that's likely a difference in cooling capacity from a dusty card.
I'm not sure what the current situation is, but at least a few years ago my experience with both Nvidia and AMD was that when upgrading past a certain driver version, fps would tend to decrease again.
Due to the large number of architectures and models churned out by AMD/Nvidia, when releasing drivers, they tend to concentrate on the performance of the latest and greatest cards, and throw a bone to the previous architecture. Older cards are absolutely neglected and may have perf regression after upgrading drivers.
A certain graphics card manufacturer has claimed that graphics cards (in continual use) reduces their performance by around 10% each year. Of course, its a near-myth, only saved by the fact that certain components (like fans and the thermal interface on certain cards) loses their performance due to factors beyond their control.
The solution is obviously to clean the damn thing.