Hardware video decoding has worked perfectly for decades.
Sleep worked perfectly until Microsoft decided that device manufacturers should replace sleep with overheating in your bag (a much better sleep mode than, y'know, actual SLEEP).
Not sure what "modern UEFI features" means. Whenever something is described as "modern" that screams to me that someone is trying to conflate recentness with quality which is a red flag. UEFI itself has worked fine for as long as it has existed as far as I know?
Why you would replace the userland with "managed language frameworks" is quite beyond me.
As things are, all browser developers - including Firefox! - disable hardware acceleration on most video cards in their browsers on Linux because it is "too unstable". The result is a 20% difference in battery life between Linux and Windows if you mostly do browsing.
Never experienced this myself, and I have used discrete and integrated graphics cards from a variety of manufacturers.
Meanwhile on Windows I am not exaggerating when I say that every computer I have owned and every peripheral device I have ever used has had serious issues. Wireless headphones randomly disconnect, microphones require frequent unplug-replug cycles, rebooting is often required, reinstalling is common. Mice and keyboards have weird compatibility issues with software drivers. This experience is shared with most people I know that I have discussed it with. People are just used to it.
Maybe it isn't Linux that is the problem. Maybe the problem is that consumer hardware is designed and built on the cheap and is not designed to last, and they get away with it because most people (1) have no idea it could be so much better and (2) have no insight into these issues before buying because they are rarely covered in reviews.
For some reason when this happens on Windows, the hardware is to blame, but when it happens on Linux, Linux is to blame.
As noted above, I have experienced it myself on a freshly bought laptop (Thinkpad T14e AMD) that is specifically touted as Linux-friendly. I was genuinely curious as to how the battery life varies between Linux and Windows, and so I did a simple test that just did automated Reddit browsing, and left it running. When I saw the results, the disparity was so unexpectedly large that I went to investigate and found out about the hardware accelerated rendering being disabled by default on Linux in both Chrome and Firefox, and why.
Then, of course, I was also curious whether their reasoning was grounded, so I manually enabled acceleration and re-run the test - and found out that both Chrome and Firefox will inevitably crash in 2-3 hours of active browsing with it enabled, so they disable it for a reason.
As far as "maybe Linux isn't the problem" - you're broadly correct that it's really an issue of hardware quality and/or lack of good first party drivers. But from the end user perspective, if you can't reliably use Linux with popular off-the-shelf hardware, it's not really "ready for the desktop", regardless of where the blame lies. I've been a Linux user for 25 years now, with about a decade of using it as a primary desktop OS, and this exact excuse has been around for as long as I remember (I've used it myself plenty of times way back!). And yet, here we are.
Because on Windows, it is the OEMs that provide the support, while on Linux (sadly) even after 30 years, it is mostly reverse engineered unless we are talking about OEM custom distros with their own blobs, like Android, ChromeOS and WebOS.
Sleep worked perfectly until Microsoft decided that device manufacturers should replace sleep with overheating in your bag (a much better sleep mode than, y'know, actual SLEEP).
Not sure what "modern UEFI features" means. Whenever something is described as "modern" that screams to me that someone is trying to conflate recentness with quality which is a red flag. UEFI itself has worked fine for as long as it has existed as far as I know?
Why you would replace the userland with "managed language frameworks" is quite beyond me.