Google's crawler uses automated modern Chrome instances. Suggesting SEO isn't possible with JavaScript or can't index a React or Vue website is very, very outdated advice.
Your suggestion makes a lot of good sense. Cookies aren't the real threat though. Surely cookies are used for both wanted and unwanted tracking. Passive tracking however (fingerprinting of any form) will remain the threat we can't block and we won't know is happening.
If we add a mechanism to allow the OS to handle cookies, bypassing possible untrusty browser vendors. We won't solve much and create a false expectation, while (arguably) break more than we fix.
This doesn't mean we shouldn't, but if a method is found, it should include a significantly more comprehensive form of anonymity.
Using it for years already