WebGL should never have been invented, at least on current architectures. It's a ticking bomb, precisely for the reasons this articles explored - drivers are complex and buggy black boxes that lie in kernel space so ring 0 access is never too far from reach for many exploits. WebGL has a potential to be exploit vector much more severe than what we've seen with other auxiliary web technologies like JavaScript or Java applets. Why not confine web to (mostly) nicely behaved x86 VMs?
Personally I think browsers will become service layers that expose the underlying OS in controlled and specific ways. Not a completely sandbox, but not a free for all either.
We've started creeping that way with a lot of the newer HTML5 stuff, but we have a long way to go.
Looking at the bug bounty record from Chrome and Mozilla it seems that fuzzing-for-bounties vulnerability research has largely moved away from WebGL. Must be at least partly due to WebGL reaching a reasonable level of robustness. Bounties may be harder to claim since bugs can be hardware/os/driver ver etc dependent, but still there used to be a lot more WebGL bounties dealt out.