I've honestly started questioning if the gamma representation was even worth it. Does using 8 bit linear really lose out on so much fidelity? I guess to test that you would need a display capable of showing 10 bit colour or maybe a CRT with adjustable gamma down to 1.
A gamma of 2.2 puts that '15' at 0.2% brightness, and the '20' at 0.4% brightness.
An 8 bit linear representation would make that '20' square the minimum brightness above zero. The next step up would be roughly the '30' square.
So yes, the gamma curve is very necessary. Even 12 bit linear would be a bad idea. So 14 or 16 minimum. And adding HDR is something like 2 bits with gamma and 8 bits without.