I have one regret from this article, and that’s the explanation that sRGB hex color #ffffff (255, 255, 255) is being remapped. It’s not. The OS simply isn’t restricting the display pipeline to sRGB anymore, and that allows content which exceeds sRGB to do so.
#ffffff is L=100%. What is L=800%? It exists in HDR content, and we can’t just make the web color #ffffff a dim gray to the eye.
We must start thinking in terms of HSL or LAB or even RGBL, and consider that L > 100% is where HDR peak brightness lives.
HDR’s color space exceeds the luminosity that sRGB hex triplets can represent, and remapping HDR color spaces into sRGB hex gives you horrendous banding and requires complex gamma functions. The CSS colors spec is finalizing on this, but essentially we’re at the last days of hex codes being a great way to express color on the web. They’ll remain good as a last resort, but it’s time to move a step forward.
Apple is pinning sRGB hex #ffffff to “paper white” brightness because the hex color specification can’t encompass the full spectrum of monitors anymore. The different between #ffffff and #fefefe can be enormous on a display with 1800 nits of peak brightness, and if you map #ffffff to peak brightness, you burn out people’s eyes with every single web page on today’s legacy sRGB-color Internet (including Hacker News!). That’s why HDR has “paper white” at around 400 lumens in calibration routines.
So, then, sRGB hex colors have no way to express “significantly brighter than paper white #ffffff”, and UI elements have little reason to use this extended opportunity space - but HDR content does, and it’s nice to see Apple allowing it through to the display controller.
But there’s no way to make use HDR in web content - other than embedded images and videos - if we continue thinking of color in terms of hex codes. This insistence that we remap hex codes into thousands of nits of spectrum is why web colors in Firefox on an HDR display make your eyes hurt (such as the HN topbar): it’s rescaling the web to peak brightness rather than to paperwhite, and the result is physically traumatic to our vision system. Human eyes are designed for splashes of peak brightness, but when every web page is pouring light out of your monitor at full intensity, it causes eye strain and fatigue. Don’t be like Firefox in this regard.
“But how do we conceive of color, if not in hex codes?” is a great question, and it’s a complicated question. In essence you select color and brightness independent of each other, and then make sure that it looks good when peak brightness is low, and doesn’t sear your eyes when peak brightness is high.
If this interests you, and you’d like to start preparing for a future where colors can be dimmer or brighter than sRGB hex #FFFFFF, here are a couple useful links to get you started:
As a final note, there are thermal reasons why peak brightness can be so much higher than paperwhite: your display can only use so much power for its thermal envelope. Yes, HDR displays have thermal envelopes. So overusing peak white, such as scaling #ffffff to the wrong brightness, can actually cause the total brightness of the display to drop when it hits thermal protections, while simultaneously wasting battery and hurting your users’ eyes.
That's already possible today, yes. Safari shipped support for all this live in production some time ago and I assume Chrome and Firefox are either already there or working towards it.
I've always wondered what it'll take for the major OS manufacturers to implement an anti-seizure filter for the content they transmit to their screens, and I'd bet that a flickering ad at HDR max brightness causing seizures worldwide one day will finally compel them to do so.
I agree with you in general, but one important aspect of "HDR" is 10+ color processing (not necessarily showing, FRC8+2 is already an improvement) on the displays. That lets you create more grays etc
Yeah, I considered rewriting to try and use colors as percentages and show the idiv x/256 banding problem of hexadecimal colors, but for today's purposes, I will be satisfied if I inspire long-time web programmers to start realizing that sRGB hex is a dead end way to think about color. It was enough to discuss the problem of peak luminosity being greater than 100% sRGB luminosity and I couldn't find any way to do a better job at the one without doing a worse job at both.
#ffffff is L=100%. What is L=800%? It exists in HDR content, and we can’t just make the web color #ffffff a dim gray to the eye.
We must start thinking in terms of HSL or LAB or even RGBL, and consider that L > 100% is where HDR peak brightness lives.
HDR’s color space exceeds the luminosity that sRGB hex triplets can represent, and remapping HDR color spaces into sRGB hex gives you horrendous banding and requires complex gamma functions. The CSS colors spec is finalizing on this, but essentially we’re at the last days of hex codes being a great way to express color on the web. They’ll remain good as a last resort, but it’s time to move a step forward.
Apple is pinning sRGB hex #ffffff to “paper white” brightness because the hex color specification can’t encompass the full spectrum of monitors anymore. The different between #ffffff and #fefefe can be enormous on a display with 1800 nits of peak brightness, and if you map #ffffff to peak brightness, you burn out people’s eyes with every single web page on today’s legacy sRGB-color Internet (including Hacker News!). That’s why HDR has “paper white” at around 400 lumens in calibration routines.
So, then, sRGB hex colors have no way to express “significantly brighter than paper white #ffffff”, and UI elements have little reason to use this extended opportunity space - but HDR content does, and it’s nice to see Apple allowing it through to the display controller.
But there’s no way to make use HDR in web content - other than embedded images and videos - if we continue thinking of color in terms of hex codes. This insistence that we remap hex codes into thousands of nits of spectrum is why web colors in Firefox on an HDR display make your eyes hurt (such as the HN topbar): it’s rescaling the web to peak brightness rather than to paperwhite, and the result is physically traumatic to our vision system. Human eyes are designed for splashes of peak brightness, but when every web page is pouring light out of your monitor at full intensity, it causes eye strain and fatigue. Don’t be like Firefox in this regard.
“But how do we conceive of color, if not in hex codes?” is a great question, and it’s a complicated question. In essence you select color and brightness independent of each other, and then make sure that it looks good when peak brightness is low, and doesn’t sear your eyes when peak brightness is high.
If this interests you, and you’d like to start preparing for a future where colors can be dimmer or brighter than sRGB hex #FFFFFF, here are a couple useful links to get you started:
https://news.ycombinator.com/item?id=15534622
https://news.ycombinator.com/item?id=22467744
As a final note, there are thermal reasons why peak brightness can be so much higher than paperwhite: your display can only use so much power for its thermal envelope. Yes, HDR displays have thermal envelopes. So overusing peak white, such as scaling #ffffff to the wrong brightness, can actually cause the total brightness of the display to drop when it hits thermal protections, while simultaneously wasting battery and hurting your users’ eyes.