Wow, that's crazy, it actually works -- just tried out the sample videos in QuickTime on my 2016 13" MBP (P3 gamut, running Big Sur) and confirmed working.
Basically: if I set my display to ~75% brightness and open the video, the whites in the video are 100% brightness, way brighter than #FFF interface white on the rest of my screen.
But if I increase my display brightness to 100%, the whites in the video are the same as the interface white, because it obviously can't go any brighter.
If I decrease my display brightness to 50%, the whites in the video are no longer at maximum 100% brightness, maybe more like 75%.
But it's also kind of buggy -- after messing around with system brightness a bit, the video stops being brighter and I've got to quit QuickTime and restart it again for effect. Also, opening and closing the video file makes my cursor disappear for a couple seconds!
I'm wondering if it switches from hardware dimming to software dimming when the video is opened and closed, and if that switch has to do with the cursor disappears. If it is, though it's flawlessly undetectable in terms of brightness -- the interface white and grays don't change at all.
Interestingly confirming it: taking a screenshot of the video while screen is at 75% brightness show massive brightness clipping in the video, since it's "overexposed" in the interface's color range. But taking a screenshot while screen brightness is 100% shows no clipping, because the video is no longer "overexposed" to the interface.
I'm just so surprised I had utterly no idea macOS worked like this. I'd never heard of this feature until now.
> it's flawlessly undetectable in terms of brightness -- the interface white and grays don't change at all.
In order to pull this off, you need to know exactly how many nits bright the display is, as well as having complete software control of the actual hardware brightness. On Windows, you have neither. Enabling HDR mode completely throws off your current colors and desktop brightness and you have to reset both your physical monitor settings and dial in the new desktop white point with a software slider Microsoft buried in the advanced HDR settings (that almost nobody knows how to use) to hopefully be somewhere in the vicinity of what you had before.
When it comes to display technology, having vertical integration is a huge benefit. Look at high-DPI: state-of-art on Windows in 2020 is nowhere near as good from a software implementation or actual user experience point of view as it was on day 1 when Apple introduced Retina MacBooks back in 2012.
Mac OS has also had system-wide color management and calibration (ColorSync) since the early 90s, part of its legacy of being the preferred platform for desktop publishing.
On Windows the systemwide "color management" basically consists of assigning a default color profile that applications can choose to use - which is generally only done by professional design/photo/video software, and not by the desktop or most "normal" apps.
Windows color management is pretty much just a folder where ICC files go to die. Everything needs to be done by the application. Nothing is color managed by default. This alone makes HDR displays (which aren't sRGB in HDR mode) less than worthless on Windows. The situation is identical on Linux, which copied the Windows approach.
Is there a way to get Windows to behave better? I was sorely disappointed by how things look in HDR mode, especially the fact that gamut seems to be abysmal compared to the same display in SDR mode, not even considering the terrible black level artifacting :/
I think Windows provides all the tools necessary for accurate color management, it's just that not everyone has done the necessary bookkeeping for it to work.
The underlying problem is that APIs describe colors in the display colorspace. #ffffff means "send full power to the red, green, and blue subpixels"[1], without describing what color the red, green, and blue subpixels are. That was not a major problem until relatively recently; every display used the same primary colors, so there was no need to specify what colorspace you were sending values to the OS in. But, then it became cheap and easy to use better primaries ("wide gamut"), and we had a problem. Every color written down in a file suddenly became meaningless; an extra piece of information would be required to turn that (r, g, b) tuple into a display color. So, everyone kind of did their own thing! Image formats long had a way to tag the pixel data with a colorspace, so images with tags basically work everywhere. Applications can read that and tell the OS that colors are in a certain color space, and it can map that to your display. Most applications do that; if you have a wide-gamut display and take an AdobeRGB-space image off your digital camera, the colors will be better than if you looked at it on an sRGB display. Even web browsers handle this fine; if they are presented with an image with a colorspace tag, they'll make sure your monitor displays the right colors.
The problem is sources of color data that don't have a tag. CSS is a big offender. CSS doesn't specify the colorspace of colors, so typically browsers will just send whatever is in there directly to the display. That means if you're a web designer and you pick #ff0000 on your sRGB display, people using a wide-gamut display will see a much more vibrant shade of red, and everything will look off. In fact, pretty much everyone using a wide-gamut display will see wrong colors everywhere because of this; I have one, and I just forced into sRGB mode because it's so broken. (On the other hand, a lot of people like more vibrant colors, so they think it's a good thing that they get artificial vibrance enhancement on everything they view. And are then disappointed when an application handles colors correctly, and what they see on their monitor are the same boring colors their digital camera saw out in the field.)
But, the problem is not Windows, the problem is applications and specs those applications use. Authors of specs don't want to say "sorry, there is no way you can ever use colors outside of sRGB without some new syntax", so they just break colors completely for everyone. That's why things look terrible on monitors that aren't sRGB; the code was built with the assumption that monitors will always be sRGB. Get rid of that assumption and everything will look correct!
There are also plenty of images out there that don't include color space tags, so it's undefined as to what colors they're actually trying to display. Some software assumes sRGB. Some software assumes the display colorspace. It's inconsistent. (I used to produce drawings in Adobe RGB and upload them to Pixiv, and their algorithm totally gets colorspaces wrong. It will serve your verbatim file to some users, but serve a version of the file to other users with the color tag removed, so that there is no possible way the viewer can see the colors you intended. I gave up on wide gamut and restricted myself to sRGB, because the Internet sucks.)
[1] It gets more complicated for shades of grey, involving gamma correction. #7f7f7f doesn't mean "send half as much electrical power to each subpixel", but rather maps to an arbitrary power level. The idea is to use the bits of the color most efficiently for human viewers; it's easy to tell "0 power" from "0.01% power". (You'll see this in practice when you write some code to control an LED from a microcontroller; if you just use the color as a PWM duty cycle, your images won't be the right colors on the display you just made. Of course, many addressable RGB LEDs do the gamma correction internally, so your naive approach of copying the image pixel values to the addressable LED will actually work. I learned this the hard way when I got addressable LED panels from two separate batches, and the old batch did gamma correction and the new batch didn't. I didn't realize it was gamma at play, so built an apparatus to measure the full colorspace of the LEDs with a spectrophotometer (https://github.com/jrockway/apacal). When I plotted the results, I immediately realized one was linear and the other was gamma-corrected... which meant all the code I wrote to build a 3D LUT for the LEDs was pointless; some simple multiplication was all I needed to make the LEDs look the same. But I digress...)
> The problem is sources of color data that don't have a tag. CSS is a big offender. CSS doesn't specify the colorspace of colors, so typically browsers will just send whatever is in there directly to the display.
CSS defines that all hex and rgb() colors are always in the sRGB colorspace. There is also support for other colorspaces, such as P3 and Adobe RGB. [1]
The Safari web browser does color management for wide gamut displays correctly, so #ff0000 looks correct and not too vibrant. The biggest offenders are Chrome and Firefox, because they are not color managed. Those web browsers (and Windows) give you the wrong colors.
> There are also plenty of images out there that don't include color space tags, so it's undefined as to what colors they're actually trying to display. Some software assumes sRGB.
Images without colorspace tags have been defined to be sRGB images by the web specs [1] and all web browsers should already do that. Other software may do something else, as you said. I hope all software will copy how web does it.
Firefox (and I think Chrome as well) does color management for images, and I think they even do it correctly these days (i.e. no tag = assume sRGB), which they did not for some time (out of the box).
FWIW I think the Windows approach to color management has proven wrong and off-base for today's world. It stems from the 90s where color management was seen as something only "pro" applications would ever need to do, so it was okay to require a lot of effort from those few application developers to implement color management in their apps. The MacOS approach where applications tag their surfaces with one of a few standard color spaces and the system does the rest is less powerful in theory but, on the other hand, means that things will actually work. Plus, I think MacOS has escape hatches so that apps can do their own color management based on output device ICC profiles if they really want to.
The desktop and most windowed apps definitely do use the systemwide color management. The main issue in the past has been whether apps correctly remap image colors based on the image’s tagged (or lack of) profile. It was broken on Macs too but these days isn’t that much of an issue.
So basically until displays advertise their physical specifications to Windows, and the Windows display stack takes advantage of that to auto-calibrate, it cannot match this kind of output.
I wonder what happens if you try this on the LG 5K hooked up to a mac. It is physically the same panel that's in the iMac, so in theory it can present the same range. But if the OS needs to know the exact physical abilities of the display, it might not be able to detect that LG display. Or maybe apple does detect it, because they partnered with LG.
I tried, I have an LG UltraFine 5k connected to an 5k iMac (2019) it works on the iMac but not the LG.
This might explain why I’ve experienced a slight difference when working with photos and video on the iMac monitor vs the LG UF5k. Interestingly I have a small program synchronizing the brightness of my LG with the iMac, the LG would light up ever time I opened the video on the iMac and then come back down on closing. This might explain som weird brightness behaviors I’ve been noticing on the 5k every now and then, I use to decouple the syncing whenever that happens and now I might know why.
I was just going to ask the same question. Everything is pointing to macOS working so well because they own the entire chain. Throwing in the external monitors that Apple promotes would suggest that Apple might also work as smoothly with it as their own. So one more test would be to use a really nice monitor not promoted by Apple.
Not exactly related, but I’ve been using macOS, Windows, and (for the last week) Linux all on the same LG HiDPI screen (4K, but small screen), and no matter how I try to wrangle Windows I just can’t get it to look good. It’s incredibly frustrating when you know how good the monitor can look.
Linux seems better somehow but I haven’t had as much time with it there, so I’m not exactly sure why.
I'm with you on vertical integration. I've been looking at haptic feedback mouses and there are ones from gaming companies but no one has pulled it off well enough to enrich the gaming experience.
I have a Dell UP2414Q, which is a wide-gamut 4K monitor with a relatively high 375 nits peak brightness and 10-bit input.
It's not technically HDR, but if I view HDR videos with it in SDR mode with the brightness turned up to maximum, there is definitely a bit of an "HDR effect", similar to what Apple has achieved.
However, under Windows, without support from the operating system, this doesn't really work. The colours are shifted and the brightness of the content is way too low.
Microsoft could add something similar, but this is a company that thinks that colour is a feature in the sense of:
Colour: Yes.
We're approaching 2021 rapidly. I suspect we'll have to wait to 2030 before Microsoft gets its act together and copies even half of what Apple has in their display technology now.
So is this the reason why they removed the glowing apple on newer macbooks? Since it was powered by the backlight, it'd give that trick away real quick.
They removed it because the screens were getting too thin, and in bright conditions you could start to see through it - leaving a brighter blob in the middle.
What's cool is: 1) 4-year old Macs have become HDR laptops and 2) the implementation is subtle- you get full brightness in the HDR video without having the rest of the UI blast to full brightness.
I don't get it. Are they compressing the range of SDR display for SDR content so that HDR content can now use the full SDR range of SDR display?
As in "we crippled your SDR display so that you can more enjoy the HDR content and buy an expensive HDR display to be back at the normal SDR range for SDR content"?
No, at least not on my Mac. Nothing's crippled, don't worry.
If my screen is already at 100% brightness then there's no HDR effect. 100% brightness is true 100% brightness, the max capability of my backlight.
The only difference is that if my screen is at less than 100% brightness, the HDR content can be brighter than the rest of the screen because it has the headroom.
On conventional LCD displays you are presumably losing some contrast in SDR content here, because running the backlight brighter to allow for this will also elevate black. The rumor mill suggests Apple is planning a move to Mini-LED displays with thousands of local dimming zones, which would handily address this.
Not only contrast, but also the gamut. If display is capable of 255 levels of color and you remap 255 to e.g. 128 to leave the rest for HDR you have only half of the possible colors available for SDR. Or how is this supposed to work?
The feature only works on P3 displays, so gamut when displaying UI (which I'd assume is nominally within the smaller sRGB space) likely isn't an issue.
Also, per later comments, this feature only kicks in when you're actually viewing SDR content, so there's no downside (even contrast) in everyday use.
Well it's not really a big deal since only your UX gets a smaller range... when you're presumably watching the content?
But it seems like Apple might be gradually moving from 8-bit to 10-bit displays. They don't really advertise it clearly, but if the technical specifications say "millions of colors" it means 8-bit, if they say "billions of colors" it means 10-bit. (256^3 vs 1024^3.)
So if you've got the iMac model with the 4K screen, it's 10-bit and therefore capable of 1,024 levels of color. So remapping the UX to e.g. 512 will still be fine, assuming (safely, I think?) the compositing is all done using 10-bit color.
I wonder how a large organization does something like this successfully. Like you need your OS video driver team working with the application team and so on.
My impression is that PMing this is really hard. And then each of the other guys is going to have an opinion that this shouldn't be done because it's so rare, etc.
Something must be organizationally right for something like this capable engineering to have succeeded on such a barely noticeable feature.
I love it when products casually have cool things like this. Not quite the same scope but IntelliJ's subpixel hinting option has each element of the drop down displaying with the hints that it describes. You don't have to pick an option to see it. You can just preview it off directly.
I'm not sure it's as difficult as you suggest. As long as you don't let applications touch the pixel data of decoded images, it's all within libraries, and you can change the bit depth of buffers and change how they map to physical pixels easily.
Most frameworks don't let applications touch pixel data without jumping through some hoops, because by restricting it you can implement things like lazy loading, GPU jpeg decoding GPU resizing, etc.
> My impression is that PMing this is really hard. And then each of the other guys is going to have an opinion that this shouldn't be done because it's so rare, etc.
It's easier when you control all parts of the stack. No way they could have pulled that one off with NVidia who were "famous" for breaking with Apple years ago when Apple demanded to code the drivers themselves... for valid reasons when one looks at the quality of their Windows and Linux drivers. The Windows ones are helluvalot buggy and the Linux ones barely integrate with Linux because NVidia refuses to follow standards.
I still use a mac with an nvidia GPU, and the driver appears to be leaking memory a lot. Unless I reboot it about once a week, it becomes noticeably laggy.
It seems to me that Microsoft really makes no effort to improve subtle aspects of Windows and hardware integration. In fact, their OS is in such shambles and is a disoriented mess with respect to UI consistency. They introduced Vista, Metro, and now Fluent. Yet there's almost no coherence and the UI is now a mish-mash of XP, Metro, and Fluent era elements. By the time they announce their next UI, you can bet you'll now see yet another ingredient added to the jumbled soup.
It really is beyond belief that an organization with so many employees can fail to adhere to a uniform vision and standard and focus on correcting details.
I'm a life long Windows and Android user. But honestly, seeing articles like this and how smooth the UI on macOS and uniformly they apply new updates and UI changes makes me extremely jealous and resentful that Microsoft is so bad at something so basic.
Features are great, but users at their start point interact with UI first. They need to fix that before anything else.
Now they want to give you the option to run Android apps on Windows through emulation. This just going to create a bigger jumbled mess.
>It seems to me that Microsoft really makes no effort to improve subtle aspects of Windows and hardware integration.
Latest Windows 10 iteration is by far the snappiest OS I've used in a long time since it uses GPU acceleration for the desktop window manager. You can check this in task manager. The icing on the cake, if you gave a laptop with 2 GPUS(Optimus) is when you can run a demanding 3D app like a game in windowed mode in parallel with other stuff like watching videos on youtube and you can see in task manager how windows uses the external GPU to render the game and the integrated GPU to accelerate your web browser, all running butterly smooth.
>In fact, their OS is in such shambles and is a disoriented mess with respect to UI consistency.
True, but that's what you get with 30 years worth of built in backwards compatibility. I can run a copy of Unreal Tournament 1999 that was just copied off an old PC with no sweat right after ripping and tearing in Doom Eternal. Can you run 20 year old software on current Apple without emulation? Apple can afford to innovate in revolutionary ways when it dumps older baggage whenever it feels like it and start from a fresh drawing board without looking back, see intel to apple silicon transition. In 2 years x86 apps will be considered legacy/obsolete on Mac hardware. Microsoft can't really do this with windows so yeah, it's a mess of new GUI elements for the simple stuff and windows 2000 era GUI elements for the deep pro settings. The advantage is that if you're an old time Windows user you can easily find your way using the "old" settings and if you're new to windows you can do most configs through the "new" GUI without touching the scary looking "old" settings.
Gotta be a point at which you decide to do a complete rewrite and ship a copy of the old OS in an emulation layer. That’s what Apple did with early OS X. Ship with a very well integrated Windows 2000 or whatever in an emulated container and be done with it
Oddly enough a number of good Windows features showed up first in Vista. I think people just weren't ready for the barrage of security permission dialogs and having their printer drivers broken. ;-)
A lot of Vista's issues were driver quality-- Creative Labs and NVIDIA played chicken with Microsoft over the "no more kernel drivers" decision and it took a lot of time for the post-XP drivers to get even remotely good. Creative Labs never did get caught up-- they decided to just do the bare minimum necessary.
WDDM's ability to do that has been progressively enhanced since Vista. Today's OSX is about where Windows was during Vista, and it really shows on Macs that are older, but still supported, usually shown via latency in normal situations, eg, just moving a window around on the screen (something every OS gets correct except OSX).
Today's WDDM, however, is snappy as hell, even on my MBP from 2012, but OSX is, and always will be, a sluggish nightmare. Intel GPU alone, no Nvidia or AMD DGPU, old enough that it has no hardware scheduling features, isn't DX12 compliant, but with Win10, its still is just as fast as my brand new workstation build when it comes to just being a plain ordinary desktop.
Apple needs to fix their development culture internally, and it strongly shows in their software product quality. Sad, because the M1 seems like a cool chunk of hardware, could be a real winner if it wasn't held down by OSX.
One issue that is very visible is the Control Panel versus the new Settings app.
A key feature of Control Panel was that it was "pluggable": vendors could add their own items, and often did. These are ordinary Win32 apps written to match the style of the era. Worse, some system control panel items have plugins in turn. E.g.: drivers can define extra "tabs", network cards have protocol-specific popup windows, etc...
There is just no way to update the look & feel of these to match the new Settings app style, most of the code is third party and ships as binary blobs.
The Microsoft Management Console (MMC), used mostly for Administrative Tools and server consoles has a similar problem.
Combined, these two make up the majority of the OS GUI!
The long-standing problem with Microsoft's internal culture is that it rewards people and teams for 'innovating' where innovation in many cases is just reinvention.
I worked at Facebook for years and it now has a similar problem. Developers are evaluated every six months on their 'impact', which results in many dropping boring work and joining teams that are doing new things, even if they aren't needed.
Similar with Google. I'm amazed at how Apple can take something as old as typography, breathe new life into it, and make everyone shocked and awed. Another one: I have yet to use a trackpad as good as theirs. It's good because it doesn't have any moving parts. It's all haptic feedback.
> It really is beyond belief that an organization with so many employees can fail to adhere to a uniform vision and standard and focus on correcting details.
If you’re not spent time in a huge company, this might seem to be the case. But really, Apple’s uniform standard really is the exception. I’m sure there are other organizational costs for this, such as it being harder to take risks with products or execute quickly.. but gosh they are good at producing a cohesive, mostly consistent set of products. I deeply appreciate their attention to detail and long term commitment.
Even on their own hardware (Surface Pro) I've ran into really annoying issues with DPI scaling, especially when docking/undocking from a secondary monitor.
> This EDR display philosophy is so important to Apple that they are willing to spend battery life on it. When you map “white” down to gray, you have to drive the LED backlight brighter for the same perceived screen brightness, using more power. Apple has your laptop doing this all the time, on the off chance that some HDR pixels come along to occupy that headroom.
This is a bit misleading. The backlight isn’t at a higher level than necessary for sRGB content all the time, just whenever any HDR encoded videos or EDR apps are open. When you open an HDR video you can see the highlights getting brighter as the backlight gets pushed.
> just whenever any HDR encoded videos or EDR apps are open
Yup, I think this has to be the case. But the crazy thing is, I can't perceive any shift in UX brightness whatsoever, even a flicker, when I open/close an EDR video.
I would have thought that there would be some slight mismatch at the moment the backlight is brightened and pixels are darkened -- whether it would be a close but not perfect brightness match, or a flicker while they're not synced. But nothing.
As I mentioned in another comment, the only giveaway is that my cursor (mouse pointer) disappears for a second or two. I have to guess that adjusting its brightness happens at a different layer in the stack that can't be so precisely synced.
Yes, this is what I came to see. It gradually increases the brightness over 5 seconds on my Catalina Macbook. It's very impressive that there are no visible brightness changes on the rest of the screen when the brightness of the backlight increases.
>When you open an HDR video you can see the highlights getting brighter as the backlight gets pushed.
but what about blacks? If you have a dark scene with bright highlights (eg. campfire at night), does the black parts of the scene get blown out because backlight bleed?
I wonder if this contributed to Apple removing the light-up Apple logo on the backs of MacBook screens. If it were still there, it would give it away when the backlight brightness is changing and potentially become distracting.
I’ve had the XDR Pro Display since June, and use it primarily for dev. I just trusted Apple in their focus on this direction. Suffice it to say I am very happy with it.
What I like about conclusions of this piece is it points to how strategic Apple is thinking in its leverage due to the breadth of distribution of advanced hardware and software.
Apple is able to set entire new standards of expectation for customers that are _very_ hard for competitors to follow.
While competitors fixate on some single feature like night photo quality, Apple is also subtly chipping away at something like this.
There's no purer expression of a certain class of cloistered Apple-thought than this - the competitors "focus on one feature, photos at night" (come on, really?), in comparison to the focus Apple shows on a __six thousand dollar display__, which is being lauded for...checks notes...using a display-specific ICC profile, which is _tablestakes_ by every modern operating system and premium hardware manufacturer for setting colorspaces and tuning per-display at the factory, and the color profile for the display uses an artificially low white point. A banal, baseline features of displays and OS 1/6th the cost is laundered into a declaration of supremacy, and we all lose all the cool information about the underlying tech and color science we could have been discussing.
> Apple is able to set entire new standards of expectation for customers that are _very_ hard for competitors to follow.
The cheaper version of this display has a price tag of $5k, the more expensive one $6k.
I never spent even remotely as much money on a display, so I cannot speak from first-hand experience. But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Apple does many things, but certainly not bleeding-edge innovation. Of course it likes to sell its products as such, but I guess that's just how marketing works.
> I never spent even remotely as much money on a display, so I cannot speak from first-hand experience.
Translation: I don't understand this display.
> But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Translation: Despite my lack of knowledge, I feel qualified to say Apple sucks.
Don't take it personal, it's just an objective view on things. I'm aware many people here have some emotional attachment to Apple products, but this doesn't warrant such a passive-aggressive response.
> Apple does many things, but certainly not bleeding-edge innovation.
What? Having been in the inner sanctum of engineering within Apple for years, that's exactly the engineering priority for groups I saw and worked within. I'm genuinely curious why you assert otherwise, find it surprising.
Apple is know for taking something others invented and making it better/more user-friendly. That's great but it isn't being bleeding edge. Like the first few iPhones. Smartphones made better, but without simple functions like copy and paste for generations. I find it surprising anyone thinks apple's strength is being bleeding edge innovative. It can be counted on one hand how many times they have been so.
- A14 in the iPhone 12 Pro (first 5-nanometer chip with 11.8 billion transistors) [1]
Previously, they introduced the first 64-bit CPU in a mobile device, which stunned competitors at the time [2].
Not to mention their excellence in hi-dpi displays. In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize. In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+ (which I own, and it's excellent). However, the MacBook Pro 16" remains superior when it comes to sound quality, form factor (i.e. weight), touch pad precision and latency, thermal performance, etc., despite significant investments from Dell in those areas to catch up. [3]
Thats cool, but nothing on that list strikes me as revolutionary kind of development. To me revolutionary is invention of the GPS, or a transistor, or the combustion engine, or a lightbulb - those inventions changed the world.
With everything on that list, thru made a better version of something that already existed, especially with display - it's not like they were the ones developing and manufacturing the displays.
Even the ARM architecture and instruction set is not created by them.
> In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize.
Ironically this is now somewhere they could stand to improve.
The MacBook displays are excellent, particularly when it comes to colour reproduction, but for the past several years they default to a scaled display mode. For anyone not familiar, the frame buffer is a higher resolution, and scaled down for the display, trading sharpness for screen space.
Evidently the drop in sharpness is imperceptible to most people, but I can certainly tell, to the point where I forego the extra space and drop it back to the native resolution.
For a company that generally prides itself on its displays, I think the right option would be to just ship higher res panels matching the default resolution.
They have also done this with certain iPhone displays over the years, but at 400+ppi it’s well within the imperceptible territory for most people. For the 200-something ppi display on the MacBooks, not so.
> […] but for the past several years they default to a scaled display mode. For anyone not familiar, the frame buffer is a higher resolution, and scaled down for the display, trading sharpness for screen space.
My understanding of how scaled resolutions in macOS work is that graphics are always rendered at the display's native resolution. The scaling factor only decides the sizing of the rendered elements. Can you point to some documentation that supports your view? I'd like to learn if I'm wrong and understand all the details.
deergomoo is correct, Apple’s “Retina” displays work by displaying all screen elements using images/icons/text rendered at 2x the liner number of pixels as their non-retina counterparts. Since it’s a fixed 2x scaling, the only way to have anything other than the native panel resolution (with elements that are 2x their non-retina number of linear pixels) is to render at a frame buffer size larger than the actual screen. Then this frame buffer is scaled (by the GPU) to fit the actual screen size. Because it’s usually scaling down and not up this theoretically results in only very minor blurring that most people don’t notice.
It used to be this non-native scaling was only an option and by default the MacBooks ran at the exact native panel resolution. But at some point that changed so the default is one “notch” on the “more space” slider. I presume most people preferred it that way as you don’t get a lot of text on the screen at the native “Retina” resolution. But the sharpness is worse than when running unscaled.
It's easy enough to set them to the native resolution first thing. They probably noticed a lot of people don't like small text and so they set the default to scaled
> In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize. In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+ (which I own, and it's excellent).
Uhh, both Sony and Dell had 1080p, 1200 vertical and then QHD laptops in form factors down to 13" before Apple. I owned both before I moved to Apple myself.
1080, 1200, and 1440 are all of course smaller than the 1880 vertical resolution on the 15” MBP.
But it’s not just the resolution, it’s that Apple made such a high resolution usable via 2x rendering, and did so immediately for the entire system and all applications.
They are. But 1440 on a 12" laptop (like the Sony Vaio) competed well.
You can also get a 4K UHD Dell at 13".
> In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+
> But it’s not just the resolution
It was, above. Now it's the resolution and the ecosystem. "Apple did it first". "No they didn't." "Well, they were the first to do it right" ( for varying definitions of "right").
I have no particular horse in the game. In fact, my entire home ecosystem from Mac Pro to MBP to iPad, iPhone, Watch would more lean me in one particular direction, but ...
Wow, people talking about getting a 512gb SSD in their laptop for $2.5K as a good deal is kinda amazing- I think my $450 HP Pavilion with a 1050 and AMD 3500H might just be faster in every single way (but it definitely has lower build quality...)
I paid a fortune for flash storage from 2009 to 2016 and it was totally worth it.
There are some technological improvements that are so transformative (wifi, flash storage, high-resolution/"retina" display, LTE data, all-day battery life) that once you try them you never want to go back.
Then there are the changes that make you go "hmm..." (butterfly keyboard, touchbar without a hardware escape key, giant trackpad with broken palm rejection...)
> No sure about bleeding edge, because for the most part their products work but at different times they have defined:
I'm ex-Apple and an Apple fan as much as anyone, but I also have the benefit of being old. Not to take anything away from Apple's collective accomplishments, in many of these categories I'd say they "redefined" more than "defined".
There were many smartphones before the iPhone (the Palm Treos were great), many MP3 players before the iPod, many tablets before the iPad (the Microsoft Tablet PC came out about a decade before the first iPad), all-in-one PCs go back 40 years now, etc.
Xerox never made the Macintosh because they missed key innovations such as regions, the Finder, consistent UI guidelines, the ability to put a usable GUI OS in an affordable package.
Doesn't matter what PARC's limitations were, what matters was the Macintosh was a huge innovation that created what the Xerox Alto and Star were missing.
As someone who adopted smartphones years before the iPhone: anyone who thinks the iPhone was just “incremental” is deluding themselves. The capacitive display alone was a game changer, let alone everything else.
When the XDR came out it was competing with monitors that were 40,000 USD and up. (With some compromises like the stand which is just an art piece and the viewing angles being squiff). If the competition is now priced competitively then that’s very good for consumers.
Yeah, this, it really irks me how some people here keep parroting Apple's marketing without checking any facts/tests.
Their monitor is great for a consumer or prosumer monitor but just because it has PRO in the name doesn't mean it can dance in the ring with the actual PRO displays that are used to master million dollar motion pictures.
The point of the 6k Pro HDR wasn't to compete with the one or two $40k monitors used to master million dollar motion pictures.
It was to replace the five to ten other $40k monitors used in other parts of the production pipeline, by being accurate/wide/bright enough for that purpose, and to provide good-enough accuracy to a whole swath of jobs where it was dearly needed but far too expensive.
In Apple's keynote announcing the XDR they literally called it the "World's best pro display" and then proceded to compare it to Sony's $40k reference monitor. I agree the comparisons are unfair and that at $5k it occupies an interesting price point in the market, but Apple did really bring it upon themselves.
The stand makes a bit more sense when you consider this monitor for movie use. Reference monitors need to move from on-site racks for director use, to studio use for editing. The ability to easily detach it from a fixed stand and move it to a cart for on-site use is an important feature.
Maybe in small low-budget indy film making would the director be using the same equipment on set as they would in post. One where the director is also the editor is also the writer is also the producer. You know the ones. Typically, everything on set is a rental. The post studio would have their own equipment fitted for their rooms. I could see maybe a DIT taking their monitor to a cart and then back to a desktop in between gigs, but probably not then.
Bit of a random rant, but despite being so common, 27” at 4K is a really poor combination for a computer monitor. At 1x scaling, everything is too small, but at 2x scaling it’s far too big. So you have to go for a non-integer scale, which on macOS at least results in reduced image quality and graphics performance (it renders to a different sized frame buffer and then scales it).
The ideal is 5K, which is double 2560x1440 in each dimension, but essentially the only 5K displays available are the LG one made in partnership with Apple, which is over a grand and has numerous quality issues, or the one built into the iMac. It’s really annoying.
I like that Dell display (I mostly use Dell and LG displays) and most people should certainly save the money. The areas where the specs aren't the same are really significant though (e.g., resolution, as you noted, is a major difference despite only being one of the specs. Brightness, dynamic brightness, and size are also quite significant and drivers of major cost differences in just about any monitor comparison.)
If you're only using it for light HDR work, both are almost equally usable, and if you're mastering HDR content all the time, you'll need a dual-LCD panel anyway (as the sony hxr-310).
But you're right, the additional 3000$ are definitely noticeable - but if they're noticeable enough to justify that price tag is another question.
Agreed; ultimately the XDR is for people who, for whatever reason, just want a really nice, large monitor in sync with macOS' color management enough to pay the premium.
They're both IPS panels with locla area dimming, the same amount of halo effect, the same color accuracy, and relatively similar brightness and dpi, especially considering there's a price difference of 4× inbetween.
What's so "mediocre POS" about them in your opinion?
You've got a typo there, it's 4K at 27" with 163dpi vs 6K at 32" with 215dpi, so 1.5× vs 2× resolution. Which actually has an interesting effect when working with content, as the Pro Display XDR either has to show a letterbox around UHD content, or has to stretch it in a very blurry way with GPU upscaling which definitely isn't useful for content production workflows.
And the halo effect actually is an artifact from the backlighting which can be resolved with the backlight recalibration cycle, which the Pro Display XDR does automatically and invisibly during times with purely SDR content, while it has to be manually run and is very obtrusive on the Dell one, that much is true.
The main issue with comparing the spec is the difference are not really shown in these specs, e.g. luminance and chromaticity uniformity (aka Delta E), among other things. Consumer grade monitor usually do not have these, and when combining with wide color gamut and _measured_ color space (not claimed), we're talking about at least higher grade EIZO monitors which are not less than $1k.
The monitor that are comparable to the Pro XDR Display are ASUS PA32UCX ($4500) or EIZO CG319X ($6000) which usually requires full recalibration every certain amount of usage.
Random story about routine calibration. The old Sony CRT reference monitors were weird with their calibration as well. Had a 32" 16x9 HD monitor that had the mass of a dying star. It was in a dedicated film xfer/color correction bay. Once installed, it never moved until we moved to a new ___location. Once in the new ___location, it had some weird anomalies even after the typical calibration steps. Our Sony tech realized that we had changed the orientation of the monitor 90 degrees. His explanation was that the earth's magnetic fields were the culprit. Rotating the monitor 90 degrees made the issue go away. Don't remember ultimately what the fix was, but it was fixable.
Very fascinating! I wondered about this and went to search around a bit, and found this bit[1] (though this is more about CRT TV)
> When we used to manufacture TVs, we'd produce them for customers in the southern hemisphere too. When building these, we had to run them through our production lines upside down.
> When the old cathode ray tube TVs were built, the earth's magnetic field was taken into account during the production process. This ensured the current flowed the right way through the TV and so the TV was able to function normally.
There are both cheaper and far more expensive displays used for graphics design to special effects and major film mastering. In this case, Apple seems to be trying to bridge the gap with the XDR and create a new middle tier of price and functionality.
As for bleeding edge, Apple has pioneered plenty of technology. It's true that it builds upon foundations of technical designs and scientific discoveries by others but that applies to every other company as well. Very few organizations are capable of going straight from invention in a science lab to large scale commercial product all by themselves. If you judge by how much "new" technology has actually reached consumers though, Apple is clearly leading the field.
The Windows HDR implementation is complete crap, exactly because they don't have have full control of the hardware stack as Apple does and can't change all the monitor settings like they would need to.
When you toggle HDR on Windows, the desktop becomes dull gray and desaturated exactly because they pull down the previous desktop brightness to something less than 255. So you have to then adjust your monitor's brightness up to compensate. The monitor's brightness effectively sets the upper cap of the HDR brightness, so let's say your brightness was set at 50% before, now you've got to fiddle with the monitor to boost the screen brightness to 100% to allow HDR to function, and to achieve your previous desktop white brightness (you'll probably also have to adjust the software "desktop white" point slider you mentioned, since MS has no clue what the correct monitor brightness and SDR pull down amount should be, so good luck matching your previous desktop colors and brightness). In my experience very few people successfully manage to setup their Windows HDR correctly, and even if you do there's no way to "seamlessly" switch between the two modes (which you have to do since tons of stuff on Windows doesn't work properly when HDR mode is enabled). I haven't checked Surface or other MS hardware, perhaps they're able to do something more clever there?
What Apple does, is that when your display brightness is 50% and you display HDR content, the HDR content will seamlessly appear at a brightness somewhere between 75% to 100% of the maximum screen brightness. That is a seamless HDR effect, giving you the whiter than white experience next to your other windows that just works.
I haven't encountered such issues with LG C9. I don't need to touch the settings on my TV when enabling/disabling Windows HDR (which puts my TV in HDR mode), the previous SDR-mode desktop brightness is achieved in HDR mode just fine.
Though I remember having read that the Windows HDR stuff works slightly differently for internal monitors (e.g. in laptops), is your experience with those?
I imagine this depends on the configuration of the TV, if only because — if your C9 is anything like my slightly older LG IPS HDR TV — backlight brightness, color calibration, and other settings that affect the brightness of full-scale SDR white can be freely configured in any of the SDR modes without affecting the levels of any other SDR mode, or of any HDR mode at all.
In other words, assume SDR "Game" mode is set to Backlight 50, SDR "Cinema" mode is set to backlight 25, all other settings are equal, and, therefore, 100% white is considerably brighter in "Game" mode.
Then both values for "white" cannot possibly match a single, fixed level set by any other mode, HDR or otherwise.
It's therefore impossible for Windows or any other input source to "just work" when switching from an arbitrary SDR mode to a preset HDR mode.
Again, assuming your TV works more or less like mine, and mode switches use the most-recently-used preset in the target "mode family" (meaning not only SDR and HDR, but also Dolby Vision, which maintains its own collection of presets), and that the various presets are independent of one another.
And if this is not the case, and presets cannot be set independently, then I'm glad I don't have a newer TV, because some of my presets have color settings that are wildly different from standard calibration (e.g., a preset resetting the display to its native, uncalibrated white point and gamut, used with video players capable of internal HDR tone mapping and color correction given a custom 3D LUT generated from measurements).
Dolby Vision traces its origins to the acquisition of Brightside Technologies (https://en.wikipedia.org/wiki/BrightSide_Technologies), which was itself a spin-off from a group at the University of British Columbia. I wonder how extensively Dolby has profited from the patents it accumulated via that acquisition, since many of the key ideas behind HDR displays must surely be covered by them.
Huh, when I started as a Master's student at UBC we got a tour of some of the labs in the ECE and CS departments there, one of which was the media lab that had one of this HDR displays. I remember being blown away by how realistic the display looked compared to an SDR display. It's cool that some of their tech lives on!
It can also be argued that Apple’s approach is a bit user-hostile. If someone wants their display at a 50% brightness ceiling (e.g. working in a darker room) it would be jarring to see HDR content overriding that, especially as such content becomes more prevalent.
The huge Windows flaw is that it isn't color managed by default. It means that colors in most applications look extra saturated on wide gamut displays. I wonder if it the same flaw applies to HDR.. apps would look extra bright.
Macs and its apps have been properly color managed for decades. That's why the transition from SDR to HDR monitors has been painless. Apps have been ready for it for a long time.
Windows apps don't even look correct on an SDR display!
I have a wide-gamut display and I can notice the difference between applications that incorrectly "stretch" sRGB to the display gamut versus apps that actually colour manage and map the colours correctly.
No app on Windows colour manages the UI widgets such as the icons, toolbars, etc... This is because the WDDM shell doesn't do any kind of colour management, it leaves that up to the application developers.
The sad thing is that Vista introduced an extremely wide scRGB gamut and WDDM had a number of internal features built around it. Unfortunately, it was only ever enabled for full-screen games, video overlays, and for internal use by apps that do colour management. They should have converted the entire desktop manager to use it.
The "non-HDR" displays are actually high-quality, wide gamut displays. They may not have been certified for HDR when they were made, but it's quite likely to be HDR400 capable.
As far as I'm aware, Apple hasn't released a display on any device with less than 400 nits brightness in years. So HDR400 capability would at least make some sense at a surface level.
In particular https://gitlab.freedesktop.org/swick/wayland-protocols/-/blo... was linked which discussed how a wayland compositor would have to display mixed "HDR" and SDR content on the same display. This document even has references to EDR. Ultimately this would end up achieving a similar result as what's described in the blog post here.
If you're interested in the technical details on what may be necessary to achieve something like this, the wayland design document might be a good read.
I noticed this 'extra bright whites in video' effect on my iPhone 12 just after upgrading. I had to go and shoot 'normal' + HDR video of the same thing a few times back and forth to be sure I knew what I was seeing.
The first thing you think is, how was I OK with this terrible standard video to begin with? The HDR version just looks SO MUCH better and the standard looks so flat next to it. Like comparing an old non HDR photo with an HDR one.
"On these non-HDR displays, Apple has remapped “white” to something less than 255-255-255, leaving headroom for HDR vales, should they be called for"
What is the source for this? I don't see any justification for this claim in the article. There are plenty of ways to implement this feature that don't involve permanently throwing out dynamic range on all your SDR panels. I'm not even convinced from reading this that they aren't HDR panels to begin with - the idea of an iPhone having a 9-bit or 10-bit panel in it isn't that strange to me, and while that wouldn't be enough for like Professional-Grade HDR it's enough that you could pair it with dynamic backlight control and convince the average user that it's full HDR.
Considering Apple controls the whole stack and uses a compositor there's nothing stopping them from compositing to a 1010102 or 111110 framebuffer and then feeding that higher-precision color data to the panel and controlling the backlight. Since they control the hardware they can know how bright it will be (in nits) at various levels.
I just tested the sample file provided in the article at my Macbook with QuickTime, IIAN, and VLC. I can confirm the result in the article but it is Quick Time only. Colors in IIAN and VLC are not as bright as in the QuickTime.
It's amazing to me that there are still people out there who don't use adblockers in 2020. The web is unbelievably painful to browse without an adblocker.
I've also gotten into the habit of using uBlock Origin to kill all of the "Accept Cookies" type popups without ever clicking accept. Those are just as intolerable as ads to me.
I think most users never even heard about Adblockers or how it will improve their web browsing life considerably. It's the thing people never know they want.
On other story, I give my junior advice that she should install adblocker on her browser, her reply was: "How can I watch the commercial then?" Umm...OKAY.
My assumption:
1) people just didn't know better or
2) they actually loves those flashing ads all along
The author is rightly excited about HDR, but could be slightly overstating its effect in some examples.
> But at key parts of the story, certain colors eek outside of that self-imposed SDR container, to great effect. In a very emotional scene, brilliant pinks and purples explode off the screen — colors that not only had been absent from the film before that moment, but seemed altogether outside the spectrum of the story’s palette. Such a moment would not be possible without HDR.
I think the author knows that this is a special case of the effect where you limit color palette to some range of colors for a duration of the film and then exceed that range in places—no HDR in particular fundamentally needed to make this possible.
True, HDR can give a greater effect in absolute colorimetric terms when full palette is revealed, but the perceived magnitude depends on how restricted the original palette was prior to the reveal, and how masterfully the effect is used in general.
Macs can also play HDR content on external displays, such as HDR televisions and projectors. This document lists Mac models supporting HDR: https://support.apple.com/en-us/HT210980
I seems that the new MacBook Air with M1 can't play HDR content on external displays. :(
The "Mac models that support HDR" table on https://support.apple.com/en-us/HT210980 says that MacBook Airs introduced after 2018 can only do SDR on external displays - if it doesn't include the 2020 Air, they'd surely have called that out?
It could be a function of their integrated graphics approach with the M1 (SoC), as opposed to simply having the right ports. That's just a guess, however.
I think I must be some weird outlier on HDR. I recently upgraded to a display that supports it, and when I turn it on, everything just looks darker to me. As though the brightness went to half at the same time that contrast was cranked up high. Playing with color and exposure and other settings just made things worse.
It sounds like a color profile mismatch. Specifically, it sounds like you're viewing linear light on something like an srgb profile without the proper color conversion. If pumping gamma up to 2.2 makes things look more or less normal, that's a strong indication it's an srgb issue.
Same here, very frustrating. When HDR is on, everything is washed out, no matter the color profile. I was hoping to use full HDR brightness for the macOS UI, but no luck so far.
I'm not sure why you would say it's not really HDR. It is. It's the base standard, the others use the same 10bit dynamic colors range. The 400/500/600 just refer to the maximum brightness of the display, something that may be a benefit but is not related to whether or not something is "true" HDR. HDR10 is true HDR. All HDR600 etc are also HDR10 with the added benefit of increased brightness.
HDR600 etc are not just a maximum required brightness but also a minimum required black value. That gives you an actual dynamic range; 10bit is not a range, it's a resolution.
HDR10 is a 10bit color depth. Not resolution. 1080p is generally considered minimum resolution, but that has nothing to do with the "10bit" part of HDR10: HDR10 Media Profile, more commonly known as HDR10, was announced on August 27, 2015, by the Consumer Technology Association and uses the wide-gamut Rec. 2020 color space, a bit depth of 10-bits [0]
Also, no, the 400/600 etc is just brightness: HDR10 and HDR 400 are the same, except HDR 400 mentions the level of brightness on the display. [1].
You may be thinking of HDR True Black, which is a further enhancement, but something different. Deeper blacks can also be achieved by non-OLED displays that support full-array local dimming, and it looks like most displays that support FALD are also HDR displays.
> Also, no, the 400/600 etc is just brightness: HDR10 and HDR 400 are the same, except HDR 400 mentions the level of brightness on the display. [1].
Vesa [0] disagrees with you. Note the Maximum
Black Level Luminance restrictions. Minimum brightness + maximum black level = minimum contrast. That's why legacy display technologies with mediocre static contrast (TN, VA, IPS) need backlight hacks to support those specifications.
Having some similar issues. The thing is that the OSD controls are nice and bright- so the brightness is there, but all the actual video content seems mostly darker than the previous non HDR TV.
I actually just got an HDR tv too, in addition to the computer display... Luckily HDR wasn't a selling point for either decision. On the TV (samsung q70 qled) I just don't notice a difference on or off.
The industry is in a very strange place right now. HDR monitors are both quite expensive and unimpressive compared to the HDR OLED TVs..
For the 5-6k that this display from Apple costs, I would certainly expect it to be an OLED display at LEAST. But it's not.
How is this the best we can get and it's NOT OLED? Dimming zones for 6k? I don't understand what's going on, I just want a nice OLED monitor that will fit on my monitor arm. I'll even pay the same price that you can get an AMAZING 55" OLED TV from LG for; 1500-2k.
OLED isn't super useful on computer screens where elements stay in a static ___location for a long time due to burn in.
Also the amount of bright colors used in computer interfaces would cause some significant discomfort.
And I'm not sure how legitimate this claim is, but I've read before that OLED suffers from dead pixels at a higher rate than other screen types, but don't take that too seriously without proof.
> OLED isn't super useful on computer screens where elements stay in a static ___location for a long time due to burn in.
I see this repeated a lot. Do you have any numbers/images on actual burnin in OLED screens. Would be interesting to know how long an OLED screen remains useable when used as a PC monitor.
Or to put it another way: Burn in does not seem to be enough of a concern to prevent Samsung/etc from puttin OLED screens into phones.
> Also the amount of bright colors used in computer interfaces would cause some significant discomfort.
The entire point of Apple's solution here is that UI's max brightness is not the display's true max brightness.
Indeed, burn-in tests almost feel anecdotal because they involve no more than several TVs. We don't have the MTTF type numbers for pixels like we demand for hard drives.
The problem with contrast is more pronounced on my OLED tv than on my HDR LED monitor. I've also noticed on my TV if I watch netflix with standard size subtitles, the brightness overwhelms the lower part of the image on dark TV shows. I suspect this is less of an issue on LED monitors only because the contrast is not so extreme.
Again, all anecdotes. I do like OLED though, enough to make it my priority TV feature.
OLED has compromises that make it unacceptable for some professional work. The main issue with it is the lack of brightness (i.e. nits) required for true HDR, compared to miniLED and LCD.
Mixed HDR/SDR content is interesting problem. To me the example where thumbnail is presented with HDR color feels wrong, because it makes the thumbnail to stand out in a distracting way and seems highlighted in some way. But the iPhone example on the other hand looks really good; note how it seems like it switches from SDR to HDR display when the picture is selected. For desktop, maybe the simplest heuristic for choosing HDR/SDR is if the the window is fullscreen/maximized or not.
I had noticed this on my iPhone 11 Pro and I hate it. HDR doesn’t give you “whiter than white”, it just adds more steps of gray.
White is white. White on a display is the brightest point the display can display.
What Apple is doing, as the article explained, is showing regular white as gray. That’s not cool, that just stupid. It’s exactly what TVs at Best Buy do when showing SD vs HD content: They ruin the regular image just so you can see the difference.
The issue is that my monitor is not a demo display, it’s what I use sometimes in daylight, and I’d very much appreciate that extra brightness that Apple takes away from me.
You know what this means for you? Everything you see and watch on your computer is not a bright as it could be. On an LCD screen that’s a big deal because suddenly your blacks are brighter (because of the backlight at 100%) and your whites are dimmer (because Apple saves brightness on the off chance that you have HDR content)
I think this same thing happens when displaying HDR photographs. The main photo taken has much brighter highlights, but the rest of the Live Photo frames don’t.
In case anyone is worried about getting free LASIK eye surgery from this at night, the effect seems to dramatically roll off at very low brightness levels.
I have one regret from this article, and that’s the explanation that sRGB hex color #ffffff (255, 255, 255) is being remapped. It’s not. The OS simply isn’t restricting the display pipeline to sRGB anymore, and that allows content which exceeds sRGB to do so.
#ffffff is L=100%. What is L=800%? It exists in HDR content, and we can’t just make the web color #ffffff a dim gray to the eye.
We must start thinking in terms of HSL or LAB or even RGBL, and consider that L > 100% is where HDR peak brightness lives.
HDR’s color space exceeds the luminosity that sRGB hex triplets can represent, and remapping HDR color spaces into sRGB hex gives you horrendous banding and requires complex gamma functions. The CSS colors spec is finalizing on this, but essentially we’re at the last days of hex codes being a great way to express color on the web. They’ll remain good as a last resort, but it’s time to move a step forward.
Apple is pinning sRGB hex #ffffff to “paper white” brightness because the hex color specification can’t encompass the full spectrum of monitors anymore. The different between #ffffff and #fefefe can be enormous on a display with 1800 nits of peak brightness, and if you map #ffffff to peak brightness, you burn out people’s eyes with every single web page on today’s legacy sRGB-color Internet (including Hacker News!). That’s why HDR has “paper white” at around 400 lumens in calibration routines.
So, then, sRGB hex colors have no way to express “significantly brighter than paper white #ffffff”, and UI elements have little reason to use this extended opportunity space - but HDR content does, and it’s nice to see Apple allowing it through to the display controller.
But there’s no way to make use HDR in web content - other than embedded images and videos - if we continue thinking of color in terms of hex codes. This insistence that we remap hex codes into thousands of nits of spectrum is why web colors in Firefox on an HDR display make your eyes hurt (such as the HN topbar): it’s rescaling the web to peak brightness rather than to paperwhite, and the result is physically traumatic to our vision system. Human eyes are designed for splashes of peak brightness, but when every web page is pouring light out of your monitor at full intensity, it causes eye strain and fatigue. Don’t be like Firefox in this regard.
“But how do we conceive of color, if not in hex codes?” is a great question, and it’s a complicated question. In essence you select color and brightness independent of each other, and then make sure that it looks good when peak brightness is low, and doesn’t sear your eyes when peak brightness is high.
If this interests you, and you’d like to start preparing for a future where colors can be dimmer or brighter than sRGB hex #FFFFFF, here are a couple useful links to get you started:
As a final note, there are thermal reasons why peak brightness can be so much higher than paperwhite: your display can only use so much power for its thermal envelope. Yes, HDR displays have thermal envelopes. So overusing peak white, such as scaling #ffffff to the wrong brightness, can actually cause the total brightness of the display to drop when it hits thermal protections, while simultaneously wasting battery and hurting your users’ eyes.
That's already possible today, yes. Safari shipped support for all this live in production some time ago and I assume Chrome and Firefox are either already there or working towards it.
I've always wondered what it'll take for the major OS manufacturers to implement an anti-seizure filter for the content they transmit to their screens, and I'd bet that a flickering ad at HDR max brightness causing seizures worldwide one day will finally compel them to do so.
I agree with you in general, but one important aspect of "HDR" is 10+ color processing (not necessarily showing, FRC8+2 is already an improvement) on the displays. That lets you create more grays etc
Yeah, I considered rewriting to try and use colors as percentages and show the idiv x/256 banding problem of hexadecimal colors, but for today's purposes, I will be satisfied if I inspire long-time web programmers to start realizing that sRGB hex is a dead end way to think about color. It was enough to discuss the problem of peak luminosity being greater than 100% sRGB luminosity and I couldn't find any way to do a better job at the one without doing a worse job at both.
wow. no one is calling this the hoax it is. it’s reducing contrast on the SDR so that the HDR content is subjectively brighter. i mean it’s fine and all, but you’re not seeing HDR on a non HDR screen. you wouldn’t be able to use it for color grading.
Did Apple really remap white with macOS Catalina on Macbook Pros, and no one noticed? For some reason, I find that hard to believe. But it also makes me believe in the vertical integration powers of Apple: just think about this trickery on Windows!
> The company, long known for shipping high quality, color-accurate displays,
This is simply not true; a lie. Everyone in Hollywood and the professional print world who needed color-accurate displays that could be calibrated were using HP Dreamcolor Displays or similar products from companies like BenQ
Btw, if you want pretty much the same specs with native 10-bit support and 1000 nits peak brightness for pretty much the same price, the UP2718Q has been discounted quite a bit recently (I got one for myself)
The entire field is larger than movie studio post production. Apple wasn’t trying to get in that space until recently but their displays were well above average for less demanding users for decades. There was an entire business getting this setup for Windows print/design/photography shops during the era where support was tedious and error prone.
Maybe, but I just don't like seeing the lie that "Hollywood uses Macs." Of course, the PAs and screenwriters do, but movie editors, color graders, and digital SFX folks don't. And, of course, render farms don't.
Basically: if I set my display to ~75% brightness and open the video, the whites in the video are 100% brightness, way brighter than #FFF interface white on the rest of my screen.
But if I increase my display brightness to 100%, the whites in the video are the same as the interface white, because it obviously can't go any brighter.
If I decrease my display brightness to 50%, the whites in the video are no longer at maximum 100% brightness, maybe more like 75%.
But it's also kind of buggy -- after messing around with system brightness a bit, the video stops being brighter and I've got to quit QuickTime and restart it again for effect. Also, opening and closing the video file makes my cursor disappear for a couple seconds!
I'm wondering if it switches from hardware dimming to software dimming when the video is opened and closed, and if that switch has to do with the cursor disappears. If it is, though it's flawlessly undetectable in terms of brightness -- the interface white and grays don't change at all.
Interestingly confirming it: taking a screenshot of the video while screen is at 75% brightness show massive brightness clipping in the video, since it's "overexposed" in the interface's color range. But taking a screenshot while screen brightness is 100% shows no clipping, because the video is no longer "overexposed" to the interface.
I'm just so surprised I had utterly no idea macOS worked like this. I'd never heard of this feature until now.