> Apple is able to set entire new standards of expectation for customers that are _very_ hard for competitors to follow.
The cheaper version of this display has a price tag of $5k, the more expensive one $6k.
I never spent even remotely as much money on a display, so I cannot speak from first-hand experience. But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Apple does many things, but certainly not bleeding-edge innovation. Of course it likes to sell its products as such, but I guess that's just how marketing works.
> I never spent even remotely as much money on a display, so I cannot speak from first-hand experience.
Translation: I don't understand this display.
> But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Translation: Despite my lack of knowledge, I feel qualified to say Apple sucks.
Don't take it personal, it's just an objective view on things. I'm aware many people here have some emotional attachment to Apple products, but this doesn't warrant such a passive-aggressive response.
> Apple does many things, but certainly not bleeding-edge innovation.
What? Having been in the inner sanctum of engineering within Apple for years, that's exactly the engineering priority for groups I saw and worked within. I'm genuinely curious why you assert otherwise, find it surprising.
Apple is know for taking something others invented and making it better/more user-friendly. That's great but it isn't being bleeding edge. Like the first few iPhones. Smartphones made better, but without simple functions like copy and paste for generations. I find it surprising anyone thinks apple's strength is being bleeding edge innovative. It can be counted on one hand how many times they have been so.
- A14 in the iPhone 12 Pro (first 5-nanometer chip with 11.8 billion transistors) [1]
Previously, they introduced the first 64-bit CPU in a mobile device, which stunned competitors at the time [2].
Not to mention their excellence in hi-dpi displays. In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize. In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+ (which I own, and it's excellent). However, the MacBook Pro 16" remains superior when it comes to sound quality, form factor (i.e. weight), touch pad precision and latency, thermal performance, etc., despite significant investments from Dell in those areas to catch up. [3]
Thats cool, but nothing on that list strikes me as revolutionary kind of development. To me revolutionary is invention of the GPS, or a transistor, or the combustion engine, or a lightbulb - those inventions changed the world.
With everything on that list, thru made a better version of something that already existed, especially with display - it's not like they were the ones developing and manufacturing the displays.
Even the ARM architecture and instruction set is not created by them.
> In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize.
Ironically this is now somewhere they could stand to improve.
The MacBook displays are excellent, particularly when it comes to colour reproduction, but for the past several years they default to a scaled display mode. For anyone not familiar, the frame buffer is a higher resolution, and scaled down for the display, trading sharpness for screen space.
Evidently the drop in sharpness is imperceptible to most people, but I can certainly tell, to the point where I forego the extra space and drop it back to the native resolution.
For a company that generally prides itself on its displays, I think the right option would be to just ship higher res panels matching the default resolution.
They have also done this with certain iPhone displays over the years, but at 400+ppi it’s well within the imperceptible territory for most people. For the 200-something ppi display on the MacBooks, not so.
> […] but for the past several years they default to a scaled display mode. For anyone not familiar, the frame buffer is a higher resolution, and scaled down for the display, trading sharpness for screen space.
My understanding of how scaled resolutions in macOS work is that graphics are always rendered at the display's native resolution. The scaling factor only decides the sizing of the rendered elements. Can you point to some documentation that supports your view? I'd like to learn if I'm wrong and understand all the details.
deergomoo is correct, Apple’s “Retina” displays work by displaying all screen elements using images/icons/text rendered at 2x the liner number of pixels as their non-retina counterparts. Since it’s a fixed 2x scaling, the only way to have anything other than the native panel resolution (with elements that are 2x their non-retina number of linear pixels) is to render at a frame buffer size larger than the actual screen. Then this frame buffer is scaled (by the GPU) to fit the actual screen size. Because it’s usually scaling down and not up this theoretically results in only very minor blurring that most people don’t notice.
It used to be this non-native scaling was only an option and by default the MacBooks ran at the exact native panel resolution. But at some point that changed so the default is one “notch” on the “more space” slider. I presume most people preferred it that way as you don’t get a lot of text on the screen at the native “Retina” resolution. But the sharpness is worse than when running unscaled.
It's easy enough to set them to the native resolution first thing. They probably noticed a lot of people don't like small text and so they set the default to scaled
> In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize. In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+ (which I own, and it's excellent).
Uhh, both Sony and Dell had 1080p, 1200 vertical and then QHD laptops in form factors down to 13" before Apple. I owned both before I moved to Apple myself.
1080, 1200, and 1440 are all of course smaller than the 1880 vertical resolution on the 15” MBP.
But it’s not just the resolution, it’s that Apple made such a high resolution usable via 2x rendering, and did so immediately for the entire system and all applications.
They are. But 1440 on a 12" laptop (like the Sony Vaio) competed well.
You can also get a 4K UHD Dell at 13".
> In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+
> But it’s not just the resolution
It was, above. Now it's the resolution and the ecosystem. "Apple did it first". "No they didn't." "Well, they were the first to do it right" ( for varying definitions of "right").
I have no particular horse in the game. In fact, my entire home ecosystem from Mac Pro to MBP to iPad, iPhone, Watch would more lean me in one particular direction, but ...
Wow, people talking about getting a 512gb SSD in their laptop for $2.5K as a good deal is kinda amazing- I think my $450 HP Pavilion with a 1050 and AMD 3500H might just be faster in every single way (but it definitely has lower build quality...)
I paid a fortune for flash storage from 2009 to 2016 and it was totally worth it.
There are some technological improvements that are so transformative (wifi, flash storage, high-resolution/"retina" display, LTE data, all-day battery life) that once you try them you never want to go back.
Then there are the changes that make you go "hmm..." (butterfly keyboard, touchbar without a hardware escape key, giant trackpad with broken palm rejection...)
> No sure about bleeding edge, because for the most part their products work but at different times they have defined:
I'm ex-Apple and an Apple fan as much as anyone, but I also have the benefit of being old. Not to take anything away from Apple's collective accomplishments, in many of these categories I'd say they "redefined" more than "defined".
There were many smartphones before the iPhone (the Palm Treos were great), many MP3 players before the iPod, many tablets before the iPad (the Microsoft Tablet PC came out about a decade before the first iPad), all-in-one PCs go back 40 years now, etc.
Xerox never made the Macintosh because they missed key innovations such as regions, the Finder, consistent UI guidelines, the ability to put a usable GUI OS in an affordable package.
Doesn't matter what PARC's limitations were, what matters was the Macintosh was a huge innovation that created what the Xerox Alto and Star were missing.
As someone who adopted smartphones years before the iPhone: anyone who thinks the iPhone was just “incremental” is deluding themselves. The capacitive display alone was a game changer, let alone everything else.
When the XDR came out it was competing with monitors that were 40,000 USD and up. (With some compromises like the stand which is just an art piece and the viewing angles being squiff). If the competition is now priced competitively then that’s very good for consumers.
Yeah, this, it really irks me how some people here keep parroting Apple's marketing without checking any facts/tests.
Their monitor is great for a consumer or prosumer monitor but just because it has PRO in the name doesn't mean it can dance in the ring with the actual PRO displays that are used to master million dollar motion pictures.
The point of the 6k Pro HDR wasn't to compete with the one or two $40k monitors used to master million dollar motion pictures.
It was to replace the five to ten other $40k monitors used in other parts of the production pipeline, by being accurate/wide/bright enough for that purpose, and to provide good-enough accuracy to a whole swath of jobs where it was dearly needed but far too expensive.
In Apple's keynote announcing the XDR they literally called it the "World's best pro display" and then proceded to compare it to Sony's $40k reference monitor. I agree the comparisons are unfair and that at $5k it occupies an interesting price point in the market, but Apple did really bring it upon themselves.
The stand makes a bit more sense when you consider this monitor for movie use. Reference monitors need to move from on-site racks for director use, to studio use for editing. The ability to easily detach it from a fixed stand and move it to a cart for on-site use is an important feature.
Maybe in small low-budget indy film making would the director be using the same equipment on set as they would in post. One where the director is also the editor is also the writer is also the producer. You know the ones. Typically, everything on set is a rental. The post studio would have their own equipment fitted for their rooms. I could see maybe a DIT taking their monitor to a cart and then back to a desktop in between gigs, but probably not then.
Bit of a random rant, but despite being so common, 27” at 4K is a really poor combination for a computer monitor. At 1x scaling, everything is too small, but at 2x scaling it’s far too big. So you have to go for a non-integer scale, which on macOS at least results in reduced image quality and graphics performance (it renders to a different sized frame buffer and then scales it).
The ideal is 5K, which is double 2560x1440 in each dimension, but essentially the only 5K displays available are the LG one made in partnership with Apple, which is over a grand and has numerous quality issues, or the one built into the iMac. It’s really annoying.
I like that Dell display (I mostly use Dell and LG displays) and most people should certainly save the money. The areas where the specs aren't the same are really significant though (e.g., resolution, as you noted, is a major difference despite only being one of the specs. Brightness, dynamic brightness, and size are also quite significant and drivers of major cost differences in just about any monitor comparison.)
If you're only using it for light HDR work, both are almost equally usable, and if you're mastering HDR content all the time, you'll need a dual-LCD panel anyway (as the sony hxr-310).
But you're right, the additional 3000$ are definitely noticeable - but if they're noticeable enough to justify that price tag is another question.
Agreed; ultimately the XDR is for people who, for whatever reason, just want a really nice, large monitor in sync with macOS' color management enough to pay the premium.
They're both IPS panels with locla area dimming, the same amount of halo effect, the same color accuracy, and relatively similar brightness and dpi, especially considering there's a price difference of 4× inbetween.
What's so "mediocre POS" about them in your opinion?
You've got a typo there, it's 4K at 27" with 163dpi vs 6K at 32" with 215dpi, so 1.5× vs 2× resolution. Which actually has an interesting effect when working with content, as the Pro Display XDR either has to show a letterbox around UHD content, or has to stretch it in a very blurry way with GPU upscaling which definitely isn't useful for content production workflows.
And the halo effect actually is an artifact from the backlighting which can be resolved with the backlight recalibration cycle, which the Pro Display XDR does automatically and invisibly during times with purely SDR content, while it has to be manually run and is very obtrusive on the Dell one, that much is true.
The main issue with comparing the spec is the difference are not really shown in these specs, e.g. luminance and chromaticity uniformity (aka Delta E), among other things. Consumer grade monitor usually do not have these, and when combining with wide color gamut and _measured_ color space (not claimed), we're talking about at least higher grade EIZO monitors which are not less than $1k.
The monitor that are comparable to the Pro XDR Display are ASUS PA32UCX ($4500) or EIZO CG319X ($6000) which usually requires full recalibration every certain amount of usage.
Random story about routine calibration. The old Sony CRT reference monitors were weird with their calibration as well. Had a 32" 16x9 HD monitor that had the mass of a dying star. It was in a dedicated film xfer/color correction bay. Once installed, it never moved until we moved to a new ___location. Once in the new ___location, it had some weird anomalies even after the typical calibration steps. Our Sony tech realized that we had changed the orientation of the monitor 90 degrees. His explanation was that the earth's magnetic fields were the culprit. Rotating the monitor 90 degrees made the issue go away. Don't remember ultimately what the fix was, but it was fixable.
Very fascinating! I wondered about this and went to search around a bit, and found this bit[1] (though this is more about CRT TV)
> When we used to manufacture TVs, we'd produce them for customers in the southern hemisphere too. When building these, we had to run them through our production lines upside down.
> When the old cathode ray tube TVs were built, the earth's magnetic field was taken into account during the production process. This ensured the current flowed the right way through the TV and so the TV was able to function normally.
There are both cheaper and far more expensive displays used for graphics design to special effects and major film mastering. In this case, Apple seems to be trying to bridge the gap with the XDR and create a new middle tier of price and functionality.
As for bleeding edge, Apple has pioneered plenty of technology. It's true that it builds upon foundations of technical designs and scientific discoveries by others but that applies to every other company as well. Very few organizations are capable of going straight from invention in a science lab to large scale commercial product all by themselves. If you judge by how much "new" technology has actually reached consumers though, Apple is clearly leading the field.
The cheaper version of this display has a price tag of $5k, the more expensive one $6k.
I never spent even remotely as much money on a display, so I cannot speak from first-hand experience. But a quick search for graphics-design displays yields several competitor products that have similar or better specs, at generally even lower price tags.
Apple does many things, but certainly not bleeding-edge innovation. Of course it likes to sell its products as such, but I guess that's just how marketing works.