Just to be clear - supporting High-DPI on Windows is not as simple as adding scaled up resources. Instead, it is an absolutely royal pain.
For one, you can't just throw hires bitmaps into a resource section of .exe and expect them to magically work. There needs to be code that looks at current DPI and picks matching bitmap.
For two, standard DPI levels are 96, 120, 144 and 192, but guess what? All other values in between and above 192 are game too. In fact, there's a nice little slider in the Control Panel that encourages you to put it somewhere in between. This means that your code either needs to rescale bitmaps to match these odd DPIs or use the largest one that fits. In either case the result will look like butt.
For three - the dialog layout. If your dialogs have text that's longer than 3-4 words, the chances are that it will either overflow, underflow or wrap differently under different DPIs. This in turn means that you need to test dialog appearance with at least 4 different font sizes and Tahoma 8px for Windows XP. Do you know how hard it is to word a longish sentence so that it would fill about the same space with all 5 combinations? Really damn hard and very time consuming.
But wait! There's more.
Every app icon needs to exist in at least 9 sizes, like so - http://imgur.com/5Pe2ZV0 - and this would still miss some cases where Windows will scale an arbitrary chosen icon image and use it.
It really is a mess. However this is not something unexpected if you've been writing for Windows for a while. This mess is a routine.
The other thing to remember is that most likely less than 1-2% of your whole customer base is currently using high DPI monitors, so all these very high and very real costs are sometimes not possible in ROI, especially for more niche markets. It much like supporting IE6, or maybe IE5 for the 1% that still use it? Would you in a niche market compared to a mainstream app? This will change with time as these monitors become more common but right now the threshold is still very low compared to the costs.
That seems like a lowball estimate even today. If you go shopping for a new Windows computer or tablet in Best Buy or Staples or some other american store, maybe 1/3rd to 1/2 of what's on shelves is high DPI now, from the $400 tablets up to $1200 ultrabooks. Lenovo, Samsung, Asus, HP, all the big manufacturers have 2560x1440 or 3200x1800 ultrabooks and hybrids out, and even 1920x1080 tablets are set to 1.5x or 2x DPI scaling because those pixels are packed into 10" or smaller screens.
Tablets yes. Notebooks no. 'Ultrabooks', whatever they are, maybe. The majority of laptops up to 15" and ~$1200 are still 1366x768. Source: I recently went shopping for a laptop with roughly that budget. In fact, I don't think you can buy anything 15" beyond 1080p on that budget.
I actually tested this several months ago for one my projects and it was about 5% (on a sample of few hundreds). That's a very respectable 1 in 20. Not that I disagree with your ROI point, it's just that playing well with non-default DPIs becomes progressively more important now.
But was your project more oriented towards a tech audience or the general population. Over a few hundred people, especially tech people, for example your very likely to get a very very low IE user base compared to the average user ;)
There's some bias, no doubt, and the sample is barely representative. But the thing is that there's lots of new hardware, like Thinkpads, that ships pre-configured with higher DPI. Drop by drop, but this will tilt the scales .. if it hasn't already.
I'd argue that its more akin to supporting IE 11 immediately after its launch - only a small % of your users use it currently, but you can expect the number to grow over time.
> For three - the dialog layout. If your dialogs have text that's longer than 3-4 words, the chances are that it will either overflow, underflow or wrap differently under different DPIs. This in turn means that you need to test dialog appearance with at least 4 different font sizes and Tahoma 8px for Windows XP. Do you know how hard it is to word a longish sentence so that it would fill about the same space with all 5 combinations? Really damn hard and very time consuming.
I don't understand that one. If the text is going to be a different size because it's scaled with the DPI, then hasnt the dialog itself scaled equally as well?
All sizes in dialog templates are indeed specified in "dialog units", which are based on the metrics of the dialog's font. So in theory if you are to double the font size, all dialog elements would just double in size and all will be well.
In practice, this is not sufficiently accurate to accommodate text size differences when going from one font/size to another. It's just too crude.
On Windows, requesting a particular font size is just a hope and a prayer. The OS will draw things at the size it thinks is best, with heavy-handed hinting, so the size of a rendered piece of text is quite non-linear with respect to font size: http://damieng.com/blog/2007/06/13/font-rendering-philosophi...
This makes it quite impossible for an application to implement a general solution to the problem. You pretty much have to treat HiDPI as another translation of the GUI with respect to sizing things.
One area where Apple has been incredibly aggressive — and Microsoft has been sorely lacking — is in actually producing and preparing high resolution content well in advance of the display technology.
Apple new this was coming and for years have demanded icons, images, and artwork to be provided in high resolution. They developed assets for their own software in high resolution before retina displays ever hit the market.
Contrast to this article where Microsoft's own Visual Studio (one of the "best cases" mentioned by this article) doesn't even have Hi-DPI icons. Microsoft's own software displays blurry icons and it's listed as a "best case."
Apple also pushed Intel very hard on driver support for their integrated graphics. Rewriting the driver-level image scaling to ensure that discrete and integrated GPUs produced identical images, again before retina displays ever came on market, to allow for seamless switching between GPUs on their hardware.
This sort of foresight was necessary to pull off the high density display introduction as smoothly as they did.
Same goes for multiple monitors, which "just worked" when Apple introduced their high density displays. You could connect a regular monitor to your laptop and drag windows across. The DPI and art assets adjusted dynamically. This is something Windows is only just now getting around to fixing in version 8.1.
This is a few years old now, but is still one of the better explanations of the issues involved in modern font rasterization, with emphasis on how Microsoft's likely held back the entire industry by only realistically supporting widget fonts at the standard 96ppi. Really, anything other than the stock Arial 10pt/96ppi is going to cause layout issues.
In an effort to very-aggressively hint that font so it looks consistent, they throw out horizontal accuracy by rounding to pixel boundaries. Per-letter. Ouch.
The best part of the paper, though, is this image that moves the sentence to the right exactly 1/10 pixel each line:
I guess it makes sense to talk about "High DPI", because there is a kind of barrier. "Resolutions" or more precisely pixel counts kept getting higher and higher for years, until they stagnated at 1920x1080. Display vendors couldn't go higher, because consumers would complain that "everything is too small". The reason is that the majority of software plays like it's still 1995 and you have 640x480 pixels. Everything is a fixed pixel size, not a fixed physical size.
I'd say that all displays with a resolution (here I mean dpi) so high that its uncomfortable to use with current desktop programs are "High-DPI". Once the software is changed to allow arbitrary dpi scaling, there will be no barrier, so there is no low or high dpi anymore, just higher dpi. (And ultimatively, as soon as you can't see individual dots anymore, it won't make sense to increase the dpi further.)
Also consider the limits of human vision. For most people, at a half meter viewing distance, something around 300 PPI is the limit of perception for most people. At a meter (where most people view the 27 - 30" big screens) it drops lower than that, probably to 200 PPI.
Eventually it doesn't make sense to make denser panels because you don't see the difference anymore. This next generation of displays definitely isn't there yet, but it is "high" as in "we're getting there".
I personally can't wait until we cap out pixel density, because then we can get variable refresh rate and color accuracy back. I want my 300 PPI vbr up to 120hz 10 bit color panel monitor for $300!
I think you are confusing high resolution with high DPI (Dots Per Inch). The first one is a measure of resolution while the second one a measure of density. Sure they are both related, and I think your point gets across, but they are not the same.
Perhaps because we reaching a point where our eyes can longer perceive the individual pixels that we really have reached maximum resolution. Is there any benefit from higher resolution displays?
The ~235 ppi of the Retina display probably isn't there yet. The human eye is really a nifty organ. For example, it increases the spatial resolution of perception above the density of the rods/cones by sampling from slightly different viewpoints and integrating over time. See: http://clarkvision.com/articles/eye-resolution.html.
Also, there is the fact that you can move your face closer to the screen to look at something small. The eye can resolve over 800 ppi at 4". Finally, the fact that the eye can resolve say 300 ppi at a particular distance doesn't mean that's the best resolution for the underlying screen. If you want to render without tricks like anti-aliasing, you want to have enough resolution so that you avoid visible artifacts. E.g. imagine rendering two adjacent thin diagonal lines separated by 0.3 arc-minutes (the spatial resolution of the eye). You want enough resolution so that you can render those lines without their touching anywhere.
One advantage of high-density displays you sometimes see mentioned is that "we won't need to use anti-aliasing / sub-pixel rendering!"
However at the common "high" density used in cellphones these days, 320+ DPI or so, artifacts in non-AA fonts are still visible (not always, or to everybody, but sometimes anyway).
So, at least, there are some cases where higher-density than today could yield some benefit.
Draw a character in as few pixels as possible. You will probably end up at 8-10px height. 6 or 7 is possible but looks like shit. Make a display with as high resolution that it becomes totally unfeasible to read this character without a magnifying glass. Then you have reached "high" DPI.
The transitioning all the way down from 320x200 to today was possible because the same character was always readable, even though it previously covered half of the screen.
Search & read some articles on 'retina' DPI. Sure, no-one can agree quite on what point DPI becomes so high that we can't notice further improvements, but the issue is that recently, displays are suddenly getting very near to that level.
'HiDPI' turned up as a MacOS term for a mode where physical resolution is higher than logical resolution. "High DPI" just means higher than 'normal' dots per inch. The terms are often used almost interchangeably.
you can say what you want about Apple in general, but they way they "fixed" this issue with the retina macs is so much better than what windows is doing.
Windows has had the DPI selector ever since 3.1 (or even 3.0), but because nobody traditionally tweaked the settings, nobody bothered to make their apps look right and because nobody bothered to make the apps look right, nobody tweaked the settings because running higher DPI modes was breaking apps all over the place.
Mac OS did this too - I think in the 10.4 timeframe there was an option to actually switch into a higher DPI mode and many of the OS-internal UI assets were vector images. They probably noticed that it will never work out, so they opted to go for a hack:
With the exact quadrupling of the resolution, we got a solution that works mostly transparently for the applications. They still think they are drawing with a one-pixel resolution, so the burden of getting this right moved from the app makers to the OS maker.
Even if an application has no modifications for retina displays, it will look mostly right (minus some blurring issues for pixel art). There will be no scaling issues, no texts will be cropped and everything will be scaled by the same factor.
If you want to 'optimize' your app for retina, all you do is provide higher resolution bitmaps and you're mostly fine.
The exception is some text-editors (sublime) and browsers (chrome) that were doing some manual text rendering, not relying on the OS API. These had to be fixed manually, but there really weren't that many applications like that.
Yes, the way Windows does it is probably more "purist". Yes, the way Windows does it allows for arbitrary scaling factors (also sub 200%).
But it doesn't work in practice.
Yes. The Apple solution is a hack. Yes, it doesn't allow scaling to arbitrary factors. Yes, providing 2x bitmaps instead of one vector image is annoying.
But it works in practice.
On a retina mac, you'd never see the issues the OP complained about seeing on their Windows machine. What you get there might be superior from a technical standpoint, but it all boils down to an ugly half-working mess because developers just don't bother to get it right.
I'm not excluding myself here. I've done a few windows apps and I f'ed up high dpi modes as many times as everybody else - also because my development environment (Delphi) made some assumptions that just didn't work well with high-dpi modes.
Getting HDPI right on Windows (Desktop): Really hard and thus not worth it for most developers. Getting HDPI right on the Mac: Trivially easy.
> Even if an application has no modifications for retina displays, it will look mostly right (minus some blurring issues for pixel art). There will be no scaling issues, no texts will be cropped and everything will be scaled to the same factor.
That is how it (should) work in Windows too. All applications that do not explicitly claim to be DPI aware get just bitmap scaled, just like on OSX.
The real problem is that tons of applications claim to be DPI aware, but in reality are not. The reason Apple does not have these kind of issues is that their Retina thing is more recent and thus there are no legacy apps that claim to be DPI aware.
> The reason Apple does not have these kind of issues is that their Retina thing is more recent and thus there are no legacy apps that claim to be DPI aware
Not actually true. Many games are fairly messed up in either windowed, full-screen, or both noises. Usually it's because of a bad middleware toolkit that either enables the Retina backbuffer our not but the OpenGL scaling doesn't match the window scale. Especially prominent in games that use the older, Carbon method of exclusive full screen mode.
Aside: full screen games manage to be even worse on OS X than on Windows. That takes effort. (Games as full screen spaces is a really nice, usable concept, but the performance hit is so nasty nobody wants to support it instead of the invasive and messy Core Video tour.)
> All applications that do not explicitly claim to be DPI aware get just bitmap scaled, just like on OSX.
I don't believe that is true. If you render text using the right high-level OS X APIs, your OS-provided vectors are rendered natively without being bitmap scaled.
WPF can do this just fine also, but few apps are written in WPF :p (well, WinRT inherits this nice property). GDI, I'm not sure....
> All applications that do not explicitly claim to be DPI aware get just bitmap scaled, just like on OSX.
That isn't what happens for Cocoa apps on OSX, though. Non-retina-aware apps aren't bitmap scaled; the Cocoa UI stuff is still scaled up properly (XBench, which had its last release in 2006 targeting MacOS 10.3, looks perfectly sharp on an rMBP, for instance). It's just non-standard stuff that may be a problem.
Your understanding how DPI scaling works on Windows is completely wrong.
> As far as I know there is no magical 'hey, I'm DPI aware' flag in Windows
Yes, there is exactly that kind of flag. Your application either sets the flag in its manifest, or calls `SetProcessDpiAwareness` function.
> In Windows (I'm talking about the old native-code API here - no idea about Windows.Forms or WPF), when you say "draw me a 1px wide line here", you get a 1px physical line drawn.
No. The default (ie if you are not claiming to be DPI aware) is that your draws end up on a 96DPI "virtualized" surface, which is then scaled up by the compositor (DWM) to the real DPI. Effectively that means that your 1px line becomes 2px line. You can only draw real 1px lines on hidpi screens if you claim to be DPI aware.
Mac does this even better. It does allow arbitrary scaling - for some definitions of arbitrary.
For example, my MBPR 13" is set to the setting between "Best (Retina)" (which is the exact 2x, 1280x800 I think) and "More space") (which is 1.52x, 1680x1050), so I get 1.77x the space (1440x900). You control the DPI there and everything remains nice and sharp.
Apps aren't even aware of this scaling, so everything works perfectly well.
I get what you're saying, but this is probably not a great way to think about it because it isn't arbitrary scaling, it's arbitrary backbuffer resizing. The way this works internally is to render to a HiDPI buffer at 1680x1050 Cocoa points (mapping pixels:points at 2:1). It's still @2x, the buffer just gets downsampled afterwards and the pixel density is high enough that it's difficult to tell in practice.
Yeah. But the virtual resolutions are fixed depending on the screen. There's no way to arbitrarily set a virtual resolution and the OS only offers this for screens it knows (see the MacPro Review from anandtech where they tried to enable HDPI with that 4K display: the os only offered the 2x mode).
That's why I wouldn't call that arbitrary scaling. If they could offer it using their method, I'm sure they would.
It's not exposed in the settings, but it can scale to any apparent resolution. There are various 3rd party programs that can set any apparent resolution for you.
And of course, you can see this without any 3rd party software if you connect it to a monitor or projector with a weird resolution and enable mirror mode.
> Mac OS did this too - I think in the 10.4 timeframe there was an option to actually switch into a higher DPI mode and many of the OS-internal UI assets were vector images
There was, though the assets were big scalable bitmaps, not vectors (even on modern HiDPI Macs, the UI assets are still way too big). John Siracusa covers a lot of this (and complains about it not being used) in his MacOS reviews.
> Yes. The Apple solution is a hack. Yes, it doesn't allow scaling to arbitrary factors.
It actually does, through a _really_ horrible hack. Say you want 1680x1050 _point_ resolution on a 2880x1800 MBP. You can select that; what the OS then does is draws at 3360x2100, then scales it down to 2880x1800. Sounds horrible, but generally works very nicely.
It's worth noting, by the way, that Metro scaling works much like MacOS scaling; it's only the desktop that uses the older style.
I don't see what's so horrible about how Apple handles resolutions for retina Macs. It lets me have a HiDPI screen that either 1) shows the app in HiDPI the way it was meant to be or 2) (worst case) doubles all the pixels, making it about the same quality as 1x screens before. It also gives me flexibility in getting more space than native retina resolution without having too much blur. Furthermore, something that is great for accessibility reasons, it finally allows lower resolutions without having a blurry picture. A killer feature for my father - the upgrade in screen quality from his old Lenovo running at 2/3 native resolution and his 15' Retina is insane.
This hack does require a high DPI screen though. If you have a normal monitor on OS X, and just want to make stuff a bit bigger, it's going to look like crap.
Apple did have a solution in place for several releases that supported arbitrary scaling. It could be enabled via a developer tool. I'm guessing the 2x solution came about once they realised getting all of their developers to rewrite for arbitrary scaling was impractical.
I think it was much more urgent than that. They wanted to ship actual retina displays on actual Macs on a defined product cycle. They had a hard date by which this HAD to work. Getting developers to rewrite eventually is one thing, getting them to rewrite in time for the release of a specific product in 12 months time is another.
If Sony is planning to release a HiDPI Windows laptop, whether it works well is a problem for who? Sony? Microsoft? Every Windows app developer ever?
If Apple releases a laptop with a Retina display, it's Apple's problem. No ifs or buts. Everyone knows exactly where the buck stops.
I think Apple also realised that the 2x solution is all they will need. There is very little point going higher density than the current 15" rMBP display. So there is little point in supporting completely arbitrary resolutions — Apple are betting on this density as the standard for the future.
I think they also realise that our current "2x" solution is our future software's "1x" scale factor. When Apple's line of hardware is retina-only, we'll probably see them drop support for the distinction between 1x and 2x. Everything will simply become designed for the current density.
To me this was the key insight. There was this idea of a slowly, ever-increasing display density that we'd have to handle by making everything be vector based, since bitmaps don't scale -- which has a grain of truth, and a certain aesthetic beauty.
But in practice, the human eye has limits so density is not going to increase forever, and we only have a known, finite set of displays to support. So just provide bitmaps for the display densities that matter. Boom. Problem solved.
(Probably forever, depending on how much you believe the "retina" marketing claim -- the difference between the type on e.g. a retina display mac and a magazine seems small enough to me that I could easily imagine a push to "super-retina" not happening in my lifetime.)
I think it'll stick around for a long time, due to external displays: projectors, TVs, etc... which either don't benefit a great deal or won't be upgraded to hiDPI for a long time.
I posted a separate comment to the effect that the Apple solution also allows using both hi- and regular-DPI displays simultaneously with no issues.
Given the push to 4K on TVs and other video viewing devices, it seems like they'll catch up fast enough. 4K at the typical viewing distances is pretty close to retina in terms of the number of degrees of your vision each pixel covers.
One cool thing about the way that Apple do drawing is that (at least on iOS, I think on OS X too (CG* APIs) - I'm making the assumption this is a PostScript/Display PostScript holdover) the drawing APIs use floats for positioning instead of integers. So a retina pixel is just a whole number pixel+0.5. This was a pain before retina screens appeared, since anything not positioned on a whole number would render blurry. But then retina displays appeared and the reason for it clicked.
I doubt it's because of HiDPI. I expect it comes more from the GL heritage inherent in the system, where the pixel/texel ratio can be anything you want.
No, he got it right the first time: it's due to the PostScript heritage. Cocoa predates OpenGL. Back in the '80s, HiDPI was called print resolution, and applications used the same commands to draw to screen or paper years before TrueType was available. (Though NeXT systems did typically use raster fonts for most on-screen use because PS wasn't very good at antialiasing or hinting.)
I was actually shocked at how badly Apple handled Retina, and even more shocked and disappointed that Windows did it worse.
Apple basically made everyone rewrite their apps instead of doing something sane - though admittedly they never had real support for multi-dpi, so perhaps that's an at least understandable solution.
Microsoft have always had support for changing the DPI in the operating system, but in Windows 8.1 instead of reusing that support and improving the code so that you don't need to log out and back in to change DPI they instead started doing Retina-style scaling - but far worse.
High-DPI apps (edit: non-Metro, I should clarify) in Windows 8.1 actually don't look good on a low-DPI screen. (You need an e.g. Retina main screen and a standard external monitor to experience this.) Sure, they look a lot better than the pixelly stuff on the main screen, but instead of rendering at standard DPI Windows instead renders at high DPI and linearly scales down the resulting image in the window manager, resulting in text that looks strangely off because it has been rendered at high DPI with appropriate hinting and then scaled down.
This is not to mention what happens when you have a program half on one monitor, half on the other... it's either tiny on one screen or humungous on the other. Retina at least does this beautifully.
> Apple basically made everyone rewrite their apps instead of doing something sane - though admittedly they never had real support for multi-dpi, so perhaps that's an at least understandable solution.
They did not make everyone rewrite their apps. The only apps that actually needed to change were the ones that were rendering text without using the Text rendering APIs of the OS (a small minority of apps).
Everything else just worked totally fine.
Yes, some icons might have been blurry, but functionality-wise everything was fine.
This is in contrast to Windows where switching to HiDPI mode usually means cut-off texts, invisible buttons (pushed out of the dialogs they are in) and, as noted by the OP, sometimes a mish-mash of correct and incorrect rendering within the same window.
You tell us you were shocked by how it was handled by Apple, so what would you have done differently? How would you have solved the issue in a way that requires even less work by app developers?
Honest question. I really can't imagine a better solution than what Apple has done, but then again, the bulk of my knowledge is in databases, web servers, web applications and devops, so I'm sure I miss something that you can see. Hence I'm asking.
In fairness, high-DPI screens on Windows are a new thing. If (and I realize it's a big if) apps get fixed over the next couple of years to handle different DPI settings, then the end result will be a lot more usable for e.g. vision-impaired users who want to use a higher DPI setting than the default.
Not a new thing at all. I had an Acer laptop back in 2006 that run XP and that actually came configured with High DPI. Took me a while to find how to switch it back to the default, but the point is that even XP was DPI-aware.
Sure. I ran windows (presumably 2000?) with high DPI back in 2002 (on a good ol' 2048x1536 CRT - it's pathetic how long it's taken vertical resolution to catch up to where we were. But I digress). But apps weren't written to support it, and many of them had the kind of problems you see in this post.
Not to mention, on a Mac you can have a combination of hiDPI (internal display on rMBP) and regularDPI (external display) and it works seamlessly.
I'm going to take a guess and say the same isn't true for Windows?
Good to know. I have no interest in upgrading to Mavericks and even less interest in full screen mode, but at least now if I ever end up there by accident, I'll be able to get back out!
Yes, the complaints about full screen mode not working on multiple monitors always seemed idiotic to me. Full screen mode is designed for small laptop screens, to free up some extra screen real-estate. Not for multiple displays.
Well at least Adobe is consistent. The Flash Installer is awful on OS X as well. For example, say you've disabled the translucent menu bar - you'll notice it's drawn translucent when the Flash Installer is the foreground app. (Nevermind the fact that the installer is a just an annoying wrapper around the OS X installer that seems to do nothing more than force you to quit your open browsers before it will continue. With some digging you can actually find where it's downloaded the real installer pkg, double-click that pkg to install it, then quit the wrapper.)
Is the Flash installer running as a different user? Menu bar translucency can be set per-user and translucent is the default, so if, e.g., you run an OS X GUI app as root using sudo, and haven't disabled translucency for root, you'll see this behavior.
Wow... night and day compared with Safari. Can't for the life of me find details about the different approaches via search. Any links to why the browsers differ?
One of the most depressing areas of Windows to look at is the various control panel screens. Some of them, I kid you not, will show different bits of text in up to three different sizes. If Windows itself doesn't deal with different DPIs correctly, what kind of example does that set for application developers?
Another application that definitely deserves mentioning is Chrome. It looks absolutely terrible. It's really blurry and browsing the web for even a couple of minutes really strains my eyes. If you run high DPI Windows, you're going to need a different browser than Chrome (Firefox looks fine but unsurprisingly does not perform so well at such a high resolution. Internet Explorer is actually not a bad choice - high res and fast)
In Windows 8.1 you can override the scaling for individual apps through their properties dialogs. Set Chrome to not scale, then set the zoom within Chrome to be > 100%.
The url bar and tabs will be a bit small but the page content will look good.
TIL. I upgraded to Windows 8.1 but I didn't know about this. Anything Java doesn't handle high DPI and will simply be doubled and is a blurry mess. I actually turned off DPI scaling just because of Java but if I can control it on a per-application basis that would be preferable.
Awesome tip, thank you. I have an 8.1 license but have to install Windows 7 first, and hadn't bothered with the extra time to upgrade it yet. I know what I will be doing this weekend :)
If you can stand to have tiny tabs, set the compatibility mode for Chrome to disable high-DPI scaling and set the default zoom in the browser options to something like 175%. It's a trick that works with many apps.
Eh, I have a yoga 2 pro like the article and Chrome is working (well it broke so bad I had to reinstall one upgrade, but that aside) Ok... but it has a lot of weird artifacts you notice from time to time like the drop down for the url bar splitting in the middle and the left side ~2em down from right side.
Fonts specified in points are correctly rendered using the DPI setting correctly, as it has always been. So, as far as 99% of my work is concerned (terminals and text editors), there is no problem.
However GUI elements of most programs are still specified using pixels, so buttons, menus and some icons might look "tight" around the text if the theme is sized using pixels. Recent GTK/QT versions do support point units, so it mostly depends on the theme you choose. These toolkits also always (historically) had resizable dialogs, contrarily to window and mac, so even with "bad" support for high DPI, the text will be readable and the interface will be scalable.
Legacy apps, such as WindowMaker dockapps (which are historically sized as 64x64) though will be unreadable. Frankly, I can already cannot read them anymore.
Given the configurability of most programs, I would say it's really not a problem. Most of the other programs will be adapted very quickly.
Would some smart window manager additions be in order here? I could envision a titlebar context menu option to force a window (or all of an app's windows, probably) to render at 2x (composited by the WM) so that we have control over all the edge cases.
With composition, this would be easily done (ie: upscale the target window). A plugin for compiz would be very easy to do and control from the user perspective, but that actually would be half resolution.
KDE looked like this at the late 90's. Then they added settings for the default font size (or did I just find it by then?), and started improving. Now, with KDE 4, everything scales all the time, there aren't even many bitmaps anymore because they break in any kind of display.
Gnome 3.10 has it mostly right. I find there are less issues on Gnome than on Windows.
But then there's the problem when you open an app which is not native GTK/Gnome, for example Libre Office, where the fonts look OK but the icons are minuscule.
The Desktop itself and the Gnome applications look awesome on high DPIs.
Also Firefox works flawlessly, though on Linux you have to manually set the option "layout.css.devPixelsPerPx" on about:config.
Well... X knows about dpi for ages. Unfortunately, people have been using pixel-based shortcuts for ages too for almost everything. The way Apple used to "fix" it is curious considering NeXT was, AFAIK, dpi-aware from the start.
Not so curious. As numerous comments here show, just about every display subsystem (be it X, Mac OS or Windows) has been DPI-aware for some time, but the 3rd party software that runs on these systems has made woeful use of it. Apple's fix gave consistent, reliable results.
Not very well. Chromium/Chrome is almost unusable, ditto for Skype. Firefox is ok, but many of the icons are quite small and hard to decipher, and you have to use NoSquint to make the pages readable. The Dev Tools are even worse.
Gnome terminal looks good though :) Many Gnome apps are fine, but Gnome 3 itself has problem.s
I can't even use my Windows 8.0 on my 32 inch TV that is 1080 sitting a couple of metres away, while keeping the resolution and boosting all the accessibility settings to the max.
Even the metro Apps I can't use.
I'd need a 50ft display! It's another facet of display/accessibility issues.
I had high hopes after Media Centre, that Windows would get this right. And bought Windows 8.0 specifically for this purpose. Disappointing. I'm forever going over to the TV and my neck hates it.
Personally I think part of the solution is to change the menuing system, breaking it out from the window. If I can at least use the menus/controls on apps I have a good chance of using them.
Wierd, I've got my Windows 8.0 laptop hooked up to my 40" Samsung 1080p telly (bought in 2007) and it all looks good, even from ~3m away...and that's using just the VGA input (Dell didn't stretch to a DVI port on their cheapo Vostro 1720 range back in the day).
Well your eye sight is way better than mine. Metro apps are almost usable, the desktop is barely usable (I have to walk over to the set, and that's with the text size increased. I have a very small (not titchy) living room.
I've had the pleasure of using a high-DPI display on Windows 7 for a while, now. There are some tricks that help significantly:
1. Calibrate ClearType to use as little color as possible (use the system magnifier to help); this way, when apps get scaled the text is just blurry rather than blurry with odd color-fringing.
2. For apps that are incompatible but suitably configurable, set their compatibility mode to disable HiDPI scaling and then set their font sizes (or default zoom) to be larger. This works well for Chrome and Skype, at least.
3. For those times when you momentarily have trouble, remember that the windows key and the plus key will zoom your whole desktop.
Remote desktop is really annoying on high dpi screens - the client cant scale up (although it can scale down) so you end up with a tiny view of your server.
Isn't the credit due to the choice by Apple (NeXT, actually) to go with Display Postscript?
I'm sure they figured that as amazing as the resolutions of the original NeXT monitor was (1120×832 in 1988), in the future it would be greatly exceeded. Not to mention they wanted to be able to use the same routines to draw to a 300 or 600dpi printed paper as a 72dpi or whatever monitors were at the time.
OS X has never used Display Postscript (partly because they probably didn't want to pay license fees to Adobe and partly because in a composited environment with client-side rendering there's no need to marshal drawing commands into text that looks like Forth). The OS X UI is full of bitmaps that had to be redrawn for retina.
This is one of the biggest reason I just can't move away from my regular DPI laptop. Every time I have tried a HiDPI display with Windows it is painful. Text looks great but icons and widgets look like shit. Things get better with every new version of Windows but Microsoft have been promising true DPI independence since Vista yet things are still crap.
Back in Windows 3.1 and Windows 95 I remember this stuff actually working quite well, a lot of people I knew were running Windows with non-default DPI settings. As newer versions came out, and also as people started using third party GUI toolkits more and more, things started getting less consistent.
I think those old settings only concern fonts. Icons and other pixel-measured things would still look ridiculously small (making it a mess of small-things-with-big-fonts).
Changing your DPI settings on Windows can cause all kinds of strange issues. I had a problem a few years back where my Windows 7 gaming theatre PC thing I had set up and connected to my 1080p TV wouldn't work quite right; specifically Star Wars: The Old Republic wouldn't launch, and would crash on loading.
I couldn't figure out what the problem was, and googling was no help until I stumbled across the dumbest advice I'd ever seen: change the text scaling. Once I set it back to 100%, the game loaded fine. Frustrating, because at 100% I couldn't read any of the other text, which meant that if I wanted to play the game I basically had to navigate by icon and give up on reading dialogue boxes unless I wanted to sit on my coffee table instead of my couch.
I'm actually impressed that windows apps support it at all. I didn't realize it could actually work properly if programs took the time to do it right. With High-DPI monitors finally coming out I'd expect dramatically improved support moving forward.
As someone who prefers high-DPI screens for workflow management across programs (spreadsheets, web layout, graphic design, etc.), this scaling problem in programs is a frequent frustration.
Particularly working with media (DSLR images or 1080p video), it is difficult to scale GUI to a readable level while viewing the media itself at an un-scaled 1:1 pixel level. Do any editing suites for photo/video do this well? I'm talking about an un-scaled video window with a GUI that scales/wraps to an arbitrary window/screen size...
I always liked what SGI Irix did. From the get go the desktop was all vector based. So you could run at 800x600 or 1280x1024 and either way things would look normal.
How did icons and stuff look, then? I thought vectors aren't the silver bullet they are made up to be especially for small/lowres graphics. Case in point, all the crazyness with fonts and hinting etc to make sure they look good in 12px.
The icons were vectors, but typically drawn at an isometric view and combined with other components (like the "sheet of paper"/"magic carpet" underneath apps). When you click once to select the app, the paper turns yellow. When you double click, the paper back and up behind the icon.
It's funny that, in an article describing issues with a high-DPI display, the author never explicitly says the size of the screen (e.g. in inches), or the DPI. He just tells us it's 3200x1800. In case you were wondering, the display on the Lenovo Yoga 2 Pro is a 13.3", 16:9, 276 DPI display. Personally, I find this description more immediately useful than saying that it's a 13.3", 3200x1800 display.
I guess the best way to solve this would be to (partially) ignore the dpi awareness flag in applications and instead apply a series of heuristics and white/blacklists (with user override) for determining if applications truly are hidpi capable. That would be bit of a pita for the applications that actually do support hidpi properly now and might get still bitmap scaled, but that would be just a temporary issue.
Windows does let you disable high dpi scaling on an app by app basis (if you right click an app icon, pull up properties and go to the Compatibility tab).
In Windows versions prior to 8.1 the UI wouldn't let you do this for 64-bit executables, only 32-bit ones. For 64-bit apps the checkbox to disable dpi scaling was greyed out, even though if you set the key in the registry (using the same key that the UI would set for 32-bit apps) it would work just fine for 64-bit apps too.
In 8.1 they fixed this stupidity and the compatibility tab UI will work for any app regardless of being 32 ot 64 bit.
What bothers me is that this crisis has been obvious since well before Apple started releasing their Retina screens on mobile devices. X has 75 dpi and 100 dpi fonts for a reason and we had 2D acceleration on every decent computer for what? 15 years?
We've been using pixels as size units for no reason other than laziness.
Vector rendering is still not advanced enough to completely (or even mostly) replace bitmaps. Until they are, we'll have to redline our bitmaps for each resolution manually. It's not just a matter of laziness.
> Vector rendering is still not advanced enough to completely (or even mostly) replace bitmaps.
Wait, what?!? Vector rendering matured before bitmaps were common place. They are faster and use less memory, that's why programmers of the 70's and 80's loved to use them.
Bitmaps are easier. That's the only reason people use them. And only in a world where everybody gets images on the same size and resolution.
My wife does a lot of pixel redlining at her job: the artifacts they produce are all done in Illustrator, a vector program. And believe me, they would love to leave them as vectors...less work for them! However, they must be converted to bitmaps and manually redlined to eliminate sub-pixel alignment problems.
Bitmaps are not easier, otherwise the designers would be cranking out pixels directly in Photoshop and not Illustrator. The fact that NO ONE has produced a decent general purpose vector renderer that avoids pixel artifacts, even a non real-time one, means that my wife will be redlining for quite a few more years. Even font rendering (arguably simpler than icons) requires substantial hinting to be effectively resolution independent.
The high density displays we are starting to see may end up solving the artifact problem - we won't be able to discern individual pixels for much longer.
The other problems: elements that vanish, such as lines and thin structural rectangles, would require some pixel-related restrictions in their definition. I'm not sure vector programs are able to deal with them right now, but it would be useful.
Right. I was replying to parent in that we don't still do bitmaps because we are lazy. Quite the contrary: we have moved to vector-based tools for producing application art assets and would love it if we could just output to SVG or something similar and be done with it. However, the current reality is that we still have to convert our vectors manually to bitmaps, even those for high res screens. Rasterization is just still not good enough to do the job automatically (it works fine for objects in motion where the imperfections aren't very noticable, but not those that don't move).
I understand there is a need for some pixel-tuning, but most of it can be expressed as rendering constraints on the vector images, such as "this rectangle has to be, at least, five pixels wide".
Every time I've upgraded my monitor, I've found it easier. If you don't want a window to be wider than a certain point, don't make it wider. Myself, whenever I see apple users, I see them with these tiny narrow windows with nothing else on screen, which I have never understood.
That said, I always value vertical resolution - I have never bought a 1920x1080 monitor as 1600x1200 is so much nicer, or 1920x1200 for the wider aspect ratio.
He is not complaining about the windows being to wide, he is complaining about the scaling that Windows 8 applies when it detects a High DPI screen, that makes every element inside a window either too large or too small, or sometimes both, breaking the original design and fluidity of the application and rendering it unusable.
And your point about Apple users..well, I don't know what was your point about Apple users.
His point about mac users is that they tend to divide their screens in tiles while windows users tend to maximize the windows and switch between them. I found myself doing that on a mac too (I used to have a machine of each OS on my desktop) and was mostly because I never get used to the dock or exposé or even the way MacOS maximize windows
I don't think that is what he wrote, but anyway, what you say has to do with how each OS works and neither is inherently bad. I do think however that the way Windows maximizes applications is more consistent, which is always better. Maybe that is why on a Mac one tends not to maximize a window as often as on Windows.
For one, you can't just throw hires bitmaps into a resource section of .exe and expect them to magically work. There needs to be code that looks at current DPI and picks matching bitmap.
For two, standard DPI levels are 96, 120, 144 and 192, but guess what? All other values in between and above 192 are game too. In fact, there's a nice little slider in the Control Panel that encourages you to put it somewhere in between. This means that your code either needs to rescale bitmaps to match these odd DPIs or use the largest one that fits. In either case the result will look like butt.
For three - the dialog layout. If your dialogs have text that's longer than 3-4 words, the chances are that it will either overflow, underflow or wrap differently under different DPIs. This in turn means that you need to test dialog appearance with at least 4 different font sizes and Tahoma 8px for Windows XP. Do you know how hard it is to word a longish sentence so that it would fill about the same space with all 5 combinations? Really damn hard and very time consuming.
But wait! There's more.
Every app icon needs to exist in at least 9 sizes, like so - http://imgur.com/5Pe2ZV0 - and this would still miss some cases where Windows will scale an arbitrary chosen icon image and use it.
It really is a mess. However this is not something unexpected if you've been writing for Windows for a while. This mess is a routine.