Not exactly breaking news unless you've been living in closet since widescreen panels came out.
It is nice that some manufacturers (especially Microsoft) are putting out great aspect ratios for productivity.
For productivity, I've found that the sweet spot seems to be either ultra widescreen (side by side applications), or multiple screens with 3:2 / 4:3 / 5:4.
16:10 is much better than 16:9 though, especially on a small device.
I'm getting ready to sell my 16:9 thinkpad carbon x1 for the 16:10 nano or x1 2021. I'm OK with my 16:9 43" monitor because it's so big I get the vertical real-estate but my thinkpad feels as though it's really only useful for watching 16:9 videos which I don't do.
If 2021 is the year of 16:10 for Thinkpad, then I'm hoping 2025 is the year of 3:2 and maybe hopefully sometime in 2030 we'll be full circle back to 4:3!
I suspect it's due to competing with the Dell XPS 17 2020. I prefer Thinkpad due to their keyboards and I am a trackpoint user, but if I really needed 16:10 then I would get something from Dell's XPS line. With the LG I'd be taking a risk with ergonomics such as the keyboard and trackpad, whereas I know what the XPS feels like.
I picked up a 17" Dell XPS with the updated monitor. It is absolutely glorious. Was able to add a M2 drive and upgrade the memory. Had to do a bit of work to get the killer nic to work with Linux, but it is a solid machine. It is remarkably small for such a large screen.
The keyboard/trackpad on the LG is imho better and having pixel less screen @ 17" with the added vertical height is a no contest for productivity.
I took a risk on it as well and it's served really well as a daily worker. The XPS otoh is a (badly executed, no retina) macbook clone.
LG brought out the first light 17" laptop two years ago when everyone was still on 15", but frankly if they don't get support then I'm afraid they'll stop releasing these machines.
I have an ultrawide LG display and it really improved my life, especially with videocalls. I can keep the call window big enough in one half and with the other read code, documentation, looking at a graph etc. Before that I was constantly changing the focused window or not seeing the other person(s).
For writing, I have a 90° angled 27" 16:9 monitor at 1080x1920. It's absolutely unusable for programming since it barely fits 80 characters at a usable scale, but it's like having a giant page in front of me when using a word processor/text editor.
In the office I have two 1080x1920 panels, which can each display two columns of code with room to spare. I use the X11 6x13 “fixed” font, and emacs with fringes but without scrollbars, so each column is a few more than 80 characters wide.
At home I have two 1200x1920 panels, a bit more horizontal breathing room, but my code is still less than 80 wide :-)
Soft-wrapped code tends to be pretty difficult to read though. Certainly much more difficult than wrapped prose. Personally I prefer horizontal scrolling over wrapping code pretty much always (at least on a mac where horizontal scrolling is easy).
Xcode used to be really good at softwrapping code. It was so useful with the assistant editor on a laptop screen. It got worse at some point around Xcode 8 or 9, which was a shame.
Usually word processing and typesetting have word-wrap enabled. This is undesirable in a program. The 9:16 screen may still be useful for programming though, because it's nice to have some documentation out on another screen.
No, 4K is called 4K because 4K is based on cinematic definitions, while the older resolutions are based on TV definitions.
For TV you count the height, as that's the only thing that matters — you can always add more high-frequency signal to increase the width after all. That's why TV signals for e.g. DVD also use a 480p signal for 4:3 and 16:9, you just stretch the content, no problem on analog TV. (In fact, the "p" doesn't mean pixels, it means the content isn't interlaced (progressive) while 1080i would be interlaced content, with both 540p and 1080i having the same amount of lines per frame)
While for cinema you count the width, e.g. what's more commonly known as 1080p is just 2K in cinema terms, as it's 1920 (or roughly 2048) wide. And that's why 4K is 4K, as cinema content is actually rendered in 4096 pixels width. In cinemas you can always add or remove height (just don't project onto that part of the screen, some cinemas even have screens able to change their height), while the width is fixed (you usually can't make the screen wider). Historically this also held true with film: except for IMAX (70mm portrait), you measure the width of the film e.g. 8mm, 16mm, 35mm, 70mm, as you can always just make the film longer to change the height.
It's true that manufacturers quickly started calling their UHD content 4K for marketing reasons, but 4K doesn't have anything to do with your hypothesis.
Now why was it used as marketing term? When HD and Full HD rolled out for consumers, cinemas started upgrading their projectors as well, offering content in 4K, and advertising it as such. Consumers got used to 4K meaning high resolution high quality. And when consumer devices started reaching a similar quality, obviously the manufacturers used the already established term, instead of the official "UHD" branding.
At least they got away from the alphabet soup nonsense. Nobody wants to try to remember what WQHD or WQSXGA mean. I also hope that they eventually come back around to just listing the two dimensions again.
Related to this, they never did find a good term for 1080P, well, other than 1080P.
720P was branded HD, 4K was branded UHD, but 1080P never got a designation like that. This put Sky in an interesting position, as they only recently added 1080P support to their NowTV streaming subscription service, as an optional upgrade to their usual 720P. Having already long promoted their service as delivering HD, [0] they were clearly at a bit of a loss as to how to describe 1080P. They ended up calling it full HD. [1]
Actually, when the HD standards were introduced, 720p/i was officially designated as "HD Ready", and 1080p/i as "Full HD", with the respective acronyms HD and FHD. Ultra HD obviously got the UHD acronym later on.
Well today I learned. I always thought they called it 4K because doubling the dimensions of a rectangle causes the area to quadruple. No wonder "2K" display resolutions confuse me...
It's 4k because it's an HD resolution (1080) doubled horizontally and vertically.
If this is HD:
┏━━━━┓
┃ ┃
┗━━━━┛
Then 4 of these tiled become 4k:
┏━━━━┳━━━━┓
┃ ┃ ┃
┣━━━━╋━━━━┫
┃ ┃ ┃
┗━━━━┻━━━━┛
I used to have 4 separate HD monitors laid out horizontally on my desk, and in terms of pixel realestate, a 4k effectively replaces them... until retina/high density displays became affordable. Now my 5k monitor is effectively the realestate of 4 HD monitors, but nice and smooth.
640x480 (VGA), 800x600 (SVGA), etc came from the computer industry since they had 4:3 CRTs
720p, 1080p came from the TV industry since they always counted video in "lines" (NTSC is 525 lines)
4K came from the movie industry (the 2005 Digital Cinema specification), since movies have wildly different aspect ratios so counting the vertical doesn't make sense (specifically, it's very rare for movies to be 16:9!)
It is nice that some manufacturers (especially Microsoft) are putting out great aspect ratios for productivity.
For productivity, I've found that the sweet spot seems to be either ultra widescreen (side by side applications), or multiple screens with 3:2 / 4:3 / 5:4.
16:10 is much better than 16:9 though, especially on a small device.