Hacker News new | past | comments | ask | show | jobs | submit login

100% agreed. Modern software has thrown out the design idioms that were so common throughout the '90s and early '00s in Windows (and Macintosh!) software. I wrote a blog post about what happened: https://loeber.substack.com/p/4-bring-back-idiomatic-design



The decline in UI drives me crazy, too. There's some irony in using a screenshot from MS Word in that blog post, given that back in the day Office was notorious for having subtly non-standard UI elements (menus which remove little-used items after a while, destroying muscle memory, custom file dialogs...) - though they were at least using familiar idioms, albeit slightly jarring implementations of them. Of course in typical MS fashion rather than fix it, they moved from subtly non-standard to overtly non-standard UI elements!

Personally I blame the shift to mobile as much as the shift to web for this - that's what drove the much-hated Windows 8 interface (leveraging the desktop computer monopoly to try and give the Windows Phone offerings a familiarity advantage), all coinciding with Gnome jumping the shark and Ubuntu shifting to something new. Ironic that during the end of the 'naughties' Apple were the pioners of touchscreen UI on commodity devices, but seemed to be the only player who understood that the desktop was a completely separate space, and should remain so.


> There's some irony in using a screenshot from MS Word in that blog post, given that back in the day Office was notorious for having subtly non-standard UI elements (menus which remove little-used items after a while, destroying muscle memory, custom file dialogs...)

This is word/office 97 though, peak office, which came before all of that.

Menus removing "unused" items must be one of the worst UI decisions of all time. Imagine how many user stories, thousands of tests and interviews resulted in that abomination.

Sometimes you just can't beat common sense. Problem is to know when you can and when you can not.


> This is word/office 97 though, peak office, which came before all of that.

At the time, people complained about the menus popping out button-like borders on hover[1], which indeed no other menus did at the time, and about the un-buttony buttons on toolbars[2], which indeed directly contravened the Windows 95 HIG (unlike those in Word 95).

Not all moments of peak Office happened at the same time. Word 95 was the last to be mostly HIG-compliant. The macro functionality only matured with Word 97—but once you were trying to do moderately fancy things like use Microsoft Equation more than a couple of times, it crashed multiple times per hour (and of course ME’s typesetting was absolutely awful). That gradually improved until it became stable somewhere about 2003 (the weird blue UI) or 2007 (the OOXML/Ribbon release)—at the cost at no longer being usable in the rest of the system using OLE (IIRC).

What makes me genuinely conflicted is the contemporary rant against the Win95-style file dialogs[3]. I’m very used to them, to the point of finding it difficult to imagine anything else (I’ve encountered the vestiges of the old two-listboxes dialogs from Win95 on, but that’s all they were, vestiges). And yet I can agree with most of the criticisms! I just can’t see how to resolve them.

(Office never left those dialogs intact, though, not in any version, although they tried damn hard to make their custom versions look like business as usual.)

[1] http://hallofshame.gp.co.at/visual.html#VISUAL36

[2] http://hallofshame.gp.co.at/visual.html#VISUAL38

[3] http://hallofshame.gp.co.at/file95.html


> This is word/office 97 though, peak office, which came before all of that.

I believe that's Word 2002 in the article.


But these days macOS and iOS are as similar as ever. I’m actually hoping Apple releases an iPad that runs macOS or a MacBook with a touch screen. I think they will do one or both eventually.


It feels like you jumped past something that I remember being a big deal - The Ribbon. I remember when Office introduced it, and people hated it. It goes against your comment on using words instead of icons. But it seems like a useful solution for programs as feature lists grow and menus (and even sub-menus) become unwieldy so I've made my peace.

Personally, I think the Windows OS itself is one of the most problematic offenders. How can you expect better from third party developers, when MS uses at least 3 different UI paradigms for system settings? And F1 doesn't bring up help - it brings up a web browser!


Since ribbons it is difficult to find what you are looking for, slows down the use of the software. To day I did not make my peace with it.


In Excel, I love the ribbon, as someone who looks to make use of some of the more advanced data manipulation and display features it has. If all I wanted to do was handle some basic lists with perhaps a formula here or there, perhaps I'd feel differently.

And this is why I DO feel differently about Word, which does take up way too much space with a ribbon full of tools that really aren't of interest. In word, I'm usually writing something, and then spending a small percentage of the time changing formatting. Quite different from Excel, where it's more often than not working with large datasets pulled either from a csv online or directly from a SQL server and the various tools and functions are central to the task at hand.

I guess the point is, it really depends on use case as for whether the ribbon makes sense, and it should be left to the user to decide to use the old menu based system or the new ribbon based system based on how much time is spent composing vs. manipulating the existing data. Right tool for the job.


I never thought about that Word vs Excel ribbon dichotomy before, but it makes sense and mirrors how I feel. The good news is we can hide it, at least. In whatever version I have, selecting text popups a mini toolbar up which usually has exactly what I need anyway.


Good OC. Thanks.

The Browser Software Era designs are idiomatic. They're based on (CD-ROM era) multimedia and adventure games, not desktop apps.

The earliest, most influential web designers had been doing multimedia. The earliest creation tools were tailored for them.

That aesthetic has been carried forward, for better or worse.


Multimedia and games were designed to be standalone, they interacted with themselves and that's it.

Browsers, OTOH, used to dsplay content to be copied and pasted everywhere. Nowadays there are silos everywhere, and interoperatibiity sucks.


Show me one site that still looks/feels like Flash.

(Actually, the more the merrier!)


Flash is such a terrible choice for this analogy. Probably the one thing Flash sites all had in common was that they had nothing in common.


> We are in an era of individually well-designed, useful web applications, and they’re all unique.

This sounds pretty great to me. If the applications are well designed, they can do their own thing. I don't see why we need to conform to a centralized idiom. It's great that you can immediately tell which app you're in just from a glance. And also it's more interesting when applications have their own character.

I would say idioms should maybe play a role at a company level, i.e. it makes sense that Google products will share design principles. But I wouldn't impose the same principles to all applications just so it will be slightly more familiar to some people.


> This sounds pretty great to me. If the applications are well designed, they can do their own thing.

They are almost never well designed. Devs, particularly those who get excited about “clever” UX don’t care about usability.

> I don't see why we need to conform to a centralized idiom.

Because I don’t want to learn dozens of keyboard shortcuts to do the same thing just because a random dev thought it was clever to have their own thing. Because I don’t want to spend hours navigating “clever” menu layouts and figure out whether the settings I am after are behind a hamburger, in a sub-sub-submenu, in a contextual menu, or hidden behind arcane commands (hello, Chrome). I don’t want some stupid software forcing down on me metaphors from another stupid OS, either.

I do want text fields to support the same operations everywhere; I want menus to be used the same way as well, and these are the most egregious.

OS provided widgets offer baseline functionality the users expect to find. Re-implementing your own widgets is fucking stupid: you are spending a lot of time to re-invent the wheel, and in the end it does not even work.

It’s just like cryptography. Don’t roll your own. Your users will thank you.


> I don't see why we need to conform to a centralized idiom.

Exactly for the same reason why humanity has come up with an idea of standards. Same set of reasons I would even say.

> But I wouldn't impose the same principles to all applications just so it will be slightly more familiar to some people.

This is because design choices are more important to you than users' comfort and productivity. Some argue that it should be the other way around, though.


> same reason why humanity has come up with an idea of standards

and the same reasons are there why humanity has come up with the idea of breaking or ignoring bad standards

You can't resolve these type of issues in the abstract, some olden design paradigms are better than what we have now, but then there were plenty of awful decisions "standardized" as well


That’s not “breaking”, that’s “ignorant disregard without sufficient understanding of what the standard actually is”.


Nope, "smart disregard for bad UI practices standardized by ignorant people of the past" is also there


Nope, it is not. The only reason you can say things I quoted below is because you hate and disrespect your own users. You should not "innovate" just for the sake of innovation. Any change in the design and UX should be the least painful for the user. If you do it the other way around it's nothing but narcissism.

> I don't see why we need to conform to a centralized idiom.

> But I wouldn't impose the same principles to all applications just so it will be slightly more familiar to some people.


Another reason is that you respect the users and want to offer them a superior UI even though something bad is more familiar

> You should not "innovate" just for the sake of innovation

But you can innovate to fix "bad UI practices standardized by ignorant people of the past"


Also you can reiterate the same line over and over again without thinking. Also works for some.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: