Hacker News new | past | comments | ask | show | jobs | submit login
The Decline of Usability (2020) (datagubbe.se)
60 points by zephyrfalcon 61 days ago | hide | past | favorite | 45 comments



I was just setting up new phones for my parents last week. And seriously, y'all (making the software) have ruined computers.

Modern apps are so obnoxious and distracting and unusable it’s getting ridiculous. Tapping a verify link in an email would slide the email screen out, then slide in a browser, wait a second or two, then slide out the browser, then slide in the play store, then wait a moment, then slide in the app itself, which then had to update the screen after another second or two. Multiple screen changes flashing before my eyes made even me feel dizzy and confused. My mother had raised eyebrows and shook her head like she just got lost.

Then once you get in the apps, there's so many distractions that make the UI unusuable. "Hey did you know about this?" "Tap this to do this thing!" "Look over here!" not to mention various notification and permission prompts. The permission prompts are especially concerning because now there's so many of them to even start using a messaging app, for example, that you train yourself to just click through them instead of scrutinize them.

We've essentially gotten rid of text labels on icons, making it anyone's guess as to what the icon with the 3 colored circles does, or an icon with a shape, two lines, and another shape that I can't even describe succinctly. So many gestures make navigating the software a guessing game. My poor mother was like, "So to go to the last app I do this" only to gesture "back" in the same app, "Oh, I meant this", only to bring up the app selector. Nothing is discoverable anymore. Those little "learn how to use your phone" tutorials do nothing for you in "the real world." My father, who used to teach Commodore computers to his community, could no more figure out how to use a phone than his students could a Commodore.

We were in much better times when computers came with 100 page manuals and you had to go to a class or ask a techie how to use them.

I'm not even going to talk about the privacy problems in all this software.

I had to keep apologizing to my parents for my industry making the experience so bad and downright confusing.


>Then once you get in the apps, there's so many distractions that make the UI unusuable. "Hey did you know about this?" "Tap this to do this thing!" "Look over here!" not to mention various notification and permission prompts.

I feel a lot of modern UI/UX design isn't to serve the user, but rather the company itself. It's no surprise that it's unusable garbage because the user is merely a resource to exploit, not a human to deliver value to. See: Windows 11, smart TVs, and growth hacking.


Yes it's strange, outside of a couple of really great features (maps navigation is what comes to mind) my mobile phone is more and more something I avoid than seek to use and help me with the routine of my life.

At this point, 90% of what I do on my phone is probably text messages and maps. The rest of it is too inconsistent, changes too often, and is too demanding. I have to spend so much time turning off notifications and permissions for apps that I almost never bother installing new ones anymore.


> "Hey did you know about this?" "Tap this to do this thing!" "Look over here!" not to mention various notification and permission prompts.

This kind of thing is a pet peeve of mine. I think those things are attempts to mitigate some of the fallout from the fact that UIs no longer think discoverability is an important thing. So, instead of the UI itself incorporating discoverability, we're presented with these nearly-useless nags.


That is in part because of the users, because so many of them don't want to explore the UI or try to figure out how it works. Instead, they just want to know what they have to click on to do the thing that they want to do. I've tried to teach so many people how to use these things that just refuse to actually look around, and instead look at me helplessly instead of reading what the screen says. Bear in mind, that is when UI are made to be discoverable.

I just lump it in with all the other changes that signal that companies don't really want people to think about what they're doing on the devices anymore. They just want them to look where they want them to.


> because so many of them don't want to explore the UI or try to figure out how it works.

I've become one of them. It used to be that it was safe to experiment, because anything really damaging could be undone (which is an important part of discoverability). At some point, that stopped being a thing I could count on, so I've come to view experimentation as a risky thing.

But those nags don't solve the problem. They're really annoying, get in the way, and are rarely actually helpful. Nowadays, when they come up, I just close them without reading and get a bit irritated.


A huge problem with touch UIs is the lack of mouse-hover discoverability. At least with desktop applications that don't have labels on UI elements, usually a hover will raise a tooltip description.


Touchscreens are just incredibly restricted control interfaces in general

They're trying to replace physical buttons with actions like swipes and in my experience swipes, pinches, double taps, all that stuff only works some of the time

A right click on a physical mouse works every time


Fashion and special-snowflaking UIs has been a disaster for the human race.

FlatUI/minimalism has wrecked web usability for a decade now, making buttons look like tags look like inputs look like boxes, everything is a box. Very few people have been able to stand up against this trend without being ignored or shouted down.

My theory is that this is due to the confluence of two things:

First, purity spiraling in a western elite class, as happened with architecture (see "From Bauhaus to Our House" and the Yale Box, now we have the FlatUI Box).

Perhaps more importantly, this trend is cheap: you don't need skilled laborers to produce a yale box and you don't need skilled designers to produce a passable FlatUI, whereas other options take time & skill (read: money). This dovetails with the corporate ethos of cost minimization.

These two trends in harmonic resonance with one another make minimalism/flat the default mode, which forces designers into increasingly bizarre UX patterns to express themselves.

This is not unlike the post-modern movement in architecture.


There has been a dramatic shift in the way that we think about UIs over my lifetime. I want to blame my generation (and younger) who grew up with GUIs failing to learn the lessons of history, but I'm sure there's more to it than that.

In the 80s and 90s, there concerted, industry-wide effort to make computers easier to use. There was a large research effort into how to make UIs that had a gentle learning curve, and this relied of making UIs that were consistent and easy to understand at a glance. People my age (older millennial) and younger, even people in the industry, often don't know this. We may have played around in DOS as youngsters, but by and large we grew up with GUIs. We didn't really see what came before, we were too young to appreciate the usability leap forward, and unless we have a particular interest, we don't know computer history.

We never learned the lesson that GUIs are all about usability. We saw GUIs evolve over our lifetimes and collectively came to the conclusion that GUIs are about fashion. Mac OS X was a better OS because look how slick it looks. Ooo, Vista has basically the same UI, but everything is glossy. Everything needs to have a hamburger menu now because menu bars are so last decade, and we don't want applications to look old.


Yeah, companies would get entire studies done to determine what sort of UI should be used, for what reasons, etc.

A few years ago I was working at a company where they still use software that originated in the 90s. This is in an industrial setting where all sorts of information is displayed to operators on clusters of screens. What amazed me the most is when I came across a binder which contained the 5th revision of what effectively did describe the entire UI language of that application. In detail it told about all the elements, where they should be used, where they shouldn't be used. Not only that, for each decision it had citations pointing to various other research by universities but also Microsoft and other software companies.

Basically think the material design website: https://m3.material.io/components but then on steroids. Now, this was in a highly critical environment where it is critical that operators cannot get confused. But it is telling that when I was there, it seems that most people had forgotten about that document and the design lessons gained over time. So at some point they needed a new bit of functionality, and it got a random icon, no justification and of course during testing it became very clear that it stood out for all the wrong reasons.

It does indeed look like there has been a shift in regard to how UI is thought about. From where I am standing, the focus these days on that it needs to "feel right" first. Where in the early 2000s the joke was that you shouldn't have designers make interfaces, exemplified by very pretty elaborate flash driven websites from those designs that looked pretty but a nightmare to navigate. It seems to me that in the modern era they have gained more ground.

Which in itself wouldn't be that bad. If they also had not thrown out all the


Not sure if you've seen this already but you might like it: https://ics.uci.edu/~kobsa/courses/ICS104/course-notes/Micro...


Oh man, it has been a while since I have seen it. But yeah, I think that is at least one of the documents (or similar) that was cited as part of the background research for the document I mentioned.

What's telling is that this is the guide made available to software developers looking to create software for Windows. So this isn't some internal document either.

> This guide is intended for those who are designing and developing Windows-based software. It may also be appropriate for those interested in a better understanding of the Windows environment and the human-computer interface principles it supports.

This section in particular is interesting:

> # How to Apply the Guidelines

> This guide promotes visual and functional consistency within and across the Windows operating system. Although following these guidelines is encouraged, you are free to adopt the guidelines that best suit your software.

> However, you and your customers will benefit if, by following these guidelines, you enable users to transfer their skills and experience from one task to the next and to learn new tasks easily. The datacentered design environment begins to break down the lines between traditional application domains.

> Inconsistencies in the interface become more obvious and more distracting to users. Conversely, adhering to the design guidelines does not guarantee usability. The guidelines are valuable tools, but they must be combined with other factors as part of an effective software design process, such factors as application of design principles, task analysis, prototyping, and usability evaluation.

> You may extend these guidelines, provided that you do so in the spirit of the principles on which they are based, and maintain a reasonable level of consistency with the visual and behavioral aspects of the Windows interface. In general, avoid adding new elements or behaviors unless the interface does not otherwise support them. More importantly, avoid changing an existing behavior for common elements. A user builds up expectations about the workings of an interface

We are talking about a document that is over 300 pages long here. I suppose that if I had to let the cynic in me answer the question about what happened, the answer will probably be costs.


As someone with a CS background who specialized in human-computer interaction, I was always a bit disappointed by how little companies value good UI/UX. I was constantly told how easy and irrelevant things such as usability are.

You have to switch to software development to get any form of credibility, career development and good salary.


This article needs an update and more complaints. It's only gotten worse since. I so dearly miss actual Windows programs with sane user interface conventions... Oh look! Microsoft actually still documents good rules to follow but they don't follow any of them!!! https://learn.microsoft.com/en-us/windows/win32/appuistart/-...


Actually is an update (2023) here https://www.datagubbe.se/usab2/.


Good call! Thank you!


Yeah. Great timing for this HN submission, as I've just been doing a fair bit of nostalgia reading on the Windows 2000 UI. Apparently, some nice folks have extended the Win2k kernel to make it usable in 2024 (as I understand, this should run XP era apps and also maintained community forks of Firefox/Chrome): https://w2k.phreaknet.org/

Obviously, this is "fringe" stuff from many angles. But, after 15 years of hand-tailoring that perfect Linux desktop, I've found myself seriously trying out open source DOS systems as robust, "distraction free" environments for doing basic things. The Win2k UI creates similar itches. It was a damn good, really solid UI with very good defaults and just about the right amount of configurability; ditto for classic style XP with some tweaks.

A good quote from that linked Win2k site, explaining the "whys":

"These individuals have nothing against progress, but they do against progress for progress’ sake. /.../

Planned obsolescene is a big deal today — and so is the backlash against it. Products and software are no longer made to last. They're designed to be disposable, meant to be replaced or upgraded within a few years. Today, sustainability is more important than ever and newer versions of Windows are simply not sustainable anymore with their numerous useless flashy features and high system requirements. In addition, modern versions of Windows are hyper-dependent upon and expect an Internet connection. Everything from activation to updates is designed with that in mind.

So, where do older operating systems like Windows XP and Windows 2000 come in? Simple. They have everything we really need and expect from a computer today — and nothing we don't."

Obviously, I did research on getting that XP-era compatible Win2k kernel running on our new-ish Thinkpad, too. Most of my crucial apps would be there: stuff I've done audio montages and audio editing with professionally, IrfranView (no contender for this one on Linux!), MuPDF, a terminal, apparently also busybox-w32, Notepad, etc. But... younger generation needs Windows 10 for gaming, so I might be too tired to fiddle with alternatives.

Echoing something I remember from an old HN thread also: despite being pretty much just another Geeky End-User, I, too, would gladly pay a decent price for a maintained, official Windows version without telemetry and with the Classic UI and just a handful of the most essential system applications. A blazing fast Classic Windows -- pretty sure this would make me truly loyal to Microsoft forever, despite 15-or-so years of daily Linux usage. (EDIT: So, yeah... this would be more or less that same maintained version of Win2k, with wireless and contemporary drivers.)

I feel silly asking this on HN, but as a dilettante, I can: why hasn't Microsoft actually implemented anything like this for us casual users? Desperate projects like that Win2k kernel modification or ReactOS sure must be fun to follow, but I suppose much more than a handful of people would be interested in this kind of official alternative.


> Missing Menu Bars

It's lucky that MacOS chose to fix its per-app menu bar at the top of the screen. As a result most apps still have a (mostly) usable one. Windows users are not so lucky.


That's one of the aspects of the Mac that drives me crazy. I hate it so much. You're right that it does resolve the issue of applications deciding to do them wrong, but for me, it's not worth the cost.


It made sense on the original macintosh with its postcard-sized screen - the idea was that a short move of the mouse up will always get you to application menu instead of requiring you to find it on screen.


Anyone who is going to disagree with the article before having finished it. Please do yourself a favor and do read the "wrapping up" section before you reply. I feel like half of the responses I expect will be posted here will be covered by it.

The author did a good job articulating something I have noticed as well.

My personal pet peeve which very much is related to this is readability. Typography and typesetting are not obscure crafts that are poorly understood. They have decades of research behind them and generally are not even that difficult to implement following some ground rules. Like using sans-serif fonts* for larger blocks of text, having a line length of around 60 characters, balancing line height and making sure the text contrasts well with the background.

Yet many people seem hell-bent on reinventing the wheel there as well. Resulting in a plethora of blogs, knowledge documents, etc. out there with valuable information in a less than ideal format.

Which is why I am happy to see that the person writing about the decline of Usability has done all of these things right. Which made this a very pleasant reading experience :)

As a last note, I don't think readability was always better in the past. Certainly not when you take personal websites into consideration (I still remember the starry sky background Geocities pages). But for a while it did seem that most people were in agreement over readability and most blogs, news articles, etc did follow the basic principles I outlined.

I, too, might be old and angry.

* No, monospaced fonts are not better for readability. They might look cooler on your tech blog, and many people who spend many hours in IDEs might have not an issue with them, but when you are targeting any audience slightly larger than developers you should reconsider. There is some nuance here, for people with dyslexia monospaced fonts might actually work better. A serif font on modern high density displays also does not impact readability, however even today everyone will always read your text on a high density display. So overall, sans-serif is still the best choice to offer readability to the widest range of people.

Edit: For anyone who wants to know a bit more than just my unsourced statements. For a modern deep dive into fonts and readability, I recently came across this medium post which does a pretty good job tackling the font type: https://medium.com/@pvermaer/down-the-font-legibility-rabbit...


Where in the title bar do you grab to move a window? Title bars are filled with utter crap nowadays.


Don't get me started. Putting crap in the title bar is one of the things that I curse about on a near-daily basis.


It's got worse since 2020 - one of my most irritating examples is chrome's changes in the last year or two to "bookmark manager."

To add a bookmark folder directly, like that's your only intended action, here is what you must do:

click a tiny hamburger menu at far top right of browser. Hover over "bookmarks and lists" which expands another menu. Then click on "bookmark manager." That takes you to a screen with all your bookmarks. Excellent! Now I'm looking for a tiny + sign or some kind of obvious button somewhere where I can edit my folders. Nope. Screw you. It's in another tiny hamburger menu that's hard to see, click on that, now you can finally select "add new folder."

Is all that really necessary? Like, why? I'm well aware that you can add a folder from the "star" icon, but I don't always want to do that. This is a very obvious and common action that should take obvious and common steps to perform. This kind of thing is everywhere. Whoever the modern UI/UX people and product managers driving these decisions have completely lost the plot.

Even something that should be stupid easy like buying tickets to an effing baseball game is impossibly obnoxious - go to stubhub to buy ticket. Cool, I can do it as a guest. Where are my tickets now? Oh, I have to use stubhub app for (reasons) to do that. Have to use some kind of google or facebook account to access the app - fine. Whatever. You have me this far already and I already spent the money. View my tickets - oh, ok, I need to go back to my email (that I used when I bought the tickets on stubhub as a guest, not necessarily the one I logged into the stubhub app with) to find the code that allows me to view my ticket, which is typically a QR code scanned at the gate. Great! almost done. Nope. This QR code cannot be screenshot because it has a rotating one time token associated with it, so you gotta do this whole song and dance in line at the gate with an impatient crowd behind you.

How is a computer illiterate person supposed to navigate all of this? Technology is supposed to make things easier.


What you say is true but there is a keyboard shortcut which makes it all much easier: Ctrl+D


Of course, these standard and transferable keyboard shortcuts only seem to exist where a feature predates the current trend of garbage UI. I find that new features in software often don't even get a keyboard shortcut, or that the default set of shortcuts covers the entire keyboard so there's nowhere to bind anything else.


Ok but how bout on Mac? I've been using a computer since before I could walk and I don't even have the slightest clue how to do that. Maybe I'm old and this stuff isn't designed for me, but as of writing this post I don't know how I'd know what that hotkey is without googling it. It's not even discoverable via a pop up on hover like most hotkeys are! Atrocious design. Almost maliciously bad.


Imagine if every car had a different set of controls. Steering wheel, joysticks, rudder pedals, etc..

Used to be, the operating system controlled the title bars. You could see which window was active, and the behavior for moving, resizing, etc was consistent.

Add in classic menus vs. ribbons vs. hamburger menus vs. whatever wet dream some UX expert has imagined.

The inconsistencies in UIs are infuriating.


> Imagine if every car had a different set of controls. Steering wheel, joysticks, rudder pedals, etc..

Sadly the tech industry influence is bleeding into cars so this is no longer as far fetched as it seems.


TLDR: this is a 2020 era take on the need for a consistent design system for free software. Can't object to that, particularly.

However, I think usability as a whole is about to get very weird.

I've just done a workshop with a bunch of people. I recorded it, and want to write up some notes about what was discussed, perhaps make detailed notes about each person's perspective and contribution in order to understand how to best help them in the project we're discussing.

Expired: listen to a recording, write notes, maybe even transcribe some quotes, then look at my notes and make some summary notes.

Tired: Use some transcription software to automatically turn an audio recording into a diarised text output. Copy paste that into a document and then edit it down into summarised notes.

Wired: "Hey LLM, provide a diarised transcript of this audio file, labelling speakers based on the names they give in the introductions. Then provide me with a summary of key points discussed, and for each speaker provide a summary of each speaker's perspective and contributions. Provide me with a list of todos to address each action item, and also to plan a mitigation for each risk or concern raised throughout the meeting."

In the new World, I don't care about applications. I care about jobs to be done. Yes, there's a lot of absolute junk Kool-Aid all over X/Twitter and we're definitely in a hype bubble, but the point remains that the interface with the computer is about to change dramatically, and worrying about what items appear in a menu and how to move files between applications is going to look pretty quaint in a few years time.


> Yes, there's a lot of absolute junk Kool-Aid all over X/Twitter and we're definitely in a hype bubble, but the point remains that the interface with the computer is about to change dramatically, and worrying about what items appear in a menu and how to move files between applications is going to look pretty quaint in a few years time.

I feel like I've been hearing this every few years for a while now and it never arrives. Meanwhile desktop software usability keeps getting worse. It feels like the IT equivalent of the hyperloop: "Why bother building infrastructure when the next gadgetbahn[1] will arrive any day now and solve all our transportation problems?"

[1] https://www.youtube.com/watch?v=O1-32Y2wADo&pp=ygUKZ2FkZ2V0Y...


> the interface with the computer is about to change dramatically

I don't think this is true outside of certain niches.


Wait until everyone's contributions to the discussion are generated by LLMs as well. "Hey LLM, attend this zoom meeting for me, chime in with a few appropriate comments, and then send me a summary"

It will be LLMs talking to LLMs.


A little unrelated, but the information density in modern UIs is making me feel old. Was it always that way? Sometimes there's way too many links/too much content presented all at once.

I know people have been talking about the trend towards mobile for some time and how that's affected density. The other day I was browsing Blizzard's battle.net website to purchase the old school Diablo 1. I was expecting it to be the easiest thing in the world, but it actually took me a few minutes to figure out how to navigate their website. I'm not sure why...

The other example I was thinking of was Amazon. The home page is packed with product recommendations. I'm currently getting pet wellness product recommendations, even though I don't own any pets? If you accidentally hover your mouse over a menu bar, you get these giant drop-downs which cause the rest of the screen to darken. Some drop-downs are text (such as the one for your account in the top right). Others are just drop-downs covered with large icons. I think the drop-down for your account is egregious. 99% of the time I go to Amazon to buy a single product. Music Library? Start a Selling Account? Kindle Unlimited? Wtf? None of this is relevant to me.


I have the opposite problem with modern UIs. They tend to be too sparse in terms of information density.


I think the main problem is the lack of color and contrast. It makes it very difficult to present information in a way that makes sense. White space is one option to add structure to the information, but it's arguably not a very good one.


> I think the main problem is the lack of color and contrast.

That's a real issue, yes. Modern UIs seem to be actively antagonistic toward users in many ways, and that's certainly one of them.


Yes it was always quite dense, but screens were physically smaller and pixels were physically larger, so the overall amount of displayed on a computer monitor has increased a fair amount.


It's fake information density. We have so many pixels available but it's just not used correctly anymore.


Shitty UI? That's what they pay me at work to use.


I've been saying this for years.

In the late 90s/early 2000s, we did user testing to figure out how to make software easy to use. We learned that discoverability was king.

In the last 10 years, we've been throwing it all away in favor of UIs that look "sleek" and "clean".

Functionality is hidden and we got rid of discoverability. Everything is flat, so it's not obvious what you can interact with. What's simply a status icon in one program could be a button that leads you to more information in another.

I really REALLY don't understand the fascination with flatness. Flat is ugly, boring, plain, and hides interactability.

We reached a period where we have more graphics processing capability than ever before, and we made everything simple flat shapes.

Windows 7 with the Classic theme was peak UI and you'll never change my mind.


I am convinced that this is happening because UI/UX designers need to somehow justify their salaries. Working in software development I consistently see changes _just for the sake of them_. It's quite infuriating.


For everyone who jumps in to comment despite not reading the article: a more appropriate title might be “the death of desktop usability” - this is about desktop operating systems, and specifically carves out mobile UIs from the argument.


Mobile UIs are terrible in their own way, too.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: