Ask Slashdot: A Point of Contention - Modern User Interfaces 489
Reader Artem Tashkinov writes: Here are the staples of the modern user interface (in varying degree apply to modern web/and most operating systems such as Windows 10, iOS and even Android):
- Too much white space, huge margins, too little information
- Text is indistinguishable from controls
- Text in full-CAPS
- Certain controls cannot be easily understood (like on/off states for check boxes or elements like tabs)
- Everything presented in shades of gray or using a severely and artificially limited palette
- Often awful fonts suitable only for HiDPI devices (Windows 10 modern apps are a prime example)
- Cannot be controlled by keyboard
- Very little customizability if any
How would Slashdotters explain the proliferation and existance of such unusable user interfaces and design choices? And also, do you agree?
Easy answer (Score:5, Insightful)
Phones and tablets.
Re:Easy answer (Score:5, Insightful)
Re:Easy answer (Score:5, Insightful)
Bingo we have a winner .. At work if we have a choice the developers use Linux and the customize the UI the way they want it... Usually a Gnome 2 , a KDE ( like windows with the menu and apps pinned to the bottom) or similar interface. I realize all the hipsters think this minimalist ui with a very small, dull color palette is cool, but it isn't. It's very limiting and very boring and 99% of the users are not hipsters ... so we are not impressed ..
Make UI's Great Again !!!!
Re:Easy answer (Score:4, Interesting)
But apparently the 'designers' are too lazy or clueless to a) know the difference, and b) build two different interfaces.
Re:Easy answer (Score:4, Interesting)
I don't think it's laziness, more like cluelessness. There was a big push for several years after smartphones and tablets took off to merge UIs across platforms. I suppose part of the justification was to try to draw in developers from both the smart device and desktop worlds to do more cross-platform work, and part of it was likely just to simplify (read: make less expensive) maintaining and developing features where an OS might be on everything from multiple screen desktops to 5" phones.
At the end of the day, I doubt even a smart UI abstraction layer will ever make these variant UI scenarios fit under one hood. Many web developers have known this for a while, which is why you have mobile and desktop versions of sites in many cases, but I still think Microsoft and Apple need to fully accept the reality that what works on little may look like shit on big (and visa versa, as my experience with an 8" Windows 10 tablet even in tablet mode informs me).
Re:Easy answer (Score:5, Insightful)
It all went to hell from there...
Re:Easy answer (Score:5, Insightful)
Funny, but I actually agree with this. I still hate the ribbon. A menu is a reasonably nicely categorised index of functionality - easy to read, properly aligned text, room for descriptiveness as required, sub-categories where appropriate (but highlighted in a consistent manner), and it has hints like underlines for keyboard shortcuts. And when not in use it neatly vanishes. The ribbon takes the menu, hurls it across the screen with a bunch of apparently random icons with no thought to readability, alignment, sorting or descriptiveness, actively hides some information in a non-standard way, and thoroughly confuses the distinction between a toolbar (a small set of tools kept visible for ease of access) and a menu.
I really, really wish the ribbon would just go away.
Not just "mobile first", but lazy/cheap web devs (Score:5, Insightful)
"Mobile first" is partly to blame, but lazy/cheap teams are more so.
Take a look at what's popular in trendy web app design today: flat everything, big rectangular colour blocks, lines and rounded corners, text. Look at the boxy, side-by-side layouts, almost invariably collapsing into increasingly linear formats for narrower screens until it's just a single column.
Now look at what you can do easily and portably with CSS. In particular, look at what you can achieve by just slapping Bootstrap or the like on your site, without spending much time or money considering the design and layout, and certainly without hiring any sort of designer or, $DEITY forbid, a digital artist to create custom graphics that fit the style of your product/service and build any sort of distinctive branding.
There was, at the time, some justification for this in that downloading lots of large images on the mobile networks of a few years ago really could significantly slow down loading a page, with resulting poor user experience and app/site performance. But for most of us, our target markets are on faster networks today, and CDNs are much more developed now as well. And certainly you don't get any allowance for this if your site includes megabytes of JS frameworks, ad content, or auto-playing hero video.
Likewise, there is some justification for minimal UI chrome on small screen devices where every pixel is precious, but you don't get any allowance for this if you replace a simple hairline with half an inch of whitespace because your visual style is so generic and unguided that the user can't actually tell how the UI works otherwise.
Frankly, Microsoft, Google and Apple are amateurs when it comes to nerfing design by being flat and bland. Web developers have been moving in this direction for at least as long as smartphones and tablets have been around, and people with actual UI design skills have been criticising them and pointing out the obvious and horrible usability flaws for just as long.
Re:Easy answer (Score:4, Insightful)
To properly build two interfaces requires way too much effort. It is much easier if you stop trying to make all things for all people for every device. The solution is much simpler than convoluted designs.
Three parts to every design, separated from each other part: 1) Content, 2)Design, 3)Structure
Content: The actual bits that matter. Articles, pictures, code snips whatever it is that "counts".
Design: The flowery bits that distinguish content from other content. Fonts, Styles, Artwork and Logos. These are the bits the identify the content brand.
Structure: This is how the content and design bits are displayed. Two Columns or Three. Header above, in the middle, or below. Left / right. "Layout"
If you break up the UI into these three aspects, it becomes much easier to modify / replace / customize. You can skin the layout to make it look unique, you can adjust the structure to work better for different work flows (Small, Medium, Large screens; Programmers vs Graphic Designers)
The problem is, we have people trying to shoehorn Touch Interfaces on desktops that don't need touch. You have people trying to make something look good on small screens, but on large screens gives way too much "white space" or ending up looking "cluttered".
The reality is, that everyone's needs are different, and the whole "one size fits all" thing doesn't work for everyone. And you end up making very few really happy in the process.
Re: (Score:3)
I agree with much of what you say, particularly the inappropriateness of trying to shoehorn interfaces from one type of device into another type of device with very different attributes. However, I also think your argument as written is fatally flawed, because content, design and structure aren't really independent.
"A picture is worth a thousand words" is as true as ever. The difference in effectiveness between a table of raw data and a standard but well-chosen chart can be dramatic. Sometimes the differenc
Re: (Score:3)
When the bring out a new model of car they don't mess with the pedals and steering wheel because that would be stupid. About as stupid as changing an ingrained UI just to make it "NEW!!!" Almost as bad as the use of the "white it out and spread it out" interface in Windows is that so many websites are now "updating" their
Re:Missing features (Score:5, Informative)
The automobile has gone through quite a few control redesigns which are continuing. If you jumped into a Model T (maybe only early ones), you'd find it hard to figure out. Besides the controls that have been automated away, choke and ignition advance, the parking brake was operated by your left hand, along with ignition advance, throttle by right hand and gears by the pedals along with the brake being the right pedal.
The steering wheel started out as a tiller and as late as 1899 was introduced in America as a wheel. Since various controls have migrated to the wheel or right beside. The turn signal operated by your left hand, which has acquired more and more functionality such as operating the lights, wipers, high beam. On the right, there was the gear shifter for quite a while before mostly migrating to the floor. And all the various controls that can be found on a modern steering wheel. Even my old truck has the cruise control buttons on the wheel. The shifter pattern has also changed at times. Had an early 5 speed where reverse was where 1st usually is.
Speaking of my 25 year old truck, while most of the pedals are standard, on the left there's the parking brake release and the hi-lo headlight dimmer button on the floor. Turn signal only operates the turn signals with a knob on the dash that you pull to turn on the lights and turn to dim the dash lights and turn on the interior light. The wiper switch is besides it, turn one way for normal wiper operation, further for high speed, turn the other way for intermittent operation, push for squirting cleaner (still have to turn the wipers on manually).
Another set of controls that seemed somewhat standardized for a long time and now are in flux are the climate controls and radio/sound system where automakers keep screwing around with stupid touch controls. Stupid due to breaking the paradigm that the driver should be able to operate everything by feel while watching the road.
It took close to 50 years to standardize just the pedals on the car UI, while the modern computer UI is at the most 30 yrs old. Hopefully in another 100 yrs, things will have mostly settled down, but as the automobile has shown, new tech such as touch screens, still puts basic interface into flux, often with stupid design decisions such as trading easy to feel buttons for hard to use, changeable, touch screen.
Re: (Score:3)
Re:Missing features (Score:4, Informative)
And touch controls are stupid because knobs and buttons allow you to rest your hand on them while you use them. This means your hand does not leave the control when you hit a bump in the road. With touch controls, you have to keep your hand floating in front of the screen, where every bump and jiggle causes it to shake around relative to the screen. It's actually worse than just having to take your eyes off the road to use them. You also have to concentrate on keeping your hand aligned with the screen while the car bumps along.
Re: (Score:3)
Re: (Score:3)
Some of us hipsters agree with many of the complaints in the summary.
I like unintrusive, post-shiny user interfaces. I really prefer flat UIs. Still, when z-order is a fundamental feature of a UI (windowed desktop) then it makes sense to provide an intuitive mechanism for z-order information (e.g. shadows, highlighting focused windows).
Too much white space, huge margins, too little information
This is true for UIs whose purpose is to disseminate information. Charts, graphics, grids, and such things need to give the user more information while requiring fewer clic
Limited colours and flat look are the best though. (Score:3)
I actually really like most of Material Design. I often have to design HMI displays (user interfaces for industrial automation). There are good reasons for much of the design:
* colours should be limited and subdued for user interface elements so as to focus attention on content. Bright colours and animation are intended to call attention to important information.
* textures, gradients, transparency and drop shadow effects for the sake of visual flare cause visual confusion and eyestrain. Important elements g
Re:Easy answer (Score:4, Insightful)
Agree.
Over the past year I've (for the first time) used Mac OS X on my laptop, I find it much less useful, and frankly much less user friendly, than Gnome 3 (and even Gnome 3 hides too much information because it assumes its users are technophobes).
One can understand Microsoft and Apple designing user interfaces primarily for technophobes, because in the modern world the majority of their users are people for whom the full power of a computer system is too complex for them to understand, much less use; and, seeing that they have in effect a duopoly, the fact that their more technically able users are not well served by their user interfaces doesn't matter, because there aren't enough of us to be a significant market, and most of us will be told what to use at work in any case.
But I really don't understand the Gnome designers' reasons for hiding so much, for making even moderately technical things so awkward. In practice, almost everyone who chooses to use Gnome is a geek. Having said that, if it really annoyed me I could either switch to something else or get under the hood and modify it, and I don't.
For me, Gnome 3 works with niggles. MacOs X is really annoying, but I can use it. Windows 7 is tolerable. Windows 10? Just let's not go there.
Oh lord, the pain (Score:5, Insightful)
Some of this is web design (I use the word "design" very loosely) and some is application design:
o the "designer" mindset has gifted us with extreme low contrast backdrops and fonts - STOP THAT
o bloody pop-up/over dialogs that were not asked for are constantly used - THIS IS HOW TO MAKE ME GO AWAY
o menus drop without being requested because mouse went over them - WAIT FOR A BLOODY CLICK!
o Videos autoplay just because I've arrived, or because the mouse pointer went over them. Ever think *I* might want to control what damned noise comes out of my computer, or what data I want to stream on my phone? You should. Because while I'm desperately trying to figure out how to shut up / stop your video abortion, I am hating on you and everything you represent, and vowing to NEVER come back to your site, which I promptly implement via my hosts file because you SUCK.
o Do NOT change the web or application UI: NEVER make a modal UI. Present a consistent interface that can be learned and incorporated into muscle memory. Enable/disable elements as appropriate. IOW, if a document isn't NEW or Loaded, Save should be disabled - not GONE. This is so everything in the interface remains where it was. We want to work, not read your damn interface over and over and over and over just to see where we're at.
o Make ALL keyboard commands configurable. In some apps, some of the things I do most often have no shortcuts and no way to add one. How annoying. How stupid.
I swear, there are days when I'd like to hunt down these so-called "designers" and yell at them until my voice gave out.
All of the above is effete nonsense that designers engage in an attempt (which is actually abject failure) to justify their title; stop all that, and just do it right. Don't even try to be "fancy" unless you're writing a game.
Also, if you say "UX", I just want you to know you've made me work to suppress an urge to slap your face. Hard.
Re:Oh lord, the pain (Score:4, Insightful)
Certainly doesn't stop them from transparently redirecting 2 times so that when you hit the back button, you are just going back to the redirect page which then puts you back into the page you were trying to leave. You have to either hit back 2 (or sometimes even more) times very quickly, or right click on the back button and choose the page you want to go back to. It's almost as annoying as not being able to hit back at all.
This, by the way, is also a side effect of SSO where you are redirected through the authentication system before arriving at the page. It's pretty aggravating.
Re:Oh lord, the pain (Score:4, Insightful)
but they are UX, I don't know what you have against calling it that, because UI is how it looks (and in UI the look is functional yes). Behaviour such as popups auto opening as soon as you open a page are clearly UX because it is nothing to do with the design of how it looks and everything about how it behaves.
Nonsense. It's all UI. A command line is a UI. My car's steering wheels and pedals are a UI. A remote control is a UI. A UI where things slide around and hide themselves and whatnot is all UI. I'm the user. It's the interface.
Re: (Score:3)
Another answer is: giving "style" more preference than "usability". In no less than TWO of my streaming video apps, I've seen progress bars implemented as light grey on white, making them more difficult to distinguish, but looking much more sophisticated than using some "garish", easy to see color scheme like, say, green on white.
Amazon recently re-designed their video streaming app on Xbox One, and while the rest of the app seems reasonably well thought out, they screwed up their transport controls. If y
Re:Easy answer (Score:5, Informative)
Re: (Score:2)
Re: (Score:2)
Re:Easy answer (Score:5, Informative)
I have used numerous candy bar and flip phones. I used an Android phone. When I was handed an iPhone to do something, I was absolutely baffled at how to do certain basic operations. I would even consider this is because I could be an ignorant idiot. But I don't think that is so. I could make certain fits of progress, but then get stuck at some basic operation. (Don't remember details, it was a few years ago.)
I'm sure I could learn how to use an iPhone / iPad just fine. I look at some of the things I have had to learn. Back in the day you had to memorize a stack of computer manuals that you could not remove from the computer room because they were physically bolted to a table. And it was uphill both ways. I practically brain downloaded the entire Common Lisp The Language (1, and partially 2) in the very early 1990's.
What bothered me was that I was a huge Apple fanboy back in the day. Apple was all about user friendly. Human Interface. Things should be intuitive. What you can do should be directly recognizable from what you can see. Even if what you can see is a control that reveals more possible operations. There weren't hidden gestures. Magic handshakes. Etc.
That's just one person's experience. It is not a generalization to everyone. But you did ask.
Re: (Score:3)
But why should you have to? It's sort of like the proliferation of different single knob bathroom shower temperature/volume controls I've seen in the past 50 years. The best of them work no better than the hot and cold knobs of my distant youth and many don't work nearly that well. What's the point?
It's sort of like emacs. Lot's of nifty stuff there, but learning to use it is such a monumental pain that most of us don't bother.
Re:Easy answer (Score:5, Insightful)
A couple decades ago there seemed to be a much more rational UX philosophy where controls were obviously controls, text was obviously text, window frames and borders were -good- things because they help the user's mental model of the UI match the software, and on-screen affordances were designed to give the user a clue as to what does what. We've gone backwards.
Re: (Score:2)
I have notices this as well.
Also, visit apple.com to see the current state of "modern" web page layout and design that is inevitably copied by every other web developer.
No only (Score:5, Insightful)
Its arrogant designers who think they know better than the generation before, want to be seen to be different and "edgy" and "new" and so chuck out all the lessons learned and fuck things up royally. So we end up with an OS in 2017 that looks more primitive than Win3.0.
Re:No only (Score:5, Interesting)
This
As part owner and lead engineer and developer for an online GPS tracking platform I experience this on a weekly basis.
Just fired a guy 1/3 my age and 1/10 my experience for telling me I am too old school and think I know everything.
Fucking guy insisted on using microscopic fonts and all grays with almost zero contrast ratio.
I could not even read that shit on a 28" monitor.
Re: (Score:3)
So we end up with an OS in 2017 that looks more primitive than Win3.0.
This is very to the point.
In fact, I think someone needs to make a list of UI features that were in Windows 3 and Windows 2000/XP/7 (whichever we think is most usable). And for newer interfaces like Windows 10, count what percentage of the UI features still exist. And based on that, calculate an "age" for the new UI, i.e. "$OS has regressed to a 1997 level of interface".
Re:Easy answer (Score:5, Insightful)
Studies that were done over 100 years ago found that the best line-length for human reading was around 4 inches at most. The extra width that modern screens provide don't give much benefit, but at least with a tablet it's much easier to adjust to a portrait mode where the added vertical space means less scrolling. Otherwise there isn't a lot of useful things to do with a UI other than add more tool pallets, but for a non-professional tool, its typically better to avoid throwing too much at users so we've got all this extra space that provides no benefit. So websites fill the void by throwing in a side column of ads, but that's worse than just empty space as far as I'm concerned.
The touch model of phones and tablets as makes it more complicated to have a universal UI. Web pages with context menus or anything that interacts with a mouse hover are horribly clunky on touch screens, and optimizing for different platforms is often time consuming or doesn't even make business sense depending on how much traffic you get from different platforms. The same goes for applications that could be run on either a tablet or a PC as the interaction models are different enough that trying a one-size fits all approach often degrades the experience for both users. Using an application with bigger buttons that are necessary for touch targets with a mouse and keyboard just feels like the UI has wasted a lot space and trying to touch small targets designed for mouse use can be exceptionally frustrating.
Re: (Score:3)
So WTF is the word for the ?????'s that invented auto-maximising? I mean, seriously, you are comparing two documents, side by side (or cutting and pasting) - then one of them touches the edge of the screen, and BANG you have only one!
There has to be NO possible use case for this feature other than to make the user select an alternative UI!
Re:Easy answer (Score:5, Interesting)
I can explain the proliferation of unusable user interfaces in two words: Graphic Artists
I saw this trend start in the 1980's. We were designing a new version of a successful Macintosh product. We were working on the user interface. The graphic designers could make things look good, but had no grasp of principles. The big eye opener to all of the developers but zero of the graphic artists was when an artist was describing an operation and then indicated using a certain button as doing something very different than it was described as doing earlier. Something unworkable. Something that revealed the entire mindset was about how good it looks aesthetically.
In our ensuing discussion it was recognized how a lot of consumer electronics at that point (late 1980s) looked fantastic on the shelf, but had horrible user interfaces.
Back in the day Apple had Human Interface Guidelines. And I understand that Microsoft did too.
Today all of that has gone out the window. I'll just give one example. Google's Material Design. Not that I'm criticizing it. But just criticizing the NAME. The name screams it is all about the aesthetics and not how well it interacts with human beings.
And we wonder why things have such badly thought out UIs. You have to start with basic principles. Get a good book like The Design Of Everyday Things. It explains the user interfaces of things like Door Handles, Faucets, and things you would never think about. It describes a lot of principles that you wouldn't think about, yet suddenly recognize. Once you read the book, you can answer what an affordance is when designing a UI.
Re:Easy answer (Score:5, Informative)
IBM had "Common User Access" (CUA), and Microsoft had "Consistent User Interface" (CUI) guidelines, which were roughly comparable to Apple's. Following those guidelines might not be as visually attractive as some of the crap being designed today, but at least it meant that people could get acclimated to your product quickly and with a minimum of confusion. In the world of UIs today, there's way too much frosting and not nearly enough cake.
Re:Easy answer (Score:4, Insightful)
IBM had "Common User Access" (CUA), and Microsoft had "Consistent User Interface" (CUI) guidelines, which were roughly comparable to Apple's.
IBM's standard could only have come from IBM. Save was F12, Save As was Shift+F12, and Print was (IIRC) Ctrl+Shift+F12. Cut/Copy/Paste? Shift+Del/Ctrl+Insert/Shift+Insert. Arrgh.
When Microsoft was trying to be a corporate partner of IBM, they followed the above standard for a while... and then they rebelled and implemented Ctrl+S for Save, Ctrl+P for Print, and Ctrl+X/Ctrl+C/Ctrl+V for Cut/Copy/Paste. And left the CUA ones working because why not. I haven't checked but I imagine the CUA ones still work today; it's not like the UI designers are falling all over themselves wanting to use Ctrl+Shift+F12 or Shift+Del for anything.
In the world of UIs today, there's way too much frosting and not nearly enough cake.
I agree completely.
CUA (Score:3)
Whatever. I find the CUA keys for copying and posting more memorable than the Microsoft ones. And I feel bad when they are not implement.
Re: (Score:3, Informative)
Today all of that has gone out the window. I'll just give one example. Google's Material Design. Not that I'm criticizing it. But just criticizing the NAME. The name screams it is all about the aesthetics and not how well it interacts with human beings.
Ah. Well, then, you might want to read more about Material Design than the name. It actually has quite a bit about human interactions. Even if it were just about aesthetics, a lot of visual design is about how humans interact with colors, shapes, fonts. No visual designer I've ever worked with picks colors purely because they like that shade of blue.
Re: (Score:3)
Re: (Score:3)
To me, anyway, "form follows function" does not mean that beauty is an afterthought, but rather that properly executed functionality is aesthetically pleasing.
Re: (Score:3)
Depends on the perspective.
It's a lot like dating. The flashy appearance makes you drool so you buy in. It's only after you've had time to get over the initial euphoria that you realize exactly what you got yourself into...
The people who drive businesses want your buy-in; it's their sole reason for existing. They don't care if the product is actually any good.
White space (Score:4, Insightful)
On web pages, at least, the excessive white space is an obnoxious side-effect of current "responsive design" practices.
Re:White space (Score:4, Informative)
Re: (Score:2)
You mean on paper? Huh... why would someone do that?
Re:White space (Score:5, Funny)
You mean on paper? Huh... why would someone do that?
Looks like you have a promising future in web design!
Re: (Score:2)
This is because most people aren't designing for paper, if they were they would include stylesheets for print medium. People who print out website are in a low percentage of users. I own a printer and the last time I used to print something other than a return label for amazon was years ago...
Re:White space (Score:4, Interesting)
On web pages, at least, the excessive white space is an obnoxious side-effect of current "responsive design" practices.
More specifically, it seems that the idea that 'content is like water' results in having just enough content to fill the small screen of a mobile device and then presenting that same content on a larger screen by introducing huge amounts of white space to pad that small amount of content out.
It should have been glaringly obvious that this was going to be the result from looking at the pic on wikipedia:
https://en.wikipedia.org/wiki/... [wikipedia.org]
how could designers not have seen this coming?
Re: (Score:3)
Agreed. I'm fighting^H^H^H^H working with a web designer on this point right now.
"Responsive" doesn't mean take a design and make it work on all devices, it means change the design so it is optimal on (ideally) all devices.
Re:White space (Score:5, Insightful)
"Responsive" doesn't mean take a design and make it work on all devices, it means
Unfortunately that IS what the term currently means among that group. Generally they (wrongly) believe they control all aspects of the web page display, that all devices are equally powerful and can run an unbounded amount of scripting, they often see no difference between a picture of text versus actual text, and don't bother to learn anything about the media they are designing for.
Aside: More than once I've had to convince a web designer that their pictures of text were the biggest reasons things weren't showing up to search engines, they kept claiming the hidden meta tags, text recognition, and image search would handle all that. Frighteningly some were never convinced, even after showing them with Google's own tools how Google interpreted their pages. Some were absolutely convinced that Google reads all text on all images and indexes pages based on image content. They could not fathom how there was a difference between text and fancy-rendered images of text.
Many wrongly assume the web browser displays the same thing on all screens, no matter what. Often they design for a few patterns they think are common, 1024x768 or 1080p, and try to force it on everyone else.
Got a Super HD display showing 7680x4320? Too bad, we'll just upscale the fonts and add some whitespace.
Got an old smartphone with a 480x640 portrait screen? We'll downscale and do an ENORMOUS amount of JavaScript processing on these devices least suited for the processing.
It seems these are the same designers with the first-world problems of their disposable $800 smart phone is more than 18 months old, and their $2000 macbook is more than three years old and ready for replacement.
Re: (Score:3)
Then it's poor responsive design.
Seriously, there is a limit to the width of a column of text that it's comfortable to read, so for continuous text on large screen there may be reasons for having large amounts of whitespace. And, again, for continuous text, having a proportion of white space around the text is easier on the eyes. There can be good ergonomic reasons for using significant whitespace in design.
Good responsive design is hard; to have the same page layout on a two inch wide mobile phone scre
Re: (Score:3)
On the PC if I manage to hit that limit -- and currently I'm not at that limit with a large widescreen monitor -- I can resize my window to something narrower.
I certainly won't hit that limit on my phone or tablet, and if I did, I could rotate to portrait mode.
Don't take away my choices. Just because one person happens to prefer a width doesn't mean everyone does. I hate the news sites that give you a fixed panel a
Re: (Score:3)
Then it's poor responsive design.
The problem is that roughly 90% of all the "responsive design" sites I have encountered have been poor. At some point, it becomes reasonable to say that the problem is responsive design itself. If the majority of implementations of something cannot get it right, perhaps the problem is the something.
Re: (Score:2)
"Responsive" may work on a screen somewhere, but it sure is not a Samsung one, or a Cyanogen one - I have tables and phones.
I dont think there is much wrong with LXDE, I find Mate is quite to my taste - but my BT modem interface is crap with both (and Gnome 3/Unity - but I find them painful anyway).
I had a blog that Google made me convert to "responsive" but I was then unable to read it myself, so I abandoned it.
Re:White space (Score:4, Insightful)
Its expensive to create 3 different interfaces
Then don't. That is foolish yet is common among people who wrongly believe they have control over how a web page looks.
One premise of the markup language was that all rendering would be agnostic of the display. It was not meant to be, and should not be treated as, a pixel-perfect display.
Yet that is exactly what most "responsive" systems are trying to do. Enormous amounts of calculations to figure out how to precisely organize the display, doing the most processing on the mobile devices least capable of doing it.
Web designers need to let go of their fascination with precisely scripted layouts. Let the browse handle it. If the browser is a 480x640 phone or a 7680x4320 ultra high density monitor, designers should allow the web browser to do what it was designed for rather than going through enormous hurdles to force it to the web designer's vision -- which is usually limited to a 1024x768 or 1280x720 design.
Too much (Score:3)
Re: (Score:2)
Also known as "UX engineers".
Re: (Score:2)
Hipsters.
Re: (Score:2)
I grew up programming my atari computer to do lots of cool things, but all inputs were hard coded into the source code. Then when I took a college course, I learned that most software was less about doing cool things than making it look pretty and always got dinged for this.
Re: (Score:3)
See : Carnegie Mellon University for the source of so much hipster insanity with UX. Their motto is apparently... "Lack of features equals ease of use."
How would Slashdotters explain the proliferation? (Score:4, Insightful)
Re: (Score:2)
Very much so. You see advocates of the new, ugly paradigm disparaging older interfaces as not being "modern."
Forgot one (Score:3)
One of my biggest beefs is with those apps whose windows can't be resized, and you're forced to scroll all over the place -horizontally as well as vertically- in a window barely the size of a post-it note.
Re:Forgot one (Score:5, Informative)
Scott Meyers calls this the The Keyhole Problem [aristeia.com] and has a paper with a bunch of good examples.
My "favorite" modern example of the problem is Chrome's omnibox auto-completion, you get six results at maximum, they don't even give you a scroll bar or a "Show more" link, six results only. There used to be a command line option to increase it, but they removed it some years ago, it's now a hardcoded constant [stackoverflow.com] in the source code.
Re: (Score:2)
Re: (Score:2)
Remedy.... we're all looking at you, you BMC piece of garbage...
Large Print (Score:2)
Re: (Score:2)
Hate flat GUIs (Score:3, Insightful)
I agree, I cannot stand this push to flat GUIs. Give me a button that looks like a button, that way I know I can push it.
heh, captch: condemns
Rebellion (Score:4, Insightful)
New generations always rebel against the ways of the previous generation. It's human nature.
During the Renaissance we had visually brilliant works of art created. Later generations shunned this and decided that a canvas painted a solid color had just as much merit. Which is "right"? Neither. They just are.
And so it goes for UI design. From my perspective, we had a very consistent standard for UIs for a good 20 or so years. This was in part driven by technological limitations, but it worked well. The barriers are gone now, anything can be done. Therefore anything will be done. I've actually worked with people who are "UX Specialists" and completely disagreed with what they thought was intuitive. I also regularly have to look up how to do things on modern gadgets because they don't include manuals anymore and they most certainly are NOT intuitive. To me. I'm probably just old. And so is the submitter. :-)
Re:Rebellion (Score:4, Informative)
It's a little sharper than that—the current generation of interface designs was a direct reaction to the previous decade's tradition of absurd skeuomorphism. The moment Steve Jobs died, Apple did an about-face and started following Microsoft's Win8/Modern/Metro UI lead. It may look like a step backward to those who from the Windows 2000 and Gnome 2 era, since there's a loss of visual cues, but the flatness of current interfaces is way better than what the classics became [tumblr.com] in the post-Windows XP era: bloated, overdesigned, pseudo-real-objects cluttered with mismatched shadows and conflicting perspective angles. You couldn't tell what was a button there, either! At least now there's a consistency and a return to the actual use of design guidelines.
That said, there are still a lot of cases where literacy in idioms dominates: for example, the largely inexplicable convention of swiping sideways on a list to reveal 'delete' or 'edit' buttons in mobile apps. That's probably where you and the UX designers run into the most difficulty. But two decades ago, every "how-to-use-a-computer" class targeted at seniors started with how to operate a mouse—so, as I think you've already recognized, it's important to try to take these things with a grain of salt, and recognize that no one is completely objective when it comes to understanding the culture of computer operation.
Re:Rebellion (Score:4, Insightful)
During the Renaissance we had visually brilliant works of art created. Later generations shunned this and decided that a canvas painted a solid color had just as much merit. Which is "right"? Neither.
No, Raphael's works have much more merit than a canvas painted in solid color. That isn't even a question. The canvas painted in solid color can be interesting, but it's on a lower level.
Modern Software (Score:4, Insightful)
Oh. And get off my lawn...
Re: (Score:3)
God yes. Autodesk, I'm looking at you. Apple, I'm tired of looking at you. You're both ugly and your mothers wear army boots.
Jeez, Autodesk - you can't bother to decide on something resembling a consistent user interface between products and life cycles? Ok, fine. Then goddamn document it somewhere beside's a YouTube video hidden in somebody's blog. The answer to 'what is the squiggle with the line on the side icon mean' should not take an hour of searching. You could, even, like put it in a menu on
I apologize for the large size of this interface (Score:2)
My manager never alots enough time to make one smaller!
Ignorance, inexperience, prejudice, expedience (Score:4, Interesting)
I fear that many of the issues listed in TFS are the result of decisions made when the OS UI conventions are defined. Then, apps follow these conventions without regard to what what it means for their product.
That is not to say that the original conventions are always bad, they were designed for a certain feature set to provide for defined functionality - the problem comes when they are applied, without thought in third party applications. The decision to follow the OS conventions are either made by executives who feel the application needs to be a "seamless" part of the system (and Microsoft, Apple, Google, etc. spent millions on the UI conventions so let's just copy their work) or by designers that don't know any better or are just trying to get their product out quickly.
I have never seen a great set of tests for UI developers to self-evaluate the end product. We've all been there when after working with a product for a while, everything you've done seems to make sense and you develop mental shortcuts that allow you to fly through the UI.
The only real solution is, as part of the development process, set aside time for third party user testing with feedback sessions. I've been through a number of them, they're humbling, surprising and educating - then there's the fun part where you need to take the results and tell your boss(es) that they're wrong.
simple (Score:2)
Common and old. (Score:2)
At least there's some attempt to combat this. For example, if you screenshot something and paste it into LibreOffice, the image is autosized from margin-to-margin. For bonus points, if it's in "web view", the autosizing is to window width.
Of course, this is a slightly old version of Libre Office, and I'm having trouble updating it on that computer. Then again, I should update that computer entirely, it's a several years old.
But still, it's universa
/. is an offender... (Score:2)
On iOS, Slashdot doesn't let me resize /. to a comfortable size. They made a decision that actually being able to read the ant-sized writing was less important than having rabid control of the layout.
As the user base ages (We're not all in our teens and 20s any more, and I suspect the majority of /.ers are in our 40s) being able to resize the font matters.
Re: (Score:2)
This might be more of a browser UI problem. Locking the regular pinch zoom is required to make responsive designs work, due to how the viewport is set up. However, the browser doesn't replace that with a standard viewport zoom. The desktop browsers actually have a proper zoom for this type of thing, where adjusting the zoom gives you responsive feedback from the web site (zoom in far enough on desktop, you get the mobile view of the site in large print).
This same type of zoom needs to be implemented on m
Touch capability (Score:2)
Part of the huge white-space and big button modern trend comes from the advent of touch screens. Remember Windows 8 and how it practically forced users into touch with gestures and "charm zones"?
I appreciate some of these new features. For example, in Siebel's database Open UI, buttons and selection targets are now easier to hit. The downside is less information in the same screen space. (Also, the new interface does not require IE and ActiveX, a positive but not related to the UI's functionality).
I sus
Forgot Some... (Score:5, Insightful)
Hate:
White text on a bight yellow background, on Galaxy Note 3 Android.
Where the fuck have the icons gone? Windows.
Why can't I cut an paste information from your dialog.
Why are things still not resolution independent. Adobe, and most music production applications.
Don't think you need files and folders? Think again, and the includes you Firefox mobile bookmarks.
The creator of "material design" need to be shot. There's a difference between not being limited by the physical world, and needlessly disconnect us from what we have already learned.
In the battle between KDE, Gnome, and Unity, Cinnamon won.
Love:
Rounded corners rule!
Shadows show us what's on top!
Maxims:
Just because Apple did it, doesn't make it right. Remember, they had a bad year last year.
People need to work, more than you need to masturbate over your own art work.
Most serious file management takes place in two windows.
Clean means that you are too lazy to update the functionality in your program, so you are leaving useful stuff off.
Those who think that the command line and a GUI cannot coexist have never seen a 3D CAD or design program.
Re:Forgot Some... (Score:5, Insightful)
You forgot a maxim:
Just because it's old doesn't mean it's bad, and just because it's new doesn't mean it's better.
I'd add another staple of modern UIs... (Score:3)
It's the "Me too!" approach to UI design (Score:5, Informative)
This is not only the case with the current "flat" design epidemic ("Apple went flat and look at how successful they are! If we go flat we'll look modern and we'll be successful too!") but in many other elements that have been taken to an extreme at the cost of usability and accessibility:
- The use of razor thin fonts
- White text on monochrome, pastel backgrounds
- The loss of critical UI elements like scroll bars and button outlines, because apparently they just clutter things up
- The use of "hamburger" mystery meat menus
- Loss of status bars (which attempted to at least give some idea of percentage completion of a task) in favor of things like dots that twirl, spin, and dance in circles
Re:It's the "Me too!" approach to UI design (Score:4, Insightful)
I'd like to take it and throw it off a mountain somewhere. Uses *tons* more memory than a paged layout, and makes it damn near impossible to find anything that's more than a few hours old without scrolling your hand off.
This is what happens when art drives UIs (Score:5, Insightful)
... and every idiot in the world thinks he's an artist.
People associate lots of white space with "modern" and "clean", but in fact the key is to use white space intelligently to help guide the user's attention. The question isn't whether you have a lot or a little, the question is how much mental work does it take for a user to accomplish his task?
It's easy to ape interfaces that work well, but that's cargo-cult design. Design should be as much evidence-driven as it is fashion-driven. First (design) principles are only a starting point.
Recently I was using a smart TV app and when the content I requested took too long to buffer I decided to quit the app. I was presented with a dialog warning me that I was leaving the app, and asking me whether I wanted to "cancel" or "continue". This gave me a moment's pause, because I didn't want to "continue" waiting for the content to load. However as a developer myself I understood the programmer's mindset: "cancel" and "continue" referred to the event the dialog was responding to: a request to exit the app.
This division of responsibilities is backwards: the user shouldn't have to get into the mind of the designer, the designer needs to get into the mind of the user. And that's hard. UI guidelines help, but there's no substitute for watching actual users struggle with your design. Any time you find something that makes them pause, even for a moment, you should file that bump down. That'd catch problems like confusion between text and controls, or inscrutable state widgets.
It's called "a fad" (Score:3)
If you notice throughout software history, developers have *always* copied each others styles, in ways no different than a fashionista would. In the 90s it was pseudo-3d buttons because people wanted buttons that looked like buttons. (Personally, I *still* do.) When the WWW got popular, people started making *everything* to look like hyperlinks. Thank god that didn't last long.
But now... I just don't even know what to say. Style has completely overrode any semblance of usability. Google started the 3 parallels bars=menu thing, and now everyone is doing it. 2D flat everything is now all the rage wherever you look, and people think they're being cool if they use obscure icons for things that may or may not have a passing resemblance to the function they're trying to perform. Intuitively has basically been thrown out the window.
Case in point: Whoever came up with the UI for snapshot should be tarred, feathered, shot, multiple times. While I eventually figured out how to use it, it took *effort* to figure it out because it made about as much sense as Trump walking into a soup kitchen.
I can only hope that sanity will return soon.
UX gone wrong (Score:5, Informative)
Developers traditionally make efficient, functional, ugly interfaces. They did this by using standardized UI controls. They were largely constrained. Today, without those constraints, those same developers make inefficient, semi-functional, pretty interfaces. And with the focus on form over function, they are pushed in this direction by management. (Thanks Apple, for telling me that I want to get rid of all the jacks in my laptop so that it can be be 0.00001 inches thinner.)
A good UX person -- not the kind of BS "UX" that I see lambasted here -- but a real one -- can improve the look and feel of an application, optimize the workflow, and make it pretty too. I work with a UX engineer who uses statistics on the average hand size of our target demographic, and can quote the average size and resolution of the displays they are using. On touch-screen apps, our UX team optimizes for right handedness, and organized the screen so your hand doesn't cover the things you are looking at and so you make minimal movements. A few years ago we even created a mock-up, and had actual users go through a workflow and timed them, counted number of clicks, etc. This is good UX. It's human factors engineering + graphic design.
A sad anecdote: A few years ago I had the pain of designing a UI with a bunch of managers. It was a screen to add/edit/delete users who had access to an account. I drew-up a typical text box with a list, and then an add/edit/delete button at the bottom. You could fit 50 users on a typical screen, quite readably. They HATED it. Their design fit about 10 users on the screen. Big margins all around. Each row had a separate add, edit, and delete button, a large single-color icon of a person. All the icons were the same, so they communicated nothing. The text was so large that long names needed an ellipsis to fit. The add/edit/delete buttons were tiny icons without text. It was pretty, wasteful, and slow. They loved it.
On another project, which was an industrial machine, they wanted icon buttons. Their previous version used 16-color EGA graphics so it needed an update. So I used actual 3D renderings of the parts as icons. Initially everyone loved it because it was clear what the icons did. 3 years later, it laughed-at because it is too "realistic." So on the next project they replaced the realistic icons with single-color conceptual representational icons. Unless you were on the project, you had no idea what the icon meant. The customers came-up with names for the icons: the "one-eyed cat" let you search. The "disney castle" was to load a tray into the device. The "laser broom" was the barcode scanner. This interface is loved by development because it is so pretty, and is the new standard moving forward. The customers (and training department) complain that unless someone uses the device regularly, they forget if they should start the workflow by clicking the "one-eyed cat" or the "laser broom."
At with the next project, they are using text under the icons again, so users know what they are.
From a developer standpoint (Score:3)
Caveat: I am not a designer, but I do program some programming for various Apps/Websites as a side job(but focus on more behind the scenes stuff in my day job).
I do not understand all the design decisions, especially the proliferation of interfaces with generic icons that could be mistook for Ikea instructions. It is frustrating when you run into an icon that could be interpreted as "light phone on fire" or "turbo mode" but you really don't know for sure which it is. Do you try it???
That being said, if I create an app or website that has nice instructions on it, the end user's first impression is to hate it. They say it does not look modern enough. However, it is intuitive to use and they can figure it out quickly. On the other hand, if I create an Apple/Material design type app, customers love it and accept it immediately. Of course, the UI is impossible for their customers, but hey, at least the company that requested the app likes it.
I think a lot of this stems from the Instagram/Pinterest world we live in. Everyone wants to be blown away by the beauty of the app when they casually glance at it. Of course, that beauty greatly limits the possibilities to make an app intuitive and easy to use.
As a developer, I find that I try to balance these things. But as someone who generally likes to get paid for my work, I will often say screw it, and give the "artsy focused" people what they want because they are the ones that sign the checks. The quicker I make them happy, the sooner I can get paid and move on to the next gig.
Non-Discoverable Interfaces (Score:5, Insightful)
For me, the #1 modern UI sin, which wasn't included in the list here -- Non-discoverable interfaces. Interfaces based on some "gesture" which is never explained, and for which one cannot find an explanation (unless you already know the gesture to get there, if it exists). Pinch-zoom, hover in a magic corner, drag from edge, press screen for short vs. long time, invisible menu bars, etc., etc. In the 1984-2010 era I could follow the words in the menus and discover new features in any piece of software (and so could anyone, assuming they weren't illiterate). The last few years have brought my first experiences with software that I just couldn't begin to figure out how to do anything with.
Leaky Abstractions (Score:3)
My biggest complaint is the constant attempt to "abstract" simple concepts such as directories. For example "My Libraries" abstracted over top the easy to comprehend directory file system is an abomination. Ask the average user how to go to a directory hanging off the user directory (c:/user/$user) and they don't know how. You click on "my documents" but there is no clear way to go up one level of hierarchy or even understand where "My Libraries" really resides. Of course this was even worse in the "my documents and settings" days. People readily understand a hierarchical directory and file system. Why do they attempt to further abstract directories and files is beyond me. This is why Gates could not find the "downloads" directory in the anti-trust trial - where the hell is it? Even he didn't know! Gnome3 makes the same mistake IMHO.
Lack of customizability = good. (Score:4, Interesting)
I was with you until the very last point. Lack of customizability is a good thing. It creates standardisation. It means when people pick up a device of same or similar model to their own they know how to use it without any guess work. It makes support and training easier, though admittedly at the expense of finely tuned specific tasks.
Too much magic in modern UI (Score:5, Informative)
Our screens are way bigger than they were back in the old days, so we have plenty of room for things like menus and toolbars. Yet the trend in modern UI design is to make things magical and non-discoverable.
Just yesterday I helped my father with a problem: the menus and toolbar from Thunderbird were gone. I was on the phone with him for a while. The task was to find the one magic part of the Thunderbird window where he could right-click and find the context menu with the checkboxes for hiding/displaying the main menu and toolbar. Thank goodness I have him running MATE so every window has a title bar... "find the blue bar at the top that says 'Inbox - Mozilla Thunderbird' Now right-click in the dark grey area underneat that, to the right of the tab that says 'Inbox'..." "It didn't work" I'll spare you the back-and-forth, he had multiple tabs and was clicking in a tab to the right of "Inbox". Once I got him over to the correct magic spot, he found the context menu and restored his menu and toolbar. (The stupid hamburger menu is part of the toolbar, and hides with the toolbar... which means it's possible to hide all the menus! And my dad somehow did so by accident!)
The original UI spec for the Macintosh required menus all the time for every app, and the menus had to be in the same place. And I learned very quickly that I could browse the menu, find the command I wanted, and the keyboard shortcut was documented right there in the menu. Hidden menus are far too magical, and if you are going to have them, the very least you should do is to make every context menu have the ability to unhide them, rather than requiring the mouse pointer to be hovering over a particular magical few pixels of your screen.
I also remember the 45 minutes it took to help my dad un-mute YouTube videos. First I had him use the MATE sound preferences dialog to test his speakers, which just took a couple of minutes. Then I had to walk him through moving the mouse pointer over the YouTube video window to make the controls un-hide... (he wasn't full-screen, why do the controls hide when there is plenty of screen real estate available?) Then he had to move the mouse pointer to touch the audio control (and a slider pops out when you get it right) and click to un-mute... and when it's un-muted it says "MUTE". Because when it's un-muted the button becomes the "MUTE" button, and when it's muted the same button becomes the "Un-mute" button. The old-school solution would be a checkbox labelled "MUTE" that's checked when it's muted; the newer way would be a GUI toggle that slides left for un-mute and slides right for mute. There's plenty of screen real estate for either of these.
I know, I know, on mobile devices these magical hiding tricks are not so pointless because screens are smaller. But desktops are not mobile devices and trying to treat them the same is a bad idea.
My dad is not stupid and I don't want to sound like I'm making fun of him. I'm just annoyed over the modern trend in UI design where everything is so magical that it's tricky and weird.
Re:children and old people (Score:5, Insightful)
You're right about people's motor and vision skills are not what they used to be, but I find that primarily to be because it's not the same people.
Things have been dumbed down for about a decade now, and young users expect things to be simplified, not having experience with anything else.
40-70 year olds have computer experience, and handle cascading menus, middle mouse buttons and overlapping windows just fine - it's the young generation that requires a single application on the screen with simplified controls. And not too many words they have to read.
tl;dr: It's dumbing down for a dumber generation.
Re: (Score:3)
You're right about people's motor and vision skills are not what they used to be, but I find that primarily to be because it's not the same people. Things have been dumbed down for about a decade now, and young users expect things to be simplified, not having experience with anything else.
40-70 year olds have computer experience, and handle cascading menus, middle mouse buttons and overlapping windows just fine - it's the young generation that requires a single application on the screen with simplified controls. And not too many words they have to read.
tl;dr: It's dumbing down for a dumber generation.
I never thought of this before, but now I will have a hard time not thinking of it! That was damned insightful!
Re: (Score:3)
Have a little grace. I use dozens of programs and web sites every day. Very few of them are so important to me that I'm willing to invest a lot of time learning the site or program's quirks and tools.
The solution isn't to make the interfaces simpler, but to standardize them. Make them compatible in function to what users are familiar with.
Interfaces have existed for long enough that time has proven what's effective and what's being used. Presumptions that you know better than the UI designers of yore vetted by time and choices will likely lead to dead ends like Gnome 3.
Don't put the steering wheel under the seat and replace the gas pedal with auto-acceleration even if it's more aesthetically pleasing.
Re: (Score:3)
I want to amen this: "The currently focused app is not sufficiently highlighted." This abomination started with MsOffice 2010 or so.
For the record, Winaero Tweaker (http://winaero.com/comment.php?comment.news.1836) helps you fix this. It isn't perfect, and there may be apps that resist it, but it can go a long ways towards improving this problem (and a few others).
Re: (Score:3)
I agree that if you must use the mouse, the UI has failed. But this has been true for a lot longer that modern UIs have been around.
The way that modern UIs tend to implement keyboard shortcuts has a serious discoverability problem, though. It used to be that you could pop open a menu and see what the keyboard shortcuts are. That's becoming impossible, forcing you to leave the application to google or open a help page (if you're lucky) to learn what the shortcuts are.
Re: (Score:3)
And how come no one, and I mean *no one*, is working on a project where the computer bends itself to the user's expectations and not the other way around?
Hear, hear!
I'm old enough to remember when one of the design ideals centered around the notion that the computer should learn how you work and accommodate that, not the other way around. I wonder where we as an industry lost that plot?