Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Android IOS Windows

Ask Slashdot: A Point of Contention - Modern User Interfaces 489

Reader Artem Tashkinov writes: Here are the staples of the modern user interface (in varying degree apply to modern web/and most operating systems such as Windows 10, iOS and even Android):
  • Too much white space, huge margins, too little information
  • Text is indistinguishable from controls
  • Text in full-CAPS
  • Certain controls cannot be easily understood (like on/off states for check boxes or elements like tabs)
  • Everything presented in shades of gray or using a severely and artificially limited palette
  • Often awful fonts suitable only for HiDPI devices (Windows 10 modern apps are a prime example)
  • Cannot be controlled by keyboard
  • Very little customizability if any

How would Slashdotters explain the proliferation and existance of such unusable user interfaces and design choices? And also, do you agree?

This discussion has been archived. No new comments can be posted.

Ask Slashdot: A Point of Contention - Modern User Interfaces

Comments Filter:
  • Easy answer (Score:5, Insightful)

    by YrWrstNtmr ( 564987 ) on Friday January 27, 2017 @11:42AM (#53748835)
    How would Slashdotters explain the proliferation and existance of such unusable user interfaces and design choices?

    Phones and tablets.
    • Re:Easy answer (Score:5, Insightful)

      by houstonbofh ( 602064 ) on Friday January 27, 2017 @11:46AM (#53748853)
      This... People are designing for one medium used one way. All of the large data workers I know (Programmers, accountants, graphics designers, architects...) HATE these new UIs and use Windows 7 / Gnome 2 style interfaces. (And often have dual monitors) I suspect it will not be long before things start to shift back...
      • Re:Easy answer (Score:5, Insightful)

        by big-giant-head ( 148077 ) on Friday January 27, 2017 @11:57AM (#53748961)

        Bingo we have a winner .. At work if we have a choice the developers use Linux and the customize the UI the way they want it... Usually a Gnome 2 , a KDE ( like windows with the menu and apps pinned to the bottom) or similar interface. I realize all the hipsters think this minimalist ui with a very small, dull color palette is cool, but it isn't. It's very limiting and very boring and 99% of the users are not hipsters ... so we are not impressed ..

        Make UI's Great Again !!!!

        • Re:Easy answer (Score:4, Interesting)

          by YrWrstNtmr ( 564987 ) on Friday January 27, 2017 @12:14PM (#53749113)
          Right. The way I interact with my PC, with mouse/kbd, and 2 x 24" screens, is totally different than interacting with a phone or tablet.
          But apparently the 'designers' are too lazy or clueless to a) know the difference, and b) build two different interfaces.
          • Re:Easy answer (Score:4, Interesting)

            by MightyMartian ( 840721 ) on Friday January 27, 2017 @12:24PM (#53749197) Journal

            I don't think it's laziness, more like cluelessness. There was a big push for several years after smartphones and tablets took off to merge UIs across platforms. I suppose part of the justification was to try to draw in developers from both the smart device and desktop worlds to do more cross-platform work, and part of it was likely just to simplify (read: make less expensive) maintaining and developing features where an OS might be on everything from multiple screen desktops to 5" phones.

            At the end of the day, I doubt even a smart UI abstraction layer will ever make these variant UI scenarios fit under one hood. Many web developers have known this for a while, which is why you have mobile and desktop versions of sites in many cases, but I still think Microsoft and Apple need to fully accept the reality that what works on little may look like shit on big (and visa versa, as my experience with an 8" Windows 10 tablet even in tablet mode informs me).

            • Re:Easy answer (Score:5, Insightful)

              by cayenne8 ( 626475 ) on Friday January 27, 2017 @01:48PM (#53750019) Homepage Journal
              It all started with the "Ribbon Interface" on the MS products....

              It all went to hell from there...

              ;)

              • Re:Easy answer (Score:5, Insightful)

                by Anonymous Coward on Friday January 27, 2017 @03:41PM (#53750957)

                Funny, but I actually agree with this. I still hate the ribbon. A menu is a reasonably nicely categorised index of functionality - easy to read, properly aligned text, room for descriptiveness as required, sub-categories where appropriate (but highlighted in a consistent manner), and it has hints like underlines for keyboard shortcuts. And when not in use it neatly vanishes. The ribbon takes the menu, hurls it across the screen with a bunch of apparently random icons with no thought to readability, alignment, sorting or descriptiveness, actively hides some information in a non-standard way, and thoroughly confuses the distinction between a toolbar (a small set of tools kept visible for ease of access) and a menu.

                I really, really wish the ribbon would just go away.

          • Re:Easy answer (Score:4, Insightful)

            by Archangel Michael ( 180766 ) on Friday January 27, 2017 @12:45PM (#53749417) Journal

            To properly build two interfaces requires way too much effort. It is much easier if you stop trying to make all things for all people for every device. The solution is much simpler than convoluted designs.

            Three parts to every design, separated from each other part: 1) Content, 2)Design, 3)Structure

            Content: The actual bits that matter. Articles, pictures, code snips whatever it is that "counts".

            Design: The flowery bits that distinguish content from other content. Fonts, Styles, Artwork and Logos. These are the bits the identify the content brand.

            Structure: This is how the content and design bits are displayed. Two Columns or Three. Header above, in the middle, or below. Left / right. "Layout"

            If you break up the UI into these three aspects, it becomes much easier to modify / replace / customize. You can skin the layout to make it look unique, you can adjust the structure to work better for different work flows (Small, Medium, Large screens; Programmers vs Graphic Designers)

            The problem is, we have people trying to shoehorn Touch Interfaces on desktops that don't need touch. You have people trying to make something look good on small screens, but on large screens gives way too much "white space" or ending up looking "cluttered".

            The reality is, that everyone's needs are different, and the whole "one size fits all" thing doesn't work for everyone. And you end up making very few really happy in the process.

            • I agree with much of what you say, particularly the inappropriateness of trying to shoehorn interfaces from one type of device into another type of device with very different attributes. However, I also think your argument as written is fatally flawed, because content, design and structure aren't really independent.

              "A picture is worth a thousand words" is as true as ever. The difference in effectiveness between a table of raw data and a standard but well-chosen chart can be dramatic. Sometimes the differenc

        • by bondsbw ( 888959 )

          Some of us hipsters agree with many of the complaints in the summary.

          I like unintrusive, post-shiny user interfaces. I really prefer flat UIs. Still, when z-order is a fundamental feature of a UI (windowed desktop) then it makes sense to provide an intuitive mechanism for z-order information (e.g. shadows, highlighting focused windows).

          Too much white space, huge margins, too little information

          This is true for UIs whose purpose is to disseminate information. Charts, graphics, grids, and such things need to give the user more information while requiring fewer clic

        • I actually really like most of Material Design. I often have to design HMI displays (user interfaces for industrial automation). There are good reasons for much of the design:

          * colours should be limited and subdued for user interface elements so as to focus attention on content. Bright colours and animation are intended to call attention to important information.

          * textures, gradients, transparency and drop shadow effects for the sake of visual flare cause visual confusion and eyestrain. Important elements g

      • Re:Easy answer (Score:4, Insightful)

        by Simon Brooke ( 45012 ) <stillyet@googlemail.com> on Friday January 27, 2017 @12:28PM (#53749239) Homepage Journal

        Agree.

        Over the past year I've (for the first time) used Mac OS X on my laptop, I find it much less useful, and frankly much less user friendly, than Gnome 3 (and even Gnome 3 hides too much information because it assumes its users are technophobes).

        One can understand Microsoft and Apple designing user interfaces primarily for technophobes, because in the modern world the majority of their users are people for whom the full power of a computer system is too complex for them to understand, much less use; and, seeing that they have in effect a duopoly, the fact that their more technically able users are not well served by their user interfaces doesn't matter, because there aren't enough of us to be a significant market, and most of us will be told what to use at work in any case.

        But I really don't understand the Gnome designers' reasons for hiding so much, for making even moderately technical things so awkward. In practice, almost everyone who chooses to use Gnome is a geek. Having said that, if it really annoyed me I could either switch to something else or get under the hood and modify it, and I don't.

        For me, Gnome 3 works with niggles. MacOs X is really annoying, but I can use it. Windows 7 is tolerable. Windows 10? Just let's not go there.

      • Oh lord, the pain (Score:5, Insightful)

        by fyngyrz ( 762201 ) on Friday January 27, 2017 @12:37PM (#53749329) Homepage Journal

        Some of this is web design (I use the word "design" very loosely) and some is application design:

        o the "designer" mindset has gifted us with extreme low contrast backdrops and fonts - STOP THAT

        o bloody pop-up/over dialogs that were not asked for are constantly used - THIS IS HOW TO MAKE ME GO AWAY

        o menus drop without being requested because mouse went over them - WAIT FOR A BLOODY CLICK!

        o Videos autoplay just because I've arrived, or because the mouse pointer went over them. Ever think *I* might want to control what damned noise comes out of my computer, or what data I want to stream on my phone? You should. Because while I'm desperately trying to figure out how to shut up / stop your video abortion, I am hating on you and everything you represent, and vowing to NEVER come back to your site, which I promptly implement via my hosts file because you SUCK.

        o Do NOT change the web or application UI: NEVER make a modal UI. Present a consistent interface that can be learned and incorporated into muscle memory. Enable/disable elements as appropriate. IOW, if a document isn't NEW or Loaded, Save should be disabled - not GONE. This is so everything in the interface remains where it was. We want to work, not read your damn interface over and over and over and over just to see where we're at.

        o Make ALL keyboard commands configurable. In some apps, some of the things I do most often have no shortcuts and no way to add one. How annoying. How stupid.

        I swear, there are days when I'd like to hunt down these so-called "designers" and yell at them until my voice gave out.

        All of the above is effete nonsense that designers engage in an attempt (which is actually abject failure) to justify their title; stop all that, and just do it right. Don't even try to be "fancy" unless you're writing a game.

        Also, if you say "UX", I just want you to know you've made me work to suppress an urge to slap your face. Hard.

      • Another answer is: giving "style" more preference than "usability". In no less than TWO of my streaming video apps, I've seen progress bars implemented as light grey on white, making them more difficult to distinguish, but looking much more sophisticated than using some "garish", easy to see color scheme like, say, green on white.

        Amazon recently re-designed their video streaming app on Xbox One, and while the rest of the app seems reasonably well thought out, they screwed up their transport controls. If y

    • Re:Easy answer (Score:5, Informative)

      by naris ( 830549 ) on Friday January 27, 2017 @11:52AM (#53748921)
      Its really the proliferation of the horrid iPhone UI. iOS has a horrid User Interface that is really difficult to use and everyone seems to be very quick to copy the least usable portions of it :/
      • What's so difficult about iOS?
        • DIfferent implementations of what on Android is a "back" button.
        • Re:Easy answer (Score:5, Informative)

          by DickBreath ( 207180 ) on Friday January 27, 2017 @12:21PM (#53749177) Homepage
          I will relate my own experience. I have used technology products since Decwriters and CRT terminals on big computers behind glass walls. And everything since then.

          I have used numerous candy bar and flip phones. I used an Android phone. When I was handed an iPhone to do something, I was absolutely baffled at how to do certain basic operations. I would even consider this is because I could be an ignorant idiot. But I don't think that is so. I could make certain fits of progress, but then get stuck at some basic operation. (Don't remember details, it was a few years ago.)

          I'm sure I could learn how to use an iPhone / iPad just fine. I look at some of the things I have had to learn. Back in the day you had to memorize a stack of computer manuals that you could not remove from the computer room because they were physically bolted to a table. And it was uphill both ways. I practically brain downloaded the entire Common Lisp The Language (1, and partially 2) in the very early 1990's.

          What bothered me was that I was a huge Apple fanboy back in the day. Apple was all about user friendly. Human Interface. Things should be intuitive. What you can do should be directly recognizable from what you can see. Even if what you can see is a control that reveals more possible operations. There weren't hidden gestures. Magic handshakes. Etc.

          That's just one person's experience. It is not a generalization to everyone. But you did ask.
          • I'm sure I could learn how to use an iPhone / iPad just fine

            But why should you have to? It's sort of like the proliferation of different single knob bathroom shower temperature/volume controls I've seen in the past 50 years. The best of them work no better than the hot and cold knobs of my distant youth and many don't work nearly that well. What's the point?

            It's sort of like emacs. Lot's of nifty stuff there, but learning to use it is such a monumental pain that most of us don't bother.

        • Re:Easy answer (Score:5, Insightful)

          by RoverDaddy ( 869116 ) on Friday January 27, 2017 @12:33PM (#53749293) Homepage
          This isn't specific to iOS, but there's this 'modern UX' Philosophy that functions should be completely hidden until needed, which does seem to develop from a 'mobile first' attitude. One example: I've been baffled on how to delete entries from a list, because there's no edit mode for the list, and even if there is, still no 'affordance' to suggest this is what you click to delete. Why? Because 'delete' is obviously a swipe left or right (depending on the app). Then and only then do you get to see a nice big red 'DELETE' box. The user should just 'know'. Similar to how Windows 8 introduced those awful 'hot corners' that made charm controls spring up if you left your mouse (or touch) there. But of course this isn't universal. The iOS Podcasts app uses an edit mode for lists and check boxes that look like radio buttons (another minor gripe) to indicate which items in a list should be deleted.

          A couple decades ago there seemed to be a much more rational UX philosophy where controls were obviously controls, text was obviously text, window frames and borders were -good- things because they help the user's mental model of the UI match the software, and on-screen affordances were designed to give the user a clue as to what does what. We've gone backwards.
      • I have notices this as well.

        Also, visit apple.com to see the current state of "modern" web page layout and design that is inevitably copied by every other web developer.

    • No only (Score:5, Insightful)

      by Viol8 ( 599362 ) on Friday January 27, 2017 @12:00PM (#53748993) Homepage

      Its arrogant designers who think they know better than the generation before, want to be seen to be different and "edgy" and "new" and so chuck out all the lessons learned and fuck things up royally. So we end up with an OS in 2017 that looks more primitive than Win3.0.

      • Re:No only (Score:5, Interesting)

        by Anonymous Coward on Friday January 27, 2017 @12:15PM (#53749121)

        This

        As part owner and lead engineer and developer for an online GPS tracking platform I experience this on a weekly basis.

        Just fired a guy 1/3 my age and 1/10 my experience for telling me I am too old school and think I know everything.

        Fucking guy insisted on using microscopic fonts and all grays with almost zero contrast ratio.

        I could not even read that shit on a 28" monitor.

      • So we end up with an OS in 2017 that looks more primitive than Win3.0.

        This is very to the point.

        In fact, I think someone needs to make a list of UI features that were in Windows 3 and Windows 2000/XP/7 (whichever we think is most usable). And for newer interfaces like Windows 10, count what percentage of the UI features still exist. And based on that, calculate an "age" for the new UI, i.e. "$OS has regressed to a 1997 level of interface".

    • Re:Easy answer (Score:5, Insightful)

      by alvinrod ( 889928 ) on Friday January 27, 2017 @12:09PM (#53749057)
      I think part of the problem is that the basic screen shape has changed. Traditionally, monitors used a 4:3 format that worked reasonably well for most sites and the resolution was low enough that letters had to be sizable so as not to be unreadable. However, we moved to the 16:9 format for most monitors which adds horizontal space, often at the expense of vertical space which is utterly useless for most things beyond watching movies filmed in a 16:9 format.

      Studies that were done over 100 years ago found that the best line-length for human reading was around 4 inches at most. The extra width that modern screens provide don't give much benefit, but at least with a tablet it's much easier to adjust to a portrait mode where the added vertical space means less scrolling. Otherwise there isn't a lot of useful things to do with a UI other than add more tool pallets, but for a non-professional tool, its typically better to avoid throwing too much at users so we've got all this extra space that provides no benefit. So websites fill the void by throwing in a side column of ads, but that's worse than just empty space as far as I'm concerned.

      The touch model of phones and tablets as makes it more complicated to have a universal UI. Web pages with context menus or anything that interacts with a mouse hover are horribly clunky on touch screens, and optimizing for different platforms is often time consuming or doesn't even make business sense depending on how much traffic you get from different platforms. The same goes for applications that could be run on either a tablet or a PC as the interaction models are different enough that trying a one-size fits all approach often degrades the experience for both users. Using an application with bigger buttons that are necessary for touch targets with a mouse and keyboard just feels like the UI has wasted a lot space and trying to touch small targets designed for mouse use can be exceptionally frustrating.
    • Re:Easy answer (Score:5, Interesting)

      by DickBreath ( 207180 ) on Friday January 27, 2017 @12:12PM (#53749095) Homepage
      That doesn't explain it.

      I can explain the proliferation of unusable user interfaces in two words: Graphic Artists

      I saw this trend start in the 1980's. We were designing a new version of a successful Macintosh product. We were working on the user interface. The graphic designers could make things look good, but had no grasp of principles. The big eye opener to all of the developers but zero of the graphic artists was when an artist was describing an operation and then indicated using a certain button as doing something very different than it was described as doing earlier. Something unworkable. Something that revealed the entire mindset was about how good it looks aesthetically.

      In our ensuing discussion it was recognized how a lot of consumer electronics at that point (late 1980s) looked fantastic on the shelf, but had horrible user interfaces.

      Back in the day Apple had Human Interface Guidelines. And I understand that Microsoft did too.

      Today all of that has gone out the window. I'll just give one example. Google's Material Design. Not that I'm criticizing it. But just criticizing the NAME. The name screams it is all about the aesthetics and not how well it interacts with human beings.

      And we wonder why things have such badly thought out UIs. You have to start with basic principles. Get a good book like The Design Of Everyday Things. It explains the user interfaces of things like Door Handles, Faucets, and things you would never think about. It describes a lot of principles that you wouldn't think about, yet suddenly recognize. Once you read the book, you can answer what an affordance is when designing a UI.
      • Re:Easy answer (Score:5, Informative)

        by NormalVisual ( 565491 ) on Friday January 27, 2017 @12:39PM (#53749343)
        Back in the day Apple had Human Interface Guidelines. And I understand that Microsoft did too.

        IBM had "Common User Access" (CUA), and Microsoft had "Consistent User Interface" (CUI) guidelines, which were roughly comparable to Apple's. Following those guidelines might not be as visually attractive as some of the crap being designed today, but at least it meant that people could get acclimated to your product quickly and with a minimum of confusion. In the world of UIs today, there's way too much frosting and not nearly enough cake.
        • Re:Easy answer (Score:4, Insightful)

          by steveha ( 103154 ) on Friday January 27, 2017 @02:18PM (#53750327) Homepage

          IBM had "Common User Access" (CUA), and Microsoft had "Consistent User Interface" (CUI) guidelines, which were roughly comparable to Apple's.

          IBM's standard could only have come from IBM. Save was F12, Save As was Shift+F12, and Print was (IIRC) Ctrl+Shift+F12. Cut/Copy/Paste? Shift+Del/Ctrl+Insert/Shift+Insert. Arrgh.

          When Microsoft was trying to be a corporate partner of IBM, they followed the above standard for a while... and then they rebelled and implemented Ctrl+S for Save, Ctrl+P for Print, and Ctrl+X/Ctrl+C/Ctrl+V for Cut/Copy/Paste. And left the CUA ones working because why not. I haven't checked but I imagine the CUA ones still work today; it's not like the UI designers are falling all over themselves wanting to use Ctrl+Shift+F12 or Shift+Del for anything.

          In the world of UIs today, there's way too much frosting and not nearly enough cake.

          I agree completely.

          • Whatever. I find the CUA keys for copying and posting more memorable than the Microsoft ones. And I feel bad when they are not implement.

      • Re: (Score:3, Informative)

        by Pete Smoot ( 4289807 )

        Today all of that has gone out the window. I'll just give one example. Google's Material Design. Not that I'm criticizing it. But just criticizing the NAME. The name screams it is all about the aesthetics and not how well it interacts with human beings.

        Ah. Well, then, you might want to read more about Material Design than the name. It actually has quite a bit about human interactions. Even if it were just about aesthetics, a lot of visual design is about how humans interact with colors, shapes, fonts. No visual designer I've ever worked with picks colors purely because they like that shade of blue.

        • In the same way that "form follows function," beauty must be an afterthought of usability. A system that puts appearance above usability fails.
          • by jbengt ( 874751 )

            In the same way that "form follows function," beauty must be an afterthought of usability.

            To me, anyway, "form follows function" does not mean that beauty is an afterthought, but rather that properly executed functionality is aesthetically pleasing.

          • Depends on the perspective.

            It's a lot like dating. The flashy appearance makes you drool so you buy in. It's only after you've had time to get over the initial euphoria that you realize exactly what you got yourself into...

            The people who drive businesses want your buy-in; it's their sole reason for existing. They don't care if the product is actually any good.

  • White space (Score:4, Insightful)

    by 93 Escort Wagon ( 326346 ) on Friday January 27, 2017 @11:44AM (#53748841)

    On web pages, at least, the excessive white space is an obnoxious side-effect of current "responsive design" practices.

    • Re:White space (Score:4, Informative)

      by houstonbofh ( 602064 ) on Friday January 27, 2017 @11:46AM (#53748861)
      You think it looks bad on screen, try printing it!
      • You mean on paper? Huh... why would someone do that?

      • This is because most people aren't designing for paper, if they were they would include stylesheets for print medium. People who print out website are in a low percentage of users. I own a printer and the last time I used to print something other than a return label for amazon was years ago...

    • Re:White space (Score:4, Interesting)

      by myowntrueself ( 607117 ) on Friday January 27, 2017 @12:00PM (#53748989)

      On web pages, at least, the excessive white space is an obnoxious side-effect of current "responsive design" practices.

      More specifically, it seems that the idea that 'content is like water' results in having just enough content to fill the small screen of a mobile device and then presenting that same content on a larger screen by introducing huge amounts of white space to pad that small amount of content out.

      It should have been glaringly obvious that this was going to be the result from looking at the pic on wikipedia:

      https://en.wikipedia.org/wiki/... [wikipedia.org]

      how could designers not have seen this coming?

    • Agreed. I'm fighting^H^H^H^H working with a web designer on this point right now.

      "Responsive" doesn't mean take a design and make it work on all devices, it means change the design so it is optimal on (ideally) all devices.

      • Re:White space (Score:5, Insightful)

        by Frobnicator ( 565869 ) on Friday January 27, 2017 @12:26PM (#53749219) Journal

        "Responsive" doesn't mean take a design and make it work on all devices, it means

        Unfortunately that IS what the term currently means among that group. Generally they (wrongly) believe they control all aspects of the web page display, that all devices are equally powerful and can run an unbounded amount of scripting, they often see no difference between a picture of text versus actual text, and don't bother to learn anything about the media they are designing for.

        Aside: More than once I've had to convince a web designer that their pictures of text were the biggest reasons things weren't showing up to search engines, they kept claiming the hidden meta tags, text recognition, and image search would handle all that. Frighteningly some were never convinced, even after showing them with Google's own tools how Google interpreted their pages. Some were absolutely convinced that Google reads all text on all images and indexes pages based on image content. They could not fathom how there was a difference between text and fancy-rendered images of text.

        Many wrongly assume the web browser displays the same thing on all screens, no matter what. Often they design for a few patterns they think are common, 1024x768 or 1080p, and try to force it on everyone else.

        Got a Super HD display showing 7680x4320? Too bad, we'll just upscale the fonts and add some whitespace.

        Got an old smartphone with a 480x640 portrait screen? We'll downscale and do an ENORMOUS amount of JavaScript processing on these devices least suited for the processing.

        It seems these are the same designers with the first-world problems of their disposable $800 smart phone is more than 18 months old, and their $2000 macbook is more than three years old and ready for replacement.

    • Then it's poor responsive design.

      Seriously, there is a limit to the width of a column of text that it's comfortable to read, so for continuous text on large screen there may be reasons for having large amounts of whitespace. And, again, for continuous text, having a proportion of white space around the text is easier on the eyes. There can be good ergonomic reasons for using significant whitespace in design.

      Good responsive design is hard; to have the same page layout on a two inch wide mobile phone scre

      • Seriously, there is a limit to the width of a column of text that it's comfortable to read

        On the PC if I manage to hit that limit -- and currently I'm not at that limit with a large widescreen monitor -- I can resize my window to something narrower.

        I certainly won't hit that limit on my phone or tablet, and if I did, I could rotate to portrait mode.

        Don't take away my choices. Just because one person happens to prefer a width doesn't mean everyone does. I hate the news sites that give you a fixed panel a

      • Then it's poor responsive design.

        The problem is that roughly 90% of all the "responsive design" sites I have encountered have been poor. At some point, it becomes reasonable to say that the problem is responsive design itself. If the majority of implementations of something cannot get it right, perhaps the problem is the something.

  • by slazzy ( 864185 ) on Friday January 27, 2017 @11:44AM (#53748845) Homepage Journal
    Fashion over function.
    • by OhPlz ( 168413 )

      Also known as "UX engineers".

      • Hipsters.

      • In the 1980's I coined the term cosmetic programming .

        I grew up programming my atari computer to do lots of cool things, but all inputs were hard coded into the source code. Then when I took a college course, I learned that most software was less about doing cool things than making it look pretty and always got dinged for this.

    • See : Carnegie Mellon University for the source of so much hipster insanity with UX. Their motto is apparently... "Lack of features equals ease of use."

    • by steveg ( 55825 )

      Very much so. You see advocates of the new, ugly paradigm disparaging older interfaces as not being "modern."

  • by cyberchondriac ( 456626 ) on Friday January 27, 2017 @11:46AM (#53748863) Journal

    One of my biggest beefs is with those apps whose windows can't be resized, and you're forced to scroll all over the place -horizontally as well as vertically- in a window barely the size of a post-it note.

    • Re:Forgot one (Score:5, Informative)

      by Ingo Ruhnke ( 3575189 ) on Friday January 27, 2017 @12:09PM (#53749059)

      Scott Meyers calls this the The Keyhole Problem [aristeia.com] and has a paper with a bunch of good examples.

      My "favorite" modern example of the problem is Chrome's omnibox auto-completion, you get six results at maximum, they don't even give you a scroll bar or a "Show more" link, six results only. There used to be a command line option to increase it, but they removed it some years ago, it's now a hardcoded constant [stackoverflow.com] in the source code.

    • I hate having 4-5 scroll bars on the screen at one time and having to try to figure out which one I need to move to center the text I'm interested in.
    • Remedy.... we're all looking at you, you BMC piece of garbage...

  • Font sizes are at least 50% bigger than they should be.
    • by naris ( 830549 )
      No, font sizes are 50% smaller than they should be. Nobody can read 2 point fonts!
  • Hate flat GUIs (Score:3, Insightful)

    by Anonymous Coward on Friday January 27, 2017 @11:48AM (#53748873)

    I agree, I cannot stand this push to flat GUIs. Give me a button that looks like a button, that way I know I can push it.

    heh, captch: condemns

  • Rebellion (Score:4, Insightful)

    by Anonymous Coward on Friday January 27, 2017 @11:48AM (#53748881)

    New generations always rebel against the ways of the previous generation. It's human nature.

    During the Renaissance we had visually brilliant works of art created. Later generations shunned this and decided that a canvas painted a solid color had just as much merit. Which is "right"? Neither. They just are.

    And so it goes for UI design. From my perspective, we had a very consistent standard for UIs for a good 20 or so years. This was in part driven by technological limitations, but it worked well. The barriers are gone now, anything can be done. Therefore anything will be done. I've actually worked with people who are "UX Specialists" and completely disagreed with what they thought was intuitive. I also regularly have to look up how to do things on modern gadgets because they don't include manuals anymore and they most certainly are NOT intuitive. To me. I'm probably just old. And so is the submitter. :-)

    • Re:Rebellion (Score:4, Informative)

      by Samantha Wright ( 1324923 ) on Friday January 27, 2017 @12:08PM (#53749053) Homepage Journal

      It's a little sharper than that—the current generation of interface designs was a direct reaction to the previous decade's tradition of absurd skeuomorphism. The moment Steve Jobs died, Apple did an about-face and started following Microsoft's Win8/Modern/Metro UI lead. It may look like a step backward to those who from the Windows 2000 and Gnome 2 era, since there's a loss of visual cues, but the flatness of current interfaces is way better than what the classics became [tumblr.com] in the post-Windows XP era: bloated, overdesigned, pseudo-real-objects cluttered with mismatched shadows and conflicting perspective angles. You couldn't tell what was a button there, either! At least now there's a consistency and a return to the actual use of design guidelines.

      That said, there are still a lot of cases where literacy in idioms dominates: for example, the largely inexplicable convention of swiping sideways on a list to reveal 'delete' or 'edit' buttons in mobile apps. That's probably where you and the UX designers run into the most difficulty. But two decades ago, every "how-to-use-a-computer" class targeted at seniors started with how to operate a mouse—so, as I think you've already recognized, it's important to try to take these things with a grain of salt, and recognize that no one is completely objective when it comes to understanding the culture of computer operation.

    • Re:Rebellion (Score:4, Insightful)

      by phantomfive ( 622387 ) on Friday January 27, 2017 @01:43PM (#53749955) Journal

      During the Renaissance we had visually brilliant works of art created. Later generations shunned this and decided that a canvas painted a solid color had just as much merit. Which is "right"? Neither.

      No, Raphael's works have much more merit than a canvas painted in solid color. That isn't even a question. The canvas painted in solid color can be interesting, but it's on a lower level.

  • Modern Software (Score:4, Insightful)

    by Oxygen99 ( 634999 ) on Friday January 27, 2017 @11:50AM (#53748893)
    I'd extrapolate this to modern software in general. It seems acceptable now to leave things broken, unsupported and undocumented so that six months after purchase or download things no longer work and can't be fixed. I appreciate things become more complex over time but the number of boneheaded things I see on a day-to-day basis is extraordinary.

    Oh. And get off my lawn...
    • God yes. Autodesk, I'm looking at you. Apple, I'm tired of looking at you. You're both ugly and your mothers wear army boots.

      Jeez, Autodesk - you can't bother to decide on something resembling a consistent user interface between products and life cycles? Ok, fine. Then goddamn document it somewhere beside's a YouTube video hidden in somebody's blog. The answer to 'what is the squiggle with the line on the side icon mean' should not take an hour of searching. You could, even, like put it in a menu on

  • My manager never alots enough time to make one smaller!

  • by mykepredko ( 40154 ) on Friday January 27, 2017 @11:58AM (#53748973) Homepage

    I fear that many of the issues listed in TFS are the result of decisions made when the OS UI conventions are defined. Then, apps follow these conventions without regard to what what it means for their product.

    That is not to say that the original conventions are always bad, they were designed for a certain feature set to provide for defined functionality - the problem comes when they are applied, without thought in third party applications. The decision to follow the OS conventions are either made by executives who feel the application needs to be a "seamless" part of the system (and Microsoft, Apple, Google, etc. spent millions on the UI conventions so let's just copy their work) or by designers that don't know any better or are just trying to get their product out quickly.

    I have never seen a great set of tests for UI developers to self-evaluate the end product. We've all been there when after working with a product for a while, everything you've done seems to make sense and you develop mental shortcuts that allow you to fly through the UI.

    The only real solution is, as part of the development process, set aside time for third party user testing with feedback sessions. I've been through a number of them, they're humbling, surprising and educating - then there's the fun part where you need to take the results and tell your boss(es) that they're wrong.

  • it's designers that need work and want to stand out and differ from others, and doing a right user interface is something that has been done before and would not stand out. To me, most modern UI's are really bad, and I'm one that want a lot of information in one view, not having to go through hundreds of pages because the designers thought it was cool having 3 lines on one screen instead of 30 or 60..
  • Too much white space, huge margins, too little information

    At least there's some attempt to combat this. For example, if you screenshot something and paste it into LibreOffice, the image is autosized from margin-to-margin. For bonus points, if it's in "web view", the autosizing is to window width.

    Of course, this is a slightly old version of Libre Office, and I'm having trouble updating it on that computer. Then again, I should update that computer entirely, it's a several years old.

    But still, it's universa

  • On iOS, Slashdot doesn't let me resize /. to a comfortable size. They made a decision that actually being able to read the ant-sized writing was less important than having rabid control of the layout.

    As the user base ages (We're not all in our teens and 20s any more, and I suspect the majority of /.ers are in our 40s) being able to resize the font matters.

    • This might be more of a browser UI problem. Locking the regular pinch zoom is required to make responsive designs work, due to how the viewport is set up. However, the browser doesn't replace that with a standard viewport zoom. The desktop browsers actually have a proper zoom for this type of thing, where adjusting the zoom gives you responsive feedback from the web site (zoom in far enough on desktop, you get the mobile view of the site in large print).

      This same type of zoom needs to be implemented on m

  • Part of the huge white-space and big button modern trend comes from the advent of touch screens. Remember Windows 8 and how it practically forced users into touch with gestures and "charm zones"?

    I appreciate some of these new features. For example, in Siebel's database Open UI, buttons and selection targets are now easier to hit. The downside is less information in the same screen space. (Also, the new interface does not require IE and ActiveX, a positive but not related to the UI's functionality).

    I sus

  • Forgot Some... (Score:5, Insightful)

    by BrendaEM ( 871664 ) on Friday January 27, 2017 @12:07PM (#53749045) Homepage

    Hate:
    White text on a bight yellow background, on Galaxy Note 3 Android.
    Where the fuck have the icons gone? Windows.
    Why can't I cut an paste information from your dialog.
    Why are things still not resolution independent. Adobe, and most music production applications.
    Don't think you need files and folders? Think again, and the includes you Firefox mobile bookmarks.

    The creator of "material design" need to be shot. There's a difference between not being limited by the physical world, and needlessly disconnect us from what we have already learned.

    In the battle between KDE, Gnome, and Unity, Cinnamon won.

    Love:
    Rounded corners rule!
    Shadows show us what's on top!

    Maxims:
    Just because Apple did it, doesn't make it right. Remember, they had a bad year last year.
    People need to work, more than you need to masturbate over your own art work.
    Most serious file management takes place in two windows.
    Clean means that you are too lazy to update the functionality in your program, so you are leaving useful stuff off.
    Those who think that the command line and a GUI cannot coexist have never seen a 3D CAD or design program.

  • by QuietLagoon ( 813062 ) on Friday January 27, 2017 @12:14PM (#53749115)
    Text with such low contrast that it is unreadable in all but the best of lighting conditions
  • by dfm3 ( 830843 ) on Friday January 27, 2017 @12:16PM (#53749131) Journal
    The problem is that there's a glut of "UX" designers convinced that if someone else has successful, and you copy the superficial hallmarks of their design, you'll be successful too. Take Facebook's "infinitely scrolling" page design for example - suddenly you have every damn app and website using an infinitely scrolling layout, even things like weather apps where the information is finite and is best presented using another paradigm such as tabs. Combine this with the prevailing attitude that if less is more, then even less must be even more, and you get the current mess we're in now.

    This is not only the case with the current "flat" design epidemic ("Apple went flat and look at how successful they are! If we go flat we'll look modern and we'll be successful too!") but in many other elements that have been taken to an extreme at the cost of usability and accessibility:

    - The use of razor thin fonts
    - White text on monochrome, pastel backgrounds
    - The loss of critical UI elements like scroll bars and button outlines, because apparently they just clutter things up
    - The use of "hamburger" mystery meat menus
    - Loss of status bars (which attempted to at least give some idea of percentage completion of a task) in favor of things like dots that twirl, spin, and dance in circles
  • by hey! ( 33014 ) on Friday January 27, 2017 @12:20PM (#53749161) Homepage Journal

    ... and every idiot in the world thinks he's an artist.

    People associate lots of white space with "modern" and "clean", but in fact the key is to use white space intelligently to help guide the user's attention. The question isn't whether you have a lot or a little, the question is how much mental work does it take for a user to accomplish his task?

    It's easy to ape interfaces that work well, but that's cargo-cult design. Design should be as much evidence-driven as it is fashion-driven. First (design) principles are only a starting point.

    Recently I was using a smart TV app and when the content I requested took too long to buffer I decided to quit the app. I was presented with a dialog warning me that I was leaving the app, and asking me whether I wanted to "cancel" or "continue". This gave me a moment's pause, because I didn't want to "continue" waiting for the content to load. However as a developer myself I understood the programmer's mindset: "cancel" and "continue" referred to the event the dialog was responding to: a request to exit the app.

    This division of responsibilities is backwards: the user shouldn't have to get into the mind of the designer, the designer needs to get into the mind of the user. And that's hard. UI guidelines help, but there's no substitute for watching actual users struggle with your design. Any time you find something that makes them pause, even for a moment, you should file that bump down. That'd catch problems like confusion between text and controls, or inscrutable state widgets.

  • by ilsaloving ( 1534307 ) on Friday January 27, 2017 @12:20PM (#53749163)

    If you notice throughout software history, developers have *always* copied each others styles, in ways no different than a fashionista would. In the 90s it was pseudo-3d buttons because people wanted buttons that looked like buttons. (Personally, I *still* do.) When the WWW got popular, people started making *everything* to look like hyperlinks. Thank god that didn't last long.

    But now... I just don't even know what to say. Style has completely overrode any semblance of usability. Google started the 3 parallels bars=menu thing, and now everyone is doing it. 2D flat everything is now all the rage wherever you look, and people think they're being cool if they use obscure icons for things that may or may not have a passing resemblance to the function they're trying to perform. Intuitively has basically been thrown out the window.

    Case in point: Whoever came up with the UI for snapshot should be tarred, feathered, shot, multiple times. While I eventually figured out how to use it, it took *effort* to figure it out because it made about as much sense as Trump walking into a soup kitchen.

    I can only hope that sanity will return soon.

  • UX gone wrong (Score:5, Informative)

    by MobyDisk ( 75490 ) on Friday January 27, 2017 @12:21PM (#53749171) Homepage

    Developers traditionally make efficient, functional, ugly interfaces. They did this by using standardized UI controls. They were largely constrained. Today, without those constraints, those same developers make inefficient, semi-functional, pretty interfaces. And with the focus on form over function, they are pushed in this direction by management. (Thanks Apple, for telling me that I want to get rid of all the jacks in my laptop so that it can be be 0.00001 inches thinner.)

    A good UX person -- not the kind of BS "UX" that I see lambasted here -- but a real one -- can improve the look and feel of an application, optimize the workflow, and make it pretty too. I work with a UX engineer who uses statistics on the average hand size of our target demographic, and can quote the average size and resolution of the displays they are using. On touch-screen apps, our UX team optimizes for right handedness, and organized the screen so your hand doesn't cover the things you are looking at and so you make minimal movements. A few years ago we even created a mock-up, and had actual users go through a workflow and timed them, counted number of clicks, etc. This is good UX. It's human factors engineering + graphic design.

    A sad anecdote: A few years ago I had the pain of designing a UI with a bunch of managers. It was a screen to add/edit/delete users who had access to an account. I drew-up a typical text box with a list, and then an add/edit/delete button at the bottom. You could fit 50 users on a typical screen, quite readably. They HATED it. Their design fit about 10 users on the screen. Big margins all around. Each row had a separate add, edit, and delete button, a large single-color icon of a person. All the icons were the same, so they communicated nothing. The text was so large that long names needed an ellipsis to fit. The add/edit/delete buttons were tiny icons without text. It was pretty, wasteful, and slow. They loved it.

    On another project, which was an industrial machine, they wanted icon buttons. Their previous version used 16-color EGA graphics so it needed an update. So I used actual 3D renderings of the parts as icons. Initially everyone loved it because it was clear what the icons did. 3 years later, it laughed-at because it is too "realistic." So on the next project they replaced the realistic icons with single-color conceptual representational icons. Unless you were on the project, you had no idea what the icon meant. The customers came-up with names for the icons: the "one-eyed cat" let you search. The "disney castle" was to load a tray into the device. The "laser broom" was the barcode scanner. This interface is loved by development because it is so pretty, and is the new standard moving forward. The customers (and training department) complain that unless someone uses the device regularly, they forget if they should start the workflow by clicking the "one-eyed cat" or the "laser broom."

    At with the next project, they are using text under the icons again, so users know what they are.

  • by NotARealUser ( 4083383 ) on Friday January 27, 2017 @12:36PM (#53749321)

    Caveat: I am not a designer, but I do program some programming for various Apps/Websites as a side job(but focus on more behind the scenes stuff in my day job).

    I do not understand all the design decisions, especially the proliferation of interfaces with generic icons that could be mistook for Ikea instructions. It is frustrating when you run into an icon that could be interpreted as "light phone on fire" or "turbo mode" but you really don't know for sure which it is. Do you try it???

    That being said, if I create an app or website that has nice instructions on it, the end user's first impression is to hate it. They say it does not look modern enough. However, it is intuitive to use and they can figure it out quickly. On the other hand, if I create an Apple/Material design type app, customers love it and accept it immediately. Of course, the UI is impossible for their customers, but hey, at least the company that requested the app likes it.

    I think a lot of this stems from the Instagram/Pinterest world we live in. Everyone wants to be blown away by the beauty of the app when they casually glance at it. Of course, that beauty greatly limits the possibilities to make an app intuitive and easy to use.

    As a developer, I find that I try to balance these things. But as someone who generally likes to get paid for my work, I will often say screw it, and give the "artsy focused" people what they want because they are the ones that sign the checks. The quicker I make them happy, the sooner I can get paid and move on to the next gig.

  • by dcollins ( 135727 ) on Friday January 27, 2017 @01:06PM (#53749617) Homepage

    For me, the #1 modern UI sin, which wasn't included in the list here -- Non-discoverable interfaces. Interfaces based on some "gesture" which is never explained, and for which one cannot find an explanation (unless you already know the gesture to get there, if it exists). Pinch-zoom, hover in a magic corner, drag from edge, press screen for short vs. long time, invisible menu bars, etc., etc. In the 1984-2010 era I could follow the words in the menus and discover new features in any piece of software (and so could anyone, assuming they weren't illiterate). The last few years have brought my first experiences with software that I just couldn't begin to figure out how to do anything with.

  • by Camel Pilot ( 78781 ) on Friday January 27, 2017 @01:47PM (#53749997) Homepage Journal

    My biggest complaint is the constant attempt to "abstract" simple concepts such as directories. For example "My Libraries" abstracted over top the easy to comprehend directory file system is an abomination. Ask the average user how to go to a directory hanging off the user directory (c:/user/$user) and they don't know how. You click on "my documents" but there is no clear way to go up one level of hierarchy or even understand where "My Libraries" really resides. Of course this was even worse in the "my documents and settings" days. People readily understand a hierarchical directory and file system. Why do they attempt to further abstract directories and files is beyond me. This is why Gates could not find the "downloads" directory in the anti-trust trial - where the hell is it? Even he didn't know! Gnome3 makes the same mistake IMHO.

  • by thegarbz ( 1787294 ) on Friday January 27, 2017 @01:49PM (#53750023)

    I was with you until the very last point. Lack of customizability is a good thing. It creates standardisation. It means when people pick up a device of same or similar model to their own they know how to use it without any guess work. It makes support and training easier, though admittedly at the expense of finely tuned specific tasks.

  • by steveha ( 103154 ) on Friday January 27, 2017 @02:48PM (#53750565) Homepage

    Our screens are way bigger than they were back in the old days, so we have plenty of room for things like menus and toolbars. Yet the trend in modern UI design is to make things magical and non-discoverable.

    Just yesterday I helped my father with a problem: the menus and toolbar from Thunderbird were gone. I was on the phone with him for a while. The task was to find the one magic part of the Thunderbird window where he could right-click and find the context menu with the checkboxes for hiding/displaying the main menu and toolbar. Thank goodness I have him running MATE so every window has a title bar... "find the blue bar at the top that says 'Inbox - Mozilla Thunderbird' Now right-click in the dark grey area underneat that, to the right of the tab that says 'Inbox'..." "It didn't work" I'll spare you the back-and-forth, he had multiple tabs and was clicking in a tab to the right of "Inbox". Once I got him over to the correct magic spot, he found the context menu and restored his menu and toolbar. (The stupid hamburger menu is part of the toolbar, and hides with the toolbar... which means it's possible to hide all the menus! And my dad somehow did so by accident!)

    The original UI spec for the Macintosh required menus all the time for every app, and the menus had to be in the same place. And I learned very quickly that I could browse the menu, find the command I wanted, and the keyboard shortcut was documented right there in the menu. Hidden menus are far too magical, and if you are going to have them, the very least you should do is to make every context menu have the ability to unhide them, rather than requiring the mouse pointer to be hovering over a particular magical few pixels of your screen.

    I also remember the 45 minutes it took to help my dad un-mute YouTube videos. First I had him use the MATE sound preferences dialog to test his speakers, which just took a couple of minutes. Then I had to walk him through moving the mouse pointer over the YouTube video window to make the controls un-hide... (he wasn't full-screen, why do the controls hide when there is plenty of screen real estate available?) Then he had to move the mouse pointer to touch the audio control (and a slider pops out when you get it right) and click to un-mute... and when it's un-muted it says "MUTE". Because when it's un-muted the button becomes the "MUTE" button, and when it's muted the same button becomes the "Un-mute" button. The old-school solution would be a checkbox labelled "MUTE" that's checked when it's muted; the newer way would be a GUI toggle that slides left for un-mute and slides right for mute. There's plenty of screen real estate for either of these.

    I know, I know, on mobile devices these magical hiding tricks are not so pointless because screens are smaller. But desktops are not mobile devices and trying to treat them the same is a bad idea.

    My dad is not stupid and I don't want to sound like I'm making fun of him. I'm just annoyed over the modern trend in UI design where everything is so magical that it's tricky and weird.

Avoid strange women and temporary variables.

Working...