Forgot your password?
typodupeerror
GUI Linux IT

Take This GUI and Shove It 617

Posted by Soulskill
from the i-ain't-clickin'-here-no-more dept.
snydeq writes "Deep End's Paul Venezia speaks out against the overemphasis on GUIs in today's admin tools, saying that GUIs are fine and necessary in many cases, but only after a complete CLI is in place, and that they cannot interfere with the use of the CLI, only complement it. Otherwise, the GUI simply makes easy things easy and hard things much harder. He writes, 'If you have to make significant, identical changes to a bunch of Linux servers, is it easier to log into them one-by-one and run through a GUI or text-menu tool, or write a quick shell script that hits each box and either makes the changes or simply pulls down a few new config files and restarts some services? And it's not just about conservation of effort — it's also about accuracy. If you write a script, you're certain that the changes made will be identical on each box. If you're doing them all by hand, you aren't.'"
This discussion has been archived. No new comments can be posted.

Take This GUI and Shove It

Comments Filter:
  • Better test! (Score:5, Insightful)

    by AnonymousClown (1788472) on Monday October 04, 2010 @07:14PM (#33789376)

    If you write a script, you're certain that the changes made will be identical on each box.

    One little mistake in the script and you fuck up the whole organization.

  • by pedantic bore (740196) on Monday October 04, 2010 @07:14PM (#33789388)

    I think the author might not fully understand who most admins are. They're people who couldn't write a shell script if their lives depended on it, because they've never had to. GUI-dependent users become GUI-dependent admins.

    As a percentage of computer users, people who can actually navigate a CLI are an ever-diminishing group.

  • by maxwell demon (590494) on Monday October 04, 2010 @07:17PM (#33789408) Journal

    What would be nice is if the GUI could automatically create a shell script doing the change. That way you could (a) learn about how to do it per CLI by looking at the generated shell script, and (b) apply the generated shell script (after proper inspection, of course) to other computers.

  • Re:Better test! (Score:4, Insightful)

    by Steve Max (1235710) on Monday October 04, 2010 @07:18PM (#33789420) Journal
    So you test your script offline? You know, exactly like you test the changes you will do through a GUI in an offline server before going to the live one?
  • Re:Better test! (Score:5, Insightful)

    by snspdaarf (1314399) on Monday October 04, 2010 @07:21PM (#33789442)
    Ah, but with a script you have a record of what was done. The GUI does not provide that, unless the author had the sense to write the changes to a log file.
  • More and more... (Score:5, Insightful)

    by Darkness404 (1287218) on Monday October 04, 2010 @07:21PM (#33789446)
    There are more and more small businesses (5, 10 or so employees) realizing that they can get things done easier if they had a server. Because the business can't really afford to hire a sysadmin or a full-time tech person, its generally the employee who "knows computers" (you know, the person who has to help the boss check his e-mail every day, etc.) and since they don't have the knowledge of a skilled *Nix admin, a GUI makes their administration a lot easier.

    So with the increasing use of servers among non-admins, it only makes sense for a growth in GUI-based solutions.
  • Config files. (Score:3, Insightful)

    by Timmmm (636430) on Monday October 04, 2010 @07:22PM (#33789450)

    Config files are one of the problems with linux. Most of them are far too hard to parse and modify so there aren't any GUI tools to do so. For example, how do you change the PATH in linux through the GUI? As far as I know there is no way. In windows it is (fairly) simple.

    Of course there's no reason why you can't have the best of both worlds - every program can be abstracted into a CLI and GUI on top of the same base library.

  • by arth1 (260657) on Monday October 04, 2010 @07:26PM (#33789512) Homepage Journal

    But a great GUI is one that teaches a new user to eventually graduate to using CLI.

    I disagree. The "great" GUIs teach new users that they don't need a CLI, and many of them will never proceed past the point-and-drool stage, but expect a GUI app for anything they do.

    When push comes to shove, a single line of awk or perl can easily do what half an hour of GUI clicking can't. And there's no way for a GUI to prepare you for perl or awk, only distract you from learning them.

  • Yes (Score:3, Insightful)

    by prichardson (603676) on Monday October 04, 2010 @07:27PM (#33789520) Journal

    This is also a problem with Max OS X Server. Apple builds their services from open source products and adds a GUI for configuration to make it all clickable and easy to set up. However, many options that can be set on the command line can't be set in the GUI. Even worse, making CLI changes to services can break the GUI entirely.

    The hardware and software are both super stable and run really smoothly, so once everything gets set up, it's awesome. Still, it's hard for a guy who would rather make changes on the CLI to get used to.

  • by Sycraft-fu (314770) on Monday October 04, 2010 @07:37PM (#33789618)

    I find it rather disturbing the UNIX ideal that sysadmins should be programmers. The opinion seems to be that it is perfectly ok for someone to need to do a fair bit of programming work to solve a system problem. Ok but the thing is programming and systems administration are not identical skills any more than say being a musician and being a recording engineer are. They are related, but proficiency in one is not the same as the other. I know more than a few programmers that are abysmal at system administration, and I know sysadmins that can't program. There is nothing wrong with this.

    While I realize a simple (emphasis on simple) script isn't quite the same thing, this attitude smacks of the "People should just get down and code what they need," thing. No, not really. Not everyone should have to learn that skill, and you could well be excluding people you want by requiring it.

    Also there's the simple matter that GUIs work better for unfamiliar situations. While it might be easy to just say "Well a good admin should know about this," that is rather stupid. Nobody knows everything, you never get someone with limitless experience. Part of systems administration is being able to solve novel problems. Ok well GUIs help in that regard, at least when well designed. They show you your options, and how they flow, what ones exclude and influence others and so on. That can make it much faster to deal with something you are not familiar with. This is important and useful in real IT work.

    They also can help prevent errors. For example I can't count the number of times our DNS has been temporarily broken by a student messing up the file. If you do the formatting incorrect, screw up the serial number, etc and suddenly things stop working (we have it in a versioning system so it can be undone easily, of course). In Windows? Not a problem. The GUI keeps you from screwing things up. You can still make a bad entry or whatever, but you can't go and break the entire server.

    I'm not saying there's anything wrong with the command line, or that it should go away. However the idea that everything should be CLI based is silly.

  • by oatworm (969674) on Monday October 04, 2010 @07:38PM (#33789624) Homepage
    Bingo. Realistically, if you're a company with less than a 100 employees (read: most companies), you're only going to have a handful of servers in house and they're each going to be dedicated to particular roles. You're not going to have 100 clustered fileservers - instead, you're going to have one or maybe two. You're not going to have a dozen e-mail servers - instead, you're going to have one or two. Consequently, the office admin's focus isn't going to be scalability; it just won't matter to the admin if they can script, say, creating a mailbox for 100 new users instead of just one. Instead, said office admin is going to be more focused on finding ways to do semi-unusual things (e.g. "create a VPN between this office and our new branch office", "promote this new server as a domain controller", "install SQL", etc.) that they might do, oh, once a year.

    The trouble with Linux, and I'm speaking as someone who's used YaST in precisely this context, is that you have to make a choice - do you let the GUI manage it or do you CLI it? If you try to do both, there will be inconsistencies because the grammar of the config files is too ambiguous; consequently, the GUI config file parser will probably just overwrite whatever manual changes it thinks is "invalid", whether it really is or not. If you let the GUI manage it, you better hope the GUI has the flexibility necessary to meet your needs. If, for example, YaST doesn't understand named Apache virtual hosts, well, good luck figuring out where it's hiding all of the various config files that it was sensibly spreading out in multiple locations for you, and don't you dare use YaST to manage Apache again or it'll delete your Apache-legal but YaST-"invalid" directive.

    The only solution I really see is for manual config file support with optional XML (or some other machine-friendly but still human-readable format) linkages. For example, if you want to hand-edit your resolv.conf, that's fine, but if the GUI is going to take over, it'll toss a directive on line 1 that says "#import resolv.conf.xml" and immediately overrides (but does not overwrite) everything following that. Then, if you still want to use the GUI but need to hand-edit something, you can edit the XML file using the appropriate syntax and know that your change will be reflected on the GUI.

    That's my take. Your mileage, of course, may vary.
  • by amRadioHed (463061) on Monday October 04, 2010 @07:39PM (#33789636)

    I'm a bit curious, could you explain how powershell encourages screen-scraping for those of us who've never had to deal with that beast.

  • by maxwell demon (590494) on Monday October 04, 2010 @07:42PM (#33789662) Journal

    GUI's are better for reporting and displaying information

    In my experience, GUIs tend to display less information (probably to not "confuse" users). But from the basic ability to provide useful information, I don't see why one should have an advantage over the other. After all, the information is just text; if that text is shown on the console or in a window with "OK" button doesn't matter. What does matter is whether the text is informative (e.g. "foo.cfg: file not found") or uninformative (e.g. "unable to change configuration" as only error message).

  • by maotx (765127) <maotx@@@yahoo...com> on Monday October 04, 2010 @07:42PM (#33789666)
    That's one thing Microsoft did right with Exchange 2007. They built it entirely around their new powershell CLI and then built a GUI for it. The GUI is limited in compared to what you can do with the CLI, but you can get most things done. The CLI becomes extremely handy for batch jobs and exporting statistics to csv files. I'd say it's really up there with BASH in terms of scripting, data manipulation, and integration (not just Exchange but WMI, SQL, etc.)

    They tried to do similar with Windows 2008 and their Core [petri.co.il] feature, but they still have to load a GUI to present a prompt...
  • Re:Better test! (Score:2, Insightful)

    by fruviad (5032) on Monday October 04, 2010 @07:43PM (#33789668)

    If you write a script, you're certain that the changes made will be identical on each box.

    One little mistake in the script and you fuck up the whole organization.

    Perhaps so, but the scripted mistake is easily fixed because every single machine exhibits the same symptoms. Easy to debug. Once debugged, easy to fix.

    Do you think it better to have a half-dozen different mistakes on a half-dozen different servers in a pool of 40?

  • by jandrese (485) <kensama@vt.edu> on Monday October 04, 2010 @07:45PM (#33789704) Homepage Journal
    Here's something you might want to try: Next time you're on a Windows box, open up a cmd prompt and type "netsh". You might be surprised what you can accomplish from the commandline, at least if you want to mess with the network settings.
  • by arth1 (260657) on Monday October 04, 2010 @07:46PM (#33789726) Homepage Journal

    Why would I want to read a bunch of documentation, mess with command line options, then read whole block of text to see what it did?

    I'd much rather sit back in my chair, click something, and then see if it worked. Don't make me read a bunch of man pages just to do a simple task.

    Because then you'll be stuck at doing simple tasks, and will never be able to do more advanced tasks. Without hiring a team to write an app for you instead of doing it yourself in two minutes, that is.

    The time you spend reading man pages is an investment which pays off in the long run. But if you belong to the instant gratification generation, that may not be what you want, no...

  • by jpate (1356395) on Monday October 04, 2010 @07:47PM (#33789730) Homepage

    learn about how to do it per CLI by looking at the generated shell script

    Have you ever seen generated code? You do not want to learn shell scripting from generated code...

  • by fishbowl (7759) on Monday October 04, 2010 @07:52PM (#33789770)

    In a directory hierarchy, find files with the extension ".ftl" that have been modified since the last SVN revision, and containing the expression "${inventoryItem.qoh}", and for each of these, substitute "${inventoryItem.atp}".

    I'd love to see GUI's that can facilitate the kinds of work I do on a constant basis, but since there's no such thing, I've become something of a Unix shell power user over the past couple of decades.

  • by GumphMaster (772693) on Monday October 04, 2010 @08:00PM (#33789822)
    Ditto, pretty well executed I thought.
  • by petermgreen (876956) <plugwash@@@p10link...net> on Monday October 04, 2010 @08:02PM (#33789836) Homepage

    Have you ever seen generated code?
    Yes

    You do not want to learn shell scripting from generated code...
    IMO the generation process should be limited to taking the users input and "plugging it in" to a "template" command or short sequence of commands. If a process that is simple in the GUI is complex in the CLI then your system has a design fault.

    It's not about teaching the user how to write complex scripts with lots of conditionals (manuals and tutorials are better for that). It's about teaching the users the command line equivalents of their GUI actions and hence creating a bridge between the "discoverability" of a GUI and the power and repeatability of a CLI.

  • by petes_PoV (912422) on Monday October 04, 2010 @08:07PM (#33789870)
    Oh there's much more to it than merely the O/S. It's all the applications and third party management tools, too. They all provide GUIs (sometimes only GUIs) as they think it makes their stuff look easy to use. In fact all it does is make it easier to sell to decision makers who don't have the background to distinguish "friendly" from repeatable.
  • by IICV (652597) on Monday October 04, 2010 @08:25PM (#33789996)

    And to drive the point of this article home: how long would your post have been if you'd had to describe how to do this through the GUI? Would it have even been possible without screenshots?

  • by rlh100 (695725) on Monday October 04, 2010 @08:25PM (#33789998) Homepage

    Duhh... He does show a keen grasp of the obvious. And for people who use command lines he is preaching to the choir.

    On the other hand for the GUI based people, they will miss it entirely. They will talk about add on GUI replay tools that allow one set of mouse clicks to be replayed to many different servers, or configuration management tools that do the work for you. I believe they truly do not understand that someone could get 20 mouse clicks on 40 different servers wrong. "Why would someone ever click the wrong check box?" They also believe that screen shots are valid ways to store configuration information off line.

    Only half in jest.

    RLH

  • by maxwell demon (590494) on Monday October 04, 2010 @08:29PM (#33790030) Journal

    OTOH, I'd say if there's an important security setting, and it's not set to the secure value by default (or, if that is not possible, gives an error when not set explicitly), that's a design error in the application.
    Also, empty configuration files IME are rare nowadays; usually they are pre-filled with (mostly commented-out) example-settings with explanations in comments. Which often allows you to just uncomment the settings you want, instead of writing the complete command by hand.

    BTW, comments are another advantage of config files vs. GUIs. Not only because you can state the reason why you put a certain setting right at the setting itself, but also because when you change a setting you can just comment out the previous setting, and therefore easily undo whatever you changed (another option for that is, of course, to make a backup copy of the original config file). I don't see how you can do that with a GUI.

  • by Anonymous Coward on Monday October 04, 2010 @08:35PM (#33790084)

    I find it rather disturbing the UNIX ideal that sysadmins should be programmers. The opinion seems to be that it is perfectly ok for someone to need to do a fair bit of programming work to solve a system problem. Ok but the thing is programming and systems administration are not identical skills any more than say being a musician and being a recording engineer are.

    A recording engineer should be musically literate, though. He should know keys, tempos, styles, and so on, in order to be able to have a productive session with the recording artist. He may not be the best player in the world but he had better know his shit.

  • by value_added (719364) on Monday October 04, 2010 @08:42PM (#33790130)

    But what happens if the system stores those values opaquely, so that you can't ever know what they are, or refer to them as something distinct from the system itself?

    The Windows registry.

    Apparently, it's fully documented, and much easier to understand, parse, manipulate, backup and restore than those nasty "config files" that need to be "edited manually" using "cryptic commands".

    Or so I'm told. ;-)

    Your comment on the "rote memorization" aspect of using GUIs I found particularly astute. When I see official documentation that consists of a little other than a series of screenshots, I wonder how it came to be that behaving like a monkey, or more charitably, aspiring to ignorance, became not only widespread, but acceptable.

  • by Charles Dodgeson (248492) <jeffrey@goldmark.org> on Monday October 04, 2010 @08:51PM (#33790206) Homepage Journal

    Probably Debian would have been OK, but I was finding admin of most Linux distros a pain for exactly these reasons. I couldn't find a layer where I could do everything that I needed to do without worrying about one thing stepping on another. No doubt there are ways that I could manage a Linux system without running into different layers of management tools stepping on each other, but it was a struggle.

    There were other reasons as well (although there is a lot that I miss about Linux), but I think that this was one of the leading reasons.

    (NB: I realize that this is flamebait (I've got karma to burn), but that isn't my intention here.)

  • by skids (119237) on Monday October 04, 2010 @08:55PM (#33790242) Homepage

    I think this is a stronger point than the OP: GUIs do not lead to good documentation. In fact, GUIs pretty much are limited to procedural documentation like the example you gave.

    The best they can do as far as actual documentation, where the precise effect of all the widgets is explained, is a screenshot with little quote bubbles pointing to each doodad. That's a ridiculous way to document.

    This is as opposed to a command reference which can organize, usually in a pretty sensible fashion, exact descriptions of what each command does.

    Moreover, the GUI authors seem to have a penchant to find new names for existing CLI concepts. Even worse, those names are usually inappropriate vagueries quickly cobbled together in an off-the-cuff afterthought, and do not actually tell you where the doodad resides in the menu system. With a CLI, the name of the command or feature set is its location.

    Not that even good command references are mandatory by today's pathetic standards. Even the big boys like Cisco have shown major degradation in the quality of their documentation during the last decade.

  • by arth1 (260657) on Monday October 04, 2010 @08:58PM (#33790268) Homepage Journal

    A pipe to xargs isn't always more efficient.

    Consider a script that has to be portable and operate on files with spaces in the name. Portability precludes the -print0 option to find and -0 option to xargs. Nothing is more inefficient than a command that doesn't work correctly, and I've seen plenty of examples of "find ... | xargs ..." that don't work, where an -exec would.

    Also consider the situation where you only want to act on a small number out of a large number of files. Then the pipe to xargs will delay execution until you've exhausted the search, while the -exec will perform the job as the files are found. It won't return any quicker, but the load will be spread out, with the first few tasks happening sooner.

  • by MaskedSlacker (911878) on Monday October 04, 2010 @08:58PM (#33790272)

    Your entire argument is specious--it boils down to thinking that because people don't already know it, it's too hard. With logic like that we'd still be living in the Stone Age.

    Much 'easier' than typing it out on the CLI

    No. Marginally easier. And that's the bottom line--GUIs make already easy tasks marginally easier, and they make hard tasks (like one of your other repliers suggestion) vastly harder. Unless you never use your computer to do anything complicated, the GUI is a step back.

  • by Tanktalus (794810) on Monday October 04, 2010 @09:01PM (#33790306) Journal

    I have seen generated code. And I've written code generators. And really, the quality of the generated code is completely dependant on whether the developer of the generation tool was merely doing whatever was required to get working generated code, or to provide a useful tool to users to learn and expand upon.

    Too often I see devs handed a task and do the bare minimum to get that one specific task completed. Instead of looking for a bigger picture of the task and seeing if they can get more out of it with little to no extra effort. Because, really, getting your output to be readable is not nearly as complex as creating the UI for it anyway. Took me on the order of hours to tweak my output to look right, and not even 50% of that time again to tweak it in the way that feedback recommended (some of which I disagreed with and responded as such, the rest I agreed with or was ambivalent upon and thus changed). The rest of the project took months of effort. Skimping out on readable output is a false savings here.

    It's not hard to do. A good template tool (in perl, I use Template Toolkit), and 90% of the problem of readable output is solved.

    The only times I don't care about readable output have been generating xml and xhtml. Even when I generate C++ and Java code that is going directly from there to the compiler and not read by any user, I try to get the output reasonably readable - which makes it much easier to deal with compile errors when they point to a line number that has only a single statement on it, or when I bring the resultant app up in a debugger and want to step through the code for one reason or another. The time saved in debugging compile or runtime errors alone can pay for the effort in readable output over and over again.

  • by turbidostato (878842) on Monday October 04, 2010 @09:07PM (#33790358)

    "I find it rather disturbing the UNIX ideal that sysadmins should be programmers."

    That should be the case, but it isn't firstly because becoming a good sysadmin is a full time activity as it is being a good programmer and secondly because of subtle character differences between the people that choose one role or the other.

    "I know more than a few programmers that are abysmal at system administration, and I know sysadmins that can't program. There is nothing wrong with this."

    Yes, there is, and very wrong. Maturity of current IT systems is still far away to what's needed to be able to work in aisles. A programmer doesn't need to be a top notch sysadmin nor the other way around, but they both need to have very clear ideas about the other's trade because is needed both to understand where your program is going to be run and how and what would make proper practices to acomodate the programs within a wider and partially peculiar local environment (and in order to recognize properly engineered programs from lame intents).

    "Not everyone should have to learn that skill, and you could well be excluding people you want by requiring it."

    And, in fact, not everyone needs to learn that skill, it's only sysadmins that need it. And take for granted you are not excluding interesting people to fill a sysadmin role if they don't have at least clear foundations on programming.

    "Also there's the simple matter that GUIs work better for unfamiliar situations."

    Quite true (but proper man pages with examples and tutorials work almost as well).

    "Part of systems administration is being able to solve novel problems. Ok well GUIs help in that regard, at least when well designed. "

    But don't forget a *very* critical point: a new thing is only novelty the first time you do it. Do not let a bit of easiness for your first time getting in your way for the subsequent 10.000 times you will do it again from then on.

    And that's exactly why GUIs sell so good. When you are "buying" something new (it might mean literarilly exchanging money, but it means commiting yourself to the effort that will require work with new thingie) it usually will be to do new things, which is the kind of situation when GUIs (and "wizards", for that matter) will help, so the GUI by itself will be a very valuable agent to sell the app/services. By the time you understand that the shiny GUI gets in the middle you have invested too much in the app (money, time and effort) to get away from it. Microsoft, for instance, has learnt that leson very, very well.

    "They also can help prevent errors. For example I can't count the number of times our DNS has been temporarily broken by a student messing up the file."

    That's not an argument, it only seems to. While "manual handling" is prone to syntax failures, GUIs are prone to knowledge failures which are, by the way, much, much more dificult to debug. For each time you had a student making a severe syntax error on a DNS zone file, I can show you a self-called sysadmin making horrible design choices that led to situations dificult to repair and problems dificult to debug because the GUI allowed for an action on Windows environments (which was not a failure of the GUI itself, because the action was correct under the proper circumnstances, but because the GUI allowed for someone without enough knowledge on the consecuences of their acts to do "something" resulting in an "OK" message: a case of "garbage in, garbage out").

    So it's a stalemate on this.

    "In Windows? Not a problem"

    Now that you mention the text config files vs. GUIs on a multiple admins (some of them students) environment, here comes a hugh problem with the vast majority of GUIs:

    You get at work early in the morning and something is not working properly. You summon your minions and tell them "something is broken; what have you messed up since yesterday?"

    On a text files-based environment the answer is easy and you already advanced it: "we have it in a versioning system so

  • by 10101001 10101001 (732688) on Monday October 04, 2010 @09:09PM (#33790380) Journal

    I find it rather disturbing the UNIX ideal that sysadmins should be programmers. The opinion seems to be that it is perfectly ok for someone to need to do a fair bit of programming work to solve a system problem.

    I'd presume this comes about from the fact that [administration] software used to be very expensive so it was normally cheaper to hire a sysadmin/programmer than to hire a sysadmin and seperate software. The fact that most sysadmins used to be at least minimally programmers (ie, they could write a shell script) certainly helped in that.

    Ok but the thing is programming and systems administration are not identical skills any more than say being a musician and being a recording engineer are. They are related, but proficiency in one is not the same as the other. I know more than a few programmers that are abysmal at system administration, and I know sysadmins that can't program. There is nothing wrong with this.

    Quite true. Meanwhile, most companies demand the equivalent of musician/recording engineers for the price of slightly more than a musician. And that tends to drive down the wages of those who are the equivalent of just recording engineers. That's just economics at play.

    While I realize a simple (emphasis on simple) script isn't quite the same thing, this attitude smacks of the "People should just get down and code what they need," thing. No, not really. Not everyone should have to learn that skill, and you could well be excluding people you want by requiring it.

    Yes, and that's why Windows System Administrators tend to be paid less than UNIX/Linux System Administrators. And where Windows can be said to excel is in providing for common tasks for very small businesses (ie, they only have one or two IT staff) in a GUI format so no programmer is needed.

    Also there's the simple matter that GUIs work better for unfamiliar situations. While it might be easy to just say "Well a good admin should know about this," that is rather stupid. Nobody knows everything, you never get someone with limitless experience.

    No, I think it would be said that a good admin should know how to learn in an unfamiliar situation. GUIs can certainly facilitate this, but GUIs don't magically remove the need to understand.

    Part of systems administration is being able to solve novel problems. Ok well GUIs help in that regard, at least when well designed. They show you your options, and how they flow, what ones exclude and influence others and so on. That can make it much faster to deal with something you are not familiar with. This is important and useful in real IT work.

    I think that heavily depends on your use of the word "novel". Most problems in system administration aren't really novel. They're merely new to the system administrator. In that regard, GUIs are great for helping prove the mechanism to do useful IT work. But, once you step into areas which are actually novel, GUIs by definition can rarely be of help. But, then, CLIs may be of little help either. At that point, you really do need to program, be it a script or an actually C/Java/whatever program.

    They also can help prevent errors. For example I can't count the number of times our DNS has been temporarily broken by a student messing up the file. If you do the formatting incorrect, screw up the serial number, etc and suddenly things stop working (we have it in a versioning system so it can be undone easily, of course). In Windows? Not a problem. The GUI keeps you from screwing things up. You can still make a bad entry or whatever, but you can't go and break the entire server.

    Very true, much like how client-side javascript can aid people in validating their input before it's actually used. That's certainly good strength of a GUI when it's dealing with well-understood, common content.

  • Re:Better test! (Score:3, Insightful)

    by dbIII (701233) on Monday October 04, 2010 @09:10PM (#33790390)
    One little mistake in a global GUI and you not only fuck up the whole organization but can't really be sure what you did unless you had something capturing the screen while you were making the changes.
    Scripts can be debugged, working out which pictures were pointed at is a bit more tricky.
    Communicating with computers can be compared to communicating with people. You can get somewhere by pointing at things but you get better results faster if you can use words as well.
  • by ToasterMonkey (467067) on Monday October 04, 2010 @09:24PM (#33790470) Homepage

    Providing a great GUI for complex routers or Linux admin is hard. Of course there has to be a CLI, that's how pros get the job done. But a great GUI is one that teaches a new user to eventually graduate to using CLI.

    Just about everyone reading this is heavily biased one way or another, and there is too much presumption that a CLI is this or a GUI is that.

    Why can't we break this down into what makes any type of interface good or bad, and keep open to the possibility of new types of interfaces or better ways of implementing existing ones?
    If we can bitch and moan about CLI vs. GUI with little choice in the matter I think the floor's open to made up interfaces too. These are qualities I think any computer interface should have.

    Learning curve no steeper than the underlying concepts. Probably even lower.
    Consistent, and predictable.
    Expressive, and concise.
    Integrity. Um.. as in, the state of the machine vs. what's conveyed to the user. Accurate?
    Available over a network.
    Can be automated.
    Online documentation. Of the interface. If you have trouble describing the concepts in your native language, maybe it sucks.
    Efficient, in the sense of labor involved, but in the sense of learning too, like my first rule.

    That's a start anyway. I don't see why any kind of interface can't shoot for those. For sure if you're going to deliver more than one kind if interface they should relate to each other as much as possible. A CLI is dead simple to turn into script, but a GUI could also be. A CLI should map so closely to a corresponding GUI that it effectively IS a script of the GUI. A GUI should convey state information just by looking at it. Duh. If it doesn't HAVE to be an either/or situation, why make it into one?

     

  • by arth1 (260657) on Monday October 04, 2010 @09:28PM (#33790502) Homepage Journal

    "They also can help prevent errors. For example I can't count the number of times our DNS has been temporarily broken by a student messing up the file."

    That's not an argument, it only seems to. While "manual handling" is prone to syntax failures, GUIs are prone to knowledge failures which are, by the way, much, much more dificult to debug.

    Never mind the things that the GUI just can't do, because the GUI is limited by the knowledge of the person who wrote it. Try entering an rdata_44 record in the DNS through a GUI -- chances are it won't let you. Or set up logging for a single client. Or remove all entries from the zone that has an expiration of a certain number of minutes (which is something you might want to do when replacing a DHCP server or similar). The limitations are endless, unless the UI simply integrates a text editor, in which case the question is why bother?

  • by MightyMartian (840721) on Monday October 04, 2010 @09:38PM (#33790544) Journal

    What would be REALLY nice for you all router manufacturers who are using Linux underneath the hood is to give shell access so that we could gain full access to iptables, vpn and routing. Just about every one of these Linux-based routers has all that power locked up in your crappy web-based configuration tools that render them all but brain dead. Yeah, I know, there's DD-WRT and its various iterations, but these only work on a subset of Linux-based routers.

  • by turbidostato (878842) on Monday October 04, 2010 @09:48PM (#33790624)

    "Realistically, if you're a company with less than a 100 employees [...] it just won't matter to the admin if they can script, say, creating a mailbox for 100 new users instead of just one."

    Well... it won't matter till the day your single server dies and you learn that, being a little drop in the sea, you don't have a "gold support contract" and it takes you one week to get a new spec'ed server (if only I weren't such a cheap ass and I'd buy two...). And when you turn on the new server you learn that you don't have the slightest idea about how it was exactly configured with all the tweaks you added through the years.

    "The trouble with Linux, and I'm speaking as someone who's used YaST in precisely this context, is that you have to make a choice - do you let the GUI manage it or do you CLI it?"

    Your problem is YaST, not your approach.

    "The only solution I really see is for manual config file support with optional XML"

    I'll tell you a better one (XML for configurations is almost never the answer).

    1) Proper engineering practices. Most of them are quite simple, once you get them. Just an example that it would be quite apt to this case: Debian favours as much as possible having configurations for complex services in its own directory, '/etc/complexservice.d/' where you drop "config snippets" instead of a big /etc/complexservice.conf file. If Suse did the same, YaST could be able to tweak the config snippets you allow it and you could manage by hand the ones you wanted/needed. But ask yourself why are you still using Suse and you probably will have to answer yourself "because YaST". Suse has no interest on making YaST a proper configuration tool because YaST is a lock-in tool instead. For the "easy" things you go with YaST, so you don't learn how to do it "by hand" on the cases where it makes more sense, the easy ones; by the time YaST gets short you don't know how to fully do it "by hand" and you can't manage it partly "by hand" because YaST will overwrite it. Net result: you end up locked in Suse.

    2) Today, almost any "workgroup-graded" server has enough horsepower to handle virtualization. Fire up a virtual guest, test your new configs in the test environment, making use of the GUI tools if needed, and then go analyze the resultant configuration and replicate it by hand in your production environment. This will bring you the best of both worlds: fast entry path for new things by means of the GUI while retaining full customization abilities on production (and the analysis part will make you learn a lot at a very fast pace).

  • by turbidostato (878842) on Monday October 04, 2010 @10:24PM (#33790852)

    "If your GUI dosn't offer the same advantages than a console/terminal text interface, then your GUI is lacking some features and is a GUI for some but not all your needs, very common in the nix world btw"

    Now, please, tell me how well a GUI-oriented environment copes with:
    * Repeatability (I want exactly the same you did yesterday on server A, now on server B)
    * Auditory (what did change in this server from yesterday, who did it?)
    * Automation (remember what you did yesterday on server A? I want you to do it again on server B; but do it tonight at 2AM -no, no extra hours payment allowed)
    * Conformance (now that we know how exactly it has to be done from our working on the test environment, let's do it -without mistakes, in our 100 production servers)
    * Orchestration (let's do changes a, b and c on servers A, B and C as fast as possible, without mistakes, in proper order and each step only after being sure the previous one is properly working -oh, and let's do it at 2AM -why do you ask it again? No, no extra hours payment allowed)

    For I know all of them are virtually trivial on text-based CLI-oriented systems.

  • by starfishsystems (834319) on Monday October 04, 2010 @10:54PM (#33791048) Homepage
    It wouldn't have been so bad if Microsoft had just supplied a generalized configuration parser for application developers to use. Then it could be applied to files, streams, digitally signed blobs, whatever.

    But no, they had to build another way to lock you in, didn't they? So you don't get the security controls or network transparency of the file system. You're not reusing the file system, so you have added another source of complexity and another attack surface for security. You don't get composability or scope control where one subsystem can feed live configuration data to another.

    And if the Registry breaks, you're kind of hooped. Sure, there's regedit, but it's off to the side, not part of the main paradigm. Mostly it serves to remind you of what you're missing. It's far from common practice to track changes to the Registry in order to monitor how an application has been configured. And that's because the application has not necessarily used the Registry. So tracking it is in general neither necessary nor sufficient.

    Ooh. Don't get me going.
  • In Google's World (Score:2, Insightful)

    by ALimoges (870872) on Monday October 04, 2010 @11:30PM (#33791266) Homepage
    If you were to read a how-to on a blog about configuring a router using a GUI, it would take a few pages. If it were about configuring a router using a CLI, it would fit in one page. Bottom line: in today's world where a forgotten command can be remembered quickly using our friend Google, I much prefer CLI because it makes me read less, and do more.
  • by Anonymous Coward on Monday October 04, 2010 @11:33PM (#33791282)

    > * It requires a sysadmin with a clue
    > * You need to not be a mouth-breather to configure it

    Still wonder why you're not taken seriously, eh?

    I like Samba. It could use less advocates like you.

  • by sjames (1099) on Tuesday October 05, 2010 @12:02AM (#33791404) Homepage

    If you type a question mark in the IOS CLI, it will show you all of your possible options from that point. Tab completion is another way to accomplish that, some CLIs use that.

    In Unix, you have the tab completion, man and apropos to help you out.

    Shell languages are designed with an emphasis on very simple "programming" tasks such that a sysadmin can easily use it for simple string substitutions and such while it remains a turing complete language (meaning anything that can be expressed to a computer can be expressed in the shell language)

    It's really sad watching some poor schlep clicking and typing all day when a simple "for i in" one-liner could have done the job in 30 seconds.

    Now, you have a problem that has brought the network to it's knees. What do you want swimming upstream to your terminal so you can fix the problem, a GUI desktop pixel by pixel or a # character? (we'll presume at this point you have a senior admin rather than an intern working the problem).

    BTW, if you do DNS right, you will at most break one zone. The log output will tell you what is wrong and you can fix it quickly. If it's a big change, try it out on a test box first, then just push the debugged files to production when ready. BTW, it was possible to put your DNS under change management because it was in the form of text files rather than an undocumented binary blob produced from a GUI app.

    You can set up a really simple ability to back a change out as well. cp pz/example.com back/; vi pz/example.com. If it really messes up and going back is your best bet, cp back/example.com pz.

    If you MUST have a GUI, why not an app that loads in the text file, presents it as a GUI and allows editing, then writes it back out as the text file again?

    A key takaway from this is exactly what TFA was talking about. File and CLI based configuration can easily have GUI, change management, etc overlaid on top of it, but if you start with a GUI, you're pretty much stuck there.

    Meanwhile, why is a student making changes directly to a production box? Let them mess around on the test boxes for a while until they become comfortable with the process, then they'll make a lot less mistakes.

  • by Anonymous Coward on Tuesday October 05, 2010 @12:16AM (#33791466)

    There are a number of features in powershell that unix CLI environments would do well to adopt. It's okay to admit ignorance, but please stop trying to cover for it with pontification.

  • by Gadget_Guy (627405) * on Tuesday October 05, 2010 @12:21AM (#33791476)

    The last Microsoft product I bought was Windows 98, so I have mercifully missed the whole disaster since then. All my clients are just now starting to switch from XP to Windows 7, because I advised against Vista.

    I was like you. I advised people to avoid Vista without having tried it too. Then I did try it and found that most of the bad things people said about it were just outright lies. There were some problems, but it was nowhere near as bad as everyone claimed.

    And in all these years of supporting dozens of computers I have never heard of PowerShell until this article.

    I'm not surprised since your clients were taking your advice and you hadn't heard of it. It stands to reason that they wouldn't be using it. Powershell has been around for 4 or 5 years now, and it does appear in everyone's Windows Update list as an optional download. No offense, but I'm not sure how you support Windows users with only knowledge of Windows 98.

    Too bad they did not build on Xenix, and save everyone much grief. Imagine where Apple could have been in the 90s, had they switched to Unix a decade earlier.

    The problem with Windows isn't with the core technology. The NT kernel is quite solid. The problem is what they did on top of that. ActiveX in a browser was just asking for trouble. Leaving ports open and services running by default might make it easy for plebs to run programs and network services without having to configure them, but it is suicide for security. Even if it had Xenix underneath, they still would have been up the creek with idiotic decisions like that.

  • by starfishsystems (834319) on Tuesday October 05, 2010 @12:53AM (#33791596) Homepage
    For sure if you're going to deliver more than one kind if interface they should relate to each other as much as possible.

    Under this expanded mandate, let's also compare the programmatic interface to the system with the command and graphical interfaces. I've always been bothered that there isn't a tight relationship between the standard Unix command set and their programmatic equivalents. It's pretty accidental, and that's a shame. As long as the coupling is not rigorous, that means two learning curves as well as two compatibility surfaces to maintain.

    I used to ponder this a lot, so it was very instructive to work with the Symbolics Lisp Machine for a few years. Here we had an environment which was Lisp from the microcode upward. It was a completely fresh start, a chance to do it right. The operating system was very open. And with Lisp being an interpreted language, you'd think it would be dead easy to provide a rich CLI that gave direct access to both the system and the GUI.

    Well, no, that's not how it worked out. Part of that is due to our natural fondness for line breaks as command delimiters. Lisp wants things to be delimited by parentheses. But that's not an insurmountable problem. Unfortunately, the Symbolics developers thought it was, and they instead went off and built a CLI that was only vaguely coupled with the corresponding system methods. It was heartbreaking to see such a basic advantage thrown away.

    To make matters worse, when the CLI ended up throwing an exception (which was not all that uncommon in such an experimental environment) what you saw on the stack bore little resemblance to the documented system methods. See, most of those were not really methods at all but wrapper macros of one kind or another. So, you'd execute a shell command, which would invoke one of these macros, which would push some undocumented methods on the stack. And since the CLI didn't relate to the system documentation, you'd have to guess where to go to find the documented methods.

    It could have been the best. Instead it was kind of the worst. Well, I guess it beat looking at IBM core dumps.
  • by keith_nt4 (612247) on Tuesday October 05, 2010 @01:06AM (#33791632) Journal

    I've never been in a decision-making position for choosing a server for a company but over the last ~seven or eight years I've come to conclusion (I'm some will correct me if I'm wrong) that there aren't really meetings in which a group managers balance the pluses and minuses of Linux/Samba verses whichever Windows server. That look at entire apparent cost-of-ownership, in particular the support contract.

    In other words the main draw is the service contract: if any hardware fails on any server an 800 number is called and less than 24 hours later either the replacement part is delivered or a tech of some sort is there ready to install the part that needs replacing.

    I don't think most at least large companies really care about brands so much. I mean I don't think they're loyal to Windows in so much as the Dells and HPs of the world will win the bid for phone support/hardware replacement for three or five or whatever number of years at a time. And that Dell contract will extend to an MS support contract along with server/desktop licensing etc. Companies pay huge amounts of money for things like always having someone that can be called and a replacement part in under 24 hours.

    So, am I way off?

  • by Johnny Mnemonic (176043) <mdinsmore@gmail.3.14com minus pi> on Tuesday October 05, 2010 @01:36AM (#33791744) Homepage Journal

    GUIs can make unknown operations significantly easier

    IMO, this is the biggest advantage that GUIs have over CLI: they allow you to see all of the legal choices at once, and as you make a change in one field you can see which other fields are now made illegal. Good "help" in a CLI can help, but it's much more trial and error. Esp if the help isn't good, and you combine illegal choices.
  • by Anonymous Coward on Tuesday October 05, 2010 @01:39AM (#33791758)

    You forgot:

    Seems to depend on OpenLDAP, which is in itself generally too unstable for production use once you pass a few hundred users.

  • by TheLink (130905) on Tuesday October 05, 2010 @02:22AM (#33791878) Journal
    In my experience perl is on most unix machines, and works quite well for cross-platform tasks.

    Can't depend on Java or GNU tools to be present everywhere, but so far I've found perl on OSX, Solaris, OpenSolaris, AIX, most Linux distros and FreeBSD.

    So if I ever had to write a cross platform unix/linux virus I'd write it in perl :).
  • by jschottm (317343) on Tuesday October 05, 2010 @02:34AM (#33791940)

    I'm a little puzzled why your clients pay you for advice on Windows when you don't use or pay attention to it. PowerShell has close to 4 million hits with a Google search and was much discussed on slashdot.

    NT derived Windows have some amazingly powerful command line tools that in some cases are far better than *nix tools. Check out Ed Skoudis' many articles and podcasts on command line kung fu.

    Too bad they did not build on Xenix, and save everyone much grief.

    As someone else noted, the NT kernel is really pretty good. It was buggy third party drivers and bad non-kernel decisions that created the vast majority of the problems. Any OS that allows a wide range of hardware is going to be vulnerable to buggy drivers.

    Imagine where Apple could have been in the 90s, had they switched to Unix a decade earlier.

    They did.

    http://en.wikipedia.org/wiki/A/UX [wikipedia.org]

  • by Fackamato (913248) on Tuesday October 05, 2010 @02:42AM (#33791968)

    I've seen this in a few places, why is pf preferred over iptables? What can it do that iptables cannot? Or is it performance related? Security?

  • by lidocaineus (661282) on Tuesday October 05, 2010 @03:05AM (#33792042)

    Ridiculous. I am all for Unix tools and prefer the Unix way for most server related tasks and apps, but Samba4 doesn't even come close to being able to dealing with an ADSI install. Even doing something basic like rolling out GPOs is either a giant pain in the ass, requires Windows-based tools still, or is impossible. As a generic SMB server, Samba is excellent. As a domain controller/active directory store, it has a LONG way to go before it's even close to viable as a replacement for AD.

  • by SpaghettiPattern (609814) on Tuesday October 05, 2010 @04:06AM (#33792242)
    GUIs usually are a management requirement. The common misconception is that sysadmin GUIs make you more productive. Well, they don't. On the contrary, they slow you down, cause repetitive and boring activities where the human factor thrives (E.g. copy paste from spreadsheet.) GUIs look good on presentations but are crap to operate.

    But it's not only the god forsaken Windows platform that has them. Ever tried configuring network interfaces and the DHCP server on OpenSolaris? Ever tried getting a readable manual on how to do either by editing files or through command line? That for me was the practical reason for discarding OpenSolaris and continuing with FreeBSD.

    In an ideal world, configuration files adhere to a specific syntax. Libraries are available and convenience tools and utilities (CLI and GUI) should be based on them and are equally effective.

    Cheerfully ignore comments stating that scripts for changing configuration files can screw up production. Hobbyists stating such manure apparently don't know you should run unit and integration tests before deploying any change whatsoever in a production environment. Nowadays any half professional organisation can afford multiple test environments to minimise production failures caused by untested software.
  • by tehcyder (746570) on Tuesday October 05, 2010 @06:02AM (#33792598) Journal

    The last Microsoft product I bought was Windows 98, so I have mercifully missed the whole disaster since then. All my clients are just now starting to switch from XP to Windows 7, because I advised against Vista.

    You are not qualified to advise people on using Windows XP, Vista or 7 if your knowledge stops at Windows 98.

    If I said Linux was a shitty desktop OS, because when I used it in 1998 the sound didn't work properly, everyone would just laugh.

  • by jedidiah (1196) on Tuesday October 05, 2010 @07:18AM (#33792796) Homepage

    This would sound a bit convincing if you actually mentioned a feature by name rather than spouting something that sounds like vague marketing propaganda.

  • by Junta (36770) on Tuesday October 05, 2010 @08:20AM (#33793032)

    YaST over SSH in not CLI, it's TUI. The important difference is CLI is scriptable, TUI has the lack of scriptability of a GUI, but the bandwidth characteristics of CLI.

    I think it's a nice example of how something running in a text-only mode is not necessarily a sufficient improvement over GUI.

  • Grandpa, Really? (Score:3, Insightful)

    by TheNetAvenger (624455) on Tuesday October 05, 2010 @08:25AM (#33793060)

    Wow, I can't believe that this is just accepted.

    With the advances in GUI design and beyond GUI design technology, a CLI should be obsolete, even if it is not obsolete in practice on the specific examples.

    There is no reason a written script should be necessary, when an object constructed visual script could also be generated that is just as specific and functional. Again, just because the tools and technology are not common does not mean it is the standard and will always be a truth.

    There was a time that writing software required 'writing code' as well, and today we have technologies that let graphic designers put together robust applications without writing a single line of code. (MS Blend for a simple example of XAML based GUI development.)

    By nature Unix based OS models are CLI dependent(textual pipes and generic I/O constructs); however, this is not true of all OS models(NT is an object based OS, where a CLI is counter intuitive which makes PowerShell a brilliant CLI model that came years after the OS, and uses the object nature of the OS design).

    Even with some imagination this isn't a 'truth' in a UNIX OS model either. There is no reason that all constructs have to derive or remain at a CLI level with a GUI strapping onto the CLI. Replace the CLI constructs with GUI based interactions and instead of textual pipes, object and graphical piping could be the model that replaces the CLI nature of UNIX.

    This line of thinking is a failure of imagination and factually incorrect when viewed from an object based OS design like NT where the CLI(PowerShell) was an achievement to harness objects at a regressed CLI level.

  • by TheNetAvenger (624455) on Tuesday October 05, 2010 @10:59PM (#33803494)

    Sadly you don't even understand how silly your response is.

    NT and piping are opposites in how the OS is designed. NT deals with objects and object passing and referencing, not generic I/o and piping.

    Nt is specifically not designed like unix, which was my point and what you do not understand.

    An OS that deals with an object model instead of generic I/o has no need for textual passing or piping.

    More slashdot peeps truly should take a minute to learn why makes NT different and unique from standard unix OS models.

    *posted from my droid...

"And do you think (fop that I am) that I could be the Scarlet Pumpernickel?" -- Looney Tunes, The Scarlet Pumpernickel (1950, Chuck Jones)

Working...