How To Adopt 10 'Good' Unix Habits 360
An anonymous reader writes to mention an article at the IBM site from earlier this week, which purports to offer good Unix 'habits' to learn. The ten simple suggestions may be common sense to the seasoned admin, but users with less experience may find some helpful hints here. From the article: "Quote variables with caution - Always be careful with shell expansion and variable names. It is generally a good idea to enclose variable calls in double quotation marks, unless you have a good reason not to. Similarly, if you are directly following a variable name with alphanumeric text, be sure also to enclose the variable name in square brackets ([]) to distinguish it from the surrounding text. Otherwise, the shell interprets the trailing text as part of your variable name -- and most likely returns a null value."
Re:Don't use shell (Score:4, Insightful)
mkdir (Score:3, Insightful)
$ cd tmp/a/b/c || mkdir -p tmp/a/b/c
If the directory exists you end up in the directory, if it does not it creates the directory but leaves you where you first started. Hence you don't know which directory you will be in after the command is executed!
Unix is more than just a shell (Score:2, Insightful)
(plus he didn't mention my favourite shortcut: shell history)
How about being more inclusive and expanding this to deal with security features (surely the single biggest benefit?) and the ease of working on remote boxes?
Re:Don't use shell (Score:5, Insightful)
Or a concise Python script that opens up a text file of URLs, and extracts the files listed in the URLs:
#!/bin/sh
for a in $( cat file | awk '{print "'\''" $0 "'\''"}' ) ; do
wget $a
done
Python has it place, and is far better for medium to large projects, and projects where the code needs to be maintainable. Shell, however, works a lot better for automating UNIX tasks than Python does. Not to mention embedded systems: I can compile Busybox to have both a good shell and all of the commands that one would run from shell scripts (including grep, cut, sed, and, yes, awk) in only about 300k. A Python binary is about a megabyte big, and you need about ten megabytes to fit all of the libraries Python 2.4 comes with.
Re:welll.. (Score:5, Insightful)
Ie there is NOTHING bad about piping cats. While you might indeed get a ~30% performance increase if you skip the cat, the complexity increases. We often sacrifice performance in order to increase abstraction and understanding.
What makes unix so powerful is its modularity, the fact that you can pipe any output from any application to any applications stdin. This makes it possible to use common tools app1 | app2, app1longoutput | grep thingsIwant. The possibility to mix and match common elements that (arguably) makes unix powerful.
Advice that says "stop piping cats" is akin to "stop using helper functions, they overload the stack, instead do everything in one function"
--
A better articulated article on the programmers intellectual ability vs proper abstraction techniques:
http://www.acm.org/classics/oct95/ [acm.org] - Dijkstra, Edsger - "Go To Statement Considered Harmful"
Re:Don't use shell (Score:3, Insightful)
No, don't mod up anybody in this thread. Perl and Python are abominations. Pure, unadulterated Bourne shell is for the true, seasoned *nix user. Just like Java is an answer to a question nobody asked in the GUI world, so too is Perl and Python in the command line world.
Re:mkdir (Score:5, Insightful)
- scripts
- commands that take long enough that you go have a coffee.
This makes sense:
make install && lilo && reboot
This doesn't:
cd tmp/a/b/c || mkdir -p tmp/a/b/c
If you fail the first part, well, you typed " || " instead of pressing enter.
If you succeed the first part, you typed " || mkdir -p tmp/a/b/c" without a bloody reason.
Type first part. Press enter. Observe result.
If necessary, type the second part, otherwise correct the first without baggage of the second one hanging around.
Re:welll.. (Score:3, Insightful)
but he never said you should stop using pipes. he is talking only about a specific situation -- cat-ing a file and then piping it to grep. surely that is a good point he is making, because grep already takes filenames as an argument?
Re:This article... (Score:1, Insightful)
Ah, you must be a teacher, seeing something that remotely resembles a common error, and immediately assuming that it was an error. The article explains both ; and &&...
Now why don't you go read the entire article and take back some of your criticism. I think it was a fairly nice article. (and sure, there were some errors, but I strongly prefer this over some typo-less corporate soap-opera)
Re:Don't use shell (Score:4, Insightful)
No shit, Sherlock! You have clearly never worked in a large organisation, where - believe it or not - you, as a standard user, do not actually get to insist that the already-overworked IT department jump through bureaucratic hoops to install your favourite bloated scripting language, unless you have a damn good business case for it. And probably not even then.
Hint: if the task you want that scripting language to accomplish is trivial to achieve with a simple shell script, you don't have a good business case.
* This doesn't apply to wget, obviously, but if your platform really has no standard alternative, you are more likely to persuade IT to install something small and simple like wget, fetch, curl, etc. than a complete programming environment like Python.
Re:welll.. (Score:1, Insightful)
Re:welll.. (Score:5, Insightful)
Re:welll.. (Score:5, Insightful)
Further, the assembly line abstraction of cat as 'input the contents of these files into the beginning of my pipeline' is predictable, simple and very clear and readable. Using the filenames in the commands means you have to be certain each command will take filenames, and if you replace the first step (from a grep to an awk, for example), you have to rethink your input method semantics again.
Any typing speed gains and performance improvements you may get will probably get shot the first time some command does something unexpected, or by the extra steps of thought.
And if performance really was a serious concern you probably shouldnt be writing it as a shell script...
Re:Anal Unix Guy (Score:3, Insightful)
Yes -- and habits is what people desperately need. The people I know primarily need three habits: RTFM when they don't understand something; adjusting their behavior based on the FM; and managing their use of the current directory (i.e. you don't have to cd into a directory to use a file which lives there).
Re:Don't use shell (Score:3, Insightful)
2. how about accepting command line arguments in bash? in perl it's just $ARGV[0]. nice and simple and like C++ (except for the offset by one) so i don't want to have to bother learning another one.
Command line args? $1 $2 etc or $* for all of them.
No actual habits in the article (Score:5, Insightful)
Re:Don't use shell (Score:5, Insightful)
I always use 2 or more args for grep (Score:3, Insightful)
Re:welll.. (Score:3, Insightful)
Re:If I wanted to upload binaries... (Score:4, Insightful)
1. Never use csh or any derivative thereof.
2. Know the portable behaviour of your Unix tools.
3. Learn to use ed, one day you'll be glad you did. You can also use ed and ex from scripts or from a command.
4. A shell command is a small program. If you are unsure about a command, test it first, like you would any program.
5. Learn to use the standard shell on your system.
6. Learn useful nonstandard extensions of utilities, but use them with care.
7. Never rely on an extension to the point that you forget how to do it portably. The definition of "portably" is up to you.
8. Learn to use csh enough that you can make do in an emergency, and learn *why* you shouldn't use it.
9. If your standard shell is Bash, learn Korn too. And vice versa. Learn both, how they differ, and how they differ form your standard shell.
10. Sometimes a real C program or a script in a different language is better than using shell.
-Lasse