You Used Perl to Write WHAT?! 307
Esther Schindler writes "Developers spend a lot of time telling managers, 'Let me use the tool that's appropriate for the job' (cue the '...everything looks like a nail' meme here). But rarely do we enumerate when a language is the right one for a particular job, and when it's a very, very wrong choice. James Turner, writing for CIO.com, identifies five tasks for which perl is ideally suited, and four that... well, really, shouldn't you choose something else? This is the first article in a series that will examine what each language is good at, and for which tasks it's just plain dumb. Another article is coming RSN about JavaScript, and yet another for PHP... with more promised, should these first articles do well."
Both sides... (Score:5, Interesting)
Having the right tools is great for current productivity, but it's hell on expenses and new recruits. If you use a different tool for every job, you need to maintain all those tools and a task force that's able to use all of them. Sometimes the 'right tool' is one that fits the company as well as the job.
Re:Only Tool (Score:1, Interesting)
Some reasons to prefer one language/IDE over another:
* It will peform better
* It will shorten development time
* It is extensible and/or has a community of developers adding features
* Plenty of developers available
* It will be more maintainable then the alternatives
* It's free or inexpensive
* It's standardized
Some reasons to avoid using a given language/IDE:
* It will break your app
* It will slow your app down
* It will take much longer to develop
* You won't find any developers
* It will make your code unmaintainable
* It's expensive
* It's nonstandard
I suggest that a proper cost/benefit analysis rather than ideology is the best way to decide on a language to use!
I wrote a BASIC interpreter (Score:4, Interesting)
Why? I dunno, but I did learn a whole lot about Perl.
I think that's the best way to learn things... make up a fake project for yourself (say, a database, or a simple flight simulator)...then implement it. Then revise it.
My favorite example (Score:5, Interesting)
It's fun to watch people's reaction when they realize that "You wrote a perl script that reads the manual and generates the code?" I just respond something like "Uh, yeah; you got a problem with that?"
Especially fun has been the couple of discussions in which I expressed a great deal of skepticism of various "AI" claims. Then someone brings up the fact that I write perl programs that read English-language docs and generate code from them. They're obviously puzzled by the fact that I do this while looking skeptically at "AI" proposals. It's like they expect me to just shrug and write other impossible things in perl.
Inline C in Perl (Score:3, Interesting)
Quick repair tool (Score:3, Interesting)
I'm currently writing a server based application written in c# (mono). The email class of c# was good...but enough flexible for the multipart graphically enriched email I had to send (a report not a spam...Mind you). I couldn't properly configured the MIME Parts (especially "inline"). If I had just c# the only available would have been a commercial library.
So I end up with Perl. perl -MCPAN -e shell . install MIME::Light (if I remind well)
a couple lines after I had a tool ready to send emails (based html pages written by my c# application). The script is fired up by my c# application with several parameters. It works.
Re:This is not a news story (Score:2, Interesting)
I actually read the f**king article and came away feeling dumber for actually having done so. Funny enough, I did not see one point that you could not use Ruby or Python to make the EXACT SAME CASE.
Data manipulation in place? Cripes, if I'm in an Excel file perhaps I could write VBA to do the same thing and a lot easier at that.
This kind of thing is just inane. Perhaps it will get the occasional CIO who reads this rag to name drop a language they neither use nor understand, but it does little in terms of providing anything useful for the actual codewriters - you know the folks who used to frequent this site.
Bah. Jet lag and cold coffee make me bitter.
Bollocks (Score:5, Interesting)
Perl was, and is (IMHO) the first and foremost thing you grab when you write web-stuff. CPAN is nothing if not infinite, the web is a text-based thing the perl was designed for, and its speed makes ruby blush. So why ?
Why try to write off perl all the time. Is it because they can't seem to
Re:Ray Tracing (Score:4, Interesting)
I was a long time perl programmer before I made the switch to python. All my headaches with perl went away, and no new headaches of similar magnitude have surfaced. So for me it has been an net improvement.
KISS, DRY, and various other good engineering/development paradigms are embodied in python's development model.
Perl made it easy to shoot yourself in the foot. Python makes it hard to shoot yourself in the foot -- but you can if you want to. That probably best sums up their differences.
Re:When to use Perl? (Score:5, Interesting)
Re:ob (Score:1, Interesting)
I really don't know why more languages aren't written in themselves. A language like Perl or Python or Ruby, for example, is much higher-level than C. Writing a compiler in an HLL like this would be about a million times easier than writing it in C. You could target something like LLVM and still be portable. It would probably be noticably faster. All the cool toys your users are writing for analyzing Ruby programs, you could turn around and use on Ruby itself.
Most importantly, it would remove the big disconnect between the language implementor and the language user. The biggest program that Matz/Guido/Larry are working on is probably Ruby/Python/Perl, which are all written in C. I'd rather they had first-hand experience writing really big programs in the language they're designing. (I'm not saying they don't have any, but if they do, it takes away from their language-implementation time.)
I know LLVM didn't exist in its current state when these languages were started. Still, if I was running a HLL project, that would be one of my top priorities -- assuming, that is, that the project didn't already have an awesome cross-platform native compiler, like SBCL does.
Re:PHP WTF?! (Score:4, Interesting)
Re:I call bullshit (Score:3, Interesting)
Seriously, you're picking at an example where I say that some small company somewhere might benefit from the faster development time of Ruby over the advantages of Java? Especially when said company probably doesn't need the same level of scalability you're worried about?
Geez. Simmer down, will ya?
Glue and objects (Score:5, Interesting)
The other really odd experience for me was learing object oriented programming. I had been programming in objects since I was first introduced to them when the first NeXT computer came out. I used java. And C++ and such. I thought I understood objects.
Then one day I learned to program object oriented in Perl. An I learned that while I was fluent in object oriented usage, I really had a pathetic understanding of how they worked and what was actually possible with objects.
Perl objects are sort of like owning a copy of grey's anatomy or "the visible" man. You son't just see that arms connect to torso's from outside but you see all the sinews and bones and blood.
It's actually amazing how so many things we think of as different concepts in object oriented programming and data bases are actually different reflections of the same trick. And that's the trick perl use to make objects.
in perl, an object is any variable that has an attribute that can store a list of package names.
Let's see what you can do with that.
Hmmm.... well that list can be your inheritance heirarchy so each package is what you search for methods. But notice that since it's a mutable list a perl object can do something else that most object oriented languages cannot. A variable can change it's "inheritance" list after the fact. it can change it's own class.
Okay Now this is just a single variable so where to we get attributes of the object? Well, if that variable is say a hash (dictionary) then we can just use the key's as the attribute names. so if were to write self.foo in C++, you would write self->{foo} in perl.
More fun: let's say you call a method() or ask for an attribute on a variable that does not exist. Well, a perl object can just add more packages to it's inheritnace list. Or it cold write the method on the spot and add it too it's own inheritnace. "I'm my own grandpa". I've used this trick many times to create tables. I don't write any of the "get" or "set" methods. instead I just intercept the call to the method "setfoo()" which never existed cause I never wrote it, then I have perl create an attribute called foo: Self->{foo} = "something". then I have perl write a subroutine called "setfoo" and add that subroutine into a package namespace and put that in it's inhereticnace list.. ("like adding methods to a C++ package outside the declaration". (programming tip: obviously this is could lead to problems with typos, so I also provide the variabel with a list of all allowed attribute names--- but of course I can always add to that list later).
Now something more exotic. The hottest thing in Data base programming is the realization that sometimes column centric data bases are better than traditional row-centric data bases structures. In perl an object can change which it is, transparently. For example, if I'm a traditional object with a row organization then all my attributes are stored as self->{foo1}, self->{foo2}, self->{foo3}. and so on, just as you might right self.foo3 in python. But I did not have to do it that way. What if instead of making the self variable a hash (dictionary) I had made the self-variable a simple scalar, say an integer. Well at fist this seems stupid, where did all the instance variables go? Well, I just store them in the class. I make the scalar self-variable's integer just an index. The class keeps the instance variables in arrays--that is column based storage--.. SO for example if self = 4, then the attibute foo for this instance now becomes self->class->foo[4].
The beauty of this is that si
Re:is your company weak? (Score:5, Interesting)
HOWEVER, I do remember quite well what threads are, what a semaphore is, what a binary tree is, the difference between a bubble/quick/radix sort, the concept of object oriented design, etc. I wish I could say I remembered UML modeling but honestly, I hated that darned part of CS and never paid attention there anyways
one of my proudest moments... (Score:3, Interesting)
Re:language vs library (Score:4, Interesting)
Languages are for the most part trivial. And universal.
How about lazy evaluation and currying... ...time to make another language, Haskell.... Do we want... a pointer?
Please do forgive me (hee hee hee: Please ==> forgive $ me) if I haven't quite gotten your point, but I cannot square your first and last paragraphs, specifically the parts I've quoted.
Perhaps if you'd written "Imperative languages are for the most part trivial and universal? Perhaps then could easily equate C and Fortran and Perl and sed, and leave Haskell and Lisp out of the mix. Or perhaps if you'd written "OO languages are for the most part trivial and universal? Perhaps then I could equate (more or less easily, don't think it's quite NP) Objective C and Java and C++. (But see below....)
But the bold unqualified "they're all languages, get over it" sort of assertion doesn't parse.
My intro was Fortran, then F77, then Pascal, then C (OMG! Pointers! The Bomb!), and it was evolution, a little more cool, a little more flexibility all along.
Then I learned C++ at work by day and Java for fun at night, and my head hurt, 'cause I liked the imperative style and OO was weirdly different but everyone was swilling kool aid so I stuck it out...
...and every night I'd discover something in Java that was THE BOMB that would solve that day's problem and every next day I'd find that C++ didn't have that feature (does today, AFAIK, but STL was busted then, so everyone rolled their own)... ...but I moved out of real programming before I got my head around OO.
And now I'm learning Haskell, just 'cause I've learned that making my head hurt from time to time is a great way of stretching myself, of getting better at everything....
And I don't understand - at least, I haven't wrapped my head around Monads yet, though I get what they're for. And you know what? No pain.
None. I can feel the approach of enlightenment. SYB and reflect, baby, introspection and lazy evaluation and side-effect free (or reliably and provably constrained side-effect management) via implicit state passing.
Whoa.
I feel like Neo after the roof but just before the hallway - I'm starting to believe.
I've read the "SYB in C++" paper. I get what they're doing. But they admit the gap:
Scrap++ is a great exercise, but how do you get to SYB without those? You don't.
There's a guy out there that /.ers love or hate, no middle ground, so I won't reference him directly, but he's right: Different programming models change how you think of problems, and the right model opens so many doors you didn't even know existed. Doors you couldn't even have described until you knew they were there, but you were unable to find the hallway until you squinted, looked sideways at the world, and watched it shift... ...and were freed from the imperative....
"Hello World" is a cool teaching aid in Fortran and C and even perl (do it 15 different ways, without string literals or character types, preferably with a program one column wide :->)
But Haskell? No. When you learn Haskell, think big. 'Cause its programming model is so way different you have no idea. I'm at the point I almost consider Monads harmful... ...but I'll get the other side of the koan soon.
And when I have a simple repetitive task that I need to automate, I'll stick to bash, 'cause it's clean and readable, and sufficiently fast and sufficiently limited that I have to force myself to be literate, which makes it so much easier 6 months later when I need to tweak $ remember script.
I won't use Haskell for that. No way. But that replacement for scrabble/scribble I've been thinking of? That tool for edit
Re:is your company weak? (Score:3, Interesting)
Sadly it is the brain dead of the HR departments and headhunters who do the hiring/selecting... usually. To them, what is on paper trumps experience. Seems that ever more often these wastes of brain pans either submit for interviews or will outright hire a newly graduated Masters student (based entirely on their piece of paper) with fuck all experience and make them managers or 'senior' developers, or architects, etc. And people who you would like to hire will have their resumes dumped in the waste bin because they don't have the requisite piece of paper that the dumb fuck HR person thinks makes you useful. So you know who is going to be your new boss. Must stop now before meltdown. Yes, some bitterness. Life.
Re:language vs library (Score:3, Interesting)
I don't consider OOP all that revolutionary or different from structured programming. Both are imperative. You can do OOP in C, you don't need C++. You can even do OOP in C without too much trouble-- C has structures and function pointers, and that's really all a class needs. With function pointers, polymorphism is no big deal to implement. I find that with OOP, the technique can get in the way. I have seen people sit there agonizing over what classes are needed and what and where methods and data should exist, and how the inheritance hierarchy should be arranged. They finally make a decision they aren't totally happy about and can't quite put their finger on why, but they need to get something done. Then halfway into writing a program, they think of a another way, and spend lots of time shuffling methods and data from the old class hierarchy to a new. It's the OOP version of renumbering a BASIC program by hand.
Monads? They're just an abbreviation in syntax. Maybe I don't get it, but it seems to me that monads are functional programmers getting really bent out of shape over nothing. Just because the Haskell equivalent of getc can return different data, they freak out. When in fact, the I/O is just concealing one extra parameter implicitly. Add time (or, file position), and just like that, no more monads. Don't say "c = getc()", say "c = getc(t)" where t is a position in a stream or file. Then the function will always get the same return values for the same inputs. Omitting the t is merely a convenient shorthand they've turned into a big deal.
It is good to experience different language paradigms. I found LISP somewhat painful when I first encountered it. Took a while to get used to doing tail end recursion instead of fruitlessly searching the language for a loop construct, and this "lambda function" stuff threw me for a while until I understood that it's just an anonymous function block. The biggest change was getting accustomed to passing functions around like any other parameter, and stuffing them into lists like any other data.
Turing Machines vs. Lambda Calculus (Score:3, Interesting)
What you're really talking about here is the difference between the Turing machine model of computation and the Lambda Calculus model, and you're absolutely right. Even though the two are provably equivalent (try expressing one of your Haskell programs as a while loop with a stack; it works but it sucks having to write it!), the very mentality that you use when programming in a language like Haskell is so totally radically different from how you program in C that it's useless comparing the two.
In college I did a lot of work in cryptography and artificial intelligence. These are tasks for which functional programming is particularly well suited. I remember the day that I learned what the functional folding did. It was like that moment you're talking about, when suddenly I realized that summing all the values in a list didn't require its own function and it was just one succinct statement. Suddenly the evaluation of a perceptron network became a functional fold nested in a functional map, and my AI code shrank from hundreds of lines to dozens, but was still perfectly readable.
People can say that languages are universal . . . but that's really not the full story is it? The fibonacci sequence, for instance. I'd express it like this in Haskell:
How do you explain that line of code to someone who's only ever seen C? And yet it'll compute the 1000th fibonacci number almost instantly on my computer. The infinite list idiom can only exist in a lazy-evaluative language, and that's a concept that doesn't even translate well to other functional languages like ML (unless you cheat and use ML's lazy module, which I've never really figured out).
One of the greatest things that I learned in college was that languages are not universal. The people who say they are obviously talking about Java vs. C vs. C# et cetera. Going from imperative to semi-OO like C to C++, you're not really switching entire paradigms. Going from imperative to, say, pure OO (see also: Ruby, Smalltalk), or going from imperative to pure functional (see also: lisp, ML, Haskell, Prolog), is quite a mindfsck.
Anyways I'm glad to hear that it's not just me here who loves the functional languages. Have you decided to do anything cool with it yet?
Re:Ray Tracing (Score:3, Interesting)
Yes, that reminds me a bit about a class I used to teach for a former employer. I was teaching old-time C programmers to use one of these up-and-coming GC-managed, JIT-compiled language/frameworks for some of the newer systems we were building. When we came to the topic of speed and efficiency, the question kept coming up: "Can't we write faster, more efficient code in C?"
Of course, the answer always was, "Yes, you can. In fact, you can write still more efficient code in assembly. The question isn't whether you can, the question is whether you will!"