Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Perl Programming

You Used Perl to Write WHAT?! 307

Esther Schindler writes "Developers spend a lot of time telling managers, 'Let me use the tool that's appropriate for the job' (cue the '...everything looks like a nail' meme here). But rarely do we enumerate when a language is the right one for a particular job, and when it's a very, very wrong choice. James Turner, writing for CIO.com, identifies five tasks for which perl is ideally suited, and four that... well, really, shouldn't you choose something else? This is the first article in a series that will examine what each language is good at, and for which tasks it's just plain dumb. Another article is coming RSN about JavaScript, and yet another for PHP... with more promised, should these first articles do well."
This discussion has been archived. No new comments can be posted.

You Used Perl to Write WHAT?!

Comments Filter:
  • Both sides... (Score:5, Interesting)

    by Aladrin ( 926209 ) on Friday January 25, 2008 @10:53AM (#22181280)
    I always see both sides of the 'right tool for the job' problem.

    Having the right tools is great for current productivity, but it's hell on expenses and new recruits. If you use a different tool for every job, you need to maintain all those tools and a task force that's able to use all of them. Sometimes the 'right tool' is one that fits the company as well as the job.
  • Re:Only Tool (Score:1, Interesting)

    by Anonymous Coward on Friday January 25, 2008 @11:12AM (#22181528)
    Yes, let's be practical here. You should use nails when they work better than the alternatives, or when you have plenty of nails and NO alternatives.

    Some reasons to prefer one language/IDE over another:

    * It will peform better
    * It will shorten development time
    * It is extensible and/or has a community of developers adding features
    * Plenty of developers available
    * It will be more maintainable then the alternatives
    * It's free or inexpensive
    * It's standardized

    Some reasons to avoid using a given language/IDE:

    * It will break your app
    * It will slow your app down
    * It will take much longer to develop
    * You won't find any developers
    * It will make your code unmaintainable
    * It's expensive
    * It's nonstandard

    I suggest that a proper cost/benefit analysis rather than ideology is the best way to decide on a language to use!

  • by thomasdz ( 178114 ) on Friday January 25, 2008 @11:17AM (#22181590)
    a few winters ago so I wrote a BASIC interpreter in Perl which wasn't hard, but then from the lessons I learned from that I then wrote the same BASIC interpreter in VMS DCL which was a really interesting week project. (VMS DCL is the Cshell of the VAX/VMS world)
    Why? I dunno, but I did learn a whole lot about Perl.
    I think that's the best way to learn things... make up a fake project for yourself (say, a database, or a simple flight simulator)...then implement it. Then revise it.

  • My favorite example (Score:5, Interesting)

    by jc42 ( 318812 ) on Friday January 25, 2008 @11:28AM (#22181738) Homepage Journal
    My favorite "You did WHAT in perl?" response is: On several projects, when there were portability problems, I've created a Makefile entry that runs a "man foo" command and pipes the data to a perl script, which generates C files for that system. It's typically just header files, but sometimes also a few .c files with data structures and/or simple functions to intercede with variant library routines.

    It's fun to watch people's reaction when they realize that "You wrote a perl script that reads the manual and generates the code?" I just respond something like "Uh, yeah; you got a problem with that?"

    Especially fun has been the couple of discussions in which I expressed a great deal of skepticism of various "AI" claims. Then someone brings up the fact that I write perl programs that read English-language docs and generate code from them. They're obviously puzzled by the fact that I do this while looking skeptically at "AI" proposals. It's like they expect me to just shrug and write other impossible things in perl.

  • Inline C in Perl (Score:3, Interesting)

    by wbav ( 223901 ) <Guardian.Bob+Slashdot@gmail.com> on Friday January 25, 2008 @11:32AM (#22181778) Homepage Journal
    So there was a case where I needed to create a big recursive data structure in Perl. It could be a hex tree about 8 nodes deep or a binary tree about 32 nodes deep (I say about as some nodes were rolled up into others based on metrics). Anyways, we had about 100,000 items being stored in these trees and I was told to use Perl so that the data coming in could be manipulated in a sane way and we could get some stats on how the data structure performed (memory wise, not speed wise). So, it turns out gathering stats on 32*100,000 nodes is very slow in Perl so I was told to boost performance using inline c. The difference was well beyond two orders of magnitude. The difficulty? There was very sparse information about following recursive objects in inline c at the time. Perl had references but that didn't translate directly to pointers in c. Even so, it was possible and makes a great story for later. You know, "Back in my day we didn't have all this processor power. We couldn't just follow the reference down in native Perl, we had to translate them references to pointers by hand and still we felt blessed."
  • Quick repair tool (Score:3, Interesting)

    by oliderid ( 710055 ) on Friday January 25, 2008 @11:55AM (#22182060) Journal
    The last time I have used Perl:
    I'm currently writing a server based application written in c# (mono). The email class of c# was good...but enough flexible for the multipart graphically enriched email I had to send (a report not a spam...Mind you). I couldn't properly configured the MIME Parts (especially "inline"). If I had just c# the only available would have been a commercial library.

    So I end up with Perl. perl -MCPAN -e shell . install MIME::Light (if I remind well)
    a couple lines after I had a tool ready to send emails (based html pages written by my c# application). The script is fired up by my c# application with several parameters. It works.

  • by baldass_newbie ( 136609 ) on Friday January 25, 2008 @12:00PM (#22182128) Homepage Journal
    I don't know what's more disappointing, the fact that this story actually got posted or that the parent was modded down for pointing out the same.
    I actually read the f**king article and came away feeling dumber for actually having done so. Funny enough, I did not see one point that you could not use Ruby or Python to make the EXACT SAME CASE.
    Data manipulation in place? Cripes, if I'm in an Excel file perhaps I could write VBA to do the same thing and a lot easier at that.
    This kind of thing is just inane. Perhaps it will get the occasional CIO who reads this rag to name drop a language they neither use nor understand, but it does little in terms of providing anything useful for the actual codewriters - you know the folks who used to frequent this site.
    Bah. Jet lag and cold coffee make me bitter.
  • Bollocks (Score:5, Interesting)

    by bytesex ( 112972 ) on Friday January 25, 2008 @12:10PM (#22182276) Homepage
    Skipped right down to the stuff that perl isn't supposed to do: not supposed to be used in high performance/real time stuff - check, as a replacement for shell scripts where shell scripts are shorter - check (obvious-meter off the scale though), it isn't supposed to be used in CGI. Eh. Right. Because, according to the author, we should be using ruby on rails for that. Eh. Right. Again. Why didn't he just outright say that we should be using j2ee with struts and beans and xml based style sheets ! Oh that was 2007 ! My bad.

    Perl was, and is (IMHO) the first and foremost thing you grab when you write web-stuff. CPAN is nothing if not infinite, the web is a text-based thing the perl was designed for, and its speed makes ruby blush. So why ?

    Why try to write off perl all the time. Is it because they can't seem to /win/ ?!
  • Re:Ray Tracing (Score:4, Interesting)

    by Lodragandraoidh ( 639696 ) on Friday January 25, 2008 @12:15PM (#22182336) Journal
    That is why I love python; in most cases there is only one way of doing it - which improves readability, testability, and debugging.

    I was a long time perl programmer before I made the switch to python. All my headaches with perl went away, and no new headaches of similar magnitude have surfaced. So for me it has been an net improvement.

    KISS, DRY, and various other good engineering/development paradigms are embodied in python's development model.

    Perl made it easy to shoot yourself in the foot. Python makes it hard to shoot yourself in the foot -- but you can if you want to. That probably best sums up their differences.
  • Re:When to use Perl? (Score:5, Interesting)

    by jandrese ( 485 ) <kensama@vt.edu> on Friday January 25, 2008 @01:25PM (#22183434) Homepage Journal
    Ironically, Larry Wall once said that part of the reason he wrote Perl is because he was scared of Awk's parser.
  • Re:ob (Score:1, Interesting)

    by Anonymous Coward on Friday January 25, 2008 @01:45PM (#22183850)
    You're probably trying to be funny w.r.t. to the trivial case, but implementing a language in itself is actually a legit and useful approach. Read "The Structure and Interpretation of Computer Programs" or "The Art of the Metaobject Protocol" for how to do it, and why you would want to.

    I really don't know why more languages aren't written in themselves. A language like Perl or Python or Ruby, for example, is much higher-level than C. Writing a compiler in an HLL like this would be about a million times easier than writing it in C. You could target something like LLVM and still be portable. It would probably be noticably faster. All the cool toys your users are writing for analyzing Ruby programs, you could turn around and use on Ruby itself.

    Most importantly, it would remove the big disconnect between the language implementor and the language user. The biggest program that Matz/Guido/Larry are working on is probably Ruby/Python/Perl, which are all written in C. I'd rather they had first-hand experience writing really big programs in the language they're designing. (I'm not saying they don't have any, but if they do, it takes away from their language-implementation time.)

    I know LLVM didn't exist in its current state when these languages were started. Still, if I was running a HLL project, that would be one of my top priorities -- assuming, that is, that the project didn't already have an awesome cross-platform native compiler, like SBCL does.
  • Re:PHP WTF?! (Score:4, Interesting)

    by joggle ( 594025 ) on Friday January 25, 2008 @01:58PM (#22184052) Homepage Journal
    Why do you say that? I write Perl and PHP scripts all the time and don't see any advantage to using Perl for webforms over PHP, at least not the ones I write. It's trivially easy to access data from a database in either scripting language and you can perform Perl-style regular expressions in PHP. The nice thing about PHP is that it's specifically designed for web applications and has simpler syntax in some situations. The downside to PHP is learning all of these functions that don't have a consistent pattern to them but, once you know them, you can accomplish a lot of tasks efficiently.
  • Re:I call bullshit (Score:3, Interesting)

    by AKAImBatman ( 238306 ) <akaimbatman@gmaYEATSil.com minus poet> on Friday January 25, 2008 @02:07PM (#22184176) Homepage Journal
    I call overreaction!

    Seriously, you're picking at an example where I say that some small company somewhere might benefit from the faster development time of Ruby over the advantages of Java? Especially when said company probably doesn't need the same level of scalability you're worried about?

    Geez. Simmer down, will ya? :-/
  • Glue and objects (Score:5, Interesting)

    by goombah99 ( 560566 ) on Friday January 25, 2008 @02:45PM (#22184772)
    The list missed the most important part of perl. A glue language. Python and a few other languages claim they can be glue languages but that's pretty much a joke to anyone who knows both fluently. Perl is the ultimate glue language for combining diverse output so different programs from different sources, written decades apart in different languages can all work like a well oiled machine.

    The other really odd experience for me was learing object oriented programming. I had been programming in objects since I was first introduced to them when the first NeXT computer came out. I used java. And C++ and such. I thought I understood objects.

    Then one day I learned to program object oriented in Perl. An I learned that while I was fluent in object oriented usage, I really had a pathetic understanding of how they worked and what was actually possible with objects.

    Perl objects are sort of like owning a copy of grey's anatomy or "the visible" man. You son't just see that arms connect to torso's from outside but you see all the sinews and bones and blood.

    It's actually amazing how so many things we think of as different concepts in object oriented programming and data bases are actually different reflections of the same trick. And that's the trick perl use to make objects.

    in perl, an object is any variable that has an attribute that can store a list of package names.

    Let's see what you can do with that.

    Hmmm.... well that list can be your inheritance heirarchy so each package is what you search for methods. But notice that since it's a mutable list a perl object can do something else that most object oriented languages cannot. A variable can change it's "inheritance" list after the fact. it can change it's own class.

    Okay Now this is just a single variable so where to we get attributes of the object? Well, if that variable is say a hash (dictionary) then we can just use the key's as the attribute names. so if were to write self.foo in C++, you would write self->{foo} in perl.

    More fun: let's say you call a method() or ask for an attribute on a variable that does not exist. Well, a perl object can just add more packages to it's inheritnace list. Or it cold write the method on the spot and add it too it's own inheritnace. "I'm my own grandpa". I've used this trick many times to create tables. I don't write any of the "get" or "set" methods. instead I just intercept the call to the method "setfoo()" which never existed cause I never wrote it, then I have perl create an attribute called foo: Self->{foo} = "something". then I have perl write a subroutine called "setfoo" and add that subroutine into a package namespace and put that in it's inhereticnace list.. ("like adding methods to a C++ package outside the declaration". (programming tip: obviously this is could lead to problems with typos, so I also provide the variabel with a list of all allowed attribute names--- but of course I can always add to that list later).

    Now something more exotic. The hottest thing in Data base programming is the realization that sometimes column centric data bases are better than traditional row-centric data bases structures. In perl an object can change which it is, transparently. For example, if I'm a traditional object with a row organization then all my attributes are stored as self->{foo1}, self->{foo2}, self->{foo3}. and so on, just as you might right self.foo3 in python. But I did not have to do it that way. What if instead of making the self variable a hash (dictionary) I had made the self-variable a simple scalar, say an integer. Well at fist this seems stupid, where did all the instance variables go? Well, I just store them in the class. I make the scalar self-variable's integer just an index. The class keeps the instance variables in arrays--that is column based storage--.. SO for example if self = 4, then the attibute foo for this instance now becomes self->class->foo[4].

    The beauty of this is that si
  • by MBGMorden ( 803437 ) on Friday January 25, 2008 @02:55PM (#22184922)
    That's pretty much my point. While I was at college, I worked with Java, C, C++, Fortran, VB, and SPARC Assembly. I have a vague working knowledge of VB and Java syntax. I still remember C and C++ pretty well as I use those still (and I use a lot of PHP as well, but that I picked up after I was out). If you asked me to write something in Fortran or SPARC Asm at this point in time the best I could do without a reference book next to me is a blank stare (The Fortran class I took wasn't even geared towards CS majors. It was just there for Liberal Arts people to get a required computer credit - I took it because for a 3rd year CS student it was like a free A+ to add to your GPA :)). I just haven't used it recently at all and the syntax is lost.

    HOWEVER, I do remember quite well what threads are, what a semaphore is, what a binary tree is, the difference between a bubble/quick/radix sort, the concept of object oriented design, etc. I wish I could say I remembered UML modeling but honestly, I hated that darned part of CS and never paid attention there anyways :P. But, regardless of the percentage, the point still stands: syntax is trivial. The important part is knowing how to think like a programmer. If you can do that the rest just falls into place.
  • by UID30 ( 176734 ) on Friday January 25, 2008 @03:20PM (#22185248)
    was in using perl to perform an xsl transform converting xml directly into executable perl code. hooray for eval. surprisingly it was one of our most stable jobs and ran for years with no problems. i think this was mostly because everybody was afraid to touch it once it got going... is there anything perl can't do?
  • by myvirtualid ( 851756 ) <pwwnow@nOsPaM.gmail.com> on Friday January 25, 2008 @03:28PM (#22185370) Journal

    Languages are for the most part trivial. And universal.

    How about lazy evaluation and currying... ...time to make another language, Haskell.... Do we want... a pointer?

    Please do forgive me (hee hee hee: Please ==> forgive $ me) if I haven't quite gotten your point, but I cannot square your first and last paragraphs, specifically the parts I've quoted.

    Perhaps if you'd written "Imperative languages are for the most part trivial and universal? Perhaps then could easily equate C and Fortran and Perl and sed, and leave Haskell and Lisp out of the mix. Or perhaps if you'd written "OO languages are for the most part trivial and universal? Perhaps then I could equate (more or less easily, don't think it's quite NP) Objective C and Java and C++. (But see below....)

    But the bold unqualified "they're all languages, get over it" sort of assertion doesn't parse.

    My intro was Fortran, then F77, then Pascal, then C (OMG! Pointers! The Bomb!), and it was evolution, a little more cool, a little more flexibility all along.

    Then I learned C++ at work by day and Java for fun at night, and my head hurt, 'cause I liked the imperative style and OO was weirdly different but everyone was swilling kool aid so I stuck it out...

    ...and every night I'd discover something in Java that was THE BOMB that would solve that day's problem and every next day I'd find that C++ didn't have that feature (does today, AFAIK, but STL was busted then, so everyone rolled their own)... ...but I moved out of real programming before I got my head around OO.

    And now I'm learning Haskell, just 'cause I've learned that making my head hurt from time to time is a great way of stretching myself, of getting better at everything....

    And I don't understand - at least, I haven't wrapped my head around Monads yet, though I get what they're for. And you know what? No pain.

    None. I can feel the approach of enlightenment. SYB and reflect, baby, introspection and lazy evaluation and side-effect free (or reliably and provably constrained side-effect management) via implicit state passing.

    Whoa.

    I feel like Neo after the roof but just before the hallway - I'm starting to believe.

    I've read the "SYB in C++" paper. I get what they're doing. But they admit the gap:

    SYB in Haskell depends on... features... not available in C++, most notably rank-2 types, higher-order functions, and polymorphic type extension....

    Scrap++ is a great exercise, but how do you get to SYB without those? You don't.

    There's a guy out there that /.ers love or hate, no middle ground, so I won't reference him directly, but he's right: Different programming models change how you think of problems, and the right model opens so many doors you didn't even know existed. Doors you couldn't even have described until you knew they were there, but you were unable to find the hallway until you squinted, looked sideways at the world, and watched it shift... ...and were freed from the imperative....

    "Hello World" is a cool teaching aid in Fortran and C and even perl (do it 15 different ways, without string literals or character types, preferably with a program one column wide :->)

    But Haskell? No. When you learn Haskell, think big. 'Cause its programming model is so way different you have no idea. I'm at the point I almost consider Monads harmful... ...but I'll get the other side of the koan soon.

    And when I have a simple repetitive task that I need to automate, I'll stick to bash, 'cause it's clean and readable, and sufficiently fast and sufficiently limited that I have to force myself to be literate, which makes it so much easier 6 months later when I need to tweak $ remember script.

    I won't use Haskell for that. No way. But that replacement for scrabble/scribble I've been thinking of? That tool for edit

  • by theshowmecanuck ( 703852 ) on Friday January 25, 2008 @05:57PM (#22187514) Journal

    I'll take someone with no formal CS training yet is able to think abstractly over employees like him anyday


    Sadly it is the brain dead of the HR departments and headhunters who do the hiring/selecting... usually. To them, what is on paper trumps experience. Seems that ever more often these wastes of brain pans either submit for interviews or will outright hire a newly graduated Masters student (based entirely on their piece of paper) with fuck all experience and make them managers or 'senior' developers, or architects, etc. And people who you would like to hire will have their resumes dumped in the waste bin because they don't have the requisite piece of paper that the dumb fuck HR person thinks makes you useful. So you know who is going to be your new boss. Must stop now before meltdown. Yes, some bitterness. Life. :)
  • by bzipitidoo ( 647217 ) <bzipitidoo@yahoo.com> on Friday January 25, 2008 @08:01PM (#22188786) Journal
    Try this for squaring things. There are times when you really do need a new language to express and use concepts that are cumbersome to do in existing languages. LISP, Haskell, Prolog, and those are perhaps different enough to justify this status, and perhaps only Prolog is really different enough, being declarative while all the rest are imperative. Anyway, it's only cumbersome, not impossible. Most of the time, the "right tool for the job" is trivial hairsplitting among very similar languages, as if there was a big difference between Perl, Java, Python, C/C++, and shell scripts. When making a GUI it almost doesn't matter what features the language supports, it's GNOME, KDE, or the Windows API that you need to use to drive the thing and the most important aspect of the language is how easily it can be interfaced with the desired libraries, not whether it supports some aspect of OOP or functional programming or memory management. Same goes for web programming-- does the language interface well with Apache and Mozilla? In a sense, Java is just C++ with "web glue", handy means to embed programs in web pages, this object code halfway point for platform independence (perhaps that's the biggest difference), and calls on some GUI management classes and authentication. And they took the opportunity to throw in automatic garbage collection and clean up a few of the awkward corners of C++. But Java wasn't anything revolutionary in language concepts, it was just a better C++. There is the cookie cutter Visual Basic approach, which I liken to whipping up a shell script that calls on heavy tools to do a few simple little jobs. You can have the best language in the world, but it's no good if it can't interface. Imagine C without stdio, or any other I/O. Cobol might be better than a lobotomized C that lacks those crucial libraries.

    I don't consider OOP all that revolutionary or different from structured programming. Both are imperative. You can do OOP in C, you don't need C++. You can even do OOP in C without too much trouble-- C has structures and function pointers, and that's really all a class needs. With function pointers, polymorphism is no big deal to implement. I find that with OOP, the technique can get in the way. I have seen people sit there agonizing over what classes are needed and what and where methods and data should exist, and how the inheritance hierarchy should be arranged. They finally make a decision they aren't totally happy about and can't quite put their finger on why, but they need to get something done. Then halfway into writing a program, they think of a another way, and spend lots of time shuffling methods and data from the old class hierarchy to a new. It's the OOP version of renumbering a BASIC program by hand.

    Monads? They're just an abbreviation in syntax. Maybe I don't get it, but it seems to me that monads are functional programmers getting really bent out of shape over nothing. Just because the Haskell equivalent of getc can return different data, they freak out. When in fact, the I/O is just concealing one extra parameter implicitly. Add time (or, file position), and just like that, no more monads. Don't say "c = getc()", say "c = getc(t)" where t is a position in a stream or file. Then the function will always get the same return values for the same inputs. Omitting the t is merely a convenient shorthand they've turned into a big deal.

    It is good to experience different language paradigms. I found LISP somewhat painful when I first encountered it. Took a while to get used to doing tail end recursion instead of fruitlessly searching the language for a loop construct, and this "lambda function" stuff threw me for a while until I understood that it's just an anonymous function block. The biggest change was getting accustomed to passing functions around like any other parameter, and stuffing them into lists like any other data.

  • What you're really talking about here is the difference between the Turing machine model of computation and the Lambda Calculus model, and you're absolutely right. Even though the two are provably equivalent (try expressing one of your Haskell programs as a while loop with a stack; it works but it sucks having to write it!), the very mentality that you use when programming in a language like Haskell is so totally radically different from how you program in C that it's useless comparing the two.

    In college I did a lot of work in cryptography and artificial intelligence. These are tasks for which functional programming is particularly well suited. I remember the day that I learned what the functional folding did. It was like that moment you're talking about, when suddenly I realized that summing all the values in a list didn't require its own function and it was just one succinct statement. Suddenly the evaluation of a perceptron network became a functional fold nested in a functional map, and my AI code shrank from hundreds of lines to dozens, but was still perfectly readable.

    People can say that languages are universal . . . but that's really not the full story is it? The fibonacci sequence, for instance. I'd express it like this in Haskell:

    let fibs = 1 : 1 : zipWith (+) fibs (tail fibs)

    How do you explain that line of code to someone who's only ever seen C? And yet it'll compute the 1000th fibonacci number almost instantly on my computer. The infinite list idiom can only exist in a lazy-evaluative language, and that's a concept that doesn't even translate well to other functional languages like ML (unless you cheat and use ML's lazy module, which I've never really figured out).

    One of the greatest things that I learned in college was that languages are not universal. The people who say they are obviously talking about Java vs. C vs. C# et cetera. Going from imperative to semi-OO like C to C++, you're not really switching entire paradigms. Going from imperative to, say, pure OO (see also: Ruby, Smalltalk), or going from imperative to pure functional (see also: lisp, ML, Haskell, Prolog), is quite a mindfsck.

    Anyways I'm glad to hear that it's not just me here who loves the functional languages. Have you decided to do anything cool with it yet?

  • Re:Ray Tracing (Score:3, Interesting)

    by tyler_larson ( 558763 ) on Sunday January 27, 2008 @09:00PM (#22203716) Homepage

    They only perfect score went to an emulator written in Perl. The built-in hash tables, and some smart programming combined with the ease of parsing the microcode and program data created not only the fastest (some classmates used C, C++, lisp, or Java to write their emulators) emulator, but also the easiest to read of the group.

    It's the programmer that creates slow, unreadable code, not the language.

    Yes, that reminds me a bit about a class I used to teach for a former employer. I was teaching old-time C programmers to use one of these up-and-coming GC-managed, JIT-compiled language/frameworks for some of the newer systems we were building. When we came to the topic of speed and efficiency, the question kept coming up: "Can't we write faster, more efficient code in C?"

    Of course, the answer always was, "Yes, you can. In fact, you can write still more efficient code in assembly. The question isn't whether you can, the question is whether you will!"

"May your future be limited only by your dreams." -- Christa McAuliffe

Working...