Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Software

John W. Backus Dies at 82; Developed FORTRAN 271

A number of readers let us know of the passing of John W. Backus, who assembled a team to develop FORTRAN at IBM in the 1950s. It was the first widely used high-level language. Backus later worked on a "function-level" programming language, FP, which was described in his Turing Award lecture "Can Programming be Liberated from the von Neumann Style?" and is viewed as Backus's apology for creating FORTRAN. He received the 1977 ACM Turing Award "for profound, influential, and lasting contributions to the design of practical high-level programming systems, notably through his work on FORTRAN, and for seminal publication of formal procedures for the specification of programming languages."
This discussion has been archived. No new comments can be posted.

John W. Backus Dies at 82; Developed FORTRAN

Comments Filter:
  • Re:Wow. (Score:2, Interesting)

    by MichaelSmith ( 789609 ) on Tuesday March 20, 2007 @05:09AM (#18411751) Homepage Journal

    I'm surprised he even lived to 82 without being killed by a rabid programmer. ;)

    I am inclined to blame him for Basic as well, because it started out as a kind of simplified Fortran.

  • Re:Also known for... (Score:3, Interesting)

    by tcopeland ( 32225 ) <tom AT thomasleecopeland DOT com> on Tuesday March 20, 2007 @06:04AM (#18411949) Homepage
    > Many times I have edited lex and yacc code, but
    > never have I understood what the hell I was doing.

    So true. I'm writing a JavaCC book [generating...javacc.com] and I'm still learning new stuff about it even though I'm almost done with the book.

    The thing that's worked best for me is writing the lexical spec first, then going back and writing the parser spec. At least then you know that the basic tokens of the language are being recognized before you try to shape them into a parse tree.
  • by Slashamatic ( 553801 ) on Tuesday March 20, 2007 @06:56AM (#18412159)
    To heck with just statistics. Fortran is alive and well at the heart of some major airline reservations, checkin and cargo systems. yes, they tried to move to newer technologies but they couldn't handel the load, particularly at points when there is a lot of rescheduling such as during bad weather.
  • by Edward Kmett ( 123105 ) on Tuesday March 20, 2007 @07:24AM (#18412263) Homepage

    I find it somewhat troubling that in this article John Backus is remembered primarily for the genie that he tried to put back in the bottle.

    FORTRAN was utilitarian and procedural and good at enabling engineers and scientists to get work done. However, the problem with FORTRAN is the imperative pattern of though that it imposed led us to tell the computer a precise sequence of steps to accomplish each task. It doesn't offer information on dependencies, simply a "go here, do that" sequence of instructions. Imperative programs are inherently hard to reason about in terms of global state and effects and as written tend to be subject to off-by-one errors.

    Backus saw this in 1978! See http://http//www.stanford.edu/class/cs242/readings /backus.pdf [http].

    His insight spawned a great deal of the interest in functional programming languages. It was been credited by Paul Hudak of Haskell fame http://portal.acm.org/citation.cfm?doid=72551.7255 4 [acm.org] (ACM membership required) (summarized here http://lambda-the-ultimate.org/classic/message4172 .html [lambda-the-ultimate.org]) and others as really helping to turn the tide and kept functional programming languages from being snuffed out.

    A lot of people don't see the point, having never programmed in a functional programming language like Haskell or ML. However even those people see dozens of cores on the horizon and wonder how they are going to deal with the debugging issues associated with all of the threads to keep those processors churning.

    Functional programming offers an alternative viewpoint that is arguably much better suited to handle multiple CPUs working on large datasets. A case for this was recently reiterated by Tim Sweeney of Epic Megagames fame who said "in a concurrent world, imperative is the wrong default!" http://www.st.cs.uni-sb.de/edu/seminare/2005/advan ced-fp/docs/sweeny.pdf [uni-sb.de].

    Haskell has brought Software Transactional Memory (STM) into play offering an alternative approach to traditional mutexes and locks that is compositional in nature unlike locking models. This is an approach that isn't readily emulable in an imperative setting because of the lack of guarantees about side effects. http://research.microsoft.com/~simonpj/papers/stm/ index.htm [microsoft.com].

    These are solutions to real problems that we are experiencing today, not some academic sideshow, and they arise from a school of thought that he helped bring a great deal of attention to.

    If you want to do something to remember Backus take the time to learn OCaml or Haskell or even just take the time to learn how to effectively use the map and fold functions in Perl, PHP or Ruby.

    It is his willingness to turn his back on what was percieved as his greatest work when confronted with a better idea for which I will remember him and I am a better programmer today for having learned what I could from his ideas.

  • by MichaelSmith ( 789609 ) on Tuesday March 20, 2007 @07:29AM (#18412279) Homepage Journal

    What's so wrong with FORTRAN?

    Well nothing really. It was a good, early attempt. Its mistake was in surviving too long.

    From the sound of things, it's like the guy committed a crime or something

    Yeah I have been one of the worst offenders in this article. Sorry about that. He was obviously an accomplished guy and I am sorry he is gone.

    ...if it was so 'destructive' or whatever then how come it got so popular?

    Hard to explain. It attracted a certain, lets say, blue collar group of programmers. People who like all their identifiers to be called MODSTR and ARGSET. Lisp was the gay, lower case language. FORTRAN was for REAL MEN who type REAL CHARACTERS. These were scientists who wanted to knock out quick, fast and dirty code. It was the perl of its day, and like perl, got used for things it should have not been used for.

    Or did it? Why did so many choose to use it?

    Something about the compilers which were built for it got it entrenched in the performance computing field. To this day people will claim that fortran code runs faster than anything else for pure numerical applications. I would be surprised if it was still true, though.

  • Backus-Naur Form (Score:3, Interesting)

    by mwvdlee ( 775178 ) on Tuesday March 20, 2007 @07:35AM (#18412303) Homepage
    The funny thing about Backus-Naur Form is that Naur himself says it should be Backus Normal Form, like it was before Naur used a slightly modified version of it.
  • New meme (Score:2, Interesting)

    by TuringTest ( 533084 ) on Tuesday March 20, 2007 @07:49AM (#18412357) Journal
    I can see a new trend of "Goodbye cruel world" programs replacing the "Hello world" equivalents, as designers of programming language pass away.
  • Re:rest in peace (Score:1, Interesting)

    by Anonymous Coward on Tuesday March 20, 2007 @07:57AM (#18412407)
    In scientific code FORTRAN tends to be 20% faster than the best possible C++ implementation because the grammar is so simple that compilers tend to understand better the code and can vectorize or optimize it much farther than C

    I'm losing my mods points on this article but statements like this just piss me off. This is only true if the C/C++ programmer doesn't know what they are doing. I work with engineers all the time who say the same thing about FORTRAN but not once has anyone ever been able to show me a piece of FORTRAN code that I couldn't recode in C with the same or better performance. I have done it on numerous occasions to the shock of my fellow workmates (who obviously can't code C/C++ for crap).

    FORTRAN is not all bad and I use it for certain things but to make such statements about performance just shows lack of experience/skill.
  • Re:rest in peace (Score:3, Interesting)

    by dario_moreno ( 263767 ) on Tuesday March 20, 2007 @08:21AM (#18412541) Journal
    there was an article on that in "computers in science and engineering" a few years ago. I do this experience every year with my students and it still holds. Just have a look at the generated assembler of a commercial Fortran compiler with optimization turned on and compare it to the same code in C and especially C++ and look at register use, number of FLOP generated, use of vectorized extensions, complex instructions and so. It is all the truer with all hardware optimization of nowadays. Of course if you spend 10 times as much time optimizing formulas and assigning variables to registers, your C code can be as fast as optimized Fortran ; we are speaking about straight-to-the point code. I know from hardware counters that my code runs at maybe 10% of theoretical machine efficiency ; but I do not have a few years to optimize everything.
  • by TapeCutter ( 624760 ) on Tuesday March 20, 2007 @09:38AM (#18413273) Journal
    How long a legacy system lasts is decided by accountants.

    The thing with a huge legacy system such as the airlines use is it's stability, they go by the rule "if it works don't touch it". I worked on a large dispatch system for a telco, the back end was HPUX/C that had been ported from FORTRAN and been in (limited) production at some other telco for a few years (they bought a snapshot of the source code), the central route planning and dispatch algorithim was similar to that used by the airline systems.

    Our job was to write the client (win3.1), the comms (9600bps on a clear day), and the transaction manglers that connecetd the back-end to the mainframes housing the customer databases, billing, materials, ect...the first live pilot of 12 servicemen took several executives, 4 on-site programmers, a back-end team, numerous phone calls and 3 hours before all 12 servicemen had their first job for the day (they got through less than a third of their normal workload that day). It took ~2years of piloting with 200 users and then carefull ramp-up before the system could compete with the old fashioned pencil and paper worksheet that serviced the other 8000 users.

    The reason legacy systems are surrounded by a fortress of red-tape that requires an "act of god" before anything is changed: The telco replaced 600 regional depots with 30 small dispatch centers, a $100M investment in mobile hardware and custom software enabled $600M in real estate sales (circa 1993-2000). After the sell-off we became a legacy system in our own right when the front end was replaced with mobile phones pre-installed with browsers. After installing a transaction mangler to connect our comms to a web server, the back-end was wrapped in red tape and the development and maintenance teams were disbanded, AFAIK it is still running.

    The corporate approach to legacy systems is similar to the high level "functional programming" that Backus advocated but the driver is profit and predictibility, elegance and efficiency are "nice to have's". The underlying algorithims for these kind of systems comes from OR [wikipedia.org] and have remained largely unchanged since the second world war. Where is the profit in replacing what IBM euphemistically calls a "functionally stable application" when competitors with new systems are only marginally more efficient at best?
  • by TwobyTwo ( 588727 ) on Tuesday March 20, 2007 @09:47AM (#18413359)


    When I was in my early 20s and had been programming only a few years, and John was already a legend and IBM Fellow for his work on FORTRAN, I had the pleasure of meeting him informally a few times. You would have thought our positions and experiences were nearly the same. He was always as engaged and delighted with younger people like me as with other giants of the computer field, some of whom were standing right with us at those get togethers (Jim Gray [wikipedia.org] comes to mind). John was extraordinarily decent, kind, and down-to-earth, and he will be very much missed.



    I think some of the wise guys/gals on this list are missing the point of the FORTRAN team's contributions. It wasn't that FORTRAN was the perfect language. To some degree, that wasn't even the goal. Quoting from an an article by Backus [acm.org] (full text is available only to ACM subscribers, unfortunately):

    "To this day I believe that our emphasis on object program efficiency rather than on language design was basically correct. I believe that had we failed to produce efficient programs, the widespread use of languages like FORTRAN would have been seriously delayed.".


    At the time the FORTRAN work was done, people didn't believe that a compiler could produce code that was fast enough. If you go back to the early references on FORTRAN you'll find that they implemented optimizations that were still considered sophisticated 15 years later. The difference is: the FORTRAN team did it at a time when nobody had done it before. Furthermore, they did it on an IBM 704 [ibm.com] that would be too weak (if not too small!) to power a wrist watch today. Its core storage units [ibm.com] were tens of cubic feet in size, and each held 4K 36 bit words, or just over 32K bytes in modern terms. Even the "high speed" drum storage units (like a disk, but with no seeking needed) held only 16K of those 36 bit words. On this machine, they built optimizations that were considered sophisticated even decades later, when machines had gotten much bigger and faster. Quoting from that same article:

    "It is beyond the scope of this paper to go into the details of the analysis which section 2 [I.e. the optimizer] carried out. It will suffice to say that it produced code of such efficiency that its output would startle the programmers who studied it. It moved code out of loops where that was possible; it took advantage of the differences between rowwise and column-wise scans; it took note of special cases to optimize even the exits from loops. The degree of optimization performed by section 2 in its treatment of indexing, array references, and loops was not equalled again until optimizing compilers began to appear in the middle and late sixties."

    The computing field has lost someone very special.

  • Re:Wow. (Score:5, Interesting)

    by mysticgoat ( 582871 ) on Tuesday March 20, 2007 @10:20AM (#18413777) Homepage Journal

    I am inclined to blame him for Basic as well, because it started out as a kind of simplified Fortran.

    FORTRAN was the first working high level compiler language; BASIC was the first working interpreter language. Very different underlying structures.

    Now COBOL was the second major high level compiler language, and it was very much a reaction to FORTRAN, so I suppose using parent post's logic, we can blame Backus for COBOL. But then that cheapens the contributions of the Girl Admiral (Grace Hopper) who gave us such wonders as the nanosecond wire, and MULTIPLY 2 BY 2 GIVING FOUR.

    For the youngsters out there:

    1. FORTRAN (FORmula TRANslator) was the break-through from machine language and assembly to a higher level language with a compiler. Everything we do now is based on this; I believe that many mission critical engineering libraries are still in Fortran (they were a few years ago)
    2. COBOL (COmmon Business Oriented Language, Compiles Only By Odd Luck) was the second successful high level language. Its major improvement over Fortran was getting rid of triphasic logic (branch on <0, or =0, or >0) in favor of boolean logic (branch on !0 or 0). Its most noteworthy failing was the requirement to use the period punctuation mark (full stop) to end sections. This was particularly a problem for girl programmers, since at the time getting into trouble because you missed a period had serious consequences. Cobol simply put too much emphasis on a nearly invisible and easily missed period.
    3. BASIC (Beginners All-purpose Symbolic Instruction Code) made two big advances: first, it attempted to span both engineering and business computing (doing each with the same degree of imprecision); second and more important, it introduced the concept of using an interpreter rather than a compiler. Good stuff, that. Yet another baby step toward tomorrow's virtual machines. Most noteworthy program ever written in Basic: Eliza. Most significant long term contribution: the reaction to its spaghetti coding style, from which Pascal and modern procedure based programming arose.

    </drivel>

    It is hard for some of us graybeards to poke fun at Backus. His vision was the inspiration that has taken us all down this road.

  • Re:What do you know? (Score:3, Interesting)

    by Frumious Wombat ( 845680 ) on Tuesday March 20, 2007 @10:41AM (#18414089)
    Its more modern forms, such as F95, aren't bad at all. All the power of C++ as far as numerics goes, a consistent and sane syntax (complex variables and operations are just there, not reimplemented by every programmer in their own unique way), modern control and data structures, and easy to read. (Translation: doesn't look like line-noise )

    It remains popular due to backwards compatibility, and the ease of writing numeric code with it.

    As for other uses, I've seen cgi-bin scripts written in F77, and a million-line massively-parallel quantum-chemistry package in an object-oriented fashion using F77. In the hands of modern compilers, it is an amazing language.
  • by ajs ( 35943 ) <{ajs} {at} {ajs.com}> on Tuesday March 20, 2007 @10:53AM (#18414279) Homepage Journal
    Linguists, in general, seem to have an interesting way of looking at the world. Chomsky has a way of looking at a political situation in a unique way and formulating an opinion that's not widely held (in fact, one that typically annoys all extremes of the political spectrum). Right or wrong, it's an interesting process.

    Larry Wall has a similar outlook (though his politics likely diverge heavily from Chomsky, I dunno). He has that linguist's way of looking at theoretically opposing points of view and rationalizing them against each other in a very logical way. It's a kind of fun process to watch, and it makes me wish I had that knack.

    Back to Backus: he will be missed. His work in CS was truly ground-breaking, even (especially) where it simply extended the work of others. FORTRAN has a legacy all its own, and the fact that the scientific community still continues to use it to this day is testimony to its power and utility, even if much of it is dated today.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...