John W. Backus Dies at 82; Developed FORTRAN 271
A number of readers let us know of the passing of John W. Backus, who assembled a team to develop FORTRAN at IBM in the 1950s. It was the first widely used high-level language. Backus later worked on a "function-level" programming language, FP, which was described in his Turing Award lecture "Can Programming be Liberated from the von Neumann Style?" and is viewed as Backus's apology for creating FORTRAN. He received the 1977 ACM Turing Award "for profound, influential, and lasting contributions to the design of practical high-level programming systems, notably through his work on FORTRAN, and for seminal publication of formal procedures for the specification of programming languages."
Re:Wow. (Score:2, Interesting)
I am inclined to blame him for Basic as well, because it started out as a kind of simplified Fortran.
Re:Also known for... (Score:3, Interesting)
> never have I understood what the hell I was doing.
So true. I'm writing a JavaCC book [generating...javacc.com] and I'm still learning new stuff about it even though I'm almost done with the book.
The thing that's worked best for me is writing the lexical spec first, then going back and writing the parser spec. At least then you know that the basic tokens of the language are being recognized before you try to shape them into a parse tree.
Re:We Stand On The Shoulders of Giants (Score:4, Interesting)
Remember him not for FORTRAN (Score:5, Interesting)
I find it somewhat troubling that in this article John Backus is remembered primarily for the genie that he tried to put back in the bottle.
FORTRAN was utilitarian and procedural and good at enabling engineers and scientists to get work done. However, the problem with FORTRAN is the imperative pattern of though that it imposed led us to tell the computer a precise sequence of steps to accomplish each task. It doesn't offer information on dependencies, simply a "go here, do that" sequence of instructions. Imperative programs are inherently hard to reason about in terms of global state and effects and as written tend to be subject to off-by-one errors.
Backus saw this in 1978! See http://http//www.stanford.edu/class/cs242/readings /backus.pdf [http].
His insight spawned a great deal of the interest in functional programming languages. It was been credited by Paul Hudak of Haskell fame http://portal.acm.org/citation.cfm?doid=72551.7255 4 [acm.org] (ACM membership required) (summarized here http://lambda-the-ultimate.org/classic/message4172 .html [lambda-the-ultimate.org]) and others as really helping to turn the tide and kept functional programming languages from being snuffed out.
A lot of people don't see the point, having never programmed in a functional programming language like Haskell or ML. However even those people see dozens of cores on the horizon and wonder how they are going to deal with the debugging issues associated with all of the threads to keep those processors churning.
Functional programming offers an alternative viewpoint that is arguably much better suited to handle multiple CPUs working on large datasets. A case for this was recently reiterated by Tim Sweeney of Epic Megagames fame who said "in a concurrent world, imperative is the wrong default!" http://www.st.cs.uni-sb.de/edu/seminare/2005/advan ced-fp/docs/sweeny.pdf [uni-sb.de].
Haskell has brought Software Transactional Memory (STM) into play offering an alternative approach to traditional mutexes and locks that is compositional in nature unlike locking models. This is an approach that isn't readily emulable in an imperative setting because of the lack of guarantees about side effects. http://research.microsoft.com/~simonpj/papers/stm/ index.htm [microsoft.com].
These are solutions to real problems that we are experiencing today, not some academic sideshow, and they arise from a school of thought that he helped bring a great deal of attention to.
If you want to do something to remember Backus take the time to learn OCaml or Haskell or even just take the time to learn how to effectively use the map and fold functions in Perl, PHP or Ruby.
It is his willingness to turn his back on what was percieved as his greatest work when confronted with a better idea for which I will remember him and I am a better programmer today for having learned what I could from his ideas.
Re:What's with the flaming? (Score:3, Interesting)
Well nothing really. It was a good, early attempt. Its mistake was in surviving too long.
Yeah I have been one of the worst offenders in this article. Sorry about that. He was obviously an accomplished guy and I am sorry he is gone.
Hard to explain. It attracted a certain, lets say, blue collar group of programmers. People who like all their identifiers to be called MODSTR and ARGSET. Lisp was the gay, lower case language. FORTRAN was for REAL MEN who type REAL CHARACTERS. These were scientists who wanted to knock out quick, fast and dirty code. It was the perl of its day, and like perl, got used for things it should have not been used for.
Something about the compilers which were built for it got it entrenched in the performance computing field. To this day people will claim that fortran code runs faster than anything else for pure numerical applications. I would be surprised if it was still true, though.
Backus-Naur Form (Score:3, Interesting)
New meme (Score:2, Interesting)
Re:rest in peace (Score:1, Interesting)
I'm losing my mods points on this article but statements like this just piss me off. This is only true if the C/C++ programmer doesn't know what they are doing. I work with engineers all the time who say the same thing about FORTRAN but not once has anyone ever been able to show me a piece of FORTRAN code that I couldn't recode in C with the same or better performance. I have done it on numerous occasions to the shock of my fellow workmates (who obviously can't code C/C++ for crap).
FORTRAN is not all bad and I use it for certain things but to make such statements about performance just shows lack of experience/skill.
Re:rest in peace (Score:3, Interesting)
Re:We Stand On The Shoulders of Giants (Score:3, Interesting)
The thing with a huge legacy system such as the airlines use is it's stability, they go by the rule "if it works don't touch it". I worked on a large dispatch system for a telco, the back end was HPUX/C that had been ported from FORTRAN and been in (limited) production at some other telco for a few years (they bought a snapshot of the source code), the central route planning and dispatch algorithim was similar to that used by the airline systems.
Our job was to write the client (win3.1), the comms (9600bps on a clear day), and the transaction manglers that connecetd the back-end to the mainframes housing the customer databases, billing, materials, ect...the first live pilot of 12 servicemen took several executives, 4 on-site programmers, a back-end team, numerous phone calls and 3 hours before all 12 servicemen had their first job for the day (they got through less than a third of their normal workload that day). It took ~2years of piloting with 200 users and then carefull ramp-up before the system could compete with the old fashioned pencil and paper worksheet that serviced the other 8000 users.
The reason legacy systems are surrounded by a fortress of red-tape that requires an "act of god" before anything is changed: The telco replaced 600 regional depots with 30 small dispatch centers, a $100M investment in mobile hardware and custom software enabled $600M in real estate sales (circa 1993-2000). After the sell-off we became a legacy system in our own right when the front end was replaced with mobile phones pre-installed with browsers. After installing a transaction mangler to connect our comms to a web server, the back-end was wrapped in red tape and the development and maintenance teams were disbanded, AFAIK it is still running.
The corporate approach to legacy systems is similar to the high level "functional programming" that Backus advocated but the driver is profit and predictibility, elegance and efficiency are "nice to have's". The underlying algorithims for these kind of systems comes from OR [wikipedia.org] and have remained largely unchanged since the second world war. Where is the profit in replacing what IBM euphemistically calls a "functionally stable application" when competitors with new systems are only marginally more efficient at best?
We've lost a wonderfully nice guy (Score:5, Interesting)
When I was in my early 20s and had been programming only a few years, and John was already a legend and IBM Fellow for his work on FORTRAN, I had the pleasure of meeting him informally a few times. You would have thought our positions and experiences were nearly the same. He was always as engaged and delighted with younger people like me as with other giants of the computer field, some of whom were standing right with us at those get togethers (Jim Gray [wikipedia.org] comes to mind). John was extraordinarily decent, kind, and down-to-earth, and he will be very much missed.
I think some of the wise guys/gals on this list are missing the point of the FORTRAN team's contributions. It wasn't that FORTRAN was the perfect language. To some degree, that wasn't even the goal. Quoting from an an article by Backus [acm.org] (full text is available only to ACM subscribers, unfortunately):
At the time the FORTRAN work was done, people didn't believe that a compiler could produce code that was fast enough. If you go back to the early references on FORTRAN you'll find that they implemented optimizations that were still considered sophisticated 15 years later. The difference is: the FORTRAN team did it at a time when nobody had done it before. Furthermore, they did it on an IBM 704 [ibm.com] that would be too weak (if not too small!) to power a wrist watch today. Its core storage units [ibm.com] were tens of cubic feet in size, and each held 4K 36 bit words, or just over 32K bytes in modern terms. Even the "high speed" drum storage units (like a disk, but with no seeking needed) held only 16K of those 36 bit words. On this machine, they built optimizations that were considered sophisticated even decades later, when machines had gotten much bigger and faster. Quoting from that same article:
The computing field has lost someone very special.
Re:Wow. (Score:5, Interesting)
FORTRAN was the first working high level compiler language; BASIC was the first working interpreter language. Very different underlying structures.
Now COBOL was the second major high level compiler language, and it was very much a reaction to FORTRAN, so I suppose using parent post's logic, we can blame Backus for COBOL. But then that cheapens the contributions of the Girl Admiral (Grace Hopper) who gave us such wonders as the nanosecond wire, and MULTIPLY 2 BY 2 GIVING FOUR.
For the youngsters out there:
</drivel>
It is hard for some of us graybeards to poke fun at Backus. His vision was the inspiration that has taken us all down this road.
Re:What do you know? (Score:3, Interesting)
It remains popular due to backwards compatibility, and the ease of writing numeric code with it.
As for other uses, I've seen cgi-bin scripts written in F77, and a million-line massively-parallel quantum-chemistry package in an object-oriented fashion using F77. In the hands of modern compilers, it is an amazing language.
Re:Yup, it's the same Chomsky (Score:3, Interesting)
Larry Wall has a similar outlook (though his politics likely diverge heavily from Chomsky, I dunno). He has that linguist's way of looking at theoretically opposing points of view and rationalizing them against each other in a very logical way. It's a kind of fun process to watch, and it makes me wish I had that knack.
Back to Backus: he will be missed. His work in CS was truly ground-breaking, even (especially) where it simply extended the work of others. FORTRAN has a legacy all its own, and the fact that the scientific community still continues to use it to this day is testimony to its power and utility, even if much of it is dated today.