Forgot your password?
typodupeerror
Software

John W. Backus Dies at 82; Developed FORTRAN 271

Posted by kdawson
from the go-to-considered-seminal dept.
A number of readers let us know of the passing of John W. Backus, who assembled a team to develop FORTRAN at IBM in the 1950s. It was the first widely used high-level language. Backus later worked on a "function-level" programming language, FP, which was described in his Turing Award lecture "Can Programming be Liberated from the von Neumann Style?" and is viewed as Backus's apology for creating FORTRAN. He received the 1977 ACM Turing Award "for profound, influential, and lasting contributions to the design of practical high-level programming systems, notably through his work on FORTRAN, and for seminal publication of formal procedures for the specification of programming languages."
This discussion has been archived. No new comments can be posted.

John W. Backus Dies at 82; Developed FORTRAN

Comments Filter:
  • Re:Wow. (Score:5, Insightful)

    by Anonymous Coward on Tuesday March 20, 2007 @05:25AM (#18411815)
    I am inclined to blame him for Basic as well, because it started out as a kind of simplified Fortran.

    I'm more inclined to thank him for all the other high level programming languages.
  • by _Hellfire_ (170113) on Tuesday March 20, 2007 @05:28AM (#18411823) Homepage
    I understand the context in which the word "apology" is being used (as in "justification"), but I had to laugh at the semantics of "apologising for FORTRAN".

    82 is a good innings. No matter what you think of FORTRAN as a language, I think it's safe to say that it, and later some of the other really early languages advanced computer science greatly during its infancy. We have a lot to thank Backus for.
  • Farewell John (Score:5, Insightful)

    by LizardKing (5245) on Tuesday March 20, 2007 @05:30AM (#18411833)
           PROGRAM FAREWELL_JOHN
           IMPLICIT NONE

           PRINT *, 'Farewell John W. Backus'

           STOP
           END

    *
    * End indeed ...
    *
  • by Zapotek (1032314) <tasos.laskos@nospAm.gmail.com> on Tuesday March 20, 2007 @05:43AM (#18411863) Homepage
    ...you insensitive clods!!

    Show some respect instead of making lame FORTRAN jokes...
  • by vivaoporto (1064484) on Tuesday March 20, 2007 @05:48AM (#18411881)
    With both the lack of interest and the distortion of the original goal, Computer Science as we know may be dying with the elders. Computer Science originally had nothing to do with computers (as in personal computer) per se, but with the science of computation, optimal algorithms for pure math problems, etc. Actually, it was nothing but a branch of Math. The way computer science is being dealt with nowadays, with disdain, lack of interest and with people thinking about it as a tool to put another "screw tighter" professional in the market, soon we may run out of real breakthroughs like the ones those genius created to pave the yellow brick road we run over nowadays.
  • by jandersen (462034) on Tuesday March 20, 2007 @06:08AM (#18411963)
    ...Backus's apology for creating FORTRAN...

    (yes, yes, I know, he's no apologising in the usual sense; this is a play on words, or a pun, as it is also known)

    Still, FORTRAN was and still is one of the great programming languages. There are many languages that offer better features and are much suitable for general usage, but there's huge number of programs written in FORTRAN, and many in science still prefer it to C/C++; FORTRAN is very well suited for numerical calculations, which is after all what is was made for.

  • by sosume (680416) on Tuesday March 20, 2007 @06:17AM (#18411997) Journal
    ..to drive a statistical website that was written in the 60's

    talking about anachronisms ...
  • by Mr_Tulip (639140) on Tuesday March 20, 2007 @06:26AM (#18412033) Homepage
    Old programmers never die, they just GOSUB with no RETURN
  • by Anonymous Coward on Tuesday March 20, 2007 @06:34AM (#18412067)
    I don't mean to be insensitive, but a lot of people die every day. Why should I stop my daily routine for just one?
  • by Wizard052 (1003511) on Tuesday March 20, 2007 @06:41AM (#18412095)
    What's so wrong with FORTRAN? From the sound of things, it's like the guy committed a crime or something...if it was so 'destructive' or whatever then how come it got so popular? Or did it? Why did so many choose to use it?
    And for that matter, what IS 'constructive'? Maybe C++? And whatever that is, it wasn't influenced in any way by FORTRAN?

    Just evolution, people... the TV scorning the radio as backward!?
  • If Mr. Backus hadn't developed fortran, would we be as advanced scientificaly as we are now?
  • by h2g2bob (948006) on Tuesday March 20, 2007 @07:06AM (#18412199) Homepage
    I do, in fact my main project is not only in FORTRAN but in standards compliant fixed form FORTRAN 77, huzzah!

    Compared to more modern languages - by which I mean C - it's bad. There are plenty of things which drive me nuts - the need to define things a million times, the lack of any sane way to group variables.

    But compared to what was around when it was made, it was a leap forward (assembly, anyone).

    Also, lets not forget that it was made for... yes, that's right: punched cards! It has a maximum line width because of this (even if it's not on punched cards). This is, I think, one of the main reasons why FORTRAN encourages you to write code like it's in a big dense block (the lack of spaces, the inline looping of variables).

    FORTRAN still has good use among physics labs, partly because there's a lot of physics-specific code that is made for it, and partly because everybody's already used to it. And it has been updated (F95) to include all the modern features you could want.

    Still, you'd need to be mad to use it. Which is why I do.
  • by tinkertim (918832) * on Tuesday March 20, 2007 @07:17AM (#18412241) Homepage

    With both the lack of interest and the distortion of the original goal, Computer Science as we know may be dying with the elders. Computer Science originally had nothing to do with computers (as in personal computer) per se, but with the science of computation, optimal algorithms for pure math problems, etc. Actually, it was nothing but a branch of Math. The way computer science is being dealt with nowadays, with disdain, lack of interest and with people thinking about it as a tool to put another "screw tighter" professional in the market, soon we may run out of real breakthroughs like the ones those genius created to pave the yellow brick road we run over nowadays.


    We're also out of good original movie plots, song lyrics and lots of other stuff too. Has absolutely nothing to do with TFA or your comment, but I figured I'd mention it.

    Give things a little more time and widen your sampling before feeling the doomsday of stagnate science is upon us. Developers will always develop what people demand, and right now they are demanding web 2.0 social networking web sites and other things that more 'serious' users would deem trivial and wasteful. By that token all development that goes into these trivial things could also be considered trivial.

    I do agree that we'll hit a lull, and I'm also inclined to feel that which yields no productive lasting result is relatively useless (games, mindless surfing, etc).

    About 15 years ago the whole world started to open up to everyone in it. We [humans] are a small world network [wikipedia.org] (as far as the definition goes), consider each person being a node and consider the need for them to begin trusting eachother for that network to be efficient and productive.

    The fact that this trust is forming through the (technology wasting) we both bitch about is nothing less than amazing. We will get out of that 'lull' sooner or later :) In effect, while a bit maddening, you're watching a network self-improve simply because it must. Nothing trivial about that.

    Relax, a little :)
  • Re:rest in peace (Score:5, Insightful)

    by justthinkit (954982) <floyd@just-think-it.com> on Tuesday March 20, 2007 @07:19AM (#18412251) Homepage Journal
    Mod parent up one more, he deserves a +5. As an engineering student in the later 70s/80s, Fortran was all I knew or cared to know. My one Comp Sci course was beginning Fortran programming -- the whole thing is probably learnable in a few hours today. My final year thesis was a 6000 line Fortran simulation used to determine the feasibility of building a "Two Stage Spouted Bed Coal Pyrolysis Plant" in China (it was).

    95 percent of the people who programmed in the early years would never have done it without Fortran.

    It is easy to criticize, as many other posts have done, something invented half a century ago. Personally, I miss being able to use Fortran (or a procedural basic) to solve today's problems -- we've given ourselves over to the machine's favorite language (C) while we pat ourselves on the back for how smart we are now (as we create write-only code).

    I wish this [cminusminus.org] had become more popular. There's still time.
  • by DollyTheSheep (576243) on Tuesday March 20, 2007 @07:39AM (#18412313)

    First there was machine language. You hand coded all the little ones and zeros manually to get your machine code. Then came assembler which was a great time saver with all its mnemonics, registers and loops.

    The next step was a real higher-level language: FORTRAN. Its estimated, that this meant a time saving ratio for programmers of 10:1 against assembler. This rate of improvement was never reached again. All other improvements in programming are only incremental compared to that.
  • by solevita (967690) on Tuesday March 20, 2007 @07:52AM (#18412381)

    I don't mean to be insensitive, but a lot of people die every day. Why should I stop my daily routine for just one?
    Exactly. Mocking FORTRAN is a mark of respect. We're much more insensitive to all those people who die and and don't mentioned, especially if it's the result of something easily fixed, like providing a supply of clean drinking water. A fair few people due to preventable causes whilst I was typing out this post; think about that you insensitive clod.
  • Re:rest in peace (Score:5, Insightful)

    by Wormholio (729552) on Tuesday March 20, 2007 @08:04AM (#18412433)
    I too still teach my students (in physics and astronomy) to use Fortran, for many of the reasons listed above. While it may also be useful for them to go on to learn other languages, their primary focus is on the physics problems they need to solve and the numerical algorithms needed to help them do that. Fortran makes it easy for them to get started and then focus on the calculations, not on grammar or philosophy.

    Fortran has been criticized because you can write "spaghetti code" or other crap, while other languages supposedly protect you from the mistakes you can make in Fortran. But you can write crappy code in any language (including "spaghetti classes"). I teach my students to write with good style. They know their code has to be clearly understandable not just to the machine but also to someone else who is familiar with the goal of the code but not the details. Trying to enforce good style through grammar is misguided at best, just as it is in writing in general. Developing good style is a personal, ongoing process for writing anything, including good code.
  • by Anonymous Coward on Tuesday March 20, 2007 @08:05AM (#18412437)
    Just because we're not openly weeping into our morning coffee does not mean we do not respect the talent and achievements of the man. I hate this attitude that death has to be seen as a sad or sorrowful time. If anything it should be a time to remember the person and their achievements; exactly what is happening here. The only time grief is called for is if you personally knew John Backus and will miss his company. Anything else is likely false grief, generated by some weird psychological conditioning that modern society has pushed on us that tells us we must grieve for people we do not know (See also: Princess Diana)
  • Re:Wow. (Score:2, Insightful)

    by lbmouse (473316) on Tuesday March 20, 2007 @08:17AM (#18412517) Homepage
    "Psh, he developed FORTRAN. I'm surprised he even lived to 82 without being killed by a rabid programmer. ;)"

    He lived to 82, I doubt there are any modern-day potato-ass programmers that could catch him even in his golden years. We should feel fortunate for his contributions and hope to hell we live that good of a life, that long. Now, where did I leave my Cheetos?
  • Re:rest in peace (Score:3, Insightful)

    by Threni (635302) on Tuesday March 20, 2007 @08:43AM (#18412715)
    > You mean there's life after forms, databases, and web 2.0? ;-)

    I'm sure there are millions of HTML hairdressers out there who don't know the first thing about real programming.
  • by dk.r*nger (460754) on Tuesday March 20, 2007 @08:46AM (#18412735)

    With both the lack of interest and the distortion of the original goal, Computer Science as we know may be dying with the elders. Computer Science originally had nothing to do with computers (as in personal computer) per se, but with the science of computation, optimal algorithms for pure math problems, etc. Actually, it was nothing but a branch of Math. The way computer science is being dealt with nowadays, with disdain, lack of interest and with people thinking about it as a tool to put another "screw tighter" professional in the market, soon we may run out of real breakthroughs like the ones those genius created to pave the yellow brick road we run over nowadays.


    Naa. I'm sure back when car became widely available and used, some of the "elders" complained that now everybody is a "driver", and they don't even know the intrigate details of internal combustion.

    Computer science is alive and well, there are merely two things happening that disorts the view:
    - "Original" CS has been rolled back into math. You don't do computational heavy math without computers anymore, so why keep CS as a seperate field? Heavy computation is also interesting in lots of other fields, especially medicine and biology.
    - "New" CS is a trade. Programmers, developers, project managers etc.

    There's plenty of novel ideas and innovation out there. Look at SUNs Sparc T1, IBMs Power Cell (hardwarewise) and stuff lige virtualization (both machines (xen, vmware) and programs (java, .NET)). Web Services, the semantic web? Search engines? New language features, like LINQ?

    But if you believe that C++ was the height of evolution, well, then, yes, CS is dead.
  • by Anonymous Brave Guy (457657) on Tuesday March 20, 2007 @09:55AM (#18413477)

    You don't think that the number of people here making and/or understanding the jokes about FORTRAN says more about the significance of Backus's contributions than any fawning obituary column ever could? Contrary to another poster's comment, I think most death really is sad, but since I didn't know Mr Backus personally, I prefer to reflect on what he contributed to society as a whole instead of displaying false grief.

  • by be-fan (61476) on Tuesday March 20, 2007 @11:54AM (#18415573)
    The improvement from FORTRAN to a modern Lisp (circa 1985) is not incremental at all. Relative to assembly, FORTRAN abtracts registers into variables, and branches into loops and functions. Additionally, it automates static storage layout. Relative to FORTRAN, Lisp additionally abstracts memory into objects, machine arithmetic into actual arithmetic*, and simple functions into higher-order, polymorphic functions**. Moreover, it automates dynamic storage management, and abstracts large-scale code patterns with macros. Add all those together, and the delta betweeen Lisp and FORTRAN is easily as large as the delta between FORTRAN and assembly.

    *) Ie: the integer type in FORTRAN or C isn't an integer in the mathematical sense, but a finite field. Addition of integers isn't real addition, but modulo addition. Division over integers isn't real division, its truncation.

    **) In FORTRAN functions are primitives, while in Lisp functions are first-class values. Moreover, until FORTRAN 90 functions could not be recursive.

I bet the human brain is a kludge. -- Marvin Minsky

Working...