Fortress: The Successor to Fortran? 361
An anonymous reader writes "A draft specification of the Fortress language was recently released. Developed by Sun Microsystems as part of a DARPA-funded supercomputing initiative, Fortress is intended to be a successor to Fortran. Guy Steele, a co-author of Java and member of the Fortress development team, hopes that Fortress will to 'do for Fortran what Java did for C.' Steele admits that Java isn't probably the best choice for numerical computing, and that 'it's a mistake to try to make a programming language that is all things to all people... because the needs are so diverse.' Fortress has a number of interesting features, including support for Unicode characters in code, enabling code to look more like formal mathematical expressions. More information about Fortress is given in interview with Steele, and in a talk by Steele. There's also some interesting commentary on Fortress, including some commentary by a member of the Fortress development team, in response to two stories at the programming languages weblog Lambda the Ultimate."
Better than Java? (Score:5, Funny)
Re:Better than Java? (Score:4, Interesting)
Re:Better than Java? (Score:5, Interesting)
There are many reasons why Java is an important programming language. (1) It is probably the first really mainstream general-purpose widely-used language with garbage collection. (2) It was specifically designed for safety at the start, with things like exception handling, bytecode validation and security managers. (3) It is the first mainstream language to run mostly on a VM, so you get portability not just at the source code level, but at the level of compiled programs. (4) It was designed from the start to handle multiple threads safely.
Java is certainly not the most exciting language for developers to use, but that is not its point. Java is not a language of clever tricks and obscure code, like C++ can be. However, it is a very practical language that has learned from many of the mistakes of earlier designs.
If you have been developing for decades like me, switching to Java and finding the ability to write a program, compile it and then decide where it should be deployed (and have this work almost perfectly most of the time) is pretty amazing.
Re:Better than Java? (Score:5, Interesting)
(1) You give up speed for a marginal increase in features. (1b) If speed is not a factor, languages such as OCaml have many more features suitable for high-level programming. OCaml is also slightly faster than Java in general. Thus Java is both more primitive and slower than a language that came out in a similar time frame.
(2) You give up source portability for binary portability. Almost every platform has an ansi-C compiler, yet only a handful support Java, especially if you use a recent library. There are more platforms that support OpenGL than Java3D, for example.
(3) A company controls your language. The future of Java is at the whim of a single for-profit entity. Furthermore, this entity has displayed that it wants to control the Java language and the Java platform to the greatest extent possible.
(4) It's one of the most difficult languages to interface with C, and it pushes 100% of the glue required to the native language. It is easier to interface Lisp and Haskell with C than Java to C through JNI. Given the large difference between the former pairs, and the small differences between the latter pair, this is pretty ironic.
Re:Better than Java? (Score:3, Informative)
No you don't. Recent Linpack benchmarks have shown Java can match C/C++ in terms of math performance in many benchmarks. There is no speed disadvantage for this kind of work.
(2) You give up source portability for binary portability. Almost every platform has an ansi-C compiler, yet only a handful support Java, especially if you use a recent library.
There are few platforms that now don't have Java; the statement 'only a handful' is nonsense. T
Re:Better than Java? (Score:5, Informative)
Do we really, really have to go over this well-trodden ground again?
OK, here's the short version:
If you write your Java like C, restricting the functionality you use to almost entirely low-level operations on primitives, then you'll probably get comparable performance, modulo a bit of range-checking on arrays and the like. That's not surprising, and should be true in most programming languages; any language that doesn't ultimately generate basically the same machine code to execute in this case is seriously deficient in performance terms.
Now try writing anything beyond a relatively contrived and self-contained benchmark in Java -- something that uses more involved data types than built-in doubles and arrays where bounds checking can mostly be optimised away, for example -- and see how far you get.
Sorry, JIT really helps and modern Java implementations do have some pretty good optimisations, but the design of Java fundamentally means that it will only ever approach the performance of elementary C or C++ as a limit, and there will always be a certain amount of overhead at some stage in the proceedings. You simply cannot avoid this, while still having the bounds checking, still missing value types, etc.
You could encounter an exceptionally fortunate set of conditions, such that Java has a chance of outperforming C or C++ code. You'd probably need code that ran often enough with similar enough data for dynamic optimisation by the VM to make up for the overhead of monitoring run-time performance in the first place, and then to generate better performing code on that sort of data than C or C++ code run through a good optimiser/profiler combination to produce generic output. Yes, it's theoretically possible. No, I've never, ever seen it.
Re:Better than Java? (Score:3, Insightful)
We obviously do, as the landscape has changed since last you walked it:
Sorry, JIT really helps and modern Java implementations do have some pretty good optimisations, but the design of Java fundamentally means that it will only ever approach the performance of elementary C or C++ as a limit, and there will always be a certain amount of overhead at some stage in the proceedings. You simply cannot avoid this, while still having the bounds
Re:Better than Java? (Score:3, Interesting)
No, there is no speed disadvantage, there is, however, a programming disadvantage: in order to get speed in Java, you can't use any abstractions. Java is a less convenient language for numerical programming than C or Fortran 66.
And (surprise) that's why Sun has been working on Fortress.
On the contrary, almost everything about the
Re:Better than Java? (Score:4, Informative)
Actually Basic was the first. It used garbage collection for strings.
"(3) It is the first mainstream language to run mostly on a VM, so you get portability not just at the source code level, but at the level of compiled programs."
Nope that would have been Pascal. At lest is was the first I had heard of. UCSD Pascal was very popular back in the late 70's early 80's It's byte code was called p-code. I think there where even some chips that ran p-code directly. An other early virtual machine was used by Infocom to run their text adventure games. If you go by number of users it was extremely popular.
Re:Better than Java? (Score:3, Insightful)
It is my impression that LISP was fairly mainstream and widely used back in the day. Admittedly, that was probably before programming itself was mainstream.
Strange. I was certain that LISP has exception handling too. As for bytecode validation, that seriou
Re:Better than Java? (Score:4, Informative)
That's absurd. There's dozens of languages that do that running on i386. BASIC, Ada, Java, Pascal, Fortran in some cases, they all check the bounds of arrays and prevent the use of arbitrary address in the general case. You don't check the bounds of arrays by some hardware magic; you check them by storing the array bounds with the array. C can't do that, since it's legal to pass a pointer to the middle of an array as if it were a pointer to an array.
In fact, Java is the least portable language I have ever seen. It only runs on one single architecture: The JVM.
Really. I guess gcj, the GNU Java frontend to GCC, doesn't exist then. On the flip side, you haven't been around computers much if you haven't seen languages that run on only one architecture; QBasic comes to mind.
Re:Better than Java? (Score:4, Informative)
My ass, try ADA for that one
If Java was truly "designed for safety at the start", you just wouldn't need that kind of project
Sorry, that is false. that is a bug-finding tool, not a safety checker. Similar bug-finders are required for Ada, although they work with source code.
Java includes major safety features (like the SecurityManager) that Ada doesn't. Ada is relatively safe because it is a Pascal-like language with features, such as bounds checking and very strong type safety, that C etc. lack. Java has such bounds-checking features, but goes further than Ada, as it does not require manual memory management.
Re:Better than Java? (Score:2)
Which is clearly tosh. The Java,
This year I rewrote all the old Pascal programs that we use for them in VB
For the love of god man, why? Java may not be perfect, but there are worse things. VB is one of them. Stop before it damage
AC Flaimbait? Well why not.... (Score:2)
Simply put many people's violent reactions are produced by their own insecurities. While I don't pretend to fully understand much of the world around me, I would say it is fairly certain that through time I have learned a humble amount and that as time progresses and I expirence more I'll become more aware. Your atack on my sexual activity and merits simply a product of your own fear. I truly hope that some day you will be confortable enough with yourself to be able to enter a dicussion with p
Whitespace (Score:3, Funny)
Also at least one whitespace rule that will make Python's syntax look uncontroversial. ;-)
<Obligatory> Don't you just hate getting a story rejected and then seeing it posted from an AC several days later? :-( </Obligatory>
Re:Whitespace (Score:2)
But beware: Good times, good times.
The dubious whitespace rule is... (Score:3, Informative)
No, I'm afraid it's worse than that.
They use a series of identifiers separated by spaces to represent either function calls or multiplication, depending on context. The dependence is relatively subtle, too: uses of () for function calls seem to depend on how many parameters the function takes, for example...
Unlike the April Fool proposal for C++, this is actually, serious BTW.
Here's what's cool (Score:3, Interesting)
Anyhow here's a few of the cool things. First all variable transactions will be atomic so that you can write parallel code with no locking or syncroization.
Parallelism is in it from the start. The assumption will be thousands of threads running on multiple processors. All loops will be done in parallel--you actaully will have to request serial loop order execution if you want it.
data types can be contain di
Re:Here's what's cool (Score:3, Interesting)
Thanks for the knee jerk, but if you'd read a little further into the discussion, you'd have found that I not only read the material but also submitted an article on this subject several days ago.
In any case, using whitespace for multiplication is all very nice, until you start using it for function calling/composition as well. Then, suddenly, you see f g h 2, and you have no idea what this represents until
'do for Fortran what Java did for C.' (Score:5, Funny)
Re:'do for Fortran what Java did for C.' (Score:3, Informative)
Given the phenomenal number of weird bugs that both C and Fortran developers produce because of the nature of those languages and their lack of memory management and safety, I'm sure a large number of Fortran developers would be very interested, even if you aren't.
Re:'do for Fortran what Java did for C.' (Score:3, Interesting)
People don't pay me because I do something easy and safe. They pay me because I do something that most people cannot do, no matter how much time they may dedicate to it.
Progressively easier and safer languages will bring "toy" coding closer to the mainstream (though never quite to the mainstream, since coding takes some real thought, and people dislike having to think). But forcing people who can actually code to use castrated languages just wastes time a
Re:'do for Fortran what Java did for C.' (Score:3, Interesting)
Don't blame the hammer when the homeowner insists you finish the job in half the time and under budget, then the house falls down a few years later.
And if you believe the same (via different mechanisms) won't happen under Java, or Fortress, or any supposedly "safe" language - I have a bridge to sell you. Buffer overruns occur because of sloppy coding - In 90% of cases I've personally had to deal with,
Re:'do for Fortran what Java did for C.' (Score:3, Funny)
Re:'do for Fortran what Java did for C.' (Score:3, Insightful)
Math++ (Score:5, Insightful)
Re:Math++ (Score:2)
Re:Math++ (Score:4, Informative)
Mathematica is (hopefully) mostly used for symbolic computations. In numeric computing, MATLAB and its extensions is quite popular (maybe even GNU Octave for those who rightly fear that proprietary software undermines freedom of research). I have no idea why the folks at Sun think that Fortran is their competitor. Maybe MATLAB suffers from a stigma similar to Visual Basic. Certainly someone inside Sun knows that their HPC customers frequently run MATLAB programs on Sun hardware.
Why don't they just improve the Mathematica calc engine for parallel/distributed supercomputing?
Why would they want to improve the product of a competitor on a government grant? Sounds like a stupid plan to me from a business perspective.
Anyway, language design suitable for numeric computing is not Sun's strength [berkeley.edu].
Re:Math++ (Score:2)
Re:Math++ (Score:3, Interesting)
You'd think Sun would have a good JVM on their own hardware, but the reality is that it sucks.
My research group is switching to Linux because 1) the software runs faster and 2) PCs
Re:Math++ (Score:3, Informative)
There is also Scilab [inria.fr], from the French INRIA (*) (let's say it's sort of Caltech, MIT, but French - highly competent, they are - they make great croissant and cheese) and ENPC. It's Libre software (FAIF).
I've seen people use it for real research, and they thought it to be excellent. There aren't as many packages as Matlab, apparently (but this is something that depends on the number of power users, so
Re:Math++ (Score:2)
Re:Math++ (Score:3, Funny)
Re:Math++ (Score:3, Insightful)
*However*, there are ways to drastically speed up one's code in Mathematica (sometimes compiling functions or modules, using functional methods rather than procedural methods, using Range[] arithmetic, etc.).
Re:Math++ (Score:2)
To the Battlements! (Score:3, Funny)
Please Explain (Score:2)
Optimisation and community, perhaps? (Score:5, Insightful)
I suspect it's mostly because FORTRAN has a lot of things built right into the language, rather than added in libraries and such. That means code can be reasonably tidy, but still leave a lot of scope for optimisation. This is particularly true when compared with the state of the art in optimising for difficult languages like C, where even today relatively simple optimisations can be difficult because of aliasing issues and the like.
It's also worth noting that when most of a serious community use the same tool(s), a lot of new work will be done using those tools simply because of familiarity, community and support issues.
Re:Optimisation and community, perhaps? (Score:2, Insightful)
Re:Please Explain (Score:2)
(that's at least one reason).
Re:Please Explain (Score:3, Insightful)
Re:Please Explain (Score:5, Informative)
FORTRAN and C have different semantics. A FORTRAN optimizer knows more about aliasing, function interactions, and I/O. A C optimizer has to infer or compute such information. C bigots typically have neither written such optimizers nor worked with folks who do it for a living, and are prone to dismiss such arguments as being petty and neolithic. FORTRAN programmers are often a bit more in touch with high performance computing, and are unwilling to bet that heavily on compiler wizardry.
There is a vast body of existing FORTRAN code (much of which is publically available and of high quality). Numerical codes are particularly difficult to "vet", scientific establishments usually do not have large otherwise idle programming staffs, etc. so massive recoding into any new language is typically resisted quite strongly.
Fortran tends to meet some of the needs of scientists better. Most notably, it has built in support for: - variable dimension array arguments in subroutines - a compiler-supported infix exponentiation operator which is generic with respect to both precision and type, *and* which is generally handled very efficiently or the commonly occuring special case floating-point**small-integer - complex arithmetic - generic-precision intrinsic functions
Re:Please Explain (Score:5, Interesting)
With C etc. you cannot know at compile time how much space the data referred to by a pointer will consume, or what it will be. This makes optimising certain routines w/regard to data alignment and packing difficult or impossible compared to FORTRAN.
Various mathematical routines run a hell of a lot faster under FORTRAN than they do under C becauase the FORTRAN compiler knows ahead of time exactly 'what it is getting', and can thus make a decision as to how to feed that data to the CPU to take advantage of its register, cache and instruction scheduling characteristics but sacrifices the flexibility of the 'data structure languages' like C.
Implementing complex, dynamic structures of arbitary 'objects' is childs play with C but something that would drive you batsh*t crazy using FORTRAN.
Re:Please Explain (Score:5, Interesting)
Blitz++ [oonumerics.org] performs very close to or better than Fortran on many numerical calculations.
Re:Please Explain (Score:4, Insightful)
The phrase "phyrrhic victory" comes to mind.
Re:Please Explain (Score:2)
Yes, but the main problem is pointer aliasing. The restrict keyword in C99 helps with that, but compilers still need to make full use of it (and programmers must actually provide these optimization hints). The array layout issues are less of a problem and can be worked around. Of course, you must not declare a 5x5 matrix as double matrix[5][5] (double matrix[25] and manual indexing
Re:Please Explain (Score:5, Interesting)
Having only this information, the compiler has no way of knowing that 'a' and 'b' do or do not point to the same piece of memory, and thus it cannot optimize this loop (as b might point to a-1 for instance). In Fortran the compiler does have this information and can optimize accordingly. Note that this is only a problem with C, not with Pascal. Pascal can in principle run as fast as Fortran, but is probably even more annoying.
Interestingly enough, C++ should be able to reach Fortran speeds when the C++ compiler writers would finally use the leanage they've gotten for optimizing the hell out of 'valarray'. This class doesn't have aliasing problems and can be used in the same way as Fortran arrays.
For the rest, the freaks and weenies have simply been brought up with Fortran and therefore prefer it.
That's the way they've always done it. (Score:2)
* Especially early on, Fortran's non-stack-based structure gave it a lot of opportunities for optimization. When you don't have recursion you can do a lot of "lifting" of subroutines, which makes function calls really fast, and loop unrolling. You can reduce the overhead to literally nothing. That's harder in a stack-based environment, especially when there's the possibility of recursion.
* Some variants of Fortran (including Fortress) have matrices
Re:Please Explain (Score:3, Insightful)
Application vs tool (Score:3, Insightful)
One simple answer is We're not writing a F'ing application!!
A great deal of scientific programming done by scientists and engineerse is NOT to develop an application with a nice gui and users manual - it to solve a complex problem ONCE for him/herself ONLY and get data/results that can be processed by other std application. Rude, crude, and vulgar - tha't is just fine! I write BASIC, Fortran (for 30 years) and C#, assembler, (all of th
In my opinion (Score:5, Insightful)
Re:In my opinion (Score:4, Informative)
He said: multiple return values. You said: have you seen the tuples library? Two distinct and very different things. Multiple return values means that a function can return several distinct first-class values that have nothing to do with each other. Otherwise, the function could just return a list. Hey, look, multiple return values!
The distinction helps you when you want to return distinct unrelated objects that don't really belong in the same data structure, such as an error code and a value, for example. There are a million other ways in which you can do that, but this is one of the more elegant approaches.
Multiple return values are especially useful in compiler intermediate representations, because they allow you to model certain control constructs (exceptions and continuations) explicitly. That turns out to be surprisingly useful to compiler writers.
Fortran's Longevity (Score:5, Insightful)
I think it will be hard for a single company to generate a successor & sincerely hope Sun will realize that for languages with no VM, early success will depend on openness. I also think a lot of what peopl want to do is already being done with python + modules compiled from C or Fortran.
Fortress's big new deal: parallel-by-default loops (Score:4, Informative)
Having loops be parallel by default, on the other hand, is going to explode a lot of heads. Can you imagine what it will be like to write a loop which doesn't depend on the effects of any previous iteration?
Granted, this has existed in various supercomputing languages and VLIW/vector processing assembly since the 80s, but trying to push this out to the masses is pretty revolutionary. A lot of people are going to see it as a serious drawback, and either shy away from Fortress or ask for sequential loops explicitly everywhere, unless they can be taught how to parallelize, which is often a very difficult task for all but the simplest loop side-effects. Sometime's it's just hard, and sometimes it's NP-hard, depending on the details of the algorithm considered for parallelization.
Frankly, given that functional style is much less of a stretch from ordinary procedural programming, and given how slowly ML variants and things like Haskell have been catching on, I guess Fortess is destined for permanent niche status, and not even math typography will save it from consignment to the high-preist class of supercomuter programmers. In fact, that may be a disadvantage, because the scientists writing the formulae aren't the engineers translating those formulae for parallel processing, in most cases.
Re:Fortran's Longevity (Score:3, Insightful)
As a scientific coder, I would estimate that 80-90% of the real reason that no one wants to update large FORTRAN libraries, is that the legacy code is
Re:Fortran's Longevity (Score:3, Informative)
On the contrary, they thought Java was so important they have battled to retain control of the language to prevent forking of the specification.
One of the major problems with Fortran is the large range of dialects, with many incompatibilities.
Matlab (Score:3, Insightful)
Well, it seems to me that 90% of scientific computation today is done with Matlab and similar languages/environments (well, mostly Matlab).
Based upon my experiences, within universities, ONLY in CS departments Matlab is NOT (yet ?) the de facto standard (but it is still tought and used anyways, along with java and some C++).
Re:Matlab (Score:4, Informative)
I've worked on several mathematical and scientific projects, from high performance libraries to manipulate geometry (applications to CAD/CAM/graphics/etc.) to analysing results from metrology hardware. To date, I've never seen a serious project where the programmers use Matlab; writing custom code in any number of serious programming languages is a better option (mostly because there is more to almost any program than maths, even a mathematical one).
Tools like Matlab are great for working scientists who need to get the job done by don't have access to real programming. For those who do, the latter appears to be a much more popular choice.
Time. (Score:2)
Re:Matlab (Score:5, Insightful)
Think of it like this:
C lets you do X, very fast.
Matlab lets you do X very fast, and Y fairly slowly.
Inclusion of any elements of Y into a predominantly X piece of code will bring the whole lot down to Y speed.
There's no inherent advantage of C (as far as I can tell and I've got years of experience in both), it's just that people tend to roll X and Y together in a Matlab program.
Admittedly, this is the fault of Matlab, in letting you do this without warnings, but there is documentation provided that tells you what the set of Y is and gives some hints for recoding bits of Y as X.
So basically, people who program in C are denying themselves the joys of MATLAB's high-level functionality, and in return are still having to code everything in terms of X.
I have given up on C completely, if I can't write it in MATLAB and have it run fast, it means I don't understand the algorithm well enough and should get back to the drawing board.
Btw, your attempt to divide programming into categories of "serious" and not is laughable.
Re:Matlab (Score:3, Insightful)
The cost? 6k dollars seems like a significant disadvantage to me.
Re:Matlab (Score:2)
Re:Matlab (Score:3, Informative)
Matlab Syntactic Salt and Performance Sludge (Score:2, Interesting)
I myself have switched over to using R [r-project.org] for statistical c
Re:Matlab (Score:2)
If you mean programms developed, or manhours spend coding, you might be right.
But noone with his brain intact would run REAL scientific computing (like that all thats stuff burning away TFlop years on the big clusters) on Matlab or "similar languages/enviroments". As soon as cpu hours start to cost money, its worth porting the solutions to something a bit more suited to the task.
So they finally admit Java was broken? (Score:4, Informative)
So they finally admit that what Java did was break the IEEE floating-point specification, that was correct in C, as Professor William Kahan, of Berkeley (see How JAVA's Floating-Point Hurts Everyone Everywhere [berkeley.edu]), had been shouting to deaf ears all this time?
Re:So they finally admit Java was broken? (Score:4, Informative)
Much shorter version of the paper is here [berkeley.edu], and a good java floating point paper is also over here [ibm.com]
oh
Re:So they finally admit Java was broken? (Score:3, Informative)
Oh, as if. There are lots of compilers and platforms that don't have correct IEEE floating-point at all.
E.g. Try running the following on GCC on linux/x86:
illogical name (Score:5, Funny)
Re:illogical name (Score:2)
Re:illogical name (Score:3, Insightful)
FORTRESS AGAINST REALITY (Score:2, Funny)
Why not APL++? (Score:3, Interesting)
I'm not saying that APL does not have its faults (the original version was weak on control structures and data structures other than arrays), but it's core syntax and native handling of multi-dimensional arrays make it idea for scientific computing.
Re:Why not APL++? (Score:2)
Fortran 90 and later do just this.
On a side note, (I've mentioned in ot
Solving the wrong problem (Score:3, Interesting)
This is simply not a problem most mathematicians, phyicists or chemists have. I've never said, "Damnit, why doesn't the FORTRAN code for this thing look more like mathematics!?" Neither has anyone I know.
The best high-level mathematical language in the world--Mathematica--has input that looks very little like mathematics. Integral[Exp[x],{x,0,1}] expresses the mathematics very elegantly in a pure ASCII, standard, portable, form. But it looks nothing like what you'd write on a piece of paper, if that's what "looking like mathematics" means.
Furthermore, there has been a language that looks a great deal like (parts of) mathematics: APL. No one uses it, and part of the reason is that the statements are far too compact--i.e. "mathematics like"--to be readable.
And finally, what does "mathematics" look like? Different fields use radically different notations and conventions. This is particularly true when you start looking across math, engineering and physics. Even different branches of physics are apt to use different notations for the same thing, and worse yet the notation changes over time--go look at any pre-war book on quantum mechanics and you'll see all these "Sp" things where today you'll see "Tr". And things like vectors are typically typeset in bold, but have over-scored arrows when you write them by hand. Which of these "looks like mathematics"?
Locking any of this down in a programming language is just not useful.
--Tom
Do they understand why Fortran is liked ? (Score:5, Insightful)
Indeed a search of the spec says "no attempt at backward compatibility/this is a new language with little relation to fortran".
Nothing to see here. Somebody with new idea he thinks are nifty , and forgot from sight why fortran is still used now.
Re:Do they understand why Fortran is liked ? (Score:2)
There are two issues:
(1) can Fortress link to Fortran libraries---and today this means Fortran 95 and Fortran 2000?
One obvious issue is using natural Fortran memory layout for (single processor non-distributed) arrays.
(2) is the Fortress programming langauge *source code* compatible with Fortran?
They are saying the answer to "2" is "no". The answer to '1' ought to be 'yes'.
I think that just getting Fortran 95 up to wide prevalance, and people realizing it's defin
Okay, so (Score:2)
Is this language compatible with Java? Can it / is it designed to live on a Java virtual machine, or interact with Java?
What with Sun's general unhelpfulness with getting languages alternate to Java running in/on the Java runtime, I find it potentially very interesting to see them at least in some small way admitting you might need more than one programming language, especially if they eventually wind up admitting java programmers might
'do for Fortran what Java did for C.' (Score:4, Insightful)
"Fuck it up."
FORTRAN won't be replaced. (Score:2, Insightful)
Besides, as the "Real Programmers" phrase goes: "Real Programmers can program FORTRAN in any language."
(I've always liked that; I prefer FORTRAN to other high level languages. But then, I am a dinosaur....)
Couldn't help but notice (Score:5, Funny)
Copyright Infringement! (Score:4, Funny)
This programming language's name is obviously derived from my Slashdot nick. My SWAT team of highly-paid lawyers is examining a satellite photo [google.com] of Sun's corporate headquarters, planning their legal assault.
Re:Copyright Infringement! (Score:2)
Do for Fortran what Java did for C? (Score:2, Insightful)
Re:Do for Fortran what Java did for C? (Score:3, Informative)
Actually, I think we need more people to learn that programming is only a part of the job, and worrying more about doing what needs to be done and less about how l33t they are.
Personally, I think I have better things to do than chase down memory leaks. Java (for example; not my favourite language) saves me from worrying about that and allows me to concentrate on the important things.
Of course, I don't get to go wakka wakka wakka on s
Guy Steele, eh? (Score:2)
</wishing for the return of Crunchly>
Unicode Operators? (Score:3, Interesting)
The Perl 6 "spec" calls for at least one unicode operator, as a way of wading into those waters for more general purpose use.
Python to the rescue (Score:3, Informative)
For signal,image processing and more, in a nice python syntax. Nice readable code, no learning curve!
From the website:
SciPy is an open source library of scientific tools for Python. SciPy supplements the popular Numeric module, gathering a variety of high level science and engineering modules together as a single package.
SciPy includes modules for graphics and plotting, optimization, integration, special functions, signal and image processing, genetic algorithms, ODE solvers, and others.
SciPy is developed concurrently on both Linux and Windows. It has also been compiled successfully on Sun and Mac, and should port to most other platforms where Python is available.
I don't thinks so--not with Sun involved (Score:3, Interesting)
If anybody can do in Fortran, it's Guy Steele (Score:5, Informative)
Primary author of Common Lisp the Language, the community-generated pre-spec for Common Lisp
The other half of Steele and Sussman, co-inventors of Scheme
Co-author of The C Programming Language by Harbison and Steele, which codified many of the techniques that made portable C code possible
As co-author of The Java Programming Language Specification, he reportedly brokered many design compromises between Bill Joy and James Gosling
Given his track record, I wouldn't bet against him if he says he's going to create a worthy successor to Fortran.
Give it up, Java Weenies! (Score:5, Informative)
Specifically, it's F77 -> F90 -> F95 -> F2K. There have been enough attempts to replace Fortran, and the only result so far is that it's kept computer scientists entertained. All of these ideas are driven by one common thread; formally trained computer scientists can't stand Fortran 77's control structures, non-dynamic memory, etc, and demand that it must be replaced for religious reasons. F90/F95 have already fixed those problems, but it's still called Fortran, and so it simply *MUST* be replaced.
Let's see, we had PL/I (a merger of Fortran, COBOL, and Algol), RATFOR, Ada, Matlab, C++, and the late, and rather lamented, Sather. None of them has the performance of Fortran, the ease of programming, the extensive and validated libraries, complex numbers as a fundamental data type, or the solidity of compilers.
It's the cockroach of computer languages; you can keep spraying, and it will keep sneakout out at night.
Sun has competition (Score:3, Informative)
Sun's not the only one working on a language to better support HPC (i.e. massively parallel) programming. IBM's working on a language called X10 [aurorasoft.net], and Cray is working on one called Chapel [washington.edu]. All three companies are being funded by the DARPA High Productivity Computing Systems [highproductivity.org] project.
Will any of these replace the dreaded MPI+(C/Fortran)? Only time will tell...
Re:Dear God (Score:2, Insightful)
Worst language ever?! (Score:5, Interesting)
Hardly. In fact, as I read the introductory sections of the spec, I found a lot of it was exactly the ideas I would have designed into a language myself, as someone who writes mathematical code for a living.
I took a bit of a sideswipe at the whitespace rules in a post below, but aside from those (which I think will die long before the final language is released, "natural" notation or not) a lot of the features look good. Things like first order functions and multiple dispatch suggest much stronger handling of functions than any mainstream language today, which is always good for a language that's going to talk about maths seriously. The consideration given to issues of parallel processing is also well beyond anything else in common usage at present, and that's surely one of the key directions serious programming languages are going to go in over the next decade as hardware becomes more and more about multi-processing rather than just Bigger And Faster(TM).
I must admit, though, that I did start to get bogged down towards the end of the section on the basics, and found it difficult to get stuck into the more advanced stuff at all, even with my CS language theory hat on.
Re:Dear God (Score:5, Interesting)
I wouldn't call Fortan the worst programming language ever; COBOL takes the cake (all of those long words for everything, geez!). It's actually still used heavily in scientific computing, and even though it started out like something that looks like the monostrities of COBOL and BASIC (such as goto statements everywhere, forced indentation, verbosity, and other stuff), the lastest standards of Fortran look decent and have a lot of features that languages such as C has and looks like it has became a much better language. For example, Fortran now supports dynamic memory allocation, structure (such as if...else statements and looping), recursion, arrays, operator overloading, records, and more. The features of the language aren't bad.
Fortran's niche is in scientific computing and numerical computing, since not too many languages come close. It's not the best language for every application, but it works well for scientists and mathematicians.
Re:Dear God (Score:3, Insightful)
Yes, and so does C++. What are Fortran's advantages? In the days of the VAX, Fortran compilers often generated faster code than C compilers, but this advantage of Fortran for numerical computation disappeared long ago.
OTOH, C and C++ have a cleaner syntax, no need for line labels, no need to start on column 6 or end in column 72, no need to use a
Re:Dear God (Score:3, Insightful)
Use the best tool for the job?
Fortran == Formula Translation, not general purpose bloated compiler. It's meant to convert mathematical operations to machine code as efficiently as possible. It just gives us a portable way to do more abacus/slide rule/calculator operations than assembly.
You can put a new marketing spin on anything. The point of the matter is that Fortran has evolved to suit coding mathematical systems, not to design applications. The mythical CS land wh
Optimisation and language syntax (Score:4, Insightful)
I can't let you get away with that. Fortran was designed to have a language syntax and structure that makes compiler optimisation easy. 'WHERE' loops can be parallelised, pointer aliasing is tightly controlled (unlike C/C++), etc. It does make a difference.
Re:Dear God (Score:4, Insightful)
Re:Please no... (Score:4, Funny)
biscuits?