Why Apple Silicon Needs an Open Source Fortran Compiler (walkingrandomly.com) 113
"Earlier this week Apple announced their new, ARM-based 'Apple Silicon' machines to the world in a slick marketing event that had many of us reaching for our credit cards," writes Mike Croucher, technical evangelist at The Numerical Algorithms Group.
"Simultaneously, The Numerical Algorithms Group announced that they had ported their Fortran Compiler to the new platform. At the time of writing this is the only Fortran compiler publicly available for Apple Silicon although that will likely change soon as open source Fortran compilers get updated."
An anonymous Slashdot reader offers this analysis: Apple Silicon currently has no open source Fortran compiler and Apple themselves are one of the few silicon manufacturers who don't have their own Fortran compiler. You could be forgiven for thinking that this doesn't matter to most users... if it wasn't for the fact that sizeable percentages of foundational data science platforms such as R and SciPy are written in Fortran.
Croucher argues that "More modern systems, such as R, make direct use of a lot of this code because it is highly performant and, perhaps more importantly, has been battle tested in production for decades. Numerical computing is hard (even when all of your instincts suggest otherwise) and when someone demonstrably does it right, it makes good sense to reuse rather than reinvent..."
"The community needs and will demand open source (or at least free) Fortran compilers if data scientists are ever going to realise the full potential of Apple's new hardware and I have no doubt that these are on the way. Other major silicon providers (e.g. Intel, AMD, NEC and NVIDIA/PGI) have their own Fortran compiler that co-exist with the open ones. Perhaps Apple should join the club..."
"Simultaneously, The Numerical Algorithms Group announced that they had ported their Fortran Compiler to the new platform. At the time of writing this is the only Fortran compiler publicly available for Apple Silicon although that will likely change soon as open source Fortran compilers get updated."
An anonymous Slashdot reader offers this analysis: Apple Silicon currently has no open source Fortran compiler and Apple themselves are one of the few silicon manufacturers who don't have their own Fortran compiler. You could be forgiven for thinking that this doesn't matter to most users... if it wasn't for the fact that sizeable percentages of foundational data science platforms such as R and SciPy are written in Fortran.
Croucher argues that "More modern systems, such as R, make direct use of a lot of this code because it is highly performant and, perhaps more importantly, has been battle tested in production for decades. Numerical computing is hard (even when all of your instincts suggest otherwise) and when someone demonstrably does it right, it makes good sense to reuse rather than reinvent..."
"The community needs and will demand open source (or at least free) Fortran compilers if data scientists are ever going to realise the full potential of Apple's new hardware and I have no doubt that these are on the way. Other major silicon providers (e.g. Intel, AMD, NEC and NVIDIA/PGI) have their own Fortran compiler that co-exist with the open ones. Perhaps Apple should join the club..."
too new (Score:2)
Bring back ALGOL!
Re: (Score:2)
ALGOL 56 or 60?
Re: (Score:2)
68. Van Wijngaarden grammars are da bomb.
Re: (Score:2)
Re:too new (Score:4, Funny)
You fancy kids with your MACRO assemblers and automatic program loaders! I bet you can't even toggle in a machine code in 36-bit words anymore! *waves cane*
Re: (Score:3)
36-bit words? Hell, real programmers toggled 12-bit words into the front of their PDP-8s....
Re: (Score:2)
You could be forgiven for thinking that this doesn't matter to most users...
Is that because you'd be correct?
Re: (Score:2)
ALGOL has little use in modern times. Fortran is very valuable still. Remember, the age of something does not indicate it's usefulness or quality.
Re:too new (Score:5, Funny)
Re:too new (Score:4, Funny)
Don't worry. (Score:1, Insightful)
It doesn't matter because Apple hardware isn't for science, it's for art. If you buy a mac for scientific computing you're just throwing money away. All the software will run on a cheaper platform where you can get much more performance for your money.
Re: (Score:2)
I am as quick to jump on the anti-Apple bandwagon as anyone else, but it's not always good to keep additional devices around just because they're more efficient or more powerful for a subset of one's tasks. Sometimes a jack-of-all-trades (but master-of-none) device makes the most sense.
Re: (Score:1)
A CHEAP jack-of-all-trades (but master-of-none) device ALWAYS beats an OVERPRICED vanity device aimed at non-technical people who have more money than brains.
Performance per watt (Score:4, Interesting)
If Apple Silicon's performance per watt is as good as Apple claims it is, then someone accounting for the power and cooling bills might conclude that it's cheaper than AMD and Intel in the long run.
Re:Performance per watt (Score:5, Insightful)
If Apple Silicon's performance per watt is as good as Apple claims it is, then someone accounting for the power and cooling bills might conclude that it's cheaper than AMD and Intel in the long run.
The problem with that is that the big power consumption problem isn't in workstations, and Apple has ceded the server market completely. Nothing prevents them from returning to it, except that it will be hard to take them seriously when they abandoned the people who trusted them last time.
Re: (Score:2)
That doesn't seem to impact Google or Microsoft.
Re: (Score:3)
it will be hard to take them seriously when they abandoned the people who trusted them last time.
That doesn't seem to impact Google or Microsoft.
Very different circumstances.
I'm one to seldom pass up an opportunity to give Google crap for everything in the Google Graveyard, but the closest they have to things relevant to devs and sysadmins are GCC and G-Suite (or Workspace or whatever-they-hell they are calling their cloudy groupware service this week). Both of those things remain pretty steady as far as i'm aware, and I can't remember a "last time" they had a similar offering that is in the Graveyard. If you can point to it, let me know, but to my
Re: (Score:2)
Microsoft dropped "virtual server" - their initial vmware competitor, and there was a several year gap between virtual server being dropped and hyper-v becoming available. Microsoft tends to drop products which aren't as successful as they hope, and using such products is generally a monumental pain even while they are supported - for the common products people know how to work around the bugs, for the less common ones people don't.
Apple didn't so much drop their server option, as bake its features into the
Re: (Score:2)
Microsoft dropped "virtual server" - their initial vmware competitor, and there was a several year gap between virtual server being dropped and hyper-v becoming available. Microsoft tends to drop products which aren't as successful as they hope, and using such products is generally a monumental pain even while they are supported
It's interesting to note that virtually all such packages were developed someplace else. Then Microsoft gets hold of them and strangles the life out of them, like they did with e.g. WolfPack, and then they suck and people don't want them. Virtual Server was developed by Connectix, from whence they also got Virtual PC — which they used for XP Mode on Windows 7. XP Mode was garbage, several programs I tried to run on it didn't work and none of them even used D3D. I presume that Virtual Server was also g
Re: (Score:2)
In the interim I have come up with a way for Apple to make money with Apple servers. They could build their own cloud that their OS provides a simple interface to so that processing tasks could be sped up by offloading to a cluster. This will only work well for people with high quality (and speed) internet access, but I would guess that most of the Apple users are among that group anyway.
Re: (Score:3)
Nobody is going to be using the current M1 for actual scientific computing for one simple reason: All the current M1 Macs cap out at 16GB of RAM.
The "starting at" prices are all for 8GB of RAM, but for an extra $200 you can up it to 16GB. Yep, you read that right: $200 for 8GB of RAM. In 2020.
And that only gets you to 16GB.
The fact that even the "Pro" model caps out at 16GB suggests there's some underlying limitation of the M1 chip that means it can't handle large amounts of RAM.
It's hard to say how much RA
Re: (Score:2)
As the RAM is all in the SoC, I would expect the M1 *could* be outfitted with more than 16GB but it would need to be integrated into the package on manufacture. As the M1 was explicitly optimised for low power entry-level systems I would expect the M1X, M2 or whatever they call it to have 32, 64 or even more as an option...
Re:Performance per watt (Score:5, Insightful)
So the problem is that you're comparing memory requirements across platforms.
Just for one example: Android devices typically require twice as much memory as iOS devices to get comparable performance and utility. A good portion of that comes from the need to support JVM garbage collection schemes whose pool-to-pool copying mechanisms mean you NEED twice as much memory. (The from pool and the to pool.)
Further, IIRC Apple claimed to have doubled the throughput from processor to SSD storage, which means that paging isn't as much of a hit as it might have been.
I think they also mentioned that they can hand off a pointer to a graphics bitmap or texture directly to the GPU for processing/display, which means that again you don't need the same object taking up CPU space and GPU space.
I don't think 16GB is enough for me either (currently i9 with 64GB) but I'm really interested to see what the next generation of this technology is going to bring.
Re: (Score:2)
It always amazes me how spoiled we get with advances in technology. 16GB would have been a dream machine 20 years ago and lots of scientific computing was done then.
Scientific computing doesn't always required lots of memory. Maybe some things do, but I strongly doubt that all projects require lots of memory. Also realize that it may be that the development is often done on one machine with perhaps a small subset of what needs to be done and then the actual "real run" can be done on a larger machine (that c
Re: Performance per watt (Score:2)
Unless your electricity price is something like 100x more expensive than average, this is likely not true for typical home/office use where the CPU is idle most of the time.
Re: (Score:2)
I'm sure the difference in cost between a MacBook and a box under my desk to crunch numbers would buy a lot of electricity.
Re: Don't worry. (Score:2, Informative)
I hate Apple and don't own one. However, my niece has a 11 year old MAC which still works. It supports R, Python, Excel, and ported Unix tools for her University research tools developed long ago orginally for SunOS for her Neurology degree.
Windows at the time didn't have R and Python for Windows was more of a special port with the real stuff on Unix back in 2010. No she doesn't have time to learn Linux or deal with all the upgrade problems due to a lack of a driver ABI and Xorg being a piece of shit for ea
Re: (Score:2)
A CHEAP jack-of-all-trades (but master-of-none) device ALWAYS beats an OVERPRICED vanity device aimed at non-technical people who have more money than brains.
Right, hater.
That's why Apple's examples in their Apple Silicon Macs Event were almost all showing Software Development and high-end Content Creation, like 8k video in DaVinci Resolve.
In fact, the entire presentation was fairly focused on technical details and capabilities (for something intended for general audiences).
https://www.apple.com/apple-ev... [apple.com]
Re: (Score:2)
I am as quick to jump on the anti-Apple bandwagon as anyone else,
Doesn't look like it.
but it's not always good to keep additional devices around just because they're more efficient or more powerful for a subset of one's tasks.
What does that have to do with anything?
Sometimes a jack-of-all-trades (but master-of-none) device makes the most sense.
I still don't see what you think you're saying here. Macs, Windows PCs, and Linux machines all are "a jack-of-all-trades (but master-of-none) device".
Re: (Score:2)
I invite you to look at my recent comments, particularly in the context of the M1 benchmark results, and decide whether I'm pro- or anti-Apple.
You noted that Apple competitors provide better performance-per-dollar than Apple does. That is certainly true, but it is not the only relevant metric. For example, I have a desktop running Linux with Windows in a VM. I don't keep another desktop for playing games, even though that would give much better
Re: (Score:1)
So you claim Apple computers are faster per $ than generic AMD or Intel computers ?
Facts to back that up or you're lying.
Re: (Score:2)
Believers don't deal in facts.
Like Dubbya, he "knows in his heart". You know... where the brain is. --- ;)
Re: (Score:2)
Re: (Score:3)
Re:Don't worry. (Score:5, Informative)
Actual scientist here
Where? All I see is an actual coward.
Tax payer money paid for all my macs so why would I give a flying fuck about how much it cost? Iâ(TM)m not paying for it.
No, everyone else is. But I guess you don't care about anyone else?
Besides my macs are faster.
No, they aren't.
I have a tensorflow based adaptive training system that runs in 10 minutes on my Mac vs over a hour on the Linux based Haswell nodes on Nersc.
Haswell is a has-been, and is irrelevant to this conversation. But there are no currently or formerly shipping macs which have the fastest possible processors or GPUs, it has literally always been possible to buy a machine with a faster processor for less from someone else.
And that doesnâ(TM)t even include how long I have to wait for my job to sit in the queue before it even starts.
Also irrelevant to your claims.
With the new arm macs the Mac will be even faster with the neural engine.
We'll see. Since Apple used a worthless benchmark, the jury is still out.
Re: (Score:2)
You think haswell is irrellevant to the topic of scientific computing?
It is irrelevant to the claim being made, and therefore to this discussion. The coward in question offered a specious claim of their code running faster on their hardware than someone else's hardware as evidence that PC hardware was not faster than Macintosh hardware. Try to keep up.
Re: (Score:1)
He was comparing to a computer that has 2388 nodes of 32 physical core per node. But obviously with the right acceleration a simpler AMD system would work as well as the mac.
I do feel there is some hate at apple for being apple especially from oss folks. That said their hardware is still top notch regardless of the status of the software,
Re: (Score:1)
Try to keep up faggot.
If I wanted to keep a homosexual "up", I'm sure I could do that. I don't, but I still don't think "faggot" is an insult. If I were gay, though, you'd know.
The same code runs 6x faster on my five year old Mac than on a Nersc.
Who the fuck are you? There's no evidence that anything you say is true. AC comments aren't worth the name they're associated with.
Re: (Score:2)
Aaand... disqualified.
If you want anyone to actually read your comments, I suggest not starting with personal attacks. Nobody wants their time wasted with useless statements.
Re: (Score:1)
If you want anyone to actually read your comments, I suggest not starting with personal attacks.
If they want to not be taken as a coward and a troll, then I suggest they associate their comments with an identity. The primary use of the anonymous posting functionality is trolling.
You also make personal attacks constantly, so you can fuck right off like the hypocrite that you are.
Re: (Score:2)
If you're running tensorflow what do you care about Fortran? It doesn't even have a mainstream Tensorflow API.
Fortran will just run on the normal cores, maybe slightly more efficient than whatever cluster/cloud/super computer you're using, but in the big scheme of things it's still kinda silly to run numerical code locally on a laptop unless it's trivial. If it's trivial, there's a fortran to webassembly compiler I think.
For quick results for your research into how to make mass surveillance and consumer dat
Re: (Score:2)
The dependency on TensorFlow could be the opposite. TensorFlow itself (or plugins to TensorFlow) might depend upon FORTRAN libraries and not that you want to call TensorFlow from FORTRAN. (I am just guessing as I don't use TensorFlow.)
I think too many people on this thread make too many hypotheses on the type of code that means it is "scientific". There are all sorts of scientific code that can run without huge amounts of memory. (Indeed I would bet that 95%+ of the FORTRAN code people want to use was creat
Re: (Score:2)
Actual scientist here.
Nope.
Re: Don't worry. (Score:1)
The CPU is great for scientific calculation. The os is just a thing that manages running your programs, it isn't a religion.
Re: Don't worry. (Score:5, Informative)
The CPU is great for scientific calculation.
No Apple platform has ever had the fastest shipping Intel CPU. The only time a Mac has been faster than PCs has been when the G4 came out, which was briefly faster than available Intel offerings. The G5 was faster at some tasks, slower at others.
These days the real horsepower is in GPUs, and Apple's ecosystem doesn't allow you to have the latest and greatest GPUs, so they're not faster at that type of calculation either.
The os is just a thing that manages running your programs, it isn't a religion.
That's bullshit through and through. First, the OS is a religion to the average Apple user. They'll tell you it's better than anything else, but they can't tell you why, they're just taking it on faith. When you point out real and concrete examples of how it is inferior, they come back with irrelevant responses. Second, the OS does a lot more than "manage running your programs". It includes a lot of other functionality. If you really think all it does is that, you are utterly unqualified to comment. If you don't, then why did you say so? Apple Religion?
Re: (Score:2)
I don't think they even know what the OS is. If the UI stayed the same but the underlying OS switched out, how many of the Apple faithful would notice, care, or understand? And anyway, with a walled garden like they have, isn't most/all of the OS access through Apple's APIs/frameworks?
Re: (Score:2)
"When you point out real and concrete examples of how it is inferior, they come back with irrelevant responses."
So... it's a Unix-based OS with a Mach kernel. I guess I'd like some real and concrete examples.
Re: (Score:2)
It's not the underpinnings, it's the UI and the stuff on top like the outdated libraries.
Re: (Score:2)
I guess I'm not seeing all of the "outdated" libraries.
And the UI is pretty consistent, unlike that, say, of Windows where all too often you're likely to drop in a Windows XP-style settings dialog that comes straight from the 90's. Or even, for matter, Linux, where you're lucky to see the same UI on two different machines, as it all depends on personal preferences and whatever the flavor-of-the-week UI/UX library the owner wanted to install. (Deepin, KDE, Pantheon, Budgie, Cinnamon, LXDE... I could go on, b
Re: Don't worry. (Score:1)
You are just foolish. Scientific computing does not require the fastest CPU or largest computer. And in addition to computing stuff many like to use visualization tools to analyze them. The mac is convenient for both applications. I generally use a Ryzen CPU with nvidia graphics but lots of people use mac for the same functions. Using the exact same programs I use on Linux.
And the function of an os is indeed to manage running programs and organize files.
Re: (Score:2)
And the function of an os is indeed to manage running programs and organize files.
I guess it's not to enable applications' access to hardware, huh? Like, the primary function of an operating system?
Re: (Score:2)
The CPU is great for scientific calculation.
No Apple platform has ever had the fastest shipping Intel CPU. The only time a Mac has been faster than PCs has been when the G4 came out, which was briefly faster than available Intel offerings. The G5 was faster at some tasks, slower at others.
These days the real horsepower is in GPUs, and Apple's ecosystem doesn't allow you to have the latest and greatest GPUs, so they're not faster at that type of calculation either.
The os is just a thing that manages running your programs, it isn't a religion.
That's bullshit through and through. First, the OS is a religion to the average Apple user. They'll tell you it's better than anything else, but they can't tell you why, they're just taking it on faith. When you point out real and concrete examples of how it is inferior, they come back with irrelevant responses. Second, the OS does a lot more than "manage running your programs". It includes a lot of other functionality. If you really think all it does is that, you are utterly unqualified to comment. If you don't, then why did you say so? Apple Religion?
One reason I like OSX is that it is designed around Fitts Law. It's also nice having a Unixey environment with a nice UI.
Re: Don't worry. (Score:2)
Apple's UI is garbage, though. It is nigh-impossible to reconfigure, it has inadequate contrast, last I checked Apple apps alone used three different widget sets, and the dock moves around as apps are opened and closed so it defeats the use of muscle memory.
Re: (Score:2)
You must be new here!
Re: (Score:1)
Precisely. This is a lame attempt by pathetic people to try and justify buying computers for their shininess by running calculations on them.
Stop wasting your research dollars on VANITY, you w%$nkers.
Re:Don't worry. (Score:4, Insightful)
You're doing the math wrong. A computer costs roughly 1% of my salary. If I can be 1% more efficient in a year, due to personal preferences as to how I use a computer, the _entire_ cost of the computer is recovered in 1 year. It's a great investment. So, no matter differences in cost, it's just worth it to use a computer that you like. I am a very heavy scientific computer user (experimental control, data analysis, everything else). I have written a few 100,000 lines of c++ code in my life, along with maybe a comparable amount of python. I am sure I could get a nice little Linux box and do OK with it. However, I like Apple Mail, and Keynote for presentations. They work well.
Re: (Score:2)
Re: (Score:2)
You can get a nice high end threadripper box for about 10K that will smoke anything apple has. You can put really good GPUs in it and push it up to 20K. It would be amazing for scientific computing. It would also be more than 1% of your salary unless you are making 2M/year. It would still likely be a good investment.
Macs are nice pieces of jewelry but if you need serious number crunching there are FAR better options.
Re: (Score:2)
You can get a nice high end threadripper box for about 10K that will smoke anything apple has.
Who cares? No sane person buys a laptop to use as a compute server.
I use my laptop for prototyping, editing, compiling, testing.
But when I am done, the code will be deployed to the cloud.
The important criteria for a laptop are a nice keyboard, screen resolution, the UX, and having a brand that impresses the cute barista at Starbucks.
Re: (Score:3)
does it run LaTeX?
Re: (Score:2)
You're not losing any scientific development efficiency on a platform that's not common for scientific development?
No. Most scientific computing is done on Linux. Linux and MacOS are 99% API compatible, and the other 1% is easy to avoid. It is rare to have any OS compatibility issues when moving code developed on a Mac to servers running Linux.
Re: (Score:3, Informative)
Re: (Score:3)
It doesn't matter because Apple hardware isn't for science, it's for art.
So what you are saying is, Apple silicon needs a COBOL compiler?
Re: (Score:2)
So what you are saying is, Apple silicon needs a COBOL compiler?
There are Cobol compilers available for MacOS and they run on the M1.
For instance, there is GnuCobol [wikipedia.org].
Re: Don't worry. (Score:2)
My niece finishing her PHD in Neuroscience disagrees. She uses Python, Excel, and Java software for her data modelling.
Re: (Score:2)
That's odd. (Score:4, Interesting)
Re:That's odd. (Score:4, Interesting)
Re:That's odd. (Score:5, Interesting)
It may not generate the most optimal code for Apple silicon in every case but gcc does damn well in most cases. And the gcc Fortran compiler is as battle tested as BLAS is. Certainly a person wanting to do scientific computing on Apple's hardware can't use the lack of a Fortran compiler as an excuse.
Re: (Score:3)
Not a problem. Just use Apples optimized version of LLVM and stick Gnu F77 in front of it via drqagon eg. problem solved
Re: (Score:3)
The 600+ instruction scoreboard/scheduler in the M1 (much larger than the 250-ish instruction window in latest AMD and Intel cores, and those seem fairly adequate anyway) means that compiler microarchitecture cost functions aren't as important as they were in the old days, or on in-order processors like the Cortex-A53 "little" cores, or embedded processors. The processor is effectively dynamically recompiling and instruction-scheduling your code as it runs, anyway.
So: as long as the compiler does a decent
Re: (Score:2)
I think the real point here is that Apple micro architecture is different enough that a generic ARM target is not going to be very good
There have been benchmarks out showing that an M1, automatically translating Intel to ARM code, beats all the Intel CPUs with up to four cores. So I suggest there can't be that much difference.
Only thing that Apple tells you: 1. Tell every thread what performance it needs (so you can have threads running automatically on the low power cores, or in the background). 2. Use the graphics formats that are fastest. 3. If you use CoreML, don't tell it to use CPU or Graphics card only, but everything (or it will
Re: (Score:1)
Not any OpenSrc, BSD or lighter (Score:2)
Apple wants what Apple has always wanted, Open Source without strings that can be closed down. Like they took OpenBSD (2.2?) for MacOSX. The GPL version _ANY_ is toxic for them because they would have to deliver source on request to anyone who receives binaries.
As per side-thread there are perfectly good Open-Source Fortran compilers (g77, flang) but these are all under GPL hence undesireable for Apple.
Re:Or you could, you know... (Score:4, Informative)
Oh and modern Fortran isn't the mess that Fortran 77 was. So you can use your ancient Fortran libraries and code in modern Fortran if you like.
But what is right about fortran? (Score:2)
What exactly is wrong with Fortran? 99% of what I write is C but Fortran certainly is an appropriate language for scientific computing.
Sincere question...Why do we care? I remember fortran classes in school, but I never have come across it in my professional life. I don't know of a single famous program I use that was written using it. Why is that?
Am I missing some core applications or libraries I used daily that just don't advertise that they're written in Fortran? Mathematical computing is very important for many industries.
Why is not used in image editing software or 3D games if it is so much better than C or C++ or any of the
Re: (Score:3)
I remember fortran classes in school, but I never have come across it in my professional life. I don't know of a single famous program I use that was written using it. Why is that?
What is your field? In e.g. astronomy, Fortran is still actively used to write new software.
Am I missing some core applications or libraries I used daily that just don't advertise that they're written in Fortran? Mathematical computing is very important for many industries.
The summary mentions R and SciPy, which are used by many scientists. Here is a list of Fortran libraries [fortranwiki.org], most having wrappers for use in other programming languages. Depending on what you do, some of those might be vital because they are very robust and do their calculations very fast.
To come back to your first question: the LINPACK Benchmark, used to rank the TOP-500 supercomputers [top500.org], is witten in Fortran and use
Modernizing legacy scientific code ... (Score:3)
Sincere question...Why do we care? I remember fortran classes in school, but I never have come across it in my professional life. I don't know of a single famous program I use that was written using it. Why is that?
It may not be about the software you use but about the software other people use. People who you buy things from, probably non-software things. Like things made from plastics?
I too had a fortran class and thought I'd never touch it again. I eventually worked on some molecular modeling and visualization software for Window and Mac. All C and C++. Dow Chemical contacts us and says they'd like to use our software as a front end to some legacy computational chemistry code. Which unsurprisingly was in Fortran
Re: (Score:3)
Re: (Score:1)
Re: (Score:2)
Or run it through f2c which has been around for years... but apparently may or may not work well on anything past F77. The real problem is that until recently C was inherently slower than F due to pointer aliasing assumptions. I don't know how well current C compilers support the restrict keyword that resolves that issue. Code converters are never that great anyway, and jumping ship to a new language isn't something you just do on a lark.
A FORTRAN compiler would save an awful lot of work, and get you hig
Re: (Score:2)
Writing mathematical code in modern Fortran is not at all painful. It is what the language is designed for, and it does the job very well, and as noted in the TFS those Fortran libraries are not practically replaceable. Updating of older Fortran software to run under new standards is not difficult.
"I don't use it" is not a good reason for other people to go to massive expense for no benefit.
Re: (Score:2)
Updating of older Fortran software to run under new standards is not difficult.
Updating some of Dow Chemical's older Fortran code from a mainframe console environment to a Windows/Mac GUI environment involved a GUI dialog box that replaced a handful of console input prompts. And of course C and Fortran compilers that were ABI compatible so their respective code could link.
flang and gfortran (Score:5, Interesting)
Re: (Score:1)
Not sure I understand. (Score:2)
Not sure I understand, vendors have to make their own compilers for new architectures. Aren't existing ARM compilers fine for Apple Silicon? Are there any new extensions specific to Apple that I have not heard about?
Re: (Score:3)
For the most part it shouldn't matter but compilers can sometimes do specific optimisations for specific processors.
Fortran is these days used mostly for numerical programs where performance matters, which is what makes this an issue.
Two processors with the same ISA can have different issue-widths and be able to run different instructions in parallel. The same instruction can have different latency. Processors can also have different memory subsystems which together with everything else could have an effect
What about FLang? (Score:2)
And if you havenâ(TM)t noticed, the new Macs are all _low end_ machines. They are bloody fast for low end, somewhere between 2017 Intel 8 core chips and 2020 intel 8 core chips, but they are low end. So if scientists canâ(TM)t use low end computers...
Modern low end computer may be OK for legacy s/w (Score:2)
So if scientists can't use low end computers...
They often can. The legacy software may originally have been running on quite dated systems. What once needed a Sun or SGI workstation can now run comfortably on a laptop. Some Fortran code I ported from a mainframe ran fine on regular PC desktops that the scientists happened to have for writing documentation. They were quite happy with a GUI front end pasted on top of the legacy fortran code running on these PCs.
Just port flang (Score:2)
LLVM (which is what Apple uses on OSX and iOS and which will almost certainly run great on the new ARM Macs) has a Fortran frontend called flang. Just get that running and you have your Fortran compiler.
I would assume that g77 will be along shortly? (Score:2)
I know that gcc does not currently support arm-darwin, but ... I would expect it to support the arm macs either before they're released (if the developers can get access to them) or shortly afterwards (if they can't).
I mean, the hardware itself isn't due to be reased until late 2020 (right?) so there's still lots of time.
Re: (Score:2)
Support may not currently exist because the hardware is very new and not yet widely available, but it is being worked on.
Support for darwin-x86_64/ppc64 exists, as does support for arm64-linux/freebsd/etc so the os and cpu are already supported separately with the only thing missing being the specific combination of darwin and arm64.
Market? (Score:1)
Seriously, we're talking 5 guys in the whole world that want Fortran for a Mac?
Am I wrong?
I'd have to wonder what in the world they are doing that would still need Fortran. Just update it to a new language already.