Why Scientists Are Still Using FORTRAN in 2014 634
New submitter InfoJunkie777 (1435969) writes "When you go to any place where 'cutting edge' scientific research is going on, strangely the computer language of choice is FORTRAN, the first computer language commonly used, invented in the 1950s. Meaning FORmula TRANslation, no language since has been able to match its speed. But three new contenders are explored here. Your thoughts?"
Q: Why Are Scientists Still Using FORTRAN in 2014? (Score:5, Insightful)
A: Legacy code.
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Informative)
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Interesting)
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:4, Informative)
Yeah, I used to hear that argument a lot in 1978...
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Interesting)
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Informative)
Blabbing on and on about vector based GPUs is idiocy, because not everything uses trig where vector based processing is beneficial. I have no confidence you have ever seen math intensive code based on what you are talking about. Nile from their own page is # The Nile Programming Language ## Declarative Stream Processing for Media Applications and NOT a language for Math.
I find this view quite amusing, given that the whole scope of the VPRI project (of which Nile has been of the intermediate results) is to reduce everything in common personal computing into mathematics. And Nile in particular was designed precisely and explicitly to allow the VPRI people to express as wide an array of graphical operations using as short a high-level description as possible - a mathematical description, in equational form, to allow them to express the majority of Cairo (or any other Cairo-like 2D library) in a few hundred lines of these equations. Furthermore, nowhere have I made the claim that the semantics of Nile in its current form is a perfect replacement for any language for scientific computation, as opposed to the thought that there could be some lessons to be learned.
And why don't you log in? There seem to be quite a few anonymous psychotic individuals running around here recently. It makes the conversation feel quite disingenuous.
Re: Q: Why Are Scientists Still Using FORTRAN in 2 (Score:3)
Pretty much all scientific computing benefits from vectorization. All that's needed to use it is that the code performs the same operation on multiple values of data (SIMD).
Re: (Score:3)
APL-family languages should be even more popular than Fortran!
Probably would be if it wasn't a write only language.
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Informative)
Fortran has been an "APL style language" since Fortran 95, with most of the APL operations present. That was done both for optimization and for convenience. And other APL-style languages are very popular as well, foremost MATLAB.
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:4, Informative)
Re:Popular has a lot to do with installed base... (Score:5, Interesting)
Isn't the main performance benefit that Fortran has always claimed over C/C++ the fact that an array is guaranteed to only be used from one thread at a time, and thus you don't have to re-read from memory to registers each time you want to do something with the data in the array? A capability that was formally added to C in C99 (and pretty much universally informally added to C++) with the restrict keyword?
Correct me if I'm wrong here, as I'm not a Fortran programmer.
Re: (Score:3, Informative)
Restrict keyword is not related to threading. C/C++ compilers have always assumed that data is not accessed from several threads without synchronization. It just wasn't standardized until the new memory model in C11 and C++11. So if you don't use mutexes, memory barriers etc, the compiler is allowed to assume a single thread of execution.
What restrict does is it guarantees that two pointers do not point to same area in memory (aliasing). Let's say a function takes two pointers (char* a, char* b). If you wri
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Insightful)
Also "legacy training". Student learns from prof. Student becomes prof. Cycle repeats.
Also Fortran didn't stagnate in the 60s, it's been evolving over time.
Other languages are highly optimizable too. However most of the new and "cool" languages I've seen in the last ten years are all basic scripting languages, great for the web or It work but awful for doing lots of work in a short period of time. It's no mystery why Fortran, C/C++, and Ada are still surviving in areas where no just-in-time wannabe will flourish.
Legacy Programmers (Score:5, Interesting)
Also "legacy training". Student learns from prof. Student becomes prof. Cycle repeats.
Not really - even when I was a student we ditched F77 whenever we possibly could and used C or C++. The issue is more legacy programmers. Often the person in charge of a project is a older person who knows FORTRAN and does not want to spend the time to learn a new language like C (or even C++!). Hence they fall back into something more comfortable.
However by now even this is not the case. The software in particle physics is almost exclusively C++ and/or Python. The only things that I am aware of which are still FORTRAN are some Monte-Carlo event generators which are written by theorists. My guess is that as experimentalists even older colleagues have to learn C++ and Python to use and program modern hardware. Theorists can get by using any language they want and so are slower to change. Certainly it has probably been at least 15 years since I wrote any FORTRAN myself and even then what I wrote was the code needed to test the F77 interface to a rapid C I/O framework for events which was ~1-200 times faster than the F77 code it replaced.
Re: (Score:3)
Excuse me, but you're going to have to explain how your model is different from the one you say is wrong.
The point is that it is not the training provided by us professors it is the people in charge of the projects who used to require Fortran. However this has almost completely changed. I can't speak for other disciplines but in Particle Physics the code migration IS happening on the scale I think. All the major programs used in collider physics have migrated to C++: GEANT, PYTHIA and ROOT are all now C++. There are still some generators written by theorists which are Fortran but these are becoming increasing
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Insightful)
People using existing Fortran code are interested in the RESULTS of the computation, not whether the code is modern or has the latest bells and whistles. Programmers forget that the ultimate goal is for someone to USE the program. I wrote a program in CDC Fortran 77 in 1978 that's still being used, Why? Because it does the job.
Re: (Score:3, Insightful)
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Insightful)
He handed it over to a computer science graduate (i.e. a non-physicist) who really liked all the modern trends in CS. Now, five years later:
1. the tarball is an order of magnitude larger
2. the input files are now all impenetrable
3. the code requires access to the outside (not possible on many superclusters)
4. he re-indented everything for no apparent reason
5. the variable names were changed, made into combined types and are much longer
6. as a result, the code is basically unreadable and nearly impossible to compare to the original formulae
7. code is duplicated all over the place
8. it now depends on unnecessary libraries (like the ones required to parse
9. it's about four times slower and crashes randomly
10. it generates wrong results in basic cases
To quote Linus Torvalds: "I've come to the conclusion that any programmer that would prefer the project to be in C++ over C is likely a programmer that I really *would* prefer to piss off, so that he doesn't come and screw up any project I'm involved with."
Fortran, apart from being a brilliant language for numerical math, has the added benefit of keeping CS graduates at bay. I'd rather have a physicist who can't program, than a CS type who can.
(Apologies to any mathematically competent computer scientists out there)
Re: (Score:3)
None of that is good programming, is the thing.
I really wish people would stop blaming the tools when the problem is people who are tools. Maybe that's endemic to "CS types"? But those of use who code for a living in the real world recognize what you describe as a noob stunt, not a language problem.
The main reason stuff stays in fortran is the general best practice of not messing with working shipped code. If the code needs regular work, for goodness sake use a maintainable language. But lots of fortran
I have worked with a lot of CS people (Score:4, Insightful)
No. The main reason we program in fortran is because the lirbary are known, have known error bars, known comportment , and are "provable". We *DO* reprogram every time we come up to a new problem which need to be translated. Chance is there is no standard code for what you want to simulate for your own specific problem. There are some rare case, like QM program (Gaussian, Molpro etc...) or some engineering program, but those are the exception not the rule.
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Insightful)
He handed it over to a computer science graduate (i.e. a non-physicist) who really liked all the modern trends in CS.
Why was a graduate fresh out of university put in charge of architecture decisions? You wouldn't put an apprentice in charge of a mechanical workshop and expect them to keep it tidy and efficient, this is no different.
It's my general experience that it takes 5-10 years of commercial experience before someone is capable of making wise architecture choices about small standalone apps, and 15+ before they'll have a hope in hell of doing anything non-destructive with a large legacy application.
Re: (Score:3)
Your problem began when you turned the code over to an inexperienced CS graduate. You don't need to be a good software engineer or even a good programmer to get a degree in computer science. I wish people would stop conflating the two, especially the people in the HR department. :P
You needed a SOFTWARE ENGINEER. Worse, you needed an experienced software engineer familiar with the domain. Instead you handed it over to a fresh graduate who maybe had one or two courses on engineering. What exactly did you expe
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Insightful)
Precision is important in scientific discourse. Latin isn't a language with creeping grammar and jargon. It's sorta what Esperanto only wished it could ever be.
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Insightful)
I would also hazard a guess that Fortran tends to be a tad easier to read than C... Especially for scientists...
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Informative)
Arrays! (Score:5, Informative)
The big thing Fortran has over C is proper support for multidimensional arrays, with powerful slicing operations built into the language. It was the inspiration for numpy arrays. My first languages were C++ and C, but when I do scientific programming, my languages of choice are now python and fortran (with f2py making it very easy to glue them together). Fortran is horrible at text processing, and has an almost absent standard library, but for scientific use, good arrays make up for that - especially when you can use python in the non-performance-critical parts.
C++ has some multidimensional array classes, but none of them are as convenient as fortran arrays. Especially when it comes to slicing. At least that's how it was the last time I checked.
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Funny)
Re: (Score:3)
I would also hazard a guess that Fortran tends to be a tad easier to read than C... Especially for scientists...
The way scientists write code, it doesn't matter what language they use it will still come out an undecipherable mess. You'll have a quintuply nested loop populating an array call tlnb1 in a function called abn that takes 8 arguments named t1 to t8. The only documentation will be a single comment just before the loop that says "This should work now".
If you're going to be working with scientific code you'll need at least a Master's degree in software archaeology and software anthropology.
Re: (Score:3)
Latin was the one language that all academics shared.
...you mean, those who didn't speak Hebrew, Greek, or Arabic?
Re: (Score:3)
Re: Q: Why Are Scientists Still Using FORTRAN in 2 (Score:3)
Doctors don't use Latin. They use their native languages and a bunch of proper nouns that happen to be Latin (or Greek) words or phrases.
Latin (or Greek) used to be the language of academia because it's what the Greeks used and the Romans translated the Greeks into. And the church loved Aristotle and ironically, the church pretty much defined western academia.
Re: (Score:3)
Latin is even more terse than English and the words can be placed where the fuck you want. It can be well ambiguous enough!, but that's because I studied poetic latin a bit in high school. Declensions on every word save it, but mean you can really abuse it. Then being a dead language noone knows how to say "yes", "no", "hello", "thanks", "how are you doing?" and such little things.
In 17th/18th century French replaced it, with less declensions and more grammar. (and you had other such artifical national lang
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Informative)
ALL CAPS has been optional since 1990, at least.
Fortran has had modularisation, structured code since 1990, Classes and object-orientated since 2003. Please update your prejudices.
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:4, Interesting)
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Funny)
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Interesting)
What amused me about the article was actually the Fortran versions they spoke about. F95? F03? F08? Let's be real: just about every Fortran code I've heard of is still limited to F77 (with some F90 if you're lucky). It just won't work on later versions, and it's deemed not worth porting over, so the entire codebase is stuck on almost 40 years old code.
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:4, Interesting)
F77+extensions, usually DEC extensions. Very very few people ever used strict F77 with no extensions.
Some of the issues this causes are irritating bordering on unnerving. This we we discovered that g77 didn't care for treating INTEGER as LOGICAL. Used to be that there was no other way to specify bit operations, now it is precluded. Everybody's code has that, and there's really nothing intrinsically wrong or difficult to understand about it, but it was technically non-standard (although everyone's extensions permitted it) and it won't work on g77 - maybe only with the infamous -fugly flag.
Re: (Score:3)
Reproducibility is part of science. So is identifying and fixing errors. But perhaps the most important aspect of science is being able to continue it.
I've worked in science labs where non-software engineers write code. They fall victim to the same problem software engineers fall victim to when they work without version control: the lose it, they overwrite it, they make mistakes and want to go backwards, they end up with 50 copies and can't remember which one was used to compile their postdoc work. And w
Re: (Score:3)
Fortunately, (1) more and more scientists (also non-CS) are using github repo's for development, as most of our stuff is public anyway; and (2) It is becoming quite common to release a github page as part of a publication and is also a selling point to the editors/reviewers.
Re: (Score:2, Interesting)
Seconded. And the legacy isn't necessarily just the source code. Many of the engineering industries using such codes have a relatively low turnover rate, meaning an older group of engineers and researchers with the most experience stick around for decades. Most of these folks used Fortran since college. It works for them, and they aren't concerned with any "new-fangled" languages that offer more features. Another reason I hear from these folks is that Fortran has powerful array slicing and indexing syn
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Informative)
A: Legacy code, and because Fortran 2003+ is a very good modern language for scientific computation and maps very naturally to problems. As it turns out, the language semantics (both legacy and modern constructs) make it very good to parallelize. And it runs fast, as in, equalling C++ level of performance is considered a weak showing.
If you haven't seen or used modern Fortran and think it's anything like Fortran 66/77 then you're mistaken. Except for I/O, which still tends to suck.
In addition there are still some seemingly trivial but actually important features which make it better than many alternatives (starting from Fortran 90).
There's some boneheaded clunkers in other languages which Fortran does right: obviously, built-in multi-dimensional arrays, AND, arrays whose indices can start at 0, 1 (or any other value) and of course know their size. Some algorithms are written (on paper) with 0-based indexing and others with 1-based and allowing either one to be expressed naturally lowers chance of bugs.
Another one is that Fortran distinguishes between dynamically allocatable, and pointers/references. The history of C has constrained/brain-damaged people to think that to get the first, you must necessarily take the second. That doesn't happen in Fortran, you have ALLOCATABLE arrays (or other things) for run-time allocation of storage, and if you need a pointer (rarer) you can get that too. And Fortran provides the "TARGET" attribute to indicate that something *may be pointed to/referenced*, and by default this is not allowed. No making pointers/references to things which aren't designed to be referred to multiple times. This also means that the aliasing potential is highly controlled & language semantics constructed to make Fortran able to make very aggressive, and safe, optimization assumptions.
The more parallel you want, the more of these assumptions you need to get fast code, and naturally written Fortran code comes this way out of the box than most other languages.
Re: (Score:3)
A: Legacy code, and because Fortran 2003+ is a very good modern language for scientific computation and maps very naturally to problems
See.... Fortran 2003 is more modern than ISO 1999 C.... Now that that's settled... How come people are still programming in languages like C/C++/Java, when Fortran2003 is available?
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Insightful)
Legacy code that has been carefully checked to give correct results under a wide range of conditions.
Workers still use shovels in 2014!!!!! (Score:5, Insightful)
A: Legacy code.
AKA battle hardened libraries that work as advertised.
Re:Workers still use shovels in 2014!!!!! (Score:5, Funny)
Do not so lightly cast aside a tool which has proven its worth many times over.
not in the field, eh? (Score:5, Informative)
no, used because Fortran is the high level language that produces the fastest code for numeric computation, it is by far the most optimizable. Yes, it blows away C.
Re: (Score:3, Insightful)
They both generate machine code. But they get there in different ways and produce very different output. It would be more correct to say FORTRAN (compilers) blows away any C compilers. (esp. gcc)
Re:not in the field, eh? (Score:4, Informative)
Re: (Score:3)
The reason this is true is that Fortran compiler output, when appropriate optimizations are turned on, is not actually concerned with producing correct output... and although it is certainly faster, the resulting code will be less robust than code output by a modern most modern C compilers. Modern compilers generally focus on producing correct output at all times, and may make compromises in efficiency for correctness.
The main differences involve pointer aliasing... and a C or C++ compiler with the app
Re: (Score:2)
A: Legacy code.
Most people who learned FORTRAN did so in a class intended to teach FORTRAN.
Most people these days aren't learning FORTRAN, they're learning other languages, and they're doing them in a context that teaches them about apply the tool, rather than making them really, really good with the tool itself.
To use an analogy, it's the difference between learning how to use all the tools in the wood shop before being sent in to make a chair (learn the tools) vs. being taught all about "chairness", and then being sent
Re:Q: Why Are Scientists Still Using FORTRAN in 20 (Score:5, Interesting)
Wow, faster AND more accurate. They must use some mystical floating-point instructions that only Fortran compiler writers know about.
On PPC implementations, head-tail floating point is typically used for "long double"; this leads to inaccuracies in calculations. 80 bit Intel floating point is also inaccurate. So are SSE "vector" instructions, since denormals, NaNs, INFs, and -0 are always suspect unless you compiler emits an extra instruction in order to trigger the "next instruction after" signalling of the condition, and for NaNs, you are still somewhat suspect there.
If it isn't IEEE-754 compliant, you pretty much can't trust it. FORTRAN goes way the heck out of its way, including issuing additional instructions and introducing pipeline stalls, in order to force IEE-754 compliance.
Pretty much this accuracy only matters if you are doing Science(tm); if you are doing graphics, you are generally willing to eat the occasional FP induced artifact, because what you typically care about is the frame rate in your game, rather than being 100% accurate.
So, in closing, they're not using "some mystical floating-point instructions", they are just using accurate floating point, rather than approximate floating point.
Re: (Score:3)
So, Fortran issues extra instructions and pipeline stalls for accuracy, yet manages to be faster.
That is amazing!
It's faster in areas not involving floating point, and in floating point on hardware that has a good floating point implementation.
It's easier to branch-predict fortran code, and the lack of pointer support makes the boolean algebra a lot simpler for the complier. Given the calling conventions and limits, it's actually a lot easier to optimize fortran, and given that oist matrix math involves linear loops, it's easier to autovectorize things like the Berkeley Physics package. E.g. if you are attempting to
It's the right tool for the job (Score:5, Insightful)
Scientists work in formulas. Fortran was designed to do things naturally that don't fit into C/C++, Python, whatever.
Re:It's the right tool for the job (Score:5, Insightful)
In other words... (Score:5, Insightful)
If it ain't broke - don't fix it.
Re: (Score:3)
If it ain't broke - don't fix it.
No, it goes beyond that. Scientific calculations are just that - calculations. You don't need services, facades, annotations, etc. that mostly obscure what the code is doing. You just need a sequence of calculations.
Re: (Score:3)
Of course hindsight is always of the 20/20 variety - If we could find a Delorian, we could time travel back to the early 50's and present The good doctor [wikipedia.org] with a technical demo of CUDA running on a modern $150 video card, do we think the boffins of today would be using C++ CUDA kernels, or FORTRAN?
Aside from the hindsight fantasy, what's wrong with using FORTRAN in the way it was intended to be used? - Old certainly doe
Re: (Score:3)
Please choose your examples well.
Although son of a well to-do farmer, Newton was not nobility, and could only study because of recognition of his talent. Gauss was the son of really poor people.
Re: (Score:3, Interesting)
The python code I tried ran at half the speed of my C++ code for machine learning (mostly matrix crunching). The situation got worse for python when I could push C++ compute steps into compile time. Scientific modeling seem to need a lot of number crunching.
Wrong question (Score:5, Insightful)
Why not?
Actually that is a serious question, for these sorts of applications there seems to be no significant downside.
Re:Wrong question (Score:5, Insightful)
Re: (Score:3)
Re: (Score:2)
FORTRAN and Lisp (and BASIC and C, only somewhat later) made programmers about as productive, within a reasonably small constant factor, as anything since.
I recently switched from C to Python, and my productivity shot up ~100 times. For example, I created a tree of hashes to perform pattern matching on large data sets in linear time. It took me 2 hours from concept to production run. Also factor in a huge library that does everything needed on this planet, ease of maintaining 100x fewer lines of code and 0 memory management hurdles.
The downside is that it runs 100 times slower than C, but since it is the programmer's productivity your are talking about, yo
Re: (Score:3)
The downside is that it runs 100 times slower than C, but since it is the programmer's productivity your are talking about, you are very wrong.
You do realize that CPU limited problems are not uncommon in physics and engineering?
Re:Wrong question (Score:5, Insightful)
There's actually significant upside.
Ever debugged a memory error in C? Ever done it when it is timing dependent? How about on 1024 nodes at once? Good luck opening that many gdb windows.
I TA'd the parallel programming class. I told the students (largely engineers & science, not CS) -- use fortran. Lack of pointers is actually a feature here.
Why not? (Score:4, Insightful)
At work in the recent past (2000's) we were still supporting FORTRAN on the SGI machines we had running. The SGI compilers would optimize the hell out of the code and get it all parallized up, ready to eat up all the CPUs.
Newer isn't always better.
Comment removed (Score:5, Informative)
Re:Ten Reasons to use Modern Fortran (Score:5, Insightful)
you left out the massive gigabytes of well-tested and respected numeric libraries for all the major fields of science and engineering (that are free for use too).....oh, and much of that written in F77. the most optimizable langague for numeric computation on planet earth, that's why supercomputer companies always sell ForTran compilers
Gnu killed fortran (Score:5, Insightful)
For years and years and years the Gnu G95 compiler was only a partial implementation of the language. This made it impossible to use without buying a complier from intel or absoft or some other vendor. It chokes the life out of it for casual use.
Personallyt I really like a combination of F77 and python. Whats cool a bout it is that F77 compiles so damn fast that you can have python spit out optimized F77 for your specific case sizes. Then for the human interface and dynamic memory allocation and glue to other libraries you can use python.
Re: (Score:2)
Fortran, the language, has evolved very significantly with little annoying cruft hurting current design on account of legacy compatibility.
The comparison vs C and C++ is instructive.
Strangely? (Score:5, Insightful)
When you go to any place where 'cutting edge' scientific research is going on, strangely the computer language of choice is FORTRAN, the first computer language commonly used, invented in the 1950s.
Perhaps it's still the best tool for the job. Why is that strange? Old(er) doesn't necessarily mean obsolete -- and new(er) doesn't necessarily mean better.
Re: (Score:3)
Re:Strangely? (Score:5, Insightful)
Agreed. My thought at reading the summary was "Do older languages have some sort of expiration date I don't know about?" What's odd about it? Also, it's not like the language has been stagnant. English is an old "legacy" human language with lots of cruft and inconsistent rules, but it works well enough for us that it's not worth jumping ship for Esperanto.
A large part of it is probably the simple inertia of legacy, both in code, systems, and personnel. However legacy systems tends to eventually be replaced if a demonstrably superior product can improve performance in some way. Any significant change, even one for the better, causes pain and friction, so the change typically has to be worth the pain involved. Obviously in the eyes of many science-focused projects, it hasn't been worth switching to a new language. There's also value in having a body of work in an older and very well understood and documented language, as it means new team members are much more likely to already be proficient with the language than a newer and less popular language.
I can also understand not wanting to switch to some "flavor of the month" language when you're not sure how long it will be actively supported. FORTRAN has credibility simply based on it's incredible longevity. No, it's not new and sexy, but you can bet it will probably be around for another half-century.
Language-limited clients and DRY (Score:3)
My thought at reading the summary was "Do older languages have some sort of expiration date I don't know about?"
That is because you aren't a hipster or fad brogrammer. These idiots probably expect them to be using Node.js or some such bullshit.
In some cases, the client side is language-limited, and everything has to be translated to one language before it can be deployed. In the case of iPhone OS (now iOS) during the second quarter of 2010, this was Objective-C++. In the case of Windows Phone 7 apps and Xbox Live Indie Games, this is the subset of verifiably type-safe CIL accepted by the .NET Compact Framework (which in practice means C#). In the case of web applications, this is JavaScript. In order to ensure that the client-side prevalidation i
Fortran is NOT the language of choice (Score:4, Informative)
I never thought about engineering and Fortran (Score:5, Insightful)
Large scale models handling huge arrays, though - like climate or weather modeling - I think that's where Fortran has always been king of the roost.
The whole point is speed. No one's working in Python if they're interested in speed.
Memory management (Score:2)
As managers of High Performance Computing platforms, we generally take an a-religious approach and deliver
Re: (Score:2)
Still a big hit in Vietnam (Score:2, Insightful)
Re: (Score:2)
As others have said... why not? (Score:5, Insightful)
If the language accomplishes the task efficiently and effectively with no apparent downside then why attempt to switch languages simply for the sake of switching?
Furthermore, an ability to run legacy code should be sustained especially in science where being able to use that code again after many years might save scientists from having to reverse engineer past discoveries.
Key Reason (Score:5, Interesting)
Huge libraries of FORTRAN code have been formally proven. New FORTRAN code can be formally proven. Due the limitations of the language, it is possible to put the code through formal processes to prove the code is correct. In addition, again, as a benefit of those limitations, it is very easy to auto-parallelize FORTRAN code.
We're Not (Score:2, Interesting)
I saw this link bait the other day...
We're NOT using Fortran anymore...
Many of us at the National Labs do modern, object-oriented C/C++... Like the project I'm in charge of: http://www.mooseframework.org/ [mooseframework.org]
There are whole labs that have completely expunged Fortran in favor of C++... Like Sandia (http://trilinos.sandia.gov) who actually went through a period in the late 90s and early 2000s where they systematically replaced all of their largest Fortan computational science codes with C++.
Those places that don'
Re: (Score:2, Interesting)
If you're using C++ for scientific math, then you deserve to have whatever credentials you may possess to be revoked immediately. No language should be used for scientific math that can produce different results based upon the version of library or platform it is compiled against.
You also cannot prove C++ code is good. You just can't. C++ is not deterministic, again, because the outcome depends on platform/library versions, compiler options, time of day, alignment of the planets, and many other factors. The
Re:We're Not (Score:5, Insightful)
Firstly... 10^-15 is WAY beyond what most scientific codes care about. Most nonlinear finite-element codes generally shoot for convergence tolerances between 1e-5 and 1e-8. Most of the problems are just too hard (read: incredibly nonlinear) to solve to anything beyond that. Further, 1e-8 is generally WAY beyond the physical engineering parameters for the problem. Beyond that level we either can't measure the inputs, have uncertainty about material properties, can't perfectly represent the geometry, have discretization error etc., etc. Who cares if you can reproduce the exact same numbers down to 1e-15 when your inputs have uncertainty above 1e-3??
Secondly... lots of the best computational scientists in the world would disagree:
http://www.openfoam.org/docs/u... [openfoam.org]
http://libmesh.sourceforge.net... [sourceforge.net]
http://www.dealii.org/ [dealii.org]
http://eigen.tuxfamily.org/ind... [tuxfamily.org]
http://trilinos.sandia.gov/ [sandia.gov]
I could go on... but you're just VERY wrong... and there's no reason to spend more time on you...
Postscript (Score:2)
Today someone told me about how he once wasn't allowed to disturb a printer - because someone was using it to run a job doing an FFT written in Postscript. Apparently the large amount of memory available in the printer was paramount.
As a Social Science Ph.d. (Score:5, Funny)
Because C and C++ multidimensional arrays suck (Score:5, Insightful)
A big problem is that C and C++ don't have real multidimensional arrays. There are arrays of arrays, and fixed-sized multidimensional arrays, but not general multidimensional arrays.
FORTRAN was designed from the beginning to support multidimensional arrays efficiently. They can be declared, passed to subroutines, and iterated over efficiently along any axis. The compilers know a lot about the properties of arrays, allowing efficient vectorization, parallization, and subscript optimization.
C people do not get this. There have been a few attempts to bolt multidimensional arrays as parameters or local variables onto C, (mostly in C99) but they were incompatible with C++, Microsoft refused to implement them, and they're deprecated in the latest revision of C.
Go isn't any better. I spent some time trying to convince the Go crowd to support multdimensional arrays properly. But the idea got talked to death and lost under a pile of little-used nice features.
Re:Because C and C++ multidimensional arrays suck (Score:4, Interesting)
Easily fixed with libraries like Eigen ( http://eigen.tuxfamily.org/ind [tuxfamily.org]... ) and many others.
That's the problem. There's no one way to represent a multidimensional array in C++. There are many ways. Which means math libraries using different ones are incompatible with each other. The last time I did a big number-crunching job in C++, I had four different array representations forced on me by different libraries.
Because the compiler has no clue what those array libraries are doing, you don't get basic loop optimizations that FORTRAN has had for 50 years.
Re: (Score:3)
There's no one way to represent a multidimensional array in C++.
Actually, there is. Underneath it all, it's a pointer, a size per dimension and dimension-1 strides. I do not believe I've ever encountered a multidimensional array (as distinct from arrays of arrays) system which did not do it that way.
In practice, this means you can write a bit of glue code in about 10 minutes to translate multidimensional arrays from one "incompatible" system to another. It's a minor pain, but hardly a showstopper.
Having mi
Its not strange (Score:4, Informative)
Most scientists like to use tools that work, and they are proficient in.
FORTRAN falls under both categories.
Perl of the timesharing age, a real Adventure! (Score:3)
FORTRAN was -- for some still is-- the 'Perl' of scientific computing. Get it in and get it done... and it doesn't always compile down very tight, but always fast because for mainframe developers getting this language optimized for a new architecture was first priority.
At 15, the first real structured program I ever de-constructed completely while teaching myself the language, was the FORTRAN IV source for Crowther and Woods Colossal Cave Adventure [std.com], widely regarded as 'the' original interactive text adventure, a genre which would later go multi-user to become the MUD. Read about it here [rickadams.org], or play it in Javascript [xenoveritas.org].
Crowther's PDP-11 version was running on the 36-bit GE-600 mainframes [wikipedia.org] of GEISCO (General Electric Information Services) Mark III Foreground timesharing system... this is in the golden age of timesharing and no one did it better than GE. It took HOURS at 300bps and two rolls of thermal paper to print out the source and data files, and I laid it out on the floor and traced the program mentally, keeping a notebook of what was stored in what variable... I had far more fun doing this than playing the game itself.
FORTRAN IV and Dartmouth BASIC (I'll toss in RPG II also) were the 'flat' GOTO-based languages, an era of explicit rather than implicit nesting -- a time in which high level functions were available to use or define but humans needed to plan and implement the actual structure in programs mentally by using conditional statements and numeric labels to JUMP over blocks of code. Sort of "assembly language with benefits".
When real conditional nesting and completely symbolic labeling appeared on the scene, with good string handling, it was a walk in the park.
Why are scientists still using LATIN in 2014? (Score:5, Funny)
Choice of language is secondary (Score:5, Informative)
Yes, FORTRAN sucks, but it is stable, fast and well understood. It runs on a number of supercomputer architectures. It is way easier to program in FORTRAN than in C for non-CS people. So what is the issue? Oh, maybe that this is not a "modern" language? Here is news for you: Java sucks a lot more than FORTRAN.
Only one problem with Fortran... (Score:3)
No one knows how to spell it. In these threads I've seen Fortran, FORTRAN, ForTran, etc. Who the hell can keep track? That's why C is winning! One letter, upper case, C. No muss no fuss. Not to worry, PL\I had the same issue (Forward or backward slash? Better go look it up!) Same with LISP, LisP, Lisp, (or, these days, scheme, Racket, Clojure).- too many letters, too many ways to misspell them. D avoided falling into this trap, but which would you rather have on a paper? A D or a C? And C++ doesn't even look like a real grade? Python? Are we naming a language or a comedy troupe? Same with Ruby - I don't need a bunch of freaking geologists telling how to use a computer.
Jeez, everyone, C just got it right where it counted - its name. Now can we just all agree to use that and move on?
Because you are a naive nanny? (Score:3)
Fortran is still used because it works. Becuase it is fast. Because libraries are optimized and well understood. Fortran is still used because gasp it has evolved since FORTRAN 66 and FORTRAN IV. Maybe you and the other language nannies always forcing latest greatest buzz on the rest of us should take the time to actually read about some of the most recent [wikipedia.org] versions?
Re: (Score:3)
yes really (Score:2)
haha, maybe you better look at what language huge parts of the cores of your petc and trillinos are written in. hint, starts with an F
Re: (Score:2)
yep. along with all the rest of the BLAS, EISPACK, CERNLAB, MINPACK, SOFA, ATLAS, EIGEN, ... and even the comparatively more recent Bioinformatics cores.. BLAST, BLAT, ...
I really don't understand the "scientific computing .. almost all new software is written in C++" comes from. It's all become Python (and Perl before that) calling old libraries at the scientific meetings i've attended. (but i suppose YMMV)
Re: (Score:2)
Not everyone needs to know all of the quirks of C++ to use it. My project ( http://mooseframework.org/ [mooseframework.org] ) does all of the nasty C++ stuff under the hood so that we can expose a very straightforward interface to non-computer-scientists.
It's working out well so far.
Object-oriented is still a good paradigm until the functional language people get everything figured out and there are enough computational science libraries written in functional languages. And if you want to do object-oriented and you still want