Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Science

Why Scientists Are Still Using FORTRAN in 2014 634

New submitter InfoJunkie777 (1435969) writes "When you go to any place where 'cutting edge' scientific research is going on, strangely the computer language of choice is FORTRAN, the first computer language commonly used, invented in the 1950s. Meaning FORmula TRANslation, no language since has been able to match its speed. But three new contenders are explored here. Your thoughts?"
This discussion has been archived. No new comments can be posted.

Why Scientists Are Still Using FORTRAN in 2014

Comments Filter:
  • by dtmos ( 447842 ) * on Friday May 09, 2014 @08:46PM (#46963867)

    A: Legacy code.

  • by Balial ( 39889 ) on Friday May 09, 2014 @08:53PM (#46963893) Homepage

    Scientists work in formulas. Fortran was designed to do things naturally that don't fit into C/C++, Python, whatever.

  • Wrong question (Score:5, Insightful)

    by Brett Buck ( 811747 ) on Friday May 09, 2014 @08:54PM (#46963895)

    Why not?

          Actually that is a serious question, for these sorts of applications there seems to be no significant downside.

  • Why not? (Score:4, Insightful)

    by grub ( 11606 ) <slashdot@grub.net> on Friday May 09, 2014 @08:56PM (#46963899) Homepage Journal

    At work in the recent past (2000's) we were still supporting FORTRAN on the SGI machines we had running. The SGI compilers would optimize the hell out of the code and get it all parallized up, ready to eat up all the CPUs.

    Newer isn't always better.
  • Strangely? (Score:5, Insightful)

    by fahrbot-bot ( 874524 ) on Friday May 09, 2014 @08:56PM (#46963905)

    When you go to any place where 'cutting edge' scientific research is going on, strangely the computer language of choice is FORTRAN, the first computer language commonly used, invented in the 1950s.

    Perhaps it's still the best tool for the job. Why is that strange? Old(er) doesn't necessarily mean obsolete -- and new(er) doesn't necessarily mean better.

  • Re:Wrong question (Score:5, Insightful)

    by jythie ( 914043 ) on Friday May 09, 2014 @09:00PM (#46963921)
    That is what tends to bother me about these 'wow, people are not using what we in another field are using!' type questions. FORTRAN does its job well, has libraries relevant to what people are using it for, and experience in it is common within that community. Why shouldn't they use it?
  • by smittyoneeach ( 243267 ) * on Friday May 09, 2014 @09:04PM (#46963947) Homepage Journal
    After all, it was "For Tran".
  • by smoothnorman ( 1670542 ) on Friday May 09, 2014 @09:04PM (#46963951)
    mod the above up please (i'm fresh out of mod points), because that's it in a nutshell. Fortran was designed for science/engineering work. And here's something that a majority of computer-science mavins never seem to grasp. In academia, at least, the use of a program is often relatively ad-hoc, and for the life of the publication. they need to have lots of numerical stuff down by easily references libraries, then handed off to their (poor) dost-docs/grad-students to study for their own one-off programming purposes. That is, the next vital program will have little to do with the previous except for those same well referenced peer-reviewed linked-to numerical libraries. Does that sound like a perfect use (model) of Clojure or Haskell to you? (yes yes you in the back, i know you brush your teeth with monads, but you're the rare case). Haskell and friends force you to think a lot up front for gains at the rear-end, but with much of academic programming there's no rear-end.
  • by Karmashock ( 2415832 ) on Friday May 09, 2014 @09:06PM (#46963963)

    If the language accomplishes the task efficiently and effectively with no apparent downside then why attempt to switch languages simply for the sake of switching?

    Furthermore, an ability to run legacy code should be sustained especially in science where being able to use that code again after many years might save scientists from having to reverse engineer past discoveries.

  • by the eric conspiracy ( 20178 ) on Friday May 09, 2014 @09:09PM (#46963979)

    Legacy code that has been carefully checked to give correct results under a wide range of conditions.

  • In other words... (Score:5, Insightful)

    by 93 Escort Wagon ( 326346 ) on Friday May 09, 2014 @09:11PM (#46963985)

    If it ain't broke - don't fix it.

  • by TapeCutter ( 624760 ) on Friday May 09, 2014 @09:18PM (#46964027) Journal
    Prospectors did not stop using shovels when bulldozers were invented. FORTRAN is the scientist's shovel, visualization software is the bulldozer.

    A: Legacy code.

    AKA battle hardened libraries that work as advertised.

  • by 93 Escort Wagon ( 326346 ) on Friday May 09, 2014 @09:19PM (#46964033)

    Large scale models handling huge arrays, though - like climate or weather modeling - I think that's where Fortran has always been king of the roost.

    The whole point is speed. No one's working in Python if they're interested in speed.

  • Re:Wrong question (Score:5, Insightful)

    by Rhys ( 96510 ) on Friday May 09, 2014 @09:30PM (#46964091)

    There's actually significant upside.

    Ever debugged a memory error in C? Ever done it when it is timing dependent? How about on 1024 nodes at once? Good luck opening that many gdb windows.

    I TA'd the parallel programming class. I told the students (largely engineers & science, not CS) -- use fortran. Lack of pointers is actually a feature here.

  • Re:Strangely? (Score:5, Insightful)

    by Dutch Gun ( 899105 ) on Friday May 09, 2014 @09:31PM (#46964093)

    Agreed. My thought at reading the summary was "Do older languages have some sort of expiration date I don't know about?" What's odd about it? Also, it's not like the language has been stagnant. English is an old "legacy" human language with lots of cruft and inconsistent rules, but it works well enough for us that it's not worth jumping ship for Esperanto.

    A large part of it is probably the simple inertia of legacy, both in code, systems, and personnel. However legacy systems tends to eventually be replaced if a demonstrably superior product can improve performance in some way. Any significant change, even one for the better, causes pain and friction, so the change typically has to be worth the pain involved. Obviously in the eyes of many science-focused projects, it hasn't been worth switching to a new language. There's also value in having a body of work in an older and very well understood and documented language, as it means new team members are much more likely to already be proficient with the language than a newer and less popular language.

    I can also understand not wanting to switch to some "flavor of the month" language when you're not sure how long it will be actively supported. FORTRAN has credibility simply based on it's incredible longevity. No, it's not new and sexy, but you can bet it will probably be around for another half-century.

  • by Bing Tsher E ( 943915 ) on Friday May 09, 2014 @09:32PM (#46964097) Journal

    Precision is important in scientific discourse. Latin isn't a language with creeping grammar and jargon. It's sorta what Esperanto only wished it could ever be.

  • by rubycodez ( 864176 ) on Friday May 09, 2014 @09:37PM (#46964117)

    you left out the massive gigabytes of well-tested and respected numeric libraries for all the major fields of science and engineering (that are free for use too).....oh, and much of that written in F77. the most optimizable langague for numeric computation on planet earth, that's why supercomputer companies always sell ForTran compilers

  • by Animats ( 122034 ) on Friday May 09, 2014 @09:56PM (#46964179) Homepage

    A big problem is that C and C++ don't have real multidimensional arrays. There are arrays of arrays, and fixed-sized multidimensional arrays, but not general multidimensional arrays.

    FORTRAN was designed from the beginning to support multidimensional arrays efficiently. They can be declared, passed to subroutines, and iterated over efficiently along any axis. The compilers know a lot about the properties of arrays, allowing efficient vectorization, parallization, and subscript optimization.

    C people do not get this. There have been a few attempts to bolt multidimensional arrays as parameters or local variables onto C, (mostly in C99) but they were incompatible with C++, Microsoft refused to implement them, and they're deprecated in the latest revision of C.

    Go isn't any better. I spent some time trying to convince the Go crowd to support multdimensional arrays properly. But the idea got talked to death and lost under a pile of little-used nice features.

  • Re:We're Not (Score:5, Insightful)

    by friedmud ( 512466 ) on Friday May 09, 2014 @10:02PM (#46964201)

    Firstly... 10^-15 is WAY beyond what most scientific codes care about. Most nonlinear finite-element codes generally shoot for convergence tolerances between 1e-5 and 1e-8. Most of the problems are just too hard (read: incredibly nonlinear) to solve to anything beyond that. Further, 1e-8 is generally WAY beyond the physical engineering parameters for the problem. Beyond that level we either can't measure the inputs, have uncertainty about material properties, can't perfectly represent the geometry, have discretization error etc., etc. Who cares if you can reproduce the exact same numbers down to 1e-15 when your inputs have uncertainty above 1e-3??

    Secondly... lots of the best computational scientists in the world would disagree:

    http://www.openfoam.org/docs/u... [openfoam.org]
    http://libmesh.sourceforge.net... [sourceforge.net]
    http://www.dealii.org/ [dealii.org]
    http://eigen.tuxfamily.org/ind... [tuxfamily.org]
    http://trilinos.sandia.gov/ [sandia.gov]

    I could go on... but you're just VERY wrong... and there's no reason to spend more time on you...

  • by Darinbob ( 1142669 ) on Friday May 09, 2014 @10:07PM (#46964227)

    Also "legacy training". Student learns from prof. Student becomes prof. Cycle repeats.

    Also Fortran didn't stagnate in the 60s, it's been evolving over time.

    Other languages are highly optimizable too. However most of the new and "cool" languages I've seen in the last ten years are all basic scripting languages, great for the web or It work but awful for doing lots of work in a short period of time. It's no mystery why Fortran, C/C++, and Ada are still surviving in areas where no just-in-time wannabe will flourish.

  • by Cramer ( 69040 ) on Friday May 09, 2014 @10:11PM (#46964235) Homepage

    They both generate machine code. But they get there in different ways and produce very different output. It would be more correct to say FORTRAN (compilers) blows away any C compilers. (esp. gcc)

  • by AchilleTalon ( 540925 ) on Friday May 09, 2014 @10:20PM (#46964283) Homepage
    Fortran is a scientific programming language. You are an engineer, seems clear enough why you are not using Fortran. Any explanations needed?
  • by Anonymous Coward on Friday May 09, 2014 @10:46PM (#46964407)

    People using existing Fortran code are interested in the RESULTS of the computation, not whether the code is modern or has the latest bells and whistles. Programmers forget that the ultimate goal is for someone to USE the program. I wrote a program in CDC Fortran 77 in 1978 that's still being used, Why? Because it does the job.

  • Gnu killed fortran (Score:5, Insightful)

    by goombah99 ( 560566 ) on Friday May 09, 2014 @11:07PM (#46964503)

    For years and years and years the Gnu G95 compiler was only a partial implementation of the language. This made it impossible to use without buying a complier from intel or absoft or some other vendor. It chokes the life out of it for casual use.

    Personallyt I really like a combination of F77 and python. Whats cool a bout it is that F77 compiles so damn fast that you can have python spit out optimized F77 for your specific case sizes. Then for the human interface and dynamic memory allocation and glue to other libraries you can use python.

  • by David_Hart ( 1184661 ) on Friday May 09, 2014 @11:14PM (#46964541)

    I would also hazard a guess that Fortran tends to be a tad easier to read than C... Especially for scientists...

  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Friday May 09, 2014 @11:34PM (#46964613)
    Comment removed based on user account deletion
  • by poodlediagram ( 1944244 ) on Saturday May 10, 2014 @01:23AM (#46964951)
    My previous supervisor decided to fork our Fortran code for performing quantum mechanical calculations. We'd worked on it for more than half a decade and it was world-class.

    He handed it over to a computer science graduate (i.e. a non-physicist) who really liked all the modern trends in CS. Now, five years later:

    1. the tarball is an order of magnitude larger
    2. the input files are now all impenetrable .xml
    3. the code requires access to the outside (not possible on many superclusters)
    4. he re-indented everything for no apparent reason
    5. the variable names were changed, made into combined types and are much longer
    6. as a result, the code is basically unreadable and nearly impossible to compare to the original formulae
    7. code is duplicated all over the place
    8. it now depends on unnecessary libraries (like the ones required to parse .xml), and it only compiles after a lot of work
    9. it's about four times slower and crashes randomly
    10. it generates wrong results in basic cases

    To quote Linus Torvalds: "I've come to the conclusion that any programmer that would prefer the project to be in C++ over C is likely a programmer that I really *would* prefer to piss off, so that he doesn't come and screw up any project I'm involved with." ... and I feel the same way about CS graduates and Fortran. They have no idea about the physics or maths involved (which is the difficult part), so the do the only thing they know which is to 'modernize' everything, making it into an incomprehensible, ungodly mess.

    Fortran, apart from being a brilliant language for numerical math, has the added benefit of keeping CS graduates at bay. I'd rather have a physicist who can't program, than a CS type who can.

    (Apologies to any mathematically competent computer scientists out there)
  • by aepervius ( 535155 ) on Saturday May 10, 2014 @04:06AM (#46965313)
    And even the "pro" make noob error from time to time out of various reason. I could list them, but let us say that even expert are not perfect programming turing complete automaton. They are human. PLus more often than not they suffer from the NIH syndrom, and from the "it must obey my standard rule" syndrom making them rewrite code or change indent variable name etc...

    The main reason stuff stays in fortran is the general best practice of not messing with working shipped code. If the code needs regular work, for goodness sake use a maintainable language. But lots of fortran code has been stable for decades, and only a madman would go changing it.

    No. The main reason we program in fortran is because the lirbary are known, have known error bars, known comportment , and are "provable". We *DO* reprogram every time we come up to a new problem which need to be translated. Chance is there is no standard code for what you want to simulate for your own specific problem. There are some rare case, like QM program (Gaussian, Molpro etc...) or some engineering program, but those are the exception not the rule.

  • by fractoid ( 1076465 ) on Saturday May 10, 2014 @07:01AM (#46965647) Homepage

    He handed it over to a computer science graduate (i.e. a non-physicist) who really liked all the modern trends in CS.

    Why was a graduate fresh out of university put in charge of architecture decisions? You wouldn't put an apprentice in charge of a mechanical workshop and expect them to keep it tidy and efficient, this is no different.

    It's my general experience that it takes 5-10 years of commercial experience before someone is capable of making wise architecture choices about small standalone apps, and 15+ before they'll have a hope in hell of doing anything non-destructive with a large legacy application.

  • by petrus4 ( 213815 ) on Sunday May 11, 2014 @02:05PM (#46973757) Homepage Journal

    And that was a shame, because many new generations of scientific programmers did not get exposed to new languages with new expressive power (such as OO) that could solve new problems.

    I've only ever seen two groups of people, who advocated OO as some sort of inherent virtue in itself.

    a} Psychopathic, buzzword-obsessed, clueless IT managers.

    b} Elitist, equally clueless programmers, who mainly advocate OO and related languages, (such as C++) because they enjoy ego tripping about the fact that they can write code that nobody else is able to read, rather than actually getting real work done.

    The main argument that both groups use to advocate OO, is the appeal to modernity fallacy [wikipedia.org]. I.e., the idea that "modernity," is an inherent virtue, purely for its' own sake.

"May your future be limited only by your dreams." -- Christa McAuliffe

Working...