Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming Science

Why Scientists Are Still Using FORTRAN in 2014 634

New submitter InfoJunkie777 (1435969) writes "When you go to any place where 'cutting edge' scientific research is going on, strangely the computer language of choice is FORTRAN, the first computer language commonly used, invented in the 1950s. Meaning FORmula TRANslation, no language since has been able to match its speed. But three new contenders are explored here. Your thoughts?"
This discussion has been archived. No new comments can be posted.

Why Scientists Are Still Using FORTRAN in 2014

Comments Filter:
  • by dtmos ( 447842 ) * on Friday May 09, 2014 @07:46PM (#46963867)

    A: Legacy code.

    • by Anonymous Coward on Friday May 09, 2014 @07:58PM (#46963911)
      No, not just "legacy code." Fortran (yes, that's how it's spelt now, not "FORTRAN") was designed to be highly optimizable. Because of the way Fortran handles such things as aliasing, it's compilers can optimize expressions a lot better than other languages.
      • by K. S. Kyosuke ( 729550 ) on Friday May 09, 2014 @09:04PM (#46964217)
        APL-style languages should be even more optimizable, since they use higher-order array operators that make the control flow and data flow highly explicit without the need to recover information from loopy code using auto-vectorizers, and easily yield parallel code. By this logic, in our era of cheap vector/GPU hardware, APL-family languages should be even more popular than Fortran!
        • by mbone ( 558574 ) on Friday May 09, 2014 @09:18PM (#46964273)

          Yeah, I used to hear that argument a lot in 1978...

          • by K. S. Kyosuke ( 729550 ) on Friday May 09, 2014 @09:37PM (#46964371)
            Well, we live in a somewhat different world today, given that suitable HW for that is virtually everywhere. But just to be clear, I'm not suggesting anyone should adopt APL's "syntax". It's more about the array language design principles. Syntax-wise, I'd personally like something along the lines of Nile [githubusercontent.com], with math operators where suitable, and with some type inference and general "in-language intelligence" thrown into the mix to make it concise. I realize that depriving people of their beloved imperative loops might seem cruel, but designing the language in a way that would make obvious coding styles easily executed on vector machines seems a bit saner to me than allowing people to write random loops and then either hope that the vectorizer will sort it out (they're still very finicky about their input) or provide people with examples what they should and shouldn't be writing if they want it to run fast.
        • APL-family languages should be even more popular than Fortran!

          Probably would be if it wasn't a write only language.

        • by stenvar ( 2789879 ) on Saturday May 10, 2014 @02:29AM (#46965217)

          Fortran has been an "APL style language" since Fortran 95, with most of the APL operations present. That was done both for optimization and for convenience. And other APL-style languages are very popular as well, foremost MATLAB.

      • by Darinbob ( 1142669 ) on Friday May 09, 2014 @09:07PM (#46964227)

        Also "legacy training". Student learns from prof. Student becomes prof. Cycle repeats.

        Also Fortran didn't stagnate in the 60s, it's been evolving over time.

        Other languages are highly optimizable too. However most of the new and "cool" languages I've seen in the last ten years are all basic scripting languages, great for the web or It work but awful for doing lots of work in a short period of time. It's no mystery why Fortran, C/C++, and Ada are still surviving in areas where no just-in-time wannabe will flourish.

        • Legacy Programmers (Score:5, Interesting)

          by Roger W Moore ( 538166 ) on Friday May 09, 2014 @11:01PM (#46964717) Journal

          Also "legacy training". Student learns from prof. Student becomes prof. Cycle repeats.

          Not really - even when I was a student we ditched F77 whenever we possibly could and used C or C++. The issue is more legacy programmers. Often the person in charge of a project is a older person who knows FORTRAN and does not want to spend the time to learn a new language like C (or even C++!). Hence they fall back into something more comfortable.

          However by now even this is not the case. The software in particle physics is almost exclusively C++ and/or Python. The only things that I am aware of which are still FORTRAN are some Monte-Carlo event generators which are written by theorists. My guess is that as experimentalists even older colleagues have to learn C++ and Python to use and program modern hardware. Theorists can get by using any language they want and so are slower to change. Certainly it has probably been at least 15 years since I wrote any FORTRAN myself and even then what I wrote was the code needed to test the F77 interface to a rapid C I/O framework for events which was ~1-200 times faster than the F77 code it replaced.

      • by Anonymous Coward on Friday May 09, 2014 @09:46PM (#46964407)

        People using existing Fortran code are interested in the RESULTS of the computation, not whether the code is modern or has the latest bells and whistles. Programmers forget that the ultimate goal is for someone to USE the program. I wrote a program in CDC Fortran 77 in 1978 that's still being used, Why? Because it does the job.

      • by poodlediagram ( 1944244 ) on Saturday May 10, 2014 @12:23AM (#46964951)
        My previous supervisor decided to fork our Fortran code for performing quantum mechanical calculations. We'd worked on it for more than half a decade and it was world-class.

        He handed it over to a computer science graduate (i.e. a non-physicist) who really liked all the modern trends in CS. Now, five years later:

        1. the tarball is an order of magnitude larger
        2. the input files are now all impenetrable .xml
        3. the code requires access to the outside (not possible on many superclusters)
        4. he re-indented everything for no apparent reason
        5. the variable names were changed, made into combined types and are much longer
        6. as a result, the code is basically unreadable and nearly impossible to compare to the original formulae
        7. code is duplicated all over the place
        8. it now depends on unnecessary libraries (like the ones required to parse .xml), and it only compiles after a lot of work
        9. it's about four times slower and crashes randomly
        10. it generates wrong results in basic cases

        To quote Linus Torvalds: "I've come to the conclusion that any programmer that would prefer the project to be in C++ over C is likely a programmer that I really *would* prefer to piss off, so that he doesn't come and screw up any project I'm involved with." ... and I feel the same way about CS graduates and Fortran. They have no idea about the physics or maths involved (which is the difficult part), so the do the only thing they know which is to 'modernize' everything, making it into an incomprehensible, ungodly mess.

        Fortran, apart from being a brilliant language for numerical math, has the added benefit of keeping CS graduates at bay. I'd rather have a physicist who can't program, than a CS type who can.

        (Apologies to any mathematically competent computer scientists out there)
        • by lgw ( 121541 )

          None of that is good programming, is the thing.

          I really wish people would stop blaming the tools when the problem is people who are tools. Maybe that's endemic to "CS types"? But those of use who code for a living in the real world recognize what you describe as a noob stunt, not a language problem.

          The main reason stuff stays in fortran is the general best practice of not messing with working shipped code. If the code needs regular work, for goodness sake use a maintainable language. But lots of fortran

          • by aepervius ( 535155 ) on Saturday May 10, 2014 @03:06AM (#46965313)
            And even the "pro" make noob error from time to time out of various reason. I could list them, but let us say that even expert are not perfect programming turing complete automaton. They are human. PLus more often than not they suffer from the NIH syndrom, and from the "it must obey my standard rule" syndrom making them rewrite code or change indent variable name etc...

            The main reason stuff stays in fortran is the general best practice of not messing with working shipped code. If the code needs regular work, for goodness sake use a maintainable language. But lots of fortran code has been stable for decades, and only a madman would go changing it.

            No. The main reason we program in fortran is because the lirbary are known, have known error bars, known comportment , and are "provable". We *DO* reprogram every time we come up to a new problem which need to be translated. Chance is there is no standard code for what you want to simulate for your own specific problem. There are some rare case, like QM program (Gaussian, Molpro etc...) or some engineering program, but those are the exception not the rule.

        • by fractoid ( 1076465 ) on Saturday May 10, 2014 @06:01AM (#46965647) Homepage

          He handed it over to a computer science graduate (i.e. a non-physicist) who really liked all the modern trends in CS.

          Why was a graduate fresh out of university put in charge of architecture decisions? You wouldn't put an apprentice in charge of a mechanical workshop and expect them to keep it tidy and efficient, this is no different.

          It's my general experience that it takes 5-10 years of commercial experience before someone is capable of making wise architecture choices about small standalone apps, and 15+ before they'll have a hope in hell of doing anything non-destructive with a large legacy application.

        • by Xyrus ( 755017 )

          Your problem began when you turned the code over to an inexperienced CS graduate. You don't need to be a good software engineer or even a good programmer to get a degree in computer science. I wish people would stop conflating the two, especially the people in the HR department. :P

          You needed a SOFTWARE ENGINEER. Worse, you needed an experienced software engineer familiar with the domain. Instead you handed it over to a fresh graduate who maybe had one or two courses on engineering. What exactly did you expe

    • by PPH ( 736903 ) on Friday May 09, 2014 @08:03PM (#46963939)
      At least Slashdot seems to encourage re-use of commonly used responses when a question is asked.
    • by Nemyst ( 1383049 ) on Friday May 09, 2014 @08:05PM (#46963955) Homepage
      This. I have many friends in the physics dept and the reason they're doing Fortran at all is that they're basing their own stuff off of existing Fortran stuff.

      What amused me about the article was actually the Fortran versions they spoke about. F95? F03? F08? Let's be real: just about every Fortran code I've heard of is still limited to F77 (with some F90 if you're lucky). It just won't work on later versions, and it's deemed not worth porting over, so the entire codebase is stuck on almost 40 years old code.
      • by Brett Buck ( 811747 ) on Friday May 09, 2014 @09:12PM (#46964237)

        F77+extensions, usually DEC extensions. Very very few people ever used strict F77 with no extensions.

                Some of the issues this causes are irritating bordering on unnerving. This we we discovered that g77 didn't care for treating INTEGER as LOGICAL. Used to be that there was no other way to specify bit operations, now it is precluded. Everybody's code has that, and there's really nothing intrinsically wrong or difficult to understand about it, but it was technically non-standard (although everyone's extensions permitted it) and it won't work on g77 - maybe only with the infamous -fugly flag.

         

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Seconded. And the legacy isn't necessarily just the source code. Many of the engineering industries using such codes have a relatively low turnover rate, meaning an older group of engineers and researchers with the most experience stick around for decades. Most of these folks used Fortran since college. It works for them, and they aren't concerned with any "new-fangled" languages that offer more features. Another reason I hear from these folks is that Fortran has powerful array slicing and indexing syn

    • by mbkennel ( 97636 ) on Friday May 09, 2014 @08:09PM (#46963977)

      A: Legacy code, and because Fortran 2003+ is a very good modern language for scientific computation and maps very naturally to problems. As it turns out, the language semantics (both legacy and modern constructs) make it very good to parallelize. And it runs fast, as in, equalling C++ level of performance is considered a weak showing.

      If you haven't seen or used modern Fortran and think it's anything like Fortran 66/77 then you're mistaken. Except for I/O, which still tends to suck.

      In addition there are still some seemingly trivial but actually important features which make it better than many alternatives (starting from Fortran 90).

      There's some boneheaded clunkers in other languages which Fortran does right: obviously, built-in multi-dimensional arrays, AND, arrays whose indices can start at 0, 1 (or any other value) and of course know their size. Some algorithms are written (on paper) with 0-based indexing and others with 1-based and allowing either one to be expressed naturally lowers chance of bugs.

      Another one is that Fortran distinguishes between dynamically allocatable, and pointers/references. The history of C has constrained/brain-damaged people to think that to get the first, you must necessarily take the second. That doesn't happen in Fortran, you have ALLOCATABLE arrays (or other things) for run-time allocation of storage, and if you need a pointer (rarer) you can get that too. And Fortran provides the "TARGET" attribute to indicate that something *may be pointed to/referenced*, and by default this is not allowed. No making pointers/references to things which aren't designed to be referred to multiple times. This also means that the aliasing potential is highly controlled & language semantics constructed to make Fortran able to make very aggressive, and safe, optimization assumptions.

      The more parallel you want, the more of these assumptions you need to get fast code, and naturally written Fortran code comes this way out of the box than most other languages.
      • by mysidia ( 191772 )

        A: Legacy code, and because Fortran 2003+ is a very good modern language for scientific computation and maps very naturally to problems

        See.... Fortran 2003 is more modern than ISO 1999 C.... Now that that's settled... How come people are still programming in languages like C/C++/Java, when Fortran2003 is available?

    • by the eric conspiracy ( 20178 ) on Friday May 09, 2014 @08:09PM (#46963979)

      Legacy code that has been carefully checked to give correct results under a wide range of conditions.

    • by TapeCutter ( 624760 ) on Friday May 09, 2014 @08:18PM (#46964027) Journal
      Prospectors did not stop using shovels when bulldozers were invented. FORTRAN is the scientist's shovel, visualization software is the bulldozer.

      A: Legacy code.

      AKA battle hardened libraries that work as advertised.

    • by rubycodez ( 864176 ) on Friday May 09, 2014 @08:28PM (#46964079)

      no, used because Fortran is the high level language that produces the fastest code for numeric computation, it is by far the most optimizable. Yes, it blows away C.

    • A: Legacy code.

      Most people who learned FORTRAN did so in a class intended to teach FORTRAN.

      Most people these days aren't learning FORTRAN, they're learning other languages, and they're doing them in a context that teaches them about apply the tool, rather than making them really, really good with the tool itself.

      To use an analogy, it's the difference between learning how to use all the tools in the wood shop before being sent in to make a chair (learn the tools) vs. being taught all about "chairness", and then being sent

  • by Balial ( 39889 ) on Friday May 09, 2014 @07:53PM (#46963893) Homepage

    Scientists work in formulas. Fortran was designed to do things naturally that don't fit into C/C++, Python, whatever.

    • by smoothnorman ( 1670542 ) on Friday May 09, 2014 @08:04PM (#46963951)
      mod the above up please (i'm fresh out of mod points), because that's it in a nutshell. Fortran was designed for science/engineering work. And here's something that a majority of computer-science mavins never seem to grasp. In academia, at least, the use of a program is often relatively ad-hoc, and for the life of the publication. they need to have lots of numerical stuff down by easily references libraries, then handed off to their (poor) dost-docs/grad-students to study for their own one-off programming purposes. That is, the next vital program will have little to do with the previous except for those same well referenced peer-reviewed linked-to numerical libraries. Does that sound like a perfect use (model) of Clojure or Haskell to you? (yes yes you in the back, i know you brush your teeth with monads, but you're the rare case). Haskell and friends force you to think a lot up front for gains at the rear-end, but with much of academic programming there's no rear-end.
    • Formula's fit "naturally" into modern "languages" such as Mathematica, MatLab, etc.

      Of course hindsight is always of the 20/20 variety - If we could find a Delorian, we could time travel back to the early 50's and present The good doctor [wikipedia.org] with a technical demo of CUDA running on a modern $150 video card, do we think the boffins of today would be using C++ CUDA kernels, or FORTRAN?

      Aside from the hindsight fantasy, what's wrong with using FORTRAN in the way it was intended to be used? - Old certainly doe
      • by chthon ( 580889 )

        Please choose your examples well.

        Although son of a well to-do farmer, Newton was not nobility, and could only study because of recognition of his talent. Gauss was the son of really poor people.

  • Wrong question (Score:5, Insightful)

    by Brett Buck ( 811747 ) on Friday May 09, 2014 @07:54PM (#46963895)

    Why not?

          Actually that is a serious question, for these sorts of applications there seems to be no significant downside.

    • Re:Wrong question (Score:5, Insightful)

      by jythie ( 914043 ) on Friday May 09, 2014 @08:00PM (#46963921)
      That is what tends to bother me about these 'wow, people are not using what we in another field are using!' type questions. FORTRAN does its job well, has libraries relevant to what people are using it for, and experience in it is common within that community. Why shouldn't they use it?
    • Computer languages turned out to be one of those things that seem very deep and significant, but actually aren't. FORTRAN and Lisp (and BASIC and C, only somewhat later) made programmers about as productive, within a reasonably small constant factor, as anything since. (And before you hit me with "Citation Needed," remember it cuts both ways!)
      • FORTRAN and Lisp (and BASIC and C, only somewhat later) made programmers about as productive, within a reasonably small constant factor, as anything since.

        I recently switched from C to Python, and my productivity shot up ~100 times. For example, I created a tree of hashes to perform pattern matching on large data sets in linear time. It took me 2 hours from concept to production run. Also factor in a huge library that does everything needed on this planet, ease of maintaining 100x fewer lines of code and 0 memory management hurdles.

        The downside is that it runs 100 times slower than C, but since it is the programmer's productivity your are talking about, yo

        • by mbone ( 558574 )

          The downside is that it runs 100 times slower than C, but since it is the programmer's productivity your are talking about, you are very wrong.

          You do realize that CPU limited problems are not uncommon in physics and engineering?

    • Re:Wrong question (Score:5, Insightful)

      by Rhys ( 96510 ) on Friday May 09, 2014 @08:30PM (#46964091)

      There's actually significant upside.

      Ever debugged a memory error in C? Ever done it when it is timing dependent? How about on 1024 nodes at once? Good luck opening that many gdb windows.

      I TA'd the parallel programming class. I told the students (largely engineers & science, not CS) -- use fortran. Lack of pointers is actually a feature here.

  • Why not? (Score:4, Insightful)

    by grub ( 11606 ) <slashdot@grub.net> on Friday May 09, 2014 @07:56PM (#46963899) Homepage Journal

    At work in the recent past (2000's) we were still supporting FORTRAN on the SGI machines we had running. The SGI compilers would optimize the hell out of the code and get it all parallized up, ready to eat up all the CPUs.

    Newer isn't always better.
  • Comment removed (Score:5, Informative)

    by account_deleted ( 4530225 ) on Friday May 09, 2014 @07:56PM (#46963903)
    Comment removed based on user account deletion
    • by rubycodez ( 864176 ) on Friday May 09, 2014 @08:37PM (#46964117)

      you left out the massive gigabytes of well-tested and respected numeric libraries for all the major fields of science and engineering (that are free for use too).....oh, and much of that written in F77. the most optimizable langague for numeric computation on planet earth, that's why supercomputer companies always sell ForTran compilers

    • Gnu killed fortran (Score:5, Insightful)

      by goombah99 ( 560566 ) on Friday May 09, 2014 @10:07PM (#46964503)

      For years and years and years the Gnu G95 compiler was only a partial implementation of the language. This made it impossible to use without buying a complier from intel or absoft or some other vendor. It chokes the life out of it for casual use.

      Personallyt I really like a combination of F77 and python. Whats cool a bout it is that F77 compiles so damn fast that you can have python spit out optimized F77 for your specific case sizes. Then for the human interface and dynamic memory allocation and glue to other libraries you can use python.

  • Strangely? (Score:5, Insightful)

    by fahrbot-bot ( 874524 ) on Friday May 09, 2014 @07:56PM (#46963905)

    When you go to any place where 'cutting edge' scientific research is going on, strangely the computer language of choice is FORTRAN, the first computer language commonly used, invented in the 1950s.

    Perhaps it's still the best tool for the job. Why is that strange? Old(er) doesn't necessarily mean obsolete -- and new(er) doesn't necessarily mean better.

    • by jythie ( 914043 )
      "tool" is the key word. Within the type of research that uses it, they want a tool for getting their actual goals done. CompSci and such tend to see languages and such as points of focus unto themselves.
    • Re:Strangely? (Score:5, Insightful)

      by Dutch Gun ( 899105 ) on Friday May 09, 2014 @08:31PM (#46964093)

      Agreed. My thought at reading the summary was "Do older languages have some sort of expiration date I don't know about?" What's odd about it? Also, it's not like the language has been stagnant. English is an old "legacy" human language with lots of cruft and inconsistent rules, but it works well enough for us that it's not worth jumping ship for Esperanto.

      A large part of it is probably the simple inertia of legacy, both in code, systems, and personnel. However legacy systems tends to eventually be replaced if a demonstrably superior product can improve performance in some way. Any significant change, even one for the better, causes pain and friction, so the change typically has to be worth the pain involved. Obviously in the eyes of many science-focused projects, it hasn't been worth switching to a new language. There's also value in having a body of work in an older and very well understood and documented language, as it means new team members are much more likely to already be proficient with the language than a newer and less popular language.

      I can also understand not wanting to switch to some "flavor of the month" language when you're not sure how long it will be actively supported. FORTRAN has credibility simply based on it's incredible longevity. No, it's not new and sexy, but you can bet it will probably be around for another half-century.

  • by Rostin ( 691447 ) on Friday May 09, 2014 @08:00PM (#46963923)
    I have a PhD in engineering, and my dissertation involved writing lots of code. Now I work at a national lab in the US, and I and nearly all of my coworkers work on scientific or engineering codes of some sort. Although there is significant amounts of legacy code that was written in Fortran lying around (a project I work on uses a fortran library written in 1973), very little development is done in that language. It's all C++ or Python.
  • The biggest reason of interest is that it helps non-computer-science scientists write up computational codes, neither having to devote excessive amount of time in memory management, nor deviate from the classic imperative programming model. And, it is also important for a purely non-technical reason: a generation of domain experts in engineering and scientific domains where trained in FORTRAN codes.

    As managers of High Performance Computing platforms, we generally take an a-religious approach and deliver
    • Good point. Sometimes we forget how hard it is to adapt to the object-oriented programming style, since it comes so naturally to us. But if all you want to do is get the computer to implement your formula, OOP doesn't give you much.
  • After all, it was "For Tran".
  • by Karmashock ( 2415832 ) on Friday May 09, 2014 @08:06PM (#46963963)

    If the language accomplishes the task efficiently and effectively with no apparent downside then why attempt to switch languages simply for the sake of switching?

    Furthermore, an ability to run legacy code should be sustained especially in science where being able to use that code again after many years might save scientists from having to reverse engineer past discoveries.

  • Key Reason (Score:5, Interesting)

    by stox ( 131684 ) on Friday May 09, 2014 @08:16PM (#46964015) Homepage

    Huge libraries of FORTRAN code have been formally proven. New FORTRAN code can be formally proven. Due the limitations of the language, it is possible to put the code through formal processes to prove the code is correct. In addition, again, as a benefit of those limitations, it is very easy to auto-parallelize FORTRAN code.

  • We're Not (Score:2, Interesting)

    by friedmud ( 512466 )

    I saw this link bait the other day...

    We're NOT using Fortran anymore...

    Many of us at the National Labs do modern, object-oriented C/C++... Like the project I'm in charge of: http://www.mooseframework.org/ [mooseframework.org]

    There are whole labs that have completely expunged Fortran in favor of C++... Like Sandia (http://trilinos.sandia.gov) who actually went through a period in the late 90s and early 2000s where they systematically replaced all of their largest Fortan computational science codes with C++.

    Those places that don'

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      If you're using C++ for scientific math, then you deserve to have whatever credentials you may possess to be revoked immediately. No language should be used for scientific math that can produce different results based upon the version of library or platform it is compiled against.

      You also cannot prove C++ code is good. You just can't. C++ is not deterministic, again, because the outcome depends on platform/library versions, compiler options, time of day, alignment of the planets, and many other factors. The

      • Re:We're Not (Score:5, Insightful)

        by friedmud ( 512466 ) on Friday May 09, 2014 @09:02PM (#46964201)

        Firstly... 10^-15 is WAY beyond what most scientific codes care about. Most nonlinear finite-element codes generally shoot for convergence tolerances between 1e-5 and 1e-8. Most of the problems are just too hard (read: incredibly nonlinear) to solve to anything beyond that. Further, 1e-8 is generally WAY beyond the physical engineering parameters for the problem. Beyond that level we either can't measure the inputs, have uncertainty about material properties, can't perfectly represent the geometry, have discretization error etc., etc. Who cares if you can reproduce the exact same numbers down to 1e-15 when your inputs have uncertainty above 1e-3??

        Secondly... lots of the best computational scientists in the world would disagree:

        http://www.openfoam.org/docs/u... [openfoam.org]
        http://libmesh.sourceforge.net... [sourceforge.net]
        http://www.dealii.org/ [dealii.org]
        http://eigen.tuxfamily.org/ind... [tuxfamily.org]
        http://trilinos.sandia.gov/ [sandia.gov]

        I could go on... but you're just VERY wrong... and there's no reason to spend more time on you...

  • Today someone told me about how he once wasn't allowed to disturb a printer - because someone was using it to run a job doing an FFT written in Postscript. Apparently the large amount of memory available in the printer was paramount.

  • by robbiedo ( 553308 ) on Friday May 09, 2014 @08:55PM (#46964177)
    I am sticking with Visual Basic 6
  • by Animats ( 122034 ) on Friday May 09, 2014 @08:56PM (#46964179) Homepage

    A big problem is that C and C++ don't have real multidimensional arrays. There are arrays of arrays, and fixed-sized multidimensional arrays, but not general multidimensional arrays.

    FORTRAN was designed from the beginning to support multidimensional arrays efficiently. They can be declared, passed to subroutines, and iterated over efficiently along any axis. The compilers know a lot about the properties of arrays, allowing efficient vectorization, parallization, and subscript optimization.

    C people do not get this. There have been a few attempts to bolt multidimensional arrays as parameters or local variables onto C, (mostly in C99) but they were incompatible with C++, Microsoft refused to implement them, and they're deprecated in the latest revision of C.

    Go isn't any better. I spent some time trying to convince the Go crowd to support multdimensional arrays properly. But the idea got talked to death and lost under a pile of little-used nice features.

  • Its not strange (Score:4, Informative)

    by nurb432 ( 527695 ) on Saturday May 10, 2014 @07:20AM (#46965865) Homepage Journal

    Most scientists like to use tools that work, and they are proficient in.

    FORTRAN falls under both categories.

  • FORTRAN was -- for some still is-- the 'Perl' of scientific computing. Get it in and get it done... and it doesn't always compile down very tight, but always fast because for mainframe developers getting this language optimized for a new architecture was first priority.

    At 15, the first real structured program I ever de-constructed completely while teaching myself the language, was the FORTRAN IV source for Crowther and Woods Colossal Cave Adventure [std.com], widely regarded as 'the' original interactive text adventure, a genre which would later go multi-user to become the MUD. Read about it here [rickadams.org], or play it in Javascript [xenoveritas.org].

    Crowther's PDP-11 version was running on the 36-bit GE-600 mainframes [wikipedia.org] of GEISCO (General Electric Information Services) Mark III Foreground timesharing system... this is in the golden age of timesharing and no one did it better than GE. It took HOURS at 300bps and two rolls of thermal paper to print out the source and data files, and I laid it out on the floor and traced the program mentally, keeping a notebook of what was stored in what variable... I had far more fun doing this than playing the game itself.

    FORTRAN IV and Dartmouth BASIC (I'll toss in RPG II also) were the 'flat' GOTO-based languages, an era of explicit rather than implicit nesting -- a time in which high level functions were available to use or define but humans needed to plan and implement the actual structure in programs mentally by using conditional statements and numeric labels to JUMP over blocks of code. Sort of "assembly language with benefits".

    When real conditional nesting and completely symbolic labeling appeared on the scene, with good string handling, it was a walk in the park.

  • by biodata ( 1981610 ) on Saturday May 10, 2014 @07:29AM (#46965893)
    In the cutting edge world of biological research scientists are still using the ancient language latin to name, classify and describe species of organism. The thing is why would they change?
  • by gweihir ( 88907 ) on Saturday May 10, 2014 @09:20AM (#46966369)

    Yes, FORTRAN sucks, but it is stable, fast and well understood. It runs on a number of supercomputer architectures. It is way easier to program in FORTRAN than in C for non-CS people. So what is the issue? Oh, maybe that this is not a "modern" language? Here is news for you: Java sucks a lot more than FORTRAN.

  • by frank_adrian314159 ( 469671 ) on Saturday May 10, 2014 @09:26AM (#46966387) Homepage

    No one knows how to spell it. In these threads I've seen Fortran, FORTRAN, ForTran, etc. Who the hell can keep track? That's why C is winning! One letter, upper case, C. No muss no fuss. Not to worry, PL\I had the same issue (Forward or backward slash? Better go look it up!) Same with LISP, LisP, Lisp, (or, these days, scheme, Racket, Clojure).- too many letters, too many ways to misspell them. D avoided falling into this trap, but which would you rather have on a paper? A D or a C? And C++ doesn't even look like a real grade? Python? Are we naming a language or a comedy troupe? Same with Ruby - I don't need a bunch of freaking geologists telling how to use a computer.

    Jeez, everyone, C just got it right where it counted - its name. Now can we just all agree to use that and move on?

  • by Lawrence_Bird ( 67278 ) on Saturday May 10, 2014 @10:24AM (#46966741) Homepage

    Fortran is still used because it works. Becuase it is fast. Because libraries are optimized and well understood. Fortran is still used because gasp it has evolved since FORTRAN 66 and FORTRAN IV. Maybe you and the other language nannies always forcing latest greatest buzz on the rest of us should take the time to actually read about some of the most recent [wikipedia.org] versions?

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...