Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Comparing the Size, Speed, and Dependability of Programming Languages 491

In this blog post, the author plots the results of 19 different benchmark tests across 72 programming languages to create a quantitative comparison between them. The resulting visualizations give insight into how the languages perform across a variety of tasks, and also how some some languages perform in relation to others. "If you drew the benchmark results on an XY chart you could name the four corners. The fast but verbose languages would cluster at the top left. Let's call them system languages. The elegantly concise but sluggish languages would cluster at the bottom right. Let's call them script languages. On the top right you would find the obsolete languages. That is, languages which have since been outclassed by newer languages, unless they offer some quirky attraction that is not captured by the data here. And finally, in the bottom left corner you would find probably nothing, since this is the space of the ideal language, the one which is at the same time fast and short and a joy to use."
This discussion has been archived. No new comments can be posted.

Comparing the Size, Speed, and Dependability of Programming Languages

Comments Filter:
  • Scala (Score:3, Interesting)

    by Laz10 ( 708792 ) on Sunday May 31, 2009 @11:01AM (#28158307)
    I am surprised how they manage to get scala to perform so much worse than pure java.

    Scala compiles to pure java .class files and uses static typing and the makes claim that the bytecodes are almost identical.

    I wonder if the benchmarks are executed in the same environment.
    http://shootout.alioth.debian.org/ has a Gentoo label behind the java benchmarks, but not the Scala one.

  • by jackb_guppy ( 204733 ) on Sunday May 31, 2009 @11:16AM (#28158405)

    Where Cobol and RPG, the languages that run business?

  • by Timothy Brownawell ( 627747 ) <tbrownaw@prjek.net> on Sunday May 31, 2009 @11:18AM (#28158425) Homepage Journal

    I think verbosity in moderation is necessary. I have read many an article with developers arguing that they don't need to document their code when their code is self-documenting. Do you make all of your variables and class/function/methods a single character for the sake of verbosity? I hope not. And I would think that reading and maintaining that code would be far less than a joy.

    Long meaningful identifiers are useful. Needing 5 lines of setup for each API call is annoying, particularly if those 5 lines are usually the same. Requiring lots of redundant long keywords to "look more like English" is annoying. Large standard libraries that let you remove most of the tedious parts from your code are useful.

  • Ocaml (Score:4, Interesting)

    by LaminatorX ( 410794 ) <sabotage@praeca n t a t o r . com> on Sunday May 31, 2009 @11:39AM (#28158575) Homepage

    Every time I see one of these things, OCaml always rocks it. I wonder why it never caught on to a greater degree?

  • by ledow ( 319597 ) on Sunday May 31, 2009 @11:52AM (#28158667) Homepage

    This kind of fits in with my thinking.

    When I was starting out in programming, I just wanted results. I wasn't concerned about performance because the computer was a million times faster than me. I was most concerned about how many "non-vital" keywords were necessary to describe what I wanted the machine to do (e.g. "void main(...)" isn't *vital* because it's just boilerplate. However "if", "for", "while" etc. would be vital - and even for/while are just cousins), and how many of the vital keywords (i.e. those that specifically interfered with the way my program would *actually* operate... a "static" here or there would hardly matter in the course of most programs) were "obvious". Java failed miserably at this... I mean, come on: System.out.println() and the standard wrapping take up too much room.

    So, BASIC was an *ideal* first language (sorry, but it was, and the reason nobody uses it much now is because EVERYONE has used it and moved on to something else - doesn't mean it "breaks" people). In this regard, even things like C aren't too bad - 30-50 keywords / operators depending on the flavour, all quite simple - you could memorise them perfectly in an afternoon. However things like Forth and Perl can be hideous.

    And even C++ is tending towards the stupid. Believe it or not, even things like bash scripting come out quite well under that test. And, to me, that correlates with the amount of effort I have to put in to write in a particular language. If I just want to automate something, bash scripting is fast and easy. Most of the stuff I write is a "one-job program" that will never be reused. If I want to write a program to work something out or show somebody how something is done programmatically, BASIC is a *perfect* prototyping language (no standard boilerplate, no guessing obscure keywords, etc.). If I want to write a program that does things fast, or accurately, or precisely, or for something else to build upon, C is perfect.

    I see no real need to learn other languages in depth past what I'm required to know for my work. I have *zero* interest in spending weeks and weeks and weeks learning YAPL (Yet Another Programming Language) just to spent 90% of that time memorising obscure keywords, boilerplate and the language's shortcuts to things like vectors, string parsing, etc. If I was going to do that, I'd just learn a C library or similar.

    I think that these graphs correlate quite well with that thinking. Let's be honest, 99% of programming is reusing other code or shortcuts - short of programming in a Turing machine, C is one of the simplest languages to learn because it *doesn't* have a million shortcuts... you want to iterate over an array or create a hash / linked list, etc. you have to do it yourself from basic elements. In modern programming, that means a one line include of a well-written library. As far as I was concerned when learning it, even the "pointer++ increases by the size of the pointer" was far too smarty-pants for me, but incredibly useful.

    But with C++, I instantly lost interest because it's just too damn verbose to do a simple job. Java OOP is slightly better but still nasty once things get complicated and the underlying "functional" language is basically a C-a-like.

    I'm a fuddy-duddy. Old fashioned. If I write a program, the damn computer will damn well do instruction 1 followed by instruction 2 with the minimum of flying off into libraries and class systems. If I want 4 bytes of memory to change type, then I will damn well have them change type. And I'll even get to specify *what* 4 bytes of RAM if I want and I'll clean up after them if it's necessary. That's how I think, so things like C match perfectly when I want to code. The fact that C is damn powerful, fast, low-level and so common also add to it's appeal.

    I worry about what will happen when people *only* code in OOP languages. The abstraction is so large that people forget that they are still telling a computer to handle bits and bytes and suddenly they get lazy. M

  • by nametaken ( 610866 ) on Sunday May 31, 2009 @11:55AM (#28158695)

    It didn't seem to me like being concise or verbose was a help or hindrance aside from his comment. Per those graphs I could say I want something as fast as Java (how often have you heard that on /.), but a little less verbose... "oh, csharp might be worth a look".

    I found it interesting.

  • by Anonymous Coward on Sunday May 31, 2009 @12:07PM (#28158781)

    Regina(Rexx) is a pretty fast and clear language. My only issue with it is how other functionality has been added such as SQL and network connections as their implementation(or maybe their documentation since there seems to be very little) doesn't seem quite as clear as using, say, PHP or Ruby. If they'd get more than a couple of developers working on the project, it could be easily transformed into something far more useful.

    That said, for 2 years I ran a website entirely off of Regina(or maybe it was ooRexx..basically the same thing) and despite the limitations of the machine it was running on, it performed faster than any other scripted site I ever visited. These days with the heavy AJAX and java implementations, it'd run circles around them.

  • by bcrowell ( 177657 ) on Sunday May 31, 2009 @12:11PM (#28158809) Homepage

    First off, he presents the big chart twice. The second version is meant to compare functional languages with imperative languages, but it's also small enough to fit on my screen, so if you're browsing the article, you might want to look at that one first.

    His "obsolete" sector is really more like a special-purpose sector. For instance, Erlang shows up in the obsolete sector, but that's because Erlang wasn't designed to be especially terse or fast. Erlang was designed to be fault-tolerant and automatically parallelizable. Io also ends up looking lousy, but Io also wast designed to be terse and fast; it was designed to be small and simple.

    The biggest surprise for me was the high performance of some of the implementations of functional programming languages, even in cases where the particular languages aren't generally known for being implementable in a very efficient way. Two of the best-performing languages are stalin (an implementation of scheme/lisp) and mlton (an implementation of ml). However, as the author notes, it's common to find that if you aren't sufficiently wizardly with fp techniques, you may write fp code that performs much, much worse than the optimal; that was my own experience with ocaml, for instance.

    The choice of a linear scale for performance can be a little misleading. For instance, csharp comes out looking like it's not such a great performer, and yet its performance is never worse than the best-performing language by more than a factor of 2 on any task. Typically, if two languages differ by only a factor of 2 in speed, then speed isn't an important factor for choosing between them. The real thing to look out for is that some of the languages seem to have performance that's a gazillion times worse than normal on certain specific tasks.

    Many of the languages are hard to find, because they're listed by the names of their implementations. In particular, g95 is an implementation of fortran.

  • Re:Scala (Score:3, Interesting)

    by kipton ( 135584 ) on Sunday May 31, 2009 @12:12PM (#28158819)

    I am surprised how they manage to get scala to perform so much worse than pure java.

    Scala does generate optimized Java byte code. Pretty much any Java code can be directly ported to Scala with nearly identical performance.

    The Scala benchmarks perform worsethan Java's, on average, for two main reasons. The first is that some of the tasks have been implemented using higher level code (think memory allocation and closure generation), trading conciseness for performance. The second is that the Scala benchmarks haven't been tuned and tweaked to the extent that the Java ones have.

    Then there are a couple benchmarks where Scala's performance is hugely worse than Java. This seems to be because the Java benchmark was implemented using optimized native libraries (big integers as I recall) or using a better algorithm. Again, Scala could achieve equivalent performance in principle, but someone needs to invest the time to update the benchmark implementations.

  • by wonkavader ( 605434 ) on Sunday May 31, 2009 @12:14PM (#28158839)

    If the code size attribute is measured in number of lines, I suspect that forth, which is practically an assembly language, will rank very low (near the top of the graph, if not at the very top), though it ought to be very fast (near the left). It depends so much on stack operations that I suspect its left to right ranking would depend a great deal on the processor it's running on.

    I love forth. I learned it many years ago. But I've never been in a position to use it for anything, which is a shame.

  • Re:Ocaml (Score:4, Interesting)

    by Daniel Dvorkin ( 106857 ) * on Sunday May 31, 2009 @12:17PM (#28158867) Homepage Journal

    I semi-agree -- procedural languages are much closer to the way most people think, and that's why pure functional languages have remained a niche. OTOH, the power and expressiveness of functional programming is really amazing. It seems to me that this is a major reason for Python's success: it's basically a procedural language (of which OO is a subset) that makes a functional style easily available.

  • by FlyingGuy ( 989135 ) <.flyingguy. .at. .gmail.com.> on Sunday May 31, 2009 @12:21PM (#28158899)

    These sorts of things never fail to to amaze me.

    The verbs, nouns, semantics and such used in a given programming language have nothing, I repeat... NOTHING to do with performance!

    What does have to do with performance is the talent of the compiler / interpreter author, nothing more, nothing less.

    C implements ++ and so forth and so on. Pascal does not, you have to express it as var := var + x or in some implementations as inc(var) or inc(var,100). The smart compiler / interpreter author would implement those in the fastest possible way regardless of the particular language.

    The one metric that has real meaning is programmer enjoyment. Do you prefer terseness over verbosity or something in between. Does this languages flow amke you truly appreciate working with it.

    The only other real metric that has any true meaning is again the talent of the compiler / interpreter author. Was the the language parser built so that it can unfold complex statements that are often required to express certain ideas and perform certain operations. Does the language implement your favorite expression, eg: ++ , or something like that, which again harkens back to programmer enjoyment.

    So what it really leaves us with is, "Do you enjoy using that language?" and only you, the programmer can asnwer that question.

  • Re:Ocaml (Score:4, Interesting)

    by Anonymous Coward on Sunday May 31, 2009 @12:30PM (#28158953)

    To be honest, I have the same reaction. I love OCaml, and am frustrated by the fact it hasn't been adopted more. It seems like a no-brainer to me.

    I've been following it over time, and I think the reasons for it have become clearer to me.

    1. I think the biggest reason is hard to articulate well, but basically I think the OCaml developers are sort of unresponsive to the community, which has been embryonic as it is, and as a result there's a sense that it's not moving in directions it should be. Numerical applications, for example, are one domain where OCaml interest has really been high, but the developers have been pretty unresponsive to concerns about matrix implementations. Comments by the developers on concurrency have also puzzled a lot of people. I don't mean to sound like a troll, but I have gotten the sense over time that while OCaml is wonderful as it is, it's not changing or moving anywhere, at least as quickly as it should be (in a bad way). Haskell is a good comparison to me in this regard, as it's a language that is also functional, but not as useful in applications than it is theoretically. However, Haskell has a wonderfully strong community, it is almost a model of how a language and its implementations can grow in a productive way.

    2. OCaml is occupying a difficult position, in that people who want cold, hard performance with stable implementations will go somewhere else (e.g., C); people who want theoretically elegant implementations will go somewhere else (e.g., Lisp, haskell), and people who want something simple will go somewhere else still (e.g., Python, Ruby). OCaml in many ways occupies all three places, but not quite as well as any one of them.

    Having said all of that, it's important to keep in mind that Microsoft is sort of pushing F# for 2010. F# is essentially a Microsoft implementation of OCaml for the CLR--very very close to OCaml--so maybe if F# takes off, there will be a big push toward OCaml.

    It's one of the few times I could say that Microsoft is really on target technically, and might actually move the numerical computing field forward.

  • Re:Ocaml (Score:1, Interesting)

    by Anonymous Coward on Sunday May 31, 2009 @12:50PM (#28159099)

    Ocaml can be used as an imperative language. Try that. Try to avoid using any "map" or something like that. You will end up in clean, elegant and *fast* code anyway.

    Ocaml is a functional, imperative and object-oriented language. The oo-style of ocaml is impressing. The type is driven by the available methods, not by the name of a class. This makes coding object-oriented imperative programs in OCaml a funny and quick task.

    If you *also* know how to make advantage of functional programming, you still can, with one of the most advanced functional programming language available.

    If you don't know how to avoid stack overflow it's better you stick with imperative code, but using a language such as OCaml will make you more productive in any case.

  • by mdmkolbe ( 944892 ) on Sunday May 31, 2009 @12:57PM (#28159155)

    Ha ha. Good joke. You've forgotten the most important feature "needed in a great programming lang": higher-order [wikipedia.org] and first-class [wikipedia.org] functions with proper closures [wikipedia.org]. Oh wait, C doesn't have that.

    Any truly great statically typed language will also have at least algebraic data types [wikipedia.org], parametric polymorphism [wikipedia.org] (even C++ only has ad-hoc polymorphism), type constructors and functions, maybe even a Turing complete type system (heh). C doesn't have any of those.

    Even aside from types, great languages should include tail-call optimization [wikipedia.org], pattern matching [wikipedia.org] and hygienic macros [wikipedia.org] (CPP macros are a bad joke).

    Now don't get me wrong. C is a great portable assembly language. It's close to the metal, widely known and easy to read. But as far as programming languages go, C feature poor.

  • Re:Related site... (Score:5, Interesting)

    by Anonymous Coward on Sunday May 31, 2009 @01:01PM (#28159187)

    Show me a language impossible to write ugly code in, and I'll show you a language which is unnecessarily restrictive.

    If you can't recognize the beauty in near infinite flexibility and the associated amount of power provided, you're not qualified to participate in such a discussion. Come back when you've gotten to know some actual talented programmers. One way to identify those programmers is that they don't blame their tools for their own incompetence.

  • I wrote a little "literate" FORTH tutorial if any readers of the above comment are interested in it: jonesforth [annexia.org].
  • Re:Ocaml (Score:4, Interesting)

    by david.given ( 6740 ) <dg@cowlark.com> on Sunday May 31, 2009 @02:44PM (#28160047) Homepage Journal

    A while back I tried implementing a project in OCaml, because it looked awesome, and I wanted to explore it.

    I found a lot to like about it; it's fast, reasonably clear, has got lots of awesome functional programming features that I actually found myself using while also having a powerful set of imperative programming features for the rest of the logic, and in general worked rather well.

    However, I kept finding myself running again and again into problems caused by the same conceptual design issue, which eventually led to me giving up in frustration.

    The problem was this: an OCaml program is structured as a series of statements, each of which mutates the program in some manner (usually by adding a definition to it). The last statement in the program typically invokes something (such as a 'main' function). The compiler ensures that your program is consistent after every statement. This means that you can't do forward declarations.

    There is an and construct that allows you to define several items of the same kind in a single statement, which is typically used to implement mutually recursive functions, but that doesn't help if you need to items of different kinds to be mutually recursive: for example, a class and an algebraic data type (which I needed to do lots, in order to implement 'object or None' semantics).

    I found this issue everywhere I looked in OCaml; the language semantics require your code dependency graph to be a tree with no cycles, and you have to structure your program strictly in bottom-up form. So not only can you not have two mutually dependent modules, you can't even put your 'main' at the top of the file for clarity.

    Now, my OCaml skills aren't particularly great, so there may be a way round this, but I spent a lot of time looking for a solution to this and didn't find one. This, and other lesser issues to do with code style and namespacing, smell unpleasantly like an academic language that hasn't really learnt to scale gracefully, and I wonder whether it would be a comfortable choice for implementing large projects. These days I'd be inclined to consider Haskell or Clean instead.

  • Re:Java (Score:3, Interesting)

    by Timmmm ( 636430 ) on Sunday May 31, 2009 @03:46PM (#28160511)

    "Oh but Java is a plodding, stumbling, lumbering, slug of slowness."

    Java does well in these kinds of synthetic tests because it doesn't have to invoke the garbage collector. All the *real life* java programs I use are significantly slower than roughly equivalent C++ programs. E.g. compare NetBeans to Visual Studio, or Azereus to uTorrent. I tried Eclipse once but it was unusably slow.

    Find me a speedy desktop Java program and I'll change my mind about it.

  • by Kjella ( 173770 ) on Sunday May 31, 2009 @04:02PM (#28160675) Homepage

    I don't think that's the primary scale. I think my primary scale is:

    What level of detail control is the programmer required to and permitted to have?
    How easy is it for a developer to shoot himself (or others) in the foot?
    How often must the developer work around poor or missing language features?

    For example, there's no lower level than assembly but it has no high-level features. Java is high level but has no low-level memory and byte manipulation to speak of. I prefer C++/Qt, I have all the practical high-level functions for the quick gains, yet I can down dive to detailed C/C++ management.

    In terms of "how easy is it to shoot himself in the foot?" I think C/C++ could do much better. Doing all sorts of pointer manipulation is error prone and zero-terminated char arrays are the thing of the devil. And while C++ has exceptions I've never seen them applied consistantly and well like in Java.

    Finally, I think there should only be the need for ONE standard library, not a five library mashup that's kinda standard. Everything else should be modules usually building on top of that library. Java, C# has done it - I emulate it with Qt in C++ because I think STL/boost/whatever is not very good.

  • Re:what about APL (Score:3, Interesting)

    by Hucko ( 998827 ) on Sunday May 31, 2009 @05:21PM (#28161263)

    Snarky et al are ancient words used up to the 60s; their resurgence can only make me hope that we are potentially seeing the return to precise use of language. This would be a fantastic event; a reversal of the trend to the dilution of semantics and language in general.

    Of course my use of 'et al' is symptomatic of this, as is the common use of acronyms for everything... Ah weel, it was a nice idea while it lasted

  • by Estanislao Martínez ( 203477 ) on Sunday May 31, 2009 @08:03PM (#28162367) Homepage

    The biggest surprise for me was the high performance of some of the implementations of functional programming languages, even in cases where the particular languages aren't generally known for being implementable in a very efficient way. Two of the best-performing languages are stalin (an implementation of scheme/lisp) and mlton (an implementation of ml).

    You should not draw too many conclusions about the results of those two without taking into account the fact that both are whole-program optimizing compilers. Those two systems just do not support separate compilation of individual source files; if you change one line of one file in your program, every file must be recompiled.

    This type of compiler has a performance advantage over the more common separate compilation systems, simply because it can inline anything anywhere, and thus optimize far more aggressively. But it's next to useless for developing large software systems, and thus mostly really useful only for writing smallish programs, in very high-level style, that perform some really expensive computations really fast.

  • by Foofoobar ( 318279 ) on Sunday May 31, 2009 @10:16PM (#28163289)
    Funny, isn't this what Twitter thought too before dumping RUBY entirely? Wasn't this what Twitter thought as they threw more and more hardware at the problem and still could not solve the problem? Didn't Twitter end up spending more on IT to administer 2-3 times the numbers of servers that it would take to do the same thing in Python, PHP or Java?

    Yeah, throw hardware at it. That's a viable solution for a company. As long as you aren't thinking about who has to maintain all those servers and the fact that RUBY STILL DOESN"T SCALE.
  • by helixcode123 ( 514493 ) on Monday June 01, 2009 @01:08AM (#28164471) Homepage Journal

    Or just:
    print "@list";

    Perl. It puts the "fun" in your functions!

  • by ygslash ( 893445 ) on Monday June 01, 2009 @05:58AM (#28165687) Journal

    If this were the case then Perl 6 would have stuck with the Pugs implementation.

    That's silly. Pugs was not designed as a production implementation of Perl 6 - it was a proof of concept for the new syntax. The fact that Pugs was so surprisingly easy to implement, and so surprisingly performant before any effort at all was put into its performance, was a stunning demonstration of the power of functional programming.

    GHC would have stuck with Darcs and not gone to GIT.

    GHC did stick with Darcs. There was a time when a move to Git was considered, but it had nothing to do with Darcs being written in a functional language. At the time, the Darcs team was not big enough or well enough organized to keep up with the heavy support demands of being the RCS for a large and growing compiler project. That has changed. Darcs has become one of the best run (and most fun) open source projects, with a large team of active and talented developers. Darcs is now an excellent choice of RCS even for very large projects.

    That is typical of the process that has been happening with functional programming during the past few years - a quick transition from the theoretical to the best-of-breed in practice.

    So if you want to continue this conversation start reading the various performance related discussions. There are 2 decades of papers on trying to resolve specific examples of this problem.

    If you're interested in history, you are the one who ought to have a good look at that literature. Learn to tell the difference between knotty problems that were identified twenty years ago, and the astonishing continuous progress that has been made since then.

    But if you want to continue this conversation, you should look at what is happening in the present. One thing that has happened is that you no longer need to read academic papers to learn and use functional languages in practice. There are books, online tutorials and resources, many developer-friendly tools, and a huge and super-friendly community.

    The shootout is a nice demonstration that speed optimization is another aspect of the increasing strength of modern functional compilers and functional programming techniques. No one will claim that a functional language can beat C at speed right now, but it says a lot that such high level languages can now compete well in the same league as C. As hardware architectures continue to move farther and farther away from the classical imperative model, watch for this trend to continue.

    I love functional languages but putting your head in the sand regarding where the problems are and considering the evaluations FUD is not going to advance the cause.

    I called the great-grandparent post FUD because it is FUD. These are common misconceptions, caused by lingering impressions of where functional programming was decades ago. Open your eyes, look at the facts, see what is happening now. Don't let the FUD lull you into apathy.

  • by adamkennedy ( 121032 ) <adamk@c[ ].org ['pan' in gap]> on Monday June 01, 2009 @08:39AM (#28166533) Homepage

    FYI, the new Padre Perl IDE is itself written in Perl.

    http://padre.perlide.org/wiki/Screenshots [perlide.org]

  • by skeeto ( 1138903 ) on Monday June 01, 2009 @03:06PM (#28171657)
    If you like Forth, you should check out Factor [factorcode.org], which is basically a modernized version of Forth (dynamically typed, no *very* low level filesystem junk that Forth has). I've recently started playing with it.
  • Re:island paradise (Score:3, Interesting)

    by Kazoo the Clown ( 644526 ) on Monday June 01, 2009 @07:52PM (#28175751)
    Nice post. It is true that APL sends you to places you never go with any other language. And it is also true that this isn't necessarily a good thing.

    My first language was APL on an IBM System/360 in about 1973. I recall one of the lab assistants had a workspace of text functions he'd created that some of us were looking at. One in particular, was designed to take a text string and reduce occurances of multiple spaces down to single spaces. The program was a one-liner, of about 120 characters. Several of us looking at it could see that it could be simplified from this 120 character monstrosity, and of course that it *should* be, so we set ourselves to the task. We divided up into a few groups, and an buddy and I worked on it for awhile and got it down to 14 characters. We concluded that was as good as it could get and were firmly convinced we would win the informal contest we were having. But then one of the other students showed us his result which was almost identical to ours but had it at 13 characters because he had noticed a logical not that could be combined with an operator in order to eliminate it. We were crushed because we worked so hard on it and were sure we had it aced...

    But that was pretty typical with APL, you could spend a huge amount of time juggling array elements to be just so, in order to evaluate all the answers in parallel, when in any other language you would have just written a for loop with a few statements and been done with it. Not as challenging as APL, but as I said, that could very well be a Good Thing(TM)...

    Plus, the varying quality of APL programmers meant that you may be looking at a 120 character monstrosity of obscure and unneccessary logic that could have been done more concisely in 13 characters...

If you want to put yourself on the map, publish your own map.

Working...