Rosetta Code Study Weighs In On the Programming Language Debate 165
An anonymous reader writes: Rosetta Code is a popular resource for programming language enthusiasts to learn from each other, thanks to its vast collection of idiomatic solutions to clearly defined tasks in many different programming languages. The Rosetta Code wiki is now linking to a new study that compares programming language features based on the programs available in Rosetta Code. The study targets the languages C, C#, F#, Go, Haskell, Java, Python, and Ruby on features such as succinctness and performance. It reveals, among other things, that: "functional and scripting languages are more concise than procedural and object-oriented languages; C is hard to beat when it comes to raw speed on large inputs, but performance differences over inputs of moderate size are less pronounced; compiled strongly-typed languages, where more defects can be caught at compile time, are less prone to runtime failures than interpreted or weakly-typed languages."
Machine specific (Score:1)
If you are characterizing performance of "large inputs" without quantifying machine behaviors (cache, TLB, ram) you're doing it wrong.
Re:Machine specific (Score:4, Interesting)
Pragmatically, almost no one actually codes software with that aspect of the target platform in mind. Unless you're writing drivers, OSes or something else that might need to know EXACTLY how many cycles an op is going to take, your cache behavior, e.g. is never going to be part of what you're building your code around.
And RAM sizes are large enough that a "large" input is easily contained entirely within even smallish RAM.
As long as they used a consistent testbed between languages, it's an excellent heuristic for language effects on performance in the real world.
Re: (Score:3)
And RAM sizes are large enough that a "large" input is easily contained entirely within even smallish RAM.
/old-man mode: ...and this is EXACTLY what's wrong with programmers today! No eye for efficiency! :old-man mode/
Cue endless round of arguments about how so-and-so wrote an app that only used one bit of RAM to execute four jobs, yeah - but in all seriousness, the attitude of 'RAM is cheap' has done more to create bloat than any other.
Re: (Score:2)
I'd love to agree with you, there, but when I got started programming, VAXen were all the rage, and programmers were still taught to program as if CPU and memory were unlimited resources. Sorry, but this is one philosophy you can't blame on "programmers today."
Re: (Score:3)
Re: (Score:3)
Size isn't the only thing, locality of reference is just as big or bigger and languages like C/C++ allow a programmer to think about and control such things. With everything being powered by batteries now being efficient isn't something we want to always just throw more hardware at, although such a phone might be more resistant to bending.
Re: (Score:2)
/Oblg.
Tony Albrecht's excellent Pitfalls of object orientated programing [cat-v.org] presentation.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
And? (Score:2)
Re:And? (Score:5, Funny)
That's the job of consultants.
Re: (Score:2)
This is not funny! This is sad. And all too true.
Re: (Score:2)
Sometimes advice carries more weight to the listener if they paid and arm and a leg for it.
Re: (Score:2)
Really? I didn't know until now that Ruby was slightly faster than Java or that Python is only slightly slower than C#. This is referring to the graph in the "Which programming languages have the best runtime performance?" Crap study.
For three decades or more. (Score:2)
So it's telling us just what we already knew? Interesting.
For three or more decades. (Before that some of the classes of things they're comparing didn't exist, with enough deployment, to characterize.)
On the other hand, it's nice to have it confirmed with some rigor and measures.
Who cares about succinctness .... (Score:5, Informative)
Re: (Score:3, Informative)
The problem is that OOP languages rarely have more readable code. Instead, they typically have simply code with more boiler plate.
Re: (Score:2)
Depends on the OO Language I would say:
o SmallTalk
o Groovy
o Simula
all easy to read and no boilerplate code.
Re: (Score:3)
I think it depends more on the coder. Spaghetti code can be written in any language. Good, readable code can be written in almost any language (not sure about Brainfuck). Some languages go farther to force the user to write good code than others but it's still a choice.
Re: (Score:2)
I guess the parent was more concerned in about languages like Java/C# where you have to "repeat" public/private all over the place at every definition and have to "manually" write getters and setters (if you believe in them)
But yes: obviously you can write Spaghetti code in every language.
Re: (Score:2)
My point was more than just that you can write spaghetti code in every language. It was also that you can write good code in any language. (except for obvious exceptions that are designed to be difficult.. I didn't want a 'what about Braifuck' reply)
I think sometimes new languages are being presented a a 'cure' to bad programming. new language constructs and rules are designed to try to force a bad coder to be a good one. As a result we get an endless churning of languages obsoleting old code and making m
Re: (Score:2)
I did not raise the topic 'Brainfuck' if it was not you, it was one of our parents.
Regarding 'bad programming', there are many ways to program badly.
I don't really think new languages address that, what do you have in mind there? Imho Onjective-C is an ugly language, and Swift is as successor much less ugly.
Java and C# are easy languages ... but their strength is the platform/VM, not the language itself. Hence people 'invented' Scala or Groovy etc.
Bad programmers still find a way to write bad code. Even 'g
Re: (Score:3)
Verilog vs. VHDL. I find that the verboseness of VHDL (which requires like 3 times as much typing) actually impedes readability. Sure, there are situations where VHDL can catch a bug at synthesis time that Verilog can't, but the rest of the time, it just makes VHDL unwieldy.
Re: (Score:2)
VHDL has a lot of (useless) metaprogramming. A lot of the standard operators and values are actually defined in header files and stuff. Even things like high and low, if I recall correctly. It's like the designers wanted a language that could describe anything, given the right boilerplates and templates, but didn't care to implement the actual things you need for FPGA programming as built-ins.
Re: (Score:3)
Re: (Score:3)
I was surprised at how many instructions that developers previously spread out over multiple lines are now packed into highly idiomatic one-liners.
As with many things, Python one-liners can be good or bad. When done correctly they are awesome.
Consider this code:
bad = any(is_bad_word(word) for word in words_in(message))
If words_in() is a generator that yields up one word at a time from the message, and is_bad_word() is a function that detects profanity or other banned words, then this one-liner checks to se
Re: (Score:1)
I think, perhaps, there might be a difference between syntactic succinctness and semantic succinctness.
Perl and APL let you express a lot of operations in a few characters. Syntactically succinct, hard to read.
A lot of functional languages give you (or give you the ability to write) more powerful operators, so that fewer operations are required to accomplish a task. Semantic succinctness: easy to read, if you know what the operators do (and they aren't buggy or inconsistent).
I'm just pulling this out of my
Re: (Score:2)
Perl [lets] you express a lot of operations in a few characters. Syntactically succinct, hard to read.
That really depends on your experience level, like in anything. Reading a wiring diagram is arcane to the uninitiated, but once you know what symbols represent what types of circuit pieces (including resistors, capacitors, diodes, FETs, etc.), they are both syntactically succinct and easy to read because you can tell what goes where at a glance, you don't need to read and parse a lot of text.
Same thing in Perl. Once you actually learn it, it becomes easier and faster to read than, say, Java, because there
Re: (Score:2)
Re: (Score:3)
Poorly done, cryptic succinctness can indeed make code impenetrable. Yet overdone verbosity can destroy readability just as thoroughly. When the language is naturally succinct, it's easy to ensure that it contains enough context to be readable. When the language is overly verbose, you generally can't slim it back down to readable conciseness.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
You're conflating "succinct" with "terse"...
F#, Swift and other functional inspired languages let you omit the wordy boilerplate that gets in the way of readability.
For instance algebraic data types (a.k.a. discriminated unions, or enums in Swift) are less wordy than declaring an inheritance tree like you would in Java/C#, and pattern matching us a shorter more readable way to deconstruct the data than virtual methods.
Re: (Score:2)
especially if it makes the code unreadable. Give me the verbose, easy to read code any time.
Interesting. When I first hit the Internet in the 80's back in the Usenet era, there were lots of folks using this exact argument to promote using platform native assembers over high-level programming languages. I didn't think any of you guys were still around...
Re: (Score:2)
Give me the verbose, easy to read code any time.
There are different ways to interpret simplicity and "easy to read", especially local vs. global.
I've come across this during code reviews, where a few lines of "complex" code (eg. a fold) can replace multiple screens of "simple" code ("doThingA(); doThingB(); doThingC(); .."). Each "simple" line is straightforward and clear, but it's difficult to see the big picture of *what we're trying to achieve*. In contrast, it's obvious what the "complex" solution is trying to do, although it may take a few minutes t
Re: (Score:2)
So I can surmise that you program in Ada?
Re: (Score:2)
Re:Who cares about succinctness .... (Score:4, Interesting)
You jest, but the majority of large, mission critical systems...payroll, point of sale, etc. are in COBOL...
When it's got to work, work all the time, handle million and millions of records, and can be maintained by college graduates, COBOL is the way to go.
Re: (Score:2)
You jest, but the majority of large, mission critical systems...payroll, point of sale, etc. are in COBOL...
Not true. There are legacy systems written in COBOL, but no where near the "majority". In business applications, Java is far more common.
When it's got to work, work all the time, handle million and millions of records, and can be maintained by college graduates, COBOL is the way to go.
No one uses COBOL for these reasons. COBOL is not particularly reliable, and is not easy to maintain. The reason it is used it because the decision to use it was made back in the 1950s or 1960s, and now the transition cost is too high to switch to something more sensible.
Re: (Score:3)
So that's a great big Nuh Uh! from you then?
Perhaps in small business applications you may be right. But the Big Guys...National Banks, large aerospace, etc. COBOL runs the core systems.
And I'd choose COBOL for reliability and maintainability anytime.
Re: (Score:2)
You jest, but the majority of large, mission critical systems...payroll, point of sale, etc. are in COBOL...
When it's got to work, work all the time, handle million and millions of records, and can be maintained by college graduates, COBOL is the way to go.
How many colleges are teaching COBOL these days?
I doubt that your average Java or C++ jockey CS grad would be able to maintain a business critical COBOL code base.
Re: (Score:2)
Given today's job market, I'll bet a lot of CS grads would be happy to maintain a business critical COBOL code base.
Re: (Score:2)
Re: (Score:3, Insightful)
Depends on the reader. Anyhow, coding is the art of balancing trade-offs among many factors, but readability by an average programmer (future maintainer) should be among the top priorities. It's best to learn what code style is the easiest for a random programmer to digest.
Too much abstraction and factoring can sometimes throw readers off, I hate to say. Those who personally enjoy making "symbolically optimized" code don't like to hear that, but one is not coding on an island, but rather in the City of Fung
Re: (Score:3)
Functional code can't be maintained immediately by the random programmer. They need to learn concepts first. That's also true of languages which are tightly coupled to hardware, languages which are tied tightly to databases or BI tools or languages which uses other conceptual frameworks like logic programming. Everyone learns: do X then do Y when they are a toddler.
Re: (Score:2)
but readability by an average programmer (future maintainer) should be among the top priorities
Especially since you rarely lose anything by making your code readable, and often gain flexibility and efficiency.
Re: (Score:3)
You're saying you only write raw pedestrian code with NO layering? I have only ever seen that in CS 101.
No, the point is that (pure) functional programming forces a separation between "actions", which may affect something in the program (eg. a global), in the computer (eg. a file) or in the outside world (eg. play a sound), from "computations", which can *only* return a value based on the given arguments.
This separation makes it easy to understand computations: arguments go in, return value comes out and *that's it*. In turn, this makes it safe to work at very high levels of abstraction (Monoids, Monads, Cate
Re: (Score:1)
Now, onwards to teaching category theory in first grades everywhere!
Um, yeah ... (Score:2)
Isn't that kind of the point?
Is this supposed to be something we didn't know? Or just confirming something we did?
Re: (Score:1)
if you've ever talked with a python dev^H^H^Hzealot, you would think it doesn't matter and you must be a bad developer to rely on strong types.
Re: (Score:1)
Funny, the first python zealot I ever met was a bad developer who didn't do ANY error checking.
So, are python developers all overconfident morons?
Then again, what do you expect from a language which makes whitespace actually part of the syntax.
I've never been able to look at python as anything other than a lazy man's language since.
Re: (Score:2)
Python is strongly-typed. It is not statically-typed, however.
Re: (Score:1)
Neither, because it is incorrect. Python, for example, is a strongly-typed language that will not catch the overwhelming majority of type errors at compile-time.
Static typing is what can catch type errors at compile time.
Re: (Score:2)
Python has a single type, which includes integers, bignums, floats, lists, strings, classes, functions, etc. *and errors*. In Python, a run-time error is a value which can be manipulated just like anything else. For example, this code passes errors into functions, adds together a couple of errors and prints some errors out:
def myFunction(x):
print "CALLED"
print x + ' world'
myFunction('hello')
try:
myFunction(100)
except TypeE
Re: (Score:2)
That's basically the definition of dynamic typing, looked at from a static implementation.
Exactly: when we're comparing static typing to dynamic typing we have to use some common ground for comparison. That common ground is static typing, since dynamic typing is just a special case ( http://existentialtype.wordpre... [wordpress.com] )
The name of a type, and its lineage, while sometimes checked, are usually considered unimportant. Having certain attributes, and methods, of certain types, are important. It allows for similar uses as compositional inheritance, but without actually having that.
No, that has nothing to do with types. That's just a (rather limited) way of treating functions as values. For example, we can store first-class functions in dictionaries:
def c1():
s = {"p": "foo"}
return {"f": lambda _*: s["p"]}
def c2():
s
Re: (Score:2)
I won't reply to your rant, since it doesn't seem to have anything to do with the question of "Does Python have type errors?".
The 2nd try doesn't work.
Yes it does, it gives me an "Exception" value which I call "e", prints out 'Second: ' and a string derived from "e". Not only does the program carry on executing, but that's the expected behaviour; just like executing code in an "else" branch doesn't mean that an "if" statement "doesn't work", or has a "type error". It's just control flow, which is orthogonal to typing.
Why? Because the data is an integer, which is not a type that can be sliced.
No, the data i
Re: (Score:2)
Isn't that kind of the point?
Is this supposed to be something we didn't know? Or just confirming something we did?
Mostly confirmation. It's good to have empirical evidence: https://news.ycombinator.com/i... [ycombinator.com]
Compiled Strongly-typed Languages -vs- Scripts (Score:5, Insightful)
The difference, as the summary noted, is that when using a scripted-language, you are trading all your compile-time (build breaks) for runtime errors that your users will see.
If you write 'C' code, would you declare all your input and output return types as 'void*'?
If you write 'Java' code, would you declare all your input and output return types as 'Object'?
Why someone would willingly give up the function of a compiler is beyond me. Sure, use scripts for little tasks / prototyping etc. Any long-term project should be using a proper language, that provides type-checking (at compile time), and provides proper encapsulation so that 'private' means 'private' (looking at your Groovy). I don't want to be forced to read every line of your crappy code, just to try to figure out what object-type your method supports because you are too damn lazy to define it in the method's interface.
When you change the behavior of the method and assume different input/output object-types, I want that to be a BUILD-BREAK instead of me once again having to reverse engineer your code.
Re: (Score:2)
I think the test-driven advocates would say that relying on the compiler is OK for that one particular kind of error, but you really should be writing tests to catch that kind of error along with many others.
The reality is probably, as you kind of imply, sometimes you have a task that is more suited to one approach or another.
Re: (Score:2)
Writing tests for 100% code coverage takes a good amount of effort to do. If you don't do this, and use a weakly typed language then you risk not seeing the issue until that rare case that was programmed for is covered.
Re: (Score:2)
relying on the compiler is OK for that one particular kind of error, but you really should be writing tests to catch that kind of error along with many others.
Tests *only* make sense if the compiler is trustworthy. If you can't trust the compiler, you can't trust anything it produces, including your tests! Test suites are *far* less trustworthy in comparison, so anything that *can* be handled by the compiler *should* be handled by the compiler.
Anything that's handled by the compiler doesn't need to be tested, since the tests wouldn't compile!
-- Our function
boolAnd
boolAnd True True = True
boolAnd _ _ = False
-- A redundant test
testBoolA
Re: (Score:2)
I think the test-driven advocates would say that relying on the compiler is OK for that one particular kind of error, but you really should be writing tests to catch that kind of error along with many others.
The reality is probably, as you kind of imply, sometimes you have a task that is more suited to one approach or another.
The nature of testing is that complete coverage grows combinatorially with state. What you're saying is you don't want to eliminate the possibility of an entire class of errors, but rather rest this (rather significant) burden on testing. From my point of view that's like abandoning DRI in a database and saying tests can detect foreign key constraint violations and all the other things DRI can check. While technically true, it just doesn't make any practical sense.
Re: (Score:2)
Do your test cases cover every possible object type that can be passed into every single method
I find it quite interesting that dynamically-typed languages, which must rely completely on tests, are the languages worst-suited to testing!
With a decent type system I can statically *prove* that my code satisfies certain properties, *exhaustively* test it for other properties (eg. those involving small enumerations like the booleans), then fuzz-test whatever's remaining with help from the language ( http://www.reddit.com/r/haskel... [reddit.com] ).
In dynamic languages I get *no* static guarantees (not even syntactic c
Re: (Score:2)
(I believe he was referring to Haskell.)
Java's boilerplate isn't a great advertisement for static typing. For a more modern perspective running on a virtual machine, F# and Scala provide OO semantics atop a Hindley-Milner-Damas inspired type inference but with perhaps too much esoteric functionalness for the average mortal!
One wonders if RoR would have existed if someone had managed to craft a killer web framework on, say, Ocaml a decade ago.
Re: (Score:2)
Have you looked at any Java lately? ... Even if it compiles there's no way to guarantee anything (except null pointer exceptions).
That's because Java's type system is *unsound*. According to the Curry-Howard correspondence, types are logical statements and values/programs of that type are proofs of that statement. Since Java's "null" is a valid value for any type, it can therefore be used to prove anything, which makes the logic inconsistent, unsound and pretty much useless.
For any statement (eg. "1 + 1 = 2") we can encode it as a Java type and prove it with null. Then we can negate it (eg. "1 + 1 /= 2"), encode that as a Java type an
Re: (Score:2)
Java was one of the poorest at actually running successfully.
AHK doesn't obey all the script-lang "rules" (Score:2)
Is not true for all scripted languages. AutoHotkey for instance gained a #warn flag, that among other things:
1) Tells you when you have a local variable with the same name as a global. The local trumps the global, unl
Grand Tradeoff [Re:Compiled Strongly-typed Languag (Score:2)
In my opinion the basic trade-off is that "scriptish" languages can be written to be closer to pseudo-code and thus easier to read and grok. Strong/heavy typing tends to be verbose and redundant, slowing down reading.
Better grokkability often means less "conceptual" errors, but at the expense of more "technical" errors, such as type mismatches. There's no free lunch, only trade-offs.
In some projects the conceptual side overpowers the technical-error side, and vice verse. It also depends on the personality o
"Scripting" langs are functional, OO, procedural (Score:5, Informative)
Simply because a language is billed as a "scripting" language (by which people tend to mean distributed as source code and partially compiled for each execution rather than compiled once and distributed as object code rather than actually used primarily to script other programs) doesn't mean there's no programming paradigm associated with them. They can support procedural, functional, actor-based, object-oriented, logical, dataflow, reactive, late binding, iteration, recursion, concurrency, and whatever other paradigms and methods people want. Some of them support mixing and matching even in the same program.
Languages that are typically fully compiled can even be run in an interpreter. C-- comes to mind. Often languages known for interpretation (actually most of which are partially compiled rather than interpreted line-by-line) have support for compiling at least portions of a program up front, too. Examples include the .pyc files of Python, luajit, Facebook's HHVM, Steelbank Common Lisp, and Reini Urban's work on perlcc.
People making claims about one type of language vs. another should really keep straight what types they are talking about.
C++ = Clear Language Choice. (Score:2)
It always comes down to personal preference.
All i care about is performance. If i want performance, i will learn how to use C++, regardless of what new writing methods i need to learn.
People who cant/dont want to learn a "better language" will always try to brickwall every other language with an excuse that suits their "locked in" writing ability.
Funny how C++ is missing from this "lets try and justify programming languages using a graph made by a 2 year old with crayons", tests
Re: (Score:2)
All i care about is performance. If i want performance, i will learn how to use C++,
Actually, if all you care about is performance, generally Fortran is the language for you. Its when you start caring about something like toolchain support and your own established codebase that C++ starts looking like a much better choice to most folks. But if you "don't care" about those things, here's your compiler [intel.com].
Re: (Score:2)
Re:C++ = Clear Language Choice. (Score:4, Interesting)
Sincere question - I've heard that Fortran blows away (or at least beats) C++ for scientific/calculation programming... can you lend any insight into what accounts for that, specifically?
http://stackoverflow.com/quest... [stackoverflow.com]
Re: (Score:2)
Two things.
1) Fortran has low level mathematical data operators that are more powerful than those in C. I.e. in practice Fortran code compiles to faster code because the programmer is more aware of what will be faster.
2) ALUs evolved around running fortran code fast the same way modern CPUs evolved around running compiled C code fast.
Re: (Score:2)
Re: (Score:2)
compiled COBOL and Forth also will beat the ass of C/C++
Re: (Score:2)
why, c++ raises the probability of making bugs, spews compiler messages that are hard to read, and to add insult to injury will generate slower code. There is a reason the core libraries of the big math packages are still in Fortran. And after making statement like that someone will post link to some self-touting python / C project, failing to notice it uses huge mass of the same Fortran libraries everyone else uses.
Re: (Score:2)
The paper justifies their choices, and hence why C++ wasn't included.
Anyway, if all you care about is performance then why use C++? FORTRAN is faster.
Re: (Score:1)
Seconded.
C++ (and C) are the only languages I've ever used while working on runtime code for games (and really, C hasn't been the primary language for the past 20 years or so).
But I've used all sorts of languages for non-runtime code (Python, C# and DOS-Batch being the 3 most commonly used).
For instance, our GUI tools are mostly written in C# (with some legacy C++ still in use for some older tools). But I prefer python (or sometimes DOS-Batch) for faceless tools (build farm, installers, miscellaneous script
Re: (Score:2)
For a "Hello World" application, C/C++ win. For a team project involving significant complexity the whole life
Dual Typing? (Score:2)
Why aren't there more languages that allow strong typing were desired but weak typing where not? One can kind of emulate such with generic "variant" or "object" types in some languages, but you have to keep declaring everything "object". If I want dynamic parameters, I shouldn't have to put any type-related keyword in the parameter definition.
For example, one should be able to type: function foo(x, y)... versus function foo(x:int, y:string)... for weak and strong typing, respectively. And types could be con
Re: (Score:1)
Re: (Score:2)
define a new method that takes as input a String, and declares that it WILL THROW a NumberFormatException (or equivalent) if the String that is passed in is not parseable as a Number.
Screw that -- make the caller responsible for handling the two possibilities by declaring an `Either` return type in its interface. That way the compiler will enforce that the caller handles both cases.
Without these protections, you are just forcing your users to become programmers and have to debug your crappy scripts (reverse engineer) what the call stack was (or what Exception they need to handle).
Re: (Score:2)
Type inferred languages often allow you to sidestep type inference by explicitly declaring the types using a semicolon following the declaration.
Microsoft's TypeScript follows that convention when augmenting ecmascript, e.g.
function add(left: number, right: number): number {
return left + right;
}
huh (Score:2)
My /. account still works. Havent' used it in ...err... a few years.
Just here to bump this article for my friend Mike and his fantastic work on RC.
Mickey mouse tests (Score:3)
If I understand the statistics correctly the average program has 71 lines of code. Those are mickey mouse tests for which scripting languages shine. All the verbosity of imperative languages becomes handy when you have tasks that are a few 100KLC long.
This is a lesson Perl learned the hard way: once your program is long enough you beg in your knees for strong static type checking system.
Data from snippets, not real programs. (Score:2)
The problem with programming language evaluations is that they tend to be based on small snippets of code, like this one, or data from novice student programmers, or worse, popularity. Yet what really tends to matter is how much trouble a language causes in large systems and in later years. That's where high costs are incurred because changes in module A affect something way over in module Z. Undetected cross-module bugs, high costs of changing something because too much has to be recompiled, that sort of
Go Go (Score:2)
The biggest suprise for me is how well Go does:
"Go is the runner-up but still significantly slower with medium effect size: the average Go program is 18.7 times slower than the average C program. Programs in other languages are much slower than Go programs, with medium to large effect size (4.6–13.7 times slower than Go on average)."
My only objection is that they classify Go as "procedural" along with C, Ada, PL/1 and FORTRAN. It may not have inheritance (a good thing in my book!) but it has many OO f
Re: (Score:1)
Re: (Score:2)
A study recently found that "Duh" is far more succinct than "You're telling us something we already new", and "No shit sherlock" applies greater annoyance at the repetition of redundant information.
Re: (Score:3)
Not obvious at all that C is hard to beat on raw speed on large inputs. Fortran and COBOL and Forth do that.
Re: (Score:2)
Average Go programmers.
Maybe it tells something about maturity of language (wide knowledge of best practices - or even existence of them), availability of skilled programmers rather than runtime performance.
And in real world, it might be a lot more important how fast/readable/maintenable code will be written by people you can hire rather than how fast/readable/maintenable it could possibly be in most idealized situation.
Re: (Score:1)
And yet, Python is one of the most succinct languages in the study...
Re: (Score:2)
If I wrote a C program using one line and lots of ;s, it would be the most concise program possible.
Rosetta Code solutions were chosen precisely because they're idiomatic, and hence not tuned to these benchmarks.
Poor python, where newlines have syntactic effect!
There are loads of Python solutions posted on http://codegolf.stackexchange.... [stackexchange.com] which *are* tuned to similar benchmarks.
It's remarkable how much can be achieved with a single list comprehension.
Re: (Score:2)
You can strip out all C newlines (after removing escaped newlines) and replace them with spaces, except for ones right before a #, and the exactly identical code will score much higher on succinctness.
Did you do this to a bunch of examples on Rosetta Code before the database dump was taken that this study is based on? No? Then it doesn't matter, because as I said Rosetta Code represents idiomatic solutions.
Code Golf already takes this into account (counting bytes).
Re: (Score:2)
Poor python, where newlines have syntactic effect!
So do semicolons. For example:
>>> print "this";print "that"
this
that
>>>
works.
Re: (Score:2)
Take that you neigh-sayers! ;-)
We are no longer the pgrogrammers that say neigh, we are now the programmers that say Ekki-ekki-ekki-ekki-PTANG. Zoom-Boing, z'nourrwringmm