Comparing the Size, Speed, and Dependability of Programming Languages 491
In this blog post, the author plots the results of 19 different benchmark tests across 72 programming languages to create a quantitative comparison between them. The resulting visualizations give insight into how the languages perform across a variety of tasks, and also how some some languages perform in relation to others.
"If you drew the benchmark results on an XY chart you could name the four corners. The fast but verbose languages would cluster at the top left. Let's call them system languages. The elegantly concise but sluggish languages would cluster at the bottom right. Let's call them script languages. On the top right you would find the obsolete languages. That is, languages which have since been outclassed by newer languages, unless they offer some quirky attraction that is not captured by the data here. And finally, in the bottom left corner you would find probably nothing, since this is the space of the ideal language, the one which is at the same time fast and short and a joy to use."
what about APL (Score:2, Insightful)
APL was faster than C and there has never been a more terse language.
Why is Verbosity Bad? (Score:5, Insightful)
And finally, in the bottom left corner you would find probably nothing, since this is the space of the ideal language, the one which is at the same time fast and short and a joy to use.
I must ask why the author assumes that verbosity is bad and why lack thereof makes it a "joy to use."
I think verbosity in moderation is necessary. I have read many an article with developers arguing that they don't need to document their code when their code is self-documenting. Do you make all of your variables and class/function/methods a single character for the sake of verbosity? I hope not. And I would think that reading and maintaining that code would be far less than a joy.
I don't even need to argue this, according to his graphs we should all be using Regina, Mlton or Stalin (a scheme implementation). But instead languages like Java and Perl and C++ prevail. And I would guess that support and a mediocre range of verbosity are what causes that.
Great work in these graphs! But in my opinion, verbosity when used in moderation--like a lot things--is far better than either extreme.
Re:Why is Verbosity Bad? (Score:5, Insightful)
Pet peeve (Score:5, Insightful)
Programming languages don't have attributes like size and speed: implementations of these languages do. Take Common Lisp for example: SBCL is blazing fast, while CLISP is rather pudgy (albeit smaller). Any conforming Common Lisp program will run on both. Or consider Python --- IronPython and CPython have different performance characteristics. (I'm too lazy to link these now.)
Point being, describing a programming language as "fast" makes about as much senese as describing a natural, human language as "smart".
Verbosity is bad because (Score:3, Insightful)
Verbosity = ( 1 / Expressiveness )
Re:Why is Verbosity Bad? (Score:5, Insightful)
Look where those languages are on the chart.
Java mostly against the left wall (i.e. mostly as fast as C). Perl right against the bottom (i.e. very small code).
It appears to me what this showed was that people like the walls.
Re:Scala (Score:5, Insightful)
Functional languages in practice often implement nlog n algorithms in quadratic time or memory. One of those in a bench mark is devastating. We really understand how to optimize imperative languages well, we don't have the same level of knowledge / experience regarding functional.
I agree this is a pity, but there doesn't seem to be any easy solution. Hopefully this gets fixed over the next generation.
Re:Related site... (Score:5, Insightful)
What's so special about it? It looks just like a regular perl program to me. /ducks
Re:Pet peeve (Score:4, Insightful)
Re:Related site... (Score:2, Insightful)
I think that any language that allows such obfuscation is NOT a good language. A programming language should be brief, powerful but still readable in all circumstances.
Re:Why is Verbosity Bad? (Score:5, Insightful)
The author is not talking about verbosity in bytes. He's talking about verbosity in code points. Talked about in this way, a thirty character variable name is no more verbose than a single character variable name.
Re:Related site... (Score:5, Insightful)
Re:Why is Verbosity Bad? (Score:3, Insightful)
Overloading symbols is often a negative aspect. C would be a much easier language to read if "*" had a single meaning.
Ruby (Score:5, Insightful)
On the plus side, both versions of Python can claim many of the smallest programs in the collection. Ruby (8, 1) might also compete for titles, but unfortunately its performance is so bad its star falls off the performance chart.
Then why the fuck is the Ruby community hyping it so much, and drawing nieve young developers in to a trap?
Not flamebait.
Why can't they make a language, or extend a language like Ruby, such that one can program it as a scripting language, but then add verbosity optionally (i.e. declaring the data types and their sizes, private / static etc. & whatever the hell makes a program light weight and fast) optionally? It's my hope that if I stick with Ruby one day it I won't be forced to learn Python because performance won't be "Ruby's big issue" in every discussion, but really, that is *just* a hope. I hope this isn't a mistake.
Re:Ocaml (Score:4, Insightful)
Be aware of your contexts. (Score:3, Insightful)
Contexts can be deceiving.
Be careful not to use these charts to decide what language to learn or what language is better for a given solution.
Let's remember the web server ecosystems: cgi, c#, perl, java, python, php, ruby.
A given algorithm implemented in you language of choice can give you the upper hand
and instant notoriety; but running the whole operation (labor/maintenance/testing) goes far beyond
controlled environment testing.
Lately I've been thinking that
the more powerful solution (language wise) is the one that you can build and tear down from scratch in less time/effort.
That gives you more confidence to try new/innovative solutions.
my 2 cents.
ccomparison of C and CAS (Score:3, Insightful)
documentation. The code itself shows what is done. Here is an example which takes two pictures
and procuces a GIF movie interpolating them:
A=Import["image1.jpg"]; B=Import["image2.jpg"];
width=Length[A[[1,1]]]; height=Length[A[[1]]];
ImageInterpolate[t_]:=Image[(t A[[1]]+B[[1]] (1-t)),Byte,ColorSpace->RGB,ImageSize->{width,height}];
Export["mix.gif",Table[ImageInterpolate[k/50],{k,0,50}],"GIF"]
It takes over a minute to process. A simple C program doing the same is a multiple times larger but also
needs multiple less time to process. But it needs to be documented because even simple things like
reading in a picture
fgets(buffer,1025,in);
if(strncmp(buffer,"P6",2)){
fprintf(stderr,"Unsupported file format (need PPM raw)\n");
exit(1);
}
do fgets(buffer,1025,in); while(*buffer == '#'); // get picture dimension
x_size = atoi(strtok(buffer," "));
y_size = atoi(strtok(NULL," "));
fgets(buffer,1025,in); // get color map size
c_size = atoi(buffer);
if((image = (char *) malloc(3*x_size*y_size*sizeof(char)))==NULL){
fprintf(stderr,"Memory allocation error while loading picture\n");
exit(1);
}
i = 0;
ptr = image;
while(!feof(in) && i<3*x_size*y_size){ *ptr++ = fgetc(in); i++;}
fclose(in);
But C it is worth the effort. For more advanced image manipulation tasks for example,
Mathematica often can no more be used, due to memory or just because it takes too long
(Math link does not help here very much since objects like a movie (a vector of images) can just
not be fed into computer algebra systems without getting into memory problems, which deals with a movie as a whole).
For computer vision stuff for example, one needs to deal with large chunks of the entire movie).
While the simplicity of programming with high level programming languages is compelling, speed often matters.
There is an other nice benefit of a simple language like C: the code will work in 20 years. Computer algebra
systems evolve very fast and much what is done today does not work tomorrow any more in a new version. Higher
level languages evolve also faster. And large junks of internal CAS code are a "black box" invisible for the
user. Both worlds makes sense: the low level primitive, transparent and fast low level language and the slower, but
extremely elegant high level language.
Re:Ruby (Score:1, Insightful)
In fairness, and as one of the commenters pointed out YARV performs better than the std Ruby interpreter it'll be replacing.
OTOH, lua/luaJIT beats Ruby and Python on all fronts; a lightweight, simple, speedy and expressive language. If we had to pick one scripting language on its merits rather than the personal predjudices of "I know language x" or hype, lua would be it.
Re:Why is Verbosity Bad? (Score:4, Insightful)
There's good verbosity and bad verbosity. Let's take for example java's System.out.println. This is bad because it is such a common function that you are bound to use it over and over again but good of course because you know it is not a keyword (it comes from the System.out library). The result is something twice as long as it needs to be and chews up screen space. As a human reader of the source, this becomes cumbersome and harder to read the full program.
Languages that can allow you to pull in identifiers from other scopes can solve this issue while reducing the bad kind of verbosity. For example in perl there is something called Exporter which allows defined symbols by the source package to be imported into the current scope through syntax like:
Now instead of Some::Package::someFunction, for the rest of the file I can just do someFunction since I've already declared I'm importing it at the top. If the reader is interested in knowing where someFunction comes from, searching from the top of the file will reveal the use Some::Package qw( someFunction ); line.
Re:Scala (Score:3, Insightful)
Re:Why is Verbosity Bad? (Score:3, Insightful)
(*) Assuming you don't "cheat" by not breaking your lines where normal programmers would.
And that's why I've often used "ELOC," or Executable lines of code.
An :)
ELOC
metric
for
this
would
count
this
as
a
single
sentence.
Re:Related site... (Score:4, Insightful)
Re:Verbosity is bad because (Score:5, Insightful)
Nowhere near true. Objective-C is more verbose than C, yet still more expressive, and largely self-documenting to boot.
Re:What kind of verbosity? (Score:3, Insightful)
Verbose standard libraries can help immensely with keeping code self-documenting.
Re:Why is Verbosity Bad? (Score:3, Insightful)
The justice or injustice of your comment depends strongly on what metric they're using for size, and a lot of the replies to your comment have been speculating about what the metric really is. Okay, if you start with the game's FAQ [debian.org], it looks like there are three different metric they actually computed:
Obviously these are going to give completely different results. Java will come out looking awful by measure #3, for instance. The difference between #1 and #2 has more to do with the distinction between terseness and expressiveness. You wrote "Do you make all of your variables and class/function/methods a single character for the sake of verbosity? I hope not." If someone did that, it would have a huge effect on #1, but little or no effect on #2. Actually #2 seems to me like a very reasonable measure of expressiveness.
So now the question is which one the author of the article actually used for his plots. He doesn't say explicitly which one he used. However, he has links to the source code he used to generate the plots, and to the data file he used as input. The data file only has metrics #1 and #3, so it looks like he didn't actually use metric #2, which IMO is the only one that measures expressiveness in the way he obviously intended. If you look through his lisp code, it looks to me like he actually used #1. (See the line that reads "(normalize-accross (normalize-accross data 'cpu) 'size))". My lisp is pretty weak, but I believe he's extracting the columns whose names begin with the strings "cpu" and "size.")
So it looks to me like he picked the wrong metric, which makes your criticism valid, but that just means he should have picked the right metric.
Even the right metric, #2, still wouldn't be a perfect proxy for expressiveness, of course. For instance, a language with static typing and no type inference will require the programmer to expend a lot of characters on declarations. However, that's a matter of taste; some people feel safer with static typing.
Re:Forth, the RPN notational programming language (Score:5, Insightful)
I fell hard for FORTH in the mid-80's - it was the programming language of some instrumentation I had to work with. I never did program the instrumentation, but got all the books on the language and interpreters/compilers (the distinction blurs in FORTH...) for it for a number of computers. I did some useful utilities for the PC using "HS/FORTH". I benchmarked it against some other solutions for a course in comparitive computer languages about 1985, and for heavily stack-based (ie very recursive) programs, nothing could touch it. I'm talking QuickSort faster than C.
But I have to admit: though I busted butt, re-writing and re-re-writing programs to make them as clear and readable as possible, though I re-read the remarkable "Thinking FORTH" by Leo Brodie many times (and would recommend it to anybody who wants to understand the craft of programming, whether they want to learn FORTH or not)....despite it all, most of my FORTH programs were hard to read a few months later.
There were things it was really good at - that is, the programs WERE readable later, the problem and language were a match - and others where I could just not seem to decompose the problem into FORTH words that fitted well together without a lot of "glue" to manipulate the parameters on the stack.
That was the most frequent problem with my FORTH programming - one ends up trying to manage several parameters on the stack and doing a lot of stack "DUP SWAP ROTATE" - actions to line up the parameters to hand to the next word. I would re-compose the words and the parameters they wanted to clean up some code, and find I'd made some other code all hard to read.
FORTH was also profoundly command-line oriented, and when the word went all GUI-centric on me with no good "Visual FORTH" or "Tk/FORTH" on the horizon, I slipped from grace. I can't see getting back to it now, either; lets face it, a huge bonus for any programming language choice is its popularity, so that others will maintain your code, so that you can get help and code fragments with a quick google.
But I still think that FORTH should be a completely MANDATORY learning experience in all University and Tech CompSci degrees. You can jump from C to Perl to Python far more easily than to FORTH - it really comes at problems from another angle and working with it for years has been an enormous asset to my non-linear thinking when it comes to problem-solving.
And perhaps if more students learned it, FORTH would rise in popularity for some problems, out of its decades-long sinecure in embedded systems (it started off programming radio telescopes, and undoubtedly still does...) Since it is inherently object-oriented (yes, an assembler-sized OO language from the 1970's, you heard that correctly) it would be an excellent interpretive, experimentation-friendly scripting language for applications. I'm currently needing to do a lot of VBA in Excel at work, and I have a strong suspicion I'd be twice as productive in "FORTH for Applications". It's a tragedy Bill Gates fell for BASIC (of all languages) instead.
Re:Why is Verbosity Bad? (Score:5, Insightful)
True but not very useful (Score:3, Insightful)
I think your statement is strictly speaking true but not useful in practice.
Here's what I mean: strictly speaking, with unlimited intelligence on the compiler's part, the compiler can understand what a program does and rewrite it completely as it wishes to conform to the same behavior. This means any turing-complete language can have the same performance, with a sufficiently intelligent compiler
In practice and in current times, however, a language's features determine how well the state-of-the-art in compilers can optimize a program. To give a very simple example... You don't see compilers inserting statements to free memory in Java programs, even though that would sometimes make them faster than running them with a garbage collector as happens in practice.
Re:Why is Verbosity Bad? (Score:2, Insightful)
There's "verbosity" and there's "verbosity".
One form - which is I think how the original article used it - is that individual statements accomplish little work. For example, they move a byte into a particular location in contrast to languages which have statements that do things like apply filters to lists' content. These are languages which require verbosity on the part of the author to accomplish work.
These are more work to write because one must break tasks down further. They're also more work to read because the reader must assimilate a greater number of statements to grasp the work being performed from a higher level perspective.
Another form of verbosity is intended to provide additional context for subsequent readers of code. APL can provide the counter-example of this, where code can be so terse that the original intent is lost.
Too many still forget that the time of the reader of the code is more valuable that the time of the writer in most cases, for the simple reason that there's one writer but many readers (including the writer him/herself a few days/weeks/months/years later *grin*). Making code that's easier to read is important for maintenance and extension.
That kind of verbosity is a Good Thing.
island paradise (Score:5, Insightful)
When the first 4K TRS-80 showed up at my high school, I had the option to learn BASIC, Z80 machine code, or APL up the street at the local university.
One of my early exercises with BASIC was writing a set of nested for loops which called a subroutine (gosub). I put the next statement that controlled the for loop iteration inside the subroutine, and the return statement for the subroutine inside the nested for loops. It still worked! At that point I understood that there were mechanistic languages and languages with a solid conceptual basis.
APL's reputation for inscrutability was only halfway deserved. Often the problems arose when you were trying to shoe-horn a data structure that didn't want to be an array into an array, because that was your only hammer. Later APL supported nested arrays, which increased the data structuring options, but I think by then the PR battle was lost.
In the original APL, it was kind of painful to pass more than two arguments to an APL function. This lead to programmers passing in flags to the function encoded in the array's rank, which were extracted to the tune of rho rho rho, while imaging the knapsack folding problem in Colossal Cave Adventure. As brutal as any language I've used. But you have to give APL a bit of a pass in some respects. Like vi, it was designed in 1963 to work well within the constraints of a paper teletype.
The next level of inscrutability arose because the APL primitives could often be combined in novel ways to yield surprisingly powerful algorithms. IIRC, the IBM 370 APL included a JIT compiler for certain common APL idioms. A one line program I wrote in APL to find primes (sieve of Eratosthenes) ran ten times faster than the compiled PASCAL program by the CS student sitting next to me.
Understanding APL was a lot easier if you were familiar with functional programming languages, but these hadn't been invented yet. Hey, I didn't know this: the Wikipedia page credits APL as a direct influence on FP, which I first heard of in 1982. Father knows best.
So you encounter this unfamiliar pattern of 15 familiar symbols for the first time, and you brain is polluted with horrible iterative solutions from BASIC or PASCAL, and the beauty of the expression is denied to your limited frame of consciousness.
Like solving a Suduko? Hardly. It takes me twenty minutes to solve a typical five star Sudoku. It used to take me about the same amount of time to puzzle out an unfamiliar APL one liner, which might be anywhere from 10 to 40 characters. There is one small difference: after decoding the APL algorithm, I usually slapped myself across the head and moaned to myself, "I am unworthy to drool on the shoe laces of the grand designer, but I will learn!" Never got that feeling from Sudoku.
Wrestling with the higher art of APL was like giving your ignorance a root canal. Sometimes the root canal made me barf up my milk: when the highest art of APL was applied to shoe horn a data structure unsuitable to array representation into an array representation anyway, like the Beethoven scene in Clockwork Orange.
The third case is where the one liner isn't all that difficult, but it's doing it in more dimensions than the brain wishes to visualize. This is a case where a picture is worth a thousand words. Your 20 character APL function would have been better presented as a caption on a one page UML diagram. Never figured out how to embed a UML diagram in an APL lamp statement on my VT100 terminal.
Another problem APL suffered was too much kinship with Forth. To thrive in APL, you needed to create hundreds of tiny APL functions, which the implementations of the day mashed together into a single unmaintainable workspace.
And the system interface tended to suck.
But other than that, what's not to like?
I could have composed instead a tedious, but germane post on Shannon's first law: concision is a function of preconception. It's a rare breed of programmer who thrives in a language which provides su
Re:This is totaly stupid (Score:3, Insightful)
The verbs, nouns, semantics and such used in a given programming language have nothing, I repeat... NOTHING to do with performance!
What does have to do with performance is the talent of the compiler / interpreter author, nothing more, nothing less.
The empirical evidence then tells us that the authors of assemblers are more talented than those of C compilers, who are in turn more talented that the authors of compilers/interpreters of JITs and dynamic languages. Put another way, you're wrong.
Re:Why is Verbosity Bad? (Score:3, Insightful)
For example, I consider Scheme more expressive than, e.g., COBOL (at least the older standards, haven't seen the newer ones), since COBOL has no notion of lexical closures, first-class functions etc. On the other hand, Bourne shell does not have these notions either, yet shell scripts tend to be quite compact when applied to tasks for which shell is a suitable tool. Scheme actually has to go out of its way if you want to write equivalent scripts of comparable length in it, which is why projects like scsh have emerged. I find it really difficult to correlate expressiveness and verbosity in any simple way.
A language does not have performance (Score:4, Insightful)
Performance is created by the compiler, not the language. A C program compiled with a shitty compiler is going to run slower than a Ruby one in a good VM, even though C is running native on the CPU. For that matter, what if I take the C code and compile it with the CLR as a VM target?
I wish people would stop trying to compare languages by performance, it does not make any sense. The only language it makes any sense for at all is assembler.
Re:Ocaml (Score:3, Insightful)
It's closer to how people who have programmed in imperative/OO languages for many years think. Do you really think for example (python-ish metacode):
res = emptylist()
for i in aList:
if (i.isFnord()):
res += i
is simpler than (haskell-ish):
filter isFnord aList
I find it tends to be a case of exactly what you are trying to do. Some problems, such as filtering a list, tend to get expressed functionally (though I like python's list comprehension syntax "[i for i in aList if i.isFnord()]"), and the procedural approach looks and feels clunky by comparison. On the other hand there are other problems that are expressed naturally in a more procedural way (often because we think of them in terms of a logical sequence of state based steps) that take some work to wrap up in a functional way. I don't think it is clear that one approach is the right one for all problems, and personally I find I prefer languages that provide expressiveness in both approaches so I can use whatever is more clear.
These benchmarks suck (Score:3, Insightful)
I'm sure there are others, but I have work to do.
Re:Why is Verbosity Bad? (Score:4, Insightful)
Re:Pet peeve (Score:4, Insightful)
Yes of course you are right. And more recently we actually have languages with coherent and consistent behavior across implementations (somewhat of a novelty, just look at C implementations). There's several Ruby interpreters that run the Rails framework now. The fastest one (or so it is claimed often) is JRuby, which runs on top of the Java virtual machine, which has has quite many implementations on a wide variety of hardware optimized for running on e.g. embedded hardware or on CPUs with thousands of cores and impressive levels of compatibility given these differences. So saying language X is faster than language Y is a quite meaningless statement these days. Faster on what, and under what conditions and at what required level of stability, and with what kind of benchmark? Most C programs are fast, except they have all but 1 core on the CPU idling because threading and concurrency are very hard to bolt on to a C program. Which is why some performance critical messaging servers are done in languages like Erlang.
Most C programmers believe it is the closest thing to native aside from assembler. Probably correct if you ignore 40 years of progress in the hardware world but downright naive in light of modern day X86 processors. Technically x86 is not a native instruction set anymore but a virtual machine language that happens to have an in hardware translation to the 'real' instruction set. Like all such translations, it comes at a cost of lost flexibility, and indeed performance. But it is worth avoiding having to rewrite all those pesky compilers. So rather than shatter the assumptions C programmers make, they actually support them by sacrificing transistors. The only real difference between Java and C execution model (aside from better defined semantics in Java) is that Java does the translation in software, at run-time, taking into account performance characteristics of both the running program and the underlying hardware. That in a nutshell is why the LLVM project exists to do the same for C programs (and indeed many other languages, including Java).
Of course you have to take into account the levels of abstraction and indirection provided in application frameworks as well. Those come at a cost and that cost is typically high in scripting languages (but you get so much in return). Java is often unfairly compared to C. I say unfairly here because it is usually a comparison of application frameworks rather than the language implementation. Java is in 'laboratory' conditions quite fast, even comparable to C and applies all the same performance optimizations (and then some). Except, nobody programs Java that way (except some dude at work who managed to produce the suckiest Java class ever, different story) . Similarly, C mostly lacks the rich frameworks that are common in the Java world. Transcode a C program to Java and you end up with a code base that is still pretty fast (e.g. quake 2 has a Java port). Stupid people in both camps believe that it is an either-or type decision between the two and that it is somehow inherent to the language what you end up with. Clever engineers know that 95% of their code is not performance critical at all (i.e. cpu idling most of the time), and that it makes a hell of a lot of sense to do whatever is necessary to get that 5% performing as best as possible if the cost in terms of stability, productivity, security, portability, etc is low enough. That's why server-side c is not a commonly required job skill.
That's why Apple used the llvm platform to port Mac OS X to mobile phones and why Google chose to implement a heavily customized Java vm on top of familiar C/C++ components to emulate their success. Good engineers know when to choose what. Both iphone and android are excellent achievements that move beyond the stupid status quo of wondering which language is better.
Re:island paradise (Score:3, Insightful)
Lisp, arguably the grandfather of a lot of programming languages, was specified around 1958. While the first efficient implementations were a long time coming from that, I don't think you can claim APL was a forerunner to FP languages.
Nice post otherwise :-)
Re:Why is Verbosity Bad? (Score:5, Insightful)
verbosity != self-documented
Ruby example:
do_something() if some_time 7.days.ago
To me, this line of Ruby code is perfectly clear and self-documented. It is also about as short as you could make it. If I had to write the same code in Java, it would be long enough that I would feel it required a comment.
This is where so many programers go wrong. The code is self documenting in the respect that you know it's checking the age of something to see if it was 7 days ago, but you have no idea WHY it's checking that. Why not 6 days ago? What if it's 8 days ago?
Comments should say WHAT and WHY you are doing something not HOW you are doing it. The HOW is the code itself. If someone was to look at that code, do they know what your intent was? Were you looking for something that was exactly 7 days old, or something 7 days old or older.
Re:Related site... (Score:1, Insightful)
One way to identify those programmers is that they don't blame their tools for their own incompetence.
I was with you until you said this. I've heard "it's a poor workman who blames his tools" too often. It's a worse one who can't tell good tools from bad.
Re:What kind of verbosity? (Score:3, Insightful)
for(object o in list) {
Item item = (Item) o;
System.Out.Println(item);
}
What language is this? If its Java maybe
for(Item o : list) {
System.Out.Println(o);
}
Although java 1.4 would have been more painful..
http://leepoint.net/notes-java/flow/loops/foreach.html [leepoint.net]
Also I guess there are times when strong typing is considered expressive and when it is considered overly verbose.
In a large application, weak typing sometimes is a PITA, in a short script it is lovely.
Re:What kind of verbosity? (Score:3, Insightful)
One suspects I'd have found that bug pretty rapidly.
If you can point me at a language that guarantees bug-free programs, I'd like to see it.
Re:What kind of verbosity? (Score:4, Insightful)
I prefer :)
print foreach @list;
though. Perl FTW as usual
Re:Why is Verbosity Bad? (Score:5, Insightful)
Vermont ski areas have more words for "snow" than the Inuit.
But they all mean "ice".
Re:what about APL (Score:3, Insightful)
Re:Related site... (Score:3, Insightful)
I'd prefer a language which restricted my thinking in beneficial ways to one which left me entirely my own devices.
"Restricted [...] thinking"? So instead of doing more with the same tool, you want to do less?? and buy more other tools to do the rest???
Honestly, I see "restricted thinkng" applicable only to developer who are not mature yet or are refusing on improving themselves. Just like guilds of past v. industry of today. In past there were skillful craftsmen - now we have some random folks off the street who simply punch a button for the 8 hours with lunch break.
I hope that llama-isation of software development would happen after my retirement....
Re:what about APL (Score:2, Insightful)
Some of us have submitted programs in APL, or its younger sibling J, to the shootout (see http://www.mail-archive.com/general@jsoftware.com/msg02859.html [mail-archive.com]). However, since the rules of the shootout specify the algorithm you have to use, you end up writing C programs in APL or J which is no way to take advantage of the power and expressiveness of these languages.
It's like taking part in a poetry contest where you're only allowed baby-talk.