Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming

Why New Programming Languages Succeed Or Fail 274

snydeq writes "Fatal Exception's Neil McAllister discusses the proliferation of programming languages and what separates the successful ones from obscurity. 'Some people say we don't need any more programming languages at all. I disagree. But it seems clear that the mainstream won't accept just any language. To be successful, a new language has to be both familiar and innovative — and it shouldn't try to bite off more than it can chew. ... At least part of the formula for success seems to be pure luck, like a band getting its big break. But it also seems much easier for a language to shoot itself in the foot than to skyrocket to stardom.'"
This discussion has been archived. No new comments can be posted.

Why New Programming Languages Succeed Or Fail

Comments Filter:
  • Smalltalk? (Score:2, Informative)

    by Anonymous Coward on Friday March 16, 2012 @09:15AM (#39376149)

    No debugging/syntax checking IDE for dynamic languages? Not if you count Smalltalk as dynamic. Smalltalk works arguably better for debugging, syntax checking, and more than static languages because of the VM concept.There can be a slight delay, especially if connecting to a VM remotely (Gemstone for instance), but Smalltalk's IDE even lets you program in the debugger. You haven't seen real TDD unless you've seen someone writing an entire app in the Smalltalk of choice's debugger. It also supports things like real-time unit tests that run as you type, direct debugging from web pages, runtime code changes, etc. The idea is that is always runtime.

    Unfortunately, Smalltalk is too "weird" for some people, while others don't get it (example: power via simplicity). And then of course there was the vendor greed factor, incompatible dialects, etc.

    I for one love working in Smalltalk and think files are a stupid concept, particularly for software development. OO source control? Yes please. People are just too set in their ways, and that's why many things that were good middle grounds have picked up momentum. I think a lot of people really do want something like Smalltalk and do not realize it.

    I work with C, C++, Objective-C F#, C#, Java, Scala, Ruby, Python, Smalltalk, and a few others. I find it really funny that people are so excited over Ruby for example when it just feels like a crippled, inconsistent, buggy Smalltalk. I do like Ruby, but it feels like a toy in so many ways and I just end up using Scala or gasp, Java, Objective-C, or C# instead for anything real. I am happy to see newer projects like Pharo, but without a large war chest backing it, I am sure Smalltalk will remain a dirty secret and unheralded language. At least overall the ideas from Smalltalk have made computing as a whole so much better. It took too long though and we're still not caught up.

  • by master_p ( 608214 ) on Friday March 16, 2012 @09:34AM (#39376303)

    Ada did not fail at all. It is used for exactly what it was designed for: mission critical defense applications.

    Ada was not designed for intranet or web or mobile or desktop applications, although it can do those things really well.

  • by Animats ( 122034 ) on Friday March 16, 2012 @12:02PM (#39378667) Homepage

    Java succeeded because Sun 1) gave it away, and 2) threw money at giving it away. Remember "applets"? Java was supposed to be the programming language of the Web. That didn't work out. It ended up being the new COBOL, which was not Sun's intent.

    Some languages fail, or get stuck, because the designer is in love with their own implementation. That happened to Pascal and Python. Wirth's own Pascal implementation was a cute little recursive-descent compiler that generated RPN byte codes, like a Java compiler. Wirth resisted changes to the language that would allow programming in the large. ISO Pascal reflects his biases. So Pascal became stuck in an educational niche. The original Macintosh software was all written in an extended Pascal, as was much '80s software. But everybody had a different dialect - there was Turbo Pascal, Clascal, and a few others. They never merged.

    Modula, Wirth's second try, was also crippled in certain ways. Modula 2 was better. Modula 3 was good enough to be used to write an operating system kernel. Unfortunately, Modula 3 was only used with DEC, which died after being acquired by Compaq.

    Python has some of the same problems. The feature set of Python reflects what it's easy to implement in a naive interpreter, like von Rossum's CPython. Internally, everything is an object, even integers and floats, and object access involves dictionary lookups. This makes CPython slow. Every attempt to speed up Python substantially has hit a wall, including Google's "Unladen Swallow" effort. (PyPy is making progress, but it's taken a decade and requires an incredibly complex internal combination of interpreters and compilers.)

    The biggest disappointment to me has been that we're still stuck with C. C has two killer bad design decisions - the language doesn't know how big arrays are, and the "pointer=array" thing lies to the language. Both reflect how things are done in assembler, and the fact that the original compiler had to fit in a 128K PDP-11. Most of the millions of buffer overflows and crashes that occur daily can be traced to those two design decisions. (C++, as I point out occasional, tries to paper over these problems with collection classes. But the mold usually seeps through the wallpaper, since most operating system and library calls want raw C pointers.)

After a number of decimal places, nobody gives a damn.

Working...