Programming Languages You'll Need Next Year (and Beyond) 315
Nerval's Lobster writes: Over at Dice, there's a breakdown of the programming languages that could prove most popular over the next year or two, including Apple's Swift, JavaScript, CSS3, and PHP. But perhaps the most interesting entry on the list is Erlang, an older language invented in 1986 by engineers at Ericsson. It was originally intended to be used specifically for telecommunications needs, but has since evolved into a general-purpose language, and found a home in cloud-based, high-performance computing when concurrency is needed. "There aren't a lot of Erlang jobs out there," writes developer Jeff Cogswell. "However, if you do master it (and I mean master it, not just learn a bit about it), then you'll probably land a really good job. That's the trade-off: You'll have to devote a lot of energy into it. But if you do, the payoffs could be high." And while the rest of the featured languages are no-brainers with regard to popularity, it's an open question how long it might take Swift to become popular, given how hard Apple will push it as the language for developing on iOS.
Over at Dice? (Score:5, Insightful)
Over at Dice
But we are at Dice, sir [whois.net]:
Pros: Today's article has more content than the usual Dice front page linkage. Great article if you're not a programmer but feel stymied by the wide assortment of languages out there. Although instead of hemming and hawing before making your first project you're better off listening to Winston Churchill and sticking your feet in the mud: "The maxim 'Nothing avails but perfection' may be spelt shorter -- 'Paralysis."
...
Cons: It barely scratches the surface of an incredibly deep topic with unlimited facets. And when one is considering investing potential technical debt into a technology, this probably wouldn't even suffice as an introduction let alone table of contents. Words spent on anecdotes ("In 2004, a coworker of mine referred to it as a 'toy language.'" like, lol no way bro!) could have been better spent on things like Lambdas in Java 8. Most interesting on the list is Erlang? Seems to be more of a random addition that could just as easily been Scala, Ruby, Groovy, Clojure, Dart -- whatever the cool hip thing it is we're playing with today but doesn't seem to quite pan out on a massive scale
Repeat after me... (Score:5, Insightful)
CSS3 is not a programming language. No more then HTML is.
If you want to earn big bucks... (Score:5, Insightful)
Learn C++, Java or C# and get yourself a job at a big corporate.
But hey, if you want to be a hipster coder and dick about all day doing "groovy" websites at some here today gone tommorow startup and earning fuck all by all means go down the web development route along with every other 14 year old school kid.
Erlang? Nice language but too niche. Never really got momentum outside telecoms and its probably too late for it now.
I don't... (Score:2, Insightful)
Re:Repeat after me... (Score:3, Insightful)
Re:Repeat after me... (Score:5, Insightful)
I've been writing software for a good 18 years now and I've never been limited by not knowing CSS. However, if I reach that limit I'm pretty sure I can pick it up like every other programming or markup language that I've needed.
Web = Garbage (Score:5, Insightful)
Next year, the languages you'll need will still be C, C++ and Java. Maybe some C#, Python or Bash. The year after that, you'll still be using C, C++, and Java. Maybe some C#, Python or Bash.
By 2020, the main difference is that you'll be working with machine-learning DSLs and libraries to program/train memristor based devices. But you'll still be using C, C++, and Java. Maybe some C#, Python or Bash.
Re:Over at Dice? (Score:5, Insightful)
Re:Erlang is overrated crap (Score:2, Insightful)
We rewrote this 9 months of Erlang development in 3 weeks (!) using one senior Java developer. it worked like a charm and still runs flawlessly in production today.
Then your project was a very poor fit for Erlang in the first place.
Fundamentals of Comp Sci (Score:4, Insightful)
Re:We'll "need" Swift? (Score:5, Insightful)
Need? No. You can still use Objective C if you want to code iOS/OS X. Want? Yes.
And while the rest of the featured languages are no-brainers with regard to popularity, it's an open question how long it might take Swift to become popular, given how hard Apple will push it as the language for developing on iOS.
Apple does not have to push very hard. After looking at it and Objective C, it doesn't take a genius to see why programmers would prefer it over Objective C.
Re:Repeat after me... (Score:5, Insightful)
Sure, but a programmer that doesn't know CSS is pretty limited!
The fact that you think not knowing CSS will make a programmer limited showcases that your programming experience is limited to front-end development. And that is sad.
The programming language for the next 20 years... (Score:5, Insightful)
C. Plain old C.
Entire Operating Systems are written in it. Userland tools for those operating systems are usually written in it. Any self-respecting developer knows at least C. The rest is just like fashion tips: next year they're outdated.
Although, as much as I hate to admit it, the same could be said for Java...
Re:If you want to earn big bucks... (Score:5, Insightful)
Re:Author thinks strong typing == static typing? (Score:3, Insightful)
Re:So much Fail. Ignore. (Score:2, Insightful)
Heh, even if you have a variable that *IS* final, you can *STILL* change it's value at runtime.
Java doesn't enforce final at the bytecode level.... It's a compile-time hint and the values can still be modified at runtime.
It's one of the main differing qualities of C++ vs Java. Const is enforced much more than "Final" in Java.
Also C++ templates aren't by-default type-erased like Java's generics. But in C++ type-erasure is a pattern that I can choose to use or not.
Lately I do work that garbage collection simply doesn't work well for..... I much prefer manually handling unique/shared_ptrs myself and choosing when and where stuff gets freed and in what order. Big data type stuff with lots of CPUs and massively parallel algorithms using in-memory working sets over 1 TB.
The older I get, the more I realize that C and C++ were the only languages I really ever needed despite learning easier languages first. (BASIC/QBASIC/VB/Perl/Python/Java/C++/C)
Now I wished I started with C++. Seriously. It's all i really need.... Python for quick prototyping crap, C++11 for the rest.
Re:My thoughts on these selections. (Score:4, Insightful)
Trust me, from a guy who's dealt with COBOL and Java, they're nothing alike in either corporate philosophy or boat-anchor of coding. For better or worse, Java and C# are essentially analogs in terms of what you can 'do' with them. Java sucks more in UI's, and some syntactic sugar that makes your life easier, and C#/.NET lacks the trillion toolkits used in Java for pretty much any common need. Many popular Java lib's are ported to .Net, but still a boat load you'll only find in Java land for now. Lets not labor the point. There will be a millions fan boys to jump on the point, but on a language stand point, they're so close that it shouldn't matter.
PHP is a simple language for beginners and it got its entrenched status because some novice PHP dev's wrote some great sites / tools which people have organically grown around. Its a lousy language, and a very specific use case. I've never used RoR, but sounds about the same but in a more sexy buzz word.
Erlang like all functional languages universally are very useful for their very limited number of business areas where they rock, and enevitably the evangelists of these languages always trump out how they're great for everything and the kitchen sink, but we all know they aren't, and will continually be relegated to areas where they shine. Hybrids like Scala have a chance, but frankly I'd hate to sit down and listen to a dev lead's meeting in a scala shop lay down the laws on when to use strictly functional no matter how broken it makes the code, and when to just use other paradigms that probably just work better, simpler, and faster to develop.
Re:Swift maybe, Erlang, really? (Score:4, Insightful)
the iOS development market is dried up for most developers, unless you can get a job at some company that wants an iOS app.
umm duh? i don't understand the point of this. the nurse market is pretty dried up except for some job in healthcare.
Re:So much Fail. Ignore. (Score:5, Insightful)
GC is not about forgetting to free memory. It's about higher level abstraction removing the need for the programmer to do the bookkeeping that the machine can do. Why don't we still program in assembler? Because it's less productive. It's about productivity. As data structures become extremely complex, and get modified over time, keeping track of the ownership responsibility of who is supposed to dispose of what becomes difficult to impossible, and is the source of memory leak bugs. In complex enough programs, you end up re-inventing a poor GC when you could have used one that is the product of decades of research.
The article fails to understand that you can also run out of memory in a program using GC. Just keep allocating stuff without and keeping references to everything you allocate.
Reference Counting is not real GC. Cyclical data structures will never get freed using reference counting.
One of the major, but under-recognized benefits of GC, which the article fails to mention, is that GC allows much simpler ''contracts' in APIs. No longer is memory management part of the 'contract' of an API. It doesn't matter which library or function created an object, nobody needs to worry about who is responsible for disposing of that object. When nobody references the object any more, the GC can gobble it up.
On the subject of Virtual Machines, the article could mention some of the highly aggressive compilation techniques used in JIT compilers. So every method in Java is a virtual call. But a JIT compiler knows when there is only one subclass that implements a particular method and makes all calls to the method non-virtual. If another subclass is loaded (or dynamically created on the fly) the JIT can recompile all methods that call the method such that they are now virtual calls. Yet still, the JIT may be able to prove that certain calls are always and only to a specific subclass, and so they can be non-virtual.
The JIT compiler in JVM can aggressively inline small functions. But if a class gets reloaded on the fly such an the body of an inlined method changed, the JIT will know to recompile every other method that inlined the changed method. Based on the changes to the method, it may or may not now make sense to inline it -- so the decision on whether to inline the method can change based on actual need.
The HotSpot JVM dynamically profiles code and doesn't waste time and memory compiling methods that do not have any significant effect on the system's overall performance. The profiling can vary depending on factors that vary from system to system, and could not be predicted in advance when using a static compiler. The JIT compiler can compile your method using instructions that happen to exist on the current microprocessor at runtime -- something that could not be determined in advance with a static compiler.
All of this may seem very complex. But it's why big Java systems run so darn fast. Not very many languages can have tens or even hundreds of gigabytes (yes GB) of heap with GC pause times of 10 ms. Yes, it may need six times the amount of memory, but for the overall benefits of speed, the cost of memory is cheap.
Functional Programming? (Score:4, Insightful)
A functional language is one whereby the functions themselves can be stored in variables and passed around as parameters to other languages.
What in the actual fuck. That may be the worst definition of a functional language I've ever heard. Even if I try to interpret it as something that could make any sort of sense, I just get that storing functions in variables makes a language functional, which the author goes on to debunk by pointing out that C++ isn't a functional language. Why bother even trying to describe them if you have no idea what the hell they are?
Re:The programming language for the next 20 years. (Score:4, Insightful)
But C is a low level language. Not the best tool for writing applications.
Higher level languages and managed runtime systems have gained so much traction for a reason. They are very productive to use. They protect you from simple mistakes. The relieve the burden of memory management. GC simplifies library APIs by making the question of who should dispose of what become irrelevant. We could still be programming in assembly language instead of C. Why aren't we? Why aren't OSes written in assembly? Because C is more productive and higher level. Similarly, there are higher level languages than C, and they have their place. C is not the end all of abstraction.
Re:Repeat after me... (Score:2, Insightful)
Speak for yourself. I know plenty of coders who would rather use CSS or HTML than a lot of "actual" programming languages. And they could code circles around either of us. The more pointlessly negative you are about HTML and CSS the more you're setting yourself back these days. Being an exceptional web programmer can be just as valuable as being an exceptional C++ programmer. It's just that lots of coders are mediocre web devs and choose to blame their tools instead of owning up to the fact that they're doing it to themselves at this point.