The Technologies Changing What It Means To Be a Programmer 294
snydeq writes Modern programming bears little resemblance to the days of assembly code and toggles. Worse, or perhaps better, it markedly differs from what it meant to be a programmer just five years ago. While the technologies and tools underlying this transformation can make development work more powerful and efficient, they also make developers increasingly responsible for facets of computing beyond their traditional domain, thereby concentrating a wider range of roles and responsibilities into leaner, more overworked staff.
Re:Some of us do still assemble, even now (Score:5, Interesting)
Just because you can doesn't mean you should. 30 years ago, applications were built with long life-spans in mind, so dropping into very low-level code could make financial sense. Today, programs are generally designed for adaptability and compatibility. The target is constantly moving for the vast majority of applications out there. Dropping into assembly rarely makes sense anymore, because the mantra is "good enough" rather than "best" because even the "best" won't stay that way for long.
Of course, different industries will have different mileage. If you do most of your work for embedded devices or industry-niche things like robotics or satellites, then by all means dive into the 1's and 0's.
Re:Not changed much (Score:5, Interesting)
I'm not sure I agree 100%.
As a senior engineer today, I'm responsible not only for knowing the primary back end language (C#) for my web application, but also HTML5, CSS3, JavaScript, and the UI toolkit library. I'm also expected to understand database design, including being able to have an intelligent conversation about the physical schema, and I'm expected to be able to design tables, indexes, and optimize queries. I also have to know security, build tools, the automation scripting language for my platform (now PowerShell), and the deployment system (WiX to create MSIs). I'm also the source control administrator (Subversion).
I also have to read requirements that are maybe 5% of the value of those I got 15 years ago, and be a business analyst to figure out the rest.
Now fifteen years ago, I could probably have gotten away with knowing C/C++, a little scripting in Perl/Bash, and be decent at a CLI. I can probably write a multithreaded TCP/IP server in C without need for Google, which I haven't done in at least 12 years, but have to constantly Google things for what I now do daily.
I don't think things have changed fundamentally, but they haven't stayed the same. We're getting shallower, broader, and less efficient for it.
Somebody is projecting their delusions... (Score:5, Interesting)
Quite frankly, I just finished a larger project for a customer and what I did strongly resembles what I would have done 30 years ago: Define what the solution must do, do an architecture, a design, define interfaces, and then code the thing. The only thing that is different is that the C code was a web-server module and that the in-memory database may go up to 10G instead of the 100k or so it would have 30 years ago.
Really, nothing has changed much, unless you are at the very-low skill end of things, where you can now produce bad code in ways you could never before.
Re:COBOL was better than JavaScript. (Score:3, Interesting)
I don't think that the 4-days claim is a valid excuse. Eich should have known better. It was the middle of the 1990s, for crying out loud! Tcl was well established as an embeddable scripting/extension language by that time. He should have just embedded that. Fuck, he could have also gone with Perl, which was well established at that point, too. Sonofabitch, he could have even gone with one of the newcomers like Lua or Python. Even going with a simple Scheme implementation, like every undergraduate Comp Sci student will develop at one point or another, would have been better than JavaScript. No matter how you spin it, JavaScript is a humongous screw up. It's perhaps the worst thing ever to have happened to the computing industry, the worst thing to have happened to the Web, and the worst thing to have happened to the charge level of batteries in mobile devices. JavaScript is a total disaster, and it never should have even happened in the first place!
Re:what a load of utter bullshit (Score:5, Interesting)
The hardest part is trying to get a web browser to act like a desktop GUI, which is what customers want. We have to glue together a jillion frameworks and libraries, creating a big fat-client Frankenstein with versioning snakes ready to bite your tush. Great job security, perhaps, but also an Excedrin magnet. What use is lining your pockets if you die too early to spend it?
It's time for a new browser standard that is desktop-GUI-like friendly. The HTML/DOM stack is not up to the job.
Dynamic languages (JavaScript) are fine as glue languages and small event handling, but to try to make them into or use them for a full-fledged virtual OS or GUI engines is pushing dynamic languages beyond their comfort zone. Static typing is better for base platform tools/libraries. You don't write operating systems in dynamic languages.
Somebody please stab and kill the HTML/DOM stack so we can move on to a better GUI fit.
Re:Some of us do still assemble, even now (Score:4, Interesting)
Just because you can doesn't mean you should. 30 years ago, applications were built with long life-spans in mind, so dropping into very low-level code could make financial sense.
No, it was utter necessity. For 30 years ago I hadn't even gotten my C64 with all of 65536 bytes of RAM yet, 38911 of which would be left available once BASIC was loaded. Store one uncompressed screenshot of the 320x240x4 bit (16 color) screen and 38400 of those was gone. If you tried storing the Lord of the Rings trilogy in memory - compressed to ~12 bits/word, you'd get about halfway into the Fellowship of the Ring. True, today we waste space but back then we lacked essential space. Every corner was cut, every optimization taken to avoid using any of our precious bytes. See the Y2K problem.
For a more modern but similar issue, I used to have the same attitude to my bandwidth. It used to be slow, costly (pay per minute) and it was important to use it as a precious resource. Today I might end up downloading a whole season of a show after watching the first episode, go bored after three and delete the other twenty. It's there to be used, not to sit idle. I've got 16GB of RAM, there's no point to waste but there's nothing to be gained by hiding in a 1GB corner. If it makes life easier for developers to pull in a 10MB library than write a 100kB function, do it. If you can get it working, working well and working fast then maybe we'll get around to the resource usage.. someday. It's not a big deal.
Not that simple (Score:5, Interesting)
There's another factor, too; the industry really wants young programmers. The costs are less for remuneration, insurance, and vacation; the families are smaller or non-existent, and these people will work much longer hours based on nothing more than back patting and (often empty) promises. One of the consequences here is that some of the deeper skill sets are being lost because they simply aren't around the workplace any longer.
I think there is no question that all of this has changed the face of coding. An interesting exercise is to ask yourself: When was the last time you saw a huge project hit the market. Now ask yourself how many little does-a-couple-of-things projects you've seen hit the market in the same time frame. My contention is that there are very few of the larger projects being undertaken at this point, or at least, being finished.
Just one (retired) guy's opinion. :)
The tower of babel was already present back then. (Score:4, Interesting)
My experience reaches back to the toggle-and-punch cards days and I don't want want to bore anyone with stories about that.
But one thing I have noticed in all those years a I cannot recall a single year where it wasn't proclaimed by someone that software engineering would be dead as a career path within a few years.
I go back that far, as well.
And the proliferation of languages, each with advocates claiming it to be the be-all and end-all, was well established by the early '60s.
(I recall the cover of the January 1961 Communications of the ACM [thecomputerboys.com], which had artwork showing the Tower of Babel, with various bricks labeled with a different programming language name. There were well over seventy of them.)