Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Businesses IT Technology

The Technologies Changing What It Means To Be a Programmer 294

snydeq writes Modern programming bears little resemblance to the days of assembly code and toggles. Worse, or perhaps better, it markedly differs from what it meant to be a programmer just five years ago. While the technologies and tools underlying this transformation can make development work more powerful and efficient, they also make developers increasingly responsible for facets of computing beyond their traditional domain, thereby concentrating a wider range of roles and responsibilities into leaner, more overworked staff.
This discussion has been archived. No new comments can be posted.

The Technologies Changing What It Means To Be a Programmer

Comments Filter:
  • by asmkm22 ( 1902712 ) on Monday August 11, 2014 @06:00PM (#47651097)

    Just because you can doesn't mean you should. 30 years ago, applications were built with long life-spans in mind, so dropping into very low-level code could make financial sense. Today, programs are generally designed for adaptability and compatibility. The target is constantly moving for the vast majority of applications out there. Dropping into assembly rarely makes sense anymore, because the mantra is "good enough" rather than "best" because even the "best" won't stay that way for long.

    Of course, different industries will have different mileage. If you do most of your work for embedded devices or industry-niche things like robotics or satellites, then by all means dive into the 1's and 0's.

  • Re:Not changed much (Score:5, Interesting)

    by digsbo ( 1292334 ) on Monday August 11, 2014 @06:11PM (#47651167)

    I'm not sure I agree 100%.

    As a senior engineer today, I'm responsible not only for knowing the primary back end language (C#) for my web application, but also HTML5, CSS3, JavaScript, and the UI toolkit library. I'm also expected to understand database design, including being able to have an intelligent conversation about the physical schema, and I'm expected to be able to design tables, indexes, and optimize queries. I also have to know security, build tools, the automation scripting language for my platform (now PowerShell), and the deployment system (WiX to create MSIs). I'm also the source control administrator (Subversion).

    I also have to read requirements that are maybe 5% of the value of those I got 15 years ago, and be a business analyst to figure out the rest.

    Now fifteen years ago, I could probably have gotten away with knowing C/C++, a little scripting in Perl/Bash, and be decent at a CLI. I can probably write a multithreaded TCP/IP server in C without need for Google, which I haven't done in at least 12 years, but have to constantly Google things for what I now do daily.

    I don't think things have changed fundamentally, but they haven't stayed the same. We're getting shallower, broader, and less efficient for it.

  • by gweihir ( 88907 ) on Monday August 11, 2014 @06:20PM (#47651237)

    Quite frankly, I just finished a larger project for a customer and what I did strongly resembles what I would have done 30 years ago: Define what the solution must do, do an architecture, a design, define interfaces, and then code the thing. The only thing that is different is that the C code was a web-server module and that the in-memory database may go up to 10G instead of the 100k or so it would have 30 years ago.

    Really, nothing has changed much, unless you are at the very-low skill end of things, where you can now produce bad code in ways you could never before.

  • by Anonymous Coward on Monday August 11, 2014 @06:37PM (#47651371)

    I don't think that the 4-days claim is a valid excuse. Eich should have known better. It was the middle of the 1990s, for crying out loud! Tcl was well established as an embeddable scripting/extension language by that time. He should have just embedded that. Fuck, he could have also gone with Perl, which was well established at that point, too. Sonofabitch, he could have even gone with one of the newcomers like Lua or Python. Even going with a simple Scheme implementation, like every undergraduate Comp Sci student will develop at one point or another, would have been better than JavaScript. No matter how you spin it, JavaScript is a humongous screw up. It's perhaps the worst thing ever to have happened to the computing industry, the worst thing to have happened to the Web, and the worst thing to have happened to the charge level of batteries in mobile devices. JavaScript is a total disaster, and it never should have even happened in the first place!

  • by Tablizer ( 95088 ) on Monday August 11, 2014 @07:59PM (#47651795) Journal

    The hardest part is trying to get a web browser to act like a desktop GUI, which is what customers want. We have to glue together a jillion frameworks and libraries, creating a big fat-client Frankenstein with versioning snakes ready to bite your tush. Great job security, perhaps, but also an Excedrin magnet. What use is lining your pockets if you die too early to spend it?

    It's time for a new browser standard that is desktop-GUI-like friendly. The HTML/DOM stack is not up to the job.

    Dynamic languages (JavaScript) are fine as glue languages and small event handling, but to try to make them into or use them for a full-fledged virtual OS or GUI engines is pushing dynamic languages beyond their comfort zone. Static typing is better for base platform tools/libraries. You don't write operating systems in dynamic languages.

    Somebody please stab and kill the HTML/DOM stack so we can move on to a better GUI fit.

  • by Kjella ( 173770 ) on Monday August 11, 2014 @08:19PM (#47651897) Homepage

    Just because you can doesn't mean you should. 30 years ago, applications were built with long life-spans in mind, so dropping into very low-level code could make financial sense.

    No, it was utter necessity. For 30 years ago I hadn't even gotten my C64 with all of 65536 bytes of RAM yet, 38911 of which would be left available once BASIC was loaded. Store one uncompressed screenshot of the 320x240x4 bit (16 color) screen and 38400 of those was gone. If you tried storing the Lord of the Rings trilogy in memory - compressed to ~12 bits/word, you'd get about halfway into the Fellowship of the Ring. True, today we waste space but back then we lacked essential space. Every corner was cut, every optimization taken to avoid using any of our precious bytes. See the Y2K problem.

    For a more modern but similar issue, I used to have the same attitude to my bandwidth. It used to be slow, costly (pay per minute) and it was important to use it as a precious resource. Today I might end up downloading a whole season of a show after watching the first episode, go bored after three and delete the other twenty. It's there to be used, not to sit idle. I've got 16GB of RAM, there's no point to waste but there's nothing to be gained by hiding in a 1GB corner. If it makes life easier for developers to pull in a 10MB library than write a 100kB function, do it. If you can get it working, working well and working fast then maybe we'll get around to the resource usage.. someday. It's not a big deal.

  • Not that simple (Score:5, Interesting)

    by fyngyrz ( 762201 ) on Monday August 11, 2014 @08:34PM (#47651971) Homepage Journal

    While the technologies and tools underlying this transformation can make development work more powerful and efficient

    ...and they can also bury them in irrelevancy. It can make them depend on debuggers instead of good coding practices and skills and self-checking that tend to make the debugger an uncommon go-to. It can isolate them further from the hardware so that the difference between what is efficient and what can only be said to work becomes a mystery to the new-style programmer. It can turn what should really be a one-programmer project into a team effort, where "team" should carry the same negative connotations as "committee." It can move critical portions of projects into the black boxes of libraries and objects sourced from outside the primary development effort, and in so doing, reduce both the maintainability and the transparency of the overall result. Languages with garbage collection can create much looser coupling between performance and system capacity, reducing the range of what can actually be done with them. Worst of all, with all the wheel spinning and checking code in and out and the testing methodology of the month, it can make them feel like they're really doing something worthwhile in terms of time spent and results obtained, when what it really boils down to is something far less efficient and effective overall.

    There's another factor, too; the industry really wants young programmers. The costs are less for remuneration, insurance, and vacation; the families are smaller or non-existent, and these people will work much longer hours based on nothing more than back patting and (often empty) promises. One of the consequences here is that some of the deeper skill sets are being lost because they simply aren't around the workplace any longer.

    I think there is no question that all of this has changed the face of coding. An interesting exercise is to ask yourself: When was the last time you saw a huge project hit the market. Now ask yourself how many little does-a-couple-of-things projects you've seen hit the market in the same time frame. My contention is that there are very few of the larger projects being undertaken at this point, or at least, being finished.

    Just one (retired) guy's opinion. :)

  • by Ungrounded Lightning ( 62228 ) on Monday August 11, 2014 @08:56PM (#47652035) Journal

    My experience reaches back to the toggle-and-punch cards days and I don't want want to bore anyone with stories about that.

    But one thing I have noticed in all those years a I cannot recall a single year where it wasn't proclaimed by someone that software engineering would be dead as a career path within a few years.

    I go back that far, as well.

    And the proliferation of languages, each with advocates claiming it to be the be-all and end-all, was well established by the early '60s.

    (I recall the cover of the January 1961 Communications of the ACM [thecomputerboys.com], which had artwork showing the Tower of Babel, with various bricks labeled with a different programming language name. There were well over seventy of them.)

"The only way I can lose this election is if I'm caught in bed with a dead girl or a live boy." -- Louisiana governor Edwin Edwards

Working...