Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming Businesses IT Technology

The Technologies Changing What It Means To Be a Programmer 294

snydeq writes Modern programming bears little resemblance to the days of assembly code and toggles. Worse, or perhaps better, it markedly differs from what it meant to be a programmer just five years ago. While the technologies and tools underlying this transformation can make development work more powerful and efficient, they also make developers increasingly responsible for facets of computing beyond their traditional domain, thereby concentrating a wider range of roles and responsibilities into leaner, more overworked staff.
This discussion has been archived. No new comments can be posted.

The Technologies Changing What It Means To Be a Programmer

Comments Filter:
  • Modern programming bears little resemblance to the days of assembly code

    What's not modern about using assembler where it's appropriate to do so?

    Sometimes I do it just because I like it...

    • How about the the 200,000 more H1B's showing up each year? I question its impact upon this great nation.
    • by asmkm22 ( 1902712 ) on Monday August 11, 2014 @05:00PM (#47651097)

      Just because you can doesn't mean you should. 30 years ago, applications were built with long life-spans in mind, so dropping into very low-level code could make financial sense. Today, programs are generally designed for adaptability and compatibility. The target is constantly moving for the vast majority of applications out there. Dropping into assembly rarely makes sense anymore, because the mantra is "good enough" rather than "best" because even the "best" won't stay that way for long.

      Of course, different industries will have different mileage. If you do most of your work for embedded devices or industry-niche things like robotics or satellites, then by all means dive into the 1's and 0's.

      • by Kjella ( 173770 ) on Monday August 11, 2014 @07:19PM (#47651897) Homepage

        Just because you can doesn't mean you should. 30 years ago, applications were built with long life-spans in mind, so dropping into very low-level code could make financial sense.

        No, it was utter necessity. For 30 years ago I hadn't even gotten my C64 with all of 65536 bytes of RAM yet, 38911 of which would be left available once BASIC was loaded. Store one uncompressed screenshot of the 320x240x4 bit (16 color) screen and 38400 of those was gone. If you tried storing the Lord of the Rings trilogy in memory - compressed to ~12 bits/word, you'd get about halfway into the Fellowship of the Ring. True, today we waste space but back then we lacked essential space. Every corner was cut, every optimization taken to avoid using any of our precious bytes. See the Y2K problem.

        For a more modern but similar issue, I used to have the same attitude to my bandwidth. It used to be slow, costly (pay per minute) and it was important to use it as a precious resource. Today I might end up downloading a whole season of a show after watching the first episode, go bored after three and delete the other twenty. It's there to be used, not to sit idle. I've got 16GB of RAM, there's no point to waste but there's nothing to be gained by hiding in a 1GB corner. If it makes life easier for developers to pull in a 10MB library than write a 100kB function, do it. If you can get it working, working well and working fast then maybe we'll get around to the resource usage.. someday. It's not a big deal.

        • , go bored after three and delete the other twenty.

          Don't worry. Soon permanent storage space will be so big that nobody deletes anymore. After a while, rm will be removed from Linux distros. You'll just revision the data on your drive, never actually deleting anything. Ha-ha, only serious.

          • Don't worry. Soon permanent storage space will be so big that nobody deletes anymore. After a while, rm will be removed from Linux distros.

            LOL. I'm laughing with the vague worry that you're accurately predicting the future.

      • Just because you can doesn't mean you should.

        Just because you shouldn't always doesn't mean you shouldn't ever.

    • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Monday August 11, 2014 @05:11PM (#47651161)

      What's not modern about using assembler where it's appropriate to do so?

      Because it is InfoWorld. Seriously.

      Here's item # 3.

      Developer tool No. 3: Libraries

      Do you remember the first time you used a library? But they're new because programmers 5 years ago did not have libraries.

      It gets better:

      Developer tool No. 4: APIs

      Yeah. That's a radical new concept there.

      Fuck it.

      Developer tool No. 6: Browsers

      Tonight we're gonna party like it's 1995.

      And, finally:

      The work involved in telling computers what to do is markedly different than it was even five years ago, and it's quite possible that any Rip Van Winkle-like developer who slept through the past 10 years would be unable to function in the today's computing world.

      No it is not. Not they would not. Windows XP was released in 2001 and there are still people using it. That's 13 years ago.

      InfoWorld sucks.

      • by Darinbob ( 1142669 ) on Monday August 11, 2014 @05:38PM (#47651375)

        There is a very tiny overlap between software developers and journalists. And yet the number of software development journalists greatly exceeds the size of that overlap. The only explanation is that there are people who don't know what they're talking about who write these articles.

        • They problem is not that they don't know what they're talking about. The problem is that they think they do. There's no excuse for it. Journalists aren't expected to be experts in the things they write about. They're just expected to not be full of shit.
      • Yeah, I skimmed the list and there were only a few things that were newer than 5 years old. Docker definitely, node.js is maybe 5yrs old already (but that looks like it peaked - the cool kids have already moved on). Chef is already about 5 years old and Puppet is already much older (and that's ignoring much older config management tools).

      • by Dutch Gun ( 899105 ) on Monday August 11, 2014 @06:54PM (#47651769)

        It's pretty obvious this is written through the lens of a javascript-focused web programmer. Seriously, libraries are a hot new trend? That's hilarious stuff. Read each item in this list as "From the viewpoint of a Javascript/web developer...", and it seems to make a bit more sense.

        It's pretty clear he only has a vague notion of game development either (my profession), and gets some basic facts wrong. He calls Unity a library (it's a game engine, better categorized as a framework). In a different article, he claims that game frameworks are in, while native development is out. The first part is true, but the second part certainly is not. C++ is still used almost exclusively for large-scale AAA game development. Unless by "native development" he meant "roll your own game engine", in which case he's using the wrong terminology.

        • Unless by "native development" he meant "roll your own game engine", in which case he's using the wrong terminology.

          Maybe he means that because of outsourcing, more and more applications are being developed in India by the "natives" rather than in the US by H1Bs

    • by gweihir ( 88907 )

      The problem is that not that many people are able to use assembler these days, because they do not get it. It is still needed in numerous places and if you cannot even do small stuff with it, then you are not a professional programmer.

      • Sorry,
        that is a misconception. "They do not get it" is certainly wrong.
        Assembler is the most simple programming "language" in existing, everyone gets it.
        ... and if you cannot even do small stuff with it, then you are not a professional programmer.
        Like you I guess, post your last assembly code otherwise I would say: you are a liar ;D and regarding your post you are an idiot anyway, so who cares.

    • C, assembler, VHDL, it still all gets done. But the people who do that sort of work does not have much overlap with the sorts of people who write blogs or articles about how programming has changed over the years.

      Reading this article, it's just stupid. Ie, some of these "new" technologies changing how we program: libraries, APIs, virtual machines, developing tools to help development, and performance monitoring. That stuff has existed since the first decade of programming. "New"s stuff includes social

      • "New"s stuff includes social media portals. Wow, I don't even know what that is, but it sounds web-ish.

        I think that means that modern programming is done via crowd-sourcing, kickstarter, and mechanical turk.

        I seem to recall having these modern things called "libraries" when I was programming in small C on CP/M.

  • Not changed much (Score:5, Insightful)

    by jmyers ( 208878 ) on Monday August 11, 2014 @04:39PM (#47650955)

    I don't see many changes. Vendors, managers, and salespeople change the buzz words every few years and talk of great paradigm shifts. Programmers continue to write code and produce actual results. In a perfect world the programmers get to choose their own tools. In the real world we have to use whatever buzz word compliant tools are thrown in the mix each year. They never actually live up to the hype and you have to dig in and find the code buried within and build stuff that works. I remember when the salespeople were touting dBase II and how programming would be completely changed. Right.

    • Re:Not changed much (Score:5, Interesting)

      by digsbo ( 1292334 ) on Monday August 11, 2014 @05:11PM (#47651167)

      I'm not sure I agree 100%.

      As a senior engineer today, I'm responsible not only for knowing the primary back end language (C#) for my web application, but also HTML5, CSS3, JavaScript, and the UI toolkit library. I'm also expected to understand database design, including being able to have an intelligent conversation about the physical schema, and I'm expected to be able to design tables, indexes, and optimize queries. I also have to know security, build tools, the automation scripting language for my platform (now PowerShell), and the deployment system (WiX to create MSIs). I'm also the source control administrator (Subversion).

      I also have to read requirements that are maybe 5% of the value of those I got 15 years ago, and be a business analyst to figure out the rest.

      Now fifteen years ago, I could probably have gotten away with knowing C/C++, a little scripting in Perl/Bash, and be decent at a CLI. I can probably write a multithreaded TCP/IP server in C without need for Google, which I haven't done in at least 12 years, but have to constantly Google things for what I now do daily.

      I don't think things have changed fundamentally, but they haven't stayed the same. We're getting shallower, broader, and less efficient for it.

      • As a senior engineer today, ... those I got 15 years ago ...

        That's because you are now holding the position of a senior engineer with 15 years of experience.

        Look at what someone who is just starting needs to know. How much different is it than what you needed to know when you started 15 years ago?

      • So when I worked for BT in the 80's we had to do all that
      • As a senior engineer today, I'm responsible not only for knowing ...

        I thought the exact same thing when I started as a software engineer in 1996, almost 20 years ago. Only back then we actually generated HTML in C++ on the server (with home-brewed "html template" processors.) And we had to deal with things like COM, and browsers were far less standardized. We also had to deal with database design, only we had to make home-brewed Object-Relational Mappings, because the frameworks for that weren't that good either. Marshalling an object from the database through the business

      • So basically you just learned how to do a full application (web server and GUI via html) and you think its new. Funny that just happens to be 20 years old as well, and its certainly not the first type someone wrote client server applications.

        Nothing you're doing is new other than you're doing it now instead of someone else.

        You just learned to do the 'full stack'. I've been doing it for over 20, before the web existed. Different languages, different layout engines, different libraries, same process.

      • If you go back 30 years you'd probably have had to write the language you were going to use and an interpreter or compiler for it. You would absolutely have had to know about the database design, although the database would have been made up of byzantine interrelated b-trees rather than intuitive SQL tables. You'd have had to write everything from scratch for each task rather than leveraging huge libraries for everything. Things are many times easier these days yet you still find articles like this that bem
    • by mrchaotica ( 681592 ) * on Monday August 11, 2014 @05:28PM (#47651325)

      In a perfect world the programmers get to choose their own tools. In the real world we have to use whatever buzz word compliant tools are thrown in the mix each year.

      In hobby programming, which is not the real world, the programmers get to choose their own tools. In the Silicon Valley startup bubble, which is also not the real world, programmers have to use whatever buzz word compliant tools are thrown in the mix each year.

      In the real real world, programmers use C, C++, .NET, Java, or some other constantly-claimed-by-idiots-to-be-dead language. (And they usually use it to write boring, vertical-market billing software.)

      • I'm not sure I get this need to bash hobby programming. I believe Microsoft and Apple started with hobby level programming. You could say they were 'professional' at the beginning because they started a company, but seriously the only difference at the beginning was the fact they incorporated a name. Some very big things, some very big concepts can come out of "hobby programming". Many people's next big business venture come out of it. I think it is poor thinking to outright slag people's small scale projec
        • Where was I "bashing" hobby programming?

          If anything, you're the one "bashing" hobby programming by implying that it is at a lower "level" or at a "small scale" compared to professional programming. As far as I'm concerned, the only difference between it and "professional" programming is profit motive (which is also what makes it not "real world," because "real world" refers to the need to earn a living).

          • by tepples ( 727027 )

            As far as I'm concerned, the only difference between it and "professional" programming is profit motive

            That and on which platforms people are allowed to do it. The video game console makers have made it clear that their platforms are not intended for hobbyists.

            • by tlhIngan ( 30335 )

              The video game console makers have made it clear that their platforms are not intended for hobbyists.

              Sony maybe, but Microsoft has long had the Xbox Live Indie Arcade where for $99 a year you can code up something and play it on an Xbox360 and even sell it on an Xbox360.

              And Sony did at one point too when the PS3 ran Linux.

              Microsoft has/had plans for bringing the program onto the Xbone as well.

    • I even remember the days before the word "paradigm" existed.

      (yes it's sarcasm you annoying downmodders!)

    • My list (Score:5, Informative)

      by Dutch Gun ( 899105 ) on Monday August 11, 2014 @07:42PM (#47651999)

      How about we make a list of the technologies that have actually impacted us in a real way over... hmm, let's say the past ten or fifteen years? I assume that everyone will have slightly different items, because we all work in different areas. I'm a game developer and use C++, so my perspective will reflect that. Listed in no particular order of importance:

      1) C++ 11/14 - It's transformed the language in a fairly dramatic way, making it much safer and convenient to use, while leaving legacy code completely compatible. Modern C++ code feels a lot more like C# at times, just a whole lot uglier.
      2) Mobile Platforms - Mobile platforms (smartphones and tablets) as a rising contender has caused a fundamental shift in the balance of power among platforms.
      3) Online Gaming and Integration - MMOs and other games are taking advantage of the ubiquitous connectivity to the internet most of us now enjoy.
      4) Distributed Version Control Systems - Modern source control systems such as Git and Mercurial (my favorite) are a boon not only to large distributed projects, but even for smaller developers. Traditional development house, for the most part, still use Perforce, though, which works much better for asset management.
      5) Online distribution - The ability to quickly and easily download and update games from vendors like Steam, Gog, and Origin are opening up the market to indie and traditional developers alike, and will eventually kill physical distribution channels.
      6) Online resources - Better search pioneered by Google teams up with incredibly knowledge-rich sites like StackExchange.com. The result is that damn near any question you have is likely to have already been asked and answered. If not, ask away, and you have a good chance of getting some real help.
      7) GPU programming - More and more visual programming is being off-loaded to the GPU, and those have developed into full-blown programming languages of their own.
      8) Parallel programming - With the advent of ubiquitous multi-core / multi-threaded processors in the past decade, game developers had to start getting serious about multi-threaded programming, making an already demanding job even tougher.

      That's about all I can think of offhand that's really changed over the last fifteen years. Libraries, frameworks, and APIs are not some new phenomenon. They've been around since I started professionally programming, so it's ridiculous to include those. You might as well add "source code", "compilers/linkers", and "editors" to the list if you're going there.

      What about in other professions?

  • by gregmac ( 629064 ) on Monday August 11, 2014 @04:44PM (#47650985) Homepage

    I'm struggling to understand the point of this article. May as well have titled it "You won't believe these 15 new tricks for programmers. The shocking truth the devops guys don't want you to know"

    Some quotes:

    * "Back to work, slave, the continuous build machine has new tasks for you."
    * "You're not a craftsman -- you're a framework-tweaker."
    * "It's so much easier, but these IaaS administration Web pages won't buy you a drink after work."

    • I'm struggling to understand the point of this article.

      It's Infoworld.

      The point of the article is twofold:
      - To convince Pointy Haired Bosses that they understand what's going on and are riding the cutting edge.
      - To sell them new products, implementing new buzzwords, which they can then inflict on their hapless subordinates, making their life hell as they try to get the stuff working and into production.

      That's the first two lines of the four-line Slashdot meme. The remaining two are:

      (3. Bill

    • * "Back to work, slave, the continuous build machine has new tasks for you."

      Has anyone ever, anywhere, gotten a task assigned to them by the continuous build machine?

  • by Ozoner ( 1406169 ) on Monday August 11, 2014 @04:46PM (#47650999)

    I think it's the exactly opposite.
    The modern programming environment is trying hard to lock the programmer into a box where he can't do much harm...

    No one has more control over the computer than an Assembler language programmer.

    And there's still lots of Assembly programming going on today.

    • Yup. Everyone's amazed at the exciting new worlds of mobile phone apps. And yet, assembler exists underneath that. Someone wrote it. Maybe not the people who responded to the "be a web app developer and earn pennies from your own home!" advertisements. But it exists and those web apps would not exist without it and the people who understand it.

      But this is nothing new. Go back 30 years. The vast majority of Unix programmers didn't know assembler either. They were just your 9 to 5 programmers getting

  • by jd ( 1658 ) <imipakNO@SPAMyahoo.com> on Monday August 11, 2014 @04:57PM (#47651073) Homepage Journal

    Modern programming languages are a fusion of older programming languages, with chunks taken out. Often, it's the useful chunks.

    There is no table, that I know of, that lists all the features ("significant" depends on the problem and who cares about solved problems?) versus all the paradigms versus all the languages. (Almost nothing is pure in terms of paradigm, so you need a 3D spreadsheet.)

    Without that, you cannot know to what extent the programming language has affected things, although it will have done.

    Nor is there anything similar for programming methodology, core skills, operating systems or computer hardware.

    Without these tables, all conclusions are idle guesses. There's no data to work with, nothing substantial to base a conclusion on, nothing to derive a hypothesis or experiments from.

    However, I can give you my worthless judgement on this matter:

    1) Modern methodologies, with the exception of tandem/test first, are crap.
    2) Weakly-typed languages are crap.
    3) Programmers who can't do maths or basic research (or, indeed, program) are crap.
    4) Managers who fire the rest of the staff then hire their girlfriends are... ethically subnormal.
    5) Managers who fire hardware engineers for engineering hardware are crap.
    6) Managers who sabotage projects that might expose incompetence are normal but still crap.
    7) If you can't write it in assembly, you don't understand the problem.
    8) An ounce of comprehension has greater value than a tonne of program listing.
    9) Never trust an engineer who violates contracts they don't like.

    • 1) Modern methodologies, with the exception of tandem/test first, are crap.
      2) Weakly-typed languages are crap.

      I would only add, "All Languages are Crap. Programmer skill makes the difference."

  • Sensationalist BS? (Score:3, Insightful)

    by Anonymous Coward on Monday August 11, 2014 @04:59PM (#47651093)
    I don't see a single point in the article that would represent any profound change in how programmers work. In fact, points like 1 or 4 are laughable simply because even though these are true, they also happened decades ago. Point 9 is exactly like programming in Lisp (do everything in a single generic language), the only difference being using Javascript instead of Lisp. Others are of interest as deployment techniques, not as a programming workflow change. Etc.
    • Re: (Score:2, Informative)

      by Anonymous Coward

      If containers, continuous integration, Iaas, Paas, and other "deployment techniques" aren't dramatically impacting your workflow as a developer, then you're almost certainly wasting a lot of time.

      Vagrant + Puppet/Chef = clean new test machine, configured & installed exactly the same way... every time.
      Docker = "all my dependencies always travel with my application."
      IaaS + PaaS = "if I need a bunch of test nodes to work with, or prototype something, I push a few buttons and wait 20 minutes, at which point

  • by bobbied ( 2522392 ) on Monday August 11, 2014 @05:07PM (#47651135)

    Here we go again with another silver bullet?. It seems that every generation of noobs comes to this same conclusion and are just as wrong as we where when we said the same thing. It's almost a rite of passage, just like the rebellious teenager or terrible twos kids go though.

    Yes, programming has changed some since I started doing it. However, in the long run, nothing has really changed. Programming is Programming, the same skills I used to need when doing assembly, are useful when I dabble in Java. What HAS changed is the programming model and the languages we use. Yea, we can automatically generate a boat load of code and come up with stuff that would have taking years to do in assembly in a few hours, but nothing is really new. When we went from Assembly to C, we could do things in C so much faster than in assembly, but programs only got bigger and slower. C to C++ bumped that up again, but not that much. Java bumped that up, adding mufti-platform capacity, slowing the programs down and making them take more memory. That's how this goes, new tools, bigger programs that run slower, but still requires a programmer to make useful things using those tools.

    Truly there is nothing really new for programmers, the job still requires the same kinds of skills and still requires that you know the programming model. Yea, we can pull ready built stuff off the shelf more easily, but like before, new advances really only make programs bigger and slower and still require programmers who know how to design and implement. We keep trying, but this will not change.

    So, nice try there syndeq, I think you are wrong. My generation of programmers thought we had achieved the same things you are claiming when we where noobs. We where wrong too. There may be new tools, but you still need a skilled craftsman to use those tools or you get garbage for a program.

    I strongly recommend you go get and read "The Mythical Man Month" by Frederick P. Brooks, Jr.. In my day his experience and insights where eye opening for us, and it will be for you too. You don"t have to make the same mistakes we did. I've met some of you guys/gals you can do better, just take my advice.

    • by gweihir ( 88907 )

      I agree, Brooks is as relevant today as it was when it was written. People are still making the same stupid mistakes and still believe that technology can fix their inadequacies. It cannot.

    • by lgw ( 121541 )

      There's nothing really new for programmers because, if you're doing it right, everything is really new for programmers all the time. Once anything becomes routine, we turn around and automate it, so the work is always this set of tasks we haven't figured out how to automate set.

      • There's nothing really new for programmers because, if you're doing it right, everything is really new for programmers all the time. Once anything becomes routine, we turn around and automate it, so the work is always this set of tasks we haven't figured out how to automate set.

        If you look at the effects of automation tools, you discover that they really don't didn't fix anything, they just change the specifics of the problem. You are still going to need a programmer to learn the new tool and make your desired program work. That is the point of Chapter 16 and 17 in "The Mythical Man Month" I told you to read....

        • by lgw ( 121541 )

          That's just what I was saying. The programmer's task is always "automate what was new 3 years ago, and routine 1 year ago, using what got automated last time to help." It's a never-ending cycle, as automating X just allows you to do Y, which eventually becomes straightforward enough to automate.

  • by sribe ( 304414 ) on Monday August 11, 2014 @05:13PM (#47651181)

    I've been doing this full-time since 1985, and the most distressing part is how little real change there has been in all that time!

    • by Tablizer ( 95088 ) on Monday August 11, 2014 @06:59PM (#47651795) Journal

      The hardest part is trying to get a web browser to act like a desktop GUI, which is what customers want. We have to glue together a jillion frameworks and libraries, creating a big fat-client Frankenstein with versioning snakes ready to bite your tush. Great job security, perhaps, but also an Excedrin magnet. What use is lining your pockets if you die too early to spend it?

      It's time for a new browser standard that is desktop-GUI-like friendly. The HTML/DOM stack is not up to the job.

      Dynamic languages (JavaScript) are fine as glue languages and small event handling, but to try to make them into or use them for a full-fledged virtual OS or GUI engines is pushing dynamic languages beyond their comfort zone. Static typing is better for base platform tools/libraries. You don't write operating systems in dynamic languages.

      Somebody please stab and kill the HTML/DOM stack so we can move on to a better GUI fit.

      • by sribe ( 304414 )

        Yes. Writing desktop apps in web browsers is a nightmare. I agree with that. It's just that it's not all that different than the nightmare of gluing together incompatible libraries and various GUI/desktop managers from long ago. No matter what decade you talk about, there were always a bunch of idiots pushing a new "paradigm" that was extremely poorly thought out and a huge pain to deal with ;-)

      • by Jeremi ( 14640 )

        Somebody please stab and kill the HTML/DOM stack so we can move on to a better GUI fit.

        Hmm, perhaps Qt running in a NaCL environment? The only fundamental limitation would be that it's Intel-only, but then again so are most desktops these days.

    • I've been doing this full-time since 1977, and the most distressing part is how little real change there has been in all that time!

  • by gweihir ( 88907 ) on Monday August 11, 2014 @05:20PM (#47651237)

    Quite frankly, I just finished a larger project for a customer and what I did strongly resembles what I would have done 30 years ago: Define what the solution must do, do an architecture, a design, define interfaces, and then code the thing. The only thing that is different is that the C code was a web-server module and that the in-memory database may go up to 10G instead of the 100k or so it would have 30 years ago.

    Really, nothing has changed much, unless you are at the very-low skill end of things, where you can now produce bad code in ways you could never before.

  • by 0xdeadbeef ( 28836 ) on Monday August 11, 2014 @05:23PM (#47651265) Homepage Journal

    The work involved in telling computers what to do is markedly different than it was even five years ago, and it's quite possible that any Rip Van Winkle-like developer who slept through the past 10 years would be unable to function in the today's computing world.

    This is quite possibly the stupidest article ever posted to Slashdot.

    Ok, this month.

    • The work involved in telling computers what to do is markedly different than it was even five years ago, and it's quite possible that any Rip Van Winkle-like developer who slept through the past 10 years would be unable to function in the today's computing world.

      This is quite possibly the stupidest article ever posted to Slashdot.

      Ok, this month.

      I hate it when I have mod points and comments like this are already at 5.

  • The only differences I've seen the last 20 years are:
    1. VMs
    2. Average developer skill getting worse

  • My experience reaches back to the toggle-and-punch cards days and I don't want want to bore anyone with stories about that.

    But one thing I have noticed in all those years a I cannot recall a single year where it wasn't proclaimed by someone that software engineering would be dead as a career path within a few years.

    Academia and Industry is actually pretty good at coming up with new and better ways to program. Hundreds if not thousands of new languages, frameworks and tools have appeared over the years and

    • by Ungrounded Lightning ( 62228 ) on Monday August 11, 2014 @07:56PM (#47652035) Journal

      My experience reaches back to the toggle-and-punch cards days and I don't want want to bore anyone with stories about that.

      But one thing I have noticed in all those years a I cannot recall a single year where it wasn't proclaimed by someone that software engineering would be dead as a career path within a few years.

      I go back that far, as well.

      And the proliferation of languages, each with advocates claiming it to be the be-all and end-all, was well established by the early '60s.

      (I recall the cover of the January 1961 Communications of the ACM [thecomputerboys.com], which had artwork showing the Tower of Babel, with various bricks labeled with a different programming language name. There were well over seventy of them.)

  • In the embedded world I often use Assembly and C, even on the desktop it's not rare for me to C in order to avoid pointless overhead with weighty languages like C# or Java. A real programmer interfaces at the hardware level and tells a computer how to do it's job without having to use bulky objects, interfaces and abstraction. "Modern" programming bears little resemblance to programming because modern programming isn't real programming, it's falling back on managed, bulky overhead that does all the work f

Keep up the good work! But please don't ask me to help.

Working...