Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming

Programmers: It's OK To Grow Up 232

Nemo the Magnificent writes: " Everybody knows software development is a young man's game, right? Here's a guy who hires and manages programmers, and he says it's not about age at all — it's about skills, period. 'It's each individual's responsibility to stay fresh in the field and maintain a modern-day skillset that gives any 28-year-old a run for his or her money. ... Although the ability to learn those skills is usually unlimited, the available time to learn often is not. "Little" things like family dinners, Little League, and home improvement projects often get in the way. As a result, we do find that we face a shortage of older, more seasoned developers. And it's not because we don't want older candidates. It's often because the older candidates haven't successfully modernized their developer skills.' A company that actively works to offer all employees the chance to learn and to engage with modern technologies is a company that good people are going to work for, and to stay at."
This discussion has been archived. No new comments can be posted.

Programmers: It's OK To Grow Up

Comments Filter:
  • by Anonymous Coward on Friday May 16, 2014 @06:31PM (#47022243)

    they just happened to have learned the most recent stuff (which all too frequently is all the managers care about)

    The experienced developer will know when not to use a new fad because they will have seen a prior version of that fad before.

    • by Anonymous Brave Guy ( 457657 ) on Friday May 16, 2014 @07:40PM (#47022617)

      Exactly. I read this and actually laughed:

      2. Embrace new technologies. Many mature developers have found themselves with an outdated skillset because their employers stuck with what works, rather than encouraging modern technologies. Employers need to embrace the latest open-source tools, languages, and frameworks, in order to grow and retain the best talent.

      Yeah, those crazy employers, sticking with things that work! What were they thinking?!

      Perhaps if this guy hired a few more experienced developers, they could have explained the relatively value of the terms "tried and tested" and "unproven and risky". Good older programmers are just as capable of learning useful new technologies as good younger programmers. The real difference is that the experienced ones tend not to waste their time learning five different [JS frameworks]* that they know will all be obsolete long before the project built on them is finished, because they were too busy building something that would actually get the job done using [jQuery]*.

      *Please substitute respectively an overhyped but underperforming technology and an established reliable technology in your fields of choice.

      • by lgw ( 121541 ) on Friday May 16, 2014 @08:55PM (#47022895) Journal

        The thing about technology is: the absolute best thing today, really the best, is utter crap in 20 years. And there are too damn many developers, old and young who stick with what was the best at one point in history but just isn't any more.

        It's wise to reject 90% of new ideas as silly fads, but the problem is when you reject 100%. And it's not just older guys like me with the problem, it just matters more as what you settled on ages. If you combine rejecting all new ideas as fads with age, you can easily become unemployable.

        For example, look at all the /.ers who still dismiss "the cloud" as a passing fad, mistaking "I have less obsessive-geek control over my precious" for business judgement. Guys? It's not going away, and it keeps getting cheaper and more reliable. There are many areas today where you just can't put stuff in the cloud for compliance reasons, but the cloud guys have checkbooks and senators phone numbers, and that last barrier won't last long. Not every new idea is a fad.

        Heck, I see people here that still think using an IDE is some sort of scam. "VI was good enough for grandpappy and it's good enough for me". Code review tools still get resistance in some quarters, but thinking you don't need a Review Board-like system is like thinking you don't need version control: it will end in tears.

        Sure, don't run off with every fad, but this is a poor industry to cross the line from change-adverse to change-resistant in.

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          The cloud is a cool idea... but lets be real. Until ISPs actually start laying fiber and not wringing hands in front of Congress to demand fee hikes, at best it is a great place to store archives or spin up machines for peak load. Until this happens, the cloud will hit a barrier.

          Oh, those servers have to be paid by someone, so better have them in-house with physical security rather than in some location that can be easily breached.

          Change-resistance is good. Things should be tested and regression tested.

          • In many cases, it's best not to have the cloud at your physical location but to have them at a site with good disaster ratings and better security than your typical company can provide.

            And even then- your secure, disaster proof site can get hit (as happened in boulder colorado with 100 year floods but I hear it made it through pretty well).

            Most people live on the coasts and anywhere within 100 miles of the coast is really not a good place to have a data center. Also probably not any place that ever gets 6.

        • by mwvdlee ( 775178 )

          The remedy is easy.
          Don't jump onto new technology.
          Wait 5 years, then move to whatever of those new technologies is still around.
          Even in this day and age, 5 years isn't enough for a decent piece of software go horribly bad; it'll survive lagging on new technology.
          It won't survive changing direction whenever a new technology has to be chased.

        • by sjames ( 1099 ) on Saturday May 17, 2014 @02:15AM (#47023893) Homepage Journal

          The cloud is a marketing term that covers a wide variety of things, some occasionally very useful, some nearly always a bad idea. You'll need to specify what you mean by cloud.

          That is the real issue here. Some of us remember when management wanted everything including the potted plant in reception to be CORBA compliant. Anyone remember CORBA? When did it ever do anything for us that didn't already exist? Then it was XML. Everything had to be XML because XML would automagically make everything merge together and work in harmony.....or not.

          OTOH, Ajax actually works as does LAMP. Ajax especially works well when you use it with JSON or HTML rather than XML.

          IDE is a matter of preference. Personally, I find Eclipse useful for Java because there is so damn much boilerplate in Java that Eclipse can take care of. It's not so useful for Python, especially compared to vim with syntax highlighting.

          When an older developer pushes back, it is important to determine if it is actually because he is a dinosaur or is it because he has seen the same thing twice before under another name and it failed both times at great expense. This industry sorely needs more of the latter.

        • The thing about technology is: the absolute best thing today, really the best, is utter crap in 20 years.

          Actually (to make an example in one particular area - the one of languages), the "absolute best things today, really the best" are probably Haskell, Lisp, and Smalltalk. Out of these three, two of them already existed twenty years ago. Twenty years from now, they'll still be around, and there's very little in sight that could match their qualities, if anything, at least in the middle and upper layers of software ecosystems (which is where probably most programmers work today).

          Heck, I see people here that still think using an IDE is some sort of scam.

          It's not a scam, IDEs proved th

        • by Kjella ( 173770 )

          I think part of it is realizing what comes in addition to and what is instead of, personally I work primarily with databases. Traditional, ACID-compliant monolithic databases where the primary concern is integrity and availability not scalability. Could I jump on the NoSQL bandwagon? Probably, I'd hardly call what Google and Facebook are doing a "fad". I think it's a fair bet to say that there'll plenty work for me anyway though, sure a few positively ancient languages like COBOL are mostly gone but I doubt

        • by pla ( 258480 )
          The thing about technology is: the absolute best thing today, really the best, is utter crap in 20 years.

          I know, right? Fourier transforms? Meh, sooo 1820s! No one except dinosaurs use them anymore... Certainly not the entirety of digital compressed audio and video, nope nope nope!

          All the cool kids today use DCT. No, wait, 40 years old? Can't trust anything over 40! Quick, port everything to wavelets! What? The HWT could legally drink alcohol in the US this year? Fuck, someone come up with a ne
        • by 605dave ( 722736 )

          Yeah, Unix is total crap by now.

      • Speaking for myself, I've been through six different frameworks/versions of "data binding", starting with VB3, now all the way through AngularJS. I've got 20 years of similar examples in DBMSs, distributed protocols, GUI design, testing, requirements, etc.

        It's not that I refuse to learn new technologies, because I've taken on new things every year that I've worked in this field. jQuery? Love it. HTML5, CSS transitions? B-E-A-utiful. Bootstrap? You betcha.

        I do, however, refuse to make all the same mistakes and work through the same leaky abstractions and other problems just to try the new hotness. A great example is the NoSQL movement - now that Postgres supports JSON documents (and has had great K-V support for a while now), I'll be very happy to exploit those features without wrestling MongoDB or Firebase to the ground.

      • New technologies are pretty childish at times though. On the other hand I get paid good money to do C and assembler and read schematics, and it's really hard to find twenty somethings who are even capable of understanding the basics anymore.

      • by jafac ( 1449 ) on Saturday May 17, 2014 @02:33AM (#47023955) Homepage

        This is really about how older people are experienced to know a boondoggle when they see one. (Example:the cloud, and how it's basically about trying to take control from the user and seeking rent). Older people don't buy into the bullshit and get off my lawn, and thus are seen as not wanting to embrace new technology. Its not that you can't teach an old dog new tricks, it's that the old dog knows that it's all a bunch of crap

    • by gweihir ( 88907 ) on Saturday May 17, 2014 @03:27AM (#47024105)

      Matches my experience. Young coders may know the latest language-hype, but usually they cannot generate clean, robust, secure and fast code at all. Using young coders may be at the very root of the problems we have been facing with software development for the last few decades. Even people that have the talent need significant experience before they become any good.

      Seems to be another case where what "everybody" knows is wrong.

      • by spd_rcr ( 537511 )

        Matches my experience. Young coders may know the latest language-hype, but usually they cannot generate clean, robust, secure and fast code at all.

        Ask the employers what skills they really value in an employee, and they'll be all the soft skills. I'm sure well written code ranks far higher than any technology buzz-word. We had a night on this at our IEEE Com-Sec meetings.

        What I get out of this article is that it's the older employees fault for not training in new tech on their own time. God forbid a technology company invest in their employees on the company dime and company time. This is the disconnect of the tech industry, and increasingly other mod

  • by Anonymous Coward on Friday May 16, 2014 @06:32PM (#47022251)

    We want people to spend their own time and money to train the skills that we need. There's no way we would invest in such things -- it hurts the bottom line!

    • by erice ( 13380 )

      We want people to spend their own time and money to train the skills that we need. There's no way we would invest in such things -- it hurts the bottom line!

      How daft! We do not want people who have trained themselves. If we wanted someone who learned technology outside of a corporate setting, we would hire someone straight out of college and we don't do that. We want other companies to train you.

    • You think Software Development is bad for this? At least the equipment is inexpensive and the material accessible.

      In aviation, you'll pay > $60,0000 of your own money to get your ATPL all to start on a wage of $25,000.

      What about medical school or law school? That's pretty expensive and comes out of your pocket.

      Many serious professions require you to spend money on your training. It just comes with the territory.

  • by bunyip ( 17018 ) on Friday May 16, 2014 @06:35PM (#47022263)

    One of my colleagues in in his mid-60s, and happily puttering around in modern technologies and adapting what he knows about systems to the latest tools. Writing prototype code in Clojure, using network databases (neo4j), doing interesting data modeling and generally just making stuff happen. He's learning new stuff every day, having fun - and getting to say no to job offers on a regular basis. I've been in this industry for more than 30 years and I'm currently mucking around with Hadoop, cloud computing and a bunch of the new things.

    People talk about time to learn, but it's a question about making time. Would you want to visit a doctor that hasn't updated their skills in 20 years?

    Alan.

    • Re:Yes, and No. (Score:5, Insightful)

      by Jane Q. Public ( 1010737 ) on Friday May 16, 2014 @07:03PM (#47022393)
      We've hashed this out on Slashdot before, more than once. OP is just wrong that older programmers in general don't keep up.

      Study after study have shown that older programmers are generally more productive, even after adjusting for the higher salary they tend to expect.

      While he appears to be genuinely sympathetic, his personal theories don't quite qualify as statistics.
      • Comment removed (Score:5, Interesting)

        by account_deleted ( 4530225 ) on Friday May 16, 2014 @08:24PM (#47022787)
        Comment removed based on user account deletion
      • Actually for me, keeping up would be learning Java. Most things newer than that are irrelevant for most things I do. I used to know Java, then the language changed, then it changed again, and now knowing the language is irrelevant as you have to know the frameworks instead.

        Keeping up with skills is sort of irrelevant when all the skills you learn are learned on your own. If someone can and has written their own language or OS, is it really necessary for them to know some temporarily fashionable language?

      • Well articulated, Ms. P., very well articulated, but /.'s parent company has been offshoring jobs for quite some time, no doubt the agenda behind this repitition!
    • by TapeCutter ( 624760 ) on Friday May 16, 2014 @07:46PM (#47022637) Journal

      I've been in this industry for more than 30 years and I'm currently mucking around with Hadoop

      I'm 55 with 25yrs experience, I picked up NIS scripting for work earlier this year and am currently playing with CUDA, at least 3/4 of the developers I work with on a daily basis are over 40. My dear old dad is 80, he's a retired engineer who started programming as a hobby @ 70.

      I have never been discriminated against because of my age, nor have I seen it happen to anyone else. If such practices exists (in Australia) I think they are limited to small outfits run by cheapskates and crooks. Shitty companies in any industry will always want to hire young people simply because they are cheaper and more easily manipulated.If you're that old you can't learn a new technology then it's time to retire and get your Alzheimer's problem looked at.

      • by gnasher719 ( 869701 ) on Saturday May 17, 2014 @03:54AM (#47024191)

        I have never been discriminated against because of my age, nor have I seen it happen to anyone else. If such practices exists (in Australia) I think they are limited to small outfits run by cheapskates and crooks. Shitty companies in any industry will always want to hire young people simply because they are cheaper and more easily manipulated.If you're that old you can't learn a new technology then it's time to retire and get your Alzheimer's problem looked at.

        Case 1: Idiot manager thinks that people should work 60 or 80 hours a week (obviously without compensation). Young, unexperienced developer might do it. Experienced developer tells him to shove it.

        Case 2: Shitty company runs out of money. Young, unexperienced developer can be tricked into accepting empty promises instead of payment. Experienced developer tells them to shove it.

        So that would be two kinds of situations where a young, unexperienced developer would be preferred.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      The thing is that most new tools are pretty much the same shit in a different package, yet most employers think their particular infatuation of the moment is unique and you have to have experience with it. I used Hadoop in a project and adapting to it was such a non-event that I had to double-check just now that I actually did use it. Yet if an employer wants someone to do some work using Hadoop, you can bet the vast majority of them will consider it a hard requirement.

      • Uhuh. All language are Algol to me. I haven't seen anything really new in decades. Mostly, it is the same old stuff warmed over with new names for old concepts.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Would you want to visit a doctor that hasn't updated their skills in 20 years?

      Alan.

      The difference is that software companies won't bother to spend time trying to teach their employees new technologies. Doctors always have pharmaceutical companies banging on their doors to tell them about the newest drugs they can prescribe to their patients.

  • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Friday May 16, 2014 @06:36PM (#47022267)

    For developers, it's skills like big data, cloud computing, and HTML5.

    Buzz word, buzzword, markup language.

    As a result, we do find that we face a shortage of older, more seasoned developers. And it's not because we don't want older candidates. It's often because the older candidates haven't successfully modernized their developer skills.

    I find it difficult to believe that a developer would NOT be able to pick up HTML5 in a weekend.

    • When I went back to school for my Master's degree everything was being taught in Java as the new teaching language. It took me less than a day to pick up enough to do the assignments competently. Admittedly jumping from a C background into Java is not a huge leap, but in the end it's all just syntax. Programming principles never change.
      • by Anonymous Coward

        Wellllll... In that particular case, I'm guessing your Java code -- at least initially -- was essentially C code with slightly different grammar and syntax. Although I definitely think people should hire for the talent and experience and not the specific skill, I also think a programmer needs to have had enough experience or education with various programming models (e.g., imperative, OO, functional) before they really grok each of them and are able to use them and their relatives as a "native" language rat

        • I taught C to 2nd year uni student in the early 90's but it wasn't until after that I realised that virtually every example in K&R is very elegant object orientated code that was written well before the term "object orientated" came into use. I looked at Java when it came out, it's main claim to fame at the time was "portability", I thought to myself "reinvented p-code?" and pretty much ignored it. A good grounding in C will make it easier to jump to any language, you just need to picture how the "new"
          • A good grounding in C? It is all Algol to me. Nothing new was invented in comp sci since about 1970. Changing the name of a concept, doesn't make it a new concept.
        • He never said he didn't know those programming models. Remember that OO existed years before C++ or Java, and can and has been done in C and Pascal.

          I think I got lucky because as an undergrad I took a comparative study of programming languages class; optional but I wanted to take all the classes. Really learned how to do things in many different ways and also more subtleties and names of the various features (sad to run across professionals who have a blank look when I mention lvalues or call-by-reference

      • by mjr167 ( 2477430 )

        Admittedly jumping from a C background into Java is not a huge leap, but in the end it's all just syntax. Programming principles never change.

        Apparently people have trouble going from Java to C...

        I'm currently involved in migrating a large legacy C/C++ project to new hardware and updating our external interfaces. They gave us a bunch of java programmers to help us out and they can't seem to wrap their heads around the fact that if the system isn't behaving the way legacy did, they are supposed to read the code and figure it out on their own. Apparently if Eclipse won't highlight the line or an error message doesn't get printed to the screen exp

        • by pauljlucas ( 529435 ) on Friday May 16, 2014 @08:03PM (#47022711) Homepage Journal

          Apparently people have trouble going from Java to C...

          You're not the first to notice this [joelonsoftware.com].

          • Started reading, because I'm usually happy to read a well written rant about why java sucks. I'm not exactly a fan myself.

            So he starts off with stuff about how he's feeling old and the surest sign of it is bitching about "kids these days". He's wrong. That's not the surest sign. This was:

            Instead what I'd like to claim is that Java is not, generally, a hard enough programming language that it can be used to discriminate between great programmers and mediocre programmers.

            Got to that point and decided that it's an obviously unsupportable premise. Read a little bit more, and my takeaway is that Joel doesn't know how to spot a good programmer unless they're working in C.

            • C has been described as a wrapper for assembly language, and as such it requires that you really understand how the computer processor works to do anything non-trivial. C++ allows you to do that as well, but C really enforces it - and makes you think about building your own libraries of routines to do the higher order abstractions yourself.

              This is valuable because most higher abstraction entry level languages today don't give you that experience (e.g. Java) - which really is what is important when design

              • by seebs ( 15766 ) on Saturday May 17, 2014 @01:45AM (#47023781) Homepage

                I hear that a lot, but I genuinely don't buy it.

                I'm a pretty good C programmer, by most accounts. I have a reasonable track record producing code that solves interesting problems, and very good reliability.

                And this absolutely does not require me to understand how the processor works. In fact, it's sort of the opposite; the reason I'm good at C is that I mostly ignore the processor question and focus on how the language spec works. So I write code that's correct without guessing at what CPUs will do with it.

                I've been writing C for >20 years. I've probably looked at assembly output maybe a dozen times in that time, maybe a little more but not much. I've tried to modify assembly code maybe twice tops. I don't know any assembly languages well enough to follow code in them without looking things up, and I generally can't tell you off the top of my head much of anything about a machine's addressing models or registers or whatever, unless the question came up as trivia. And I do just fine in C.

                • And [C] absolutely does not require me to understand how the processor works.

                  Well then you're not writing that has to run fast. Consider this talk [msdn.com]. Yes, it's on C++, but the point is that, at least for code that's used a lot (like the page-display code at Facebook), shaving 1% off the running time saves the company an "engineer's salary for 10 years."

                  In order to achieve that level of performance, you really do need to understand what's going on at the CPU level.

              • by sourcerror ( 1718066 ) on Saturday May 17, 2014 @06:38AM (#47024579)

                So you had a developer who doesn't know about Java's garbage collection, and the solution is to teach him C?

            • So he starts off with stuff about how he's feeling old and the surest sign of it is bitching about "kids these days".

              You need to have read more of Joel's writing. That's just his irreverent style.

              Got to that point and decided that it's an obviously unsupportable premise. Read a little bit more, and my takeaway is that Joel doesn't know how to spot a good programmer unless they're working in C.

              His premise is that, in order to be a good programmer, you need the right kind of metal aptitude which is a you-

        • I had a group of student who wanted more extension time on their project (8 weeks into the project) because all the previous classes used C++ and they weren't used to using C. I suspect they were just giving a lame excuse, though they weren't bright enough to realize how lame an excuse it really was.

      • I learned C++ in a weekend since I was teaching assistant for the class teaching C++ and had to learn it before the students showed up. And this was as an undergrad. And it was not just C with some changed syntax, it was actual OO stuff; like when I learned Lisp I actually programmed with it in a functional style.

        • You probably meant that you've learned a small but useful subset of C++ in a weekend. Otherwise you'd be a true Heinrich Schliemann of programming languages.
      • "jumping from a C background into Java is not a huge leap, but in the end it's all just syntax"

        Going from procedural to object-oriented is NOT all just syntax.

        Of course you can implement some OO concepts in C but then, you need to know about them before hand since C doesn't naturally lead to it.

        If you were just spouting procedural code on Java syntax (I've seen that before) then, well, it's only just syntax but you sorely missed the point.

    • "So, we're looking for someone with 5+ years programming in buzzword for buzzword using a buzzword methodology. Oh, and must work cheap. We play hard and work harder, so you should be glad just to join us even without being paid."

    • by sjames ( 1099 )

      Funny thing about big data. It bears a remarkable resemblance to the old mainframe days when data processing was a matter of streaming through tape after tape outputting successively more processed and more condensed data to a scratch tape which would be the input for the next step. In it's day, that was big data because there was no way to make it all fit in core. 'Core' is orders of magnitude larger these days, and so is the data. Instead of mounting tapes, it tends to be other machines on the network con

  • Short Sighted (Score:5, Insightful)

    by ImprovOmega ( 744717 ) on Friday May 16, 2014 @06:40PM (#47022287)

    When you go to hire a developer you're not just looking to hire someone who can code in the latest fad language/API/SDK. You need someone who knows software development like a captain knows his ship. I promise you that 20+ years of software development will be worth way more than the 22 year old kid who knows Ruby on Rails because he learned it while studying in college. That experienced developer can pick up whatever tool your company standardized on and yeah, it may be three months before he's all the way up to speed on it, but then the years of experience will begin to make themselves tellingly felt vs. a kid who happens to know the tool already.

    Hiring for the tool is stupid. It would be like looking for a columnist who specifically has Microsoft Office 2013 experience and filtering all the applicants who only used Google Docs in their previous jobs. Either one of them can write copy.

    • by Anonymous Coward on Friday May 16, 2014 @06:46PM (#47022315)

      Even better would be the 20 year veteran who can take those fresh out of school enthusiastic newbies and get high quality software out of them on a predictable schedule, without the "back in the day, we coded with patch cords on EAM equipment". Or the 20 year vet who is doing the new stuff and the old stuff, and can help the inexperienced new stuff guys and gals avoid the traps.

      Face it, on a large project, there aren't enough skilled veterans on the market to get the job done, you MUST do it with average or below average folks. The challenge is seeding the crowd with just enough experience so that all those contributors are net positive, no matter how small.

      • So what about the 20 year veteran who knows how to program her way out of a paper bag, AND knows the latest web technologies?

        Some people seem to think those are mutually exclusive.
      • Even better would be the 20 year veteran who can take those fresh out of school enthusiastic newbies and get high quality software out of them on a predictable schedule

        Yeah, that one might be mythical.

    • by ysth ( 1368415 )

      This. Though that three months sounds exeedingly generous to me. It takes very little time to get up to speed enough to start working with a new fad/language/API/SDK, especially if you are willing to bare your ignorance by asking questions where needed.

      • by samantha ( 68231 ) *

        It very much depends on what it is you are learning. There is no way you are going to be a reasonably proficient scala programmer in less than 3 months. Frankly I find that until I work with a language full time for a year I certainly cannot claim to be expert in it. Also there is time needed to learn the new gig software stack and its history which is non-zero. It usually takes 1-2 months depending on body of code to have some idea what one is talking about. People that say they can do it faster a

    • Yup. In 1999 I was almost 57 and got laid off. One company I sent my resume to completely refused to talk to me because my resume showed no Visual Basic experience. The fact they told me so was phenomenally unusual.

      Then a former boss snapped me up at his new company when he heard I was available. The first day on the job, I was helping a young developer write some test code in Visual Basic. While I had never tried to use Visual Basic before, the issues being dealt with were matters of logic and algorithm
  • by turkeydance ( 1266624 ) on Friday May 16, 2014 @06:42PM (#47022295)
    it's about the money. same with age.
    • by Guppy06 ( 410832 )
      Even age is about money. Benefits issues aside, older workers are more likely to know when they're being underpaid.
  • by pooh666 ( 624584 ) on Friday May 16, 2014 @06:51PM (#47022335)
    I am so sick of this same FA reposted more or less every week or two.
  • by kye4u ( 2686257 ) on Friday May 16, 2014 @06:55PM (#47022357)
    Companies often times prefer younger developers because they are cheaper. It is as simple as that.
    That older, incompetent developer was probably just as incompetent when he/she was in their 20's.
    • by samantha ( 68231 ) *

      I have actually had hiring managers try to claim they want people with no more than 5 years experience because that codes for how up to date their skills are. No, it doesn't. Some colleges teach little but Java for instance. If the first job or two after was mandating and existing Java stack then it is guaranteed the developer is no more up to date than a more seasoned developer that has seen more environments and has had to learn many more new things. With greater breadth learning new languages and API

    • Yup, they want cheap workers on their virtual assembly line.

      • by gweihir ( 88907 )

        Indeed. Only problem is that it is not an assembly line and viewing it as such kills productivity, quality, security, etc.

    • by gweihir ( 88907 )

      And on the other side, these companies routinely fail Capitalism 101, because it is not about cost, it is about cost-to-benefits relation. The problem they have is that they are unable to recognize that it is not "programmer, one unit", but that productivity varies wildly, and especially inexperienced or non-talented people can easily have negative productivity.

  • by Tablizer ( 95088 ) on Friday May 16, 2014 @07:29PM (#47022547) Journal

    make me!

  • by MacTO ( 1161105 ) on Friday May 16, 2014 @07:29PM (#47022555)

    I've found that young vs. old is a trade-off.

    Older workers frequently have a better work-ethic in the workplace, and have more experience to draw upon. Younger workers have a better work-ethic in terms of the amount of time they are willing to dedicate to work and frequently (but not always) contribute new ideas.

    What it seems to come down to is: do you want experienced workers who will contribute more per hour, but who will also draw a firm line between their work and personal life, or a young worker who is willing to put in the extra time, even though a lot of their time will be spent relearning what a more experience worker already knows?

    I suppose software development also has other factors. Some products depends upon experienced developers (e.g. anything considered mission critical) while other products depend upon fresh ideas (e.g. most software targetted at consumers).

    • You didn't say what industry you were describing. In every industry, experience generally leads towards higher productivity. But in software development, experience often leads to productivity that is orders of magnitude higher. A competent older software engineer can run circles around a younger worker, even if that younger worker puts in lots of hours.

      • It depends on how you define productivity though. It's a strange thing to try and measure but when they do they often have silly things like lines of code written or bug tickets closed, etc. If the product is dead in the water and not shipping until a bug is fixed it is almost always the experienced person who figures out the problem, often while seeming to stare off into space.

  • I've watched a dozen, or so, "new, cool" methodologies, languages, and tools come and go over the years, mostly because some screwball "consultants" or publishers needed to sell books and training and managers who need to look useful to their organizational superiors. If a person has actual programming skills (understand a problem to be solved, state a solution in a form that a computer can understand and a human can maintain, choose an appropriate language/tool set in which to implement her specific compo

  • To just tell the perspective employer that you have the skill, and learn it if you get the job.

    That is tech. It takes a lot of time and effort to get good at programming; No one can know all languages, but it only takes about a week to be moderately proficient at any single one. When you are hiring a new programmer, do you really want to hire some JS code monkey (even if that is the only language you currently need developers for), or do you want to hire an experienced software developer who has the ability

    • To just tell the perspective employer that you have the skill, and learn it if you get the job.

      Ooooooo. No.

      When I interview candidates, I get people all the time who claim to have a certain skill on their resume, even answering in the affirmative when asked directly if they have that skill. A simple question or two about the technology is usually all it takes to determine if they're lying. Some of them then actually admit it, saying "the recruiter told me to put that on my resume". I don't really care

      • I honestly don't know why this happens. Seriously, if someone puts something on a resume don't they realize that the interviewer might actually ask a question about that thing? I'm not just talking about fresh-out-school kids who put down everything they ever did in a class, but people who claim to have actual experience do this too.

        The other thing I see a lot is the person who writes down all the stuff their company or team did, without writing what they did themselves. It's sort of like name dropping i

    • You also want the employees to know _more_ than just the language. The language is the simple stuff. I have interviewed people who can't even describe in broad terms what the basic functional blocks in the product they worked on were (and not even using an excuse about not having an NDA). Knowing what you are doing with the programming language is more important than the language. If someone says they have 5+ years doing "embedded Linux" and yet they haven't the first idea about how to start to write a

  • One of the best developers I know is over 50 years old, the second best programmer I know ( not me ), is 25 years old, age means nothing. What matters is natural talent, some programmers can sit down and write a great firmware in a night and some can't write one in a year, ( substitute firmware for program ).
    • by gweihir ( 88907 )

      Indeed. The issue most hiring managers do not get is that mediocre or bad programmers may never produce an acceptable result at all, no matter how much time they are given. I have seen that several times, where failure after failure happened and those in charge where too stupid to see that the "cheap" people they hired just could not hack it. Well, I guess they hired people similar to themselves...

  • by Fear the Clam ( 230933 ) on Friday May 16, 2014 @09:01PM (#47022923)

    As programmers get older they simply get less excited about the idea of pulling all nighters and doing "code sprints" because they have spouses and families they enjoy, responsibilities to others outside of work, and they know that this isn't a good process for long-term success. All nighters are fun and adventurous when you're in college or just out of school, but after a few decades in the working world you're seen it all before and simply refuse to get caught up in another "emergency" caused by poor planning, unrealistic expectations, and marketing promises.

    I'm not saying that programming is a young person's game--far from it. However, inexperienced workers are not only cheaper, but also far more likely to put up with bullshit and bad management.

  • by Maxo-Texas ( 864189 ) on Friday May 16, 2014 @09:30PM (#47023025)

    Who is a friend of mine that said to me casually, "Yea I wanted to build a team of young people that I could hang out with so I didn't hire anyone old". Old here being over 35!

    In IT, age discrimination is blatant. It starts at 45. You should always keep your skillset up- but it probably won't help much because many young 28 year old managers are just flat out not going to hire an "old geezer" who is 45 unless they are the only viable candidate.

  • Young MAN's game? (Score:5, Insightful)

    by Malkin ( 133793 ) on Saturday May 17, 2014 @12:06AM (#47023493)

    Everybody knows software development is a "young man's game"? Did you seriously say that?

    HELLS no, man.

    First off: I've been programming since I was 8, but I was never a man, and I will never be a man, and I have never suffered under the idiotic delusion that this was ever exclusively a man's game -- young or otherwise. This is my game.

    I am still programming at 40, and I assure you that youth offers no advantages over experience, either.

    But, that doesn't stop me from mentoring. My interns may not be able to program like I do, but I'll give 'em every advantage I can. It's great to teach them some of those intrinsics that they don't get in school. That gives them some of the advantages an experienced developer, even if they're younger. This isn't a zero sum game. We all need good devs, so we should try to make everyone who is working with us better -- whether they are young or old. We all get better software, that way.

  • My advice is: train your analytic skills and understand where thing go right, wrong, or just different. This can only be done by experience. While i learn programming languages slower than with 25, i learned to analyze code. Having seen code written by many very different people (everything from physics professors to psychologists), i understand the idea of most code better than the authors (since i see the limitations the author is placed under). If you apply your analytic experience and skills to the prob

  • "Grow Up"? Seriously? Some of us started coding at single digit ages. I "grew up" at age 17, when I was homeless and fending for myself on the streets. Patronize someone else, moron, you don't even know what life is. Ever seen someone's skull stomped in? You learn real quick what's actually important in life once yours is on the line. I learned real quick to have a plan B: Always have a contingency plan. Idiots without one are not, "grown up."

    I've forgotten more languages than most have learned, but I'd be fine with folks not being considered programmer material at age 40 if they would hire from within for management positions. Instead of employing middle management drones with unrelated "business" (Secretary++) degrees give the folks with actual hands-on experience the job of managing the people in the job they actually know how to do. Face it: Those HR goons are morons, they can't tell good from evil, or else explain how the odd Napoleon-complexes and Micro-dictators in management even got there? If HR wasn't dumb as rocks they'd require demonstrations of skill, a coding test, not accreditations: Degree mills exist, fools; This is especially true overseas. Ah, but the that's getting to the real issue: Skill sets aren't what's really important to upper management... TFA's author isn't as "grown up" as they think.

    The new platforms will keep coming, but the solutions will largely be the same. Now I can undercut competition via barging onto any new platform with my whole codebase faster than the other guy can tell you why the new language is "More Expressive". I just have to implement a new "platform runtime" for my meta compiler and then I can check off that platform as a capability and deploy ALL of my existing solutions on the platform since they're written in an even higher level language, and compile down into other languages. Sometimes this means we have to implement features the language doesn't have in the target language -- If I need a coroutine in C: When returning to the trampoline record the exit point. When calling into the coroutine specify the entry point to resume at. I generate a switch with GOTOs to resume from the next point of the operations (GOTO is very valuable, only idiots say otherwise; Look at any VM). Lambda lifting mutable persistent state out of the function scope has the benefit of thread safety. Since I treat comments as first class lexical tokens the compiler-compiler's output is fully readable and commented code in the target output language and following whatever style guide you want. (LOL @ brace placement arguments, what noobs)

    See, experienced coders understand languages so well they aren't even languages to us, they're just problem solving APIs: The problem-space is independent of the implementation's solution-space. When we pick up new languages or platforms we're looking for, "How does this language let me solve problem $X?", but more importantly our experience lets us identify what solutions the platform lends itself to solving. Just because a new platform comes out doesn't mean it's more capable of solving every problem. Do this long enough and you'll get tired of re-inventing all your wheels in each new platform and just create a meta compiler, as I've done.

    Fortunately I've always crossed off (and initialed) that employment contract paragraph that said everything I would create (even off the clock) would belong to the company: "I don't watch TV. I have several hobby projects I do instead and they need to remain mine. If you want me to give up my hobbies while working here you're going to have to pay me a lot more. Would you sign a contract to work somewhere that said you couldn't ever watch TV?" Protip: Most places have another employment contract without that clause, just tell them you make indie games or have a robotics / embedded systems project, contribute to open source in your spare time, etc. Make your hobby profitable. That way you can always have a plan B, and you'll have more leverage in any salary negotiations: "

  • If a fellow gets married, *then* his creativity and productivity plummets. His time is no longer his own. When I regard the fellows I work with, the guys over 40 who avoided marriage, or have been divorced for a while, are the "top guns" to put in U.S. terms. They have both creativity *and* a lot of experience, which makes them almost impossible to beat over the course of many months.
    • You are tagging a group of people based on no logic, and grossly insufficient evidence. This is not different that racial discrimination.

    • because all the premier engineers and scientists weren't married when they did their greatest works?

      Let's see, Einstein married 1903, then in 1905 did the Annus Mirabilis papers 1905, which laid down the foundations of modern physics both quantum and relativistic.

      Linus Pauling, married 1923, then 1927-32 creating Pauling's rules, did fifty papers on quantum mechanics explaining chemical properties of atoms and molecules. After that, invented concept of electronegativity.

      James C. Maxwell, married 1858, and

  • I constantly see memes like "It's often because the older candidates haven't successfully modernized their developer skills." but I have never found that to be even remotely true. I have worked in IT over 30 years. I have worked for several companies, big companies, and small companies.

    Nobody claims that sort of crap about doctors, lawyers, accountants, even most engineers, or scientists.

    If all of those professionals can keep up with changes in their field of work, and study, then why not software developer

  • My boss and I routinely look at new tools and technology with an eye to solving our company's problems and build cool new stuff. Our goal is not to embrace flavour-of-the-month technology. It's to identify better solutions to old problems, or find good solutions to new problems. Tools have to work, or they serve no purpose. Everything else follows from there.

    We do most of our development in C on Linux, but have incorporated virtualization and cloud computing, new technologies that provide better solutions

Statistics means never having to say you're certain.

Working...