What Today's Coders Don't Know and Why It Matters 368
jfruhlinger writes "Today's programmers have much more advanced languages and more forgiving hardware to play with — but it seems many have forgotten some of the lessons their predecessors picked up in a more resource-constrained era. Newer programmers are less adept at identifying hardware constraints and errors, developing thorough specifications before coding, and low-level skills like programming in assembly language. You never know when a seemingly obsolete skill will come in handy. For instance, Web developers who cut their teeth in the days of 14.4 Kbps modems have a leg up in writing apps for laggy wireless networks."
Re:It doesn't matter. (Score:4, Interesting)
Is it truly cheaper to be sloppy ? Hardware keeps getting cheaper and faster, sure, but not matching the pace at which code is getting slower.
Just look at your average web server. Today's machines are over a hundred times faster than they were 10 years ago, and we're not doing anything significantly different. Serving up text and graphics, processing forms, same old b.s. So then, why aren't we serving 100 times more pages per second ? Apache keeps getting fatter, PHP seems to do a "sleep(100)" after each line, and don't even get me started on Ruby.
There was a time, not so long ago, when I would spend an hour tweaking an oft-used assembler loop, and the end result was a 3x speedup or more. I'm not saying we should rewrite everything in assembler, but I think we're become so far removed from the actual machine, relying on the compiler to "do the right thing", that people don't even have the slighest clue how to distinguish fast code from slow. How often do we use benchmarks to test different solutions to the same problem ? Almost never! People bust out the profiler only when things go wrong, and even then they might say "just add CPU/Ram/SSD" and call it a day.
Or, if we must motivate the hippies, call it "green optimisation". Yes, faster code finishes quicker, using less power to complete the same job. If we're dealing with web servers, faster code would require less cluster nodes, or maybe free up some CPU time for another VM on the host, and those 60A circuits aren't cheap either. If spending an extra day optimizing my code could save me $2000 / mo off my colo bill, I'd be a fool not to invest that time.
Re:It doesn't matter. (Score:4, Interesting)
As an example, consider a database with very poorly designed primary and secondary keys. This choice will either:
a) not matter in the least because the tables are so small and queries so infrequent that the other latencies(e.g network, hard disk etc) will dwarf the poor key choice or
b) Will quickly make the entire database unusable as the time it takes for the database software to search through every record for things matching the given query takes forever and just kills the disk.
I've seen plenty of (b), largely caused by total ignorance of how databases, and the hard disks they live on, work. The indices are there not only for data modeling purposes, but also to minimize the amount of uber expensive disk I/O necessary to perform most queries. And while you may be in situation (a) today, if your product is worth a damn it may very well get bigger and bigger until you end up in (b), and by the time you are in (b) it may end up being very, very expensive to re-write the application to fix the performance issues(if they can be fixed without a total re-write)
Anyone who codes should have at least a fundamental grasp of computer architecture, and realize what computer hardware is good, and bad at. That isn't "premature optimization" as you seem to think it is, it is a fundamental part of software design. "Premature optimization" in this context means things like changing your code to avoid an extra integer comparison or two, that kind of thing. It is not,"lets just pretend the computer is a magic device and ignore performance concerns because it's easier".
Re:This... (Score:5, Interesting)
the worse code, yes
But that runs fast on old/not so fast hardware
Example http://en.wikipedia.org/wiki/Fast_inverse_square_root [wikipedia.org]
Where are they finding these "new coders?" (Score:2, Interesting)
Every once in a while I read an article on Slashdot about how the current generation of programmers churn out only the shittiest of code; how they have no idea how a computer works; how they could never program without a fully-featured IDE with Intellisense that renders instantly. As an undergrad in a CS/ECE discipline, this has always surprised me-- I can only speak for the curriculum at my school, but I can assure you, these mythical 'lost programming skills' are alive and well. Most of the supposed missing skills are addressed in the following mandatory courses, required for graduation from CS/ECE at my school:
Yeah, sure, the some folks at the bottom of the class may be shitty programmers who scraped by with a "barely passed" in the mandatory courses. But surely they existed 10, 20, 30, 40 years ago as well. I'm not sure where people are getting this idea that new coders have no skills. What do they think we do for four years, set up MySQL and write PHP for it?
Re:The problem is (Score:5, Interesting)
There aren't good engineering practices in software. This is why I abjectly refuse to call myself an engineer (that and I'm *not* an engineer). Can you tell me with a known degree of certainty the probability that a software component will fail? What procedures would you put into place to give you that number (keeping in mind the halting problem)? What procedures would you put into place to mitigate those failures? Because I'm drawing a big gigantic blank here.
Look, I'm all for using "Software Engineering" practices. Personally, in my career I have championed TDD, peer review, acceptance tests written in advance by someone other than the programmers, etc, etc, etc. But this isn't engineering. The best I can tell anyone is, "Hey, it doesn't break on *my* machine. Yeah, I think the probability of it breaking on your machine is 'low'. No, I wouldn't like to specify a number, thank you very much." Why do you think software never comes with a warrantee?
I often wonder what we could do to actually make this an engineering discipline. For one thing, I think we really need to invest in developing stochastic testing techniques. We need to be able to characterise all the inputs a program can take and to test them automatically in a variety of different ways. But this is the stuff of research. There are some things we can do now, but it's all fairly naiscent technology. Maybe in 20 years... :-P
Yesterday's coders aren't necessarily relevant (Score:5, Interesting)
Some old-school developers prematurely optimize for things we no longer need to optimize for (and shouldn't). From an older post of mine [slashdot.org]:
He was optimizing for resources that were no longer constrained, and consequently pessimizing for the resources we actually cared about. RAM? Dirt cheap, at least for the dataset sizes involved in that project. Much more expensive was all the extra disk and CPU load he was creating on the company-wide database server (which is sufficiently powerful to serve the entire company when it's not being deliberately assaulted).
I'm not "anything goes" by any means, and I'm the guy responsible for making sure that lots of processes can peacefully coexist on an efficiently small number of servers. But for all intents and purposes, most of our apps have unlimited resources available to them. If they want to use 100% of a CPU core for 5 minutes or 2GB of RAM for half an hour a day, so be it. I'd much rather run simple, testable, maintainable code that happens to use a lot of server power than lovingly hand-mangled code that no one but the original programmer can understand and which interacts with the rest of the network in entertainingly unpredictable ways.
Whither specs? (Score:2, Interesting)
We have more code verification, static typing, contracts, tests and assertions than ever. Somehow I doubt the above.
And if by 'specification', you mean a spec on paper separate from the code, that's because the lessons of the past have taught us that specs are never current, so what's the point? With more expressive, higher-level languages, the programming language becomes the specification language, and a specification in such a language is then naturally executable, ie. code is the specification with sophisticated types, or an executable program can be extracted from the specification ala Coq.
Re:tl;dr (Score:4, Interesting)
No, software right now is turning from a craft into an industry, which means that artisans are being replaced by minimum-wage drones and automated code generators. Of course quality is going to suffer as a result.
Re:tl;dr (Score:5, Interesting)
Having developed on old platforms and new ones and back to old one. You actually get much better quality out of the new stuff. We look back at the old software with pride on how well it runs, forgetting the decades of errors and problems it had in the past, and the years of effort it took to get them run at that level.
The old stuff seems to run faster but not really. Old word perfect, no realtime spell check, no fonts, bold and italics were the big features, along with multi-column and margins. The display was 80x25 and still the app crashed and you lost all your work. And took minutes to load off your floppy.
Can new developers write tighter code. Sure they are not stupid. But how much can you loose from doing it? Getting in too late in the market? Loss of multi-platform support? Hard to maintain? Vulnerable to hacking?
I just heard a report that the high frequency trading systems are easy to hack into because they have been developed for all speed... Leaving room for hacking in. As anyone has done coding knows it takes 10% to get the program to do what it needs to do and the rest to make sure humans don't cause it to do something else.