What Today's Coders Don't Know and Why It Matters 368
jfruhlinger writes "Today's programmers have much more advanced languages and more forgiving hardware to play with — but it seems many have forgotten some of the lessons their predecessors picked up in a more resource-constrained era. Newer programmers are less adept at identifying hardware constraints and errors, developing thorough specifications before coding, and low-level skills like programming in assembly language. You never know when a seemingly obsolete skill will come in handy. For instance, Web developers who cut their teeth in the days of 14.4 Kbps modems have a leg up in writing apps for laggy wireless networks."
Newsflash (Score:2, Insightful)
Experienced people have experience in things they have experienced
Re: (Score:3)
newsflash - many schools no longer teach low level programming.
If you want to get good, truly good, you'll have to learn some of this on your own. And in the good old days all you had to do was count up clock cycles and you were done. Modern processors with multi-stage pipe-lines and out of order execution are much harder to hand optimise. When a pipe-line stall is very expensive you don't worry so much about clocks.
Fashion (Score:4, Informative)
Re: (Score:3)
So your particular skillset has fallen out of vogue for a while; it happens. If this stuff is useful, it'll come back. For instance, a lot of the hardware related skills mentioned are still around, they're just considered to be a specialisation these days, in most situations it's safe to assume that the hardware either performs within spec or that the lower layer (OS etc) is dealing with any irregularities.
I'm actually a youngin' who took interest in the lower layers, and developed my skill set around that. I'm kind of waiting for the oldies to retire/kick the bucket and open up more openings for people like me. I anticipate that I'll be one of the hawt shite programmers once the population of systems programmers starts dwindling...
Re: (Score:3)
Pretty much. There are very few comp sci programs these days that teach anything relevent to systems programming, and kernel jobs still pay nicely as a result. I stay away from the kernel stuff myself, as I find it quite tedious, but there are plenty of user-mode systems jobs around in Silly Valley - not so much elsewhere. (Stay away from embedded though, that field pays crap for some reason I've never grasped).
We're re-inventing the mainframe all over again, and that promises to be a lot of work.
those young whippersnappers (Score:2)
They don't know that old trick from liblawn
Lawn::GetOffLawn(kid);
Re: (Score:2)
This obviously calls for a LawnFactoryFactorySingletonFactory pattern
Re: (Score:3, Funny)
No, it clearly demands a combination of the observer pattern with the command pattern: You observe your lawn, and if you see kids, you command them to get off it.
Re:those young whippersnappers (Score:5, Funny)
That lib requires cooperative event handling in the kid class. I much prefer the longer, but deterministic form:
if ( $myLawn->getContents()->filter({type: kid})->count() > 0 ) {
$myShotgun = new Shotgun()
$myShotgun->loadAmmo();
$myLawn->getOwner()->wieldObject($myShotgun);
for( $i = 5; $i>0; $i--) { sleep(1000); }
while ( $myLawn->getContents()->filter({type: kid})->count() > 0 ) {
$myShotgun->fire();
}
}
Re: (Score:2)
What?
it should be get_off(my_lawn), none of this modern 'object orientation' nonsense
or maybe
lea ax,[my_lawn]
call get_off
The problem is (Score:5, Insightful)
they aren't trained to engineer software, and the industry hasn't incorporated good engineering practices.
Re: (Score:3)
Coming from a legacy modernization project, just because people wrote programs 10, 20, 30 years ago doesn't mean that the code was good, or that the developers knew what they were doing. One would hope that decades of development experience would teach a well rounded set of skills and often it does.
To sum up, a 5 year out of school brat learning technology X is any less capable than a 5 year out of school brat learning technology Y in the 80's/90's.
Re:The problem is (Score:5, Interesting)
There aren't good engineering practices in software. This is why I abjectly refuse to call myself an engineer (that and I'm *not* an engineer). Can you tell me with a known degree of certainty the probability that a software component will fail? What procedures would you put into place to give you that number (keeping in mind the halting problem)? What procedures would you put into place to mitigate those failures? Because I'm drawing a big gigantic blank here.
Look, I'm all for using "Software Engineering" practices. Personally, in my career I have championed TDD, peer review, acceptance tests written in advance by someone other than the programmers, etc, etc, etc. But this isn't engineering. The best I can tell anyone is, "Hey, it doesn't break on *my* machine. Yeah, I think the probability of it breaking on your machine is 'low'. No, I wouldn't like to specify a number, thank you very much." Why do you think software never comes with a warrantee?
I often wonder what we could do to actually make this an engineering discipline. For one thing, I think we really need to invest in developing stochastic testing techniques. We need to be able to characterise all the inputs a program can take and to test them automatically in a variety of different ways. But this is the stuff of research. There are some things we can do now, but it's all fairly naiscent technology. Maybe in 20 years... :-P
Re:The problem is (Score:4, Insightful)
The problem is many people (bosses, project managers, developers, etc) don't understand the big difference between "Software Engineering" and say Civil Engineering.
In Civil Engineering creating all the blueprints and plastic models necessary is typically 1/10th the cost of building the "Real Thing", and make up a smaller portion of the total cost.
For software, creating the source code (drafts, blueprints) costs more than 100 times the cost of "make all" (building the "Real Thing"), and form a large portion of the total cost.
So if "stuff happens" and you need to spend 50% more to fix the design, with Civil Engineering the bosses are more likely to agree (unhappily) to "fix the design", because nobody can afford to build the whole building a few times till you get the design right...
Whereas with "Software Engineering", the bosses are more likely to go "Ship and sell it, we'll fix it in the next release!".
And if you're a boss, you'd likely do the same thing
So even if you could work out the probabilities of some software component failing, nobody would care. Because all you need to work out is: which bugs need to be fixed first, out of the hundreds or thousands of bugs.
That changes if you are willing to spend 10x the cost (in $$$ and time) of creating each internal "release". By the time the 10th (final) release is written and tested (specs remaining the same - no features added) the stuff should work rather reliably. But you'd be 5 years behind everyone else...
Re: (Score:3)
Mostly that is not in there because the software in question is not 'critical' and nobody wants to pay for the testing.
I do work in the oil/gas industry and for ESD (emergency shutdown) systems we now have to have SIL Certified controllers and software. This means libraries have to be SIL tested and certified along with all hardware used.
http://en.wikipedia.org/wiki/Safety_Integrity_Level [wikipedia.org]
Is this the kind of failure rate and safety factoring you are referring to? :)
Re: (Score:3)
It's there. I had to study this stuff as an undergrad and then was in a room full of people doing it in more detail for my PhD. If you really need software that is provably correct, then you can get it. Of course, that doesn't mean any more than it does in engineering: remember Tacoma Narrows? Specifications missing an important detail can still mean a disaster, even if the engineering is perfect.
The problem is cost. Several of the people in the verification group during my PhD were working with ind
This... (Score:2)
would explain why Runescape runs perfectly well over dialup or EDGE(4KB/sec) speeds, while most other games do not: The original creators probably had some experience that way.
Re: (Score:2)
Game development is where you see the worse code ever.
Re:This... (Score:5, Interesting)
the worse code, yes
But that runs fast on old/not so fast hardware
Example http://en.wikipedia.org/wiki/Fast_inverse_square_root [wikipedia.org]
Re: (Score:2)
It doesn't matter. (Score:5, Insightful)
"We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil" - Donald Knuth
Most developers will never need for their apps to run in constrained environments, and most employers don't want to spend money to eek out performance when average performance is perfectly fine.
Too many programmers get caught up in trying to make something the fastest, or most memory efficient, or makes the best use of bandwidth. When most of the time, it just doesn't matter. Such things are expensive, and in the long run it's cheaper to be fast and sloppy than slow and lean.
Re:It doesn't matter. (Score:5, Informative)
I love D Knuth and have read is sorting and searching book many time over, always finding good times.
SPEED does mater and so does SIZE and BANDWIDTH. It is important to design things right the first time versus loops and loops of daily optimization that must code is written in today. The understanding of record-locks, index optimization and other multiplexing methods are needed today. I see too much of sucking a 1+ million peaces into memory to find 1 item, "Because it is fast".
Yes this sounds like "get of my grass", but "fast and sloppy", is a waste on everyone's resources not just a single computer.
Re: (Score:2)
Those are important... *IF* they are important. Duh. Most of the time, they are not. Thus, complaining that developers who don't need to write lean apps aren't writing lean apps is kind of pointless.
Re: (Score:2, Insightful)
The thing is people use that quote to be *LAZY*. Yeah most of the time it doesnt matter. But guess what when it does you may find yourself rebuilding an entire framework because you made just plain stupid mistakes and LOTS of them.
For someone who understands what optimizations are available vs your code jockey who just whips up some code the difference is miles different.
I used to think the same way. "It doesnt matter much" but then I realized it does matter. It matters a lot. Think if all of your prog
Re: (Score:3)
Re: (Score:3, Insightful)
Nobody is suggesting you optimise to shave off the odd byte or machine cycle.
However, you should optimise for the task in the sense of picking appropriate algorithms and structures.
What I've seen in industry is that a lot of developers pay attention neither to elegance nor efficiency. And this really bites you in the pants when the code meets real data.
Anyway, once you decide to ignore resource constraints in your engineering, what on earth is left that's challenging? Honestly, you might as well flip burg
Re: (Score:2)
Re:It doesn't matter. (Score:5, Insightful)
And no, that's not an excuse to be sloppy. "Back in the ancient days" it was important to write good code for the limited resources. Now you still need to write good code, but the constraints are relaxed. But we still need code that is maintainable, dependable, extendable, flexible, understandable, etc.
Re: (Score:2, Insightful)
And all those users sitting waiting 5 minutes for the page to load, for the data to completely draw, or whatever?
You do read thedailywtf.com, don't you? Plenty of stories where a 'quick app' becomes mission critical, and can't handle slightly (?) larger datasets.
Well, those aren't _our_ users.. their time is _free_. And they can _leave_ if they don't like it. ....
Re: (Score:2)
Re: (Score:2)
You seem to have this mental model that more efficient code must take longer to develop. But not making bad decisions may take up exactly zero time if you are in the habit of making good decisions.
A simple example is the ordering of loops. Exchanging the order of loops after the fact may take extra time, but writing the loops in the right order from the start doesn't take more time than writing them in the wrong order.
Re: (Score:2)
You seem to have this mental model that more efficient code must take longer to develop.
I did not say, claim, or imply that. I was talking about factoring in developer time as a resource.
Re: (Score:3)
So to save one developer day we sacrifice 5 user minutes per day X 250 days per year X 100000 users? I know I have been the victim of this.
Re: (Score:3)
While i'm not disagreeing with you, I think it's a lot more complex than that. If you're talking commands in an app, then certainly there are times where effectively instant actions are beneficial, but when moving from app to app most people take a few moments to readjust themselves to a different UI and figure out how they're going to go about doing their task.
Let me give you an example. A company I worked at was complaining that their terminal based app (running on a Unix system) was too slow, and that
Also in terms of CPU or RAM (Score:4, Insightful)
You need to wait until the shit is done, then profile it. You suck at knowing what needs to be optimized, no I don't care how good you think you are. Ask the experts, like Knuth or Abrash.
So if the speed matters, or the RAM usage, or whatever you write the program, then you profile it, in real usage, and see what happens. You then find where spending your time is worth it.
For example support you find a single function uses 95% of all the execution time. Well, until you've optimized that, it is stupid to spend time optimizing anything else because even a small gain in that function will outweigh a massive gain elsewhere. You need to find those problems spots, those areas of high usage, and optimize them first, and you can't do that until the program is written and profiled.
That is also pretty common too, the "Couple areas that use all the resources," thing. It is not usual for it to be spread across all the code. So you need a profiler to identify those and you then need to focus your effort. You can then break out the ASM hackery for those few lines that have to be super fast, if needed, and you'll achieve most if not all of the result you would from doing the whole thing low level.
Re: (Score:3)
I've seen many times over where a convention is used, and a res
Re: (Score:3)
Hell, even working for low wage, a person is expensive. Thus the most effort should be put in having them do the least effort.
Yeah, but here's where it gets weird: Software is highly valuable because it allows a single person to do the same task a very large number of times.
Because of this, it's not like stacking logs or gluing tile. If you have sufficient leverage, you can spend a stupid amount of money getting everything just right and profit immensely from it. (See: Apple)
On the other hand, most softwar
Re: (Score:3)
Sadly the person signing the developer's time-sheet isnt usually the one incurring the cost down the line...
That kind of disconnect makes for shitty code and shitty efficiency overall :(
Re:It doesn't matter. (Score:4, Interesting)
Is it truly cheaper to be sloppy ? Hardware keeps getting cheaper and faster, sure, but not matching the pace at which code is getting slower.
Just look at your average web server. Today's machines are over a hundred times faster than they were 10 years ago, and we're not doing anything significantly different. Serving up text and graphics, processing forms, same old b.s. So then, why aren't we serving 100 times more pages per second ? Apache keeps getting fatter, PHP seems to do a "sleep(100)" after each line, and don't even get me started on Ruby.
There was a time, not so long ago, when I would spend an hour tweaking an oft-used assembler loop, and the end result was a 3x speedup or more. I'm not saying we should rewrite everything in assembler, but I think we're become so far removed from the actual machine, relying on the compiler to "do the right thing", that people don't even have the slighest clue how to distinguish fast code from slow. How often do we use benchmarks to test different solutions to the same problem ? Almost never! People bust out the profiler only when things go wrong, and even then they might say "just add CPU/Ram/SSD" and call it a day.
Or, if we must motivate the hippies, call it "green optimisation". Yes, faster code finishes quicker, using less power to complete the same job. If we're dealing with web servers, faster code would require less cluster nodes, or maybe free up some CPU time for another VM on the host, and those 60A circuits aren't cheap either. If spending an extra day optimizing my code could save me $2000 / mo off my colo bill, I'd be a fool not to invest that time.
Re:It doesn't matter. (Score:5, Insightful)
Today's machines are over a hundred times faster than they were 10 years ago
The raw CPU power times the amount of cores is 100 times faster. How much faster is the I/O? Serving up web pages is mostly about I/O. I/O from the memory, I/O from the database, I/O to the end user. The CPU is usually a small part of it.
You actually sound like a perfect example of what the article is talking about. People who don't understand where the bottlenecks lie. Hell, it even mentioned the misunderstanding of the I/O bottlneck that exists today.
Re:It doesn't matter. (Score:4, Interesting)
As an example, consider a database with very poorly designed primary and secondary keys. This choice will either:
a) not matter in the least because the tables are so small and queries so infrequent that the other latencies(e.g network, hard disk etc) will dwarf the poor key choice or
b) Will quickly make the entire database unusable as the time it takes for the database software to search through every record for things matching the given query takes forever and just kills the disk.
I've seen plenty of (b), largely caused by total ignorance of how databases, and the hard disks they live on, work. The indices are there not only for data modeling purposes, but also to minimize the amount of uber expensive disk I/O necessary to perform most queries. And while you may be in situation (a) today, if your product is worth a damn it may very well get bigger and bigger until you end up in (b), and by the time you are in (b) it may end up being very, very expensive to re-write the application to fix the performance issues(if they can be fixed without a total re-write)
Anyone who codes should have at least a fundamental grasp of computer architecture, and realize what computer hardware is good, and bad at. That isn't "premature optimization" as you seem to think it is, it is a fundamental part of software design. "Premature optimization" in this context means things like changing your code to avoid an extra integer comparison or two, that kind of thing. It is not,"lets just pretend the computer is a magic device and ignore performance concerns because it's easier".
Optimization vs Standard Performance (Score:2)
There's a big difference between "optimization", and "designed with standard performance considerations in mind" though. The latter should always be done. The former, only when there's a genuine need to run As Fast As Possible (tm).
Too many new coders today have no grasp of EITHER. They don't understand even the basics of algorithmic efficiency ("big O" notation, the difference between a linear and constant time lookup, etc), the cost of memory allocation (doing things like unnecessarily creating a new i
Re: (Score:3)
When most of the time, it just doesn't matter. Such things are expensive, and in the long run it's cheaper to be fast and sloppy than slow and lean.
You have GOT to be in management. The vast majority of the cost of a software project go into the support. If it cost twice as much to do it right up front you will save 10 times that amount in support cost down the road. You can then spend that money evolving and improving your software rather than being forced to spend it trying to keep it going day to day. Attitudes like your's are the number one thing wrong with the software industry (well, besides software patients but that's another argument).
Hi, my name is Bob! (Score:2)
Maybe having a conservative mindset helps.. (Score:2)
Re: (Score:2)
Less people master it, therefore it is more rewarding, both intellectually and financially.
Would you rather be a replaceable coding monkey, doing what anyone else could do, or be an expert software architect that your company relies on?
Good link (Score:2)
Stone Carving Skills (Score:2)
Computational systems used these frames so that sometimes you had to go through several of them to get to your goal (These were the early AND gates), and other times you allowed users to pick which one to pass through (OR gates). When
Re: (Score:3)
Oblig.: A Bunch of Rocks [xkcd.com]. :D
Lessons from the Old and Wise (Score:4, Funny)
Re: (Score:3)
Or go the other way - toward Church instead of Turing [wikipedia.org] - check out Erlang [erlang.org] or Haskell [haskell.org] or Ocaml [inria.fr]. In fact I recommend learning at least one of the above (I personally like Erlang, but that's just me) just to get a different perspective on computation than any of the 'classic' imperative, memory-location-oriented languages.
Excuse me while I laugh my head off... (Score:4, Insightful)
I'm new to programming myself, but I've always felt the need to learn more about the computer than just the high-level language. That's why I want to take up PERL.
(emphasis mine)
This possibly the most hysterically unintentionally funny thing I've read in a long time.
IDE debugging really isn't that bad (Score:2)
Re: (Score:2)
One of those interviewed in the article complained about the fact that modern day programmers try to solve the problem through an IDE or debugger, instead of putting in statements which change the output of the program. They wanted printf debugging. While I do value a good tracing subsystem, I for one, am grateful for modern debuggers which let me view the state of the system without having to modify/redeploy the code.
Oooh, I remember printing out debug statements. When I was at Uni.
Tried doing it once while working on a massive program when I got a job after Uni, it was near useless due to the scale of the system. Figured out how to use a decent debugger properly (we might have been taught how to use a basic one at Uni) and haven't looked back.
Recently found out with the debugger I am using that I can change variable values mid execution - can't do that with print statements. You're right - modern debuggers are great.
Re: (Score:2)
Not only can you change variables during execution, you can manually move the execution pointer around, you can recover from unhandled exceptions, and you can edit the source code during a breakpoint and then continue without having to restart your application.
You can also still direct things to the Output window in the IDE if you fancy the printf style statements.
Re: (Score:3)
I didn't read TFA, but here are some thoughts to consider. printf debugging is only useful if you have an uncomplicated subsystem. You have to be able to quickly find the area where the problem may lie. You have to be able to easily identify and set up the scenario that is causing the problem. You have to easily be able to modify the code, redeploy and run it. I find that a lot of people will say, "My system is too complex for that". What they don't quite grasp is that their system is broken.
If you ne
In my day (Score:5, Funny)
we had to code uphill in 10 feet of snow on an abacus using roman numerals.
Re: (Score:3)
You had Roman numerals? You lucky, jammy, bastard! I'd have killed for Roman numerals. We had to use tally marks. And if we didn't put the slash to mark the fifth, the compiler got confused and core dumped on us!
Re: (Score:2)
Roman numerals? Then how did you terminate your strings?!
'Resource constrained era' (Score:2)
Comments/phrases like these completely fail to grasp that things like this are RELATIVE. What is 'resource constrained' today isn't what was seen as 'resource constrained' 20 years ago. Likewise, many young programmers _today_ (including myself) DID in fact learn to code in what would be seen as resource constrained environments compared to today's machines. I cut my teeth on an 8MB Win95 machine and later a 32MB machine. Sure, that amount of RAM to play with is an insane luxury if we're thinking back to ea
One question... (Score:2, Funny)
How the fuck do you forget something you were never taught in the first place?
This article should really read: "Crotchety old programmers fail to pass on tricks of the trade, then complain anyways"
Duplicate story (Score:2)
Eh, didn't someone remind us of this a couple of months ago? Seems like someone really has teeth to grind with modern coders. Get a life, you suspicious person!
Hardware issues less common (Score:3)
One of the first things I learnt when troubleshooting problems is that it is probably a problem with your code, and not the hardware or external software libraries (apart from rare cases).
Embedded software development (Score:3)
We still use straight C, sometimes even intermixed with assembly. We know all about resource constraints, given that our microcontrollers sometimes only support a few kB for instructions and data RAM. As far as debugging goes, I'll see your debug-with-printf and I'll raise you a debug-with-oscilloscope-and-GPIO-toggling.
Re: (Score:2)
Oh, the things I didn't know when I started doing embedded software development. I learned a rather frustating lesson that Microchip's PIC16 series only has a 7 level call stack. Ironicly, I was only overflowing when I called a routine to output debugging information.
To further muddy the waters, the device transmits data to a server where everything is written in C#. My work can be frustrating at times. Much less so since we switched development to the PIC32 series (128k ram? Heaven!)
Re: (Score:2)
Even on slightly fancier processors you can have limited JTAG debugging support. Severe limitations on the number of instruction breakpoints and data breakpoints can limit the usefulness of the debugger for everyday work. Even in not-particularly-time-critical software single-stepping through the code can be impossible - either the bug is time dependant (e.g. errors in hardware drivers or race conditions) or normal execution relies on timing (e.g. communications). The debugger is useful only for a very narr
go for the faster hardware (Score:2)
it will last longer and be less hassle
Next article should be about pointless titles (Score:2)
"I see poor understanding of the performance ranges of various components," says Bernard Hayes, PMP (PMI Project Mgmt Professional), CSM (certified Scrum Master), and CSPO (certified Scrum Product Owner).
There's such a thing as a certified Scrum Product Owner? Am I now being encouraged to go get management trained on how to be certified at owning Scrum Products?
I'm not sure if I can take what someone with a set of certifications this ridiculous says seriously.
Where are they finding these "new coders?" (Score:2, Interesting)
Every once in a while I read an article on Slashdot about how the current generation of programmers churn out only the shittiest of code; how they have no idea how a computer works; how they could never program without a fully-featured IDE with Intellisense that renders instantly. As an undergrad in a CS/ECE discipline, this has always surprised me-- I can only speak for the curriculum at my school, but I can assure you, these mythical 'lost programming skills' are alive and well. Most of the supposed mis
Yesterday's coders aren't necessarily relevant (Score:5, Interesting)
Some old-school developers prematurely optimize for things we no longer need to optimize for (and shouldn't). From an older post of mine [slashdot.org]:
He was optimizing for resources that were no longer constrained, and consequently pessimizing for the resources we actually cared about. RAM? Dirt cheap, at least for the dataset sizes involved in that project. Much more expensive was all the extra disk and CPU load he was creating on the company-wide database server (which is sufficiently powerful to serve the entire company when it's not being deliberately assaulted).
I'm not "anything goes" by any means, and I'm the guy responsible for making sure that lots of processes can peacefully coexist on an efficiently small number of servers. But for all intents and purposes, most of our apps have unlimited resources available to them. If they want to use 100% of a CPU core for 5 minutes or 2GB of RAM for half an hour a day, so be it. I'd much rather run simple, testable, maintainable code that happens to use a lot of server power than lovingly hand-mangled code that no one but the original programmer can understand and which interacts with the rest of the network in entertainingly unpredictable ways.
No what we need is programmers that can execute (Score:3)
Over the last few years the most disturbing trend I have seen is programmers that do not have the ability to ship a product. You can talk all the shit you want about architecture, cute technology, methodology but if you don't ship product you don't count.
Economy/elegance is a green issue too (Score:3)
That said, there's a great many good reasons for doing things economically, concisely and elegantly, Occam's razor, optimal use of resources [a financial matter too], no constant upgrades and server/desktop refreshes. No endless addition of extra storage because stuff is duplicated everywhere. By the way, if someone starts a project to de-duplicate the web down to a 'safe' evel [only ten copies of each thing not 2K] , I'll sign up.
I also resent, mainly government who send me half a dozen self-congratulatory jpegs with each email. You're wasting bandwidth and blocking up the pipes, you folks.
So, if we pay a little attention, we won't need the sprawling power-hungry data centres that all the big players seem to be building. WE won;t need the constant hardware refreshes [admittedly a lot of those are Windows 'upgrades'].
Anyways, don't listen to me, I'm old and grumpy, but take a look at this: http://en.wikipedia.org/wiki/Green_computing [wikipedia.org] especially Product Longevity.
Re: (Score:2)
Get off my lawn because it is resource constrained you bloated juveniles!
Re:tl;dr (Score:5, Interesting)
Having developed on old platforms and new ones and back to old one. You actually get much better quality out of the new stuff. We look back at the old software with pride on how well it runs, forgetting the decades of errors and problems it had in the past, and the years of effort it took to get them run at that level.
The old stuff seems to run faster but not really. Old word perfect, no realtime spell check, no fonts, bold and italics were the big features, along with multi-column and margins. The display was 80x25 and still the app crashed and you lost all your work. And took minutes to load off your floppy.
Can new developers write tighter code. Sure they are not stupid. But how much can you loose from doing it? Getting in too late in the market? Loss of multi-platform support? Hard to maintain? Vulnerable to hacking?
I just heard a report that the high frequency trading systems are easy to hack into because they have been developed for all speed... Leaving room for hacking in. As anyone has done coding knows it takes 10% to get the program to do what it needs to do and the rest to make sure humans don't cause it to do something else.
Re: (Score:3)
I don't miss the days of my PDP-8 programming. OTOH, I do a lot of embedded stuff as a hobby these days, and it's incredible to think that a lot of the newer programmers think you need huge resources to do even minor tasks. I sometimes choke at the megabytes of dependencies sucked into a small piece of code... Anyway, it's a bit of "Get off my lawn" and a bit of "when the car starts skidding, you really do need to understand oversteer, understeer, and torque steer to keep from getting wrapped around that
Re: (Score:3)
I prefer "less code"*.
I find
- it is usually easier to manage less code
- has very little depencies
- is usually easier to debug
- is usually more efficient
I have a feeling some people just don't even check anymore what dependencies they suck into it.
Just recently, I had to look at Microsoft Exchange, a database had a problem. And the commandline tool to fix it didn't work anymore either.
I took depends.exe and looked at it. The commandline tool to fix the Exchange database files indirectly depends on Internet E
Re: (Score:2)
While I'm making lists, I should have added:
- easier to read code
- easier to understand code
- done faster reading the code
Re: (Score:3)
I'm glad you added "not to the extreme". Some people like less code, so they have everything in one big function with variables such as "q" and "mm".
I prefer more code: many small functions with readable names. Then again, I use auto-completion so I don't get too tired.
Re:tl;dr (Score:4, Insightful)
Less code does NOT mean smaller file sizes.
I write (in general) smaller amounts of code than any of the other 4 developers I have to work with, but I also document my code as I write it, the doxygen info for a function or methods is fully written out before I start writing the function. I document then write. My files are always larger because my documentation and commenting is far more complete.
Using shorter variable names does not mean you're writing less code, you're just using less text to do the same thing. That also generally means you're writing unreadable code. Disk space and IO for files used in a compile is irrelevant for any reasonably sized project. If your project is so big that the size of your variable names makes a noticeable difference in your build times ... then you need to reorganize your project.
Good code is self documenting, but being that most of us can't write code that beautiful, proper documentation makes the code suck less for the next guy.
Re: (Score:3)
The ability to consistently code cleanly with small functions is great, You basically build application based on well proven code, over time you most likely have blended functions together and optimized them. What you are doing is what we did back in the day's with index card's and floppies, you had a stack of code that you used over and over again, made smaller and better. it's the quality of your functional libraries that make you a success in the industry and keeps you employed.
Re: (Score:3)
Your missing the fact that the development machine that was used to develop this code probably had 64k or less of RAM - "pointer_to_the_structure_containing_the_information_about_xy" could result in the code not compiling due to memory constraints on the build machine.
I agree with what you're saying - a variable named "q" is probably a good example of somebody who's not a good coder and it could have been renamed to something like "ptr_xyInfo" and not blow the build machine.
myke
Re: (Score:3)
When the critical mass of programmers just don't worry about efficiency you get this sort of behavior of the type that makes other engineers cringe. Software right now is sort of like the auto industry in the 50's, we're all about adding cool looking tail fins while sticking a cast iron engine block up front.
I have seen people recite the mantra of "don't optimize prematurely" when I can tell that they think it means to never optimize.
Re:tl;dr (Score:4, Interesting)
No, software right now is turning from a craft into an industry, which means that artisans are being replaced by minimum-wage drones and automated code generators. Of course quality is going to suffer as a result.
Re: (Score:3)
I think "don't optimize prematurely" implies that the code should have reasonable efficiency in the first place, and "optimization" work that sacrifices clarity for further efficiency should not be done prematurely. Some people write code that is very ugly and slow, and such code needs more optimization that tends to make it even uglier.
When optimization is premature (Score:3)
That's often what it means, and not by accident. "Don't optimize prematurely" means don't optimize until there is evidence that you need to optimize, and then only optimize what there is evidence that you need to optimize. In practice, that means (for many components) never optimizing many aspects that could, in theory, be optimized, because optimization never turns out to be the c
Re: (Score:3)
Yeah, that was my first thought. I'm pretty long in the tooth myself, but for the most part, if something has been forgotten, it's because it is no longer of use.
Sure things come back (the mobile app market is a good example), and this becomes a great opportunity for the older generation to pass on information to the younger. But I'm not going to miss the days where you had to figure out how to handle your data set when you couldn't use more than 64k of consecutive memory.
Thing is, a lot of the stuff that was required knowledge and isn't now is probably unlikely to come up very often any more. So knowing it internally isn't as important as it can just be looked up for a one off use. Sure, maybe a basic overview understanding of some of the potential issues may be useful, but details of something that you might use once and not again that can be looked up isn't.
I think my phone has a bit over 200MB of memory, so any code optimisations for a computing device with less than th
Re: (Score:2)
Depends what you mean by "double digit".
I cut my teeth on a computer with 5K RAM. If you want to go true single digit, I think there were some with 16 byte.
Re: (Score:2)
Re: (Score:2)
(computers did have double digit RAM at some point, right? My history of computer hardware isn't that great)
*sob*
Re:tl;dr (Score:5, Insightful)
Your phone has 200MB of memory, so when a stupid app uses all 200MB ... then what? You don't run your other apps? Being wasteful is stupid regardless of where you do it, in your home, or in your code.
(computers did have double digit RAM at some point, right? My history of computer hardware isn't that great)
http://www.atmel.com/dyn/products/product_card.asp?part_id=4605&category_id=163&family_id=607&subfamily_id=791 [atmel.com]
If you're too lazy to follow the link:
The high-performance, low-power Atmel 8-bit AVR RISC-based microcontroller combines 512B ISP flash memory, 32B SRAM
That '32B SRAM' is actually shared with the 16 general purpose registers, so if you take those out, you have 16 bytes of ram. With those 16 bytes and the rest of the IO and other functions built in you can easily control 3 servos from a single input line, which takes feed from a larger motor. That chip is capable of driving stepper motors with interpolation for a CNC machine, again taking position information from somewhere else, not processing the command tree itself. The chip is used to handle input from several glass breakage detectors and performing false positive checks to avoid triggering because the cat knocked something off that didn't actually break an external window.
In short, the modern world is built on devices with tiny ass amounts of ram. Do you wear a digital watch? They're getting rarer now days, but thats another example.
Its not history were that stuff mattered, its right now today, you just aren't aware of it. Nothing about the concepts used then is bad today, they STILL provide massive benefits if you know them and follow them. You're basically saying 'gasoline is cheap, just burn more to get more done' and ignoring the fact that there are clear physical limits to computing, a given amount of mass, regardless of how its configured, can only store so much data and performing so many computations. Eventually you'll have devices which simply can't meet the demand being put on them ... because people thought ram/processing power was cheap and we no longer need to worry about those old guys.
Locality of reference, cache hits, pipeline stalls ... all these things that you know nothing about, yet drastically affect how your shitty little app runs. Sigh, I swear at some point in the past people actually took pride in 'doing it right'.
Re:Cheap microcontrollers (Score:3)
Yes, there really are micros much smaller and cheaper than that. $2 in production quantities can get you 128k of Flash and 16k of SRAM on something with a CPU decent enough to run sloppy C code. You're thinking much too large; think $0.30-$0.40 for the cheapies. Google around for the Atmel Tiny4, Tiny12, or the Microchip 10F family for starters...
You are correct about the quantities of scale, but remember that if that electric toothbrush is $40 at retail, the company probably didn't spend more than $8-$1
Re: (Score:2)
In addition, most app programmers test their application on their comparatively fast machine, while not running other stuff in the background.
If developers were forced to test their app on 4 year old hardware, then it would improve general performance lots.
Re: (Score:3)
But these are still in use. Forget mobile phones, those are massive computers. When you're on a device with a max limit of 1M RAM then you need to worry, or even if one of the processors only has 256 BYTES of RAM, then you seriously need to worry. When the CPU runs at 1MHz or less with one instruction per cycle then you have to worry about what's efficient code and what isn't. This is stuff people have had to worry about this year, not in the 1980s.
This applies on the high end too. Maybe you've got 4GB
Re:tl;dr (Score:5, Insightful)
I'd submit the game development industry is still one place where tight, fast-running code really matters. But the trick is, writing efficient code is extremely slow and expensive to develop. At my company, our game engine is written in C/C++ like most places, but we still write most of our tools in C# or Python (and from what I understand, that's also becoming the norm). We well understand that we're trading efficiency of execution for efficiency of performance in those cases. Both types of coding certainly have their place.
I get a little annoyed at zealots on both sides that claim that "C++ is dead" or "C# is too slow", blah, blah... If you ask a carpenter what is "best tool" is, he might ask you "for doing what?" Programming is no different. People who cling to a single language or paradigm are just missing the big picture, IMO.
Re: (Score:2)
Yep and posts like this one [slashdot.org] are fucking hilarious with such obviously made up shit like:
I'm old enough to remember when it wasn't like that. You'd run your program and it was ready in a second, you'd exit and it left no trace. Crashes were virtually unheard of. We have people where I work who only do data entry, and they still use wordperfect 4.2 on 386 hardware. I've seen their workflow and how fast it works for them and I can see if they "modernized" it would cripple their productivity.
Hahah lolwut? Crashes were virtually unheard of? Back in DOS crashes were basically a way of life with all the buggy software that each had their own handrolled version of psuedothreading, video ouput, etc. Secondly, WordPerfect 4.2? That was seriously buggy shit. Yep, this is just a bunch of old timers remembering a romanticized version of history that never really existed.
Re: (Score:2)
Yes, that was kind of the point. Since he was holding up WordPerfect as an example, which was *gasp* wait for it *gasp* a DOS program, he was clearly attempting to claim that the DOS era was something free of crashes which is completely absurd. Secondly, the claim that software older than DOS wasn't buggy and crash prone is also bullshit. Some of the most classic bugs that get pointed out when you read up on the history of C come from this supposed "golden age" where all programmers were apparently wizar
Re: (Score:2)
A pointer dereferencing to NULL would be a pointer pointing to a null pointer.
That would be "a pointer dereferencing to a NULL pointer". A "pointer dereferencing to NULL" would be a pointer to a NULL value.
I learned 68000 and VAX assembly code when I learned C. The best way to learn what a C statement is actually doing (with all those & and * and [] and stuff) is to have the compiler dump the assembly and look at it. It's also a good way to see what optimizations the compiler is trying to do.
I have a feeling the main best use for people who know how hardware acts these days ar
Re: (Score:2)
less is definitely more, atleast if you ask me.
Re: (Score:2)
# man sex
Re: (Score:3)
Even modern operating systems are largely written in HLLs
I guess you consider C a High Level Language. I think a LOT of people would seriously disagree with you.
I don't care what OS you are using. Those are written in C and Assembler. Look HERE [linux.no] for yourself.
You are not going to find any php, python, java, perl, ruby etc. etc. there, not a line of it.
Mostly your comments are pointless. Most really "good" HTML is still written by hand. It may get spewed from some content management system, but who do you think wrote all the HTML and CSS for those? Do you thi