How to Keep Your Code From Destroying You 486
An anonymous reader writes "IBM DeveloperWorks has a few quick tips on how to write maintainable code that won't leech your most valuable resource — time. These six tips on how to write maintainable code are guaranteed to save you time and frustration: one minute spent writing comments can save you an hour of anguish. Bad code gets written all the time. But it doesn't have to be that way. Its time to ask yourself if its time for you to convert to the clean code religion."
Summary: Beginners need tips too. (Score:5, Informative)
1. Comment smartly.
2. Name your constants ("use #defines").
3. Descriptive variable names, but not too long.
4. Handle errors.
5. Avoid premature optimization.
6. Clarity is better than cleverness.
The author may not be a beginning programmer, but it appears that he might be a beginning writer on programming.
Re:The whole article is -1 redundant. (Score:3, Informative)
Re:Who wrote that article? (Score:5, Informative)
Maybe it was the note at the top of the article that says, "Level: Introductory."
Maybe it was the author's comment at the end that said, "At this point, you may be thinking, 'Wow. That was a big waste of time. All of this stuff is obvious and everyone knows it. Why did anyone write all of this?' I hope this is what you're thinking. Then you're already smart. Good for you."
But somewhere along the course of reading the article, I got the impression that he wasn't writing it for professional developers (at least, smart ones), but for people relatively new to programming.
But then, maybe I'm just stating the obvious, Cap'n...
Re:Mostly agreed (Score:3, Informative)
I guess if you're forced to use pure C, it could be an issue, but I would personally just compile it as C++ (remember, you don't have to use all the C++ features if you don't want to.)
Re:I thought #defines were deprecated in C++ (Score:3, Informative)
Using const int instead of define can save you a whole lot of time when it comes to debugging. When something has gone wrong and you need to step into your code with a debugger, const int will have created a symbol that your debugger can use and inform you of, while define will make the debugger tell you the value of the constant, but not where it comes from or what it's called.
Re:Who wrote that article? (Score:3, Informative)
Re:Mostly agreed (Score:3, Informative)
You're right, though - that's one of the more annoying "features" of assert. Luckily assert() is not a particularly hard thing to rewrite.
Re:Mostly agreed (Score:3, Informative)
I suspect they only do this in release mode, of course.
Re:I thought #defines were deprecated in C++ (Score:2, Informative)
Not necessarily. That's implementation dependent, and as long as you're compiling with optimization, you're probably wrong most of the time.
Neither offer any advantage over the other since changing either will only affect the program after its been recompiled
#defines have a number of well-known drawbacks that consts don't.
1. CPP macros stomp all over your namespace. For instance, Linux defines a macro "current". At one point I tried to make a local variable in a function named "current". I spent the next 15 minutes swearing at GCC for not compiling my code and giving error messages that made no sense. You can reduce this problem by naming conventions (for instance all caps), but consts don't have this problem at all.
2. consts don't suffer from order of operation issues, so you can define them without regards to those thoughts. The classic example is Which helpfully prints "42" instead of 54. Again, there are ways to work around this (put parens around the expressions), but you don't NEED to work around it if you use consts.
3. consts are typesafe
Of course, you can't always use 'const' in C. For instance, you can't use a const int as an array bound. But if you don't have to worry about C compatibility, I can't think of a reason to use #define instead of const for defining constant expressions.
Re:Felt I should point out (Score:3, Informative)
First off, it's not a side-effect. It's intended.
I know it's nothing more than syntactic sugar, but making a temp variable, worrying about scope and naming clashes, messing up your code, having to pull stuff out or put stuff into a for loop, and all the other million ways it comes in handy says you're cutting off your nose to spite your face. I have to reiterate: if this is too complicated for you, you shouldn't be writing C. Then, when we start to talk about C++, forget about it. Template magic, function and operator overloading, virtual functions, etc. are all way more complicated. For that matter, pointers in C is just one thing I can think of that's far more complicated than the difference between pre- and post-increment.
Re:It was fine... (Score:2, Informative)
Personally, I don't think it's such a big problem. Nice, little articles like this one help and, honestly, most bright people get with the program once they see how the post-college world works. Those who don't tend to get converted the first time they have to deal with someone else's messy code.
Re:Mostly agreed (Score:3, Informative)
Obviously it's best if none of your asserts ever trigger, but I'd rather an assert trigger and save my work before crashing than have my document get increasingly corrupted until I can't even load it to save whatever is left.
A Note From the Author (Score:5, Informative)
* To those who think it is all so obvious that I shouldn't have written about it:
No. You are wrong. Just wrong. Good programming practices do not just appear in peoples' heads as if by magic.
It's an introductory article. It's sold as an introductory article. And I am far more interested in being Right than I am scared of being Obvious.
* To those who have problems with suggesting using #define instead of const int
Meh. Yeah, you're probably right. But the important thing here is the CONCEPT of having your constants being defined in one central, easy to find place. Once a person has that down, he or she can define them however is desired.
* To those who accuse me of being a (gasp) inexperienced programming writer.
Yeah. So what? I never said I wasn't. I'm a game author. I've written over a dozen games. They all made money. That doesn't mean I am mister programming advice god.
But one, if you have a problem with it, yell at IBM, not me. They're the ones who had me write the piece.
Two. This is kind of ad hominem. Who cares how experienced I am or am not? I'm still right.
* I didn't read the article, but I'll say bad things about you because it means I'm awesome.
R0ck 0n d00d. First post!
* I liked the article. It might tell beginning programmers something actually useful.
If anyone says this, thanks in advance.
Re:Who wrote that article? (Score:5, Informative)
The indenting in the selected code was not mine. It got screwed up somewhere between my machine and being posted on their site. I'll drop them a not and ask them to fix it.
No, I am not insane.
Re:Who wrote that article? (Score:4, Informative)
Very little apart from failing to respect scope and not encoding any type information?
Re:Comments are a code smell. (Score:1, Informative)
The only time I feel comments are useful is to comment on WHY the particular technique was chosen or the rationale behind why the particular approach was taken.
2 cents
Re:But C doesn't have classes or methods. (Score:2, Informative)
Please note the term "C pre-processor" can be very misleading in this context; it is not the best term to use when referring to Cfront.
Cfront [wikipedia.org] parsed, analyzed & type-checked C++ code and generated the equivalent C code. It fits the formal description of a compiler [wikipedia.org] and should be referred to as such.
I understand that "C pre-processor" can mean "a compiler whose target language is C", but usually it means "the program (or a component of a C or C++ front end) that processes directives beginning with '#' and resolves macro expansions."
And calling Cfront "just" a C pre-processor is even worse. It was as complicated as any native-code-generating C++ compiler of its later years (when such compilers started to appear on the market), with the exception that it generated C code instead of native assembler code.