Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming IT Technology

How to Keep Your Code From Destroying You 486

An anonymous reader writes "IBM DeveloperWorks has a few quick tips on how to write maintainable code that won't leech your most valuable resource — time. These six tips on how to write maintainable code are guaranteed to save you time and frustration: one minute spent writing comments can save you an hour of anguish. Bad code gets written all the time. But it doesn't have to be that way. Its time to ask yourself if its time for you to convert to the clean code religion."
This discussion has been archived. No new comments can be posted.

How to Keep Your Code From Destroying You

Comments Filter:
  • by Palmyst ( 1065142 ) on Wednesday May 30, 2007 @03:21PM (#19325883)
    The article is suited for beginning programmers, I guess. Here is the summary of the tips.

    1. Comment smartly.
    2. Name your constants ("use #defines").
    3. Descriptive variable names, but not too long.
    4. Handle errors.
    5. Avoid premature optimization.
    6. Clarity is better than cleverness.

    The author may not be a beginning programmer, but it appears that he might be a beginning writer on programming.
  • by Anonymous Coward on Wednesday May 30, 2007 @03:28PM (#19325989)
    I am trying to maintain code written by a senior designer (logic code). This developer did not believe these rules. It is hell. This is not redundant.
  • by KingSkippus ( 799657 ) * on Wednesday May 30, 2007 @03:36PM (#19326117) Homepage Journal

    Maybe it was the note at the top of the article that says, "Level: Introductory."

    Maybe it was the author's comment at the end that said, "At this point, you may be thinking, 'Wow. That was a big waste of time. All of this stuff is obvious and everyone knows it. Why did anyone write all of this?' I hope this is what you're thinking. Then you're already smart. Good for you."

    But somewhere along the course of reading the article, I got the impression that he wasn't writing it for professional developers (at least, smart ones), but for people relatively new to programming.

    But then, maybe I'm just stating the obvious, Cap'n...

  • Re:Mostly agreed (Score:3, Informative)

    by ZorbaTHut ( 126196 ) on Wednesday May 30, 2007 @03:37PM (#19326131) Homepage
    Fair enough. His post was talking about C++, and I'm a C++ coder myself, so I wasn't aware of the weird C problem.

    I guess if you're forced to use pure C, it could be an issue, but I would personally just compile it as C++ (remember, you don't have to use all the C++ features if you don't want to.)
  • by Phisbut ( 761268 ) on Wednesday May 30, 2007 @03:42PM (#19326229)

    Neither offer any advantage over the other since changing either will only affect the program after its been recompiled, its just that slight disadvantage for a distinct data type.

    Using const int instead of define can save you a whole lot of time when it comes to debugging. When something has gone wrong and you need to step into your code with a debugger, const int will have created a symbol that your debugger can use and inform you of, while define will make the debugger tell you the value of the constant, but not where it comes from or what it's called.

  • by jjrockman ( 802957 ) on Wednesday May 30, 2007 @03:46PM (#19326291) Homepage
    Not to nitpick, but it's not "by Microsoft". It's published by Microsoft, but written by Steve McConnell of Construx.
  • Re:Mostly agreed (Score:3, Informative)

    by ZorbaTHut ( 126196 ) on Wednesday May 30, 2007 @03:51PM (#19326371) Homepage
    In fairness, I don't actually use assert() - I have my own macro that I use that is turned on in both debug and release mode. But in effect it's an assert() that always runs.

    You're right, though - that's one of the more annoying "features" of assert. Luckily assert() is not a particularly hard thing to rewrite.
  • Re:Mostly agreed (Score:3, Informative)

    by ZorbaTHut ( 126196 ) on Wednesday May 30, 2007 @03:54PM (#19326439) Homepage
    Since the compiler knows that the int can't ever change, the compiler can easily inline it, giving you exactly the same performance as #define. I believe that most compilers only bother allocating space for the int if you try to take the address of it in some way.

    I suspect they only do this in release mode, of course.
  • by EvanED ( 569694 ) <{evaned} {at} {gmail.com}> on Wednesday May 30, 2007 @03:56PM (#19326461)
    That is bad in a way because the const ints are being stored in memory instead of hardcoded at compile time.

    Not necessarily. That's implementation dependent, and as long as you're compiling with optimization, you're probably wrong most of the time.

    Neither offer any advantage over the other since changing either will only affect the program after its been recompiled

    #defines have a number of well-known drawbacks that consts don't.

    1. CPP macros stomp all over your namespace. For instance, Linux defines a macro "current". At one point I tried to make a local variable in a function named "current". I spent the next 15 minutes swearing at GCC for not compiling my code and giving error messages that made no sense. You can reduce this problem by naming conventions (for instance all caps), but consts don't have this problem at all.

    2. consts don't suffer from order of operation issues, so you can define them without regards to those thoughts. The classic example is

    #define SIX 1 + 5
      #define NINE 8 + 1
      printf("%d", SIX * NINE)
    Which helpfully prints "42" instead of 54. Again, there are ways to work around this (put parens around the expressions), but you don't NEED to work around it if you use consts.

    3. consts are typesafe

    Of course, you can't always use 'const' in C. For instance, you can't use a const int as an array bound. But if you don't have to worry about C compatibility, I can't think of a reason to use #define instead of const for defining constant expressions.
  • First off, it's not a side-effect. It's intended.

    I know it's nothing more than syntactic sugar, but making a temp variable, worrying about scope and naming clashes, messing up your code, having to pull stuff out or put stuff into a for loop, and all the other million ways it comes in handy says you're cutting off your nose to spite your face. I have to reiterate: if this is too complicated for you, you shouldn't be writing C. Then, when we start to talk about C++, forget about it. Template magic, function and operator overloading, virtual functions, etc. are all way more complicated. For that matter, pointers in C is just one thing I can think of that's far more complicated than the difference between pre- and post-increment.

  • Re:It was fine... (Score:2, Informative)

    by Erasmus ( 32516 ) on Wednesday May 30, 2007 @05:01PM (#19327523)
    I see where you are coming from. These are simple rules and are really important to professional programmers. But at least in my experience, a lot of the kids just entering the field are comping out of their college comp sci programs with chiefly academic programing experience. They know how to knock out programs that solve very specific problems to meet to requirements of their course but which are never used again. Whether that's tragic or not, I don't know. But that is what I've seen in the past.

    Personally, I don't think it's such a big problem. Nice, little articles like this one help and, honestly, most bright people get with the program once they see how the post-college world works. Those who don't tend to get converted the first time they have to deal with someone else's messy code.
  • Re:Mostly agreed (Score:3, Informative)

    by ZorbaTHut ( 126196 ) on Wednesday May 30, 2007 @05:12PM (#19327729) Homepage
    I disagree. Anything that you expect to happen shouldn't be caught by assert(), of course - hard drive out of space, timeout connecting to website, etc etc etc. But anything you don't expect to happen you obviously can't plan for, and that means you can't guarantee that your program will be in a consistent state. The best thing to do there is to save any user data you can and kill the program.

    Obviously it's best if none of your asserts ever trigger, but I'd rather an assert trigger and save my work before crashing than have my document get increasingly corrupted until I can't even load it to save whatever is left.
  • by spidweb ( 134146 ) on Wednesday May 30, 2007 @06:06PM (#19328655) Homepage
    As the person who actually wrote the article in question, I'd like to thank you for your comments and respond with a few of my own.

    * To those who think it is all so obvious that I shouldn't have written about it:

    No. You are wrong. Just wrong. Good programming practices do not just appear in peoples' heads as if by magic.

    It's an introductory article. It's sold as an introductory article. And I am far more interested in being Right than I am scared of being Obvious.

    * To those who have problems with suggesting using #define instead of const int

    Meh. Yeah, you're probably right. But the important thing here is the CONCEPT of having your constants being defined in one central, easy to find place. Once a person has that down, he or she can define them however is desired.

    * To those who accuse me of being a (gasp) inexperienced programming writer.

    Yeah. So what? I never said I wasn't. I'm a game author. I've written over a dozen games. They all made money. That doesn't mean I am mister programming advice god.

    But one, if you have a problem with it, yell at IBM, not me. They're the ones who had me write the piece.

    Two. This is kind of ad hominem. Who cares how experienced I am or am not? I'm still right.

    * I didn't read the article, but I'll say bad things about you because it means I'm awesome.

    R0ck 0n d00d. First post!

    * I liked the article. It might tell beginning programmers something actually useful.

    If anyone says this, thanks in advance.

  • by spidweb ( 134146 ) on Wednesday May 30, 2007 @06:10PM (#19328727) Homepage
    A brief defense from the person who wrote the article.

    The indenting in the selected code was not mine. It got screwed up somewhere between my machine and being posted on their site. I'll drop them a not and ask them to fix it.

    No, I am not insane. :-)
  • by Anonymous Brave Guy ( 457657 ) on Wednesday May 30, 2007 @07:05PM (#19329681)

    Very little apart from failing to respect scope and not encoding any type information?

  • by Anonymous Coward on Wednesday May 30, 2007 @09:05PM (#19330933)
    If a method doesn't convey intent well then it is too complex. If a commment is needed then it is a smell that says maybe the code should convey intent better. If you find out that breaking out something complex into intent-conveying methods leaves a lot of methods around then it is speaking to you that you might be doing too much in this class so look at your design. As an exercise, before you write a comment on WHAT a snippet of code is doing, see if you can make a method whose name is what the comment says. Of course, if you were practicing good TDD/TFD then you would most likely be refactoring little complexities as they come about which yields to naturally intent-revealing code.

    The only time I feel comments are useful is to comment on WHY the particular technique was chosen or the rationale behind why the particular approach was taken.

    2 cents
  • by neutralstone ( 121350 ) on Wednesday May 30, 2007 @09:31PM (#19331139)
    "Heck, C++ was just a C pre-processor."

    Please note the term "C pre-processor" can be very misleading in this context; it is not the best term to use when referring to Cfront.

    Cfront [wikipedia.org] parsed, analyzed & type-checked C++ code and generated the equivalent C code. It fits the formal description of a compiler [wikipedia.org] and should be referred to as such.

    I understand that "C pre-processor" can mean "a compiler whose target language is C", but usually it means "the program (or a component of a C or C++ front end) that processes directives beginning with '#' and resolves macro expansions."

    And calling Cfront "just" a C pre-processor is even worse. It was as complicated as any native-code-generating C++ compiler of its later years (when such compilers started to appear on the market), with the exception that it generated C code instead of native assembler code.

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...