Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming IT Technology

How to Keep Your Code From Destroying You 486

An anonymous reader writes "IBM DeveloperWorks has a few quick tips on how to write maintainable code that won't leech your most valuable resource — time. These six tips on how to write maintainable code are guaranteed to save you time and frustration: one minute spent writing comments can save you an hour of anguish. Bad code gets written all the time. But it doesn't have to be that way. Its time to ask yourself if its time for you to convert to the clean code religion."
This discussion has been archived. No new comments can be posted.

How to Keep Your Code From Destroying You

Comments Filter:
  • by Anonymous Coward
    Captain Obvious?
    • by KingSkippus ( 799657 ) * on Wednesday May 30, 2007 @02:36PM (#19326117) Homepage Journal

      Maybe it was the note at the top of the article that says, "Level: Introductory."

      Maybe it was the author's comment at the end that said, "At this point, you may be thinking, 'Wow. That was a big waste of time. All of this stuff is obvious and everyone knows it. Why did anyone write all of this?' I hope this is what you're thinking. Then you're already smart. Good for you."

      But somewhere along the course of reading the article, I got the impression that he wasn't writing it for professional developers (at least, smart ones), but for people relatively new to programming.

      But then, maybe I'm just stating the obvious, Cap'n...

      • Re: (Score:3, Insightful)

        by seaturnip ( 1068078 )
        Somewhere along the course of reading the article, I also got the impression that he wasn't a professional developer himself (at least, a smart one).
        • by Anonymous Brave Guy ( 457657 ) on Wednesday May 30, 2007 @05:07PM (#19328675)

          That's a bit harsh. Apart from writing comments that are a maintenance liability, using C++ macros when constants would be better, mentioning the use of Hungarian notation that is a liability without mentioning the use that can actually be useful, advocating silent failure in the case of failed preconditions, misquoting Knuth and, to add insult to injury, citing a Wikipedia article in support when that article is currently tagged as having dubious citations (I know; I put the tag there a few weeks ago), failing to understand that games development is one of the few areas where early optimisation is basically a fact of life for some genres, and arguing that you shouldn't rely on programmers knowing basic language facilities like the pre- and post-increment operators in the C family, what was wrong with it? :-)

          I am, of course, being facetious. As the author himself points out at the end, much of this stuff isn't obvious to newbies, and it's better if someone tells them earlier rather than later, so kudos to him for taking the time to write it up. I do wish people volunteering such material would get some peer review if they can, though, because the only thing worse for inquisitive newbies than no information is bad information.

    • by Enselic ( 933809 ) on Wednesday May 30, 2007 @02:36PM (#19326121) Homepage
      I wonder what mushrooms he were on when he came up with that coding style... (yes, this is the actual indentation he used):

      Void change_score(short num_points)
      {
          if (num_points < 0)
      {
      // maybe some error message
              return;
      }

          score += num_points;

          if (num_points > 0)
      make_sparkles_on_score();
      }
      • Re: (Score:3, Interesting)

        Agreed, especially considering that this article was not so much about commenting as it was about writing self-documenting code. To leave indentation out of that discussion is to be quite remiss.

        On the other hand, I work with many CS degree holders who could greatly benefit from this article. So while to some it's obvious stuff, just because it's obvious advice doesn't mean that it's always followed.

        I did kind of cringe though, at the bit about good var naming. I have been known to name vars things like
      • by spidweb ( 134146 ) on Wednesday May 30, 2007 @05:10PM (#19328727) Homepage
        A brief defense from the person who wrote the article.

        The indenting in the selected code was not mine. It got screwed up somewhere between my machine and being posted on their site. I'll drop them a not and ask them to fix it.

        No, I am not insane. :-)
    • by dvice_null ( 981029 ) on Wednesday May 30, 2007 @03:37PM (#19327087)
      I think all Slashdot readers should read that article. After all it tells you how to write good _comments_.
  • Damn (Score:5, Funny)

    by ReidMaynard ( 161608 ) * on Wednesday May 30, 2007 @02:11PM (#19325735) Homepage
    Damn, that game looks sweet. Anyone know what it is?
  • by rah1420 ( 234198 ) <rah1420@gmail.com> on Wednesday May 30, 2007 @02:14PM (#19325781)
    Comments, clarity, constants. If you're not doing this in your daily coding exertions, you deserve to have to maintain your own stuff 10 years from now.

    I have. It ain't fun. Not that I'm bragging on myself, but I've now had people from the support group stop me in the hall and compliment me on the quality of the code I've written and deployed.

    • I agree. It's the old concept of investment. Investing a little now is worth a lot more later. Investing one more minute now in commenting your code saves hours of puzzling later when you need to edit it.
      • Re: (Score:3, Funny)

        by jstretch78 ( 1102633 )
        I agree, but could all those hours of puzzling actually improve your ability to understand poorly written code? I've been using comments sparsely for years and have spent much time fustrated. But I've found that I can 'See the Code' like on the Matrix and bend spoons and shit.
    • Re: (Score:3, Informative)

      by Anonymous Coward
      I am trying to maintain code written by a senior designer (logic code). This developer did not believe these rules. It is hell. This is not redundant.
      • absolutely. I work for a large corporation, with a huge IT department. You'd be surprised to see how many "well educated" people write horrid code, and for a variety of reasons. Some do it for job security (I say let em support it forever then). Some do it to be "clever". Some do it to "separate the wheat from the chaff" but that's a lot like making jokes about things that have happened in your life that only you'd get and expecting others to laugh. Nobody else knows what you are thinking when you wri
    • Basics (Score:2, Insightful)

      Ok, yeah, it's redundant. But is there anything wrong with going back to the basics? Look at sports figures ( and I chose sports for an analogy because most pro sports are very good at filtering out all but the best, and the very best get to the top by proving that they are better than the second best in head to heah competition, something that programmers almost never do. ) The best often are known for continually going back to basics.

      Lombardi was known for his comment that if you block and tackle bette
    • by mstahl ( 701501 )

      The whole article is -1 redundant.

      You'd think so, but I think there are enough examples [worsethanfailure.com] out there to show that there are a lot of people out there writing sub-standard code and thinking there's nothing at all wrong with it. There have been a lot of times when I was tutoring computer science in college where I'd ask a student, "Just what in the hell were you doing here?" and point to some ridiculous incantation in their code.

      In particular I think people ignore that last two when they're in school, because everybody wants to look impressiv

  • by FortKnox ( 169099 ) * on Wednesday May 30, 2007 @02:15PM (#19325801) Homepage Journal
    That has to be the worst written article on cleaning up your code I've ever read.
    This looks like it was written for (and BY) freshmen CS majors.

    Comment your code smartly? No shit?
    Use #defines everywhere? Honestly, I find that having a config file (or DB table) is a lot better, as I can change global variables without even a recompile...

    I'm not saying its BAD advice, its just advice that anyone in the real world already knows.
    How about something new?
    1.) Use test driven development
    2.) Write complete unit tests, including bad input
    3.) If any piece of code is complex enough to require a comment, make it its own function and comment the function. I believe the only thing that REQUIRES comments are classes and methods. Not pieces of code...

    I code go on, but I'm not a writer...
    And neither is the author of that pile of trash...
    • It was fine... (Score:5, Insightful)

      by Erasmus ( 32516 ) on Wednesday May 30, 2007 @02:27PM (#19325981)
      People who are just starting their careers as programmers are allowed to read articles too. Just because something is aimed at a population less experienced than you doesn't mean that it's crap!

      I'm not sure if it really called for a Slashdot entry, but I've been on a few projects with new coders where a quick read of something like this on their parts would have saved everyone a lot of grief.
    • by wiredog ( 43288 ) on Wednesday May 30, 2007 @02:30PM (#19326017) Journal
      It has functions...
    • by Sax Maniac ( 88550 ) on Wednesday May 30, 2007 @02:31PM (#19326029) Homepage Journal
      Right, I laughed at that #define remark, it's so green.

      The real thing is to used named constants where it makes sense. #define is the crudest approximation of that, C can use enums, C++ can use "const" for compile-time values, etc.

      In a real project, you have to scope your constants otherwise you'll have a billion of them in "everything.h" and every time you touch it, the world will rebuild. So nix the "centrally located" file theory.

      In a real project, your constants will often have interdepedencies on other bits of code, so changing one will frequently affect the others. Heck, maybe changing one will cause it not to compile. This example makes them all trivial to the point of uselessness. Shuttling it off miles away in an #include file, can frequently give the impression this than can be changed with no effect on anything else.

      • by hobo sapiens ( 893427 ) <MOSCOW minus city> on Wednesday May 30, 2007 @03:21PM (#19326855) Journal
        "In a real project, you have to scope your constants otherwise you'll have a billion of them in "everything.h" and every time you touch it, the world will rebuild. So nix the "centrally located" file theory."

        Agreed, but some of the things in this article can be applied conceptually if not literally. Don't want a 2MB config file or header file? Right, me either. But break your program down into smaller pieces and declare stuff at that level and group it together at that level. Conceptually the same as what he is recommending, just done according to your actual implementation.

        I too thought the article was very basic, but that doesn't mean that the principles don't apply well to systems larger than a simple game.
      • by swillden ( 191260 ) * <shawn-ds@willden.org> on Wednesday May 30, 2007 @04:38PM (#19328183) Journal

        The real thing is to used named constants where it makes sense. #define is the crudest approximation of that, C can use enums, C++ can use "const" for compile-time values, etc.

        That was my thought, too. When writing C++, you should *avoid* #define like the plague. In fact, avoid using the preprocessor for anything except including (and guarding) headers and, occasionally, conditional compilation. One of the best things about enums is that they create not just values, but *types*. So, if you define:

        enum AlienCount { MAX_NUM_ALIENS = 20 };
        enum PointValue { POINT_VALUE_FOR_ALIEN = 10, POINT_VALUE_FOR_SPACESHIP = 30 };

        void givePlayerSomePoints(PointValue points);

        The compiler will barf if you accidentally type:

        givePlayerSomePoints(MAX_NUM_ALIENS)

        ... or something equally silly, but perhaps less obviously wrong.

        Smart C++ programmers find ways to get the compiler to point out their mistakes. One of the most powerful (and most clever) examples of this is in Barton and Nackman's book on C++ for scientists and engineers. They make use of templates to define types for representing physical values that not only have units attached, but which allow the compiler to statically check the units. Given:

        Mass m = 4 * kg;
        Acceleration g = 9.8 * meter / (second * second);
        Force f;
        f = m; // Generates a compile error!
        f = m * g; // No error here

        The compiler would take care of all of the unit checking at compile time, and, assuming we got rid of the erroneous line, generate machine code equivalent to:

        double m = 4;
        double g = 9.8;
        double f = 39.2; // No need to delay the multiplication to runtime.

        And if m or g aren't actually used except to calculate f, the compiler will optimize them away completely.

        I used the verbose version of their syntax, BTW. You can also write:

        Mass m(4);
        Acceleration g(9.8);
        Force f = m*g;

        which will apply default units to the numeric values. Of course, good code would also define a "const Acceleration g(9.8)" in a header somewhere, etc., rather than using numeric constants directly in the code, and it would use better variable names.

        Of course, such usage is well beyond the "Introductory" level of this article, but I think even an introductory article on C++ should recommend using enums to define constants, not #define. More advanced C++ users should devote a little time to writing classes that idiot-proof the code (because we're *all* idiots, at least some of the time), ideally without sacrificing performance.

    • by AuMatar ( 183847 ) on Wednesday May 30, 2007 @02:31PM (#19326041)
      Not every program uses a db. In fact the majority of programs don't. And unless a constant is going to change frequently, or needs to be configured per client, putting it in a configuration file or db table is a bad idea. It makes it fairly likely it will be changed by accident. The only things that should be in configuration files are things you actually expect the user to configure per install.

      As for your advice

      1)Thinking about testing early- good. Writing unit tests-good. The test driven development mentality (write tests instead of design, write unit tests before coding)- bad. It leads to a lot of wasted time, completely rewritten test suites, and throw away work. Thinking about testing early is useful, it may cause you to think about corner cases. But writing them first causes 2 problems- you end up writing the code to solve the tests (rather than solving the problem) and/or you end up throwing away half the test suite in the middle when you refactor the design.

      3)Disagree. The purpose of comments is to make sure that maintainers know what the code is trying to do. Anything block of code thats more than 5 or 6 lines deserves a comment. Breaking all of those into independent functions leaves you with hundreds of 5 or 6 line functions, which is even harder to understand how they interact. Frequently the correct thing to do is not break it into a function and just write a 1 line comment.
      • I accept most of your arguments, except for TDD, which I have used to success...

        ...The test driven development mentality (write tests instead of design, write unit tests before coding)- bad. It leads to a lot of wasted time, completely rewritten test suites, and throw away work. Thinking about testing early is useful, it may cause you to think about corner cases. But writing them first causes 2 problems- you end up writing the code to solve the tests (rather than solving the problem) and/or you end up thr
      • by cliffski ( 65094 )
        Or in the case of games (what the author writes, and what I write to), stuff that the user may want to fiddle with. I've been using data driven code for a while, but after people were so keen to mod my last 2 games, my new one (www.rocklegendgame.com) had pretty much every notable variable placed in an external text file (a fully commented one called config.txt in the data dir) with the express purpose of letting the techy gamer play with it. Theres the usual discliamer that says the game has been balanced
    • by Coryoth ( 254751 ) on Wednesday May 30, 2007 @02:43PM (#19326251) Homepage Journal

      1.) Use test driven development
      I'll go you one better. Use specification driven development. That is, use a combination of contracts an unit tests. If your method has general constraints, or your object has an invariant, write it into the code using contracts (and ideally use a system that will let your subclasses inherit contracts, and allow contracts to be sucked up and included in the API documentation); if your methods specification is only easily expressed as a set of mappings from input to expected output, write a unit test instead. When you run your unit tests the contracts will automatically get tested too. Better yet, by using contracts you can help yourself on:

      2.) Write complete unit tests, including bad input
      by using the contracts as a test oracle and passing in randomly generated data to really flesh out the corner cases. In some cases you can do this in a purely automated fashion [inf.ethz.ch] at the push of a button. Contracts also have the benefit of: (1) not requiring the biolerplate code of unit tests, so they're faster to write; (2) respecting inheritance which can save you a lot of extra test writing. You can't always easily write contracts for methods, and in those cases unit tests make sense, but you may as well take full advantage of contracts for the parts that can be handled in that manner.
    • I'd suggest that commenting the function calls may also be needed at times, such as when calling a legacy function that has tricky (or just plain inappropriate) calling conventions or side effects. I see plenty of places where a comment is useful for 2-3 lines of code, and it would not be appropriate to make a function out of all of those instances.
    • by Nyh ( 55741 ) on Wednesday May 30, 2007 @03:37PM (#19327093)
      If any piece of code is complex enough to require a comment, make it its own function and comment the function.

      That is just hiding your complexety. A massive tree of functions called each one time is as complex as all the code sequencally in one function. Plowing through the massive tree of functions will cost you more time as reading sequentially through your code whit comments at the places you would have created a function.

      Nyh
  • by mkcmkc ( 197982 ) on Wednesday May 30, 2007 @02:17PM (#19325809)
    It's amazing how much simpler life is if your language will check errors (esp I/O errors) by default. That is, if you do a write and if fails (e.g., because the disk is full), an exception gets thrown, and even if you haven't written any error handling code at all, you get a nice explanatory error message.

    C, C++, and Perl are not "safe" in this sense. Python is. Not sure about other common languages.

    • Relying on the environment to do all cleanup leads to bad code.
      Every time you create something, destroy it afterwards.
      Assume every action will fail and handle it appropriately.

      I have seen 'developers' assume everything will be taken care of, then when the software gets into the users system their usage patterns make it explode.

      Simple management needn't make a development time longer or harder and allows you to migrate things to other applications/systems with ease.
      • by mkcmkc ( 197982 ) on Wednesday May 30, 2007 @04:26PM (#19327967)

        Assume every action will fail and handle it appropriately.

        True enough, but this misses my point. The question is: What happens when a programmer fails to properly handle errors?

        This happens all the time, either because the programmer is not sufficiently competent, or simply misses a check, or because the program in question is a prototype that got pushed into production without being reworked.

        Having the language produce useful error messages by default does not preclude an other strategy regarding error handling, resource deallocation, etc. It wouldn't necessarily even need to be done via exceptions. It just needs to change the default strategy from fail-silently to fail-safe, which is what you really want if you care at all about reliability and correctness.

    • C, C++, and Perl are not "safe" in this sense. Python is. Not sure about other common languages.

      Please check your errors. I see far too many projects coded in Python that just barf stacktraces to the console if anything unexpected happens.

      Java has the advantage that it *makes* you handle your errors, even if you choose to do so poorly.
    • Perl has it [cpan.org].

      use Fatal qw(open close etc...);
    • This relates to my biggest gripe about VB.Net. Any function can throw any exception, and you have no way of knowing, apart from looking at source code or documentation what kind of errors any specific function might throw. So basically all errors become runtime errors. This is one place where I wish VB.Net would copy from Java. If the function is going to throw an exception, it should be an compilation error if you aren't catching it. Also, related to this, when using ASP.Net, the debugger never breaks
    • Are you writing a quick tool or an application? For a quick tool, sure, you want a nice explanatory error message. But for an application, you might want to do something different. You might want to ignore the message or do your own error handling.
  • by Palmyst ( 1065142 ) on Wednesday May 30, 2007 @02:21PM (#19325883)
    The article is suited for beginning programmers, I guess. Here is the summary of the tips.

    1. Comment smartly.
    2. Name your constants ("use #defines").
    3. Descriptive variable names, but not too long.
    4. Handle errors.
    5. Avoid premature optimization.
    6. Clarity is better than cleverness.

    The author may not be a beginning programmer, but it appears that he might be a beginning writer on programming.
    • Yep, no you need to include the "Back To School" basic programming techniques, like Rodney Dangerfield might include.

      1. Use comment syntax to obfuscate the actual running code
      2. Don't indent or "pretty format" your code
      3. Use the same variable name over and over, capitalizing different letters to make them unique throughout the program
      4. Use variable names that are incredibly offensive in hindu, so any common "outsource" programmer will refuse to work on the code.
      You get the point.
    • For the rest of us I recommend "The Pragmatic Programmer" by Andrew Hunt and David Thomas.
  • In Soviet Russia, code destroys YOU!
    • Noooooooooo (Score:2, Funny)

      by Anonymous Coward
      In Soviet Russia, Code comments YOU!
  • by dcollins ( 135727 ) on Wednesday May 30, 2007 @02:22PM (#19325901) Homepage
    "One minute spent writing comments can save you an hour of anguish."

    However, what's the probability that the savings actually goes to *you* and not a coworker competing with you for a promotion, or someone who replaced you in a later year? If you work in an office with 100 staff, let's say 1%. So expected savings to you is EV = 1% x 60 minutes = 0.6 minute, less than the minute it takes to write the comment. (Even assuming the payoff is correct, and then helping competing coworkers doesn't do any damage to you.)

    This is what I consider to be the "tragedy of the commons" for software engineering jobs. When I was a programmer, the people who did the least documentation were the fastest, and often the only folks who could approach certain parts of code, and so held in the highest esteem by the executives. Now I only write code for my own projects.
    • Re: (Score:3, Insightful)

      by jfengel ( 409917 )
      This is also one of those "Check mark: $1 Knowing where to put it: $49,999" problems. It takes you a minute to comment a function... times 50 methods a day... times the 1% of comments you ever actually need to go back and read.

      Suddenly that "one minute" is a lot of hours spent writing comments that you'll never read, cluttering up your code and getting wronger as you don't maintain them.

      If I knew which comment to write, sure, I'd write it. And I do, when I think it's appropriate. There are plenty of time
      • Re: (Score:3, Insightful)

        I'm with you. Almost every time I commit to commenting more, I can't find things to comment that aren't obvious in the code. In 90% of the cases, the code should either comment itself or be rewritten.
    • Good point. In my very long experience in this industry, the star programmers cobble something together as fast as possible without worrying about maintenance or documentation. What documentation there is is just the minimum that is needed for various members of the team to work together to make release 1.0, not something that will help future maintainers of the code very much. After release 1.0, they cash in their chips and move on to The New New Thing (yes, I am specifically referring to certain people in
  • Mostly agreed (Score:5, Insightful)

    by ZorbaTHut ( 126196 ) on Wednesday May 30, 2007 @02:23PM (#19325913) Homepage
    I thought I'd make two comments on things that I think he got a bit wrong.

    Tip 2: Don't use #define. Avoid it as best as you can. Use const int. That's what it's for. It will be typechecked by the compiler, it's much harder to produce bizarre errors, and 99% of the time it's better.

    const int NUM_ALIENS_TO_KILL_TO_END_WAVE = 20;

    Tip 4: Warning messages don't work. Don't bother with them. Use assert() - if it triggers, your program will crash with a useful error message. Now that's an incentive to make things work!

    In my current project, out of 25,000 lines of code, I have almost 1100 asserts. And the first number counts whitespace. Any bugs I have get found and squashed pretty much instantly.
    • by Sciros ( 986030 )
      I'm a bit rusty on my C++, but assert() statements in Java are discouraged in anything besides testing, since invoking them is a runtime config and is off by default in some environments. So, production code shouldn't have asserts() if it's Java.
      • Re: (Score:3, Informative)

        by ZorbaTHut ( 126196 )
        In fairness, I don't actually use assert() - I have my own macro that I use that is turned on in both debug and release mode. But in effect it's an assert() that always runs.

        You're right, though - that's one of the more annoying "features" of assert. Luckily assert() is not a particularly hard thing to rewrite.
      • Re: (Score:3, Interesting)

        by AuMatar ( 183847 )
        In most C++ libraries, assert is conditionally compiled to noop in non-debug builds. So the same issue applies. That doesn't mean don't use asserts, it means that assert statements should not have side effects.
    • Don't use #define. Avoid it as best as you can. Use const int.

      Wouldn't that be slightly less efficient? Wouldn't that allocate memory to hold the value rather than having a literal in the program code? Would be disaserterous if it caused a cache-miss too, you coul lose a lot of efficency there. Anyone know how various compilers handle const ints?
      • Re: (Score:3, Informative)

        by ZorbaTHut ( 126196 )
        Since the compiler knows that the int can't ever change, the compiler can easily inline it, giving you exactly the same performance as #define. I believe that most compilers only bother allocating space for the int if you try to take the address of it in some way.

        I suspect they only do this in release mode, of course.
      • "Wouldn't that be slightly less efficient?"

        No.

        "Wouldn't that allocate memory to hold the value rather than having a literal in the program code?"

        No.

        "Would be disaserterous if it caused a cache-miss too, you coul lose a lot of efficency there. Anyone know how various compilers handle const ints?"

        No, and you can't really know whre a cache miss will happen unless you know exactly what computer will run that code.

        Your compiler knows that a const is constant, and won't throw memory (and lookup time)

        • Your compiler knows that a const is constant, and won't throw memory (and lookup time) away with it. Unless, of course, you want to debug. And you shouldn't be thinking about cache size while targeting the (quite heterogeneous) PC.

          Yeah I know that most cases it wouldn't matter, but I can think of quite a few worst-case scenarios with a lot of lookups. In any case, I know that you can modify const values by getting a pointer, casting to non-const, and then modifying it. But it appears that will only work i
    • Does the use of const int vs #define compile any differently? I work mostly with small embedded projects where every byte counts, though I've always trusted the compilers to handle the optimization and the standard seems to be #defines rather than const int across several companies I've worked at. Though IIRC #defines are resolved by the precompiler rather than the compiler which may result in smaller executable size since there would be less overall symbols in the symbol table. Since it may be that all
      • Re: (Score:3, Insightful)

        by drxenos ( 573895 )
        I use C++ for embedded systems all the time (along with C and Ada). Compilers will "inline" constants just as well as #defines are. By "symbol table" I assume you are talking about the global symbol table used by the dynamic link-loader. Since constants are local by default I would not worry about this, as they will not appear in the table. And don't listen to the commenter saying you should use C instead of C++ for embedded systems. That's just know-nothing nonsense.
  • Um... well, duh.

    But aside from simply commenting, *updating* comments is also important, and possibly more so. Looking at the history of the TCP/IP source in OSX, for example, you see wonderful instances of complex code that's been updated as new operating systems have been built on top of it, but comments reflect the old behavior rather than the new.
  • by dpbsmith ( 263124 ) on Wednesday May 30, 2007 @02:26PM (#19325937) Homepage
    Not that I don't use them a lot myself, but I thought that in C++ you were supposed to try to avoid the use of macros altogether, and in particular were supposed to use consts for, well, defining constants.

    I.e. not

    #define PIXEL_WIDTH_OF_PLAY_AREA 800
    #define PIXEL_HEIGHT_OF_PLAY_AREA 600

    but

    const int PIXEL_WIDTH_OF_PLAY_AREA=800;
    const int PIXEL_HEIGHT_OF_PLAY_AREA=600;
  • Invariants (Score:2, Insightful)

    by ljw1004 ( 764174 )
    The comments I like best are correctness invariants or induction hypotheses. Like the ones you'd use to prove that your algorithm is correct. For example:

    // Invariant: if an object is ReadOnly, then everything that points to it is ReadOnly
    // Invariant: if an object is writeable, then everything it points to is writeable
    // Invariant: when you call CopyOnWrite(), you get back a writeable object
    // Invariant: The ReadOnly flag starts false, and will change to true during the object's lifetime, but can neve
  • What about structured coding? I mean really, this stuff is so obvious, and then to not even mention structured coding?
  • by dfuhry ( 973158 ) <dfuhry@cs.kent.edu> on Wednesday May 30, 2007 @02:27PM (#19325959)
    Wow. That was a big waste of time. All of this stuff is obvious and everyone knows it. Why did anyone write all of this?
  • Actually (Score:5, Funny)

    by Dachannien ( 617929 ) on Wednesday May 30, 2007 @02:27PM (#19325979)
    Actually, it's not necessarily a bad thing for your code to destroy you. Just make sure you don't dereference any old pointers to you afterwards.

    • Actually, it's not necessarily a bad thing for your code to destroy you. Just make sure you don't dereference any old pointers to you afterwards.

      Geez, and I thought Skynet was just doing garbage collection.
  • So where can I download "Kill Bad Aliens"?
  • It is notable that Tip 1, "Comment like a smart person", and Tip 2, "Do error checking. You make errors. Yes, you" can be sensibly combined into "Use contracts". Sure, you'll still need some extra comments to describe the function, but any comments about the parameters a function recieves and what it returns can be usefully written as contracts which provide both documentation and error checking. It also helps you to think about exactly what you intend a function to do, and assists greatly down the line whe
  • by Ambitwistor ( 1041236 ) on Wednesday May 30, 2007 @02:32PM (#19326055)
    To keep my code from destroying me, I shouldn't #define MAX_ALIENS_ON_SCREEN_AT_ONCE to equal 100. That's way too many aliens to survive.
  • ... if your C code requires you to know the difference between i++ and ++i, it is too complicated.

    Advice on good comments--great--but really, it's just obvious. Anyone that doesn't get how to comment well doesn't want to comment well. And the above quote made me want to wring his neck. If you don't know the difference between those two operators, you should stick to VB.

    • by loqi ( 754476 ) on Wednesday May 30, 2007 @02:45PM (#19326271)
      ... if your C code requires you to know the difference between i++ and ++i, it is too complicated

      It's not a matter of knowing the difference, it's a matter of the code depending on the difference. If you need to increment beforehand, do it on the previous line. Afterward, do it on the next line. Expressions are like sex: they're better without side-effects.
      • Re: (Score:3, Informative)

        First off, it's not a side-effect. It's intended.

        I know it's nothing more than syntactic sugar, but making a temp variable, worrying about scope and naming clashes, messing up your code, having to pull stuff out or put stuff into a for loop, and all the other million ways it comes in handy says you're cutting off your nose to spite your face. I have to reiterate: if this is too complicated for you, you shouldn't be writing C. Then, when we start to talk about C++, forget about it. Template magic, function

  • /* insert meaningful and descriptive comment here */
  • the foremost concern of any programmer used to be time. CPU time. How many cycles for this operation, versus how many cycles for that one. Now it seems like we have shifted our time concerns to programming time - the programmers' salary, the time to market, etc. Funny though how this tends to result in bloated, sluggish, bug-ridden programs that need at LEAST a patch or two before they even reasonably deliver what was promised on the box.

    I guess we are lucky that the software "consume
  • by Kalzus ( 86795 ) on Wednesday May 30, 2007 @02:47PM (#19326305)
    "Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it." - Brian W. Kernighan
  • by kabdib ( 81955 ) on Wednesday May 30, 2007 @03:04PM (#19326579) Homepage
    Rule #1 of Systems Programming: Never check for an error you don't know how to handle.

    But, if you simply MUST, then:

    Rule #2: If you have to blow up, arrange it to be in someone else's code.

    That way, when you're (say) deep in your file system update locking and you realize that something's gone truly plonkered, you stealthily return something that causes X Windows to blow chunks long after you've returned.

    "It's the file system."

    "No, it's not. It's the bloody clipping code in X. Remember when release 10.5.08A came out? It's just gotten worse from that. Did I ever tell you about the time that 9.02 was released? Let me just say, you're lucky, man . . ."

    Rule #3: When necessary, distract, distract, distract. Everything is on the table, and "Look, the Goodyear Blimp!" [points excitedly] is just for starters...

    Good systems programmers know these tricks, and all the others you haven't learned about yet, which is why they're curmudgeons with level 70 pallies and tier-2 gear and you're shovelling Java and XML around trying to make a rent check.

    Cheers!

  • Tip 3 is crap. (Score:3, Insightful)

    by geekoid ( 135745 ) <dadinportland@yaFREEBSDhoo.com minus bsd> on Wednesday May 30, 2007 @03:11PM (#19326679) Homepage Journal
    " For example, in real life, I probably wouldn't give constants names as long as I did in the previous section. I just did that so that you, the reader, would totally understand what they meant without any context. In the context of the program itself, instead of:

    #define MAX_ALIENS_ON_SCREEN_AT_ONCE 5
    I would almost undoubtedly write:

    #define MAX_NUM_ALIENS 5

    Any confusion caused by the shorter name would be cleared up very quickly, and the shorter name would lead to much more readable code. "

    Cleared up very quickly? no, it can only be cleared up after searching your code to realize it is the max number of aliens on the screen, not the max per level, or per game, or whatever the hell you were thinking 2 years ago when you wrote it.
    Bad Bad Bad.
    sloppy.
  • These are ok for intro tips. Though if the goal is maintainable code I don't know if I'd be doing trivial stuff in C++.

    Saying use "#define" everywhere is pretty much like saying use globals everywhere. It's better than using constants, but sometimes not by much. Keeping variables scoped as tightly as possible can help save some pain; globals can add salt to the wound.
    • by geekoid ( 135745 )
      Wow, you really, really do not understand #define, or the power they give the programmer.

      If you are using C or C++ and not using defines, I guarantee you your code is a disaster waiting to happen. Also, it's a maintenance nightmare.

      Sorry, but it's true.

  • Although java centric one must include information on how to write UNmaintainable code as well. The original [mindprod.com] unmaintainable code site (AFAIK)
  • My favorite part is where he says: "if your C code requires you to know the difference between i++ and ++i, it is too complicated," because it makes me remember that being covered in my very first programming course, and that all code should perforce be understandable to someone who has finished maybe 1/3 semester of intro programming.
  • Hungarian Notation (Score:4, Insightful)

    by PhoenixRising ( 36999 ) <ngroot+slashdotNO@SPAMlo-cal.org> on Wednesday May 30, 2007 @04:11PM (#19327699) Homepage

    For example, there is something called Hungarian Notation. There are lots of flavors of this, but the basic idea is that you put a tag at the beginning of a variable name saying what type it is.

    I wish he'd included a link to the Wikipedia article on Hungarian notation [wikipedia.org] and specifically referenced "Apps Hungarian". Hungarian notation is essentially a cheap way to create programmer-enforced "types". When these are truly new types ("dirty string", "null-terminated string", etc.) not known to the compiler/interpreter, it might be reasonable; this is "Apps Hungarian". However, prefixing an unsigned int with "ul" (i.e., "Systems Hungarian") is silly; your compiler should warn you/error out if you're trying to do something inappropriate with it, since it knows what an unsigned int is. Hungarian notation will be a useful thing until it's as easy to define new types in common programming languages as it is in, say, Haskell, but it should be used judiciously.

  • by spidweb ( 134146 ) on Wednesday May 30, 2007 @05:06PM (#19328655) Homepage
    As the person who actually wrote the article in question, I'd like to thank you for your comments and respond with a few of my own.

    * To those who think it is all so obvious that I shouldn't have written about it:

    No. You are wrong. Just wrong. Good programming practices do not just appear in peoples' heads as if by magic.

    It's an introductory article. It's sold as an introductory article. And I am far more interested in being Right than I am scared of being Obvious.

    * To those who have problems with suggesting using #define instead of const int

    Meh. Yeah, you're probably right. But the important thing here is the CONCEPT of having your constants being defined in one central, easy to find place. Once a person has that down, he or she can define them however is desired.

    * To those who accuse me of being a (gasp) inexperienced programming writer.

    Yeah. So what? I never said I wasn't. I'm a game author. I've written over a dozen games. They all made money. That doesn't mean I am mister programming advice god.

    But one, if you have a problem with it, yell at IBM, not me. They're the ones who had me write the piece.

    Two. This is kind of ad hominem. Who cares how experienced I am or am not? I'm still right.

    * I didn't read the article, but I'll say bad things about you because it means I'm awesome.

    R0ck 0n d00d. First post!

    * I liked the article. It might tell beginning programmers something actually useful.

    If anyone says this, thanks in advance.

"Consequences, Schmonsequences, as long as I'm rich." -- Looney Tunes, Ali Baba Bunny (1957, Chuck Jones)

Working...