Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming

Programming Mistakes To Avoid 394

snydeq writes "InfoWorld's Peter Wayner outlines some of the most common programming mistakes and how to avoid them. 'Certain programming practices send the majority of developers reaching for their hair upon opening a file that has been exhibiting too much "character." Spend some time in a bar near any tech company, and you'll hear the howls: Why did the programmer use that antiquated structure? Where was the mechanism for defending against attacks from the Web? Wasn't any thought given to what a noob would do with the program?' Wayner writes. From playing it fast and loose, to delegating too much to frameworks, to relying too heavily on magic boxes, to overdetermining the user experience — each programming pitfall is accompanied by its opposing pair, lending further proof that 'programming may in fact be transforming into an art, one that requires a skilled hand and a creative mind to achieve a happy medium between problematic extremes.'" What common mistakes do you frequently have to deal with?
This discussion has been archived. No new comments can be posted.

Programming Mistakes To Avoid

Comments Filter:
  • by Barryke ( 772876 ) on Tuesday December 07, 2010 @06:28AM (#34471378) Homepage

    What common mistakes do you frequently have to deal with?
    - Software only tested by programmer.
    - Manual only written by programmer.
    - Support can't do a day without programmer.

    A good programmer should know when to delegate. Or their boss should. Depends on office culture perhaps.

  • by eagleyes ( 737026 ) on Tuesday December 07, 2010 @06:43AM (#34471438)
    The most common programming mistake to avoid: Reading badly written articles about "what programming mistakes to avoid".
  • by Chrisq ( 894406 ) on Tuesday December 07, 2010 @06:52AM (#34471492)

    Doesn't mistake number 2 contradict number 1? Or am I missing something?

    The whole lot is full of contradictions:

    4: Delegating too much to frameworks 8: Reinventing the wheel
    9: Opening up too much to the user 10: Overdetermining the user experience
    5: Trusting the client 6: Not trusting the client enough

    I think that there is a meta-message, akin to Buddha's middle way. Don't take any rule to extremes.

  • by Voulnet ( 1630793 ) on Tuesday December 07, 2010 @06:57AM (#34471504)
    Programming mistake #0: Believing that your computer degree (Computer Engineering or Computer Science alike) automatically puts your code in a high level of quality.

    Not to bring any academia vs industry argument, but many students miss the idea of a Computer degree with programming courses in it: The degree intentionally doesn't go to details because it needs to give you a background into a broader set of subjects. Industry needs one to be very attentive to details in that one thing he's doing at the moment.
  • "Common" mistakes (Score:5, Insightful)

    by Alex Belits ( 437 ) * on Tuesday December 07, 2010 @06:57AM (#34471506) Homepage

    The only common mistake I see is not firing the programmer who makes any of those "common" mistakes. There is absolutely no reason for any of this shit to be "common" unless "programmers" who make them are uneducated dumbasses who should never be allowed anywhere near software development.

    Now, please, give me the list of "common mistakes" made by surgeons and aircraft engineers, and compare them with this list of amateurish crap.

  • by digitig ( 1056110 ) on Tuesday December 07, 2010 @07:09AM (#34471546)
    True enough. And since every rule has to have a complement, 0a: Assuming that you don't need to learn any of that theory: algorithms, data structures, normalisation and so on
  • Pointer typedefs (Score:5, Insightful)

    by QuoteMstr ( 55051 ) <dan.colascione@gmail.com> on Tuesday December 07, 2010 @07:14AM (#34471564)

    Pointer typedefs were a bad idea in the 1980s. They're just terrible today. One pet peeve of mine is this:

    typedef struct _FOO { int Blah; } FOO, *PFOO;

    void
    SomeFunction(const PFOO);

    That const doesn't do what you think it does. There was never a good reason to use pointer typedefs. There is certainly no good reason to do so today. Just say no. If your coding convention disagrees, damn the coding convention.

  • Is this real? (Score:5, Insightful)

    by Psychotria ( 953670 ) on Tuesday December 07, 2010 @07:38AM (#34471684)

    I've not worked as a programmer for, hmm, maybe 15 years and all of this was known way back even before I "retired" from that line of work. Perhaps all these levels of abstraction upon abstraction make things harder to understand. Back in my days these "pitfalls" were obvious because we all (well, not all, but a lot) knew ASM and actually even used it regularly (even inline, *shudder*).

    Someone above mentioned pointer typedefs and gave the example of typedef struct { int Blah; } FOO, *PFOO; (yes I left off the bit before the the opening brace deliberately.) and then suggesting that people don't know that void SomeFunction (const PFOO) {} doesn't behave as expected. Now this could, I suppose, be seen as a failure of the language. But, shit, any idiot who understands the underlying logic can see why that causes problems. Which goes back to my point of maybe all these modern levels of abstraction and getting away from the machine are, in some ways, detrimental.

    Now, get off my lawn. Umm, except I don't have a lawn because I sprayed the growth inducing hormone RoundUp all over it, but that is beside the point. I think.

  • by CountBrass ( 590228 ) on Tuesday December 07, 2010 @07:56AM (#34471766)
    'programming may in fact be transforming into an art, one that requires a skilled hand and a creative mind to achieve a happy medium between problematic extremes.' Bullshit. Programming has always been an art that required skill and a creative mind. The only people who have claimed otherwise have been managers, who would prefer all techies were interchangable cogs, and crap programmers: the gimps and muppets of our trade.
  • by commodore64_love ( 1445365 ) on Tuesday December 07, 2010 @08:04AM (#34471818) Journal

    >>>Programming has always been an art that required skill and a creative mind

    plus logical thinking (like the machine you're programming). It always surprised me when my Professor/Director of Engineering said programming should not be considered a "science" or "engineering". He said they were the equivalent of bus drivers - just human beings running a machine.

    At first I thought, 'Well maybe he has a point' but no not really. Driving a machine is a skill that can be learned in a day or two. Programming a machine requires years - the same amount of time needed to learn any engineering discipline.

     

  • by BiggerIsBetter ( 682164 ) on Tuesday December 07, 2010 @08:09AM (#34471842)

    There will however always be BAD code by bad programmers. I've taken over Java progress where everything was OOP'ed into hell (as in a bazillion classes more than was needed for the application) and PHP projects which should be OOP'ed but consisted of about 500 files that included each other in a huge confusing net.

    I see this one as a lack-of-experience problem. People have good intentions and want to build scalable, extensible, maintainable code. This is good. Unfortunately however, they're wrong. The apps they're building are small irregardless of the amount of thought they put into them, and they won't have to scale and extend the way they think they might - you don't need interfaces and impls and arbitrary inheritance for everything when the webapp is 4 screens of Spring WebFlow! Sure, if you're building something that warrants it, this is the way to go, but most of aren't building apps that big or flexible. It seems to take time to learn this, and to know when to apply the patterns and when to just build it.

    As a smarter man than I once said, Make things as simple as possible, but no simpler. If you do that, your code will work, it'll be understandable by the next guy, and you'll have a fighting chance of meeting your deadlines.

  • by Bobakitoo ( 1814374 ) on Tuesday December 07, 2010 @08:20AM (#34471896)
    If you would have care to check the link, you would have see the IBM advertisment on the side. Also print layout is a feature of the site, OP did not invented it or hacked it in. Or maybe it is you that run ads-blocker and steal their labor? If you get mod down it will be only because you speak bullshit.
  • by petes_PoV ( 912422 ) on Tuesday December 07, 2010 @08:28AM (#34471944)
    By the time the coding starts, most projects are already doomed. The basic mistakes that occur before any code is written have a far greater effect on the project. While these are almost all outside the control of the programmer, he/she always gets the blame due to the "last person who touched it, broke it" principle. My short list of favourites would be:

    Allowing too many options / features in the design. The classic example being unable to decide whether feature A or B is best, and ducking the issue by including them both

    Assuming 5 working-days of effort can be achieved in a working week. Conveniently forgetting about all the office overheads such as "progress" meetings, timesheet administration, interrupted work, all the other concurrent projects. Even the most efficient, single-threaded operation needs half a working-day per week just for the trivia.

    Following on from that, conveniently forgetting about annual leave commitments, national holidays and the possibility of sickness. If 5 working-days per week is impractical, 12 working-months in a year is downright negligent.

    The tacit assumption that testing will inevitably be followed by reelase - rather than bug-fixing.

    Holding the end-date constant while delaying the start, or presuming that all delays in the specification, design, approval stages can somehow be reclaimed during coding (how: by thining faster?)

  • by rgravina ( 520410 ) on Tuesday December 07, 2010 @08:36AM (#34471978)

    The biggest programming mistakes I've had the displeasure of making, or discovering in others code, almost always centre around one of these two problems:

    1. The code is over-engineered
    2. The code was abstracted before there was even a need for the abstraction.

    I remember when I was less experienced, how thrilled I'd be over code that was clever, solved many problems aside from the one I was trying to solve, and had some clear reusability built in. What a work of art, I thought.... until I eventually realised that much of the extra code I had written didn't get used, the abstracted code was never reused - or even if it was, I couldn't predict how it would be reused and the abstraction was clumsy at best, useless at worst.

    It's sad when this happens - good intentions, but the end result is a lot of waste. I'm embarrassed to look over my earlier code which is like this.. I like to think I do it less now, but the temptation is always there... I'm going to need to do this later anyway... I can just abstract this bit here and reuse it some day in the future...

    My advice now... Don't do it! Just wait until the reuse case comes along, or the new feature request comes along, and *then* do it. You'll know so much more about the problem domain then, or you might avoid days (weeks!) of wasted effort.

  • Re:GOTO... (Score:4, Insightful)

    by ledow ( 319597 ) on Tuesday December 07, 2010 @08:56AM (#34472060) Homepage

    I happen to agree with the general idea of that thread - goto is powerful, even in good code, but it easily misused to create spaghetti code. The choices then available are: Remove goto from the language / never use goto, or careful audit each use of goto to make sure it provides sufficient advantages and *doesn't* make the silly mistakes possible.

    The languages that remove things that provide complications to inept programmers (e.g. pointers, goto etc.) tend to be the ones that are hardest to program with predictable efficiency for.

    There's nothing wrong with goto. Just don't lob it into code without thinking about it.

  • by Tridus ( 79566 ) on Tuesday December 07, 2010 @09:24AM (#34472270) Homepage

    Don't think thats the fault of the programmer in a lot of cases. I'd love to have someone in my office to write a manual and do proper QA. But the budget doesn't include those things, and I don't get to set the budget.

    Sometimes the reality is that you either get a program that doesn't have those things, or you try to do those things and don't have enough money to build anything.

  • by Tridus ( 79566 ) on Tuesday December 07, 2010 @09:30AM (#34472370) Homepage

    That, and being able to figure out what people actually want in the first place.

    Quite a few projects fail simply because the people requesting it have no idea what they actually want. They can't articulate their needs, or why they need it, or even where the idea came from. If you can't nail that down, the rest of it is a crapshoot.

    (The only thing worse is when they DO know what they want, and it's for entirely irrational reasons.
    "We want Sharepoint!"
    "Okay, why?"
    "Umm... because we do!"
    "What does Sharepoint do that you want?"
    "Documents, and stuff!"
    "Sigh...")

  • by mrjb ( 547783 ) on Tuesday December 07, 2010 @09:48AM (#34472578)
    Don't get me started on preventing programming mistakes. If I'd address the most common programming mistakes that I've ran into in the wild and write an article about each of those mistakes at a time, I would end up with a whole book on the matter and would probably call it "Growing Better Software".

    I find the given top 12 list of mistakes a bit weak- I'd be able to avoid all of these and yet write horrible code. My personal recommendation for a top 12 of programming mistakes to avoid would be:

    1. Failing to check function parameters before using them: null pointers, limits, lengths, etc. This will make your program unstable and/or unpredictable.

    2. Spending too little time thinking about and designing the data structure of the application. This will make you get stuck when maintaining/extending your application.

    3. Following every market hype - When the marketing bubble bursts, you'll have to start over again.

    4. Designing user interfaces without actually involving users - You'll be surprised how easy it is to confuse users.

    5. Infinitely deeply nested if/else statements - This will make code absolutely unreadable.

    6. No documentation whatsoever - Who's going to maintain your code after you change jobs?

    7. Ignoring existing, universally accepted standards - so you'll cause interoperability issues or be doomed to either reinvent the wheel.

    8. Hard-coded values/magic numbers - as a result, any change must be made in code rather than allowing power users to configure their own system.

    9. Littering code with global variables - this implies statefulness of code, making it pretty near impossible to predict how a function will behave next time it is called.

    10. Being unaware of the "Big O" order of your algorithms, causing code to be unnecessarily inefficient.

    11. Strong platform dependency: This can shorten the lifetime of your application to whenever the next platform upgrade takes place, or keep you stuck at the current version of the current platform forever.

    12. Thinking you can figure out everything by yourself - In learning by doing, experience can only follow from making mistakes. By getting yourself a mentor or an education, you can actually learn from the mistakes that thousands have made before you.

    13. Stopping at 12.
  • by Jimmy King ( 828214 ) on Tuesday December 07, 2010 @09:49AM (#34472588) Homepage Journal

    I agree. This is a very big problem for me as management keeps cutting back the tech staff where I work.

    I write the code, I test the code, I write the docs for people. I've tried explaining over and over again that this is terrible. I shouldn't be the final tester. I already think it works or I wouldn't have written it that way and wouldn't be at the point of testing. This affects my testing and causes me to miss things. I also know how it's supposed to work too well and so don't even think to try stupid stuff that users inevitably do which breaks stuff.

    Documentation is similar. I know how it works and how it was intended to work. At times this makes things obvious to me that are not necessarily obvious to someone else and so they may not find their way into the documentation.

  • by Savage-Rabbit ( 308260 ) on Tuesday December 07, 2010 @10:49AM (#34473356)

    There will however always be BAD code by bad programmers. I've taken over Java progress where everything was OOP'ed into hell (as in a bazillion classes more than was needed for the application) and PHP projects which should be OOP'ed but consisted of about 500 files that included each other in a huge confusing net.

    Taking over projects fitting those descriptions is never a good idea. They are nothing but pain, it's impossible to resolve the problems with the app and the code unless you opt for a complete rewrite. If, however, you go that route the remaining developers will be pissed off because they wrote the crappy code and you are basically saying that their ugly baby is ... well ... UGLY! What's worse, you are saying it out loud for everybody including the PHBs to hear. Eventually you end up being frustrated, your PHB either caves in to complaints about you and puts you in your place or you get laid off. Unless, of course, you anticipate this and quit before he gets the chance. There is no substitute for writing code properly and designing and planning your application properly no matter how insignificant the application seems to be because you will never know which piece of shit app will take off and scale into something much, much bigger. Myself, I learned this from a friendly lecture I was given by my boss after I handed in my first project on my very first job. He made me rewrite the thing entirely claiming it was better that I learned the value of things like database abstraction and MVC separation right away. He was right.

  • by olau ( 314197 ) on Tuesday December 07, 2010 @12:31PM (#34474920) Homepage

    That's because your professor learnt it in a day or two and now thinks he's a star programmer, while the sad truth is more like he's years from producing anything better than the crap beginners can crank out. As far as I can tell, most professors are at that level when it comes to programming. So none of his co-professors have probably corrected him. I've heard a Ph.d. refuse our explanation of why our SML student project processing strings as lists was slower than gzip (written in C) with "but they have the same time complexity".

    It has the depressing side-effect that people in college are being thought principles that the real star programmers shy away from, while nobody is working on hammering the really important ones into them.

    Of course, there are exceptions to this generalization.

  • by jekewa ( 751500 ) on Tuesday December 07, 2010 @01:34PM (#34475980) Homepage Journal

    I know you're being factitious, but software compiled with "acceptable warnings" may also lead to runtime failures.

    I once had a job writing C++ software where the lead made us write code with zero warnings, turning the compiler to the most strict. Justifiable suppression was allowed (for example, in cases of ambiguous type-casting from library headers). While I thought this was overkill, we were internally hired to help a group whose software was woefully unreliable; turns out they went the other way, turned off all compiler warnings and suppressed some "acceptable" errors. Correcting their errors and compromising on some of their warnings brought the quality of their software to at least a stable level.

    There is a middle ground, but I've chosen to go with the zero-tolerance route on warnings; they're easy to get rid of, and encourage careful and thoughtful use (and even abuses).

    A good rule of thumb is that if your IDE or compiler is complaining about it, you probably left yourself open for a failure.

    Of course, not having any warnings doesn't prevent errors due to bad logic...that's a whole other ball of flame-bait.

  • by Anonymous Coward on Tuesday December 07, 2010 @02:33PM (#34476934)

    I think you overestimate the predictability of the engineering disciplines.

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...