Programming Mistakes To Avoid 394
snydeq writes "InfoWorld's Peter Wayner outlines some of the most common programming mistakes and how to avoid them. 'Certain programming practices send the majority of developers reaching for their hair upon opening a file that has been exhibiting too much "character." Spend some time in a bar near any tech company, and you'll hear the howls: Why did the programmer use that antiquated structure? Where was the mechanism for defending against attacks from the Web? Wasn't any thought given to what a noob would do with the program?' Wayner writes. From playing it fast and loose, to delegating too much to frameworks, to relying too heavily on magic boxes, to overdetermining the user experience — each programming pitfall is accompanied by its opposing pair, lending further proof that 'programming may in fact be transforming into an art, one that requires a skilled hand and a creative mind to achieve a happy medium between problematic extremes.'"
What common mistakes do you frequently have to deal with?
Printable version - All on one page (Score:5, Informative)
And now for the printable version with all the tips on one page:
http://infoworld.com/print/145292 [infoworld.com]
Y
Re:Printable version - All on one page (Score:5, Insightful)
Re:Printable version - All on one page (Score:5, Insightful)
>>>Programming has always been an art that required skill and a creative mind
plus logical thinking (like the machine you're programming). It always surprised me when my Professor/Director of Engineering said programming should not be considered a "science" or "engineering". He said they were the equivalent of bus drivers - just human beings running a machine.
At first I thought, 'Well maybe he has a point' but no not really. Driving a machine is a skill that can be learned in a day or two. Programming a machine requires years - the same amount of time needed to learn any engineering discipline.
Re: (Score:3)
I think driving is pretty similar. You can learn it in a few days of intensive training but if you want to be a really good driver you need to stick to best practices and learn how the car itself works and what its limits are.
Re:Printable version - All on one page (Score:4, Insightful)
That's because your professor learnt it in a day or two and now thinks he's a star programmer, while the sad truth is more like he's years from producing anything better than the crap beginners can crank out. As far as I can tell, most professors are at that level when it comes to programming. So none of his co-professors have probably corrected him. I've heard a Ph.d. refuse our explanation of why our SML student project processing strings as lists was slower than gzip (written in C) with "but they have the same time complexity".
It has the depressing side-effect that people in college are being thought principles that the real star programmers shy away from, while nobody is working on hammering the really important ones into them.
Of course, there are exceptions to this generalization.
Re: (Score:2)
And a lot of graphic/physical artist. I've seen many who do graphic arts, drawing and painting, who seems to think that if you are in any computer programming or science field, you have no creativity.
Re: (Score:3)
I had a friend in college [harvard.edu] who got a double major in art and electrical engineering. When I asked her why she was working in such different fields, she responded that to her art and electrical engineering were both forms of art.
For some reason this has stuck with me for decades (yeah, I'm an old guy).
Re: (Score:3, Insightful)
Re: (Score:2)
what advertisement? I haven't seen ads on websites in years between my hosts file and adblock.
Re: (Score:2)
Did you just not bother to look, or did you forget to turn off your adblock before looking?
Tests, Manual, Support by programmer. (Score:5, Insightful)
What common mistakes do you frequently have to deal with?
- Software only tested by programmer.
- Manual only written by programmer.
- Support can't do a day without programmer.
A good programmer should know when to delegate. Or their boss should. Depends on office culture perhaps.
Re:Tests, Manual, Support by programmer. (Score:5, Insightful)
Don't think thats the fault of the programmer in a lot of cases. I'd love to have someone in my office to write a manual and do proper QA. But the budget doesn't include those things, and I don't get to set the budget.
Sometimes the reality is that you either get a program that doesn't have those things, or you try to do those things and don't have enough money to build anything.
Re:Tests, Manual, Support by programmer. (Score:5, Insightful)
I agree. This is a very big problem for me as management keeps cutting back the tech staff where I work.
I write the code, I test the code, I write the docs for people. I've tried explaining over and over again that this is terrible. I shouldn't be the final tester. I already think it works or I wouldn't have written it that way and wouldn't be at the point of testing. This affects my testing and causes me to miss things. I also know how it's supposed to work too well and so don't even think to try stupid stuff that users inevitably do which breaks stuff.
Documentation is similar. I know how it works and how it was intended to work. At times this makes things obvious to me that are not necessarily obvious to someone else and so they may not find their way into the documentation.
Re: (Score:2)
Unrealistic timetables... Your testing prototype ends up as the shipped code.
Senior programmer helps on the project... Yay we get old code we have to re-write for him.
Engineering gives us a craptastic design that we have to program around the limitations or problems in the platform. Causing a kludge that we pray will not unravel after it ships.
Those three I see nearly monthly.
Re: (Score:2)
One modern solution is to write both from the specifications, then run the program according to the documentation to attempt to generate the screen shots...
Re: (Score:2)
The only problem with that it that specifications vary wildly in the level of detail.
I've seen them be a very brief overview of what the program should do, leaving out all the important details that would belong in a manual. I've also seen specifications that specify the smallest details of the inner workings of the code, making it little better to base the manual on than the source itself.
Nevertheless, for most applications the best specification would be somewhere in between, and probably be well suited a
Re: (Score:3)
Re: (Score:3)
Ah, I see. All small-time open source software (software written by a single person in their own time) is a "mistake." Nice.
Oh please, that's not what he means and you know it. There are a lot of companies that think the person who wrote the code should also test it ("They know it better than anybody else!") and write the manual ("They know it better than anyone else!").
This is not a knock against "one person operations" such as OSS, freeware, shareware, etc. They are forced to deliver quality code, testing and documentation because it is their name on the product. When it's a company that is just trying to save money (or the
Programming Mistakes To Avoid... (Score:3, Funny)
...just try to avoid errors and you should be set.
Re:Programming Mistakes To Avoid... (Score:4, Funny)
...just try to avoid errors and you should be set.
But Warnings are ok. Just no errors. Warnings still compile.
Re:Programming Mistakes To Avoid... (Score:4, Insightful)
I know you're being factitious, but software compiled with "acceptable warnings" may also lead to runtime failures.
I once had a job writing C++ software where the lead made us write code with zero warnings, turning the compiler to the most strict. Justifiable suppression was allowed (for example, in cases of ambiguous type-casting from library headers). While I thought this was overkill, we were internally hired to help a group whose software was woefully unreliable; turns out they went the other way, turned off all compiler warnings and suppressed some "acceptable" errors. Correcting their errors and compromising on some of their warnings brought the quality of their software to at least a stable level.
There is a middle ground, but I've chosen to go with the zero-tolerance route on warnings; they're easy to get rid of, and encourage careful and thoughtful use (and even abuses).
A good rule of thumb is that if your IDE or compiler is complaining about it, you probably left yourself open for a failure.
Of course, not having any warnings doesn't prevent errors due to bad logic...that's a whole other ball of flame-bait.
Maintaining code by others are always a nightmare (Score:5, Interesting)
As I see it, most projects start out with a good structure and the best of intentions, and then comes deadlines and the developer having to juggle several projects at once, and then a shortcut is taken here, then there. And suddenly you end up with a non-documented project where the only person that knows how it works is the original developer.
There will however always be BAD code by bad programmers. I've taken over Java progress where everything was OOP'ed into hell (as in a bazillion classes more than was needed for the application) and PHP projects which should be OOP'ed but consisted of about 500 files that included each other in a huge confusing net.
I've also had to take over projects where the original developer was using new technology because he thought it would be fun (at the expense of the customer). Having a huge website in PHP/MySQL and then having crucial parts of it in Ruby/PostreSQL is just a maintenance nightmare.
Re: (Score:3)
All true. My personal favorites are larger, long running projects where all of the above is true and all kinds of undocumented business logic is embedded in the code making a rewrite unfeasable and you have to decide which part of the code is outright sloppy or bad, which parts are feasable and which parts aren't actually being used anymore. Top that off with the original developers being unavailable (either dead or fleeing) and you'd be painting a pretty accurate run-of-the-mill software enviroment.
Re:Maintaining code by others are always a nightma (Score:5, Insightful)
There will however always be BAD code by bad programmers. I've taken over Java progress where everything was OOP'ed into hell (as in a bazillion classes more than was needed for the application) and PHP projects which should be OOP'ed but consisted of about 500 files that included each other in a huge confusing net.
I see this one as a lack-of-experience problem. People have good intentions and want to build scalable, extensible, maintainable code. This is good. Unfortunately however, they're wrong. The apps they're building are small irregardless of the amount of thought they put into them, and they won't have to scale and extend the way they think they might - you don't need interfaces and impls and arbitrary inheritance for everything when the webapp is 4 screens of Spring WebFlow! Sure, if you're building something that warrants it, this is the way to go, but most of aren't building apps that big or flexible. It seems to take time to learn this, and to know when to apply the patterns and when to just build it.
As a smarter man than I once said, Make things as simple as possible, but no simpler. If you do that, your code will work, it'll be understandable by the next guy, and you'll have a fighting chance of meeting your deadlines.
Re:Maintaining code by others are always a nightma (Score:5, Insightful)
There will however always be BAD code by bad programmers. I've taken over Java progress where everything was OOP'ed into hell (as in a bazillion classes more than was needed for the application) and PHP projects which should be OOP'ed but consisted of about 500 files that included each other in a huge confusing net.
Taking over projects fitting those descriptions is never a good idea. They are nothing but pain, it's impossible to resolve the problems with the app and the code unless you opt for a complete rewrite. If, however, you go that route the remaining developers will be pissed off because they wrote the crappy code and you are basically saying that their ugly baby is ... well ... UGLY! What's worse, you are saying it out loud for everybody including the PHBs to hear. Eventually you end up being frustrated, your PHB either caves in to complaints about you and puts you in your place or you get laid off. Unless, of course, you anticipate this and quit before he gets the chance. There is no substitute for writing code properly and designing and planning your application properly no matter how insignificant the application seems to be because you will never know which piece of shit app will take off and scale into something much, much bigger. Myself, I learned this from a friendly lecture I was given by my boss after I handed in my first project on my very first job. He made me rewrite the thing entirely claiming it was better that I learned the value of things like database abstraction and MVC separation right away. He was right.
Re: (Score:3)
Only one programming mistake to avoid: (Score:2)
Learing how to do it in the first place.
Re: (Score:2)
Or even 'learning'.
Tchoh.
do x but not too much! (Score:2)
Programming mistake No. 1: Playing it fast and loose
Failing to shore up the basics is the easiest way to undercut your code. Often this means overlooking how arbitrary user behavior will affect your program. Will the input of a zero find its way into a division operation? Will submitted text be the right length? Have date formats been vetted? Is the username verified against the database? Mistakes in the smallest places cause software to fail.
Fair enough. So debug while you code. Seems like good advice.
Programming mistake No. 2: Overcommitting to details
On the flip side, overly buttoned-up software can slow to a crawl. Checking a few null pointers may not make much difference, but some software is written to be like an obsessive-compulsive who must check that the doors are locked again and again so that sleep never comes.
Doesn't mistake number 2 contradict number 1? Or am I missing something? I guess he's saying debug while you code, but not too much. After reading the rest I see that that was his algorithm for writing the whole article. Rule 1: do x; Rule 2: But not too much! I didn't really find the article all that useful.
All programming should be assembly language programming anyway and a lot of his rules don't seem to apply to assembly language programming. R
accompanied by its opposing pair (Score:4, Informative)
Doesn't mistake number 2 contradict number 1? Or am I missing something?
Yup. FTA:
Below you will find the most common programming pitfalls, each of which is accompanied by its opposing pair, lending further proof that programming may in fact be transforming into an art -- ...
Re:do x but not too much! (Score:5, Insightful)
Doesn't mistake number 2 contradict number 1? Or am I missing something?
The whole lot is full of contradictions:
4: Delegating too much to frameworks 8: Reinventing the wheel
9: Opening up too much to the user 10: Overdetermining the user experience
5: Trusting the client 6: Not trusting the client enough
I think that there is a meta-message, akin to Buddha's middle way. Don't take any rule to extremes.
Re:do x but not too much! (Score:5, Informative)
"The whole lot is full of contradictions"
No, it isn't. It goes "don't do that... but don't fall in the other extreme".
That's on line with his central idea that programming is "an art, one that requires a skilled hand and a creative mind to achieve a happy medium between problematic extremes".
Re: (Score:2)
Doesn't mistake number 2 contradict number 1? Or am I missing something?
The whole lot is full of contradictions:
4: Delegating too much to frameworks 8: Reinventing the wheel
9: Opening up too much to the user 10: Overdetermining the user experience
5: Trusting the client 6: Not trusting the client enough
I think that there is a meta-message, akin to Buddha's middle way. Don't take any rule to extremes.
Confucious say: This one skitzo mutherfucker
Re: (Score:3)
Fair enough. So debug while you code. Seems like good advice.
s/debug/test/
Re: (Score:3)
Yes it does. The difficult part is knowing the balance, as indicated by the summary: "programming may in fact be transforming into an art, one that requires a skilled hand and a creative mind [...]"
Personally, I believe we'd be better off it professional programming transformed from an art into an engineering discipline. IMHO, building robust and efficient applications should be a boring and repetitive exercise in design and implementation of prescribed design patterns... maybe then we'd turn our industry's abysmal success rates around.
Re: (Score:2)
Sounds like a plan, you willing to go and tell HR and the executives that we now need to be paid ENGINEER pay scales?
Those that control the money dont want programming to become elevated to Engineering status, they would have to pay us all more across the board.
Re:do x but not too much! (Score:4, Insightful)
That, and being able to figure out what people actually want in the first place.
Quite a few projects fail simply because the people requesting it have no idea what they actually want. They can't articulate their needs, or why they need it, or even where the idea came from. If you can't nail that down, the rest of it is a crapshoot.
(The only thing worse is when they DO know what they want, and it's for entirely irrational reasons.
"We want Sharepoint!"
"Okay, why?"
"Umm... because we do!"
"What does Sharepoint do that you want?"
"Documents, and stuff!"
"Sigh...")
Missing from the article (Score:5, Insightful)
Mistakes? (Score:2)
I very rarely see programming mistakes. There seems to be 2 kinds of programmers.
- Those who care about what they do and try hard.
- Those who don't care about what they do and don't try hard
The later write terrible code, but it is just because they are either lazy or aren't suited to the profession and can't get enthused. Very rarely do you see someone who cares about there work make a big mistake (and if so they are probably just starting out).
Re:Mistakes? (Score:5, Funny)
I very rarely see programming mistakes.
Neither do the bad programmers!
Re: (Score:2)
Very rarely do you see someone who cares about there work make a big mistake (and if so they are probably just starting out).
Really? I see it all the time from very intelligent but inexperienced programmers. I've seen a few spin their wheels and end up depressed about their whole job.
Programming Mistake #0 (Score:5, Insightful)
Not to bring any academia vs industry argument, but many students miss the idea of a Computer degree with programming courses in it: The degree intentionally doesn't go to details because it needs to give you a background into a broader set of subjects. Industry needs one to be very attentive to details in that one thing he's doing at the moment.
Re:Programming Mistake #0 (Score:5, Insightful)
"Common" mistakes (Score:5, Insightful)
The only common mistake I see is not firing the programmer who makes any of those "common" mistakes. There is absolutely no reason for any of this shit to be "common" unless "programmers" who make them are uneducated dumbasses who should never be allowed anywhere near software development.
Now, please, give me the list of "common mistakes" made by surgeons and aircraft engineers, and compare them with this list of amateurish crap.
Re:"Common" mistakes (Score:5, Informative)
UK doctors leave 722 objects inside patients in 1 year [sify.com]
Re: (Score:3)
I really could not find much data, but the total number of surgical
procedures performed in the U.S. per year is around 70 million. I'd expect the UK to have at least 10% of that. That means about 700 lost objects for 7000000 procedures, 1 in 10000. Pretty good track record, although these are not the only mistakes to happen of course.
Re:"Common" mistakes (Score:5, Funny)
Re: (Score:2)
UK doctors leave 722 objects inside patients in 1 year [sify.com]
I would have that patient(*) arrested for trying to steal a complete surgery unit...
As per TFA (as I only read them once every while, when I read it I quote it), it just states 6 issues (security, user freedom) and says "don't do it too much, don't do it too little"). Useless, because the point is knowing exactly what is too much or too little.
*: I know it says "patients"... but the joke is worth it..
Re: (Score:2)
It's not about "too much" or "too little", it's about decisions that make absolutely no sense.
Re:"Common" mistakes (Score:5, Informative)
UK doctors leave 722 objects inside patients in 1 year
That's actually not the fault of the doctor, except in the "it's his O.R. so anything that happens in it is his responsibility" sense.
The "circulating" tech or nurse is the non-sterile person who fetches stuff out of cabinets, opens packages, and makes notes like "opened a package of 10 sponges" (typically by making a row of checkmarks on a pre-printed form).
The "scrub" tech or nurse is the sterile-gowned-and-gloved person standing next to the surgeon who passes instruments, puts knife blades on the scalpel handles, loads the needle drivers, and keeps track of the gazillion tiny pieces to everything. There are so many removable parts because everything has to be able to be broken down into pieces small enough to clean, sterilize, and package, and part of preparing for a surgery is re-assembling all the stuff so it'll be ready if the surgeon needs it.
The circulators and scrubs work together as a team. The circulator will say stuff like "here's the 10-pack of sponges", and the scrub will relay messages like "I counted them and there are 10 sponges there" or "I opened a package of 5 needles and there are actually 5 needles". The circulator will check off "10 sponges" or "5 needles" or "bolt and wingnut for the retractor" to build a list of everything that has been opened in the room which could possibly fit inside someone.
At some point, the surgeon will say, "OK, I'm getting ready to close". At this point, "the count" begins. The circulator will ask how many needles the scrub has, and the scrub will answer (including the one that the surgeon is actively using at that moment). If the counts match, the circulator will check off "needles" and move on to sponges, or knife blades, or wingnuts, or whatever else they'd opened earlier. When they're done, the circulator will announce that the count is correct and the surgeon will finish closing, which they're already well into by this point because the count is pretty much always correct.
Except when it's not.
The biggest ass-chewing I've ever received in my life was when I was in the Navy and scrubbing for some captain and we couldn't reconcile the number of sponges. One was missing, and the presumption was that it was still inside the patient. After a few minutes of pissed-off-high-ranking-officer-screaming, they wheeled the patient out anyway and prepared to X-ray them to find the missing sponge. Ideally, everyone would stop what they're doing and stand around while we searched, but the realities of surgery are that the anesthesiologist plans the sleeping and waking cycles and you really don't want to start putting them back down into deep anesthesia or keep them down longer than absolutely necessary.
So, we tore the room apart. We moved cabinets. We dismantled the surgical table. We dumped all the trash - clean and hazardous - onto the floor to dig through it. The captain would periodically stick his head in to ask why the hell we hadn't found the f'ing sponge yet and what the hell was wrong with us and did we know whether this was a courtmarshalling offense.
Finally, the anesthesia resident - a much lower-ranking officer fresh from med school - sheepishly asked what a sponge looked like. Turns out, one had fallen on the floor during the case and he'd "helped" us keep the room clean by throwing it in the anesthesia trash that he was responsible for.
As an enlisted person, that was the one time in my career that I actually yelled at an officer (who had the good grace to accept that he'd screwed up and had it coming to him). He went and told the surgeon what happened, X-rays were avoided, courtmarshalls were cancelled, and we scrubbed the room down from ceiling to floor because we'd strewn bloody trash all over the place while digging through it.
Anyway, so yeah. The counts are ultimately the responsibility of the surgeon, but the surgeon is not the person who actually does the counting - nor could they possibly be expected to without dramatically lengthening the time a patient would have to spend under anesthesia. Behind every object left inside a patient is a scrub and/or circulator who accidentally miscounted or who lied on the count sheet to hide their screwup.
Re: (Score:3)
While I agree with what you say, the problem is that bad programming is largely the result of cultural personality shift.
Lately, I have been taking some classes on web development... yeah I don't know Photoshop, Illustrator or Flash particularly well -- I like text editors though dreamweaver is growing on me. There is one person in the class who is all about shortcuts. He doesn't want to understand anything, he just wants "results." He's a business man who runs a company that is all about outsourcing. H
Pointer typedefs (Score:5, Insightful)
Pointer typedefs were a bad idea in the 1980s. They're just terrible today. One pet peeve of mine is this:
typedef struct _FOO { int Blah; } FOO, *PFOO;
void
SomeFunction(const PFOO);
That const doesn't do what you think it does. There was never a good reason to use pointer typedefs. There is certainly no good reason to do so today. Just say no. If your coding convention disagrees, damn the coding convention.
Re: (Score:2)
People DO THAT???
Re: (Score:2)
Ever see Windows code?
Re: (Score:3)
Re: (Score:2)
It never does.
Re: (Score:2)
It never does.
Fundamental rules of programming -
Re: (Score:2)
Pointer typedefs were a bad idea in the 1980s. They're just terrible today. One pet peeve of mine is this: typedef struct _FOO { int Blah; } FOO, *PFOO;
void SomeFunction(const PFOO);
That const doesn't do what you think it does. There was never a good reason to use pointer typedefs. There is certainly no good reason to do so today. Just say no. If your coding convention disagrees, damn the coding convention.
Care to elaborate (on pointer typedefs and the CONST PFOO usage)? Honest question from someone that hasn't touched a C/C++ for the last 12 years and is trying to clear the cob webs.
Re:Pointer typedefs (Score:4, Informative)
Pointer typedefs were a bad idea in the 1980s. They're just terrible today. One pet peeve of mine is this:
typedef struct _FOO { int Blah; } FOO, *PFOO;
void
SomeFunction(const PFOO);
That const doesn't do what you think it does. There was never a good reason to use pointer typedefs. There is certainly no good reason to do so today. Just say no. If your coding convention disagrees, damn the coding convention.
Care to elaborate (on pointer typedefs and the CONST PFOO usage)? Honest question from someone that hasn't touched a C/C++ for the last 12 years and is trying to clear the cob webs.
The pointer is constant... not what it "points to" and the typedef "hides" that
Re:Pointer typedefs (Score:5, Informative)
Re: (Score:3)
In short - pointer typedefs are good for the times when you really want to say "this is data of some type, maybe a pointer, maybe a
Re: (Score:2)
My pet peeve is the unnecessary use of the underscore in struct _FOO. I prefer this:
typedef struct foo { int blah; } foo;
I hate caps too, except for constants.
Re: (Score:2)
Best way to handle nulls? (Score:2)
He gives one example of an attempt to avoid null pointer errors in Java next:
public String getFirstName(Person person) {
return person?.getName()?.getGivenName();
}
But is it a good idea to use null to mean "no value specified"? What would be better, and what are the tradeoffs? Storing 0 or ""? Storing a special (constant/static) instance object nullValue?
Re: (Score:2)
He gives one example of an attempt to avoid null pointer errors in Java next:
public String getFirstName(Person person) { return person?.getName()?.getGivenName(); }
But is it a good idea to use null to mean "no value specified"? What would be better, and what are the tradeoffs? Storing 0 or ""? Storing a special (constant/static) instance object nullValue?
I think it is useful to collect the NULLs together to deal with at a higher level - a quick way to deal with a null person, name, or given name identically.
Scala has a type "None [sanaulla.info]" that does mean no value specified. I haven't got my head over the trade-offs, pros and cons but it seems to work nicely in case statements.
Re: (Score:2)
Anything as long as it's consistent.
And "fixing invalid values" is a completely retarded idea to begin with.
Programming Mistakes To Avoid (Score:5, Funny)
2) Perl
3) Silver bullets
3) Writing your own "framework".
4) Using somebody else's "framework".
Re: (Score:2)
Whenever yo hear "framework", run like hell.
Re: (Score:2)
I don't see many problems using a well-defined, well-supported framework you are familiar with.
Of course, the part of being familiar with it means an overhead that can be important. If you are just going to use it once, maybe it is not worth the time using it (if it is optional). But once you get to know it, many of them are good at solving the issues they were created for...
If we follow the trend of "frameworks does not serve at anything", we'll be back to programming in assembly soon.
Re: (Score:2)
If you need a framework - your language is not suitable - Ruby is a scripting language not really designed for web transactions, that's why rails was invented
If your language allows you to do any of these mistakes ask yourself if you need the power it gives you, if you do then it's your responsibility to not do these, if you can't/won't avoid these use a language that manages it for you ....
GOTO... (Score:4, Funny)
Is it indecent of me to reminisce on the days of olde when such a topic would simply turn into a lengthy discussion mocking BASIC programmers?
Re: (Score:2)
Re: (Score:2)
You won't believe how many GOTOs I see regularly in our C/C++ sources :(
Re: (Score:2)
Worst commerical code snippet I've seen looked something like.
a *= 0;
b *= 1;
Single letter variable names, and code that multiplies one variable by 0 and the other by 1 indicate the programmer was either bored or making a pathetic attempt to keep their job by obfuscating the code.
Re:GOTO... (Score:4, Insightful)
I happen to agree with the general idea of that thread - goto is powerful, even in good code, but it easily misused to create spaghetti code. The choices then available are: Remove goto from the language / never use goto, or careful audit each use of goto to make sure it provides sufficient advantages and *doesn't* make the silly mistakes possible.
The languages that remove things that provide complications to inept programmers (e.g. pointers, goto etc.) tend to be the ones that are hardest to program with predictable efficiency for.
There's nothing wrong with goto. Just don't lob it into code without thinking about it.
Is this real? (Score:5, Insightful)
I've not worked as a programmer for, hmm, maybe 15 years and all of this was known way back even before I "retired" from that line of work. Perhaps all these levels of abstraction upon abstraction make things harder to understand. Back in my days these "pitfalls" were obvious because we all (well, not all, but a lot) knew ASM and actually even used it regularly (even inline, *shudder*).
Someone above mentioned pointer typedefs and gave the example of typedef struct { int Blah; } FOO, *PFOO; (yes I left off the bit before the the opening brace deliberately.) and then suggesting that people don't know that void SomeFunction (const PFOO) {} doesn't behave as expected. Now this could, I suppose, be seen as a failure of the language. But, shit, any idiot who understands the underlying logic can see why that causes problems. Which goes back to my point of maybe all these modern levels of abstraction and getting away from the machine are, in some ways, detrimental.
Now, get off my lawn. Umm, except I don't have a lawn because I sprayed the growth inducing hormone RoundUp all over it, but that is beside the point. I think.
Re: (Score:2)
"I've not worked as a programmer for, hmm, maybe 15 years and all of this was known way back even before I "retired" from that line of work."
Yes: there's an obvious problem with programming and it's that "we" as a guild are not building upon past experience. For the most part, the current generation of programmers are making the same kind of mistakes that where common -and learnt how to avoid, even 20 years ago. Can you imagine, say, aviation if you had to engineer an Airbus 380 all the way from Otto Lili
Philosophy for Software Designers (Score:2)
I did went throught the list in TFA and while their "programming mistakes" list is sound, it's all over the place and often doesn't dig down to why should you do or not a certain thing.
So I decided to put down a list of, low level principles and concerns to consider when doing software. Given my own level of experience and the fact that I'm getting tired of maintining code by people who have never managed to cross the threshold from junior/medior software designer to senior designer, that is the target audi
Mistake Number 1 (Score:2)
public String getFirstName(Person person) {
return person?.getName()?.getGivenName();
}
WTF?
Re: (Score:2)
My two questions when reading that:
1. If there is a null pointer in there, what would the return be?
2. Would the return be any more useful than the default mode (which would be crash + burn)?
Meh it's not all programming (Score:2)
Half of these are design mistakes
my favorite: more is always better (Score:2)
Two Major Mistakes (Score:5, Funny)
My two most common mistakes:
Re: (Score:2)
Variable scoping -- in javascript especially!
I'd add, "errors in the error handling"
The PRE programming mistakes (Score:5, Insightful)
Allowing too many options / features in the design. The classic example being unable to decide whether feature A or B is best, and ducking the issue by including them both
Assuming 5 working-days of effort can be achieved in a working week. Conveniently forgetting about all the office overheads such as "progress" meetings, timesheet administration, interrupted work, all the other concurrent projects. Even the most efficient, single-threaded operation needs half a working-day per week just for the trivia.
Following on from that, conveniently forgetting about annual leave commitments, national holidays and the possibility of sickness. If 5 working-days per week is impractical, 12 working-months in a year is downright negligent.
The tacit assumption that testing will inevitably be followed by reelase - rather than bug-fixing.
Holding the end-date constant while delaying the start, or presuming that all delays in the specification, design, approval stages can somehow be reclaimed during coding (how: by thining faster?)
Re: (Score:2)
Just out of interest, do you work at IBM?
Avoid over engineering and over generalising (Score:3, Insightful)
The biggest programming mistakes I've had the displeasure of making, or discovering in others code, almost always centre around one of these two problems:
1. The code is over-engineered
2. The code was abstracted before there was even a need for the abstraction.
I remember when I was less experienced, how thrilled I'd be over code that was clever, solved many problems aside from the one I was trying to solve, and had some clear reusability built in. What a work of art, I thought.... until I eventually realised that much of the extra code I had written didn't get used, the abstracted code was never reused - or even if it was, I couldn't predict how it would be reused and the abstraction was clumsy at best, useless at worst.
It's sad when this happens - good intentions, but the end result is a lot of waste. I'm embarrassed to look over my earlier code which is like this.. I like to think I do it less now, but the temptation is always there... I'm going to need to do this later anyway... I can just abstract this bit here and reuse it some day in the future...
My advice now... Don't do it! Just wait until the reuse case comes along, or the new feature request comes along, and *then* do it. You'll know so much more about the problem domain then, or you might avoid days (weeks!) of wasted effort.
Its always been the case (Score:2)
>> 'programming may in fact be transforming into an art, one that requires a skilled hand and a creative mind to achieve a happy medium between problematic extremes.'"
Its not transforming into it, it's always been an art. And that has got nothing to do with whether its web programming or not.
The reason that this is even news to some people is that most managers fight hard to bury that fact, because the vast majority of them are one-trick ponies that incorrectly think that everything can and should be
Don't get me started (Score:5, Insightful)
I find the given top 12 list of mistakes a bit weak- I'd be able to avoid all of these and yet write horrible code. My personal recommendation for a top 12 of programming mistakes to avoid would be:
1. Failing to check function parameters before using them: null pointers, limits, lengths, etc. This will make your program unstable and/or unpredictable.
2. Spending too little time thinking about and designing the data structure of the application. This will make you get stuck when maintaining/extending your application.
3. Following every market hype - When the marketing bubble bursts, you'll have to start over again.
4. Designing user interfaces without actually involving users - You'll be surprised how easy it is to confuse users.
5. Infinitely deeply nested if/else statements - This will make code absolutely unreadable.
6. No documentation whatsoever - Who's going to maintain your code after you change jobs?
7. Ignoring existing, universally accepted standards - so you'll cause interoperability issues or be doomed to either reinvent the wheel.
8. Hard-coded values/magic numbers - as a result, any change must be made in code rather than allowing power users to configure their own system.
9. Littering code with global variables - this implies statefulness of code, making it pretty near impossible to predict how a function will behave next time it is called.
10. Being unaware of the "Big O" order of your algorithms, causing code to be unnecessarily inefficient.
11. Strong platform dependency: This can shorten the lifetime of your application to whenever the next platform upgrade takes place, or keep you stuck at the current version of the current platform forever.
12. Thinking you can figure out everything by yourself - In learning by doing, experience can only follow from making mistakes. By getting yourself a mentor or an education, you can actually learn from the mistakes that thousands have made before you.
13. Stopping at 12.
Documentation (Score:3)
When we migrated to C++ a while back, my biggest gripe became the number of projects, library, et.al. that weren't documented. I won't name the very popular library, but when I contacted the developers (I was still new with C++ at the time), they told me to "read the headers." Your code is not documentation, no matter how well you comment your functions. There's a subculture out there that I don't get that has the mindset that "it was hard to write, it should be hard to use" (and that's almost a direct quote from a library author). I don't know if it's job security, elitism, nepotism, or what. But, with some projects there's a cold disregard (borderline hostility) towards the people who will actually be using the product.
Re: (Score:2)
Re: (Score:2)
Unless you're coding in machine opcodes like a previous poster suggested, you're already losing some optimization potential by relying on the compiler/interpreter to do it for you. Even in C you're relying heavily on machine generated code. So why is garbage collection so bad?
It seems to me like the structured programming debate all over again.
Re: (Score:2)
I was recently working on a horribly written .Net app that created and de-referenced tons of objects in its main loop. The net result was memory usage going from 200MB one second to 500MB another second. Based on how quickly over all system memory was cleared up, the .Net GC must kick in right away. I'm not talking about objects lingering around for minutes or even tens of seconds, I'm talking about just a few seconds.
If you're going to gripe about using a language that uses a GC, then it's perfectly valid
Re:#1 - Not managing the pointers and memory yours (Score:5, Informative)
#1 - If you are a programmer, BE A PROGRAMMER and manage the pointers and memory allocations yourself. Garbage collection is for little boys. Men deal with it on their own with techniques that work and are efficient.
So mega-strongly disagree dude. Not saying you shouldn't do heavy lifting when necessary -- just that you should only do it when necessary. Don't re-invent the wheel every time. Frameworks exist that do work for you for a reason. Chose your frameworks well, understand them in depth, and you can do good things. If you "start from the first principles" every time, you end up with a humongous fucking surface of new code -- which is bound to have a nasty bug or three. It comes down to choosing the best tools for the job.
#2 - Initialize all variables to known values. int i; doesn't cut it. int i=0; does.
True dat. Lots security pitfalls here too -- not just garden variety bugs.
#3 - Use descriptive variable names
So true. Corollary to that: because a variable name is descriptive, don't make wanton assumptions about it.
#4 - you shouldn't be allowed to program anything new until you've been a maintenance programmer for a few years and seen the crap code that others puke into the world. Your crap code stinks too, BTW.
I'd modify this to say "always, always, always have a peer-review process". Junior devs are prevented from checking in crap because it gets caught by senior devs. The junior devs also learn quality habits from reviewing senior devs' code. Multiple reviewers is always a good thing. Review your design among the entire team before anyone writes a single line of code. Remember to keep security in mind when reviewing code. Use static analyzers when you're done with the "human" aspect of the review. Apply every imaginable quality bar to your code, and only check it in once it has passed scrutiny.
Re: (Score:2)
If you are a programmer, BE A PROGRAMMER and manage the pointers and memory allocations yourself
I look at the problem like this;
I've solved a fair number of sudoku puzzles in my time. You build a mental set of rules and patterns you look for, and you repeatedly apply them to the remaining squares. On a really hard puzzle you'll often reach a point where you get stuck. You keep scanning for patterns in the remaining numbers but you just can't see the one clue that causes the rest of the solution to fall out.
But a computer program? Once you tell it how to find something, it never forgets to apply a ru
Re: (Score:3)
#1 - If you are a programmer, BE A PROGRAMMER and manage the pointers and memory allocations yourself.
And output formatting. printf is for wussies. Also, networking; if you can't whip up your own application-tailored TCP stack, then you should go back to playing with VB. And GUI toolkits? TOOLKITS?!? What are you doing, building a footstool? Hell, no! Manly programmers don't use toolkits, they use the library of macros they built while apprenticing to Knuth.
You're completely right. All this mamby-pamby resource management crap is for Kindergartners and Excel users. Real Programmers flip bits with soldering
Re: (Score:2)
I often deliberately choose a string manipulation that involves strcpy() and even strcat(), just to make a point that those are perfectly valid and useful functions, despite some morons writing insecure code with them.
The only really unsafe function is gets().
Re: (Score:2)
I often deliberately choose a string manipulation that involves strcpy() and even strcat(), just to make a point that those are perfectly valid and useful functions, despite some morons writing insecure code with them.
The only really unsafe function is gets().
That's just stubbornness for it's own sake. These are violently unsafe functions. People do manage to make asses of themselves even with so-called 'safe' string APIs, but there's no good reason to not use them. Strcpy() or strcat() have pitfalls that are really esoteric, and if you keep using them you'll eventually make a mistake and end up with some absolute motherfucker of a bug with security ramifications you wouldn't even have imagined.
Re: (Score:2)
These are violently unsafe functions.
If those are unsafe, then dereferencing a pointer or using an array is unsafe, too -- and that means, a programmer is unable to write safe code no matter what.
People do manage to make asses of themselves even with so-called 'safe' string APIs, but there's no good reason to not use them.
What?
The only kind of "safe" strings handling that I am aware of, is operations on strings that are combined with allocation (in object-oriented or almost-object-oriented way). Their purpose is to simplify common operations, any "safety" is at best a side effect that shouldn't matter if programmer is not a moron in the first place.
Strcpy()
strcpy()
Case-sensiti
Re: (Score:2)
To avoid working code of course.