Developer Argues For 'Forgotten Code Constructs' Like GOTO and Eval (techbeacon.com) 600
mikeatTB quotes TechBeacon:
Some things in the programming world are so easy to misuse that most people prefer to never use them at all. These are the programming equivalent of a flamethrower... [But] creative use of features such as goto, multiple inheritance, eval, and recursion may be just the right solution for experienced developers when used in the right situation. Is it time to resurrect these four forgotten code constructs?
The article notes that the Linux kernel uses goto statements, and links to Linus Torvalds' defense of them. ("Any if-statement is a goto. As are all structured loops...") And it points out that eval statements are supported by JavaScript, Python, PHP, and Ruby. But when the article describes recursion as "more forgotten than forbidden," it begs the inevitable question. Are you using these "forgotten code constructs" -- and should you be?
The article notes that the Linux kernel uses goto statements, and links to Linus Torvalds' defense of them. ("Any if-statement is a goto. As are all structured loops...") And it points out that eval statements are supported by JavaScript, Python, PHP, and Ruby. But when the article describes recursion as "more forgotten than forbidden," it begs the inevitable question. Are you using these "forgotten code constructs" -- and should you be?
Doing it wrong? (Score:5, Insightful)
Re:Doing it wrong? (Score:5, Insightful)
I'm just as baffled by this. I wasn't aware that recursion went out of style. Just another tool in the algorithm and design pattern toolbox. Did I miss the memo that it was taboo as GOTO?
Re: Doing it wrong? (Score:5, Interesting)
Re: Doing it wrong? (Score:5, Informative)
Re: Doing it wrong? (Score:4, Insightful)
They can make the code look neat but that also hides horrible performance bombs and security issues.
Then they're lousy abstractions. Abstractions are here to hide irrelevant details that you'd get wrong if you had to do stuff by hand identically in many places and correctly all the time. Doing something right just once is the exact opposite of having a security issue.
Regarding stack allocation, that ought to be O(1), anyway.
Re: Doing it wrong? (Score:5, Insightful)
Some algorithms are naturally recursive. For example, in-order traversal of a binary tree is easiest described as: deal with the left child, deal with this node, deal with the right child. Tower of Hanoi is easily solvable with: Move all disks above the one you want to move to the other peg, move the disk you want to move to the peg you want to move to, move the disks you moved earlier to on top of the disk you wanted to move.
In these cases, if you use loops, you're going to be making up all the stuff recursion is good for, and you're going to be maintaining your own stacks. There's no advantage to doing this rather than using recursion. If you were going to get into a loop and recurse indefinitely, if you translate it into loops you're going to get into a loop and push indefinitely.
Re: Doing it wrong? (Score:5, Funny)
There are people who think such things are beyond the ken of mortal man. Mere programmers should use libraries and only the gods themselves write the libraries.
Re: Doing it wrong? (Score:5, Informative)
Stack frame allocation costs exactly: nothing.
So why do you care?
You fear an endless loop and running out of stack space?
Well, most compilers convert most recursions into loops anyway: it is called tail recursion optimization.
I don't remember when I saw recursion the last time in production code, likely decades ago, and you care about the raw cases where a manually written loop is "better" than recursion ... wow.
Re: Doing it wrong? (Score:5, Insightful)
Correction: *some* compilers will convert *some specifically structured* tail-recursive function calls into loops.
There are lots of ways to make recursive function calls not tail-calls which renders them ineligible for compiler optimization.
Re: (Score:3)
Most modern C compilers, given certain constraints about return values.
Re: (Score:3)
I don't remember when I saw recursion the last time in production code, [...]
See, I've worked in compilers, and software that has compiler-like components (e.g. input languages which resembles programming languages in some way) for a long time, and I see recursion all the time.
Recursion is all over production code, you just don't work in those spaces.
Re: (Score:3)
Who runs out of stack space on modern desktop computer platforms anyway unless you messed up your break condition or you've made very, VERY large data structures?
People programming in languages that do not offer tail recursion optimization. For example, Guido van Rossum decided explicitly to exclude tail recursion optimization from Python because it would make the output from functions in the traceback module less clear.
Re: (Score:3)
Re: Doing it wrong? (Score:5, Funny)
Whenever recursing, have a recursion counter and a termination condition that stops infinite recursions.
Re: (Score:3)
Nope. You need to get the computer scientists point of view: Have a termination condition. And this is usually NOT a counter, but a condition, which needs to be fulfilled. i.e. you visited every node of a tree. This takes 1 mio steps? Yep, then it does. Your counter is arbitrary and/or hard to calculate and some measure for ppl who don't trust their code to have the right condition. Which is made easier by recursion, because functions tend to be more understable, as they are similiar to their math equivalen
Re: Doing it wrong? (Score:5, Insightful)
That's a termination condition: If you ever visit a node, which was visited before, stop.
You do not follow the symlink 1000 times and then abort. You follow it one time and the next time you see a mark "followed it" and stop. Without error as successful termination of a directory traversal.
You think such a link would be an invalid condition, but it is actually not. And it isn't even a special case to the algorithm, which has the invariant "always take a node, which wasn't visisted yet until there are no such nodes".
Re: (Score:3)
A tree has nodes and leafs. A leaf is where the algorithm terminates. A node has children, which can be followed.
A search visits each node once, where links between nodes may alter when a node is visited first. When you visit a node, you mark it as visited and add the children (and links) to a queue or stack (depths first / breath first search).
I do not see any need to visit a node twice, as you get no new information on the second visit.
Re: Doing it wrong? (Score:5, Insightful)
Every time you make a function call, some amount of bookkeeping data has to be stored on the stack
Not necessarily, really, these days. But yes, if your compiler is really dumb, like in the 1970s, the difference can be significant.
If you do "manual recursion", with a loop and a resizable container, then you can achieve lower overhead.
Chances are that if you can do that easily, you should have done it in the first place, and if you can't, there was a reason for that.
Re: Doing it wrong? (Score:5, Informative)
int recurse(int x) {
if(x<=0) return 42;
return recurse(x - 1);
}
int main() {
return printf("%d\n", recurse(6502));
}
Then we compile it, adding -S so we can look at the assembly output:
$ clang -S -O3 test.c The compiler recognizes that the printf() call is a tail call, and uses a jmp (which places nothing on the stack in x86), but it also recognizes that recurse() evaluates to a constant and returns that. I was going to post the assembly output for you to look at, but Slashdot said it had too many junk characters.
There is a lot to complain about in terms of efficiency in clang and gcc, but tail recursion is a well-understood problem with a lot of research behind it, and they both do it well.
Re: (Score:3, Insightful)
Elegance.
Re: (Score:2)
Uhm then you're using the wrong language.
Lua and scheme have tail call optimization.
Python is the wrong language (Score:3)
By your criteria, Python is the wrong language, and this is intentional. See Guido van Rossum's explanation: part 1 [blogspot.com.au] and part 2 [blogspot.com.au].
Re: (Score:3)
And if you're always doing a tail call, the recursion can be replaced by an iteration without too much trouble.
Well, the good thing is that the compiler can do this automatically, so if you're writing a program from scratch and already know what it will do (which is often the case), these manual decisions are fine, but if you already have a working program and the desired logic changes a bit, switching between iterations and recursions by hand is much more involved now than just letting the compiler figure out the changed tail call positions.
Re: (Score:3)
Re: Doing it wrong? (Score:5, Insightful)
So you consider it easier to change perfectly running code (for what reason?) instead of fixing the compiler settings?
Sometimes, yes. The project was for a medical application, with 99% of the code written by others. I'm not going to change global compiler settings and risk exposing some optimization error, when I can simply change a few lines of my code.
Re: (Score:3)
Re: (Score:3)
Yep. Embedded people avoid recursion like the plague because they have limited resources. Sometimes desktop guys use it, and it works okay until someone opens a big file and it needs 18GB of RAM...
Re: Doing it wrong? (Score:5, Interesting)
I am an embedded software engineer.
You have to be extremely careful with recursion, because often you have a very small stack. We also try to avoid any dynamic memory allocation when possible, only using automatic variables. With some types of firmware, even automatics are not allowed, but I don't go that far.
The danger with recursion is that even if you try to limit it, if you screw up it can be very bad. It can be much worse than, say, an infinite loop. The loop can be caught by a watchdog and if designed carefully shouldn't start corrupting other stuff. Recursion that grows the stack or allocates memory could end up overwriting things and causing all kinds of unpredictable behaviour. Remember that embedded systems typically don't have any memory protection or stack guarding, and only extremely simple memory management.
Of course it does depend on the type of firmware too. My stuff often has to run for 5+ years without a reset, and is sealed so you can't just reset it if something does go wrong. If resetting is an option, you can take more risks. One option we considered was to do one reset every day by design, so that any issue which took longer than a day to emerge would never affect us.
Re: (Score:3)
Or just don't use recursion
A non-recursive algorithm to walk a tree structure still requires memory space proportional to the maximum depth of the tree, and still requires more memory for an unbalanced tree. You may need less memory than a call stack, since you don't need to store the instruction pointer or stack index pointer (BP register on an x86), but that is only a linear improvement.
Re: Doing it wrong? (Score:5, Informative)
Recursion is undesirable because it doesn't scale - you run out of stack pretty quickly. There isn't really ever any need for recursion anyway as there's nothing you can do recursively that you can't do non-recursively.
While that's basically true, as a former LISP programmer, I can attest that recursion can be simpler and more elegant to code, understand and maintain. It's really good for prototyping and proof-of-concept work, where speed and scaling may not matter. For example, coding a tree search is about 3 lines of recursive code vs. 2 pages of non-recursive code. I sometimes even use a recursive version of a function to verify the operation of a non-recursive function.
Re: (Score:3)
There are languages that do not optimise tail-recursion, such as C .
All modern C compilers optimize tail-recursion.
computing the sum of the elements of a large array is efficiently done with a for-loop, whereas recursion would use excessive stack space.
Only a total idiot would use recursion to sum an array. There are many other tasks, such as searching a tree, where a recursive algorithm makes much more sense.
Anything that can be done iteratively, can be done recursively
Anything that can be built with a hammer can be built with a rock.
Poor article? (Score:4, Informative)
Recursion is an easy way to implement solutions to a number of problems. But if you don't have a clearly finite depth then it can be dangerous. There is often a way to use a loop that doesn't pile on the stack the way recursion can.
That said, it doesn't seem like it belongs in this list.
Frankly, it doesn't seem like a great article. Yup, those things can be misused. Yup, if something can be misused, it will be. I use ruby, so I have access to at least 3/4 of these dark techs. Whatever.
Re:Poor article? (Score:5, Interesting)
Recursion is an easy way to implement solutions to a number of problems. But if you don't have a clearly finite depth then it can be dangerous.
In '88 (90?) I had a copy of Unix Sort for PC (MS-DOS) complied in I believe a Lattice C compiler from LifeBoat. It worked fine but ran slow as a dog, and this was when IBM AT were fast. So I found the routine that did the actual in-memory sort and made it recursive. It easily worked over 5x as fast but had the slight problem of ABENDing when it ran out of stack space, which the old version didn't have.
So I fixed it: I left the recursive sort in place but did a free space stack check on entry. If there was less than 4K (4K!) left I switched to the slower non-recursive routine. I was able to keep sort speed around 4x of the original slower program but still have the program always successfully complete.
It was a simple fix, but I have to admit I was impressed with myself for implementing that.
EVERYTHING can be misused. Add meaningful comments so they are not misunderstood. Write everything for your peers and their less-experienced colleagues. If you're a genius who writes working code that no one else understands, you're not a genius. But if the person following you really is a blithering idiot, then nothing you do will help.
Re:Poor article? (Score:5, Informative)
Re: (Score:3)
Does this horrible misrepresentation mean there are coders who don't realize that they are using a recursive algorithm? This lack of understanding could be a result of how some languages hide all the details of data structure implementation. Specifically, I'm thin
Re: (Score:3)
You seem to have a poor understanding of programming.
1) Hiding implementation, including algorithm selection, is a large part of what objects are for.
2) It's possible to write any traversal algorithm using loops, without any recursion.
Re:Doing it wrong? (Score:5, Insightful)
2) It's possible to write any traversal algorithm using loops, without any recursion.
Sure, but if that requires building your own stack, you haven't really gained anything.
Re: (Score:2)
The fact is that recursion does not perform well on general purpose silicon. The performance degradation can be so significant that modern compilers are advertised to be able to perform an optimizatio
Re: (Score:2)
The article is a bit lite the summary on /. is worse. The article itself specifically mentions your use case: data structures where you want to convert a list of data into a list of actions as examples where recursion is often helpful.
Re:Doing it wrong? (Score:5, Funny)
Honest question: Am I not supposed to use recursion?
It depends. See https://developers.slashdot.or... [slashdot.org]
Re:Doing it wrong? (Score:4, Funny)
In the book "C Traps and Pitfalls", on page 146 of the index it says "recursion 88, 146".
The index also contains “pitfalls, see traps” and “traps, see pitfalls.”
Re: (Score:2)
I generally advise against it.
This doesn't mean it should never be used.
But a lot of the time, it doesn't NEED to be recursive and making it recursive complicates thing.
I'll give a practical example. I worked on router firmware when I first graduated and got assigned this bug where a router in South Korea kept crashing. Really hard to debug it. After a lot of debugging, I found out it was related to the number of ACLs applied to a policy. Went through the code and the section that applies those ACLs was re
Honest answer (Score:2)
Re: Doing it wrong? (Score:3)
Read this post and it's replies. It is a good explanation of the issues with recursion.
http://slashdot.org/comments.p... [slashdot.org]
Comment removed (Score:4, Informative)
Intent (Score:2)
Just keep in mind that familiarity is not the same as clarity.
Combining two of them (Score:2)
Not sure about the idea of recursion being forgotten, but I used to work on a system (GEC 4000 series) which had no stack, making recursion slightly more difficult. The neat way of achieving it was by using a goto, back to the beginning of the function.
Obvious answer (Score:4, Insightful)
It depends.
In some cases, you want to allow goto statements, for instance because they help manage failure handling without adding condition or exception constructs.
In some case, you want none of these gotos, because you are using processes or tools which are (partly or entirely) not compatible with them, and you need these tools to work more than you need gotos.
In some cases, you don't want recursivity because the contex does not favor them (think embedded SW with restricted stack size).
In some cases you want recursion because it makes code simpler and closer to the principles behind it, thus more maintainable.
In some cases, you want class-like constructs in C be don't want C++ because the legacy code, people involved, time alloted, or general context just does not allow you to rewrite the whole thing.
Etc.
Re: (Score:2)
I don't agree with you. I use recursion all the time on infinite data structures where obviously the inner loops using recursion can't limit. That has to be handled by the out loops creating the evaluation context.
Some things are better left alone (Score:2)
Do what you want with your GOTOs, but do not bring ALTER back.
I know what will happen one day. (Score:5, Insightful)
Re:I know what will happen one day. (Score:4, Interesting)
That already happened; apparently some people now feel inheritance is bad. I've seen a few of their arguments (rants, really), and it seems to boil down to "you can get confused", "some inheritance trees are too deep", and a whole bunch of irrational ranting besides.
I do agree that we've(*) suffered from an overload of policy factory manager producer singletons (which seems to be an important part of many of those rants), but actually inheritance is a tool that serves me well in many, many cases. It's certainly way better than having type-specific switch statements all over the place...
(*) I say "we", but actually it seems like a fairly typical Java affliction, more than a general OO thing...
Re:I know what will happen one day. (Score:5, Insightful)
IMO class inheritance is useless. Interfaces and properties are a good idea though.
Re: (Score:3)
As someone who has worked with complicated C++ software for something like fifteen years, inheritance is very valuable. It keeps some of our core concepts understandable. It can clearly be overused, of course, but that's true of all language constructs.
Re: (Score:3)
Beware of arguments that seem rational and are not based on experience.
What the article says (Score:5, Interesting)
The article talks about 4 features: goto, eval (run code from a string), multiple inheritance, and recursion. It discusses why the 4 get attacked by simplicity advocates:
goto -- incomprehensible logic in programs
eval -- security risks
multiple inheritance -- breaks single responsibility since one module can have subtle impacts on how other modules acts in this context
recursion -- article isn't clear though the comments above are mostly correct. In non-tail recursive languages recursion usually creates algorithms that are O(n) in memory. Even in tail recursive languages this can happen (and in fact in those languages because more complex recursions are encourages O(n^2) isn't uncommon when recursion isn't used carefully / well understood).
It then mentions that these things should be used to avoid complexity in certain situations.
goto -- error handling
multiple inheritance -- is generally too useful to give up. implement with interfaces and be careful
eval -- JSON, HTML, math...
recursion -- trees, some list algorithms... recommend to implement imperative style mostly though (article assumes the language can't handle recursion)
Now my opinion:
Recursion is obviously the best understood of the 4. It is easily provable that there exists recursive algorithms which are both important and are not implementable as loops. Recursion classification is a still active research problem. Most imperative programers don't even bother to think deeply about their algorithms and not using design patterns from recursive features means the same bugs are introduced over and over again in code. IMHO there is no reason not to be abstracting loops away using built in functional design patterns in code.
Multiple inheritance is too powerful to give up. Java was wrong here. Better safety than the C++ style seems to be needed though. For OO languages this should be an active area of experimentation.
goto is today rarely used and when it is it often avoids complexity. I think we hit the right level of compromise here decades ago and this is a dead issue.
Eval I think history has shown that without explicit evals developers end up having to create implicit evals where the code acts in complex ways on input. The code / data duality is not dead. Complex evaluation of input and layering aren't going away. Perl's concept of taint checking is likely the best approach: make it explicit and let the compiler check for accidental security risks.
Re: (Score:3)
This is recursion theory. A tail recursive function f is expressible in loop form f' and visa versa (https://en.wikipedia.org/wiki/Tail_call#Relation_to_while_construct). Given any simple recursion that's not tail recursive you can use an explicit stack. So for example quick sort is much easier to implement recursively but it can be implemented with an explicit stack to make it into a loop. To get something that you can't avoid you need something for which the complexity / size of the explicit stack can'
Re: (Score:3)
As for multiple inheritance being bonkers. My car is a Honda and a vehicle registered in NJ. That's two hierarchies.
That's not an argument that multiple inheritance is needed, that's an argument that subtyping rarely fits real-world problem domains.
Avoid as a rule, apply with good reasoning (Score:2)
Any "dirty" style should be avoided as a rule and applied with good reasoning.
My guilty pleasure is to return multiple return statements. The reasoning behind this is that it sometimes makes code better readable. Less nesting/fewer methods. Translating "avoid multiple return statements" to loops would result in avoiding break and continue statements.
If one's self-critical enough, code will be cleaned up eventually. Cleanly avoiding multiple return statements -and too many avoiding break and continue st
Re: (Score:3)
There's absolutely nothing wrong with multiple returns, continue and breaks. Don't let some purist that got this added to your favourite "code checker" tool fool you that just because the rule is there, it must be good.
Goto has its uses (Score:3)
goto is useful in some situations (Score:3)
GOTO is useful ... when you have nothing else (Score:3)
In my opinion there are valid uses of GOTO where it could be used, but nowhere else, especially in C.
Modern languages should preferably have constructs especially for those cases so that you wouldn't have to use GOTO, but many languages don't.
The article already mentions error handling in languages without RAII or exceptions, where resources have to be deallocated. Note also that not all resources are objects on the heap: The function may need to close a file or a network connection etc.
I find goto most useful for for breaking out of loops. Many languages have constructs especially for breaking out of nested loops to a scope that encloses the enclosing loop but there is one class of loop that is rarely supported: Loops for looking up an item in a data structure, you want to take a different code paths for when an item has been found to when the loop has run its course without a result. ... break ... else: ..., where the else-block is taken only if you did not break out of the loop.
The only major language I know of that supports this with its own construct is Python in the form: for:
In languages that allow nested break statements you could emulate that by enclosing the loop within an outer loop and breaking out of that but IMHO that would be even less readable than using GOTOs.
Some of these things are not like the others (Score:3)
Goto: A way to enable lousy programmers to write impenetrable code. Are there extremely unusual circumstances, where a superstar might use a Goto in a good way? Yes, but the price - encouraging use by the incompetent - is not worth it.
Multiple inheritance: Middle ground. In a few circumstances useful, but the conceptual complexity is too high for many programmers. On the other hand, those will not be the ones designing your architecture. Mixed feelings about this one.
Recursion: Many algorithms can be implemented more cleanly with recursion than with iteration. If recursion were better supported, it would be more widely used. Unfortunately, the most widely used languages have poor implementations (C# and Java, to name two), making recursion horribly inefficient. Optimizing for tail-recursion is not hard (Scala does it on the JVM), so it's weird that this isn't done in all modern languages.
Re:Some of these things are not like the others (Score:4, Insightful)
goto is vital to safe C code. You want to be able to jump to your clean-up code from each place something might go wrong. The alternative is to add another layer of indentation under an "if" for each place something might go wrong, the stuff of nightmares.
Re:Some of these things are not like the others (Score:4, Informative)
"I don't let lousy programmers touch my code. Problem solved."
Nice thought, but that's not real life. As a cynical estimate, at least half of the people working as programmers are lousy. Companies hire them, because they're cheap, or because the company can't find anyone better, or because the company has no clue about programmer quality. There's more code to write than there are good programmers to write them, and that's not going to change any time soon.
You're probably using GOTO every day (Score:3)
Re: (Score:3, Insightful)
Goto is considered harmful for humans. Who cares what happens under the covers. Hot liquids are harmful for humans. The fact that there are hot liquids under the covers when you drive your car is irrelevant.
Stay away from the Knives, and the Stove too! (Score:3, Insightful)
I'm tired of this mentality. I'd rather we favored those with skill rather than those with a lack of it. We as a people would go much farther.
Do you cripple the use of bicycles by forcing everyone to ride with training wheels? Or do we in fact favor those who can ride and instead burden the new-comer with the difficulties around obtaining and installing training wheels on very poor low-end bicycles?
Why should coding be any different? Sometimes people craft very complex and difficult pieces of software that tie together more than 20 libraries all which have their own quirks. I need the ability to share raw pointers, I need the ability to avoid ref-counting or shared_ptrs. I need to sometimes work with systems that have their own scheduler (Erlang, cough) and then bind C libraries into that ecosystem which doesn't allow blocking for more than 1ms. So I need crazy thread logic sometimes and odd code to support linking two separate mutex idioms from 2 different libraries so the lock works across the boundry....
Sometimes I just wish to be left alone in a complex space where another soul's mere presence is essentially proof of their abilities and understanding of logic. Similar to how adults sometimes wish to leave behind children and mingle only with other mature adults, I desire this of a programming language. Something to scare away all the posers and poor misguided (but righteous and well meaning) individuals.... It's not elitist thinking just like Adults aren't really being rude when trying to mingle with other like-minded adults.... it's more of a time-saver for people who find that many "adults" are actually children in disguise and only after 30 minutes of talking can you determine they are fake. I grow tired of wasting my time and eventually wish to move to a place where it's harder for the fake to blend in. It was amazing going from finding 1-2 good people every 50 to instead finding 1-2 good every 10.
For me that language has been C++ and simply put it's the most amazing thing I've ever discovered in my programming career. I also love how everyone still is scared to death of it and clamoring for it's deprecation while simultaneously using programs written in C++ to post these complaints.
The Recursion Cult (Score:3)
I think some time in the misty past (1970s?) recursion went through a fad phase, and it was hailed as the solution to every programming woe, not to mention the secret key to artificial intelligence. I can remember studying Logo (which is a variant of LISP) at one time. Logo composed every function call recursively: when it hit a key word that required arguments, then it would put that on hold and go looking for those arguments, some of which might be keywords that required their own arguments, etc. That's not unique among programming languages, but the syntax provided no clues or organization: no parenthesis, no brackets, no braces, just a string of words, and the only way to figure out which was an argument to what was if you already knew (or stopped to look up!) how many arguments each word takes. But supposedly you wouldn't need help reading it because it's recursive, and recursion is wonderful magic.
Incidentally, Forth suffered from a similar readability problem, but at least it executed way way way faster.
The other thing I remember about Logo and recursion was the textbooks and tutorials trying to teach me how every loop could be done using recursion -- and should be! Why would you do that? Because it's the Logo Way, of course. And because recursion is wonderful magic.
It was overly complex and inefficient, to be sure. However. . . I happily use recursion for actually recursive tasks, such as traversing various kinds of tree structures.
GOTO and the Wild West (Score:3)
To understand the revulsion some hold toward GOTO, you have to mentally turn back the clock to a time when it was used for almost everything. Back in the wild west days of computering, there were no conventions for organizing program code. There was no Structured Programming. Early languages provided simple branching tools (like IF-GOTO) but no guidance. A good programmer would soon figure out his own way of organizing his code, and he could become quite productive. The problem was, everyone had their own individual, eccentric methods, and looking at somebody else's code was often confusing. Then structured programming came along, and it provided (or some might say imposed at sword point) a common organizational methodology and a common vocabulary. Two programmers who were trained in the doctrine of structured programming could read one another's code much more easily.
If you see the keywords and indentation of a WHILE-REPEAT loop, or a REPEAT-UNTIL loop, or an IF-THEN-ELSE condition, then you already have a clue, you already have a starting point to understand what the code is doing. If you see GOTO, then it communicates almost nothing. Then you have to look at the context. There may also be some code comments. It may not be a problem, and in today's environments there's no reason why it should be. This isn't the wild west anymore, and we don't use GOTO for everything. If it's there, somebody presumably had some reason for it.
Sorry, but "creative use" of any feature ... (Score:3)
(And I'm not saying 'goto' is a bad thing. Using it to uncreatively break out of multiple nested loops or do error handling is easier to understand than the alternative. Also, in about every programming language, there are pretty much always several ways to achieve a certain behavior. The one that is easiest to understand should be chosen unless there are pressing reasons for one of the other ways.)
Disregard my rant about maintainability if you code one-shot things that no one - including you - will look at again once you're done.
GOTO is dead.We live in COMEFROM age. (Score:4, Insightful)
But the entire modern GUI API is based on "event driven" programming. Replete with "OnRightButtonDown()" , "OnWindowClose()" ... . These are nothing but COMEFROM statements. COMEFROM could be as harmful or even more harmful than GOTO. With a good design based on a valid state machines and object oriented code we not only handle these with east, we are successfully developing incredibly complex code.
So, no. We did not forget GOTO just because some authority figure railed against it. We replaced it with a better concepts like event loop, event dispatching, object orientation.
Re: (Score:2)
I use JMP's if I'm messing around on some old micros, does that count?
Re: (Score:2)
People seem happy to hate on goto while using other things that are goto in all but name (break, continue, return, throw). Throw/catch is particularly useful as a non-local goto in places where you want to catch some kind of condition (usually an error) that can be detected in lots of places. C++ RAII eliminates most of the need for local gotos though - you can make local classes with destructors for cleanup, with the added benefit that it's exception-safe.
Re:Recursion is dead! (Score:5, Interesting)
Just like Linus, you seem to fail to understand the problem. Dijkstra argued against *unstructured* jumping around, since this made programs very hard to understand (look up some source from that era to get an idea of what he was arguing against. It wasn't just a single goto here or there, it was 'using goto for everything we now use structured constructs for, like loops, switch-statements, etc.). Dijkstra argued for replacing those goto's with structured jumps as much as possible. And guess what? By and large, the software world has done so, and become much better for it.
I very much doubt he meant for his statement to become dogma in the way it has, and he certainly wasn't arguing for the complete removal of all forms of flow control, structured or not (as you and Linus seem to think). Goto, like everything else, is a tool. It has its place. You should not use it if a better tool is available, but you should also feel free to use it if it is the best you have. And the fact that assembly _only_ has goto is immaterial. The whole point is to allow reasoning over the language in the language itself.
Dijkstra always struck me as a sensible, practical man. He wrote about an argument he had about driving printers. In his era, printers could only accept a character once every so often (because they were slow, mechanical beasts, without much in the way of buffering), so his colleagues wanted to intersperse printing code with other processing. Dijkstra didn't like this, and wanted to print using an interrupt that would signal when the printer needed a new character. His colleagues fought against that: not only were interrupts more costly than just interleaving printer output with normal code, but Dijkstra was 'throwing away' valuable information about printer timing that could be used to improve efficiency!
His colleagues were, of course, completely right - right up until the moment when the hardware changed, and their programs no longer worked, that is...
Re: (Score:2)
> I very much doubt he meant for his statement to become dogma in the way it has
The original title Dijkstra gave his paper is 'A Case Against the Goto Statement'. It was a CACM editor who came up with the Considered Harmful version.
Re: (Score:2)
You've missed my point. The people I have a problem with are people who will avoid a goto at all costs, even doing things like this so they can say they didn't use a goto:
do { ... ...
if (exit_cond) break;
} while (0);
I really have seen that done in a popular piece of open source software. If a goto really is the cleanest way to achieve what you need to do, then just use a goto. Don't write code that does a goto while using some convoluted means to avoid the goto keyword.
That s
Re: (Score:2)
How is goto return better than just return?
Re: (Score:3, Informative)
Also forgot to add to this, memory cleanup is another big one. Instead of having to free in every single possible 'if error return' block you can have it always do a check and free in the goto
Re:Recursion is dead! (Score:4, Interesting)
History lesson.
https://en.wikipedia.org/wiki/Considered_harmful
GOTO considered harmful, raised out of a generation of BASIC programmers that knew only too well that they were horrible to deal with. Early micros had RENUM so you could move line numbers around and attempt to preserve GOTOs. They were awful, but only on 8-bit micros.
Later C used them in local jump structures using LABEL: which wasn't even remotely as bad as BASIC. Everyone is allergic to GOTO from BASIC so the whole idea got canned along with it - baby out with the bath water. This is why we say "GOTO considered harmful, considered harmful". The idea that a code construct is so repulsive that we've condemned it to never be used again.
GOTO is useful. Certain forms of C exception handling code benefit from GOTO immensely. They make the code both more readable and more performant. Unfortunately we can't submit this code because in a code review...."GOTO considered harmful" circa 1990. Brainless dogma has won over thought. I've seen generations of programmers that would never consider GOTO to be a valid keyword. They won't consider it on the basis of a decades-old argument that was meant for a different language in a different age. As much as I might be right I won't pass a code review, so I don't use it.
Re: (Score:2)
Yes, goto is useful for exception handling, in languages that lack specific exception handling constructs, but have goto. This is a good reason to use goto in such languages, but I think it ultimately points to a deficiency in the language, rather than an argument in favour of goto as a language construct.
IMHO, to argue for goto as a language construct, it is not enough to argue that there are a small number of specific
Re:Recursion is dead! (Score:4, Insightful)
You should read "GOTO considered harmful" before you bash it.
"Most programmers have heard the adage "Never use goto statements", but few of today's computer science students have the benefit of the historical context in which Dijkstra made his declaration against them. Modern programming dogma has embraced the myth that the goto statement is evil, but it is enlightening to read the original tract and realize that this dogmatic belief entirely misses the point."
http://david.tribble.com/text/... [tribble.com]
In the bad old days, all you had was goto, and every program looked like spaghetti. Now that we have if...then...else, loops, switch-case statements,
goto should only be used as a last resort (and every use should be justified). I've been a professional programmer for twenty years; last year I used goto *twice*.
And never forget https://xkcd.com/292/ [xkcd.com]
Re:Recursion is dead! (Score:5, Informative)
I was there.
Circa 1980, GOTOs in early BASIC and also 6502 Assembly were appropriately used to maximize the limited resources of early desktop computers. A particularly elegant technique on the Apple II was to POKE instruction codes into the keyboard buffer and GOTO it (the Lamb technique IIRC). While the KB buffer was only something like 128 bytes, it was long enough that a GOTO to a computed destination could be built in it and, wowsa, suddenly Applesoft BASIC had a very powerful CASE emulation.
Naked GOTOs were no longer needed when disk drives replaced tape drives, and RAM grew from 4, 8, or 16 kilobytes to the incredible size of 640 kilobytes. We still used GOTOs that were clothed within Structured Programming constructs (IF-THEN, DO-UNTIL, WHILE-DO, etc) but those were tamed GOTOs. The wild, naked GOTOs became much more rare and good programmers charged with maintaining legacy software would savagely hunt them down and destroy them.
Meanwhile, Gee-Whiz BASIC (arguably the only really good thing to ever come out of Microsoft) let us replace line numbers with labels and brought about the Business BASIC revolution circa 1985.
Dijkstra first used the phrase "GOTO considered harmful" in 1968, only 3 years after BASIC was written and about 7 years before BASIC was widely used (the costs associated with moving from Big Iron using centralized card and tape readers to minicomputers with networks of remote terminals slowed BASIC's adoption.) He was talking about FORTRAN and COBOL practices. His work was part of the slowly dawning recognition that it was not sufficient to write a program that solved the problem; that you also had to write it in such a way that you could maintain it or repurpose it next month or next year. That was the dawning of what became known as structured programming practice.
Bringing this back to the present, using recursion makes a great deal of sense when time to production, long term costs of code maintenance, or repurposing are things that need to be considered.
Obviously if the code is one-off throw-away, like a tool that will be used in converting the accounting system database from warehouse inventory to just in time purchasing, then maintenance is not a consideration but neither is efficiency. Slap together whatever will work and get on to something else asap; don't take time to rework a recursion into something faster or more robust unless the software breaks on a pre-production trial run. And then look for a quick and dirty fix.
But if the code is likely to still be in use five years in the future, then write it so the poor bastard whose got to maintain it can understand it as quickly as possible. That could well mean using recursion. The same goes if chunks of the code might be re-used in some other way, say for example taking chunks from an inventory application to build a library system for maintenance manuals.
Also keep in mind that today's hardware limitations will not apply to tomorrow's problems. It is perfectly acceptable to use a recursion that you know will fail on the 20th iteration if you also are assured that there will never be a need for more than 19 iterations in the next 5 years. In other words, don't waste yourself trying to fix tomorrow's problem, which may no longer be a problem when tomorrow rolls around.
Re: (Score:3, Insightful)
How is goto return better than just return?
In C in particular, which is the ONLY place I'll use goto... i might have a pattern like something like...
{
a = malloc(something)
if malloc failed goto e1
b = malloc(something2)
if malloc failed goto e2
c = malloc(onemorething)
if malloc failed goto e3
open a file... if error happened goto e3
some other error happened goto e3
e3:
free (a)
e2:
free(b)
e3:
return;
}
The goto sequence cleanly handles the memory free. Obviously you wouldn't want to just return after t
Re: (Score:2)
char *a = NULL;
a = malloc(something);
if (a) free(a);
and so on.
Re: (Score:3)
The point is that you have several mallocs to deal with, and more than one error condition (not just failing the malloc itself).
So if you make 3 mallocs, and then there is an error opening the file... you still have to free all 3 mallocs.
the use of 3 if(x) free(x) instead of multiple goto labels is fine, but you still need goto to get to the cleanup block from the various error points.
Re: (Score:2)
Gotos in C are like operator overloading in C++. It's a handy mechanism when used judiciously, in very specific circumstances. If GOTO is abused, used to jump all over the place (which it can literally do), yeah, of course it's bad. But then again, you can always name a function:
int AddTwoNumbers(int x, int y);
when it actually multiplies two numbers. Does this mean named functions are bad, because they could be misused? Why would you purposefully abuse the language like that?
Generally speaking, the rul
Re: (Score:3)
{
char *a = NULL, *b = NULL, *c = NULL;
FILE *f = NULL;
if (!(a = malloc(something)))
goto fail;
if (!(b = malloc(something)))
goto fail;
if (!(c = malloc(something)))
goto fail;
if (!fopen(f))
goto fail;
some code
return 0;
fail:
if (f)
fclose(f);
free(c);
free(b);
free(a);
return -1;
}
FTFY. mkay?
Iffy resources (Score:3)
Here's a much better pattern for you, in a c-ish form:
How to manage iffy resources in a structured manner [ourtimelines.com]
You can generalize that pattern into almost any situation and it will work well. If you need details, then instead of true/false, pass can be a value or a bit mask, etc. and then the check at the end can be verbose about what exactly went wrong. Essentially still the same pattern.
Re: (Score:2)
lol. on the upside it won't compile so no one will ever suffer them. :p
Re:Recursion is dead! (Score:4, Informative)
It gives you a chance to unwind something (like taking a lock) before the return.
Re: (Score:2)
Well yeah the complex stuff should be in user land, as with Minix. But C is just an assembler language. There can't be may things you need to do in a kernel which can't be done in C.
Re: (Score:2)
You aren't going to win that debate easily. You are arguing for micro kernels. And of course you are right that micro kernels do allow for extensions much more easily. The problem is that they have notoriously horrible performance for operations that repeat too frequently because the wrapping and unwrapping of calls gets too expensive. Same thing that happens in user code at a high level. Linear efficiency matters. Speeding up every computer on the planet 20% is worth a lot of hassle to OS developer
Re: (Score:2)
Re: (Score:2)
The whole point of recursion is to use the stack. If not some sort of while loop is probably called for.
Re: (Score:2)
You are wrong here. The whole point of recursion is to abstract away the details of how a loop executes and focus on what the loop does. Then later you use a recursion design pattern to abstract away the naive recursion. In a looping structure you end up having to decide immediately on bounds cases handling.
Most languages that handle recursion well just translate most simple recursions into iterative loops during compilation anyway.
Re: (Score:2)
I believe you are missing the greater-than character, which also looks a bit like a letter "V" with the point facing to the right. Here's one now: >
Oh, and the last time I worked with an 80-column screen was probably 30 years ago. My current screen comfortably displays at least double that number. I really don't see why we stick with that tired old convention.
Re: (Score:2)
That's not a reason to outlaw their use. If all you have are mediocre developers then no amount of coding standards will help you.
Less experienced developers can learn from good code. Sure they will get it wrong as they find their feet - we have code reviews to deal with that - but we shouldn't seek to handicap them by taking away tools they can use to express themselves.
Re: (Score:3)
It is rarely "the right thing," typically because it is super inefficient.
In compiling implementations of programming languages, it's anything but inefficient.
But my biggest concern is that it is a huge security hole, especially when the expression to be evaluated comes from the user.
One of my favourite quotes in programming: "If you don't want to do something, just don't do that."
at which point you might as well write the mini-interpreter you really need,
Will you also rewrite the compiler, and the memory manager, and write a set of interfaces between your two separate kingdoms?