Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Software IT Technology

Old-School Coding Techniques You May Not Miss 731

CWmike writes "Despite its complexity, the software development process has gotten better over the years. 'Mature' programmers remember manual intervention and hand-tuning. Today's dev tools automatically perform complex functions that once had to be written explicitly. And most developers are glad of it. Yet, young whippersnappers may not even be aware that we old fogies had to do these things manually. Esther Schindler asked several longtime developers for their top old-school programming headaches and added many of her own to boot. Working with punch cards? Hungarian notation?"
This discussion has been archived. No new comments can be posted.

Old-School Coding Techniques You May Not Miss

Comments Filter:
  • Some, not all... (Score:5, Insightful)

    by bsDaemon ( 87307 ) on Thursday April 30, 2009 @12:17AM (#27768223)

    Some of those are obnoxious and good to see them gone. Others, not so much. For instance, sorting/searching algorithms, data structures, etc. Don't they still make you code these things in school? Isn't it good to know how they work and why?

    On the other hand, yeah... fuck punch cards.

  • by NotQuiteReal ( 608241 ) on Thursday April 30, 2009 @12:17AM (#27768229) Journal
    Heh, I had to turn in a punched card assignment in college (probably the last year THAT was ever required)... but I was smart enough to use an interactive CRT session to debug everything first... then simply send the corrected program to the card punch.

    I was an early adopter of the "let the machine do as much work as possible" school of thought.
  • Hungarian Notation (Score:2, Insightful)

    by masdog ( 794316 ) <{moc.liamg} {ta} {godsam}> on Thursday April 30, 2009 @12:18AM (#27768239)

    I don't get what the big deal is with Hungarian Notation. Why do people consider it a bad thing?

    Modern IDEs might reduce the need for it, but not everyone uses an IDE to read or write code.

  • by AuMatar ( 183847 ) on Thursday April 30, 2009 @12:27AM (#27768279)

    Its absolutely essential to know how those work and why. If not you'll use the wrong one and send your performance right down the crapper. While you shouldn't have to code one from scratch anymore, any programmer who can't do a list, hash table, bubble sort, or btree at the drop of a hat ought to be kicked out of the industry.

  • What a retard! (Score:5, Insightful)

    by Alex Belits ( 437 ) * on Thursday April 30, 2009 @12:30AM (#27768297) Homepage

    First of all, most actual practices mentioned are well alive today -- it's just most programmers don't have to care about them because someone else already did it. And some (systems and libraries developers) actually specialize on doing just those things. Just recently I had a project that almost entirely consisted of x86 assembly (though at least 80% of it was in assembly because it was based on very old code -- similar projects started now would be mostly in C).

    Second, things like spaghetti code and Hungarian notation are not "old", they were just as stupid 20 years ago as they are now. There never was a shortage of stupidity, and I don't expect it any soon.

  • by Brett Buck ( 811747 ) on Thursday April 30, 2009 @12:31AM (#27768307)

    Actually, the worst spaghetti code I have ever seen (in 30+ years most of it in life-critical systems) is OO C++. It doesn't have to be that way, but I have seen examples that would embarrass the most hackish FORTRAN programmers.

              I am alarmed at the religious fervor and non-functional dogma associated with modern programming practices. Even GOTOs have good applications - yes, you can always come up with some other way of doing it, by why and with how much extra futzing? But it's heresy.

            Brett

  • by Dunx ( 23729 ) on Thursday April 30, 2009 @12:32AM (#27768321) Homepage

    Hungarian notation is bad because you are encoding type and scope information into the name, which makes it harder to change things later.

    The fact that it is also one of the ugliest naming conventions is merely a secondary issue.

  • by AuMatar ( 183847 ) on Thursday April 30, 2009 @12:33AM (#27768323)

    Three reasons.

    1)Variables change type. And then you have to rename everything. Its a pain
    2)The extra information it gives you is minimal. I want to know what data is in a variable, not the language type used to hold it. If the name of the variable is firstName, I don't need it to be called lpcstrzFirstName, I know it's a string. And the language type is rarely interesting- I want to know that the variable outsideTemp holds degrees farenheit, not that it's an integer. But Hungarian doesn't tell me that. (It also doesn't work even if I make a typedef for temperature- it'll still start with 'i').
    3)It makes searching the code for a variable that much more annoying, because they all start with freaking 'i' and 'p'.

  • by snookums ( 48954 ) on Thursday April 30, 2009 @12:34AM (#27768335)

    Really it has nothing to do with IDEs, but more compilers, good coding practice and OO principles. A few cons:

    • The code should be simple enough that you can easily track a variable from declaration through use, or imply the type from the context and name.
    • Since most (all?) compilers and interpreters ignore the Hungarian prefix, there's no way of knowing that iFoo is really an integer. This is particularly true of weakly typed languages that are popular in a lot of modern programming environments.
    • In a large OO project you might have hundreds of types. Creating meaningful prefixes for all of them is going to be next to impossible, and having obj at the front of everything is redundant.

    For a succinct summary: Hungarian Notation Considered Harmful [erngui.com]

  • Yes, I'm old (Score:5, Insightful)

    by QuantumG ( 50515 ) * <qg@biodome.org> on Thursday April 30, 2009 @12:42AM (#27768381) Homepage Journal

    * Sorting algorithms

    If you don't know them, you're not a programmer. If you don't ever implement them, you're likely shipping more library code than application code.

    * Creating your own GUIs

    Umm.. well actually..

    * GO TO and spaghetti code

    goto is considered harmful, but it doesn't mean it isn't useful. Spaghetti code, yeah, that's the norm.

    * Manual multithreading

    All the time. select() is your friend, learn it.

    * Self-modifying code

    Yup, I actually write asm code.. plus he mentions "modifying the code while it's running".. if you can't do that, you shouldn't be wielding a debugger, edit and continue, my ass.

    * Memory management

    Yeah, garbage collection is cheap and ubiquitous, and I'm one of the few people that has used C++ garbage collection libraries in serious projects.. that said, I've written my own implementations of malloc/free/realloc and gotten better memory performance. It's what real programmers do to make 64 gig of RAM enough for anyone.

    * Working with punch cards

    Meh, I'm not that old. But when I was a kid I wrote a lot of:

    100 DATA 96,72,34,87,232,37,49,82,35,47,236,71,231,234,207,102,37,85,43,78,45,26,58,35,3
    110 DATA 32,154,136,72,131,134,207,102,37,185,43,78,45,26,58,35,3,82,207,34,78,23,68,127

    on the C64.

    * Math and date conversions

    Every day.

    * Hungarian notation

    Every day. How about we throw in some reverse polish notation too.. get a Polka going.

    * Making code run faster

    Every fucking day. If you don't do this then you're a dweeb who might as well be coding in php.

    * Being patient

    "Hey, we had a crash 42 hours into the run, can you take a look?"
    "Sure, it'll take me about 120 hours to get to it with a debug build."

  • Memory Management (Score:3, Insightful)

    by Rob Riepel ( 30303 ) on Thursday April 30, 2009 @12:49AM (#27768407)

    Try overlays...

    Back in the day we had do all the memory management by hand. Programs (FORTRAN) had a basic main "kernel" that controlled the overall flow and we grouped subprograms (subroutines and functions) into "overlays" that were swapped in as needed. I spent hours grouping subprograms into roughly equal sized chunks just to fit into core, all the while trying to minimize the number of swaps necessary. All the data was stored in huge COMMON blocks so it was available to the subprograms in every overlay. You'd be fired if you produced such code today.

    Virtual memory is more valuable than full screen editors and garbage collection is just icing on a very tall layer cake...

  • by Nakoruru ( 199332 ) on Thursday April 30, 2009 @12:52AM (#27768435)

    You will never find a programming language that frees you from the burden of clarifying your thoughts.

    http://www.xkcd.com/568/

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Thursday April 30, 2009 @12:58AM (#27768459) Journal

    any programmer who can't do a list, hash table, bubble sort, or btree at the drop of a hat ought to be kicked out of the industry.

    Why?

    Lists, hash tables, and sorting is already built in to many languages, including my language of choice. The rest, I can easily find in a library.

    When performance starts to matter, and my profiling tool indicates that the sorting algorithm is to blame, then I'll consider using an alternate algorithm. But even then, there's a fair chance I'll leave it alone and buy more hardware -- see, the built-in sorting algorithm is in C. Therefore, to beat it, it has to be really inappropriate, or I have to also write that algorithm in C.

    It's far more important that I know the performance quirks of my language of choice -- for instance, string interpolation is faster than any sort of string concatenator, which is faster than straight-up string concatenation ('foo' + 'bar').

    And it's far more important that I know when to optimize.

    Now, any programmer who couldn't do these at all should be kicked out of the industry. I could very likely code one quickly from the Wikipedia article on the subject. But by and large, the article is right -- there's a vast majority of places where these just don't matter anymore.

    Not that there's nowhere they matter at all -- there are still places where asm is required. They're just a tiny minority these days.

  • by AuMatar ( 183847 ) on Thursday April 30, 2009 @01:03AM (#27768493)

    Because they're dead simple, and if you don't know how they work you won't write good code. I didn't say you had to do so regularly (or ever, after college), but you need to be capable of it. If you aren't, you're not qualified to program. Period.

  • by Anonymous Coward on Thursday April 30, 2009 @01:19AM (#27768583)

    Lemme just review your *ahem* "arguments":

    Because they're dead simple

    Your first key point is that programmers must understand them because they're simple??? Um...

    and if you don't know how they work you won't write good code.

    Now you're asserting that it's impossible to write good code unless you understand these things. So all of good programming hinges on this? That's incredible! </sarcasm>

    If you aren't, you're not qualified to program. Period.

    Heehee!

    Try this one: thinking logically is critical to being qualified to program.

  • by Ruie ( 30480 ) on Thursday April 30, 2009 @01:23AM (#27768591) Homepage

    any programmer who can't do a list, hash table, bubble sort, or btree at the drop of a hat ought to be kicked out of the industry.

    Why?

    Because if these well known tasks are difficult for them their job title is really typist, not programmer. The challenge is not to write bubble sort day in and day out, but being several levels above that so it is as easy as computing six times seven or reading road signs.

  • by drolli ( 522659 ) on Thursday April 30, 2009 @01:23AM (#27768593) Journal

    mod parent up.

    if you are that uninterested in computers that these algorithms are uninteresting for you, you should leave. Moreover there can be *extremely* tricky performance things to consider about your cache/physical ram size (just write a loop which covers more and more memory.....). There are algorithms which work extremely well unless you exceed the available size of ram, but break down suddenly (i have the feeling that something of this class happens on my Nokia E61 with the bundled e-mail client. There was a day when my imap folders exceeded a certain number of e-mails, and suddenly the times to process things where growing by a factor of approx. 20-100).

  • by Max Littlemore ( 1001285 ) on Thursday April 30, 2009 @01:29AM (#27768619)

    The worst I saw in my ~25 years, and I include old COBOL and BASIC crap, was not spagetti in the strict sense of the word. It was a 10000 line Java method written by a VB developer. There were no gotos, but the entire thing was nested ifs switches and for loops nested to over 10 layers deep. Oh, and you did read that right, it was a method - the entire class had a solitary static method full of copy and pasted chunks. He explained that it was OO because it was Java. I might forgive him if it was gigantic nested unrolled loop that ran like stink, but it was slow and crash prone.

    A bunch of gotos and gosubs are a pleasure to debug compared to that kind of poo, seriously.

    No matter how nice a new paradigm that comes along, there is always some idiot who can make it suck far, far more than the last paradigm.

    Re wrote that as 10 classes of ~20 lines each, it ran faster and never died until it was told to.

  • by AuMatar ( 183847 ) on Thursday April 30, 2009 @01:32AM (#27768639)

    I'm thinking perfectly logically. If you don't understand and can't replicate the concepts that underpine your craft, you aren't qualified to practice it. It's like a physicist who can't understand force, or a mathematician who doesn't understand the first fundamental theory of calculus. They aren't capable of doing their job. Apparently this includes you.

  • by mark-t ( 151149 ) <markt AT nerdflat DOT com> on Thursday April 30, 2009 @01:42AM (#27768709) Journal
    A feature like intellisense isn't a feature to save typing time... its primary benefit is to save looking things up in a manual if one happens to not remember the exact spelling of some class member or function. If one knows exactly what ones wants to type in the first place, it doesn't stop you, nor should it even slow you down, unless it's implemented poorly.
  • by Darinbob ( 1142669 ) on Thursday April 30, 2009 @01:55AM (#27768763)

    One problem I've seen with some programmers, is they use the built in libraries to solve all problems. I've seen C++ maps (ie, red-black trees) be used to implement something a trivial array could do (ie, they keys were an 8 value enumeration). They've got a hammer, and all problems look like nails.

    It's not difficult to whip up data structures or algorithms that can beat the one-size-fits-all versions in language libraries. Of course, some people say "don't reinvent the wheel", but then there are applications where size and performance really do matter. Or maybe limited memory and limited CPU systems are considered too old school for some.

  • by Toonol ( 1057698 ) on Thursday April 30, 2009 @01:56AM (#27768773)
    The basic fundamentals of programing will be reused in a number of different contexts and variations. Yes, a "fastsort()" api call can sort arrays, and can help you skate past the evidently difficult chore of learning something new. It won't help you realize that you could use the principles of a Quick-Sort to bear on a complicated problem.

    Another example: The computer can do exponentiation for you too, but actually understanding it will occasionally let you vastly improve the quality of your code.
  • by Unoriginal_Nickname ( 1248894 ) on Thursday April 30, 2009 @02:01AM (#27768799)

    I'll agree with you for most of what you said, but I disagree that programmers should learn to implement sorting algorithms. Unless they're doing serious research on the subject it's doubtful that Joe Programmer is going to be whipping up a sorting algorithm that's better than the one provided.

    What you're suggesting here isn't like a mathematician not understanding calculus. It's more like a mathematician only having pi memorized to the 8th decimal. I see zero value in learning to parrot quicksort, especially since the information is easily obtained and the implementation you use is almost certainly as fast as is possible (assuming you aren't Abrash).

  • by jd ( 1658 ) <imipak@yahoGINSBERGo.com minus poet> on Thursday April 30, 2009 @02:23AM (#27768927) Homepage Journal

    There is no practical difference between OO code and structured code. The article assumed structured code means goto and gosub, but any Real Programmer knows that procedures (which are just gosubs by name rather than address) are still structured programming.

    So what's OO? Each class is just a bunch of functions and procedures, with one entry point and one exit point for each - your standard structured programming methodology. The fact that there are different classes makes no difference. Calls between classes don't change the nature of a class any more than pipes between programs change the nature of programs.

    I wasn't impressed by other claims, either. Garbage collection is still a major headache in coding, which is why there are so many debugging mallocs and so many re-implementations of malloc() for specialist purposes. Memory leaks are still far, far too common - indeed they're probably the number 1 cause of crashes these days.

    Pointer arithmetic? Still very very common. If you want to access data in an internal database quickly, you don't use SQL. You use a hash lookup and offset your pointer.

    Sorts? Who the hell uses a sort library? Sort libraries are necessarily generic, but applications often need to be efficient. Particularly if they're real-time or HPC. Even mundane programmers would not dream of using a generic library that includes sorts they'll never refer to in, say, an e-mail client or a game. They'll write their own.

    One of the reasons people will choose a malloc() like hoard, or an accelerated library like liboil is that the standard stuff is crappy for anything but doing standard stuff. This isn't the fault of the glibc folks, it's the fault of computers for not being infinitely fast and the fault of code not being absolutely identical between tasks.

    The reason a lot of these rules were developed was that you needed to be able to write reusable code that also had a high degree of correctness. Today, you STILL need to be able to write reusable code that also has a high degree of correctness. If anything, the need for correctness has increased as security flaws become all the more easily exploited, and the need for reusability has increased as code bases are often just too large to be refactored on every version. (Reusability is just as important between versions as it is between programs - a thing coders often forget, forcing horrible API and ABI breakages.)

    The reason that software today is really no better, stability-wise, than it was 15-30 years ago is that new coders think they can ignore the old lessons because they're "doing something different", only to learn later on that really they aren't.

  • New Headaches (Score:4, Insightful)

    by Tablizer ( 95088 ) on Thursday April 30, 2009 @02:26AM (#27768949) Journal

    The biggest "new" headache that will probably end up in such an article 20 years from now is web "GUIs", A.K.A. HTML-based interfaces. Just when I was starting to perfect the art of GUI design in the late 90's, the web came along and changed all the rules and added arbitrary limits. Things easy and natural in desktop GUI's are now awkward and backassward in a browser-based equivalent.

    Yes, there are proprietary solutions, but the problem is that they are proprietary solutions and require users to keep Flash or Active-X-net-silver-fuckwhat or Crashlets or hacky JimiHavaScript up-to-date, making custom desktop app installs almost seem pleasant in comparison, even with the ol' DLL hell.

    On a side note, I also came into the industry at the tail end of punched cards (at slower shops). Once the card copy machine punched the holes about 1/3 mm off, making them not read properly, but on *different* cards each pass thru. It's like including 0.5 with binary numbers, or 0.4999 and 0.5001 with a quantum jiggle.

    Good Times
       

  • by Pinckney ( 1098477 ) on Thursday April 30, 2009 @02:39AM (#27769033)

    You're missing the point---it breaks when one of the variables is a reference to the other.

    It's a neat algorithm, but the case in which it fails just goes to show that these skills aren't irrelevant. Yes, you should know what a reference is. Using your compiler and libraries as a crutch for your lack of understanding leads to unpleasant bugs.

  • by Garridan ( 597129 ) on Thursday April 30, 2009 @02:44AM (#27769059)

    Recently, I had a colleague ask me what sorting algorithm he should use in the inner loop of some algorithm he was implementing. Most CS majors I've talked to just blurt out "QUICKSORT!" without thought. Ok, that's got an average runtime of nlg(n). After about an hour of discussion and analysis, we came up with an algorithm that ran in sub-linear time. Now's the time for the CS kids to blurt, "ZOMG, but you can't sort in less than O(nlg(n)!" Ah, but you can, if you know what your input is going to look like.

    When a function gets executed billions or trillions of times, it's worth optimizing. Often times, doubling the speed of a deep internal function does nothing -- other times, it can cut the runtime of your program in half. I come from a mathematical background, and I do lots of computation. Often times, it can take a year or more to solve a problem with a quick implementation. Spend a few weeks optimizing it, and you might be able to solve the problem in a few hours.

    There is no substitute for analyzing your code. And I do mean, sitting down with a writing implement and a blank surface, and tracing through the algorithm. Then, profiling the code and hammering down hotspots. And then, take a page out of Knuth's book -- throw the code away, and write it again.

  • by Sycraft-fu ( 314770 ) on Thursday April 30, 2009 @02:49AM (#27769089)

    You don't need long division in normal life. Regardless of if you are in a math heavy career or not, you aren't going to waste your time doing it by hand, you'll use a calculator which is faster and more accurate. However, you need to learn it. You need to understand how division works, how it's done. Once you learn it, you can leave it behind and automate it, but it is still important to learn. An understand of higher level math will likely be flawed if basic concepts aren't learned properly.

  • by bcboy ( 4794 ) on Thursday April 30, 2009 @02:49AM (#27769091) Homepage

    The one application of "goto" that I swear by is for cleaning up allocations on failure when coding in C.

    Maintaining a huge library of legacy C code, one of the most common bugs we see is leaks due to people using multiple "return" statements and failing to clean up allocations. You can fairly reliably pick such a function at random and find a memory leak: people always get it wrong.

    "goto cleanup;" however, is hard to mess up.

    I've seen any number of clever tricks to avoid the "goto". Using "break" statements in a do {} while (0) loop, for example. All of them merely obfuscate the code, and make it more likely for bugs to appear.

  • by Anonymous Coward on Thursday April 30, 2009 @03:01AM (#27769167)

    *different* AC

    Why, why, why do people get SO offended when you tell them they have to learn computers to be good at computers?

    People expect to learn engineering to be a mechanic. To study biology if they want to be a surgeon. Hell, to read a cookbook just to learn how to cook.

    But answer "How do I program?" with "Well, there's this manual, see..." and you get "Elitist! Elitist! Hey everybody, come see the arrogant condescending elitist who's persecuting me! Come and see the violence inherent in the sys-teeeem!"

    Fine then, let's start telling everybody it's magic. Get some chicken bones and some goat's blood and some black candles...

  • by Fulcrum of Evil ( 560260 ) on Thursday April 30, 2009 @03:02AM (#27769179)

    but I disagree that programmers should learn to implement sorting algorithms.

    Dead wrong. Every programmer worthy of the name must be able to implement the basic data structures and algorithms, understand Big-O notation, and be able to do fault isolation (this last one is tricky). This is the lowest bar.

  • by Fulcrum of Evil ( 560260 ) on Thursday April 30, 2009 @03:04AM (#27769193)

    I've seen C++ maps (ie, red-black trees) be used to implement something a trivial array could do (ie, they keys were an 8 value enumeration). They've got a hammer, and all problems look like nails.

    Sure, it's excessive, but unless it's performance critical, I'd do the same thing - simpler and clearer usually trump faster these days.

  • by fractoid ( 1076465 ) on Thursday April 30, 2009 @03:05AM (#27769213) Homepage
    Yeah, I went through a phase of using Hungarian-esque notation not long after I started programming, mostly because I taught myself from Charles Petzold's excellent Programming Windows book, combined with various MS examples, all of which used Hungarian notation. Then I started realising that 90% of my variables were integers anyway, and started dropping notation where it was obvious, to the point where I now only give decoration to member/static/global variables and some pointers where it's not blatantly obvious by usage.

    I think that with a clearer coding style, Hungarian notation becomes less helpful, possibly even unnecessary. It's useful when, for instance, you have a 500-line function and you can't remember all the variables and their types, but this is far better solved by simply having non-awful code structuring in the first place rather than by any kind of variable-name decoration.
  • by syousef ( 465911 ) on Thursday April 30, 2009 @03:06AM (#27769219) Journal

    Good Hungarian notation does exactly that, actually. Check out Apps Hungarian, which encodes the semantic type of the data, rather than the language-level data type.

    Good explicitly LONG where appropriate variable names that don't conform to a complex set of rules that need to be memorized are ALWAYS a better solution.

    # rwPosition : variable represents a row ("rw");

    Awful! For one extra character you get rowPosition which is unambiguous and doesn't need to be looked up.

    # usName : variable represents an unsafe string ("us"), which needs to be "sanitized" before it is used (e.g. see code injection and cross-site scripting for examples of attacks that can be caused by using raw user input)

    unsafeName or unsafeNameString would be much better. It doesn't imply this name only applies in the U.S.A.

  • by fractoid ( 1076465 ) on Thursday April 30, 2009 @03:21AM (#27769291) Homepage

    If you don't understand and can't replicate the concepts that underpin your craft, you aren't qualified to practice it.

    Well put. Saying that sorting algorithms are readily available in libraries for virtually all platforms, and thus that modern programmers need not learn them, is just wrong. It's like saying that an engineer need not know about moments of inertia when designing a beam, because he can click a button on his design software to tell him the rigidity. Or like a mechanic not knowing how to use a spanner because he has an air gun available.

    I probably couldn't code a particularly efficient quicksort, for example, off the top of my head - but I certainly understand how it works. Contrary to what Unoriginal_Nickname says below, it's not like a mathematician not memorising Pi past 8dp, it's more like a mathematician not ever learning what Pi is because he has a computer program that he can use to calculate the circumference of a circle.

  • Re:True story (Score:5, Insightful)

    by fractoid ( 1076465 ) on Thursday April 30, 2009 @03:29AM (#27769341) Homepage
    My mother, who was programming before a fair few of us (including me) were born, once told me this: If you think you've found a bug in a compiler, or an operating system, or a programming language, or a well-known commonly used library... you're wrong.

    Of course, this doesn't hold true 100% of the time, especially when you're pushing the limits of new versions of large 3rd party libraries, but when one is just starting to program (and hence using very well known, well tested libraries and code) it's true 99.99% of the time.

    (Oh, btw, I love your sig. Makes me laugh every time. :)
  • Re:True story (Score:3, Insightful)

    by arotenbe ( 1203922 ) on Thursday April 30, 2009 @03:30AM (#27769351) Journal

    Even more puzzling to me is how someone could decide to use a data structure without understanding its behavior (and without at least checking the Java APIs or simply Googling).

    Easy. They learned that they should use *insert class here* in Intro to Programming 1 or 2 and never thought about it again since then. Horrendous overuse of StringBuilders is probably the most common example of this, but it can apply to just about anything.

  • by Unoriginal_Nickname ( 1248894 ) on Thursday April 30, 2009 @04:13AM (#27769581)

    I don't think you read my post very thoroughly, because I'm pretty sure I didn't say what you clearly think I said.

    My argument is that learning to implement a sorting algorithm will not impart special knowledge beyond the experience that can be attained by completing virtually any other task. Like I said above, I see absolutely zero value in the ability to recite a particular solution from memory.

    By restating my argument from memory I have successfully completed a similar task to the one you are challenging your contemporaries with. Literally anybody could read the Wikipedia article about the Bubble Sort and write their own implementation based on it. In a similar vein, I am reiterating my previous point. Both the aforementioned hypothetical programmer and I could have accomplished the same task by copying and pasting earlier efforts.

  • by Nocturnal Deviant ( 974688 ) on Thursday April 30, 2009 @04:19AM (#27769613) Homepage

    if you DONT look into everything including stuff considered out of date in my opinion your not a programmer.

    A true programmer in my opinion is somebody who LIKES to learn and solve problems. History of programming is a really good place to learn. ive sometimes found myself just downloading source off programs ill never use, and more times than not out of date, to look at them and learn, see what i personally would do differently, and say hmm well that could have been done better, and if the project is still being worked on ill either fix it or send it along to somebody.

    classes teach you basics, it is you who must take it further, and if its a passion, you take it as far as you can.

  • by Aceticon ( 140883 ) on Thursday April 30, 2009 @04:46AM (#27769753)

    Maybe you guys are frozen in time - or maybe you're some kind of elitist-coder types.

    From where I stand, the most relevant optimizations have to do optimizing the data flows between systems - the most typical of which are appServer-database and GUI-appServer and between storage and memory. We're talking about shaving hundreds of miliseconds, maybe even seconds per-operation: not nanoseconds.

    Even if you work in standalone, small size applications, were knowing the basic principles of algorithms can be more important, hand-coding your own is not only useless (there are plenty of libraries out there with good implementations) it's actually counter productive (it introduces a complex piece of code which is often not properly tested and might be even slower than the library ones)

    Understanding the basic principles = important.
    Being able to code your own = only important for those who never evolved beyond just-a-coder.

  • by sgbett ( 739519 ) <slashdot@remailer.org> on Thursday April 30, 2009 @04:50AM (#27769781) Homepage

    while we are being pedantic...

    From TFA: "most text editors instantly tell you the variable type"

    a *text editor* should do no such thing.

  • by shutdown -p now ( 807394 ) on Thursday April 30, 2009 @04:52AM (#27769799) Journal

    you get far better results from the garbage collector if you null out your references properly, which does matter if your app needs to scale.

    You don't get any difference at all if you null out local variables. In fact, you may even confuse the JIT into thinking that the variable lifetime is larger than it actually has to be (normally, it is determined by actual usage, not by lexical scope).

  • by owlstead ( 636356 ) on Thursday April 30, 2009 @04:54AM (#27769823)

    I would have to think about bubble sort and btree (I could do them though, but not at the drop of a hat). I would never program them myself. It's hard to write collection frameworks, and I'll gladly leave it to specialists.

    More to the point: it's important *what* these algorithms do, much more than being able to program them, at least for most programmers. It is also very important that they know how and when to use the right classes.

    I've already seen the Java "string" + "string" example, pointing out that it is slow. SO WHAT? Unless you're putting megabytes together (which you probably shouldn't), who will notice? Same with bubble sort and quick sort. Bubble sort is faster when doing small sorts? Who cares about small sorts? I've seen programmers clobber other programmers because they were using a linked list when sorting. Not very interesting if the list only grows to 10 elements.

    After 8 years in the industry, I've only written a few slightly generic collections (what they called "data structures" in the old days). They were either on a smart card (which misses all these structures) or mash-ups of other data structures (a hash map with a backing linked list is a favorite for storing multiple elements of the same type to be referenced by a key).

    All in all, there's simply no need to know how to implement these things anymore. As said, you *should* know the tools in your collections framework though. And you *should* know when to use them.

  • by SmallFurryCreature ( 593017 ) on Thursday April 30, 2009 @05:08AM (#27769905) Journal

    It is well known that Michael Schumacher is NOT much of a car nut when it comes to the mechanics. How many world championships did he win? Oh, more then ANYONE ELSE?

    You need to know about the network stack if it is your job to know about the network stack. If it isn't, you don't need to know about it. What good is it for someone who writes an music codec to know about the network? Parallell programming notepad?

  • Re:True story (Score:5, Insightful)

    by ShakaUVM ( 157947 ) on Thursday April 30, 2009 @05:23AM (#27770023) Homepage Journal

    >>If you think you've found a bug in a compiler, or an operating system, or a programming language, or a well-known commonly used library... you're wrong.

    You apparently never tried doing template coding in C++ ten years ago. =)

  • by BrokenHalo ( 565198 ) on Thursday April 30, 2009 @05:31AM (#27770057)
    if you are that uninterested in computers that these algorithms are uninteresting for you, you should leave.

    Yeah. TFA has relegated to its second page a couple of the techniques that used to be important to me nearly 30 years ago: self-modifying code and patching code while it was actually running. These came under the heading of "hazardous duty", since if you got it wrong you could fuck things up quite spectacularly, and EVERYONE would know about it and give you a hard time. This was the sort of thing that was guaranteed to the adrenaline going. ;-)

    I had almost forgotten about the old DEADBEEF filler in core-dumps, though. I pulled that trick out of my hat in the '80s when I was showing a new CompSci graduate a way to fix an issue with an old COBOL program. I had a hex dump of about 500 pages 11x14" fanfold, and I just flicked the edges of the paper to locate the likely culprit. She cracked up laughing when I showed her the "secret".
  • Re:True story (Score:5, Insightful)

    by Moraelin ( 679338 ) on Thursday April 30, 2009 @05:43AM (#27770097) Journal

    Well, obviously all 3 above knew how to use a Hashtable or HashMap, but neither knew what they really do and all ended up trying to fix what's not broken.

    But the real answer I'm tempted to give is more along the lines of the old wisecrack: In theory there's no difference between theory and practice. In practice there is.

    In theory, people shouldn't know more than what collection to use, and they'll be perfectly productive without more than a Java For Dummies course. In practice I find that the people who understand the underlying machine produce better code. Basically that you don't need to actually program anything in ASM nowadays, but if you did once, you'll produce better code ever after. You don't need to chase your own pointers in Java any more, but you _can_ tell the difference between people who once understood them in C++ and those who still struggle with when "x=y" is a copy and when it holds only a reference to the actual object. You theoretically don't need to really know the code that javac generates for string concatenation, but in practice you can tell the difference in the code of those who know that "string1=string2+string3" spawns a StringBuffer too and those who think that spawning their own a StringBuffer is some magical optimization. Etc.

    And then there are those who are living proof that just a little knowledge is a dangerous thing. I see people all the time who still run into something that was true in Java 1.0 times, but they don't understand why or why that isn't so any more.

    As a simple example, I run into people who think that to rewrite:

    for (int i = 0; i < someArray.length; i++) {
        doSomething(someArray[i]);
    }

    as

    try {
        for (int i = 0; ; i++) {
            doSomething(someArray[i]);
        }
    }
    catch (ArrayIndexOutOfBoundsException e) {
    // do nothing
    }

    ... is some clever optimization, and it speeds things up because Java doesn't have to check the extra bounds on i any more.

    In reality it's dumb and actually slower, instead of being an optimization. Any modern JIT (meaning since at least Java 1.2) will see that the bound was already checked, and optimize out the checking in the array indexing. So you have exactly one bounds check per iteration, not two. But in the "optimized" version, it doesn't detect an existing check, so it leaves in the one at the array indexing. So you _still_ have one bounds check per iteration. It didn't actually save anything. But this time the exit is done via an exception, which is a much more expensive thing.

    For bonus points, it introduces the potential for another bug: what if at some point in the future the doSomething() method throws its own ArrayIndexOutOfBoundsException? Well, they'll get a clean exit out of the loop without processing all values, and without any indication that an exception has occured.

    Such stuff happens precisely to people who don't understand the underlying machine, virtual or not.

  • by Have Brain Will Rent ( 1031664 ) on Thursday April 30, 2009 @06:14AM (#27770277)
    Division by hand? Don't you just do it in your head?
  • Re:True story (Score:5, Insightful)

    by Have Brain Will Rent ( 1031664 ) on Thursday April 30, 2009 @06:23AM (#27770329)
    I think you've got the bar a little high there. I'd settle for not continuing to run into bugs that result because people wrote code that copies a string into a buffer without knowing if the buffer was big enough to hold the string. Or, not quite a bug, people who place arbitrary, and small, limits on the size of strings (or numbers) - cause god forbid that anyone have a name longer than 12 characters, or a feedback comment longer than 40 characters, or ...
  • by robthebloke ( 1308483 ) on Thursday April 30, 2009 @06:30AM (#27770369)
    In my experience the problem is that normally you end up seeing iCount, szName, bEnabled, fPercent, etc. None of those variable names are improved in any way by hungarian notation - you could easily drop the sz from szName and know what type it is.
  • by javaxjb ( 931766 ) on Thursday April 30, 2009 @07:11AM (#27770587)
    If you really want to understand how division works, iterative subtraction would be better -- literally count the number of times the divisor gets used. Then, to optimize (vs iterative subtraction) and improve accuracy (over long division) learn double division [doubledivision.org]. Long division belongs in the same category as these old school coding techniques -- I'm tempted to hedge a bit on that point since long division is faster, but I can't even remember the last time I used it (probably best measured in decades).
  • by Rockoon ( 1252108 ) on Thursday April 30, 2009 @07:17AM (#27770619)

    When performance starts to matter, and my profiling tool indicates that the sorting algorithm is to blame, then I'll consider using an alternate algorithm.

    Sure, that profiler might say that you are taking n% of your time in it, but how are you going to objectively know that that n% can be reduced significantly? Is your profiler an artificial inteligence?

    That, my friend, is the problem with canned solutions. You never really know if the implementation is decent, and in some cases you don't even know what the algorithm used is. Still further, if you are a clueless canned solution leverager, you probably don't know the pitfalls associated with a given algorithm.

    Do you know what triggers quicksorts worst case behavior?

    Do you know why a boyer-moore string search performs fairly badly when either string is short?

    Do you know the worst case behavior of that hash table implementation? Do you know what the alternatives are? What is its memory overhead?

    Are any of the canned solutions you use cache oblivious?


    Now lets get into something serious. Algorithmic performance deficiencies are often used in Denial of Service attacks, and any time you use a canned solution you are setting yourself up as an easy target. Your profiler will never tell you why your customer is experiencing major problems, because the attack isn't on your development machine(s.)

    ..and finally.. being ignorant is not something to be proud of. Seriously. Your answer to discovering that the canned solution isnt acceptable is to "buy more hardware." Developers don't get to make that decision. Customers do... and thats assuming the hardware exists. If I was your boss I would fire you immediately for being a willfully ignorant bad programmer.

  • Re:True story (Score:4, Insightful)

    by _merlin ( 160982 ) on Thursday April 30, 2009 @07:20AM (#27770645) Homepage Journal

    LOL, I used to believe that, but I can now reliably make SunPRO, GCC and MSVC miscompile things. SunPRO has a bug where it always considers children of friends to be friends. SunPRO occasionally constructs locals when an exception should have caused flow control to leave the block earlier. GCC insists on copying temporaries passed by const reference. SunPRO outright crashes when you try templating on a member function pointer type. MSVC incorrectly mangles names of symbols in anonymous namespaces contained within other namespaces. GCC won't find global operators inside a namespace that contains operators, even for completely unrelated types. Giving GCC the same specific register constraint for an input and output of an inline assembly block will cause miscompilation - you need to use numeric constraints. People say that I only find this stuff because I'm digging around in the dark corners of the language where no-one else goes. It still sucks to be tearing my hair out over it, though.

  • by anothy ( 83176 ) on Thursday April 30, 2009 @07:57AM (#27770869) Homepage
    you're missing the point, which is understandable since this thread has gone totally stupid. but hey, i'm up early.

    My argument is that learning to implement a sorting algorithm will not impart special knowledge beyond the experience that can be attained by completing virtually any other task. Like I said above, I see absolutely zero value in the ability to recite a particular solution from memory.

    the problem is that you're conflating two different things. the "ability to recite a particular solution from memory" is largely, i'd agree, useless in most cases. but that's not really what this is about. the process of learning imparts special knowledge beyond what is learned. you begin to understand the "whys" of things in ways that you simply cannot if you've never learned the thing.

    in most ways, statements of the form "you must know X" are really proxies for statements of the form "you must have learned X" (even current retention is less important), mostly because they're so much easier to verify.

  • by Anonymous Coward on Thursday April 30, 2009 @08:38AM (#27771201)

    Wrong. The lowest bar is understanding the theory behind Big-O notation.

    Frankly, if you can make it through a couple semesters of college level CS & Math 300 & 400 level classes you should be able to pick up the implementation techniques pretty easily.

    I agree with you that the data structures and algorithms are important to programmers but I would suggest that you are not a real programmer unless you understand what you are implementing.

  • by bziman ( 223162 ) on Thursday April 30, 2009 @08:46AM (#27771287) Homepage Journal
    This might be okay if you are SO constrained you can't afford one register's worth of temp space, but if you're into performance, this is 4-8x slower than using a temp variable, in every language I've tried it on. Run your own benchmarks, see what I mean. Also, don't obfuscate your code, just to be "clever".
  • Re:True story (Score:3, Insightful)

    by banana fiend ( 611664 ) on Thursday April 30, 2009 @08:56AM (#27771405)
    You may not be wrong, but you should exhaust all other possibilities first. I was working in a company where we found a bug in the floating-point calculation on the intel chip. http://en.wikipedia.org/wiki/Pentium_FDIV_bug [wikipedia.org]

    Lots of people also found it. You can't even assume that your hardware is right :)

    (Oh, btw, damn your sig! I'm singing that song now!)
  • by gerglion ( 1264634 ) on Thursday April 30, 2009 @09:00AM (#27771437)
    You are mixing up 'programmer' and 'computer scientist'. They aren't necessarily one and the same. Computer science is largely the mathematics of computing, it just so happens that to physically show it often one has to write code to do it. This doesn't mean that everyone who writes code automatically is a CS major/graduate.

    As an aside, you could also argue that programmers should have a good grasp on design patterns, requirements, planning, etc... Which seems to fall under the title of software engineer now. My CS department to date has required me to take a single SE course since I've been here and it'll be the only one I'll take.

    'Programmer' is too vague a description, as it is just one who programs, regardless of how they learned, why they are programming, what they are programming, etc... It could be someone writing Lisp for their Masters/PhD research, some web designer writing javascript for their new website, or a CE/EE writing assembler for a new driver/BIOS for hardware.
  • by shabble ( 90296 ) <metnysr_slashdot@shabble.co.uk> on Thursday April 30, 2009 @09:11AM (#27771569)

    Full Hungarian notation is a bit redundant, precisely because everyone (for reasonable values of 'everyone') DOES use some form of IDE to code, and any non-epic-fail IDE will at the least tell you variable types when you mouse over them, or pop up a member list for a class/struct when you go to type them.

    Um - Hungarian notation is for coding what the variable represents, not the type of variable it's represented by.

    Anyone using iVariable or sVariable to indicate that the former is an int and the latter is a string is doing it wrong.

    It's this misunderstanding that's resulted in HN's 'bad' reputation.

    See http://www.joelonsoftware.com/articles/Wrong.html [joelonsoftware.com] for an example of how HN should be used.

  • by 16Chapel ( 998683 ) on Thursday April 30, 2009 @09:23AM (#27771693)
    Number of times I had to implement sorting algorithms for my degree: 3

    Number of times I've had to implement sorting algorithms in my 10 year career: 0

    They make good teaching exercises, but any programmer in my team who wasted time building their own sorting algorithms rather than using a library function, would get a few sharp words about efficiency.
  • by mdwh2 ( 535323 ) on Thursday April 30, 2009 @09:54AM (#27772055) Journal

    I'm a mathematician who works as a programmer. My apologies for not fitting into your simplistic argument.

    (My job requires plenty of mathematical knowledge, and a maths background was more appropriate for my job than computer science, despite being a programmer.)

    Perhaps we should take it further - surely by your reasoning, only a Bubble Sortist needs to know how to hand code a Bubble Sort under exam conditions, but other kind of programmers don't? After all, it's surely not possible that different fields may cross over, and that different people have different experiences.

    What I would value far more is not someone who can regurtitate his college days where he memorised line by line an algorithm that you shouldn't be using anyway, rather, someone who can hand code any given algorithm as and when he needs to, when he hasn't previously memorised it - that could be a bubble sort if he hasn't previously learnt it, but it's even better to test that with other things.

    Furthermore, for standard algorithms I would value someone who reads up about the algorithm, and preferably uses a standard version, to ensure optimised usage, no bugs, and to know about it's flaws (as is obviously the case with bubble sort), or whether they should be using it at all. Far better than that then someone who only shows off his skills by hacking together a quick version from memory without doing any checks.

  • Re:Quicksort (Score:5, Insightful)

    by fractoid ( 1076465 ) on Thursday April 30, 2009 @10:07AM (#27772217) Homepage
    Um, I'm pretty sure quicksort is still the go-to sort simply because it's the implementation that's built into almost every single programming environment. Then again honestly, I'd say that from the point of view of a pragmatic programmer... it doesn't matter. There's a built-in fuction (whether it's qsort() in the C standard library, or Arrays.Sort() in Java, or whatever) that will take your array and return it, sorted. If your app runs too slow and you profile it and it turns out the speed problem is in the sorting AND you can't find a better algorithm that doesn't depend so much on sorting... THEN you look at optimising it. Never forget the two cardinal rules of optimising:

    1) Don't optimise.
    2) (Experts only:) Optimise later.

    Or as I once read it eloquently expressed:
    1) Make it work.
    2) Make it work right.
    3) Make it work fast.
  • by Moridineas ( 213502 ) on Thursday April 30, 2009 @10:54AM (#27772949) Journal

    I'll agree with you for most of what you said, but I disagree that programmers should learn to implement sorting algorithms. Unless they're doing serious research on the subject it's doubtful that Joe Programmer is going to be whipping up a sorting algorithm that's better than the one provided.

    That is completely missing the point. Nobody is expecting the average programmer out there to write a superior sorting library.

    Your example of a sorting algorithm being related to decimals in pi is completely bonkers. You're really comparing learning and comprehending an ALGORITHM to memorizing digits? If you (and I don't mean you specifically) don't understand how sort algorithms work, if you don't understand linked lists, etc--and furthermore, think learning them is unnecessary rote memorization--I'd bet a years pay that you're producing shit code and probably not even realizing it.

  • by david_thornley ( 598059 ) on Thursday April 30, 2009 @11:00AM (#27773059)

    In almost all cases, all I need to know about sorting is how to use the built-in function, and a general idea of how efficient it is. In the other cases, I can consult Knuth or Wikipedia.

    I don't happen to remember right now how Quicksort works in detail enough to write, and I don't even know what algorithm the C++ STL sort I use uses. Last time I needed to know something like that, it was heapsort, and that was years ago. I have understood quicksort in the past, and wouldn't require more than a few minutes to brush up on it, but as long as I know what it does and what its performance is like I don't need to know the details.

    Similarly, I've forgotten a lot of my parsing theory, and don't remember offhand what separates LALR(1) from a more general LR(1) parser, but I seem to be able to use compilers and come up with a usable YACC grammar.

    It's certainly worth learning this stuff at some point, and you do need to understand the implications of what you're doing, but I find that what I've forgotten about algorithms that are built into every programming language I use has never been a problem.

  • Re:True story (Score:3, Insightful)

    by BotnetZombie ( 1174935 ) on Thursday April 30, 2009 @11:03AM (#27773101)

    Horrendous overuse of StringBuilders is probably the most common example of this

    If you have a more efficient way to concat large strings of unknown length, I'd like to hear about it.

  • by Myria ( 562655 ) on Thursday April 30, 2009 @11:08AM (#27773179)

    Although it's generally true that what I initially think is a compiler bug is almost always my fault, I definitely run into actual compiler or library bugs on occasion.

    - I was working on some low-level boot loader assembly code, and found that "call esp" was not working as documented. It turns out that most modern Intel CPUs have a bug they haven't bothered fixing where it jumps to the value of ESP *after* pushing the return address. AMD processors don't do that, which is why it worked for me >_<

    - I found that a memcpy() in our Win32 "vectored exception handler" was corrupting memory. The code that caused the exception was a memmove() that had decided to go in reverse to ensure a proper copy. It turns out that Win32's exception handler stub wasn't clearing the x86 direction flag before calling exception handlers, a violation of the Win32 calling convention. Compilers assume that the direction flag is clear at the start of a function, so a constant-sized memcpy() will frequently get inlined as simply "rep movsd".

    - We found an extremely obscure bug in Visual C++'s compiler where it incorrectly zero-extended a pointer to 64 bits when it should have sign-extended as per the C standard. A global declaration like this was being used:

    int var;
    long long extended = (long long) &var;

    If the base address of a 32-bit program is >= 0x80000000, as occurs in NT device drivers, the extension to 64-bit will be zero-extended instead of sign-extended. This differs from what happens in local variables.

    In fact, getting this right is impossible with the relocations available to Win32 images and object files. The compiler should've thrown an error because it can't do the requested operation. (In C++ it could, but would have to make it a constructor.)

  • Re:True story (Score:2, Insightful)

    by UnknownSoldier ( 67820 ) on Thursday April 30, 2009 @11:36AM (#27773615)

    > My mother, who was programming before a fair few of us (including me) were born, once told me this: If you think you've found a bug in a compiler, or an operating system, or a programming language, or a well-known commonly used library... you're wrong.

    So when MSVC prints "Internal Compiler Error" and stops compiling my code, I'm wrong? :)

    Five years ago, it was easy to cause MSVC to crash & burn - lately their back-end compiler has gotten much better dealing with C/C++ code.

  • by n00854180t ( 866096 ) on Thursday April 30, 2009 @01:13PM (#27775195)
    Even if you're writing a game(and I'm a game dev), there is very little use in writing a sorting algorithm. In fact, I know my boss would give me a big "WTF!?" if I started writing one. The reason? It's going to be less efficient, and also a waste of time that I could use to be doing something that actually moved tasks forward. Now, obviously it is important to understand the various uses of different sorting algorithms, their performance implications etc. But actually knowing how, from memory, to write any particular sorting algorithm is pretty pointless IMO.
  • by The End Of Days ( 1243248 ) on Thursday April 30, 2009 @01:55PM (#27775969)

    That's not the case at all. The programming field is extremely large and there is room for people who don't know the underpinnings at all, and don't need to.

    Placing arbitrary bars on what constitutes a "programmer worthy of the name" is just get-off-my-lawn elitism spawned by people who are watching their craft steadily devalue under attacks from all angles.

    Sorry to let you guys in on this secret: it gets easier over time, and it will continue to do so until your arcane knowledge is as meaningless to society in general as pokemon trivia.

  • Why, why, why do people get SO offended when you tell them they have to learn computers to be good at computers?

    Because you're essentially attacking them.

    What if I responded to you by saying: "I'm sorry, but if you don't understand how flatly attacking people's qualifications for their job is insulting and threatening, you shouldn't be having this discussion. You simply don't have the interpersonal skills to articulate this kind of thing in a manner that would be productive, let alone persuasive."

    Get your dander up at all?

    And please don't hide behind the "I was just stating a fact, if the shoe fits, wear it." There are lots of good ways to say what you're trying to get at that are probably even closer to the truth.

    Which is that really don't have to learn *everything* about computers in order to be good at computers. It is certainly an underlying truth that the more you know, the better you are as a developer. But it's entirely possible to be a reasonably productive developer without knowing everything... as long as your abilities are matched to what you need to accomplish. And there's more or less a curve of task difficulty to go along with a curve of developer abilities.

    I don't know very much about building compilers. Some people would say that makes me a mere dilettante of a software developer. That's a rash overstatement. It's absolutely true I would be a *better* developer if I knew more about these things, and certain problem domains would be more open to me, but there's a huge problem space that really doesn't require this knowledge. This works the other way, too: I probably know more about Linear Algebra and Discrete Mathematics than many developers and even some CS majors (studied Math in school) and I'm familiar with the Logic Programming paradigm (written full programs in Prolog). These things make me a better developer, particularly for some problem domains, but it certainly doesn't mean anyone who doesn't know these things is a simple hack.

    I think implementing hashes and other primitives that are now part of libraries/languages falls in this category. Being able to implement them is certainly a *demonstration* that you've mastered certain skills. The contrapositive doesn't necessarily follow. Not ever having implemented them -- in particular because you've never had to -- doesn't necessarily imply that you lack the ability to solve that class of problem.

    And in fact, it might demonstrate a certain stripe of wisdom: there's a limited amount of time and a pretty much infinite supply of problems. What do you spend time learning how to do?

  • by JasterBobaMereel ( 1102861 ) on Friday May 01, 2009 @09:11AM (#27785845)

    Number of times you have had to pick a different sorting/searching algorithm by choosing a different library or routine, I hope is many times - and how did you decide which to pick, did you try each by trial and error, or did you remember (or look up) which were likely to be the best candidates in the current situation and try just them?

    It's not what you know from memory, it's not what you can lookup, it's what you know/remember where and how to lookup the answer... learning how to use a variety of basic algorithms will help you in the future to use the most appropriate

Ya'll hear about the geometer who went to the beach to catch some rays and became a tangent ?

Working...