Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Education Programming

The Value of BASIC As a First Programming Language 548

Mirk writes "Computer-science legend Edsger W. Dijkstra famously wrote: 'It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.' The Reinvigorated Programmer argues that the world is full of excellent programmers who cut their teeth on BASIC, and suggests it could even be because they started out with BASIC."
This discussion has been archived. No new comments can be posted.

The Value of BASIC As a First Programming Language

Comments Filter:
  • by syousef ( 465911 ) on Wednesday March 10, 2010 @03:26AM (#31423650) Journal

    A good programmer has experienced many languages and done things in many ways. A good programmer has compared all these various experiences and understands the advantages and disadvantages of each language and programming technique. A good programmer doesn't get bogged down in line numbers and GOTO statements and never move beyond that. If someone does get bogged down they never had the attitude to be a good programmer.

  • Time heals (Score:3, Insightful)

    by Quiet_Desperation ( 858215 ) on Wednesday March 10, 2010 @03:31AM (#31423662)

    Depends on the BASIC. I use RealBasic at work as an alternative to LabView. It's object oriented, multithreaded and completely "Visual".

    as potential programmers they are mentally mutilated beyond hope of regeneration

    Am I the only person on the Earth who just writes off hysterical, panty-wetting stuff like this? When did he say this? in the 1960s or something? It's 2010 now, right?

  • by julesh ( 229690 ) on Wednesday March 10, 2010 @03:33AM (#31423672)

    ...and I'm wondering: what's with the random sushi pictures?

  • Funny argument (Score:5, Insightful)

    by phantomfive ( 622387 ) on Wednesday March 10, 2010 @03:36AM (#31423694) Journal
    His argument is kind of funny. He says people who've learned in BASIC have learned what NOT to do when programming. I have to admit he has a point....I learned exactly why spaghetti code was a bad idea after doing it for a couple years. Some people think they know what spaghetti code is, but unless they've written code with line numbers, they probably don't.
  • by VirtualUK ( 121855 ) on Wednesday March 10, 2010 @03:37AM (#31423714) Homepage

    I disagree with the premise that BASIC teaches bad habits. I stick with the old adage a bad workman blames his tools. BASIC teaches kids (like I was over 30 years ago) from the ages of 5-6 how to put together simple logic, and gives them the very basics of languages constructs like variables, loops, sub routines, etc. without them having to grasp structures, classes, polymorphism, OO, etc. that a lot of grown ups that have been involved with writing code for quite some time can have issues with.

  • by black3d ( 1648913 ) on Wednesday March 10, 2010 @03:41AM (#31423732)

    Not recognising the relevance of BASIC as described in the article, it's possible you're around a decade or two younger than the individuals the article is referring to.

  • Re:Funny argument (Score:3, Insightful)

    by 0123456 ( 636235 ) on Wednesday March 10, 2010 @03:42AM (#31423744)

    Some people think they know what spaghetti code is, but unless they've written code with line numbers, they probably don't.

    And the good old days of 'LET A = NOT PI' to save three bytes of RAM :).

    (ex-Sinclair BASIC programmers will understand why such arcane constructs were beneficial when you were low on RAM)

  • by Katatsumuri ( 1137173 ) on Wednesday March 10, 2010 @03:45AM (#31423764)

    As a programmer who started with old-school BASIC (numbered lines, etc), I was overjoyed with better elements of structured programming in Turbo Basic, and totally excited with C when I learned it. It felt like having my hands untied. So I would state the contrary: you cannot fully appreciate the structured programming unless you went through the GOTO hell.

    I hear a lot of similar FUD from some people, like "you can't grok OOP if you started with C", or "anyone who touched .NET or Java is lost for C++..." It boils down to "people are idiots, they can't possibly learn anything new, they are either indoctrinated at birth in My True Way, or lost and hopeless." Who in their right mind would take that seriously?

  • by Dahamma ( 304068 ) on Wednesday March 10, 2010 @03:52AM (#31423790)

    Absolutely. It was drummed into me (and apparently most coworkers), and I can't believe how liberating it is once you realize, yes, in appropriate contexts "goto" can result in code (especially in error handling cases) that is both more readable and more efficient. Dogma is rarely the answer.

  • by syousef ( 465911 ) on Wednesday March 10, 2010 @03:57AM (#31423816) Journal

    The irony is that under the covers, it's all done with jump instructions anyway.

  • Bah (Score:5, Insightful)

    by tsotha ( 720379 ) on Wednesday March 10, 2010 @03:59AM (#31423834)

    Computer-science legend Edsger W. Dijkstra famously wrote: 'It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration'

    Like all eminently quotable people, Dijkstra tended to hyperbole and oversimplification.

  • by Anonymous Coward on Wednesday March 10, 2010 @04:00AM (#31423838)
    programming is about being rigorous. that should be taught at college. not some sloppy language where anything goes and with the magical on error resume next that "fixes" all of your bugs. you know what is propaedeutic for real programming to freshmen? fighting segmentation faults. learning to trace stack overflows. checking that what they are doing do what they thing it should do, even if that means checking the return value of each printf.

    we don't really need another generation of people who'd like to be programmers but doesn't wan't to learn the intricacies behind it. we've enough problems as it is, with sql injections and stuff.
  • by Anonymous Coward on Wednesday March 10, 2010 @04:05AM (#31423862)

    I maintain that Python can do this much more easily due to the fact that there are simply fewer ridiculous syntax requirements in a simpler to read format. The things you learn in BASIC can be taught at a much more significant level in Python (inline AND block comments, order of instructions, basic control structures, variables and their use) without jumping through the hoops that you will in BASIC. Python's use of whitespace and extensibility allow beginners with an aptitude to expand on their knowledge easily by making simple inferences about how some control structures work (Python has an excellent for loop implementation) .

    Python does not require classes, can be used as an imperative language, and can provide subroutines in the form of other functions easily simply be defining it in the environment (which is not a difficult concept to grasp, "the computer needs to be told that it can use it" will suffice for beginners which is true enough).

    Basically, Python can provide all the same benefits as BASIC without the stupid unnecessary crap (Explicit Line numbers? Really? Are we still using punch cards?) that always annoyed me.

    I will give BASIC one thing - it's basic geometric drawing library was really easy to use. Problem is, we NEVER USED IT except as a "hey, so this is how you use it, now we're never going to talk about it again." Might as well have learned LOGO (which I had taken as well, turtles were awesome and provided a good intro to iteration even if it was never described as such in class.)

  • by smash ( 1351 ) on Wednesday March 10, 2010 @04:11AM (#31423900) Homepage Journal
    I'd argue that goto IS harmful - however like all harmful things, there are use-cases where it is either necessary or useful to accomplish a particular objective.

    Much like a chain saw not being an all purpose cutting tool....

    So... avoid using it when not strictly necessary, but if it is the only sane/high performance way of getting things done in a special use case, by all means...

  • I think the author is mostly on. He's aware Dijkstra was exaggerating for effect, but also completely correct... if you started programming in the early home computing era, you probably started with a BASIC. I was lucky enough to get some varied exposure earlier to some other languages (LOGO and some shallow assembly), but until I was 15, it was pretty much Basic.

    And none of my programming habits now resemble anything close to the BASIC I wrote in when I was that age. Except, occasionally, for the rare cases where global state seems to make sense, and even then, I try to namespace things in one way or another. But by and large, I picked up structured programming, I picked up object-oriented programming, I picked up logic programming, and I'm learning to enjoy functional programming.

    I will say... there was a time when I was probably close to being "ruined." It was when I was learning C++, and I only really had Pascal, basic C, and Basic under my belt. And I had a pretty solidly structured-imperative mindset, and really hadn't seen any other way of doing things. C++ married data structures and methods in an interesting way, but it didn't seem like more than a stylistic practice to me. I was pretty sure most languages were alike, you just had syntax and typing differences.

    But there was one thing: I'd had to learn Prolog for a very specific job. We were teaching it to high school students in a CS summercamp I worked at for a few years. The first year, I just thought "Man, this is weird," more or less got through all the exercises, and left it behind, and did what most people do: dismiss it as an odd research toy. The second year, I thought "this is weird, but interesting." The third year, I thought "Wow. There are all kinds of intriguing ideas here."

    And there are, and I still think it could stand to see more usage in mainstream software, but more importantly, I think I'm pretty lucky I got repeated exposure to a language that forced me to think differently before I got very far into actually working in the software industry.

    Because I now think there's either a critical period [wikipedia.org] (or possibly, at a minimum, a critical attitude of some kind) after which a lot of programmers tend to lose either the humility or the curiosity that drives people to think about different programming constructs and habits. I think if a programmer has been minimally exposed before they reach it, they'll keep just enough of one or both of those attributes that they'll be interested in what they don't already know, rather than arriving at the point where "they've already learned the last programming language they'll ever need." [blogspot.com]

    And if they don't get so exposed, they become Blub programmers [paulgraham.com], where generally $Blub is some industry-leading language that does enough you don't easily bump up against tasks that are near impossible in it.

    To tie this back in with a point I think the author missed, I suspect that some of the difficulties with Basic are actually part of the reason why it didn't end up ruining more programmers. Almost everybody who really came to grips with it as a tool probably realized that it couldn't possibly be the last programming language you'd ever need (if it weren't enough that any effort to look into working as a programmer revealed that Basic was clearly not the strongest payroll ticket).

  • by Eraesr ( 1629799 ) on Wednesday March 10, 2010 @04:50AM (#31424056) Homepage
    At college we started off in Pascal but quickly moved on to C and a bit of Java. I'd say that if anyone is considering BASIC as a first language, they should choose Pascal instead. But to be completely honest, these days OO programming has become so important that it's probably better to start off in Java or C# from the start.
  • by pedestrian crossing ( 802349 ) on Wednesday March 10, 2010 @04:58AM (#31424092) Homepage Journal
    I also started in Fortran and by the time I got through my second semester, I had an epiphany - highly structured code is very important if you are going to maintainably do anything of significant complexity. I don't think the language matters so much, once you get to a certain point you realize that you have to modularize your code if you are going to create anything beyond a simple classroom assignment. Languages like BASIC and PERL are great introductions for beginners because they are interpreted and present a low barrier to entry. The focus can be on basic programming concepts and they provide instant gratification. That said, if one is to go further and "become a programmer", you have to understand the need for structure, typing, scope etc. and take things to the next level. If you have the aptitude to be a good programmer, this will become clear to you as you take on more complex tasks. If you don't have the aptitude, well, you are going to be a shitty programmer no matter what language you started with. GOTO in a high-level language is bad in that it is a crutch that is tempting for the beginner to reach for, and overuse makes code difficult to maintain. Making a rule of avoiding it forces you to think through your flow/structure/logic.
  • by rsidd ( 6328 ) on Wednesday March 10, 2010 @05:05AM (#31424128)

    Thanks for the link. However, I'm not sure you read beyond the title. On page 2, Knuth foresees two types of reactions based on reading the title alone, and yours sounds the first type. In fact Knuth does not disagree with Dijkstra, and he quotes Dijkstra to show that Dijkstra was not dogmatic about GOTO either. Knuth's purpose is to explore where GOTO has a place and where it is better to eliminate it.

  • Re:Time heals (Score:3, Insightful)

    by jandersen ( 462034 ) on Wednesday March 10, 2010 @05:35AM (#31424258)

    Depends on the BASIC. I use RealBasic at work as an alternative to LabView. It's object oriented, multithreaded and completely "Visual".

    Not really; BASIC, as Dijkstra talks about, is not the modern development tool that is barely BASIC any longer. As a programming language regarded, modern 'BASIC' is not really much different from C++ or what can loosely be called "Object Pascal" - it only resembles BASIC because it has kept many of the same keywords.

    No, BASIC was and still is, as far as I am concerned, the line-by-line interpreted language that was meant to be a simplified FORTRAN, and as most of the languages from that time, it has a number of features that codify some of the things you can do in assembler (but really shouldn't) - I still shudder at the thought of PERFORM SECT1 THROUGH SECT5 VARYING ... - a sort of combination of a FOR loop and a GOSUB the allows overlapping subroutines with no explicit returns.

    BASIC allows practices that bad because they can make programs unmaintainable; and the limitations in the language mean that you are more or less forced to code that way. This is of course not because those that designed the language were idiots - it was never meant to be used for serious programming: Beginner's All-purpose Symbolic Instruction Code - the name says it all; this was only an introduction to things like FORTRAN (for numerical computing) or COBOL (for i/o manipulation).

    Am I the only person on the Earth who just writes off hysterical, panty-wetting stuff like this?

    Probably not, but it seems to me that you are the one hysterically wetting your pants. A person with a sense of humour would see that this kind of radical statement was meant as a provocative input into a sometimes heated discussion. If one were to interpret your words in a positive spirit, one could say the same; that you are probably just emphasizing a viewpoint by exaggerating.

  • by bluescreenbert ( 1185323 ) on Wednesday March 10, 2010 @06:20AM (#31424382)

    And the idea of mixing "data" and "executable code"??? Really? Damn. Sounds like injectable code execution exploits to me. When I started, we knew to keep those two things separate from the very beginning. Object orientation mixes them up and probably does more to lead to exploitable code than anything else.

    Your post pretty much proves Dijkstra's point. You did not manage to lay off your old thinking habits. You do not bother to think how an object oriented compiler works and to me it sounds that you are stuck in 80's style programming. For your information, object orientation does not mix up data and code. It merely gives the programmer a paradigm to access data. Code is related to classes, data is related to objects.

  • by Nerdfest ( 867930 ) on Wednesday March 10, 2010 @06:39AM (#31424488)
    When programming in Java, I still resist breaking out of loops, or multiple returns, even in small methods. These things seem to be normal acceptable practice with most developers, and really are fine when used in the right places. Because of early exposure to BASIC (and badly written C) I avoid them more than most.

    I think the BASIC of today is Javascript. You see more badly written javascript than any other language..
  • by Moraelin ( 679338 ) on Wednesday March 10, 2010 @06:43AM (#31424504) Journal

    Well it IS done with jump instructions, but as few as possible because the branch penalty is usually high (especially on an x86). If you don't use the goto statement then your program is more abstract, and structures like loops can be more easily optimised by the compiler to use as few branches as possible. Not to mention things architecture-specific like ARM's condition codes which can turn a loop with multiple if-else statements into a block of code with only one branch instruction.

    Point taken, but in my experience people who have even marginal idea of what happens under the covers, tend to write better code than those for whom the underlying machine is a complete mystery. I'm not talking premature optimization, but merely knowing in the back of your head what a pointer is, or _why_ this operation is O(log n) and better thus than O(n), can save one from a lot of awfully wrong guesses and writing awful code.

    My canonical example is a team whose architect (!) finally read somewhere that when passing an object to a Java method, only the pointer is passed on the string. So he actually decreed -- and none of the lemmings knew better -- that they should use parameters like the wrapper object Integer instead of the primitive int. (We're also talking Java 1.3 times, so no automatic boxing/unboxing either.) Because, I quote, "If you use Integer Java copies only a pointer to it, not the whole int."

    Maybe knowing how much space an int takes under the covers would have helped.

    Another time I hear my now ex-coleague Wally (not the real name, but pretty accurate;)) repeatedly going, "That can't be true!" and the like. Curiosity gets the better of me and I ask what's the problem.

    "Java has a bug!" he goes, "I put a new key/value with the same hash code in a HashMap and it just replaced my old value!"

    "Oh, yeah, we've had the same bug at the old company, " Wally 2 chimes in. "We had to manually set the capacity so it goes in another bucket."

    (I clench my teeth to avoid screaming at the notion that there's any way to the right capacity to avoid collisions for keys that are random strings.)

    I go and look at what he's doing, and sure enough he's got the debugger open and is looking at the bucket array of a HashMap. "Look! There! I had a different value and it replaced it!"

    "Aha, " I try to be diplomatic, "can you please expand that 'next' variable there?"

    "No, you don't understand! My value was there and now it replaced it!"

    "Yes, I get it. But I want to see what's in that 'next' variable."

    He clicks and goes, "Oh... there it is."

    The whole concept of a linked list was new to him, obviously.

    And if you think that's an isolated case, in the meantime I've run into two different teams whose "architect" actually made it mandatory to plaster his broken replacement for the hash-code method everywhere, because of that supposed "bug in Java." Supposedly they can hash a long-ish random String into a 32 bit int without ever having collisions. (Ok, 31 since Java doesn't use the sign.) Consulting can be depressing business, you know?

    I could go on with more such WTF examples, but basically let's just say I wish more people would know exactly what happens behind those high level constructs and libraries. Because otherwise I see them take their own guesses anyway, and guessing wrong. I wish they'd know what a pointer really means, and why a LinkedList does _not_ use less memory than an ArrayList, and, yes, what kind of things will cause jumps. Or what kind of things will be optimized into a tail recursion instead of a plain recursion, as a trivial example of where it pays to know the difference between a JUMP and a bunch of PUSHes and CALL generated by the compiler.

  • by r00t ( 33219 ) on Wednesday March 10, 2010 @07:06AM (#31424572) Journal

    First of all, you're clearly not an article reader. The point of initially learning with a painfully unstructured language is that you end up appreciating structure more, while being able to tolerate code that has awful structure. IMHO a better choice is assembly, but BASIC does have the advantage of providing faster rewards.

    Second of all, many of us dispute the bit about Python not being a toy language. If you build your skills around Python, you'll hit serious trouble if you ever end up needing decent performance or unwrappered OS functionality. If you build your skills around C, whole new possibilities open up to you. You could even write a non-toy OS if you were so inclined.

  • by wileypob ( 446033 ) on Wednesday March 10, 2010 @07:28AM (#31424646)

    First you complain about people not understanding how machines actually work, then you make it perfectly clear you've no idea how object oriented languages are implemented. In C++, code still goes in .text sections, and data is still on the heap (or possibly .data or .rodata), so the mixing of code and data is an abstraction.

    "Object orientation mixes them up and probably does more to lead to exploitable code than anything else."
    Wow, what BS.

    BTW, I also grew up in the 80's on microcomputer ROM BASICs, and while I had unlearn bad habits when I went to real languages, I did learn a lot about how to solve problems.

  • by Anonymous Coward on Wednesday March 10, 2010 @07:55AM (#31424732)

    And BASIC of today is Python.

    easy != bad
    Complicated conventions and workarounds don't make a good programming language.
    Teaching people to program in C is so much more complicated than Java/Python because you have to explain all the conventions (read: main, NULL) and workarounds (read: arglists, macros) people came up with.

  • by AVee ( 557523 ) <slashdotNO@SPAMavee.org> on Wednesday March 10, 2010 @08:18AM (#31424818) Homepage
    receive(id, val)
    previousId = id
    previousVal = val
    beforeEOF = true
    while (beforeEOF)
        beforeEOF = receive(id, val)
        if(not beforeEOF || id <> previousID)
            output previousId, previousVal
        else
            previousId = id
            previousVal = val
        end if
    end while
  • by Jurily ( 900488 ) <jurily&gmail,com> on Wednesday March 10, 2010 @09:02AM (#31425004)

    Most importantly it's not a toy language.

    That's why it should be kept far away from beginners. What if they accidentally type "import skynet", huh?

  • by beakerMeep ( 716990 ) on Wednesday March 10, 2010 @09:12AM (#31425040)
    That's just nonsense. JavaScript is great for learning simply because it's good at giving immediate results that users can see, in a format they can all relate to (web pages). It's also a very nice language that doesn't throw too much IO stuff or memory addressing at the beginner. It adds a bit of "fun" to the coding process. Something that was sorely lacking when I wrote my first hello world in fortran.

    A lot of the older coders hate it though because it's easy and flexible and gives the programmer much rope to hang themselves, but that's just "get off my lawn" nonsense. The reality is, and I'm surprised no slashdotter has mentioned this yet, is we all feel like we cut our teeth on the best programming language. And those other guys who do stuff differently must be wrong.

    I swear I hear Foghorn Leghorn's voice when reading some of these comments. "I say, I say, a those new-fangled curly braces languages are an abominations boy! An abominable snowman, I say!"

    This attitude is pervasive in the developer world and if you dont believe me, re-read all these comments. Developers need a bit more self reflection about what it means to be a good developer. It's not just about what crazy hard language you started with, or what perfect cohesion, or design pattern mastery you have. Rather nowadays, it's also about teamwork, problem solving, readability, modularity, and user experience. I know a few guys who might be masters of the command line, wizards at writing regex, and zen-like in their ability to do bit shifting math. But, for most projects, I wouldn't want them anywhere near my team/code.

    People forget that learning isn't all about what the "right way" is. Learning is about accumulating knowledge in steps, and then retaining it. Making the process fun means better retention. I'd hope that more of the developer world groks this soon, as the machismo bullshit that comes out of these conversations is what drives smart and nice people away from developing.
  • by OeLeWaPpErKe ( 412765 ) on Wednesday March 10, 2010 @10:18AM (#31425702) Homepage

    I still love delphi. Talk about simply starting a program and getting it running, Delphi was heaven on earth. Yes, it can't do everything (though it has winapi support that's good enough to write even things like filesystem drivers or firewalls in Delphi), but it was such a pleasure to get a small application running quickly.

    I still don't understand why academics have such hate for that language. And the language they mostly replace it with - java - is not better, it's a lot worse.

    The object pascal compiler was a pleasure to use compared to today's clunky and slow-as-hell compilers. People underestimate the advantages of a hugely fast compiler.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...