The Value of BASIC As a First Programming Language 548
Mirk writes "Computer-science legend Edsger W. Dijkstra famously wrote: 'It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.' The Reinvigorated Programmer argues that the world is full of excellent programmers who cut their teeth on BASIC, and suggests it could even be because they started out with BASIC."
Good programmers aren't easily ruined (Score:5, Insightful)
A good programmer has experienced many languages and done things in many ways. A good programmer has compared all these various experiences and understands the advantages and disadvantages of each language and programming technique. A good programmer doesn't get bogged down in line numbers and GOTO statements and never move beyond that. If someone does get bogged down they never had the attitude to be a good programmer.
Time heals (Score:3, Insightful)
Depends on the BASIC. I use RealBasic at work as an alternative to LabView. It's object oriented, multithreaded and completely "Visual".
as potential programmers they are mentally mutilated beyond hope of regeneration
Am I the only person on the Earth who just writes off hysterical, panty-wetting stuff like this? When did he say this? in the 1960s or something? It's 2010 now, right?
Second story from this blog this week... (Score:3, Insightful)
...and I'm wondering: what's with the random sushi pictures?
Funny argument (Score:5, Insightful)
BASIC is great for kids (Score:5, Insightful)
I disagree with the premise that BASIC teaches bad habits. I stick with the old adage a bad workman blames his tools. BASIC teaches kids (like I was over 30 years ago) from the ages of 5-6 how to put together simple logic, and gives them the very basics of languages constructs like variables, loops, sub routines, etc. without them having to grasp structures, classes, polymorphism, OO, etc. that a lot of grown ups that have been involved with writing code for quite some time can have issues with.
Re:BASIC is irrelevant (Score:5, Insightful)
Not recognising the relevance of BASIC as described in the article, it's possible you're around a decade or two younger than the individuals the article is referring to.
Re:Funny argument (Score:3, Insightful)
Some people think they know what spaghetti code is, but unless they've written code with line numbers, they probably don't.
And the good old days of 'LET A = NOT PI' to save three bytes of RAM :).
(ex-Sinclair BASIC programmers will understand why such arcane constructs were beneficial when you were low on RAM)
Appreciate the difference (Score:5, Insightful)
As a programmer who started with old-school BASIC (numbered lines, etc), I was overjoyed with better elements of structured programming in Turbo Basic, and totally excited with C when I learned it. It felt like having my hands untied. So I would state the contrary: you cannot fully appreciate the structured programming unless you went through the GOTO hell.
I hear a lot of similar FUD from some people, like "you can't grok OOP if you started with C", or "anyone who touched .NET or Java is lost for C++..." It boils down to "people are idiots, they can't possibly learn anything new, they are either indoctrinated at birth in My True Way, or lost and hopeless." Who in their right mind would take that seriously?
Re:Good programmers aren't easily ruined (Score:4, Insightful)
Absolutely. It was drummed into me (and apparently most coworkers), and I can't believe how liberating it is once you realize, yes, in appropriate contexts "goto" can result in code (especially in error handling cases) that is both more readable and more efficient. Dogma is rarely the answer.
Re:Good programmers aren't easily ruined (Score:5, Insightful)
The irony is that under the covers, it's all done with jump instructions anyway.
Bah (Score:5, Insightful)
Like all eminently quotable people, Dijkstra tended to hyperbole and oversimplification.
Re:BASIC is irrelevant (Score:1, Insightful)
we don't really need another generation of people who'd like to be programmers but doesn't wan't to learn the intricacies behind it. we've enough problems as it is, with sql injections and stuff.
Re:BASIC is great for kids (Score:3, Insightful)
I maintain that Python can do this much more easily due to the fact that there are simply fewer ridiculous syntax requirements in a simpler to read format. The things you learn in BASIC can be taught at a much more significant level in Python (inline AND block comments, order of instructions, basic control structures, variables and their use) without jumping through the hoops that you will in BASIC. Python's use of whitespace and extensibility allow beginners with an aptitude to expand on their knowledge easily by making simple inferences about how some control structures work (Python has an excellent for loop implementation) .
Python does not require classes, can be used as an imperative language, and can provide subroutines in the form of other functions easily simply be defining it in the environment (which is not a difficult concept to grasp, "the computer needs to be told that it can use it" will suffice for beginners which is true enough).
Basically, Python can provide all the same benefits as BASIC without the stupid unnecessary crap (Explicit Line numbers? Really? Are we still using punch cards?) that always annoyed me.
I will give BASIC one thing - it's basic geometric drawing library was really easy to use. Problem is, we NEVER USED IT except as a "hey, so this is how you use it, now we're never going to talk about it again." Might as well have learned LOGO (which I had taken as well, turtles were awesome and provided a good intro to iteration even if it was never described as such in class.)
Re:Good programmers aren't easily ruined (Score:3, Insightful)
Much like a chain saw not being an all purpose cutting tool....
So... avoid using it when not strictly necessary, but if it is the only sane/high performance way of getting things done in a special use case, by all means...
I'd guess there's a critical period & an attit (Score:4, Insightful)
I think the author is mostly on. He's aware Dijkstra was exaggerating for effect, but also completely correct... if you started programming in the early home computing era, you probably started with a BASIC. I was lucky enough to get some varied exposure earlier to some other languages (LOGO and some shallow assembly), but until I was 15, it was pretty much Basic.
And none of my programming habits now resemble anything close to the BASIC I wrote in when I was that age. Except, occasionally, for the rare cases where global state seems to make sense, and even then, I try to namespace things in one way or another. But by and large, I picked up structured programming, I picked up object-oriented programming, I picked up logic programming, and I'm learning to enjoy functional programming.
I will say... there was a time when I was probably close to being "ruined." It was when I was learning C++, and I only really had Pascal, basic C, and Basic under my belt. And I had a pretty solidly structured-imperative mindset, and really hadn't seen any other way of doing things. C++ married data structures and methods in an interesting way, but it didn't seem like more than a stylistic practice to me. I was pretty sure most languages were alike, you just had syntax and typing differences.
But there was one thing: I'd had to learn Prolog for a very specific job. We were teaching it to high school students in a CS summercamp I worked at for a few years. The first year, I just thought "Man, this is weird," more or less got through all the exercises, and left it behind, and did what most people do: dismiss it as an odd research toy. The second year, I thought "this is weird, but interesting." The third year, I thought "Wow. There are all kinds of intriguing ideas here."
And there are, and I still think it could stand to see more usage in mainstream software, but more importantly, I think I'm pretty lucky I got repeated exposure to a language that forced me to think differently before I got very far into actually working in the software industry.
Because I now think there's either a critical period [wikipedia.org] (or possibly, at a minimum, a critical attitude of some kind) after which a lot of programmers tend to lose either the humility or the curiosity that drives people to think about different programming constructs and habits. I think if a programmer has been minimally exposed before they reach it, they'll keep just enough of one or both of those attributes that they'll be interested in what they don't already know, rather than arriving at the point where "they've already learned the last programming language they'll ever need." [blogspot.com]
And if they don't get so exposed, they become Blub programmers [paulgraham.com], where generally $Blub is some industry-leading language that does enough you don't easily bump up against tasks that are near impossible in it.
To tie this back in with a point I think the author missed, I suspect that some of the difficulties with Basic are actually part of the reason why it didn't end up ruining more programmers. Almost everybody who really came to grips with it as a tool probably realized that it couldn't possibly be the last programming language you'd ever need (if it weren't enough that any effort to look into working as a programmer revealed that Basic was clearly not the strongest payroll ticket).
Re:BASIC is irrelevant (Score:4, Insightful)
Re:Good programmers aren't easily ruined (Score:5, Insightful)
Re:Good programmers aren't easily ruined (Score:3, Insightful)
Thanks for the link. However, I'm not sure you read beyond the title. On page 2, Knuth foresees two types of reactions based on reading the title alone, and yours sounds the first type. In fact Knuth does not disagree with Dijkstra, and he quotes Dijkstra to show that Dijkstra was not dogmatic about GOTO either. Knuth's purpose is to explore where GOTO has a place and where it is better to eliminate it.
Re:Time heals (Score:3, Insightful)
Depends on the BASIC. I use RealBasic at work as an alternative to LabView. It's object oriented, multithreaded and completely "Visual".
Not really; BASIC, as Dijkstra talks about, is not the modern development tool that is barely BASIC any longer. As a programming language regarded, modern 'BASIC' is not really much different from C++ or what can loosely be called "Object Pascal" - it only resembles BASIC because it has kept many of the same keywords.
No, BASIC was and still is, as far as I am concerned, the line-by-line interpreted language that was meant to be a simplified FORTRAN, and as most of the languages from that time, it has a number of features that codify some of the things you can do in assembler (but really shouldn't) - I still shudder at the thought of PERFORM SECT1 THROUGH SECT5 VARYING ... - a sort of combination of a FOR loop and a GOSUB the allows overlapping subroutines with no explicit returns.
BASIC allows practices that bad because they can make programs unmaintainable; and the limitations in the language mean that you are more or less forced to code that way. This is of course not because those that designed the language were idiots - it was never meant to be used for serious programming: Beginner's All-purpose Symbolic Instruction Code - the name says it all; this was only an introduction to things like FORTRAN (for numerical computing) or COBOL (for i/o manipulation).
Am I the only person on the Earth who just writes off hysterical, panty-wetting stuff like this?
Probably not, but it seems to me that you are the one hysterically wetting your pants. A person with a sense of humour would see that this kind of radical statement was meant as a provocative input into a sometimes heated discussion. If one were to interpret your words in a positive spirit, one could say the same; that you are probably just emphasizing a viewpoint by exaggerating.
Re:Good programmers aren't easily ruined (Score:3, Insightful)
And the idea of mixing "data" and "executable code"??? Really? Damn. Sounds like injectable code execution exploits to me. When I started, we knew to keep those two things separate from the very beginning. Object orientation mixes them up and probably does more to lead to exploitable code than anything else.
Your post pretty much proves Dijkstra's point. You did not manage to lay off your old thinking habits. You do not bother to think how an object oriented compiler works and to me it sounds that you are stuck in 80's style programming. For your information, object orientation does not mix up data and code. It merely gives the programmer a paradigm to access data. Code is related to classes, data is related to objects.
Re:BASIC is irrelevant (Score:4, Insightful)
I think the BASIC of today is Javascript. You see more badly written javascript than any other language..
Point taken, but... (Score:5, Insightful)
Point taken, but in my experience people who have even marginal idea of what happens under the covers, tend to write better code than those for whom the underlying machine is a complete mystery. I'm not talking premature optimization, but merely knowing in the back of your head what a pointer is, or _why_ this operation is O(log n) and better thus than O(n), can save one from a lot of awfully wrong guesses and writing awful code.
My canonical example is a team whose architect (!) finally read somewhere that when passing an object to a Java method, only the pointer is passed on the string. So he actually decreed -- and none of the lemmings knew better -- that they should use parameters like the wrapper object Integer instead of the primitive int. (We're also talking Java 1.3 times, so no automatic boxing/unboxing either.) Because, I quote, "If you use Integer Java copies only a pointer to it, not the whole int."
Maybe knowing how much space an int takes under the covers would have helped.
Another time I hear my now ex-coleague Wally (not the real name, but pretty accurate;)) repeatedly going, "That can't be true!" and the like. Curiosity gets the better of me and I ask what's the problem.
"Java has a bug!" he goes, "I put a new key/value with the same hash code in a HashMap and it just replaced my old value!"
"Oh, yeah, we've had the same bug at the old company, " Wally 2 chimes in. "We had to manually set the capacity so it goes in another bucket."
(I clench my teeth to avoid screaming at the notion that there's any way to the right capacity to avoid collisions for keys that are random strings.)
I go and look at what he's doing, and sure enough he's got the debugger open and is looking at the bucket array of a HashMap. "Look! There! I had a different value and it replaced it!"
"Aha, " I try to be diplomatic, "can you please expand that 'next' variable there?"
"No, you don't understand! My value was there and now it replaced it!"
"Yes, I get it. But I want to see what's in that 'next' variable."
He clicks and goes, "Oh... there it is."
The whole concept of a linked list was new to him, obviously.
And if you think that's an isolated case, in the meantime I've run into two different teams whose "architect" actually made it mandatory to plaster his broken replacement for the hash-code method everywhere, because of that supposed "bug in Java." Supposedly they can hash a long-ish random String into a 32 bit int without ever having collisions. (Ok, 31 since Java doesn't use the sign.) Consulting can be depressing business, you know?
I could go on with more such WTF examples, but basically let's just say I wish more people would know exactly what happens behind those high level constructs and libraries. Because otherwise I see them take their own guesses anyway, and guessing wrong. I wish they'd know what a pointer really means, and why a LinkedList does _not_ use less memory than an ArrayList, and, yes, what kind of things will cause jumps. Or what kind of things will be optimized into a tail recursion instead of a plain recursion, as a trivial example of where it pays to know the difference between a JUMP and a bunch of PUSHes and CALL generated by the compiler.
no, Python is not the language to start with (Score:3, Insightful)
First of all, you're clearly not an article reader. The point of initially learning with a painfully unstructured language is that you end up appreciating structure more, while being able to tolerate code that has awful structure. IMHO a better choice is assembly, but BASIC does have the advantage of providing faster rewards.
Second of all, many of us dispute the bit about Python not being a toy language. If you build your skills around Python, you'll hit serious trouble if you ever end up needing decent performance or unwrappered OS functionality. If you build your skills around C, whole new possibilities open up to you. You could even write a non-toy OS if you were so inclined.
Re:Good programmers aren't easily ruined (Score:3, Insightful)
First you complain about people not understanding how machines actually work, then you make it perfectly clear you've no idea how object oriented languages are implemented. In C++, code still goes in .text sections, and data is still on the heap (or possibly .data or .rodata), so the mixing of code and data is an abstraction.
"Object orientation mixes them up and probably does more to lead to exploitable code than anything else."
Wow, what BS.
BTW, I also grew up in the 80's on microcomputer ROM BASICs, and while I had unlearn bad habits when I went to real languages, I did learn a lot about how to solve problems.
Re:BASIC is irrelevant (Score:1, Insightful)
And BASIC of today is Python.
easy != bad
Complicated conventions and workarounds don't make a good programming language.
Teaching people to program in C is so much more complicated than Java/Python because you have to explain all the conventions (read: main, NULL) and workarounds (read: arglists, macros) people came up with.
Re:Good programmers aren't easily ruined (Score:4, Insightful)
previousId = id
previousVal = val
beforeEOF = true
while (beforeEOF)
beforeEOF = receive(id, val)
if(not beforeEOF || id <> previousID)
output previousId, previousVal
else
previousId = id
previousVal = val
end if
end while
Re:BASIC is irrelevant (Score:3, Insightful)
Most importantly it's not a toy language.
That's why it should be kept far away from beginners. What if they accidentally type "import skynet", huh?
Re:Start with scripting (Score:3, Insightful)
A lot of the older coders hate it though because it's easy and flexible and gives the programmer much rope to hang themselves, but that's just "get off my lawn" nonsense. The reality is, and I'm surprised no slashdotter has mentioned this yet, is we all feel like we cut our teeth on the best programming language. And those other guys who do stuff differently must be wrong.
I swear I hear Foghorn Leghorn's voice when reading some of these comments. "I say, I say, a those new-fangled curly braces languages are an abominations boy! An abominable snowman, I say!"
This attitude is pervasive in the developer world and if you dont believe me, re-read all these comments. Developers need a bit more self reflection about what it means to be a good developer. It's not just about what crazy hard language you started with, or what perfect cohesion, or design pattern mastery you have. Rather nowadays, it's also about teamwork, problem solving, readability, modularity, and user experience. I know a few guys who might be masters of the command line, wizards at writing regex, and zen-like in their ability to do bit shifting math. But, for most projects, I wouldn't want them anywhere near my team/code.
People forget that learning isn't all about what the "right way" is. Learning is about accumulating knowledge in steps, and then retaining it. Making the process fun means better retention. I'd hope that more of the developer world groks this soon, as the machismo bullshit that comes out of these conversations is what drives smart and nice people away from developing.
Re:Good programmers aren't easily ruined (Score:2, Insightful)
I still love delphi. Talk about simply starting a program and getting it running, Delphi was heaven on earth. Yes, it can't do everything (though it has winapi support that's good enough to write even things like filesystem drivers or firewalls in Delphi), but it was such a pleasure to get a small application running quickly.
I still don't understand why academics have such hate for that language. And the language they mostly replace it with - java - is not better, it's a lot worse.
The object pascal compiler was a pleasure to use compared to today's clunky and slow-as-hell compilers. People underestimate the advantages of a hugely fast compiler.