The Value of BASIC As a First Programming Language 548
Mirk writes "Computer-science legend Edsger W. Dijkstra famously wrote: 'It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.' The Reinvigorated Programmer argues that the world is full of excellent programmers who cut their teeth on BASIC, and suggests it could even be because they started out with BASIC."
Good programmers aren't easily ruined (Score:5, Insightful)
A good programmer has experienced many languages and done things in many ways. A good programmer has compared all these various experiences and understands the advantages and disadvantages of each language and programming technique. A good programmer doesn't get bogged down in line numbers and GOTO statements and never move beyond that. If someone does get bogged down they never had the attitude to be a good programmer.
Re: (Score:2)
I agree completely my first language was QBASIC and I didn't bring any of that with me, but I could still go back and do it just fine.
That said if I were to teach someone how to program now I would probably use javascript.
Re:Good programmers aren't easily ruined (Score:5, Interesting)
I learned in Fortran (I should qualify this by pointing out that I'm not a particularly good programmer) but it seems to me that writing logical code that uses GOTO statements would be a good introduction to computer logic. A complex program may become unreadable, but as a learning tool I could see that it might have merit. Good coding is about understanding logical procedure (and comments).
Re:Good programmers aren't easily ruined (Score:5, Insightful)
Re: (Score:3, Interesting)
it seems to me that writing logical code that uses GOTO statements would be a good introduction to computer logic.
I'll agree with Dijkstra to a point -- BASIC makes it harder to learn modern high level languages like C or Java. BASIC is more like assembly than it is like C; there's little difference between JMP FF37 and GOTO 100.
Now, if they're talking about Visual Basic, I'll agree with Dijkstra, that language is an abomination and should never be foisted on anyone.
I do databases at work, and used to use d
Re:Good programmers aren't easily ruined (Score:5, Informative)
Indeed. Dijkstra was frequently wrong, especially when he made grand sweeping statements.
GOTO is a good example, 'GOTO considered harmful' is practically biblical law amongst many programmers, but it's worth remembering that he made that statement in the context of an argument with Donald Knuth. Knuth won: (http://pplab.snu.ac.kr/courses/adv_pl05/papers/p261-knuth.pdf)
Re:Good programmers aren't easily ruined (Score:4, Insightful)
Absolutely. It was drummed into me (and apparently most coworkers), and I can't believe how liberating it is once you realize, yes, in appropriate contexts "goto" can result in code (especially in error handling cases) that is both more readable and more efficient. Dogma is rarely the answer.
Re:Good programmers aren't easily ruined (Score:5, Informative)
All of that discussion is passed us now, most of us have been writing software without using goto for the last two decades, goto has been replaced with try/catch constructs, labeled breaks, switch statements etc. None of the examples Knuth provides in that paper are still relevant in any modern language. By that measure, Dijkstra won.
It's not surprising either, Dijkstra was always in utopia, talking about how things would be if he build the world himself (which doesn't mean he's wrong). Knuth has always been about how to deal with the current reality (including the state of programming languages), and not so much about changing that reality.
Re: (Score:3, Funny)
So far you've proven either that you suck at programming, or that you suck at cut and paste. But that is not a program that uses a goto. You could stick with your current proof, or if you try again you may remove all doubt.
Re:Good programmers aren't easily ruined (Score:4, Insightful)
previousId = id
previousVal = val
beforeEOF = true
while (beforeEOF)
beforeEOF = receive(id, val)
if(not beforeEOF || id <> previousID)
output previousId, previousVal
else
previousId = id
previousVal = val
end if
end while
Re:Good programmers aren't easily ruined (Score:5, Insightful)
The irony is that under the covers, it's all done with jump instructions anyway.
Point taken, but... (Score:5, Insightful)
Point taken, but in my experience people who have even marginal idea of what happens under the covers, tend to write better code than those for whom the underlying machine is a complete mystery. I'm not talking premature optimization, but merely knowing in the back of your head what a pointer is, or _why_ this operation is O(log n) and better thus than O(n), can save one from a lot of awfully wrong guesses and writing awful code.
My canonical example is a team whose architect (!) finally read somewhere that when passing an object to a Java method, only the pointer is passed on the string. So he actually decreed -- and none of the lemmings knew better -- that they should use parameters like the wrapper object Integer instead of the primitive int. (We're also talking Java 1.3 times, so no automatic boxing/unboxing either.) Because, I quote, "If you use Integer Java copies only a pointer to it, not the whole int."
Maybe knowing how much space an int takes under the covers would have helped.
Another time I hear my now ex-coleague Wally (not the real name, but pretty accurate;)) repeatedly going, "That can't be true!" and the like. Curiosity gets the better of me and I ask what's the problem.
"Java has a bug!" he goes, "I put a new key/value with the same hash code in a HashMap and it just replaced my old value!"
"Oh, yeah, we've had the same bug at the old company, " Wally 2 chimes in. "We had to manually set the capacity so it goes in another bucket."
(I clench my teeth to avoid screaming at the notion that there's any way to the right capacity to avoid collisions for keys that are random strings.)
I go and look at what he's doing, and sure enough he's got the debugger open and is looking at the bucket array of a HashMap. "Look! There! I had a different value and it replaced it!"
"Aha, " I try to be diplomatic, "can you please expand that 'next' variable there?"
"No, you don't understand! My value was there and now it replaced it!"
"Yes, I get it. But I want to see what's in that 'next' variable."
He clicks and goes, "Oh... there it is."
The whole concept of a linked list was new to him, obviously.
And if you think that's an isolated case, in the meantime I've run into two different teams whose "architect" actually made it mandatory to plaster his broken replacement for the hash-code method everywhere, because of that supposed "bug in Java." Supposedly they can hash a long-ish random String into a 32 bit int without ever having collisions. (Ok, 31 since Java doesn't use the sign.) Consulting can be depressing business, you know?
I could go on with more such WTF examples, but basically let's just say I wish more people would know exactly what happens behind those high level constructs and libraries. Because otherwise I see them take their own guesses anyway, and guessing wrong. I wish they'd know what a pointer really means, and why a LinkedList does _not_ use less memory than an ArrayList, and, yes, what kind of things will cause jumps. Or what kind of things will be optimized into a tail recursion instead of a plain recursion, as a trivial example of where it pays to know the difference between a JUMP and a bunch of PUSHes and CALL generated by the compiler.
Re: (Score:3, Insightful)
Much like a chain saw not being an all purpose cutting tool....
So... avoid using it when not strictly necessary, but if it is the only sane/high performance way of getting things done in a special use case, by all means...
break 2; // electric boogaloo (Score:3, Informative)
Sometimes I wished `break` could take argument of how many levels it should break out of
It can in PHP [php.net]. Java [sun.com] and Perl [perl.org] have a different solution: label the start of a loop and then use that label as the argument of break. In C, it's just a matter of discipline to use goto only to replace a throw or labeled break.
Re: (Score:3, Insightful)
Thanks for the link. However, I'm not sure you read beyond the title. On page 2, Knuth foresees two types of reactions based on reading the title alone, and yours sounds the first type. In fact Knuth does not disagree with Dijkstra, and he quotes Dijkstra to show that Dijkstra was not dogmatic about GOTO either. Knuth's purpose is to explore where GOTO has a place and where it is better to eliminate it.
Re: (Score:2)
Heh......this is what happens when you rush super-fast to comment without reading the article, post something unrelated, and end up getting modded down. You end up thinking slashdot moderation is awful. :)
Heh? Stop pontificating for 5 minutes would you? I haven't been modded down and my comment relates to the summary and is not meant to say anything negative about the article. Are you even aware of the irony of you rushing to criticize me without provocation? Pipe down.
Re: (Score:2)
Re: (Score:2, Offtopic)
Re: (Score:3, Interesting)
I too started out with a range of BASIC varieties. I coded on Commodore PET and 64, AppleII and Color Computer and all sorts of thing like those. I then moved on to Assembly language for Motorola processors. Those two language experiences have the lacking of something in common -- functions; creating them, linking to libraries and all that. This meant I learned to do it ALL myself, whatever it was, and my code wasn't all that portable except for copy, rename and modify to suit. Everything was GOTO, GOSU
Re: (Score:3, Insightful)
And the idea of mixing "data" and "executable code"??? Really? Damn. Sounds like injectable code execution exploits to me. When I started, we knew to keep those two things separate from the very beginning. Object orientation mixes them up and probably does more to lead to exploitable code than anything else.
Your post pretty much proves Dijkstra's point. You did not manage to lay off your old thinking habits. You do not bother to think how an object oriented compiler works and to me it sounds that you are stuck in 80's style programming. For your information, object orientation does not mix up data and code. It merely gives the programmer a paradigm to access data. Code is related to classes, data is related to objects.
Re:Good programmers aren't easily ruined (Score:5, Funny)
"Your post pretty much proves Dijkstra's point. You did not manage to lay off your old thinking habits. You do not bother to think how an object oriented compiler works and to me it sounds that you are stuck in 80's style programming."
'Object-oriented programming is an exceptionally bad idea which could only have originated in California.' - Dijkstra
Re: (Score:3, Insightful)
First you complain about people not understanding how machines actually work, then you make it perfectly clear you've no idea how object oriented languages are implemented. In C++, code still goes in .text sections, and data is still on the heap (or possibly .data or .rodata), so the mixing of code and data is an abstraction.
"Object orientation mixes them up and probably does more to lead to exploitable code than anything else."
Wow, what BS.
BTW, I also grew up in the 80's on microcomputer ROM BASICs, and wh
Fuck him. (Score:4, Informative)
I was on Basic from 1986 to 1993, and it was the most meaningful years of my life.
Seconded! (Score:4, Interesting)
QBasic allowed me to take my learned lessons to the 8086 and add much much more visually appealing graphical interfaces (still mostly games, but also editors etc.)
QuickBasic introduced me to libraries and compilation, i've built some great hardware monitoring interfaces with sensors and relay switches
Visual Basic allowed me to explore the win32 API and libraries, i''ve built some of my greatest applications with it ranging from editors and filtering proxies to a graphical music collection interface I could control with a remote
I loved basic, it taught me so much but most of all it taught me to love programming... The days of fun little programs in basic are over and I have no intention of ever going back... but there is definitely meaning there, and I would recommend anyone to try programming with basic... as a self-taught programmer I can say you will learn a lot from basic.
Re:Seconded! (Score:4, Interesting)
Commodore Basic taught me to love programming, have fun making little games and all important binary operations and encoding for sprites and font
Preach it. My first exposure to binary was working through the examples in the Commodore 64 Programmers Reference Guide [commodore.ca] (still the best manual I've ever read). Sprites were 8x8 images stored as a 8 bytes, each byte representing a row, and each bit representing a pixel. I probably went through that section 15 times before I truly understood and believed that binary math works, but the lesson paid off in spades. Back then, it was figuring out how to peek and poke a memory location to make a single pixel blink. Now it's loading and storing a memory location to toggle a serial control line on an embedded controller. That little machine gave me a good start.
Time heals (Score:3, Insightful)
Depends on the BASIC. I use RealBasic at work as an alternative to LabView. It's object oriented, multithreaded and completely "Visual".
as potential programmers they are mentally mutilated beyond hope of regeneration
Am I the only person on the Earth who just writes off hysterical, panty-wetting stuff like this? When did he say this? in the 1960s or something? It's 2010 now, right?
Re: (Score:2)
Re:Time heals (Score:4, Funny)
and... (Score:2)
Re: (Score:3, Informative)
Depends on the BASIC. I use RealBasic at work as an alternative to LabView.
RTFA. The author is quite clearly talking about non-block structured BASICs of the MS-BASIC kind.
Am I the only person on the Earth who just writes off hysterical, panty-wetting stuff like this?
Again, read for context. Dijkstra was being intentionally hyperbolic in a joke article when he wrote this. He did intend the point behind it, though.
Re: (Score:3, Insightful)
Depends on the BASIC. I use RealBasic at work as an alternative to LabView. It's object oriented, multithreaded and completely "Visual".
Not really; BASIC, as Dijkstra talks about, is not the modern development tool that is barely BASIC any longer. As a programming language regarded, modern 'BASIC' is not really much different from C++ or what can loosely be called "Object Pascal" - it only resembles BASIC because it has kept many of the same keywords.
No, BASIC was and still is, as far as I am concerned, the line-by-line interpreted language that was meant to be a simplified FORTRAN, and as most of the languages from that time, it has a numb
Simplicity (Score:5, Informative)
There's something to it. I recently downloaded a ZX Spectrum+ manual from worldofspectrum.org (the colorful one), and was amazed by how simple the language is. The complete reference takes like 10 pages? And it can draw lines and circles..
Now compare it with any modern language, such as Java or Python. The language description itself takes 10x more than that, and the libraries available are vast. I am not arguing it's a bad thing; I am just arguing that simplicity may be a key here.
Re: (Score:2)
I started out on a little basic-in-rom interpreter and it was okay for a while. If I needed performance I had to write 6502 machine code. Eventually my dad built up a CP/M system and I got my hands on pascal.
But I knew people of my fathers generation who grew up on fortran and still wrote fortran in newer languages (pascal, c, etc). IMHO fortran was more harmful because it went further than basic and the coding style it imprinted on people went deeper.
Meh (Score:2, Interesting)
I started out with QBasic. It was absolutely horrific - the language itself and the code I wrote as well. QB convinced me to never, ever try VB.
So yes, starting out with BASIC helped me tremendously :)
Re: (Score:3, Interesting)
I don't remember tha language being bad.
It was free from line numbers and allowed proper structured programming.
Second story from this blog this week... (Score:3, Insightful)
...and I'm wondering: what's with the random sushi pictures?
Re:Second story from this blog this week... (Score:5, Funny)
Clearly there's something fishy going on there...
Re:Second story from this blog this week... (Score:5, Funny)
Or, more on-topic:
Haiku Overflow
It was programmed in BASIC
System halting now
Funny argument (Score:5, Insightful)
Re: (Score:3, Insightful)
Some people think they know what spaghetti code is, but unless they've written code with line numbers, they probably don't.
And the good old days of 'LET A = NOT PI' to save three bytes of RAM :).
(ex-Sinclair BASIC programmers will understand why such arcane constructs were beneficial when you were low on RAM)
Re: (Score:2)
I agree. I started on QBASIC, so I didn't have to cope with line numbers... but the code was still spaghetti. I remember the joy I felt upon discovering subroutines.
It's like a child burning themselves for the first time. Sometimes it's the best way to learn.
Re:Funny argument (Score:4, Interesting)
Compare that to the guy I met when I was tutoring CS, who said, "functions, why do I have to use functions to write this program? I know how to use functions, it's such a waste of time." The idiot could have finished the program in the time he spent complaining about it, but he certainly did not feel the joy of discovery.
Re: (Score:2)
It should be also said, though, that for people who programmed on 8-bit computers (I had ZX Spectrum), there was hardly any other way than to have spaghetti code at the time. OOP was non-existent and Lisp too complex. At the time, global variables, fixed-length arrays and GOTOs were the way how to actually program effectively (both in BASIC and assembler).
Re: (Score:2)
In praise of...BBC BASIC (Score:4, Informative)
I never owned a Beeb, though I had several friends that did. I used them at school a lot too, and their BASIC was extraordinarily advanced. The ELSE statement was there, as was the standar(ish) GOSUB, but you could also define true procedures which returned values etc. (DIM PROC), and there was a clean way of dropping down to the 'OS' proper (OSCLI statements).
In addition, it also solved the line number problem you mentioned. It had a renumber command so that everything would become properly spaced out again. I remember the style of coding you're describing from my C64 efforts - the C64 was actually MS BASIC and it was dreadful, anyone wanting to do decent high-levle coding used to get the Simon's BASIC [wikipedia.org] cartridge.
As a whole though, the BBC simply had the best BASIC of any 8-bit I encountered. That's not too surprising given its background and use as a teaching tool, but they did it very well indeed.
Cheers,
Ian
Re: (Score:2)
Argh. DEF PROC, not DIM PROC.
Cheers,
Ian
BASIC is great for kids (Score:5, Insightful)
I disagree with the premise that BASIC teaches bad habits. I stick with the old adage a bad workman blames his tools. BASIC teaches kids (like I was over 30 years ago) from the ages of 5-6 how to put together simple logic, and gives them the very basics of languages constructs like variables, loops, sub routines, etc. without them having to grasp structures, classes, polymorphism, OO, etc. that a lot of grown ups that have been involved with writing code for quite some time can have issues with.
Re:BASIC is great for kids (Score:4, Informative)
Re: (Score:3, Insightful)
I maintain that Python can do this much more easily due to the fact that there are simply fewer ridiculous syntax requirements in a simpler to read format. The things you learn in BASIC can be taught at a much more significant level in Python (inline AND block comments, order of instructions, basic control structures, variables and their use) without jumping through the hoops that you will in BASIC. Python's use of whitespace and extensibility allow beginners with an aptitude to expand on their knowledge ea
Re: (Score:2)
If you can't get your head around basic data structures, then perhaps programming isn't an ideal career path. To do anything useful at all, you ARE going to have to deal with data structures.
Pascal or even C is a much better first language, imho.
I started off with TRS-80 Color Basic on a Coco 2.
Simple, but genuine rewards (Score:2)
Bah! (Score:2)
Worthless? Hardly!
Re: (Score:2)
Appreciate the difference (Score:5, Insightful)
As a programmer who started with old-school BASIC (numbered lines, etc), I was overjoyed with better elements of structured programming in Turbo Basic, and totally excited with C when I learned it. It felt like having my hands untied. So I would state the contrary: you cannot fully appreciate the structured programming unless you went through the GOTO hell.
I hear a lot of similar FUD from some people, like "you can't grok OOP if you started with C", or "anyone who touched .NET or Java is lost for C++..." It boils down to "people are idiots, they can't possibly learn anything new, they are either indoctrinated at birth in My True Way, or lost and hopeless." Who in their right mind would take that seriously?
Re:Appreciate the difference (Score:4, Interesting)
It boils down to "people are idiots, they can't possibly learn anything new, they are either indoctrinated at birth in My True Way, or lost and hopeless." Who in their right mind would take that seriously?
That quote reminds me of a number of religions who have exactly that attitude. Makes me suspect that it must be a recurrent (though no less foolish) theme to human thought...
Basic and Basic... (Score:2)
Whoever hasn't coded with GfA-Basic or Omikron Basic never experienced how a fun and versatile language it was.
Variety (Score:5, Interesting)
I've gone from MSX Basic to Turbo Basic to Turbo C. Now I can code in all kinds of languages, assembly, PHP, Ruby, Javascript, etc..
I do think that BASIC has value as a first language because it gives back results immediately. Sure, nowadays there are other script languages, so you don't have to go through compiling and all the other complexity. BASIC is valuable because it's just that: basic. You don't have to worry as a first-timer about libraries, include files, functions and everything else. You get down to the very basics like variables and program flow.
And after a lot of years of BASIC programming I knew the limitations of the language (which largely depends on the interpreter). That's when I switched over to Turbo C. And to be honest it didn't took me long at all to learn C because I was a pretty reasonable BASIC programmer.
What I _do_ object against is stuff like Visual Basic. That's taking a limited language which is simple and jamming it into a place where it shouldn't belong. To let Visual Basic work, they stuffed all kinds of non-original basic stuff in there which make it more complex then something like Visual C. Their idea was "lets make making real application easy with Basic, because Basic is easy right?". It doesn't work like that.
I also think that Java is not a language that people should start programming in to be honest. Object oriented programming is NOT something people should learn before they had a taste of procedural programming. Fun fact. I went back to my old school to see about taking some night classes to get my CS degree. (I dropped out at the time and I've learned a LOT more on the job then what they were teaching.) At their open house classes I asked about procedural programming and if they still taught it. They scoffed and said nobody uses that anymore. This when I've been a Linux kernel developer for 10+ years now which is 100% procedural ANSI C. It's all Java they teach nowadays.
In closing. I think a good programmer is somebody who explores. If I have a Windows application that does something cool, I take it through a disassembler to see what makes it tick. I look up DOT NET C# code snippets to see what it's all about. I look through COBOL and ALGOL source code to see what constructs people used in the past. I patch ARM assembly code to fix bugs. I do all those things and not rigidly stick to a single programming environment. A good programmer is a state of mind, not the language he works in.
Re: (Score:2)
PS..
GOTO's have their place, even in modern programming languages. Fuck C for not having a 'break [n]'. :)
I started with BASIC (Score:4, Interesting)
Years before I took any formal course, I was looking at the manual of a BASIC computer and making circles on the screen by programming a dot that kept to the same distance to another and rotated. I still remember the emotion of seeing a real circle emerge on screen. I don't know if BASIC helped me much to program, but the immediacy of the thing certainly did much to keep my interest alive.
Re: (Score:3, Interesting)
I too started with BASIC. In my case this was freshmen year in High School back in 1999 on QBASIC. I had dabbled a little with HTML but that was nothing like an actual language that had logic. It was there that I learned the basic blocks that made me the programmer I am today. I went from simple logic to loops to graphics and sound.
By the end of the year I had a full animation of a house with Christmas lights and music. I even had a very primitive text based RPG working with the ending taken straight out of
It's not the language, it's the teacher (Score:4, Interesting)
Most people aren't very good at teaching themselves. I've seen this a lot with people trying to learn Morse code and giving up in frustration.
You can pick up a lot of bad habits without someone to guide you.
Bah (Score:5, Insightful)
Like all eminently quotable people, Dijkstra tended to hyperbole and oversimplification.
There's never anything wrong with any language (Score:2)
It wasn't all that great... (Score:4, Interesting)
...but it was BASIC. And the expectations were so low. "10 PRINT "Hello, World!", "20 GOTO 10" and it started doing something. The programming manual was well worn by the time I was 10, would that have happened with any other language? I doubt it. Things like lack of scoping makes the easy things easier and the hard things harder. The point isn't to learn everything from your first language, the point is to get started and interested at all. Moving to DOS was sorta ok, but moving to Windows killed my interest. C/C++ was just horribly complicated, I remember trying to get up a window in the Win32 API and it was like wtf, how hard can this be? MFC was even worse, Java (really early java, on hardware of the time) was slow and unresponsive as fuck, Javascript was a toy language for websites and never really like Pascal or VB much either. I didn't regain my interest in programming until I went with C++/Qt, or maybe more Qt than C++ really. QMainWindow *mw = new QMainWindow(), mw->show(). The hard stuff is still hard, but I very very rarely find I write "overhead" code that I shouldn't have to.
Re: (Score:2)
I lost interest in Window systems as well due to the amount of bullshit to get anything on the screen just killed interest. I've gotten into Cocoa with xcode and interface builder recently, and its fun again :)
I definitely agree, its the "overhe
Easily gotten over (Score:5, Funny)
easy language first your get-over is
FORTH started I at-all me affected not and
Basic is, well, basic. (Score:2)
My son is currently at the age where he wants to start learning to program. The thing is, other than Basic or similar entry-level languages, he just can't wrap his 11 year old mind around C++ or other more complex languages to start. And I can't exactly drop him straight into SQL or Linux either. He has to start somewhere, and simple languages fill this gap very well for the young. It's also the same reason why I hope that they never stop making those ###-in-one electronic kits. The basics may be old
Dijkstra ? Legend ? (Score:4, Informative)
Dijkstra, who taught at Eindhoven Technical University - which is how I superficially came to know him - was mostly a self-declared legend. He cultivated his own myth, even going as far as publishing a little book with his own quotes.
Re: (Score:3, Interesting)
Re: (Score:3, Interesting)
I think the second part of that definition is more relevant. Dijkstra was definitely arrogant, no matter how brilliant he was and no matter how much he contributed to CS. Or maybe you can say he was just an ass. But it certainly wasn't that "others are just miffy".
Re: (Score:3, Interesting)
I think you sort of have to cultivate your own myth if you want to become famous. There are a lot of brilliant people out there who most people have never heard of. Dijkstra made great contributions to computer science and programming, but to say that that is what made him famous belittles the work of all those others who did so, too. He was famous because he made great contributions _and_ worked on his visibility.
Being famous is not one of my goals, so I don't engage in a lot of activities that would raise
Re: (Score:3, Interesting)
I think you sort of have to cultivate your own myth if you want to become famous.
ehhhhh..... I think you have to cultivate your own myth if you don't have a great number of concrete accomplishments to cement your reputation. Richard Feynman was a pretty unassuming guy, and he ended up famous anyway. Dijkstra probably didn't need to sell himself, so he really doesn't fit the category. He was just a kind of a loudmouth by nature.
Dyslexic programmer (Score:2)
Back in the day I used basic a LOT, arguably like a sophisticated calculator programming language to do engineering calculations.
Show me a page of PHP even and everything goes blurry and out of focus.
That's dyslexia for you.
Who learns BASIC anymore, anyway? (Score:2)
Are people still teaching or learning BASIC?
I know JavaScript, PHP, and AppleScript, and I learned them so I could script the Web browser, the Web server, and the Mac desktop, respectively. With just a few simple lines of each you can make really practical and productive things happen that seem to me like they would reward the beginner. My background is publishing, though. Maybe I'm biased towards programming that makes documents.
What does BASIC actually do?
I'd guess there's a critical period & an attit (Score:4, Insightful)
I think the author is mostly on. He's aware Dijkstra was exaggerating for effect, but also completely correct... if you started programming in the early home computing era, you probably started with a BASIC. I was lucky enough to get some varied exposure earlier to some other languages (LOGO and some shallow assembly), but until I was 15, it was pretty much Basic.
And none of my programming habits now resemble anything close to the BASIC I wrote in when I was that age. Except, occasionally, for the rare cases where global state seems to make sense, and even then, I try to namespace things in one way or another. But by and large, I picked up structured programming, I picked up object-oriented programming, I picked up logic programming, and I'm learning to enjoy functional programming.
I will say... there was a time when I was probably close to being "ruined." It was when I was learning C++, and I only really had Pascal, basic C, and Basic under my belt. And I had a pretty solidly structured-imperative mindset, and really hadn't seen any other way of doing things. C++ married data structures and methods in an interesting way, but it didn't seem like more than a stylistic practice to me. I was pretty sure most languages were alike, you just had syntax and typing differences.
But there was one thing: I'd had to learn Prolog for a very specific job. We were teaching it to high school students in a CS summercamp I worked at for a few years. The first year, I just thought "Man, this is weird," more or less got through all the exercises, and left it behind, and did what most people do: dismiss it as an odd research toy. The second year, I thought "this is weird, but interesting." The third year, I thought "Wow. There are all kinds of intriguing ideas here."
And there are, and I still think it could stand to see more usage in mainstream software, but more importantly, I think I'm pretty lucky I got repeated exposure to a language that forced me to think differently before I got very far into actually working in the software industry.
Because I now think there's either a critical period [wikipedia.org] (or possibly, at a minimum, a critical attitude of some kind) after which a lot of programmers tend to lose either the humility or the curiosity that drives people to think about different programming constructs and habits. I think if a programmer has been minimally exposed before they reach it, they'll keep just enough of one or both of those attributes that they'll be interested in what they don't already know, rather than arriving at the point where "they've already learned the last programming language they'll ever need." [blogspot.com]
And if they don't get so exposed, they become Blub programmers [paulgraham.com], where generally $Blub is some industry-leading language that does enough you don't easily bump up against tasks that are near impossible in it.
To tie this back in with a point I think the author missed, I suspect that some of the difficulties with Basic are actually part of the reason why it didn't end up ruining more programmers. Almost everybody who really came to grips with it as a tool probably realized that it couldn't possibly be the last programming language you'd ever need (if it weren't enough that any effort to look into working as a programmer revealed that Basic was clearly not the strongest payroll ticket).
Better than nothing (Score:3, Interesting)
When I was first learning to "program" I had nothing more than an old computer with DOS. The internet was something I had heard about, but had never experienced myself, and I didn't know what Linux or even Unix was. The only way I had to learn was from some books I found at the library. At first, it was just .bat files. When I discovered BASIC (I thing it was GW BASIC), I was excited to have it. Later, I discovered QB.
There are some advantages. First, I didn't have to set anything up or worry about what includes I needed. A simple PRINT "Hello word" was enough. What was better with QB was that with the press of F1 I could browse the list of commands. Also, it came with a Gorillas.bas and Nibbles.bas. I spent hours injecting lines of code into those games.
Sure, if you have a full Linux environment with gcc, man pages, and web access then BASIC is just some lame toy, but if it's all you have It's a start.
BBC BASIC (Score:5, Interesting)
I cut my teeth on BBC BASIC back in the 80's. It was simple, powerful, let you do pretty much anything and best of all came with a built in assembler. Now that was really neat.And it just worked. It was easy to optimise individual subroutines in assembler. This was age 10. At my simple state school with a couple of BBC Model Bs in the corner, I wasn't the only one doing that either.
I make a living writing C++ now and seem to do fairly well at it. The kids coming out of university that I interview these days haven't touched BASIC, or C++ for that matter. We want them to write good C++ when they come and work for us. The intelligent ones adapt easily to working with pointers etc. The less able ones that have somehow made it through the interview process struggle.
Not Basic, but restrictions (Score:2)
Bullshit. You know what may be a reason? The extreme restrictions. In the 80s I used a fairly good Basic, BBC Basic, but I mostly used 6502 assembler. This on a machine with very limited memory. What that does is cause you to do things as efficiently as possible, it makes you think how to best represent the data in memory etc.
Ever see BBC Basic? (Score:3, Interesting)
The BBC Model B was released in 1981. It had a nice dialect of basic with named procedures, named functions, dynamic memory allocation, typed variables, proper pointer indirection and a cleverly integrated assembler.
It had pretty much the full suite of structured programming tools.
Maybe the comment had some value before 1981, though I don't think it did. For the last 29 years, however it has been somewhat out of date.
Re:BASIC is irrelevant (Score:5, Insightful)
Not recognising the relevance of BASIC as described in the article, it's possible you're around a decade or two younger than the individuals the article is referring to.
Re:BASIC is irrelevant (Score:4, Interesting)
It's not necessarily a bad thing to have done things in BASIC, but bad habits can be formed by that language. However the other side of it is that if you first see the bad sides of BASIC then you can recognize the good sides of other languages.
And BASIC of today is Python.
Re:BASIC is irrelevant (Score:4, Insightful)
I think the BASIC of today is Javascript. You see more badly written javascript than any other language..
Re: (Score:3)
I agree on all counts. Python has its moments -- especially when the student moves on to lists and classes and especially when they fall afoul of the unexpected consequences of = being a binding operation rather than an assignment. But in general, the simple stuff is simple. And it works. I think that's really what you want in a tutorial language. And it's probably not all that bad in many real applications either.
If Python has a problem as a language for tutorial programming, it is probably in its ra
Re: (Score:3)
Re: (Score:3)
I'm curious about what kind of "bad habits" can be learned using modern dialects of BASIC today?
Then again, the current incarnation of Visual BASIC and other similar implementations of the language are such an abomination that you can hardly call them BASIC either. Well, I should note it isn't the earlier implementations of Visual BASIC, but rather when some C programmers got ahold of the language and threw out some of the exceptional power that BASIC holds over other languages.
I'll admit that the traditio
no, Python is not the language to start with (Score:3, Insightful)
First of all, you're clearly not an article reader. The point of initially learning with a painfully unstructured language is that you end up appreciating structure more, while being able to tolerate code that has awful structure. IMHO a better choice is assembly, but BASIC does have the advantage of providing faster rewards.
Second of all, many of us dispute the bit about Python not being a toy language. If you build your skills around Python, you'll hit serious trouble if you ever end up needing decent per
Re: (Score:3, Interesting)
One of the reasons I like Python is that it is actually quite good at accessing unwrappered OS functionality - the standard libraries provide the ability to pack structs, etc, so if you want you can access ioctls without having to write native code (the inability to do that was one of the pain points I found in Java when I last programmed it - but that was 7 years ago, so I daresay things will have moved on!). When you do have to write "native" code, some combination of C (the native module API is fairly n
Re:no, Python is not the language to start with (Score:4, Interesting)
If you build your skills around Python, you'll hit serious trouble if you ever end up needing decent performance or unwrappered OS functionality. If you build your skills around C, whole new possibilities open up to you.
That's the dumbest thing I will have read today. I built my skills around 6502 assembler, written inside a monitor because I didn't know that real assemblers existed. I did my senior thesis on interfacing hardware to an embedded controller. I'm content writing memory-managing, bit-twiddling [sourceforge.net] software in C. At the end of the day, though, I'd much rather write complicated stuff in Python than in anything else. Furthermore, I don't have any problem getting great performance out of it. The fact that you do says a lot more about the way you tried to write software in Python than it does about the language itself.
Re: (Score:3)
Surely Python is the language to start with these days? It's straightforward, doesn't force any particular model, i.e., can use for procedural, OOP, functional style.
Most importantly it's not a toy language.
I've been struggling to teach programming to my kids using Python (and other languages), and have never been completely satisfied. I'm thinking that I need to start by teaching a restricted subset that looks a lot like BASIC. Two character variables are missing, but OTOH python doesn't use '$' to indicate string variables. The big thing is to include a GOTO statement, such as http://entrian.com/goto/ [entrian.com], to be used until other flow control mechanisms are taught. I wonder if I can extend Entrian's code to
Re:BASIC is irrelevant (Score:4, Funny)
Perl as introduction to programming for kids?
You will be lucky if they put you in prison for life, for child abuse.
If they don't, you'll have the misfortune to spend the rest of your miserable and short life in the grim world you will have created, filled by monstrosities from worst nightmares.
If BASIC was mutilating the young minds, Perl is in the line of Cthulhu summoning with minors.
Re: (Score:3, Insightful)
Most importantly it's not a toy language.
That's why it should be kept far away from beginners. What if they accidentally type "import skynet", huh?
Re: (Score:3, Interesting)
just great, more sushi, I should block images from that site
I started with Apple BASIC, and I was playing around with peek and poke before the other students even knew how to properly misuse goto.
But that was when I was 13. The first year of college should not teach BASIC.
But why are we not introducing BASIC or the very beginner friendly 'Ruby' to students when they're 13 anymore?
Re:BASIC is irrelevant (Score:4, Interesting)
When was anyone introducing languages to students at 13? There's bound to be a significant chunk of us out there who poked around our computers and went to find these things out ourselves. BASIC just kinda sold itself with the name - you knew it was a good starting point. Self-discovery and a curious mind is probably outdated in the corporatised world, in spite of the roots of many of us who learnt by screwing up an autoexec.bat file or two on Dad's computer. Nothing teaches you to pre-check your logic than having to explain to Dad why his computer doesn't work any more :)
Re: (Score:3, Interesting)
Oh really? And university students should have no prior knowledge to programming? We have full universities of such students, they get degree and go working for some brainless company doing minor tasks.. Or they could continue study until they are 40, then they will probably be able to code.
I started at 12, and BASIC gave me some idea of how computer is actually working. Probably it's time for new languages, but You just can't learn programming at college in few years. You have to grow up with that.
Re: (Score:2)
What a bunch of horseshit. Most modern intro-to-programming classes are taught in C and/or C++.
Really? I've looked at a *lot* of courses, and I've hardly ever seen either C or C++ taught as first language. When I started doing this, Pascal was the most common with a smattering of Modula-2, ML and LISP. These days, it's almost always Java. If it isn't, it's OCAML, Modula-3 or LISP.
What intro-level courses use C or C++?
Re: (Score:3, Informative)
What intro-level courses use C or C++?
Florida State University -- COP3330, "Intro to Computer Programming". 100% C++
Re:BASIC is irrelevant (Score:4, Interesting)
Depends on your age when you start learning. I was 12 when i first met "programming", it was a TI programmable calculator. It was fun to squeeze as much functionality as possible from less than 1K Basic. Then came C64, with its ~38K Basic. Its Basic was very weak, but i learned how to read disassembly when i read the code of various Basic extensions and read books that contained snippets on extending C64 basics. Eventually i made my own Basic extension, cracked games to create trainers, made an own turbo loader that had half of its code on the floppy drive. So, by the time i went to mid school, i programmed device drivers in machine code!
I learned PL/I and other archaic languages in mid school, even punch cards! I learned C on my own and was taught in it only by the time i went to high school.
Re:BASIC is irrelevant (Score:4, Insightful)
Re: (Score:3, Interesting)
It forced us to think around corners. It made us think through what the control structures really were, and how they were implemented." is moot - assuming he's not joking, if you really want to train that way of thinking, you're much better off learning Assembler.
Most of the experienced assembly programmers I know started with an old-school basic (gw-basic, basica, one of the rom basics) and today also program in a BASIC derivative (VB, PowerBasic, TrueBasic, etc..)
None of them enjoy C++.
I often hear C++ programmers declare that learning assembly is stupid, that processor are too complicated now to write efficient assembly, and so on.
I do not think that these things are a coincidence. Basic programmers from the 70's and 80's turned into tinkerers, while C pro
Re: (Score:3, Insightful)
A lot of the older coders hate it though because it's easy and flexible and gives the programmer mu