Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Education

Programming the Commodore 64: the Definitive Guide 245

Mirk writes "Back in 1985 it was possible to understand the whole computer, from the hardware up through device drivers and the kernel through to the high-level language that came burned into the ROMs (even if it was only Microsoft BASIC). The Reinvigorated Programmer revisits R. C. West's classic and exhaustive book Programming the Commodore 64 and laments the decline of that sort of comprehensive Deep Knowing."
This discussion has been archived. No new comments can be posted.

Programming the Commodore 64: the Definitive Guide

Comments Filter:
  • Frist psot! (Score:4, Funny)

    by heffel ( 83440 ) <dheffelfingerNO@SPAMensode.net> on Saturday March 13, 2010 @06:47PM (#31467912) Homepage Journal

    Atari 800 rules!

  • Oh what fun raster interrupts were.
    • Re:Indeed (Score:5, Interesting)

      by davester666 ( 731373 ) on Saturday March 13, 2010 @07:19PM (#31468172) Journal

      There doesn't appear to be any section on custom high-speed communication with the external floppy drive unit. IIRC, you could upload a small program to the drive, and then you could in particular read data from the drive a lot faster than the 'OS' normally supported. This technique was also used to do copy protection for a bunch of titles, primarily by stepping the drive head 1/2 between tracks then doing reads. Production disk duplication could write to both the track & between tracks [or could write a wide enough track to cover the whole area], but regular floppy drives couldn't write both [you could either write on the track, or between tracks].

      Not that I was interested in this stuff or anything.

      • by headkase ( 533448 ) on Saturday March 13, 2010 @08:00PM (#31468490)
        Fast loaders worked because the kernel ROM software didn't fully take advantage of the hardware. Between the C64 and the 1541 floppy drive the connector cable had 4 wires for carrying information. The kernel routines built into ROM only used one of those lines to signal from the drive to the computer. The "fast loaders" simply uploaded a program to the drive which used all four lines to signal information. The "fast" loaders weren't fast magic they just removed a deficiency in the kernel ROM routines. The exact number of lines between the computer and drive I'm not sure of but this is the principle the fast loaders worked by. And tape based fast loaders worked because the kernel routines would save a copy of the information to tape and then immediately save a complete other copy to compare against for error correction on load. The tape fast loaders just skipped saving and comparing the redundant copy to get the speed. Disk fast loaders didn't compromise the integrity of the information in the way tape fast loaders had potential to though. Remember computers back then were full of noise when you were talking to tape drives especially.
        • A disk-drive fast loader could fit into 256 bytes, IIRC.
          The trick was to shift bits in an unrolled loop. I remember I stored some values on the stack by using PHA in this loop.

          I did write a fast loader for a few games I wrote on the C64 (with was one of the computers I was doing games in the 80s, along with Oric, Thomson TO7 and MO6 and Amstrad CPC). Atari ST appeared in 1987.

          • Re: (Score:2, Interesting)

            by mindstrm ( 20013 )

            The 1541 drives were purposefully slowed down to maintain compatibility with some other Commodore hardware (I forget at the moment exactly which)..... so they weren't so much fast-loaders as they were "doing it the way the engineers designed it to work, not the way the boss made them change it so he could claim compatibility.

      • by mrmeval ( 662166 )

        That's in an exhaustive 1541 ROM dis-assembly book. I do not now recall who published or wrote it. It had exhaustive detail on how the drive's rom worked. Using that and some machine code I actually found out you could store a file on the floppy that on power on would boot no-knock code. I used that for several years. Then tech got more difficult and less accessible for a time.

        Now with small devices like the Arduino and Make Magazine's controllers many people are learning and better creating things with rea

  • Sweet! (Score:5, Funny)

    by Junior J. Junior III ( 192702 ) on Saturday March 13, 2010 @06:49PM (#31467922) Homepage

    Now, I can finally stop waiting and get to programming my Commodore 64!

    • Re:Sweet! (Score:5, Funny)

      by ae1294 ( 1547521 ) on Saturday March 13, 2010 @07:16PM (#31468148) Journal

      Now, I can finally stop waiting and get to programming my Commodore 64!

      I know right! I finally got my Beowulf Cluster of c128's running GNU/linux doing seti@home work!!! Sure it's drawing about 200amps but damn it's a sweet setup and only takes about 24 days per unit!

  • by headkase ( 533448 ) on Saturday March 13, 2010 @06:50PM (#31467930)
    I started on a Commodore 64 (well a Commodore 128 that ran exclusively in 64 mode..) and learned machine language by breaking protections of the day. Many of the things that were legal back then such as copying software for DRM'd games are now gone the way of the dodo. I honestly see that in twenty years from now a debugger in itself will be seen as a "tool of crime" or whatever wordage they use to keep them out of the general public's hands just like lock-picks today. Hope you like high-level because the day is coming that it will be illegal to be low-level without a government (or more likely Apple) license.
    • Relax (Score:4, Funny)

      by sakdoctor ( 1087155 ) on Saturday March 13, 2010 @06:59PM (#31468032) Homepage

      Relax. You've obviously read too many kdawson stories recently, and have been trolled into a heightened state of paranoia. Don't worry, it happens to the best of us.

      Also, why have you switched off your iphone citizen 533448?

      • Re:Relax (Score:5, Interesting)

        by Anonymous Coward on Saturday March 13, 2010 @07:21PM (#31468180)
        I was playing a game with some DRM (either StarForce or SecuROM) and it wouldn't run if I had a debugger present. I asked them why and they were all like "Anyone who has a debugger and is playing the game is a hacker." That's RIGHTLY earned state of paranoia.
        • I believe that it's StarForce [simhq.com].
        • My response was to kick their copy protection out of the game.

          What? It's legal! My country's copyright law explicitely states that disassembly and alteration of compiled code is allowed if necessary to make a system or program to interoperate with another one. No kidding. They pretty much prompted me to invoke that portion of our copyright code and crack their game. Without the crack, it is not able to cooperate with the disasselber or the CD-drive emulator I have running that I simply cannot turn off (sorr

      • Re:Relax (Score:4, Interesting)

        by fuzzyfuzzyfungus ( 1223518 ) on Saturday March 13, 2010 @08:17PM (#31468600) Journal
        Given that German has already gone and adopted an absurdly vague and overbroad law aimed at "hacking tools" [schneier.com], I wouldn't really describe somebody hypothesizing that other jurisdictions might do so in the future as "paranoid".

        Perhaps ultimately more dangerous(because they tend to be subtler) are situations where no law ever bans something, per se; but some quiet mixture of contractual, legal, and technical pressure effectively prevents it anyway. Consider SDI [wikipedia.org] for an instance of that. A digital video transmission standard, available well in advance of HDMI, that was frozen out of the "Consumer" market entirely. It's not like possession was illegal or anything; but most people never even heard of it, nor was it available on any broadly affordable hardware.

        In the case of something like debuggers, I'd be very surprised to see any sort of legal ban; but the technological/private sector contractual de facto neutralization is an eminently plausible scenario. Already, in recent versions of Windows, any media application that requires the "Protected Video Path" will throw a fit if there are any unsigned drivers loaded that could compromise that path. An analogous "Protected Execution Path", provided by the OS for programs that didn't want anybody else debugging them or looking at their memory, hardly seems implausible. Not to mention, of course, the increasing percentage of consumer-level computer activity that is occurring on devices were being able to run arbitrary programs isn't even an expectation. Not much debugging going on on Xbox360s, and debuggers don't have to be illegal to not be available through the App Store.

        There will always be gaps, of course, for the sufficiently knowledgeable, motivated, and well equipped; but a largely opaque consumer level computing environment seems like an unpleasantly plausible prediction.
    • V-Max (Score:5, Interesting)

      by headkase ( 533448 ) on Saturday March 13, 2010 @07:07PM (#31468092)
      One of the most comprehensive protections at that time was called "V-Max!" which stood for Verify Maximum. What were called "nibblers" for disc copy software couldn't touch it even though those nibblers represented the ultimate in disk copy technology at the time. There were two ways to copy V-Max, the first was to get a dedicated hardware copying unit. The second was to apply a bit of knowledge with a debugger cartridge: the V-Max protection was a turn-key system you gave them files and they wrapped the protection around it and provided a fast-loader at the same time. So what you would do is fill all of memory (the whole 64K) with a value you knew say: $AF. Then you would load a V-Max file from the disc, it's loader would automatically take over and while it was loading you would enter your debugger cartridge and change it's exit point to point to itself. So instead of $0800: RTS you would make it $0800: JMP $0800. Then you would wait for the V-Max loader to fully load the file. Then a quick button press on your debugger cartridge and use the memory monitor to find where the file loaded by seeing what memory was NOT $AF. Then from the debugger cartridge save that memory block out again. Completely de-protected file. Since V-Max used standard kernel-load vectors the program itself needed no further modification, the protection was completely gone you just lost the fast-loader function. Which you then re-added yourself into a chunk of memory wherever the game didn't use it. Relocatable code was best for that. Later versions of V-Max also did on-the-fly decompression of files so occasionally while stripping the protection you would run into a situation where your destination disk ran out of space versus the original protected disk. Again, that was worked around by inserting your own custom loader into the kernel load-vectors which also did decompression. V-Max was impossible for copy software of the day to copy but with a little bit of knowledge and a debugger cartridge it was absolutely trivial to defeat.
      • Uphill Both Ways (Score:5, Interesting)

        by headkase ( 533448 ) on Saturday March 13, 2010 @07:24PM (#31468204)
        And why did I spend time removing protection systems? Funny that part is: I owned an MSD floppy drive which was completely incompatible at a machine-language level with the 1541 drives everyone else owned and that all the game-makers wrote their protection systems for. So my floppy drive would load any of the software of the day. I literally bought a game, had to hack away the protection, and then I could play it on my computer. Of course no one will believe me when I say this but damnit, its the truth! Now get off my lawn.
  • by jmorris42 ( 1458 ) * <jmorris@@@beau...org> on Saturday March 13, 2010 @07:02PM (#31468048)

    It was possible to fully understand all of the old 8 and 16 but machines. Now you could spend momths trying to fully understand one video card, which would be replaced by something more complex by the time you finished understanding it.

    And that was a big part of it, the stability of the platforms during that era. A C64 was exactly the same as every other one, a Tandy Coco was identical to the million others of it's kind. Later models tended to retain as close to 100% backward compatibility as possible so knowledge and software tools retained value. Now you buy a lot of PCs with the understanding that a year from now you won't be able to buy more of the exact model even if you stick to Optiplexes and such that promote the relative stability of the platform. Something will be slightly different. So, understanding being impossible we abstract it all away to the greatest extent possible.

    If you want to reconnect with low level look at AVR microcontrollers. If you are really frugal you can get going for $20.

    • It was possible to fully understand all of the old 8 and 16 but machines.

      Personally I prefer my butts to be over 18, but to each his own I guess.

    • It's not really exact, since the C64 existed in 2 forms: one for the US and one for the Europe.

      I vaguely remember that it introduced a difference in the fast disk-loading routine mentioned in a message above, because there was one cycle of difference (yes, simply a NOP).

      If somebody is interested, I can dig in my very old source code to retrieve this information (I coded several games for the C64 in the years 1985-1988).

    • It was possible to fully understand all of the old 8 and 16 but machines.

      Well, a lot of the segment-based memory management features introduced in the 80286 seemed to be so complex and hard to understand that nobody really used them to the extent Intel envisioned. Once the much simpler page-based virtual memory was added to the 32-bit 80386, people tried to forget that those 286 features ever existed.

  • by v1 ( 525388 ) on Saturday March 13, 2010 @07:03PM (#31468054) Homepage Journal

    Though my experience was on the Apple II not the Commodore. Little things like writing your own device drivers, drawing graphics via direct access to interlaces vram, (oh the maths!) direct read latch access to the floppy drives, writing hybrid assembly/BASIC apps. It was grand.

    It's downright depressing to compare my present-day knowledge of computers, classify myself as somewhere in the upper 2%, and still wish I knew a quarter as much (percentage-wise) about my current computer as I did about my //c.

    *sigh*

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      I think it's there if you want it. A couple of weeks ago I was writing in assembly language specifically for one of the top ten fastest computers in the world. Sure, there's a shitload I don't know about that cluster -- no idea if there's an RJ-45 port, much less how I'd access it via the assembler, but I could probably find out if I cared and got clearance for it.

      It's easy to romanticize simpler times, and there is some truth to them being simpler, but I'm damn excited about today. I mean, really, you can

    • It's downright depressing to compare my present-day knowledge of computers, classify myself as somewhere in the upper 2%, and still wish I knew a quarter as much (percentage-wise) about my current computer as I did about my //c.

      Can you get more done with a program today than you could with the Apple II?

      • One of the usability tests that Jef Raskin proposed was the time from turning a computer on to having written a program for adding two numbers together. On the C-64, a reasonably fast typist could do this in around 20-30 seconds. On a modern computer, you might finish the boot / resume process in that time, then you need to launch an app, then you can start writing. If you use something like Visual Studio / Eclipse / XCode, it will take a lot longer than if you use a terminal (expr $1 + $2 will do it on
        • Err okay... so are you rebooting and launching Visual Studio enough times that it takes longer these days to write a program than it did back in the C64 days?

          I must admit I am disappointed. I thought I was going to hear about how all the abstraction and use of libraries/modules/etc meant that we can create useful apps in a fraction of the time that we could in the 80's. I had no idea it actually took longer!

        • That's right; one of the usability tests was boot-to-use time. I'm sure it was far from the most important, and there are changes in the way computers are used now that make it almost obsolete. For instance, my MacBook Pro might take over a minute from cold boot to an open terminal window, but I reboot it only rarely. Most of the time I just close the lid and let it sleep, so wake-to-use time is under 5 seconds; less if I already had a terminal window open.

          The answer to the grandparent's question is, "ye

        • > One of the usability tests that Jef Raskin proposed was the time from turning a computer on to having written a program for adding two numbers together.

          So the Openoffice spreadsheet is totally unusable.

        • by jandrese ( 485 )
          Heck, on some of the really old IBM PCs we had back in school, there was a ROM BASIC that would load if you didn't have a bootable floppy in the drive. You could have an adding program cranked out in about 5-10 seconds after hitting the power switch. I'm not sure what this proves though. How often do you crank on your computer to add two numbers together?
    • by hoggoth ( 414195 )

      Those were the good old days.
      I could make beautiful music on my Apple II.

      No, really, I made music. Like this:

      10 POKE -16336,0
      20 REM TOGGLE SPEAKER
      30 FOR I=0 TO PDL(0)
      35 REM DELAY BASED ON POSITION OF GAME PADDLE
      40 J=J+1
      50 NEXT I
      60 GOTO 10

      Man... I was so productive back then...

      • by ashitaka ( 27544 )

        I remember writing that exact same program. The rising and falling of the speaker tones as I twisted the paddle knob back and forth.

        Good times.

    • by hoggoth ( 414195 )

      So, top 2%, what are you doing now?
      I also place myself in the top 2% and did very well as a freelance programmer for many years. Then I started competing with offshore programmers.
      Now I'm not a programmer any more. I miss it, but I need to make a living.

      • by v1 ( 525388 )

        So, top 2%, what are you doing now?

        Today I do warranty repair work for Apple at an AASP. Our shop is small, but I have the satisfaction in knowing we're good and well-respected. From time to time the local (50 miles away!) apple store actually sends customers here because they can't cut it. :)

        Hence the wish that I could have anywhere near the mastery level on the mac as I did on the II. But for now I solve problems, I fix things, and I help people. That sums up what I enjoy doing, and usually involves t

  • by Anonymous Coward on Saturday March 13, 2010 @07:05PM (#31468070)

    ...buy yourself some Atmel microcontrollers (ATMega8 is a good choice). This will be instantly familiar to anyone who programmed assembly language on the C64. There are some differences, the Atmels aren't Von-Neumann architecture but Harvard architecture (separate program and data address space) and the CPU has more registers, but there is excellent hardware documentation, the complete command set and detailed register descriptions in the data sheet. There are lots of interesting application notes (IR decoding, interfacing to PS/2 keyboards, LCD output, ...). The Arduino system is based on an Atmel microcontroller, so there is also a big application oriented community in addition to the people coming from the electronics side.

    It's not a toy either. These controllers are everywhere. Have fun and learn a useful skill...

  • by ickleberry ( 864871 ) <web@pineapple.vg> on Saturday March 13, 2010 @07:05PM (#31468080) Homepage
    To know a computer from the ground up, as it were. Its not some long lost dream or anything, after all your average disposable Crapple Netbook with "clout computing" or Octocore Core i13 and a half is just a fancy C64 with more CPU instructions, more memory, more peripherals that runs faster.

    Its just that unless you start off at the low level, learning about transistors and that sort of shizzle and learning assembly language you probably will never bother to learn it. A lot of programmers now think about functions, objects and arrays as if they actually exists - not just a convenient way of presenting blocks of code and data in a way that makes it easy for you to understand. Fuck it, a lot of people fairly high up on the IT scene have no clue in the wide world about TCP or UDP but they sure as hell know how to write a 'Web App' using JSON and the latest Web 2.5 gimmick completely oblivious to any of the lower levels.

    The problem is when you have nearly everyone going for the latest abstraction layer, easy low hanging fruits (at the expense of efficiency and everything else - rabble rabble rabble) high level stuff there might be a day 2110 when they're still using HTTP + more abstraction layers but quantum computers aren't getting any faster for what they need and nobody knows knows any of the low-level stuff anymore. If you are the one kid on the block who knows how to write assembly in 2110 you'll be a rich man by 2111, just in time for the end of the world cause the Mayans were off by 100 years.
    • by TheRaven64 ( 641858 ) on Saturday March 13, 2010 @08:08PM (#31468534) Journal

      Octocore Core i13 and a half is just a fancy C64 with more CPU instructions, more memory, more peripherals that runs faster

      Possible, but nowhere near as easy. I've read most of volume 3A of Intel's architecture reference while doing background reading for my Xen book, but the complete architecture reference is well over 3,000 pages. The GPU reference - if you can get it - is a similar length, and that's before you get to the OS. The Design and Implementation of the FreeBSD Operating System is 720 pages. It's a good book, but it skips over a lot of details. The copy of the X11 protocol reference that I read was several hundred pages, and it's a few revisions old. The OpenGL reference was a similar length. But now you can do 2D and 3D graphics and, once you've read the C spec (not so bad, only a couple of hundred pages) and spent some time familiarising yourself with your C compiler and standard library you can draw things.

      To get the level of understanding that the original poster is talking about, on a modern computer, means reading and remembering around 10,000 pages of reference books, and gaining familiarity with the source code that they mention. And that's just going to give you one CPU architecture and the core bits of the OS.

    • Re: (Score:3, Interesting)

      by phantomfive ( 622387 )
      I don't know man, it's a lot of work. On my computer I have Ruby, Python, Perl, GCC, and who knows what else installed. There are tons of APIs that I don't know what they do, even in the languages that they do know. I also have a webserver, and FTP server, and probably several other servers. They aren't running right now, but they came with the system.

      On top of that, I have postfix config files, a mach_kernel file, and a bunch of other weird files that are either quite complex (this book about Postfix [oreilly.com]
  • by GaryPatterson ( 852699 ) on Saturday March 13, 2010 @07:05PM (#31468082)

    It's lovely to remember what was, but not so great to forget what we have today.

    Sure, we generally don't know the whole widget from top to bottom, but it's a hell of a lot easier to get a program up and running. It's not just frameworks either - the choice of languages we have today beats the crappy BASIC we had then, or the assembly language tools we had.

    The first machine I knew inside-out was the ZX-Spectrum. While I like to remember it fondly, I would never want a return to those primitive times.

    It's a bit like object-oriented programming - we hide the details of an object and only deal with the interface. It's more scalable and leads to faster development.

    • by xtal ( 49134 )

      If you want to know a platform inside out, there's a fully documented open-source linux kernel staring there at you in the face.

      Go get any of the dozens of embedded arm kits, any of the GREAT bits of documentation, and dig in. You want to get dirty with the hardware? U-boot is right there.

      Want to pay with SRAM and gates? A $100 FPGA will get you all you need. Including a VGA out. We made a fully HDL Pong Game; including the VGA DAC out of a $20 part.

      Hell, for ALL that matter, go get a Gameboy (any one) for

    • Nostalgia, and attention spans. Sure, our attention spans, of us, the old farts!

      We do not know the whole widget anymore from top to bottom like we did with the 8-bit machines of our childhoods... but if I talk to kids with an interest in computers today, they know a great deal more about the nitty-gritty of modern machines than I do. Sure they aren't taking a soldering iron to the motherboard anymore like I used to do with the C64 to make some sort of interface, instead they stick some components on a
  • Why "lament the decline of that kind of deep knowing?" Shouldn't we just encourage teens, students, hobbyists, computer science majors (e.g., anyone with an interest in this kind of thing) to get out there and buy a C64 or a kit or an open source game machine or an embedded device or any of the other numerous projects in which we could pursue "deep knowing"?

    Frankly, it's a great time to be interested in computers: absurd amounts of power for cheap, _along with_ easy access (thank you Internet) to kits, i
  • by gklinger ( 571901 ) on Saturday March 13, 2010 @07:12PM (#31468122)
    Should anyone wish to download an electronic copy (PDF) of Programming the Commodore 64 by R. C. West they may do so from DLH's Commodore Archive [bombjack.org]. It's a community supported archive of Commodore-related printed materials (books, magazines, newsletters, manuals etc.) and it could use your support. Enjoy.
  • I credit going through elementary school with a Commodore 64, one of the few in my school that couldn't actually afford one, for my advanced engineer position I have now. I spent so much time hacking away basic programs and stuff that I ended up learning so much computer science without even realizing.

    Its the only explanation I have for how I've been a software engineer for my entire post school career (the past dozen plus years) while my undergrad degree was a BA in English.

    • Re: (Score:3, Funny)

      by PDG ( 100516 )
      So much for my English background that I can't even proofread my post properly. Should have said "one of the few in my school that could actually afford one"
  • I've been waiting for this guide... like... forever. Now I can finally finish that work project
  • Typing in programs line by line from a book or from Compute! Gazette to animate a moon landing, or play Basketball Sam and Ed (I think they were called).

    Anyway, even Mad Magazine eventually published a program, but it never worked. :(
  • by rbrander ( 73222 ) on Saturday March 13, 2010 @08:12PM (#31468568) Homepage

    http://www.salon.com/21st/feature/1998/05/cov_12feature.html [salon.com]

    Ellen Ullman was a programmer for a full career before she discovered she was also a talented writer. The above link is to a Salon.com article that was basically an excerpt from her excellent book, "Close to the Machine".

    She writes about getting a PC and stripping off Windows, DOS, everything, until the (old even for 1998) BIOS is saying "Basic Not Loaded", then building Linux on it.

    Her conclusions do sound a smidge "kids these days" when she writes about modern programmers that only know libraries and IDEs, but I know the /. gang will love it:

    "Most of the programming team consisted of programmers who had great facility with Windows, Microsoft Visual C++ and the Foundation Classes. In no time at all, it seemed, they had generated many screenfuls of windows and toolbars and dialogs, all with connections to networks and data sources, thousands and thousands of lines of code. But when the inevitable difficulties of debugging came, they seemed at sea. In the face of the usual weird and unexplainable outcomes, they stood a bit agog. It was left to the UNIX-trained programmers to fix things. The UNIX team members were accustomed to having to know. Their view of programming as language-as-text gave them the patience to look slowly through the code. In the end, the overall "productivity" of the system, the fact that it came into being at all, was the handiwork not of tools that sought to make programming seem easy, but the work of engineers who had no fear of "hard."
    ---

    I do recall some /. (or maybe it's in Salon) commenter at the time who replied, "Yeah, and your Dad thinks you're a weenie because you don't know how to wire transistors on a circuit board, and his Dad thinks he's a weenie because he can't wind the copper wire around his own inductors". Which is fair enough. Even log cabins can't be made without manufactured tools unless you can mold a kiln from clay and smelt iron for the axe yourself.

    Still, the point of the desire is to have *maximum* control of the level of tool you are able to work directly with. The philosophy was echoed by Neal Stephenson in his essay, "In the Beginning Was the Command Line", the googling of which I will leave to the student. It's on-line.

  • The deep knowledge is still there - the well is just a LOT deeper, and more complex.

    In the days of the C64 - it was reasonable for a skilled and/or curious programmer to get to the bottom of things and learn how everything worked, exactly. It was also potentially USEFUL for him to do this.... it was the only direction you could go, short of inventing a new language.

    So - today we still find deep knowledge out there - but it just not be as useful for even a very good programmer to go ALL the way down.

    Yes, a

  • I had a 1 line game for the commodore series of computers.
    Something like the below. You could do it in less than 40 characters with the short cuts. It even worked on the TRS80 line with slight modifications.
    I liked walking into their sales rooms with the "don't touch serious stuff" feel and having a game going in 30 seconds.

    0 poke 32788+loc,65; loc=loc+peek(151)*2-1; print tab(rand(37)),"XXX"; if peek(32788+loc) == 32 GOTO 0

    You had to clear the screen and start it with RUN at the bottom for it to work

    poke 3

  • I had the monochrome green on black monitor that came with the set up. I remember the huge boxes that the computer & the separate one for the monitor was in. (Compared to the slim box that my MacBook Pro came in.) I loved peeking and poking my way through programming in AppleSoft Basic. I even cut my teeth learning assembly. It was fun! It was what I loved about learning to program. This article reminded my of the long gone days. I still enjoy learning new languages but I miss my Apple ][e in so many w
  • by billakay ( 1607221 ) on Saturday March 13, 2010 @10:13PM (#31469354)
    A few years ago when I was an undergrad, I did a class project on the C64 just for the hell of it...the assignment was for my Theory of Computation class, and I happened to be taking an embedded systems class at the same time. I ended up implementing a Turing Machine simulator on the C64. I used a C cross-compiler on my PC to develop it, tested it on an emulator, and eventually burned it onto a ROM chip which I put into an actual cartridge that ran on a real C64. It was a REALLY cool project that involved quite a few different aspects of CS, and I ended up taking first place at a undergrad research poster competition at a CS conference.
  • What the hell? The Commodore's BASIC interpreter was not Microsoft BASIC. It was just the Commodore's variant of the BASIC language. Microsoft had not a bloody thing to do with it.
    • by Spit ( 23158 )

      Commodore originally licensed BASIC from Microsoft. They wrangled some deal which gave them full rights with a one-off payment. MS eventually felt hard done by and you see the Microsoft copyright message on the c128 boot screen.

  • by LodCrappo ( 705968 ) on Saturday March 13, 2010 @10:30PM (#31469444)

    I've recently rediscovered the joy of small computers that can be fully understood by one person. The 8 bit machines from the 80s provide opportunities for learning and experimentation that are not present in today's computers. "Retro computing" is growing as a hobby amongst both people who remember these machines fondly from past days and younger folks who just find them interesting. It is strictly a hobby of course, very little "useful" stuff can be done with these boxes beyond the education they can provide.

    My favorite retro system is OS-9, a real time multitasking operating system that you can fit in your head. There is an open source version called NitrOS-9 [nitros9.org] which has excellent documentation and most of the code well commented. It runs on 6809 based computers like the Tandy Color Computer and the Tano Dragon.

    You can learn a tremendous amount about process scheduling, IPC, memory management, device drivers and low level I/O, etc from playing with this system.

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Sunday March 14, 2010 @12:12AM (#31469938)
    Comment removed based on user account deletion

An adequate bootstrap is a contradiction in terms.

Working...