Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming Education

Programming the Commodore 64: the Definitive Guide 245

Mirk writes "Back in 1985 it was possible to understand the whole computer, from the hardware up through device drivers and the kernel through to the high-level language that came burned into the ROMs (even if it was only Microsoft BASIC). The Reinvigorated Programmer revisits R. C. West's classic and exhaustive book Programming the Commodore 64 and laments the decline of that sort of comprehensive Deep Knowing."
This discussion has been archived. No new comments can be posted.

Programming the Commodore 64: the Definitive Guide

Comments Filter:
  • by headkase ( 533448 ) on Saturday March 13, 2010 @07:50PM (#31467930)
    I started on a Commodore 64 (well a Commodore 128 that ran exclusively in 64 mode..) and learned machine language by breaking protections of the day. Many of the things that were legal back then such as copying software for DRM'd games are now gone the way of the dodo. I honestly see that in twenty years from now a debugger in itself will be seen as a "tool of crime" or whatever wordage they use to keep them out of the general public's hands just like lock-picks today. Hope you like high-level because the day is coming that it will be illegal to be low-level without a government (or more likely Apple) license.
  • by jmorris42 ( 1458 ) * <jmorris&beau,org> on Saturday March 13, 2010 @08:02PM (#31468048)

    It was possible to fully understand all of the old 8 and 16 but machines. Now you could spend momths trying to fully understand one video card, which would be replaced by something more complex by the time you finished understanding it.

    And that was a big part of it, the stability of the platforms during that era. A C64 was exactly the same as every other one, a Tandy Coco was identical to the million others of it's kind. Later models tended to retain as close to 100% backward compatibility as possible so knowledge and software tools retained value. Now you buy a lot of PCs with the understanding that a year from now you won't be able to buy more of the exact model even if you stick to Optiplexes and such that promote the relative stability of the platform. Something will be slightly different. So, understanding being impossible we abstract it all away to the greatest extent possible.

    If you want to reconnect with low level look at AVR microcontrollers. If you are really frugal you can get going for $20.

  • by phantomfive ( 622387 ) on Saturday March 13, 2010 @08:04PM (#31468062) Journal
    But.......anyone who would have trouble going without garbage collection, and couldn't learn how to code on a C64 probably shouldn't have a job. And anyone who can code on a C64 should have no problem adjusting to garbage collection.
  • V-Max (Score:5, Interesting)

    by headkase ( 533448 ) on Saturday March 13, 2010 @08:07PM (#31468092)
    One of the most comprehensive protections at that time was called "V-Max!" which stood for Verify Maximum. What were called "nibblers" for disc copy software couldn't touch it even though those nibblers represented the ultimate in disk copy technology at the time. There were two ways to copy V-Max, the first was to get a dedicated hardware copying unit. The second was to apply a bit of knowledge with a debugger cartridge: the V-Max protection was a turn-key system you gave them files and they wrapped the protection around it and provided a fast-loader at the same time. So what you would do is fill all of memory (the whole 64K) with a value you knew say: $AF. Then you would load a V-Max file from the disc, it's loader would automatically take over and while it was loading you would enter your debugger cartridge and change it's exit point to point to itself. So instead of $0800: RTS you would make it $0800: JMP $0800. Then you would wait for the V-Max loader to fully load the file. Then a quick button press on your debugger cartridge and use the memory monitor to find where the file loaded by seeing what memory was NOT $AF. Then from the debugger cartridge save that memory block out again. Completely de-protected file. Since V-Max used standard kernel-load vectors the program itself needed no further modification, the protection was completely gone you just lost the fast-loader function. Which you then re-added yourself into a chunk of memory wherever the game didn't use it. Relocatable code was best for that. Later versions of V-Max also did on-the-fly decompression of files so occasionally while stripping the protection you would run into a situation where your destination disk ran out of space versus the original protected disk. Again, that was worked around by inserting your own custom loader into the kernel load-vectors which also did decompression. V-Max was impossible for copy software of the day to copy but with a little bit of knowledge and a debugger cartridge it was absolutely trivial to defeat.
  • Re:Indeed (Score:5, Interesting)

    by davester666 ( 731373 ) on Saturday March 13, 2010 @08:19PM (#31468172) Journal

    There doesn't appear to be any section on custom high-speed communication with the external floppy drive unit. IIRC, you could upload a small program to the drive, and then you could in particular read data from the drive a lot faster than the 'OS' normally supported. This technique was also used to do copy protection for a bunch of titles, primarily by stepping the drive head 1/2 between tracks then doing reads. Production disk duplication could write to both the track & between tracks [or could write a wide enough track to cover the whole area], but regular floppy drives couldn't write both [you could either write on the track, or between tracks].

    Not that I was interested in this stuff or anything.

  • Re:Relax (Score:5, Interesting)

    by Anonymous Coward on Saturday March 13, 2010 @08:21PM (#31468180)
    I was playing a game with some DRM (either StarForce or SecuROM) and it wouldn't run if I had a debugger present. I asked them why and they were all like "Anyone who has a debugger and is playing the game is a hacker." That's RIGHTLY earned state of paranoia.
  • Uphill Both Ways (Score:5, Interesting)

    by headkase ( 533448 ) on Saturday March 13, 2010 @08:24PM (#31468204)
    And why did I spend time removing protection systems? Funny that part is: I owned an MSD floppy drive which was completely incompatible at a machine-language level with the 1541 drives everyone else owned and that all the game-makers wrote their protection systems for. So my floppy drive would load any of the software of the day. I literally bought a game, had to hack away the protection, and then I could play it on my computer. Of course no one will believe me when I say this but damnit, its the truth! Now get off my lawn.
  • by Anonymous Coward on Saturday March 13, 2010 @08:28PM (#31468234)

    I think it's there if you want it. A couple of weeks ago I was writing in assembly language specifically for one of the top ten fastest computers in the world. Sure, there's a shitload I don't know about that cluster -- no idea if there's an RJ-45 port, much less how I'd access it via the assembler, but I could probably find out if I cared and got clearance for it.

    It's easy to romanticize simpler times, and there is some truth to them being simpler, but I'm damn excited about today. I mean, really, you can the guide for ARM assembler and pick up a phone or other portable, and have a ridiculous amount of computing power that you can bend to your will. OS and application bloat is either for a purpose (e.g., to make a computer easier to program for, or to add more features for less work) or a result of laziness, but on most systems you can get down to ground level if you really want. It's just that most people do not.

  • by TheRaven64 ( 641858 ) on Saturday March 13, 2010 @08:49PM (#31468408) Journal

    Absolutely. This is the point of, for example, the STEPS project at VPRI, which aims to build an entire working system (kernel, GUI, dev tools, apps) in under 20,000 lines of code. Some of the stuff they've produced, like OMeta and COLA are really impressive.

    Adding complexity to a system is easy. Adding features without increasing the complexity is much harder, but much more useful and rewarding.

  • by headkase ( 533448 ) on Saturday March 13, 2010 @09:00PM (#31468490)
    Fast loaders worked because the kernel ROM software didn't fully take advantage of the hardware. Between the C64 and the 1541 floppy drive the connector cable had 4 wires for carrying information. The kernel routines built into ROM only used one of those lines to signal from the drive to the computer. The "fast loaders" simply uploaded a program to the drive which used all four lines to signal information. The "fast" loaders weren't fast magic they just removed a deficiency in the kernel ROM routines. The exact number of lines between the computer and drive I'm not sure of but this is the principle the fast loaders worked by. And tape based fast loaders worked because the kernel routines would save a copy of the information to tape and then immediately save a complete other copy to compare against for error correction on load. The tape fast loaders just skipped saving and comparing the redundant copy to get the speed. Disk fast loaders didn't compromise the integrity of the information in the way tape fast loaders had potential to though. Remember computers back then were full of noise when you were talking to tape drives especially.
  • by Jah-Wren Ryel ( 80510 ) on Saturday March 13, 2010 @09:10PM (#31468542)

    You are only partly right. All of those things are knowable and learnable within a reasonble length of time - the problem is getting the documentation to know them. Too much documentation is locked up as proprietary info, either behind a paywall or an NDA wall.

  • by phantomfive ( 622387 ) on Saturday March 13, 2010 @09:17PM (#31468598) Journal
    I don't know man, it's a lot of work. On my computer I have Ruby, Python, Perl, GCC, and who knows what else installed. There are tons of APIs that I don't know what they do, even in the languages that they do know. I also have a webserver, and FTP server, and probably several other servers. They aren't running right now, but they came with the system.

    On top of that, I have postfix config files, a mach_kernel file, and a bunch of other weird files that are either quite complex (this book about Postfix is 288 pages [oreilly.com]), or I have no idea what they do, or they are binary and I have no hope of ever figuring them out. Even if I switch to my Linux partition, where I have the source code to everything, it's a lot of work to understand every single file in even just the Kernel. I'm not sure anyone even understands the Kernel itself completely. I haven't talked about hardware yet, but Intel processors do some tricky out of order operations and pipelining and such, it's not always easy to predict what is going to happen on one of those things. It is a lot of knowledge, and I am not sure anyone actually does understand it today, even if it is possible. No one I know makes that claim.

    This is really different from the days of the C64, where the entire thing was only 64k (actually more with paging). You can read the entire memory contents of 64k in an afternoon, literally everything on the computer. You could definitely understand all the 'source code' (except the source code was in assembly) to the entire system. Predicting what the processor would do and how long it would take wasn't hard. You could fit everything about the system (even schematics to the hardware) in a single book.
  • Re:Relax (Score:4, Interesting)

    by fuzzyfuzzyfungus ( 1223518 ) on Saturday March 13, 2010 @09:17PM (#31468600) Journal
    Given that German has already gone and adopted an absurdly vague and overbroad law aimed at "hacking tools" [schneier.com], I wouldn't really describe somebody hypothesizing that other jurisdictions might do so in the future as "paranoid".

    Perhaps ultimately more dangerous(because they tend to be subtler) are situations where no law ever bans something, per se; but some quiet mixture of contractual, legal, and technical pressure effectively prevents it anyway. Consider SDI [wikipedia.org] for an instance of that. A digital video transmission standard, available well in advance of HDMI, that was frozen out of the "Consumer" market entirely. It's not like possession was illegal or anything; but most people never even heard of it, nor was it available on any broadly affordable hardware.

    In the case of something like debuggers, I'd be very surprised to see any sort of legal ban; but the technological/private sector contractual de facto neutralization is an eminently plausible scenario. Already, in recent versions of Windows, any media application that requires the "Protected Video Path" will throw a fit if there are any unsigned drivers loaded that could compromise that path. An analogous "Protected Execution Path", provided by the OS for programs that didn't want anybody else debugging them or looking at their memory, hardly seems implausible. Not to mention, of course, the increasing percentage of consumer-level computer activity that is occurring on devices were being able to run arbitrary programs isn't even an expectation. Not much debugging going on on Xbox360s, and debuggers don't have to be illegal to not be available through the App Store.

    There will always be gaps, of course, for the sufficiently knowledgeable, motivated, and well equipped; but a largely opaque consumer level computing environment seems like an unpleasantly plausible prediction.
  • by mindstrm ( 20013 ) on Saturday March 13, 2010 @09:41PM (#31468760)

    The 1541 drives were purposefully slowed down to maintain compatibility with some other Commodore hardware (I forget at the moment exactly which)..... so they weren't so much fast-loaders as they were "doing it the way the engineers designed it to work, not the way the boss made them change it so he could claim compatibility.

  • by RichMan ( 8097 ) on Saturday March 13, 2010 @10:05PM (#31468930)

    I had a 1 line game for the commodore series of computers.
    Something like the below. You could do it in less than 40 characters with the short cuts. It even worked on the TRS80 line with slight modifications.
    I liked walking into their sales rooms with the "don't touch serious stuff" feel and having a game going in 30 seconds.

    0 poke 32788+loc,65; loc=loc+peek(151)*2-1; print tab(rand(37)),"XXX"; if peek(32788+loc) == 32 GOTO 0

    You had to clear the screen and start it with RUN at the bottom for it to work

    poke 32788+loc,65 # Display your lander "A" at the middle top of the screen+ offset
    loc=loc+peek(151)*2-1 # update your location to the right if shift is pressed, otherwise to the left. No going straight.
    print "XXX" # but a block XXX at the bottom of the screen and scroll the previous display of your lander A off the top
                                                    # the blocks would scroll up towards the top of the screen and you had to dodge them
    if peek() # loop if you are not going to hit a block

    It ran nice and fast.

  • by billakay ( 1607221 ) on Saturday March 13, 2010 @11:13PM (#31469354)
    A few years ago when I was an undergrad, I did a class project on the C64 just for the hell of it...the assignment was for my Theory of Computation class, and I happened to be taking an embedded systems class at the same time. I ended up implementing a Turing Machine simulator on the C64. I used a C cross-compiler on my PC to develop it, tested it on an emulator, and eventually burned it onto a ROM chip which I put into an actual cartridge that ran on a real C64. It was a REALLY cool project that involved quite a few different aspects of CS, and I ended up taking first place at a undergrad research poster competition at a CS conference.
  • by LodCrappo ( 705968 ) on Saturday March 13, 2010 @11:30PM (#31469444)

    I've recently rediscovered the joy of small computers that can be fully understood by one person. The 8 bit machines from the 80s provide opportunities for learning and experimentation that are not present in today's computers. "Retro computing" is growing as a hobby amongst both people who remember these machines fondly from past days and younger folks who just find them interesting. It is strictly a hobby of course, very little "useful" stuff can be done with these boxes beyond the education they can provide.

    My favorite retro system is OS-9, a real time multitasking operating system that you can fit in your head. There is an open source version called NitrOS-9 [nitros9.org] which has excellent documentation and most of the code well commented. It runs on 6809 based computers like the Tandy Color Computer and the Tano Dragon.

    You can learn a tremendous amount about process scheduling, IPC, memory management, device drivers and low level I/O, etc from playing with this system.

  • by BuR4N ( 512430 ) on Sunday March 14, 2010 @01:12AM (#31469938) Journal
    This is a must read for anyone interested in Commodore and its products, a great historical account how among other things the C64 came to be.

    http://www.amazon.com/Edge-Spectacular-Rise-Fall-Commodore/dp/0973864907 [amazon.com]
  • Re:Indeed (Score:5, Interesting)

    by fwarren ( 579763 ) on Sunday March 14, 2010 @02:00PM (#31473304) Homepage

    Yet every day, I put young pups to shame. It does not matter if it is troubleshooting hardware, or software. It does not matter if it is dealing with programing or configuring. My mental map of the problem is different than theirs.

    The skills I learned back in the 80's on a computer that you could understand, I still use today. My "concept" of what a computer is was formed by understanding the whole. RAM, ROM, interrupts, I/O, how the CPU works. All from a machine with 64K or RAM and 20K of ROM.

    Under the hood, under all of that abstraction. Is a PC that is very much like a C64. With the C64 people learned mastery of their system. With the PC, so much hardware is out there. It is impossible to learn it all inside out and take advantage of every feature of it. So greater power has always been obtained in the PC world by moving to faster hardware, not by utilizing the current hardware better. It is all abstraction running on very fast, underutilized hardware.

    The techs coming out of college for the last 20 years do not understand a computer conceptually like those who learned this stuff in the 70's or 80's. When it comes to trouble shooting all of this abstraction, many folks have no idea that there is anything beneath the abstraction.

    I recently attended a college programming class as a requirement for a degree. The instructor gave us a quiz at one point and there was only 5 students out of 60 that passed. Why? Because most students did not know how to write a program on a piece of paper. Without intellisense holding their hand they could not code. I learned to program from the manual that came with my C64. I learned to program better by typing in programs from Compute! Magazine. I have written hundreds of pages of code on paper and typed it into a computer at a later time. It is a skill I take for granted. Without the abstraction of Intellisense most of the class was rendered useless.

    Something has been gained, but something has also been lost. When I was a kid I dreamed of computers that could do a 10th of what they do now. I learned everything I could about them. Lived, dreamed, ate, slept computers, computers, computers. Now days. My kids can buy a laptop with 3 gigs of RAM for the price of a C64 and 1541 drive. And what do they do with it? Program? Nope. They learn their friends on FaceBook, not programming. They play games, not write them. They pirate music, not create it.

    It looks like those who understand a computer and really make it do something will always remain a small, elite Priesthood.

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...