Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Education Programming

Want To Get Kids Interested In Programming? Teach Them Computer History 200

An anonymous reader writes "With poor IT teaching putting kids off pursuing a career in the computing it is time to look for a new approach. Taking kids back to the time of computing pioneers like John Von Neumann and the first machines — the likes of the Z3, the Eniac and the Colossus — would both inspire them and help get over the fundamentals about how computers work, argues silicon.com."
This discussion has been archived. No new comments can be posted.

Want To Get Kids Interested In Programming? Teach Them Computer History

Comments Filter:
  • Exactly what I did (Score:5, Interesting)

    by Tx-0 ( 572768 ) on Saturday January 07, 2012 @10:43AM (#38621294)
    Given the opportunity to teach Informatics to Diagnostic Radiology Imaging students, almost all in their 20s, I decided to start with a first lesson about history of computing, and I started from the ancient times when the most sophistcated calculator was the abacus. Guess what? Almost all of them listening, interested about something that's not really about their business.
  • by polymeris ( 902231 ) on Saturday January 07, 2012 @10:47AM (#38621304)

    Why?

    Last week I put together a one-hour implementation of Reversi [wikipedia.org] & showed it to my little brother (he's 12). I expected his reaction to be "meh", since the board was represented by a boring 2D array of dots, Xs and Os representing the pieces, and the input was in numerical coordinates, and of course he's used to cinematic 3D games and mouse input.
    To my surprise, he not only had fun playing, he wanted to know how I had done it, what could be improved, pointed out bugs, and keeps nagging me to fix them.

    Still don't know what the best way to start programming would be for him, but motivation is not the hard part.

  • Re:Riiiiiight. (Score:3, Interesting)

    by Tx-0 ( 572768 ) on Saturday January 07, 2012 @10:48AM (#38621316)
    I honestly think the interest a kid will show is proportionate to the passion he feels in the words of the teacher.
  • by nurb432 ( 527695 ) on Saturday January 07, 2012 @11:11AM (#38621448) Homepage Journal

    We need to stop this belief that people have, that computers are appliances.

    Computers should be appliances, for the consumer. Stuff should 'just work' and they should be insulated from the 'magic'. Else it would be like expecting a person driving a car to be a mechanic. Those days are long gone too.

  • by ledow ( 319597 ) on Saturday January 07, 2012 @11:26AM (#38621536) Homepage

    Most kids, especially boys, when given something like a programmable game that they can play will spend hours doing things like changing it so they get a million points, or changing it so their character looks like a penis, or changing it so that there are a million things on screen, etc.

    It's actually *not* a bad way for them to learn at all. See, understand, experiment, see results of experiment, tinker more. It's how I got into programming at first - sure there was a lot of typing in listings, etc. but the coolest bit was to be able to tweak the QBASIC Gorillas game and things like that.

    In my early teenage, I was ripping the graphics resources out of games like Castle of the Winds and trying to create my own version, and cracking the CD protection on Desert Strike for myself so I didn't need to keep swapping the fecking disks (I did that using MSDOS debug and a copy of Ralf Brown's Interrupt List).

    But the greatest initial impetus, that hooked my entire maths class on silly graphical-calculator games I was writing, was for them to see the code, tweak it and start to understand how it worked.

    I spent hours with a teenager who was working under me for his work experience (two weeks in a "real" job while they are still at school) explaining how to program and the most interesting thing was that they couldn't see how, e.g., 3D, sound, joysticks, etc. could be thought of in the same way as the numbers they manipulated in a basic dice game. Once they realised that everything from networking to 3D to AI to physics was just a matter of manipulating numbers, the "magic trick" of their console games was revealed and they wanted to replicate them.

  • by JPLemme ( 106723 ) on Saturday January 07, 2012 @11:30AM (#38621562)
    Last year I wrote a simulator of a VERY simple computer. It had four instructions, 16 bytes of memory, and 2 registers. There were no branch instructions; literally the only thing you could do was write a program to add two (8-bit) numbers together. (And it would set the error bit if the result was bigger than 255.) I gave it an interface of nothing but (simulated) LED lights for the registers and memory, and then (simulated) push buttons to select a memory address and poke a value into it. It looked like a relic from 1956.

    I then explained it to my then 9 and 11 year-old sons (who both are teaching themselves to program), explained base-2 math, explained how the "computer" worked and the four instructions they had available, gave them a whiteboard, and tasked them with writing a program to add two numbers.

    They went NUTS! They were discussing theories, pointing out errors in each other's ideas, and getting excited when they fixed bugs. And they were doing it with a maturity level way beyond their years. They loved it. And I think that part of it was because it was simple enough that they felt in control of it. I also had the memory lights turn green as the instruction pointer advanced, so they could watch the program running. (It was slow enough that they could follow it and watch the registers change.) Granted, my boys love history, so that may have sweetened the deal for them a bit. But I was shocked at how easily they picked it up and how much they enjoyed it.

    I'd like to expand it to the point where they can watch a stack operating, and see pointers and offsets getting used, but I just haven't had the time to follow up on it. But it confirms (for me) that the idea of starting at the beginning might be the most effective way to teach programming. (I also taught programming at a local trade college for a few years, and I noticed how much harder it was for the students to pick up--say--OO programming concepts when they had never had to deal with the problems that OO concepts were designed to solve. Trying to simplify it even more for elementary school students seemed mis-guided.)

    The very best part of the story was six months later touring the Mercury Redstone program blockhouse at Kennedy Space Center (I know it's not technically on the KSC property, save your breath). They had an old Sperry-Rand computer with a console full of lights, and both boys lit up and told the (confused) tour guide "I KNOW THIS! I KNOW HOW TO PROGRAM IT!". It nearly brought a nerdy tear to my eye.

    P.S. If anyone is curious for more information I'd be happy to share. It wasn't very complicated, but I think it has a lot of potential.
  • Re:Frankly... (Score:4, Interesting)

    by TheRaven64 ( 641858 ) on Saturday January 07, 2012 @11:58AM (#38621796) Journal

    For 8-bit processors with 64K memory (or 128K +banked memory), instructions were only between 1 and 3 bytes in size. Compare that to 16-bit processors with 640K memory where instructions are between 2 and 8 bytes in size or 32-bit processors with 2 MBytes where instructions are between 4 and 32 bytes in size. You still get around 32,000 instructions. It's just the sizes that change.

    I'm not sure where you're getting these numbers from. On ARM, instructions are 2-4 bytes in size for ARMv7 (32 bit) and 4 bytes for ARMv8 in 64-bit mode. On x86-64, the smallest instructions are 1 byte and the common ones average about 3-4 bytes. And these instructions do a hell of a lot more than 6502 or Z80 instructions. One instruction on pretty much any a modern CPU can take two vectors of four 32-bit floating point values and do a fused multiply-add (for example). If you could do that in under 100 Z80 instructions, then I'd be very impressed.

    The instruction size has nothing to do with the increase in memory usage. The biggest difference is the increase in data size. The images on this page alone are several MBs when uncompressed. Just the text is more than would fit into the memory of any 8-bit system. Try writing a book on an 8-bit system: you'll end up having to save each chapter to disk / tape separately, because it can't store the entire thing in memory even as plain 7-bit ASCII text. And, with the massive increase in data size, we've also seen an increase in the things people do with it. High-level programming languages mean that one line of code can now be hundreds of instructions, rather than just one or two. Code reuse means that writing a million-instruction program may only mean writing a few tens of thousands of instructions of new code.

  • Re:Legos (Score:1, Interesting)

    by matthiasvegh ( 1800634 ) on Saturday January 07, 2012 @12:10PM (#38621882)
    Unfortunately, as a child who enjoyed playing with LEGO (singular, not plural) I always built what was in the instructions, and then started building things totally different. Combining kits etc, to achieve what had never been done before. I always thought that this was what LEGO would be all about. Turns out, everyone else I knew who had LEGO, built them once (according to the manual) and then left them on the shelves to collect dust. Ultimately what I'm trying to say is, yes there is a lot of potential in everything, LEGO, game programming etc. But noone except geeks will ever actually use any of them. For them, getting them interested is kind of moot..
  • RCA 1802 (Score:5, Interesting)

    by Kupfernigk ( 1190345 ) on Saturday January 07, 2012 @12:12PM (#38621910)
    In ancient history, if you needed a really low power microprocessor based system, there was a processor made by RCA called the 1802. It was CMOS, and I once demonstrated a development board running off two lemon halves with a copper and a zinc rod in each one. It was very slow but, and here is the point, it could be clocked down to zero. You could single step, not just instructions, but the entire cpu fetch/execute cycle, see the memory address go out on the bus, see the data. Although it had a stack pointer you had to manipulate it programmatically, and write subroutines to handle subroutines, nesting and returns.

    It stuck me then it would be the most perfect teaching tool, because you could do anything with it from teaching the von Neumann architecture to running BASIC on a terminal. The processor and its support chips are long dead (I'm writing about the late 70s), and there doesn't seem to be any modern equivalent.

After an instrument has been assembled, extra components will be found on the bench.

Working...