Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Education IT Technology

Learning Computer Science via Assembly Language 1328

johnnyb writes " A new book was just released which is based on a new concept - teaching computer science through assembly language (Linux x86 assembly language, to be exact). This book teaches how the machine itself operates, rather than just the language. I've found that the key difference between mediocre and excellent programmers is whether or not they know assembly language. Those that do tend to understand computers themselves at a much deeper level. Although unheard of today, this concept isn't really all that new -- there used to not be much choice in years past. Apple computers came with only BASIC and assembly language, and there were books available on assembly language for kids. This is why the old-timers are often viewed as 'wizards': they had to know assembly language programming. Perhaps this current obsession with learning using 'easy' languages is the wrong way to do things. High-level languages are great, but learning them will never teach you about computers. Perhaps it's time that computer science curriculums start teaching assembly language first."
This discussion has been archived. No new comments can be posted.

Learning Computer Science via Assembly Language

Comments Filter:
  • by agm ( 467017 ) * on Thursday February 05, 2004 @06:46PM (#8195896)
    Is "Linux x86 assembly" any different to any other kind of "x86 assembly"?
    • by shaitand ( 626655 ) on Thursday February 05, 2004 @06:48PM (#8195919) Journal
      It is in the same fashion that win32 asm is different from linux asm. The core is the same but knowing the core of x86 assembler is going to get you far if what you are wanting to do is talk to the kernel.
      • Not the point! (Score:5, Insightful)

        by www.sorehands.com ( 142825 ) on Thursday February 05, 2004 @07:17PM (#8196316) Homepage
        The point is the understanding of the workings of the machine. When I was in school, we had to take a computer architecture class which included using AND gates to make counters and such.


        My first IBM PC job was C, but I had to learn 8086 so that I could debug since there was no source level debugging when using overlays.

        Anyways, how do you find a compiler bug, if you can't read the code the compiler generates?

        • by EugeneK ( 50783 ) on Thursday February 05, 2004 @07:49PM (#8196612) Homepage Journal
          Luxury! In my day we had to make our own AND gates out of OR and XOR gates!
          • Re:Not the point! (Score:5, Informative)

            by tomstdenis ( 446163 ) <<moc.liamg> <ta> <sinedtsmot>> on Thursday February 05, 2004 @07:58PM (#8196740) Homepage
            Correct me if I'm wrong but isn't the most primitive CMOS gate a NAND gate? So I highly doubt you would make AND out of and XOR gate [XOR being the more costly of the three].

            Tom
            • Just to clarify (Score:5, Informative)

              by Raul654 ( 453029 ) on Thursday February 05, 2004 @10:41PM (#8197903) Homepage
              The correct answers are down there, but just to collect them and clarify - you can build anything using nothing but NANDS. Alternatively, you can build anything using nothing but XORS. You can prove this easily using demorgan's theorem.

              However, in the real world, NANDS are cheap (2-3 transistors), so that's what everyone uses.
              • by BigBlockMopar ( 191202 ) on Friday February 06, 2004 @12:12AM (#8198428) Homepage

                However, in the real world, NANDS are cheap (2-3 transistors), so that's what everyone uses.

                Well, NANDs are easy to make with MOSFETs or vacuum tubes.

                But I suggest that, in order to simplify the learning of digital logic and avoid this whole nastiness of DeMorgan, we should adopt relays as our primary logic device.

                Think about it: two relays with their contacts in parallel = OR. Two relays with their contacts in series = AND. A relay with normally-closed contacts = NOT.

                In this way, all design work can be done with natural logic (AND, OR, NOT) rather than "efficient" NAND, NOR, etc.

                On top of that, your computer would make satisfying clicking sounds reminiscent of a pinball machine's scorekeeping system or an old elevator contoller, while you're crunching SETI@Home units.

                I'm building a 4-bit binary full adder with nothing but relays in order to demonstrate their sheer computing power, and was hoping that someone could write me drivers to allow it to have practical uses.

              • in reality... (Score:5, Informative)

                by slew ( 2918 ) on Friday February 06, 2004 @02:51AM (#8199100)
                For what it's worth, they don't use just NANDs in cmos chip design in the real world. The primary primitive is the AND-OR-INVERT (AOI) structure.

                In the cmos world, pass-gates are much cheaper than amplifying gates (in the size vs speed vs power tradeoff), although you can't put too many pass gates in a row (signal degradation). So in fact MUX (multiplexor to pass one of the two inputs using the control of a third) and XORS (use input A to pass either !B or B) are used quite a bit.

                Some background might be helpful to think about the more complicated AOI struture, though...

                In a cmos NAND-gate, the pull-up side is two p-type pass gates in parallel from the output to Vdd (the positive rail) so that if either of the two p-type gates is low, the output is pulled high. For the pull-down side, two n-type pass gates are in series to ground so both n-type gates have to be low before the output is pulled to ground. This gives us a total of 4 transistors for a cmos-nand where the longest pass gate depth is 2 (the pull-down). The pull-down is restricted to be the complement function of the pull-down in CMOS (otherwize either the pull-up and pull-down will fight or nobody will pull causing the output to float and/or oscillate).

                A 2-input NOR gate has the p-type in series and the n-type in parallel (for the same # of transistors).

                Due to a quirk of semi-conductor technology, n-type transistors are easier to make more powerfull than p-type so usually a NAND is often slightly faster than a NOR (the two series n-types in a NAND gate are better at pulling down than the two series p-types are at pulling up in a NOR gate). However, this isn't the end of the story...

                Notice that you can build a 3-input NAND by just adding more p-type transistors in parallel to the pull-up and more n-type in series to the pull-down. You can make even more complicated logic by putting the pull-up and pull-down transistor in combinations of series and parallel configurations. The most interesting cmos configurations are called AOI (and-or-invert) since they are the ones you can make with simple parallel chains of pass transistors in series for pull-up and pull-down.

                For most cmos semi-conductor technologies, you are limited to about 4 pass gates in series or parallel before the noise margin starts to kill you and you need to stop using pass gates and just start a new amplifying "gate". Thus most chips are designed to use 4 input AOI gates where possible and smaller gates to finish out the logic implementation.

                Thus "everyone" really uses lots of different types of gates (including simple NAND and XORS as well as more complicated AOI).
    • I think what is meant is programming in assembly under Linux. Programming in assembly under Linux is different than say programming in assembly under DOS/Windows, for example.
    • by Cryptnotic ( 154382 ) on Thursday February 05, 2004 @06:57PM (#8196035)
      Well, for starters the syntax for assemblers is different. There are two standards, the AT&T standard (which is used by the GNU assembler) and the other one that is more familiar to DOS/Windows x86 assembly programmers (which is used by the NASM assmebler).

      Second, OS interfaces for making system calls (e.g., to read files, open network connections, etc) are different in Linux versus DOS or Windows).

      • by Anonymous Coward on Thursday February 05, 2004 @07:14PM (#8196269)
        There are two standards, the AT&T ... and the other one

        Incorrect. There are at least four different assemblers and standards:

        ASM - GNU Assembler. AT&T standard, as commonly used on Linux. The syntax hasn't changed since the 60's - which is both very good and very bad. I personally think it should be retired.

        MASM - Microsoft Assembler. Intel standard assembly. The syntax is nice, but there are some ambiguous operators (is [] address of or address by value? - the meaning changes depending on the context). This is typically what the commercial Windows world uses. MASM itself is mostly obsolete - the Visual C compiler can now do everything that it could and supports all modern CPU instructions (even on Visual C++ 6 if you install the latest CPU pack).

        NASM - Netwide Assembler. An assembler that set out to put right all the things that were wrong with MASM. The syntax is excellent, ambiguous operators are cleared up, documentation is also excellent, it interoperates beautifully with Visual C on Windows and GNU C on Linux. Ideally NASM would replace AS as the standard now that it's open source.

        TASM - Borland Turbo Assembler. Based around the Intel standards, but does things slightly differently. Has extensions which allow for easy object-oriented assembly programming - which can make for some very nice code. Had a MASM compatibility mode, but nobody in their right mind used that if they could help it. I had version 5, but I don't believe they've kept it up to date, so it's obsolete now.

        There are a couple of others as well, most notably AS86 (which was the leading independent solution for writing assembler back in the DOS days).
    • by pla ( 258480 ) on Thursday February 05, 2004 @06:58PM (#8196049) Journal
      Is "Linux x86 assembly" any different to any other kind of "x86 assembly"?

      Yes. Although it requires understanding the CPU's native capabilities to the same degree, Linux uses AT&T syntax, whereas most of the Wintel world uses (unsurprisingly) Intel/Microsoft syntax.

      Personally, although I far prefer coding C under Linux, I prefer Intel syntax assembly. Even with many years of coding experience, I find AT&T syntax unneccessarily convoluted and somewhat difficult to quickly read through.

      The larger idea holds, however, regardless of what assembler you use. I wholeheartedly agree with the FP - People who know assembly produce better code by almost any measurement except "object-oriented-ness", which assembly makes difficult to an extreme. On that same note, I consider that as one of the better arguments against OO code - It simply does not map well to real-world CPUs, thus introducing inefficiencies in the translation to something the CPU does handle natively.
      • by perky ( 106880 ) on Thursday February 05, 2004 @07:16PM (#8196294)
        On that same note, I consider that as one of the better arguments against OO code - It simply does not map well to real-world CPUs, thus introducing inefficiencies in the translation to something the CPU does handle natively

        maxim: cycles are cheap, people are expensive. For the *vast majority* of software it is significantly better value to design and build a well architected OO solution than to optimise for performance in languages and methodologies that are more difficult to implement and maintain. Who cares if it's not very efficient - it'll run twice as fast in 18 months, and will be a lot cheaper to change when the client figures out what the actually wanted in the first place. But I guess you already knew that.
        • by pla ( 258480 ) on Thursday February 05, 2004 @07:31PM (#8196434) Journal
          maxim: cycles are cheap, people are expensive.

          True. This topic, however, goes beyond mere maximizing of program performance. Pur simply, if you know assembler, you can take the CPU's strengths and weaknesses into consideration while still writing readable, maintainable, "good" code. If you do not know assembly, you might produce simply beautiful code, but then have no clue why it runs like a three-legged dog.


          it is significantly better value to design and build a well architected OO solution

          Key phrase there, "well-architected". In practice, the entire idea of "object reuse" counts as a complete myth (I would say "lie", but since it seems like more of a self-deception, I woun't go that far). I have yet to see a project where more than a handful of objects from older code would provide any benefit at all, and even those that did required subclassing them to add and/or modify over half of their existing functionality. On the other hand, I have literally hundreds of vanilla-C functions I've written over the years from which I draw with almost every program I write, and that require no modification to work correctly (in honesty, the second time I use them, I usually need to modify them to generalize better, but after that, c'est fini).


          Who cares if it's not very efficient - it'll run twice as fast in 18 months

          Y'know, I once heard an amusing joke about that... "How can you tell a CS guy from a programmer?" "The CS guy writes code that either won't run on any machine you can fit on a single planet, or will run too slowly to serve its purpose until technology catches up with it in few decades". Something like tha - I killed the joke, but you get the idea.

          Yeah, computers constantly improve. But the clients want their shiny new software to run this year (if not last year, or at least on 5-year old equipment), not two years hence.
          • by afidel ( 530433 ) on Thursday February 05, 2004 @08:11PM (#8196889)
            True. This topic, however, goes beyond mere maximizing of program performance. Pur simply, if you know assembler, you can take the CPU's strengths and weaknesses into consideration while still writing readable, maintainable, "good" code. If you do not know assembly, you might produce simply beautiful code, but then have no clue why it runs like a three-legged dog.

            About .1% of code needs to be so optimized that CPU architecture matters. For the other 99.9% speed improvements are much more likely to come from algorithmic improvements. Not only that but real world experience shows that code written in ASM is NOT maintanable, the indepth knowledge of a specific architecture is fleeting while knowledge of most high level languages lasts a LONG time.
            • by pla ( 258480 ) on Thursday February 05, 2004 @08:44PM (#8197133) Journal
              For the other 99.9% speed improvements are much more likely to come from algorithmic improvements.

              Gack! I perhaps have phrased myself rather poorly. Throughout this entire thread, I have not meant to refer to writing even a single line of actual assembly code. I don't mean that humans can do it better than compilers (though often true, for small sections of code), I don't mean that asm always runs faster than the comparable C (again, often true), and I don't in any way mean that asm reads more clearly than a high-level language (about as false as they come).

              Perhaps an example would help...

              In C, I can make a 10-dimensional array (if the compiler will let me) as a nice, easily-readable organization of... Well, of something having 10 dimensions (superstrings?). I can make a pointer to a structure that contains an array of pointers to linked lists (which sounds obscure, but I can imagine it as a straightforward way to implement, say, a collection of variable-length metadata on a set of files). I can choose to have my loop indices run in row-major or column-major order, with no high-level reason to choose either way.

              From an assembly point of view, I realize exactly the hellish task involved in dereferencing the first two example. I realize that row-major vs column-major ordering has a significant impact on the quantity of dereferencing needed. Even further, I realize that by choosing row-major or column-major indexing, I can ensure cache integrity, or obliterate it.

              The specific examples I just gave perhaps seem absurdly obvious to any decent programmer. But countless other, more subtle, differences in how I would choose to lay out my code, come from an understanding of what the compiler will likely do with that code, and how the CPU will eventually have to deal with it. Rather than having a superficially obvious relation to the CPU, such choices would look more like stylistic preferences than careful decisions with significant implications to performance.

              How about the size of an array, for example? Sometimes using a power of two will help immensely (if it allows a constant shift vs a multiply), and sometimes it will hurt immensely (if you plan to use it such that almost every access competes for the same cache line). Things like that, which a high-level-only programmer simply will not know without experiential (ie, programming in assembly) knowledge of the underlying architecture.
            • by gweihir ( 88907 ) on Thursday February 05, 2004 @09:52PM (#8197620)
              Not only that but real world experience shows that code written in ASM is NOT maintanable, the indepth knowledge of a specific architecture is fleeting while knowledge of most high level languages lasts a LONG time.

              That is not the point. The point is that knowing one assembly language gives far more insight into what higher level languages actually do. It is, e.g., very difficult to explain the actual workings of a buffer-overflow exploit to somebody without any assembly knowledge. Or what a pointer is. Or what pageing does. Or what an interrupt is. Or what impact the stack has and how it is being used for function arguments. Or how much memory a variable needs....

              The only processor I know that actually made assembly programming almost a c-like experience was the Motorola 68xxx family. On the Atari ST, e.g., there were complex applications writen entirely in assembly. Today it would indeed be foolish to do a larger project in assembly language, but that is not the point of the book at all.

              Bottom line: You need to understand the basic tools well. You don't need to restrict yourself to their use or even use them often. But there is no substitute for this understanding.
          • by YouHaveSnail ( 202852 ) on Thursday February 05, 2004 @11:25PM (#8198185)
            I have yet to see a project where more than a handful of objects from older code would provide any benefit at all, and even those that did required subclassing them to add and/or modify over half of their existing functionality.

            Have you never used a decent class library? Writing reusable classes requires a much more careful approach to design and implementation than writing classes for one time use, and most people can't afford to spend that kind of time. The power of reusability lies in the fact that I can go out and buy a library of useful classes and feel pretty good that the code therein has already been well tested, usually at much lower cost and higher quality than I could produce myself.

            Whether it's building a user interface with PowerPlant, Cocoa, or MFC, or manipulating data with STL, the amount of code that I reuse far exceeds the amount that I write myself.
          • by DonGar ( 204570 ) on Thursday February 05, 2004 @11:42PM (#8198278) Homepage
            I have say that trying to program in low level languages, or worrying about the details of the machine archtecture has usually been (in my experience) counter productive in terms of efficiency.

            I'm not saying that there aren't places where low level details aren't critical, but for the most part they just draw attention away from the thing that has the most impact on performance.

            Application Architecture.

            The choices of algorithms and data structures are far more important than any low level details. But low level details are more fun, and tend to make us feel more manly or guruly or something so we tend to focus on them instead. In practice I find that using low level languages or super optimized tools make it hard to worry about high level structure, so the structure gets ignored.

            I once worked on a project in which people were seriously freaking out over the performance hit in using virtual functions while parsing the configuration file.

            At the same time, the application (a firewall) was performing multiple linear searches through linked lists of several hundred items per packet. These searches were very carefully optimized, so they had to be fast... (sigh). When I switched the system to use STL dictionaries (and later hashes), total throughput jumped three fold, yet some of the developers were worried about the cost of the templates and virtual functions used.

            The fact that the algorithm is more important thatnthe details of implementation is a lesson that everyone (myself included) needs to keep getting pounded into them, because it's so easy to forget.

            There are places where assembler and hardware details matter a great deal. But they are usually places that contain a lot of repetition that can't be removed algorithmically. Graphics are the obvious example.

            A recent example:

            My brother in law gave me one of those boards with pegs in which you try to jump your way down to a single peg remaining. I have no idea what it's called, but anyway....

            I decided to be cute, and wrote a 100 line python scrpt over lunch to find all possible solutions. I was suprised when it hadn't found a single solution by the time I was finished eating. I was a lot more suprised when it hadn't found anything by the end of the day.

            So I killed it and started in optimizing for performance and tweaking and trying different things. This kept me occupied over lunch for a couple of weeks, but didn't produce anything else. Finally I started doing some analysis of the problem. The first thing I found was that the search space (for the board I had) was roughly 10**18.

            I didn't matter how much I tweaked the details of my search, it wasn't going to find very many solutions in less than a century (actually, it looks like a naive full search will take several thousand years).

            So, after wasting several weeks of lunch breaks, I have redefined the problem. Find A solution, and rewritten my search to use a heuristic. I finished everthing but the heuristic at lunch a couple of days ago. The new system will take 100 or even a 1000 times as long to perform a jump, but I'm expecting to find a solution before I'm dead.

            So, don't get bogged down in the details of an implementation. They won't usually take you very far.
      • by Ungrounded Lightning ( 62228 ) on Thursday February 05, 2004 @08:09PM (#8196861) Journal
        People who know assembly produce better code by almost any measurement except "object-oriented-ness", which assembly makes difficult to an extreme.

        Actually, they don't.

        A study was done, some decades ago, on the issue of whether compilers were approaching the abilities of a good assembly programmer. The results were surprising:

        While a good assembly programmer could usually beat the compiler if he really hunkered down and applied himself to the particular piece of code, on the average his code would be worse - because he didn't maintain that focus on every line of every program.

        The programmer might know all the tricks. But the compiler knew MOST of the tricks, and applied them EVERYWHERE, ALL THE TIME.

        Potentially the programmer could still beat the compiler in reasonable time by focusing on the code that gets most of the execution. But the second part of Knuth's Law applies: "95% of the processor time is spent in 5% of the code - and it's NOT the 5% you THOUGHT it was." You have to do extra tuning passes AFTER the code is working to find and improve the REAL critical 5%. This typically was unnecessary in applications (though it would sometimes get done in OSes and some servers).

        This discovery lead directly to two things:

        1) Because a programmer can get so much more done and working right with a given time and effort using a compiler than using an assembler, and the compiler was emitting better assembly on the average, assember was abandoned for anything where it wasn't really necessary. That typically means:

        - A little bit in the kernel where it can't be avoided (typically bootup, the very start of the interrupt handling, and maybe context switching). (Unix System 6 kernel was 10k lines, of which 1.5k was assembler - and the assembly fraction got squeezed down from then on.)

        - A little bit in the libraries (typically the very start of a program and the system call subroutines)

        - Maybe a few tiny bits embedded in compiler code, to optimize the core of something slow.

        2) The replacement of microcoded CISC processors (i.e. PDP11, VAX, 68K) with RISC processors (i.e. SPARC, MIPS). (x86 was CISC but hung in there due to initera and cheapness.)

        Who cares if it takes three instructions instead of one to do some complex function, or if execution near jumps isn't straightforward? The compiler will crank out the three instructions and keep track of the funny execution sequence. Meanwhile you can shrink the processor and run the instructions at the microcode engine's speed - which can be increased further by reducing the nubmer of gates and length of wiring, and end up with a smaller chip (which means higher yeilds, which means making use of the next, faster, FAB technology sooner.)

        CISC pushed RISK out of general purpose processors again once the die sizes got big: You can use those extra gates for pipelining, branch prediction, and other stuff that lets you gain back more by parallelism than you lost by expanding the execution units. But it's still alive and well in embedded cores (where you need SOME crunch but want to use most of the silicon for other stuff) and in systems that don't need the absolute cutting-edge of speed or DO need a very low power-per-computation figure.

        The compiler advantage over an assembly programmer is extreme both with RISC and with a poorly-designed CISC instruction set (like the early x86es). Well-designed CISC instruction sets (like PDP11, VAX, and 68k) are tuned to simplify the compilers' work - which makes them understandable enough that the tricks are fewer and good code is easier for a human to write. This puts an assembly programmer back in the running. But on the average the compiler still wins.

        (But understanding how assembly instruction sets work, and how compilers work, are both useful for writing better code at the compiler level. Less so now that optimizers are really good - but the understanding is still helpful.)
    • by skurk ( 78980 ) on Thursday February 05, 2004 @07:07PM (#8196170) Homepage Journal
      You have a good point. I code assembly on many different CPU's, and I can only see a minor difference between them

      For example, on the 6502 family (like the 6510 from the C64), you have only three registers; X, Y and A. These registers can only hold a byte each. Most of the variables you have are stored in zero pointers, a 255-byte range from address $00-$FF.

      Then the 68k CPU (as in the Amiga, Atari, etc) you have several more registers which can be used more freely. You have D0-D7 data registers and A0-A7 address registers. These can be operated as bytes, words or longwords as you wish, from wherever you want.

      The x86 assembly is written the "wrong way", and is pretty confusing at times. Where I would say "move.l 4,a6" on the 68k, I have to say "mov dx,4" on the x86. Takes a few minutes to adjust each time.

      Once you master assembly language on one CPU, it's pretty easy to switch to another.

      I still think the 680x0 series are the best.
  • Not So New Concept (Score:5, Insightful)

    by andyrut ( 300890 ) on Thursday February 05, 2004 @06:47PM (#8195902) Homepage Journal
    Although unheard of today, this concept isn't really all that new -- there used to not be much choice in years past.

    While starting Computer Science students off with assembly (without first introducing them to a high-level language) may be a relatively new concept these days, the idea of teaching low-level languages to Computer Science students is not a revolutionary technique whatsoever. Every decent Computer Science curriculum includes several semesters of courses in which assembly language is required, to demonstrate their knowledge of basic computer processes.

    That reminds me of a great fortune:

    "The C Programming Language -- A language which combines the
    flexibility of assembly language with the power of assembly language."
    • by gid13 ( 620803 )
      I'm not even a CS student (I'm Engineering Physics), and I still had to learn some microcontroller assembly language.

      While I admit that it helps you understand the device more, I have to say it's much less intuitive and enjoyable than high-level programming (not that I'm the type to find scripting fun, but you know what I mean).
    • by rblancarte ( 213492 ) on Thursday February 05, 2004 @06:58PM (#8196051) Homepage
      I was about to say the same thing. I don't think that it is some new "mystical" idea of teaching assembly to students. I am currently taking my THIRD assembly class at the University of Texas. And I know that there are others to take.

      I will agree with the parent post, this is not a new concept. Now teaching assembly to beginners, that might be new.

      And I don't know if "great" coders know assembly, but I think knowing assembly is a useful tool in being able to program efficient code. If you understand concepts like division, how bad it is, what the computer is actually doing when your C/C++ or whatever language (that is not interpreted) is compiled, then you are well on your way to being able to produce efficient code.
    • by tealover ( 187148 ) on Thursday February 05, 2004 @07:05PM (#8196151)
      Since this submission is nothing more than an attempt to hawk his one book, on principle I refuse to buy it. I don't like dishonesty in the submission process. He should have come out and directly admitted that it was his book.

      • by johnnyb ( 4816 ) <jonathan@bartlettpublishing.com> on Friday February 06, 2004 @01:12AM (#8198753) Homepage
        "I don't like dishonesty in the submission process."

        What's dishonest? It would have been dishonest had I registered a new account to make the submission. However, the way that you know that I was the author was because I _did not_ resort to dishonest tactics. I simply wrote from the third person, which is exactly how I wrote the back cover text for the book, the press releases, etc.

        I don't even spamguard my email address, because I want people to know who I am and be able to reach me easily.
    • by Jester99 ( 23135 ) on Thursday February 05, 2004 @07:39PM (#8196515) Homepage
      "The C Programming Language -- A language which combines the
      flexibility of assembly language with the power of assembly language."


      The way I heard it was far drier humor: "C: The language combining the power of assembly with the ease of use of assembly." :)
    • by ca1v1n ( 135902 ) <snook@g u a notronic.com> on Thursday February 05, 2004 @08:28PM (#8197009)
      Starting CS students in assembler isn't a new idea either. It's an old idea that some guy recycled, wrote a book about, and advertised on slashdot. Why are the old guys who started on assembler "wizards"? It's easy. Back then there weren't that many people doing it, and it was really hard. Anyone who stuck with it had to be really good.

      Now for my own rant on the topic:

      My first programming classes at the University of Virginia had me programming in VHDL and m68k assembler. This wasn't the intent of the curriculum or the CS faculty at all, but rather a result of some schedule conflicts and a first year advisor who wasn't in the CS department and didn't know any better. It was a disaster. Normally students here get their first taste of assembly in a course that works its way down to it from the high-level languages they'd been introduced to first. My pain with assembly distracted from the course material on architecture. I learned more about efficient programming during two lectures on C# than I did while banging my head against the desk writing m68k assembler.

      Anyone who has benchmarked the C++ standard template library extensively will tell you that using the fairly complicated, safely implemented data structures are incredibly fast. Using an STL deque as an array, without any deque operations, is actually faster than using an array, which is the same in basically any machine-code language, be it C++ or assembly. The STL vector is even faster.

      Moral of the story? Compilers are amazing. This has not always been the case, but it is now. Writing code in assembly results in a product that takes forever to create, is less likely to correctly handle special cases, is impossible to debug, is unreadable to those who did not write it, and often is slower than compiled high-level code. In fact, JIT compilers are getting good enough that bytecode languages are nearly as fast as well.

      It's true that assembly can often be used to get a bit of a speedup. This is why game developers will often write everything in C or C++ except for 3 or 4 functions that they'll implement in assembly. The advantages that high-level languages give for correctness vastly outweigh any miniscule performance gains in almost all circumstances. Assembly's advantages in compactness are becoming moot too, as embedded devices are now capable enough to run full-fledged operating systems.

      Aside from teaching architecture though, assembly does teach an important skill to CS majors though, and that is staying up all night to find a one-line bug that is making everything go wrong.
  • Knuth (Score:5, Informative)

    by red floyd ( 220712 ) on Thursday February 05, 2004 @06:48PM (#8195909)
    Isn't that what Knuth did with his ASM language? I believe it was a synthetic assembler for a hypothetical stack machine -- hence the name ASM - Abstract Stack Machine.
    • MIXAL (Score:3, Informative)

      by texchanchan ( 471739 )
      MIXAL, MIX assembly language. MIX was the virtual machine I learned assembly on in 1975. Googling reveals that MIX was, in fact, the Knuth virtual computer. The book came with a little cue card with a picture of Tom Mix [old-time.com] on it. MIX has 1 K of memory. Amazing what can be done in 1 K.
  • by shakamojo ( 518620 ) * on Thursday February 05, 2004 @06:48PM (#8195912)
    My Grandfater worked for IBM in the 70's and 80's. He did all his coding in assembly and machine language. His motto is "Anyone who doesn't know machine language has no business using a computer."

    There has to be a happy medium IMHO, and I think this is a great start. While my Grandfather was on the cutting edge of the PC revolution, he now has trouble figuring out email, etc, because he operates at too LOW a level (and I feel that he now has no business being online!). Then you have the users who have the same problems because they operate at too HIGH a level (AOL, etc...). The majority of programmers nowadays fall about smack in the middle of these two groups, but I'd argue they should be a little closer to the lower levels than they currently are.

    I learned LOGO and BASIC as a kid, then grew into Cobol and C, and learned a little assembly in the process. I now use C++, Perl, and (shudder) Visual Basic (when the need arises). My introduction to programming at a young age through very simple languages really helped to whet my appetite, but I think that my intermediate experiences with low level languages helps me to write code that is a lot tighter than some of my peers. Let's hope this starts a trend, it would be great if more young (and current) programmers appreciated the nuts and bolts!
    • by blixel ( 158224 ) on Thursday February 05, 2004 @07:06PM (#8196160)
      His motto is "Anyone who doesn't know machine language has no business using a computer."

      Just say to him "Well Grandpa, my motto is anyone who can't describe, with exacting detail, all the functions of every organ in the human body doesn't deserve to live."
    • by Saven Marek ( 739395 ) on Thursday February 05, 2004 @07:09PM (#8196209)
      I learned LOGO and BASIC as a kid, then grew into Cobol and C, and learned a little assembly in the process. I now use C++, Perl, and (shudder) Visual Basic (when the need arises). My introduction to programming at a young age through very simple languages really helped to whet my appetite, but I think that my intermediate experiences with low level languages helps me to write code that is a lot tighter than some of my peers.

      I'm with you there. I learned C, C++ and assembler while at university, and came out with the ability to jump into anything. Give me any language and I can guarantee I'll be churning out useful code in a VERY short amount of time.

      Compare this to my brother, 12 years younger than me who has just completed the same comp.sci course at the same uni, and knows only one language; Java. Things change, not always for the better. I know many courses haven't gone to the dogs as much as that, but many have. I'm not surprised the idea of teaching coders how the computer works is considered 'novel'.

      I can see a great benefit for humanity the closer computers move to 'thinking' like people, for people. But that's just not done at the hardware level, it's done higher. The people who can bring that to the world are coders, and as far as I'm concerned thinking in the same way as the hardware works is absolutely essential for comp.sci. Less so for IT.
    • by Lord Ender ( 156273 ) on Thursday February 05, 2004 @08:47PM (#8197155) Homepage
      This was posted to USENET by its author, Ed Nather, on May 21, 1983.

      A recent article devoted to the *macho* side of programming
      made the bald and unvarnished statement:

      Real Programmers write in FORTRAN.

      Maybe they do now,
      in this decadent era of
      Lite beer, hand calculators, and "user-friendly" software
      but back in the Good Old Days,
      when the term "software" sounded funny
      and Real Computers were made out of drums and vacuum tubes,
      Real Programmers wrote in machine code.
      Not FORTRAN. Not RATFOR. Not, even, assembly language.
      Machine Code.
      Raw, unadorned, inscrutable hexadecimal numbers.

      Lest a whole new generation of programmers
      grow up in ignorance of this glorious past,
      I feel duty-bound to describe,
      as best I can through the generation gap,
      how a Real Programmer wrote code.
      I'll call him Mel,
      because that was his name.

      I first met Mel when I went to work for Royal McBee Computer Corp.,
      a now-defunct subsidiary of the typewriter company.
      The firm manufactured the LGP-30,
      a small, cheap (by the standards of the day)
      drum-memory computer,
      and had just started to manufacture
      the RPC-4000, a much-improved,
      bigger, better, faster --- drum-memory computer.
      Cores cost too much,
      and weren't here to stay, anyway.
      (That's why you haven't heard of the company, or the computer.)

      I had been hired to write a FORTRAN compiler
      Mel didn't approve of compilers.

      "If a program can't rewrite its own code",
      he asked, "what good is it?"

      Mel had written,
      in hexadecimal,
      the most popular computer program the company owned.
      It ran on the LGP-30
      and played blackjack with potential customers
      at computer shows.
      Its effect was always dramatic.
      The LGP-30 booth was packed at every show,
      and the IBM salesmen stood around
      talking to each other.
      Whether or not this actually sold computers
      was a question we never discussed.

      Mel's job was to re-write
      the blackjack program for the RPC-4000.
      (Port? What does that mean?)
      The new computer had a one-plus-one
      addressing scheme,
      in which each machine instruction,
      in addition to the operation code
      and the address of the needed operand,
      had a second address that indicated where, on the revolving drum,
      the next instruction was located.

      In modern parlance,
      every single instruction was followed by a GO TO!
      Put *that* in Pascal's pipe and smoke it.

      Mel loved the RPC-4000
      because he could optimize his code:
      that is, locate instructions on the drum
      so that just as one finished its job,
      the next would be just arriving at the "read head"
      and available for immediate execution.
      There was a program to do that job,
      an "optimizing assembler",
      but Mel refused to use it.

      "You never know where it's going to put things",
      he explained, "so you'd have to use separate constants".
      It was a long time before I understood that remark.
      Since Mel knew the numerical value
      of every operation code,
      and assigned his own drum addresses,
      every instruction he wrote could also be considered
      a numerical constant.
      He could pick up an earlier "add" instruction, say,
      and multiply by it,
      if it had the right numeric value.
      His code was not easy for someone else to modify.

      I compared Mel's hand-optimized programs
      with the same code massaged by the optimizing assembler program,
      and Mel's always ran faster.
      That was because the "top-down" method of program design
      hadn't been invented yet,
      and Mel wouldn't have used it anyway.
      He wrote the innermost parts of his program loops first,
      so they would get first choice
      of the optimum address locations on the drum.
  • by Sebastopol ( 189276 ) on Thursday February 05, 2004 @06:48PM (#8195918) Homepage
    Sounds more like a programming book than compsci book.

    writing an RB tree or an A* search an assembly would be a huge pain in the ass, if you ask me.

    compsci is a large part about data structures, how to choose the right datastructure, how to get the most out of an algorithm by picking the best datastructure, etc...

    but i didn't read the book, so i'll just go back to my websurfing now...

    • It's about optimal instruction usage, language design, automata, and a lot more. It's about optimal computing all the way around.
    • by dilettante ( 91064 ) on Thursday February 05, 2004 @07:12PM (#8196254)
      I think it's a good idea in comp. sci. *because* it's a pain in the ass. I certainly agree that assembly language is not the easiest or most efficient language for implementing certain algorithms and data structures. I also think it's very instructive to understand why.

      I did learn assembly language first. I wouldn't claim to be a wizard (although i'm certainly an old-timer); but i concur with the premise that learning assembly language makes you a better programmer *and* computer scientist. Assembly language exposes you to the basic architecture of the computers that most of work with, and i believe that helps one to understand everything from why certain data structures are preferable in certain situations to basic computational complexity.

  • Not new (Score:3, Informative)

    by El Cabri ( 13930 ) on Thursday February 05, 2004 @06:49PM (#8195922) Journal
    Knuth's The Art of Computer Programming was illustrating the algorithm in an imaginary assembly language.
  • by lake2112 ( 748837 ) on Thursday February 05, 2004 @06:49PM (#8195927)
    Good Idea: First teaching simple programming fundamentals through a simple to understand language. Then, confuse the hell out of a student with assembly Bad Idea: Teaching CS by starting with one of the most cryptic languages around, and then trying to teach basic CS fundamentals. There are already problems with people interested in CS getting turned off by intro/intermediate programming classes. Imagine the retention rates once my CS100 class is taught in assembly.
    • Not to be a prick (well, not too much of one) but that would be a -good- thing. Less retention means we are shaking ut the chaff faster, getting down to only those people who want to be in CS for the art's sake, not the Big Buck$(tm). As recent economic events have shown, too many of latter is a Bad Thing(tm).
    • by AvitarX ( 172628 )
      Except more and more jobs are being outsourced because schools have churned out commodity programmers and business's can get them anywhere.

      What we need if we want to stay the forefront is for there to be fewer programmers of a higher quality. For the past decade we have had too many people not gettin turned off by CS. So they enter the workforce as commodity disposable workers, and they tarnish the reputation of you uys who are good too. Now our jobs go to India and either they have better or equal work
    • High retention rates in higher education is not necessarily a good thing. Not everyone can benefit from such an education, and the sooner they find that out the better.

      The problem with your idea is that the "simple to use" languages that are most often taught are too simple and are too many generations removed from what the computer was actually doing. Years ago, even using BASIC on a TRS-80 would teach you a lot about how the machine worked. To get the best performance out of the thing you had to at least

    • by RAMMS+EIN ( 578166 ) on Thursday February 05, 2004 @07:10PM (#8196217) Homepage Journal
      ``Bad Idea: Teaching CS by starting with one of the most cryptic languages around, and then trying to teach basic CS fundamentals.''

      I completely disagree. Assembly is actually one of the simplest languages around. There is little syntax, and hardly any magic words that have to be memorized. Assembly makes an excellent tool for learning basic CS fundamentals; you get a very direct feeling for how CPUs work, how data structures can be implemented, and why they behave the way they do. I wouldn't recommend assembly for serious programming, but for getting an understanding of the fundamentals, it's hard to beat.
    • by pla ( 258480 ) on Thursday February 05, 2004 @07:17PM (#8196302) Journal
      Then, confuse the hell out of a student with assembly

      I disagree. Personally, I learned Basic, then x86 asm, then C (then quite a few more, but irrelevant to my point). Although I considered assembly radically different from the Basic I started with, it made the entire concept of "how the hell does that Hello World program actually work?" make a whole lot more sense.

      From the complexity aspect, yeah, optimizing your code for a modern CPU takes a hell of a lot of time, effort and research into the behavior of the CPU itself. But to learn the fundamental skill of coding in assembler, I would consider it far less complex than any high-level language. You have a few hundred instructions (of which under a dozen make up 99% of your code). Compare that to C, where you have literally thousands of standard library functions, a good portion of which you need to understand to write any non-trivial program.


      There are already problems with people interested in CS getting turned off by intro/intermediate programming classes.

      You write that as though you consider it a bad idea...

      We have quite enough mediocre high-level hacks (which I don't mean in the good sense, here) flooding the market. If they decide to switch to English or Art History in their first semester, all the better for those of us who can deal with the physical reality of a modern computer. I don't say that as an "elitist" - I fully support those with the mindset to become "good" programmers (hint: If you consider "CS" to have an "S" in it, you've already missed the boat) in their efforts to learn. But it has grown increasingly common for IT-centric companies to have a handful of gods, with dozens or even hundreds of complete wastes-of-budget who those gods need to spend most of their time cleaning up after. We would do better to get rid of the driftwood. Unfortunately, most HR departments consider the highly-paid gods as the driftwood, then wonder why they can't produce anything decent.

      Hmm, okay, rant over.
  • New? (Score:5, Insightful)

    by Sloppy ( 14984 ) * on Thursday February 05, 2004 @06:50PM (#8195938) Homepage Journal
    A new book was just released which is based on a new concept - teaching computer science through assembly language
    Uh, assembly language for teaching isn't exactly a new idea. Knuth's AoCP books used MIX, a "fake" assembly language, even though easy-to-read languages (e.g. ALGOL) were already around at the time. And he wasn't even trying to teach fundamentals about computers work -- he was teaching higher-level stuff, algorithms in those books. Think about just how weird that is.

    Perhaps this current obsession with learning using 'easy' languages is the wrong way to do things.
    That depends on what you're trying to learn. I think someone with a CS degree should have a deep understanding of things, and should have at least some experience working in assembly language, managing memory, writing compilers, etc. But that doesn't mean that high-level languages are a bad idea when they're learning higher-level concepts. Do you want someone wasting their time remembering what is being stored in what register, when they're learning how to write a web browser? Of course not: you want them to be thinking about the real issues at hand.
  • Wussies (Score:5, Funny)

    by mikeophile ( 647318 ) on Thursday February 05, 2004 @06:51PM (#8195947)
    Real programmers learn machine code.
  • Your book? (Score:5, Informative)

    by Tet ( 2721 ) * <slashdot@astra[ ]e.co.uk ['dyn' in gap]> on Thursday February 05, 2004 @06:51PM (#8195948) Homepage Journal
    A new book was just released

    What you meant to say was that your new book has just been released. If you're going to pimp your wares on Slashdot, at least put an appropriate disclaimer on. That said, I completely agree with the premise of the book. I've met a lot of mediocre programmers, and a few good ones. But I've never yet met a real star that didn't have some background in assembly language programming. Personally, I haven't written anything in assembly in well over a decade. But that fact that I can do so if needed makes me a better programmer, and I'd recommend it to any aspiring coder as a key skill to learn. I wouldn't say IA32 is a particularly nice introduction (I'd start with a cleaner, simpler architecture, such as 6502), but it is at least widely available to anyone that wants to study it...

    • Re:Your book? (Score:5, Interesting)

      by FreshFunk510 ( 526493 ) on Thursday February 05, 2004 @06:58PM (#8196052)
      Ugh. Johnnyb should've wrote a disclaimer that he was promoting his own book. This type of action turns me away from wanting to support such an individual. Sorry, nothing personal.
  • by Anonymous Coward on Thursday February 05, 2004 @06:51PM (#8195955)
    The think concepts of registers and memory locations and stack pointers and branching is easier to understand in assembly. You can teach a simple subset of instructions. It was the way I started back in the day. I scratched my head more later learning C, etc. I guess its just the opposite to kids these days.
  • by Geeyzus ( 99967 ) <(moc.oohay) (ta) (jedam_kram)> on Thursday February 05, 2004 @06:53PM (#8195979)
    I think this is a bad idea, for a couple reasons.

    1) Difficulty - Assembly is harder to learn (and create meaningful programs with) than C++, or Java, which is replacing C++ in a lot of college curriculums. This means that students will be spending more time learning assembly, and less time learning about complicated algorithms and the things you really should be learning about (since languages change but algorithms are standard).

    2) Job practicality - 99% of CS grads aren't going to use assembly in their day to day jobs. They will most likely be programming in Java, or VB, or some web language (PHP/ASP/etc). Maybe some C++. But unless you are doing something that requires the control that assembly can provide, like real-time software or game engine development, you simply aren't using assembly at work.

    If it's harder to learn/teach, and you won't use it after you graduate, I can't see the point in teaching it at universities.

    Mark
  • by JoshuaDFranklin ( 147726 ) * <joshuadfranklin...NOSPAM@@@yahoo...com> on Thursday February 05, 2004 @06:53PM (#8195987) Homepage
    I don't know why he didn't mention that this is a free documentation project:

    http://savannah.nongnu.org/projects/pgubook/ [nongnu.org]

    It's also being used at Princeton [princeton.edu]

  • by WankersRevenge ( 452399 ) on Thursday February 05, 2004 @06:54PM (#8195994)
    assembly is the great monster that requires fresh blood every year, or the great darkness will fall upon the land. i myself have never dabbled in assembly because i don't like living in an hp lovecraft nightmare.

    For those of you insane enough to take the plunge, check out this FREE online introduction course [ccsu.edu] (no reg, don't ya love it). The guy who wrote it is pretty wacky. I took his java introductory course [ctstateu.edu] and it was hip as well as very educational.
  • by LineNoiz ( 616971 ) <kal_durak@ya3.1415926hoo.com minus pi> on Thursday February 05, 2004 @06:55PM (#8196001)
    Get it to your Valentine on time! Choose UPS 2 DAY and pay the price of Ground.

    Yeah. Give my GF a book on Linux Assembly programming. That should get those panties off in a hurry.
  • by pointzero ( 707900 ) on Thursday February 05, 2004 @06:55PM (#8196009) Homepage
    Perhaps it's time that computer science curriculums start teaching assembly language first.
    Here at University of New-Brunswick (Canada), they may not teach assembly language "first", but we do have a second year course dedicated to assembly and the inner workings of computers. My only problem though at the school is that we learn the Motorola M68HC11 cpu and not current ones. Sure it's easier to learn and understand, but most computers we work on today are x86 based.

    My 2 cents.
  • A good idea (Score:3, Insightful)

    by swtaarrs ( 640506 ) <swtaarrsNO@SPAMcomcast.net> on Thursday February 05, 2004 @06:56PM (#8196014)
    I agree that assembly language should be taught, but not necessarily as the first language. BASIC is a good tool for teaching higher level programming ideas like conditional statements, loops, etc... Once those concepts are understood, C should be taught. I was a fairly proficient C programmer before I learned assembly (68k), but even then assembly helped me understand much more about the inner workings of C and what lines of code do. Instead of just knowing that code I write works, I now know why it works and I am able to do much more advanced things with a low-level knowledge of programming.
  • For the record (Score:5, Interesting)

    by pclminion ( 145572 ) on Thursday February 05, 2004 @06:56PM (#8196025)
    The CS program at Portland State University starts with assembly in their introductory classes. At the time I was there, it was x86 assembly, but I've heard that some professors are using Sparc assembly as well -- not a good idea in my opinion, simply because of 1) the delay slot and 2) the sethi instruction, both of which are a little confusing for someone who's never coded before, let alone never coded in assembly language.

    I think it's a little weird to call this "Learning Computer Science via Assembly Language." It's programming, not computer science. Computer science is really only marginally about computers. It has to do more with algorithms, logic, and mathematics.

    You can study computer science, and produce new knowledge in the field, without ever touching a computer.

    This misunderstanding is, I think, part of the reason so many students drop out of CompSci. They head into it thinking it's about programming, and are startled to find that computation and programming are not equivalent.

    That's why the Compilers course at PSU is considered the "filter" which kills all the students who aren't really interested in computer science. They really need to spin off a seperate "Software engineering" school for these students, since what they really want to study is programming.

  • by Wise Dragon ( 71071 ) on Thursday February 05, 2004 @06:57PM (#8196041) Homepage
    I think the article should have disclosed that the submitter (johnnyb) is also the author of the book, Jonathan Bartlett. So rather than saying "A new book was just released", I would rather see something like "I wrote this new book." Here is johnnyb's website. http://www.eskimo.com/~johnnyb/

    • by FreshFunk510 ( 526493 ) on Thursday February 05, 2004 @07:22PM (#8196363)
      Furthermore Slashdot should make it a policy for people who submit their own books/publications to reveal that they are the author so that there is no conflict of interest (sort of like how News channels who report news on their parent company or subsidiary always say so explicitly). I think that's only fair to the readers of Slashdot and it won't make us feel like we're being scammed into buying someone's book.
  • Great concept. (Score:5, Insightful)

    by shaitand ( 626655 ) on Thursday February 05, 2004 @06:59PM (#8196060) Journal
    I started out learning to code in asm on my c64 and I'd have to say it was a very rewarding experience.

    Anyone who disagrees with this probably doesn't have much experience coding in assembler to begin with. Asm really is fairly easy, the trick is that most who teach asm actually spend too much time on those computer concepts and not enough time on actual real coding. It's wonderful understanding how the machine works, and necessary to write good assembler but you should start with the 2 pages of understanding that is needed to "get" asm at all.

    Then teach language basics and THEN teach about the machine using actual programs (text editor, other simple things) and explaining the reason they are coded the way they are in small chunks. Instead of handing a chart of bios calls and a tutorial on basic assembler, introduce bios calls in actual function in a program, most of them are simple enough that when shown in use they are quite clear and anyone can understand.

    After all assembler, pretty much any assembler, is composed of VERY simple pieces, it's understanding how those pieces can be fit together to form a simple construct and how those simple constructs form together to create a simple function and how those simple functions form together to create a simple yet powerful program that teaches someone programming. Learning to program this way keeps things easy, but still yields a wealth of knowledge about the system.

    It also means that when you write code for the rest of your life you'll have an understanding of what this and that form of loop do in C (insert language here) and why this one is going to be faster since simply looking at the C (insert language here) concepts doesn't show any benefit to one over the other.

  • by walt-sjc ( 145127 ) on Thursday February 05, 2004 @07:03PM (#8196124)
    Of all the processors out there, yes the x86 is common but it has to be one of the WORST instruction sets - one of the most difficult to work with.

    Is it just me???

    I DO think it's a good idea to be teaching assembly, not so sure as the core of a comp sci program however. I started playing with assembly fairly early, on 6052, z80, and then later with 68000 and IBM 370. It's good to know, but I would do major stuff in it anymore. That's what high-level languages are for. You only drop to assembly when you have to for speed or space.
  • Not a good idea (Score:3, Interesting)

    by El ( 94934 ) on Thursday February 05, 2004 @07:08PM (#8196192)
    I think one would be much better off writing in C without optimization, then stepping through the execution in a disassembler to see how the resulting machine code operates. Yes, it helps to write more efficient high level code if you know how it is converted to machine code. For example, I had a coworker who made a habit of declaring initialized arrays local to his functions. I had to point out to him: "You do know that this causes the array to be copied onto the stack every time you enter the function, thus really slowing down program execution, don't you?" Apparently this had never occured to him, because he had never actually watched the code execute.
  • emulator (Score:4, Insightful)

    by bcrowell ( 177657 ) on Thursday February 05, 2004 @07:08PM (#8196195) Homepage
    When I was a kid, I first learned BASIC on a TRS-80, and then learned Z80 assembler. (There were no compilers available.) Thirteen-year-old me had a really hard time with asembler at first. For example, I thought preprocessor defines were like floating point variables that you could modify at runtime.

    Assembler-first might work with beginners if it was on an emulator where they could see exactly what was happening, and there was no way to crash it. Otherwise, I just don't see the point of making things harder.

    Of course if you really want to make it hard, you hand every twelve-year-old kid a copy of Knuth and a hardware implementation of Knuth's hypothetical processor. Then our generation could be completely assured of job security.

  • by use_compress ( 627082 ) on Thursday February 05, 2004 @07:10PM (#8196220) Journal
    There are a million fields in CS-- you can view them as points on a line that stretches from engineering to mathematics. The people who work in architecture are at the most extreme end of the engineering section. If you want to go into systems programming or into architecture, then I can see how would want to base everything off of asm. But if you specialize in ai, or algorithms, or theory, you really don't encounter assembly that often... for the most part, the need isn't there to develop extremely high performance, system dependent apps. In these fields, you could do of a cs curriculum (through graduate) entirely in Matlab, Prolog and ML. The emphasis is on the mathematical structures the program represents over how the computer actually deals with them.
  • by Cryptnotic ( 154382 ) on Thursday February 05, 2004 @07:11PM (#8196226)
    A real computer science program will teach generic principles of programming and systems development, with projects that delve into a variety of actual implementations of systems.

    For example, a b-tree data structure is fundamentally the same thing whether you implement it in 32-bit ARM assembly language or 16-bit x86 assembly language or C or Java.

    To understand how assembly language works, you need to understand how a processor works, how instruction decoding works, how register transfer language works, how clocking a processor makes it accomplish things. To understnad how registers hold values electrically and transfer values between registers you need to understand some physics and electronics.

    To understand how a compiler takes a source language and translates it into a target language, you need to understand a little about the kinds of languages computers can understand (Context-Free Languages) and how they can parse them (Context-Free Grammars). Delving into that field will lead to the core theory of computer science, what is possible with machines in general and what is impossible.

    A real computer science program at a university will take you through all of these subjects over several years, allowing for some excursions into other things like databases and cryptography. A real computer science program is always theory with projects that are applied to actual implementations.

  • by man_ls ( 248470 ) on Thursday February 05, 2004 @07:13PM (#8196257)
    Just learn C instead. It combines the readability of assembly with the ease of use of assembly.
  • Ugh... x86 (Score:3, Insightful)

    by iamdrscience ( 541136 ) on Thursday February 05, 2004 @07:14PM (#8196267) Homepage
    For anybody on here thinking about broadening their CS horizons, I would recommend not learning x86, at least not first. A simpler, RISC(-ish) instruction set is really the way to go. It's a lot more enjoyable to program in. Some good choices are maybe z80 or 68k (program your TI calculator maybe?) or Microchip's PICMicro microcontrollers (most models have under 35 instructions to learn).

    Learning x86 isn't a bad idea, and for most nerds programming assembly that's probably where it will be most useful, but I just think it's a better idea to start off programming ASM on something a little more enjoyable so that you can really learn to appreciate it before diving in with x86.
  • by John Jorsett ( 171560 ) on Thursday February 05, 2004 @07:16PM (#8196295)
    I understand C much better than I would have had I not learned assembly language first. I think of C as a somewhat-more-abstract version of assembly. It has that "down to the bare metal" aspect in much of what you can do with it, particularly pointers.
  • by PissingInTheWind ( 573929 ) on Thursday February 05, 2004 @07:36PM (#8196489)
    From the book's presentation page:
    To be a programmer without ever learning assembly language is like being a professional race car driver without understanding how your carburetor (sic) works.

    To which I reply: To be a book writer without ever learning how to spell properly is like trying to teach programming by starting with assembly languages.
  • by John Whitley ( 6067 ) on Thursday February 05, 2004 @07:57PM (#8196725) Homepage
    Perhaps it's time that computer science curriculums start teaching assembly language first.

    Having taught an assembly/into computer arch class, I agree with the sentiment that students who get "under the hood" gain valuable knowledge and working skills. Not just pounding ASM, but in learning how the machine works. Point agreed.

    Also having taught first year computer science students, and seen how some of academia's transitions in pedagogy affected students... I have to say that the idea of teaching first year students in assembly is friggin' daft.

    My reasoning is the same as why I strongly advocated an objects-first teaching model. It is increasingly critical for students to build a strong sense of software design and abstraction early on. This foundation makes students much better prepared to solve problems of many different scales (asm to component-systems) in the long run.

    There's evidence from a paper in one of the Empirical Studies of Programmers workshops that this approach does trade off design skills for purely algorithmic reasoning for students at the end of their first year. But my own experience, as well as that of some prominent Comp Sci Education (CSE) folks seems to indicate that this is far more than compensated for as a student's skills grow.

    Here's my theory as to why this is the case:
    The details of debugging, alogrithmic thinking, and problem solving are very much skill building exercises that really require time of exposure to improve. But it is much more difficult in my experience for students to build good design sense on their own. Once the framework for thinking in terms of good abstractions is laid down, it provides much stronger support for later filling all of those gory low-level details.

    Historical perspective: Ironically, this same reasoning is much of why I believe that academia's switch to C++ from languages like Pascal, Modula-2, etc. was an educational disaster for many years. The astute reader is now thinking: "hey, you just said you like objects-first; what up?" In the Procedural Era, many schools wouldn't expose students to C in the first year, as it had too many pitfalls that distracted from learning the basics of algorithmic thinking and important abstraction skills. Once the foundation was put in place, it was okay to swtich 'em to C for the rest of the program.

    When C++ and the early object boom really hit, this put on big pressure to teach first year students using C++. At one point in the mid-90's, upwards of 75% of 4-year institutions were teaching their first year in C++. Thus a language that had plenty more pitfalls than C, previously shunned for its pedagogical failings, entered the classroom. Combined with a lack of of proper OO mental retooling on the part of first year instructors and faculty made for something of a skills disaster on a broad scale. At best, students learned "Modula-C" instead of good OO style. At worst, they were so confused by this melange of one-instance classes and sloppy hybrid typing that they didn't get a cohesive foundation whatsoever.
  • This book (Score:5, Informative)

    by voodoo1man ( 594237 ) on Thursday February 05, 2004 @08:43PM (#8197123)
    has been available [nongnu.org] for some time under the GNU Free Documentation License. I tried to use it a while back when I decided to learn assembler, but I found Paul Carter's PC Assembly Language [drpaulcarter.com] to be a much better introduction.
  • by smoon ( 16873 ) on Thursday February 05, 2004 @09:40PM (#8197536) Homepage
    The problem is that computer scientists don't make good programmers and vice-versa. If you're good with code and hunker down to write lots of programs, then you tend to clash with the all-theory-no-code camp that delights in big-O notation and graph theory. Of course there is a lot of middle ground, but in general the PHd professor types that staff CompSci departments I've been in tend to have stopped learning about computers as soon as they finished their doctorate and instead concentrate on internecine politics, incomprehensible papers, and teaching the occaisional class (leaving most of that to T.A.'s who actually teach the class and understand how to compile programs).

    Meanwhile the coder types graduate with a B.S. or maybe a masters then go into commercial development shops and crank out code, forgetting as much as they can about red-black trees and other subtle CompSci concepts.

    So if you want to crank out programmers, then assembly is probably a good thing. God knows I learned a lot from the assembly classes I took.

    If you're trying to scare students away then assembly is also a good tactic. Nothing like a good hex dump to get some non CompSci students eyes to glaze over. Sort of like making people take Biology or Physics, but instead of teaching about cells and newtonian motion, jump right into the finer points of quantum mechanics or amino acid chemistry.

    On the other hand, for 2nd year CompSci students, Assembly is probably a good thing to get out of the way. It really sucks, for example to take economics for 4 years only to learn at the end "just kidding, reality is too complex to model so these are all just gross oversimplifications." Sort of like thinking programming == Java then finding out how it all _really_ works.

  • by Senior Frac ( 110715 ) on Thursday February 05, 2004 @09:59PM (#8197670) Homepage

    Perhaps it's time that computer science curriculums start teaching assembly language first.

    It's more critical they actually teach computer science first, instead of programming. A new CS hire, assuming their school was worth a damn, can learn a new language. I want to know if they have the math background to understand the problems that will be handed to them and that they have the ability to self-learn.

  • by Jerk City Troll ( 661616 ) on Thursday February 05, 2004 @11:37PM (#8198243) Homepage

    Computer science isn't "knowing computers on a deeper level." Computer science is algorithms and lots of math. Computer scientists don't care about how a computer works. They don't care about the language either. They are interested in data structures and how to work with them. What language is in use is really unimportant, be it Java or Assembly.

  • by rcpitt ( 711863 ) * on Friday February 06, 2004 @12:18AM (#8198466) Homepage Journal
    The first piece of digital gear I ever had my hands on was the size of a small desk and about 5' high. It had just enough "logic" on it to create an eight bit ring counter (0 1 10 11 100...)

    That was in 1965 and the unit was the only one for our whole school district and worth about $30,000 (CDN - about $35000 US at the time)

    From there to today's Object Oriented Programming languages has been an interesting time. I wouldn't have missed it for anything, and I honestly think that living through it has given me a perspective that many more recent programmers don't have and IMHO need, sometimes.

    Where "brute force and ignorance" solutions are practical, there is no gain in knowing enough about the underlying hardware and bit twiddling to make things run 1000% faster after spending 6 months re-programming to manually optimize. In fact, since (C and other) compilers have become easily architecture tuned, there really are few areas where speed gains from hardware knowledge can be had, let alone made cost effective. Most are at the hardware interface level - the drivers - most recently USB for example.

    If you're happy with programming Visual Basic and your employer can afford the hardware costs that ramping up your single CPU "solution" to deal with millions of visitors instead of hundreds, then you don't need to know anything about the underlying hardware at the bit level.

    On the other hand, if you need to wring the most "visits" out of off-the-shelf hardware somehow, then you need to know enough to calculate the theoretical CPU cycles per visit.

    Somewhere between these two extremes lies reality.

    Today I use my hardware knowledge mostly as a "bullshit filter" when dealing with claims and statistics from various vendors. I have an empiric understanding of why (and under what circumstances) a processor with 512 Megs of level 1 cache and a front-side bus at 500MHz might be faster than a processor with 256 Megs of L1 cache and a 800MHz FSB and vice versa. Same thing for cost effectiveness of SCSI vs IDE when dealing with a database app vs. access to large images in a file system (something that came up today with a customer when spec'ing a pair of file servers, one for each type application)

    Back in the mid 70s I dealt with people who optimized applications on million $ machines capable of about 100 online users at one time. Today I deal with optimization on $1000-$3000 machines with thousands of 'active' sessions and millions of 'hits' per day. Different situations but similar problems. Major difference is in the cost of "throwing hardware at the problem" (and throwing the operating systems to go with the HW - but then I use Linux so not much of a difference ;)

    Bottom line is that understanding the underlying hardware helps me quite a bit - but only as a specialist in optimization and cost-effectiveness now, not in getting things to work at all as in the past.

Marvelous! The super-user's going to boot me! What a finely tuned response to the situation!

Working...