Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Education IT Technology

Learning Computer Science via Assembly Language 1328

johnnyb writes " A new book was just released which is based on a new concept - teaching computer science through assembly language (Linux x86 assembly language, to be exact). This book teaches how the machine itself operates, rather than just the language. I've found that the key difference between mediocre and excellent programmers is whether or not they know assembly language. Those that do tend to understand computers themselves at a much deeper level. Although unheard of today, this concept isn't really all that new -- there used to not be much choice in years past. Apple computers came with only BASIC and assembly language, and there were books available on assembly language for kids. This is why the old-timers are often viewed as 'wizards': they had to know assembly language programming. Perhaps this current obsession with learning using 'easy' languages is the wrong way to do things. High-level languages are great, but learning them will never teach you about computers. Perhaps it's time that computer science curriculums start teaching assembly language first."
This discussion has been archived. No new comments can be posted.

Learning Computer Science via Assembly Language

Comments Filter:
  • Knuth (Score:5, Informative)

    by red floyd ( 220712 ) on Thursday February 05, 2004 @07:48PM (#8195909)
    Isn't that what Knuth did with his ASM language? I believe it was a synthetic assembler for a hypothetical stack machine -- hence the name ASM - Abstract Stack Machine.
  • by shaitand ( 626655 ) on Thursday February 05, 2004 @07:48PM (#8195919) Journal
    It is in the same fashion that win32 asm is different from linux asm. The core is the same but knowing the core of x86 assembler is going to get you far if what you are wanting to do is talk to the kernel.
  • Not new (Score:3, Informative)

    by El Cabri ( 13930 ) on Thursday February 05, 2004 @07:49PM (#8195922) Journal
    Knuth's The Art of Computer Programming was illustrating the algorithm in an imaginary assembly language.
  • Your book? (Score:5, Informative)

    by Tet ( 2721 ) * <.ku.oc.enydartsa. .ta. .todhsals.> on Thursday February 05, 2004 @07:51PM (#8195948) Homepage Journal
    A new book was just released

    What you meant to say was that your new book has just been released. If you're going to pimp your wares on Slashdot, at least put an appropriate disclaimer on. That said, I completely agree with the premise of the book. I've met a lot of mediocre programmers, and a few good ones. But I've never yet met a real star that didn't have some background in assembly language programming. Personally, I haven't written anything in assembly in well over a decade. But that fact that I can do so if needed makes me a better programmer, and I'd recommend it to any aspiring coder as a key skill to learn. I wouldn't say IA32 is a particularly nice introduction (I'd start with a cleaner, simpler architecture, such as 6502), but it is at least widely available to anyone that wants to study it...

  • by gid13 ( 620803 ) on Thursday February 05, 2004 @07:51PM (#8195950)
    I'm not even a CS student (I'm Engineering Physics), and I still had to learn some microcontroller assembly language.

    While I admit that it helps you understand the device more, I have to say it's much less intuitive and enjoyable than high-level programming (not that I'm the type to find scripting fun, but you know what I mean).
  • by JoshuaDFranklin ( 147726 ) * <joshuadfranklin.NOSPAM@ya h o o .com> on Thursday February 05, 2004 @07:53PM (#8195987) Homepage
    I don't know why he didn't mention that this is a free documentation project:

    http://savannah.nongnu.org/projects/pgubook/ [nongnu.org]

    It's also being used at Princeton [princeton.edu]

  • by WankersRevenge ( 452399 ) on Thursday February 05, 2004 @07:54PM (#8195994)
    assembly is the great monster that requires fresh blood every year, or the great darkness will fall upon the land. i myself have never dabbled in assembly because i don't like living in an hp lovecraft nightmare.

    For those of you insane enough to take the plunge, check out this FREE online introduction course [ccsu.edu] (no reg, don't ya love it). The guy who wrote it is pretty wacky. I took his java introductory course [ctstateu.edu] and it was hip as well as very educational.
  • by Wise Dragon ( 71071 ) on Thursday February 05, 2004 @07:57PM (#8196041) Homepage
    I think the article should have disclosed that the submitter (johnnyb) is also the author of the book, Jonathan Bartlett. So rather than saying "A new book was just released", I would rather see something like "I wrote this new book." Here is johnnyb's website. http://www.eskimo.com/~johnnyb/

  • by AstynaxX ( 217139 ) on Thursday February 05, 2004 @08:02PM (#8196099) Homepage
    Not to be a prick (well, not too much of one) but that would be a -good- thing. Less retention means we are shaking ut the chaff faster, getting down to only those people who want to be in CS for the art's sake, not the Big Buck$(tm). As recent economic events have shown, too many of latter is a Bad Thing(tm).
  • Am I that old? (Score:1, Informative)

    by Anonymous Coward on Thursday February 05, 2004 @08:02PM (#8196104)
    Have things changed that much since I graduated in 1999? Can you really get a degree in computer science without at least one course dealing with assembly?

    I know we had one course on computer hardware and assembly. There were at least one or two others that touched on the subject. It wasn't much but it was something.

    Knowing assembler doesn't help me too much at work where I am forced to write in VB6. It does help a lot with my projects at home that use microcontrollers. In that invironment, even if you can use C, you still need to know a lot about the hardware you are working with.

  • Loughborough Uni UK (Score:1, Informative)

    by Anonymous Coward on Thursday February 05, 2004 @08:02PM (#8196105)
    Here in Loughborough, on my course, it is one of the first things that you learn, after that you are introduced to C and Java, ofcourse I had knowledge of those before but some had their first taste of programming with assembler.

    Only straight computer science learnt assembler, however. Computer science with Business were learning VB, and didn't even look at c or java and I think that is wrong! Do u? :(

    Jimbob
  • by Anonymous Coward on Thursday February 05, 2004 @08:02PM (#8196107)
    http://www.cse.msu.edu/~miscisi2/security/pages/ex ploits/overflows.htm
  • MIXAL (Score:3, Informative)

    by texchanchan ( 471739 ) <ccrowley@@@gmail...com> on Thursday February 05, 2004 @08:03PM (#8196116)
    MIXAL, MIX assembly language. MIX was the virtual machine I learned assembly on in 1975. Googling reveals that MIX was, in fact, the Knuth virtual computer. The book came with a little cue card with a picture of Tom Mix [old-time.com] on it. MIX has 1 K of memory. Amazing what can be done in 1 K.
  • by tealover ( 187148 ) on Thursday February 05, 2004 @08:05PM (#8196151)
    Since this submission is nothing more than an attempt to hawk his one book, on principle I refuse to buy it. I don't like dishonesty in the submission process. He should have come out and directly admitted that it was his book.

  • Re:No, but... (Score:5, Informative)

    by Alan Shutko ( 5101 ) on Thursday February 05, 2004 @08:06PM (#8196155) Homepage
    Except that things like "i = i + 1" vs. "i++" vs "i+=1" are mostly irrelevant today, since that's a very easy thing for compilers to optimize. And they've been optimizing stuff like that for years.
    Try looking at the asm output from GCC at -O2 on those two statements.

    Knuth had reasons for using ASM that were a lot better than that. It does give you a better idea of how things are laid out in memory, because you have to do it yourself. It's easier to do detailed performance analysis of algorithms, because you can get exact cycle counts. (Which in turn helps train your intuition, and tell you how to find out from a CPU's instruction set how it does at various things to tune algorithms.) You can look at how cache affects things.

    Take a look at his reasons [stanford.edu].
  • by shaitand ( 626655 ) on Thursday February 05, 2004 @08:08PM (#8196188) Journal
    Your post makes no sense unless you were confused by my mistype, I meant to say "the x86 core ISN'T going to get you far if what your wanting to do is talk to the kernel". Parts of the kernel ARE in assembler, and the bootloader is largely in ASM.

    So in truth, the kernel is the car. Asm can be the road, it can be the engine, it can be the passengers, it can be the wind resistance, it can be virtually any component. But nonetheless, if your writting an application sitting on top of the kernel you are going to need to speak to the kernel's api at some point (or the api of a layer sitting on top of it), just as if your writting a windows application in asm or c, or vb, you need to be speaking to the win32 api.

    Asm is no different than any other language, knowing the language is great and all, but it's worthless without learning the proper api's you'll need to actually write a program that does something. That's a major flaw in most programming tutorials. They'll teach C or another language and not mention a single word about the api's one needs to know to actually write a program that does more than calculate pie.
  • by Cryptnotic ( 154382 ) on Thursday February 05, 2004 @08:11PM (#8196226)
    A real computer science program will teach generic principles of programming and systems development, with projects that delve into a variety of actual implementations of systems.

    For example, a b-tree data structure is fundamentally the same thing whether you implement it in 32-bit ARM assembly language or 16-bit x86 assembly language or C or Java.

    To understand how assembly language works, you need to understand how a processor works, how instruction decoding works, how register transfer language works, how clocking a processor makes it accomplish things. To understnad how registers hold values electrically and transfer values between registers you need to understand some physics and electronics.

    To understand how a compiler takes a source language and translates it into a target language, you need to understand a little about the kinds of languages computers can understand (Context-Free Languages) and how they can parse them (Context-Free Grammars). Delving into that field will lead to the core theory of computer science, what is possible with machines in general and what is impossible.

    A real computer science program at a university will take you through all of these subjects over several years, allowing for some excursions into other things like databases and cryptography. A real computer science program is always theory with projects that are applied to actual implementations.

  • by Anonymous Coward on Thursday February 05, 2004 @08:14PM (#8196269)
    There are two standards, the AT&T ... and the other one

    Incorrect. There are at least four different assemblers and standards:

    ASM - GNU Assembler. AT&T standard, as commonly used on Linux. The syntax hasn't changed since the 60's - which is both very good and very bad. I personally think it should be retired.

    MASM - Microsoft Assembler. Intel standard assembly. The syntax is nice, but there are some ambiguous operators (is [] address of or address by value? - the meaning changes depending on the context). This is typically what the commercial Windows world uses. MASM itself is mostly obsolete - the Visual C compiler can now do everything that it could and supports all modern CPU instructions (even on Visual C++ 6 if you install the latest CPU pack).

    NASM - Netwide Assembler. An assembler that set out to put right all the things that were wrong with MASM. The syntax is excellent, ambiguous operators are cleared up, documentation is also excellent, it interoperates beautifully with Visual C on Windows and GNU C on Linux. Ideally NASM would replace AS as the standard now that it's open source.

    TASM - Borland Turbo Assembler. Based around the Intel standards, but does things slightly differently. Has extensions which allow for easy object-oriented assembly programming - which can make for some very nice code. Had a MASM compatibility mode, but nobody in their right mind used that if they could help it. I had version 5, but I don't believe they've kept it up to date, so it's obsolete now.

    There are a couple of others as well, most notably AS86 (which was the leading independent solution for writing assembler back in the DOS days).
  • by Tom7 ( 102298 ) on Thursday February 05, 2004 @08:23PM (#8196379) Homepage Journal
    Assembly is an impediment to understanding high-level issues. While I agree 100% that any good CS program should include some discussion of assembly and systems programming (at least a few semesters), there is so much more to CS than systems hacking. There is no way that a whole modern CS curriculum should be taught in assembly language.

    As a starting language it's really a matter of preference: the bottom-up method gets you really understanding the machine (and really yearning for more convenient tools), but it is a slow, painful start. The things you learn are less general and will be less applicable years down the line. On the other hand, learning a high level language can leave you in the dark about what's going on under the hood, which means that some aspects of what you're doing will seem like "magic". For you efficiency addicts, this can mean less efficient code. On the other hand, the world will have fewer efficiency addicts!

    However, I think that assembly will turn a lot of people off of programming that could otherwise be interested in the subject and perhaps productive programmers, if not cowboy kernel hackers.
  • by ncc74656 ( 45571 ) * <scott@alfter.us> on Thursday February 05, 2004 @08:29PM (#8196419) Homepage Journal
    Is "Linux x86 assembly" any different to any other kind of "x86 assembly"?

    Given that most Linux-based assemblers use AT&T syntax and most other x86 assemblers use Intel syntax, yes. This page [codeproject.com] summarizes some of the most significant differences (operand order, operand prefixes, etc.).

    There's also the small matter of talking to the host OS. The difference between x86 assembly coding on Linux and x86 assembly coding on Win32 is comparable to the difference between 6502 assembly coding on the Apple II and 6502 assembly coding on the Commodore 64. Just as JSR $FDED does one thing on the Apple II (print the character in the accumulator) and does something completely different on the Commodore (crash?), making Linux calls to Windows is not likely to work too well.

  • by CleverDan ( 728966 ) on Thursday February 05, 2004 @08:30PM (#8196429)
    by Randy Hyde at Univ California - Riverside. To learn about assembly on 80x86 processors, check out the printed book [nostarch.com], or download the text with a Linux [ucr.edu] or Windows [ucr.edu] point-of-view. It's written in a style that's not overwhelming to the novice.
  • by Wolfier ( 94144 ) on Thursday February 05, 2004 @08:39PM (#8196509)
    It is not assembly language. It is the way we think - and in the world of computers there are 5 types of languages that will make you take on anything very easy.

    1. Procedural (Pascal/C/BASIC)
    2. Object-Oriented (Eiffel/Smalltalk/Java/C++)
    3. Functional (Scheme/Lisp/Logo)
    4. Declarative (Prolog/Forth)
    5. Assembly (x86 etc.)

  • by vontrotsky ( 667853 ) on Thursday February 05, 2004 @08:40PM (#8196519)
    A* is like Dijktra's Algorithm, but instead of wandering randomly, you use a global function to guess at which paths to take.

    Jeff
    who thinks that you should ask google
  • by pclminion ( 145572 ) on Thursday February 05, 2004 @08:48PM (#8196600)
    I would organize those differently:

    1. Imperative
    -- 1a. Procedural (Pascal/C/BASIC)
    -- 1b. Object-Oriented (Eiffel/Smalltalk/Java/C++)
    -- 1c. Assembly language
    2. Functional-Type
    -- 2a. Pseudo-functional (Scheme/Lisp)
    -- 2b. Pure functional (Haskell/ML/Pure lambda calculus)
    3. Declarative (Prolog)

    Imperative languages are based on the execution of individual commands. Fundamentally they are based on the concept of assignment -- moving data from one place to another. Functional languages are based on the evaluation of expressions and the absence of side-effects. Pseudo-functional languages have variables, loops, and side-effects but are mainly based on functional concepts. Declarative languages are based on the concept of goals, and the recursive description of how those goals should be achieved, or the definition of what constitutes achievement of the goals.

    I'm not sure why you consider Forth a declarative language. To me it seems more like an imperative language with an unusual syntax.

  • by larry bagina ( 561269 ) on Thursday February 05, 2004 @08:50PM (#8196638) Journal
    those are implementations, not standards.

    x86 instructions that deal with 2 data points can be written 2 ways:

    instr src,dest
    instr dest,src

    The intel standard (used by nasm, tasm, masm) is dest,src. The ATT standard (used by gas) is src,dest

  • by Endive4Ever ( 742304 ) on Thursday February 05, 2004 @08:53PM (#8196662)
    Well, some of us code assembly on bare hardware. We have to roll our own 'api' and include it in there with the rest of the code.

    I've worked before with programmers who had little experience in programming 'bare hardware'- they do really foolish things like not initing timers, setting up stack pointers, and the like.

    Writing bare ASM code for a processor (where it boots up out of your own EPROM or on an emulator) is good experience in minimalism. It can give you a good feeling when the project is all done and you can say you did it all yourself.

    For those interested in getting into this kind of thing, start with a PIC embedded controller and a cheap programmer. You can get PIC assembly language tools for free, and build a programmer, or buy a kit for a programmer, that plugs into your serial or parallel port. Your first PIC machine can be the CPU, a clock crystal, a few resistors and capacitors, and the LED you want to blink, or whatever other intrigues you. If you're not into complex soldering, and/or layout and complex schematics, you can buy pre-etched boards you just plug the PIC into.

    Another easy-start processor would be the 68HC11. It has a bootstrap built into ROM. Basically, you can jumper the chip so it wakes up listening on the serial port for code you send down the wire at it, and burns it into the EEPROM memory in the 'HC11 chip itself. Move the jumper and reboot the chip, and it's running your code.

    I think this is far more interesting that just writing apps that run on an Operating System you didn't roll yourself.

  • by John Whitley ( 6067 ) on Thursday February 05, 2004 @08:57PM (#8196725) Homepage
    Perhaps it's time that computer science curriculums start teaching assembly language first.

    Having taught an assembly/into computer arch class, I agree with the sentiment that students who get "under the hood" gain valuable knowledge and working skills. Not just pounding ASM, but in learning how the machine works. Point agreed.

    Also having taught first year computer science students, and seen how some of academia's transitions in pedagogy affected students... I have to say that the idea of teaching first year students in assembly is friggin' daft.

    My reasoning is the same as why I strongly advocated an objects-first teaching model. It is increasingly critical for students to build a strong sense of software design and abstraction early on. This foundation makes students much better prepared to solve problems of many different scales (asm to component-systems) in the long run.

    There's evidence from a paper in one of the Empirical Studies of Programmers workshops that this approach does trade off design skills for purely algorithmic reasoning for students at the end of their first year. But my own experience, as well as that of some prominent Comp Sci Education (CSE) folks seems to indicate that this is far more than compensated for as a student's skills grow.

    Here's my theory as to why this is the case:
    The details of debugging, alogrithmic thinking, and problem solving are very much skill building exercises that really require time of exposure to improve. But it is much more difficult in my experience for students to build good design sense on their own. Once the framework for thinking in terms of good abstractions is laid down, it provides much stronger support for later filling all of those gory low-level details.

    Historical perspective: Ironically, this same reasoning is much of why I believe that academia's switch to C++ from languages like Pascal, Modula-2, etc. was an educational disaster for many years. The astute reader is now thinking: "hey, you just said you like objects-first; what up?" In the Procedural Era, many schools wouldn't expose students to C in the first year, as it had too many pitfalls that distracted from learning the basics of algorithmic thinking and important abstraction skills. Once the foundation was put in place, it was okay to swtich 'em to C for the rest of the program.

    When C++ and the early object boom really hit, this put on big pressure to teach first year students using C++. At one point in the mid-90's, upwards of 75% of 4-year institutions were teaching their first year in C++. Thus a language that had plenty more pitfalls than C, previously shunned for its pedagogical failings, entered the classroom. Combined with a lack of of proper OO mental retooling on the part of first year instructors and faculty made for something of a skills disaster on a broad scale. At best, students learned "Modula-C" instead of good OO style. At worst, they were so confused by this melange of one-instance classes and sloppy hybrid typing that they didn't get a cohesive foundation whatsoever.
  • Re:Not the point! (Score:5, Informative)

    by tomstdenis ( 446163 ) <tomstdenis@gma[ ]com ['il.' in gap]> on Thursday February 05, 2004 @08:58PM (#8196740) Homepage
    Correct me if I'm wrong but isn't the most primitive CMOS gate a NAND gate? So I highly doubt you would make AND out of and XOR gate [XOR being the more costly of the three].

    Tom
  • by milesw ( 91604 ) on Thursday February 05, 2004 @09:11PM (#8196885) Homepage
    The book was released under the GNU Free Documentation License, and it can be downloaded for free (in PDF format) from: http://savannah.nongnu.org/projects/pgubook/ .
  • by Perl-Pusher ( 555592 ) on Thursday February 05, 2004 @09:13PM (#8196906)
    The CS and CE programs I went to in Virginia only differed by 3-4 classes. CE had 2 semesters of micro controllers, Linear Algebra, Ordinary Diff EQs and Chemistry. While CS required Algorithms and Operating Systems. CS could opt for matrix algebra for business or the engineering based Linear Algebra. Other than that, they were identical. Many students opt for dual degrees. Both required physics,calculus and what they called core curriculm which was Digital design, and alot of programming, computer architecture etc. Both taught assembly, C++,UML and java was a popular elective.
  • by Venner ( 59051 ) on Thursday February 05, 2004 @09:29PM (#8197028)
    Eh, it isn't really unheard of today.
    In the old days, computer scientists didn't really exist. You basically had groups of electrical engineers, mathematicians, etc, developing what is today computer science.

    As a graduate of an accredited Computer Engineering curriculum, my take is this; computer scientists develop software, algorithms, etc. Computer engineers design the underlying digital circuits, logic, and such. Software guys vs. hardware guys.

    As such, you'll find computer engineers use assembly a heck of a lot more than computer scientists. I've worked with MIPS, x86, motorola's, and several others. And when you get down to it, I like to work with C more than I do with languages like C++ or Java. I enjoy the low-level nitty-gritty.

    I'm making a generalization of course; there's no great schism between the two groups and our work often overlaps. We just each use the tools most appropriate for our jobs.
  • Re:MIXAL (Score:3, Informative)

    by Slamtilt ( 17405 ) on Thursday February 05, 2004 @09:35PM (#8197073)
    Well, if we're talking about the same MIX as in The Art of Computer Programming, it's defined there as having 4000 memory locations. Each memory location is made up of 5 bytes and a sign bit. The number of bits in a byte isn't precisely defined, but a byte has to be able to hold at least 64 distinct values, and no more than 100. And programs should never assume more than 64 values for a byte. Odd, but there you are.

    Writing your own MIX machine is an interesting exercise. I remember I finished the instruction set but never got round to actually implementing MIXAL (the assembly language). Which is embarrassing, since Knuth practically gives you the recipe.
  • Re:Your book? (Score:2, Informative)

    by Mr. Darl McBride ( 704524 ) on Thursday February 05, 2004 @09:36PM (#8197083)
    I'm put off too.

    I'll download it for free [nongnu.org] myself.

  • This book (Score:5, Informative)

    by voodoo1man ( 594237 ) on Thursday February 05, 2004 @09:43PM (#8197123)
    has been available [nongnu.org] for some time under the GNU Free Documentation License. I tried to use it a while back when I decided to learn assembler, but I found Paul Carter's PC Assembly Language [drpaulcarter.com] to be a much better introduction.
  • by Lord Ender ( 156273 ) on Thursday February 05, 2004 @09:47PM (#8197155) Homepage
    This was posted to USENET by its author, Ed Nather, on May 21, 1983.

    A recent article devoted to the *macho* side of programming
    made the bald and unvarnished statement:

    Real Programmers write in FORTRAN.

    Maybe they do now,
    in this decadent era of
    Lite beer, hand calculators, and "user-friendly" software
    but back in the Good Old Days,
    when the term "software" sounded funny
    and Real Computers were made out of drums and vacuum tubes,
    Real Programmers wrote in machine code.
    Not FORTRAN. Not RATFOR. Not, even, assembly language.
    Machine Code.
    Raw, unadorned, inscrutable hexadecimal numbers.

    Lest a whole new generation of programmers
    grow up in ignorance of this glorious past,
    I feel duty-bound to describe,
    as best I can through the generation gap,
    how a Real Programmer wrote code.
    I'll call him Mel,
    because that was his name.

    I first met Mel when I went to work for Royal McBee Computer Corp.,
    a now-defunct subsidiary of the typewriter company.
    The firm manufactured the LGP-30,
    a small, cheap (by the standards of the day)
    drum-memory computer,
    and had just started to manufacture
    the RPC-4000, a much-improved,
    bigger, better, faster --- drum-memory computer.
    Cores cost too much,
    and weren't here to stay, anyway.
    (That's why you haven't heard of the company, or the computer.)

    I had been hired to write a FORTRAN compiler
    Mel didn't approve of compilers.

    "If a program can't rewrite its own code",
    he asked, "what good is it?"

    Mel had written,
    in hexadecimal,
    the most popular computer program the company owned.
    It ran on the LGP-30
    and played blackjack with potential customers
    at computer shows.
    Its effect was always dramatic.
    The LGP-30 booth was packed at every show,
    and the IBM salesmen stood around
    talking to each other.
    Whether or not this actually sold computers
    was a question we never discussed.

    Mel's job was to re-write
    the blackjack program for the RPC-4000.
    (Port? What does that mean?)
    The new computer had a one-plus-one
    addressing scheme,
    in which each machine instruction,
    in addition to the operation code
    and the address of the needed operand,
    had a second address that indicated where, on the revolving drum,
    the next instruction was located.

    In modern parlance,
    every single instruction was followed by a GO TO!
    Put *that* in Pascal's pipe and smoke it.

    Mel loved the RPC-4000
    because he could optimize his code:
    that is, locate instructions on the drum
    so that just as one finished its job,
    the next would be just arriving at the "read head"
    and available for immediate execution.
    There was a program to do that job,
    an "optimizing assembler",
    but Mel refused to use it.

    "You never know where it's going to put things",
    he explained, "so you'd have to use separate constants".
    It was a long time before I understood that remark.
    Since Mel knew the numerical value
    of every operation code,
    and assigned his own drum addresses,
    every instruction he wrote could also be considered
    a numerical constant.
    He could pick up an earlier "add" instruction, say,
    and multiply by it,
    if it had the right numeric value.
    His code was not easy for someone else to modify.

    I compared Mel's hand-optimized programs
    with the same code massaged by the optimizing assembler program,
    and Mel's always ran faster.
    That was because the "top-down" method of program design
    hadn't been invented yet,
    and Mel wouldn't have used it anyway.
    He wrote the innermost parts of his program loops first,
    so they would get first choice
    of the optimum address locations on the drum.
  • by steveg ( 55825 ) on Thursday February 05, 2004 @09:56PM (#8197217)
    (But understanding how assembly instruction sets work, and how compilers work, are both useful for writing better code at the compiler level. Less so now that optimizers are really good - but the understanding is still helpful.)

    My understanding of the parent post was that this is exactly what he was saying. I don't think he was claiming that programs written in assembly were better, but that programmers who knew assembly were better programmers.

    I think you were agreeing with him.
  • by Anonymous Coward on Thursday February 05, 2004 @10:12PM (#8197314)
    I used VAX, PDP11 and 68K assembly, and they are actually quite easy to use, because they are orthogonal: all the operands of all the instructions use the same addressing modes, and there are very few exceptions that you need to remember. And the addressing modes themselves are quite obvious and sensible: each meets a real need. The fact that any operand could be in memory was not a problem.

    It was a real shock when this new IBM PC thingy came along and I started dipping into x86 ASM. NOT orthogonal. Not all instructions could use each addressing mode. Downright ugly.

    Moving huge amounts of data is not unique to the VAX et. al. Even the x86 can do it. And the page faults, etc., add no complexity at all because the memory management hardware is at a lower layer, and thus invisible to the assembly language programmer.

    --An old shell-back
  • by Daytona955i ( 448665 ) <flynnguy24@@@yahoo...com> on Thursday February 05, 2004 @10:13PM (#8197320)
    That all depends on what you are doing... if you are doing it for fun then yes, I agree with you... however, if you are a programmer who picked up learn c++ in 24 hours, and now call yourself a coder, you have a lot to learn, and x86 asm might be the place to start.
  • Re:MIXAL (Score:3, Informative)

    by bunratty ( 545641 ) on Thursday February 05, 2004 @10:22PM (#8197390)
    You might be interested to know that MMIX [stanford.edu] is the new 64-bit RISC processor version of MIX. There's even an assembler and simulator available so you can run code, and gcc even generates code for the processor.
  • by revery ( 456516 ) * <charles@NoSpam.cac2.net> on Thursday February 05, 2004 @10:30PM (#8197453) Homepage
    Sorry if this has already been mentioned, but the book is available for dowload from this [nongnu.org] site, under the filelist link. Here is a direct link [nongnu.org] to the pdf.

    --

    Was it the sheep climbing onto the altar, or the cattle lowing to be slain,
    or the Son of God hanging dead and bloodied on a cross that told me this was a world condemned, but loved and bought with blood.
  • Just to clarify (Score:5, Informative)

    by Raul654 ( 453029 ) on Thursday February 05, 2004 @11:41PM (#8197903) Homepage
    The correct answers are down there, but just to collect them and clarify - you can build anything using nothing but NANDS. Alternatively, you can build anything using nothing but XORS. You can prove this easily using demorgan's theorem.

    However, in the real world, NANDS are cheap (2-3 transistors), so that's what everyone uses.
  • by RallyDriver ( 49641 ) on Friday February 06, 2004 @12:44AM (#8198291) Homepage
    Speaking from some experience (CS undergrad TA while in grad school)....

    A few thoughts:

    It's essential to teach some assembly at some point in a CS undergrad - A CS course should give full insight into the workings of a real CPU, and should give as wide a variety as possible.

    At Edinburgh [ed.ac.uk] the first year CS course included assembly, C, and ... wait for it ... PostScript. PS sounds wacky but it's the only stack based language widely used on modern computers (APL and Forth have died out).

    When I was a CS undergrad we had practical classes in no fewer than 17 languages, covering the range of imperative, declarative, functional and stack based, plus specialist toys like theorem provers and SQL.

    The best starting point for a university level course is the good old procedural language - in my day it was Pascal, C++ and Modula-3, these days I'd use Java (and many CS departments do).

    Also, when you do get to assembler, I don't think using a real assembler is the best teaching tool - assemblers are intended for developing real low level code, or as back end targets for compilers. For teaching at Edinburgh, we used an X11 based tool called xspim which simulated a MIPS R2000 (we actually ran it on Sun Sparc-II's, not that it matters), and it let you single step and examine registers without the complexity of adding a debugger, and had a window where you could see the registers, CPU pipeline etc. displayed.

    For introducing programming concepts to a younger audience I think an interpreted language which will execute command lines, allowing them to experiment while avoiding the edit-compile-run cycle, is very important. Some are better than others; when I was a kid the 8 bit micros (Apple, Commodore, Atari, ...) had BASIC interpreters in ROM, and they were mostly OK, though the only one with a really good BASIC language (proper procedures, not GOSUB) was the Acorn BBC [nvg.ntnu.no].

    I don't like Pilot or Comal for teaching (failed experiments of the 1980's) but I think LOGO [mit.edu] is a very commendable way to make concepts accessible to the young.

    A perhaps unexpected place I was made to learn with an interpreted environment was as an undergrad at Cambridge University [cam.ac.uk], where the first programming language taught is ML [ed.ac.uk] which for the CS people who haven't heard of is an implementation of lambda calculus with a sane syntax.
  • by tjb ( 226873 ) on Friday February 06, 2004 @12:51AM (#8198327)
    That's not really true - Windows, for instance, runs in protected mode, where the chip intercepts memory access and hands it off to the OS for translation which then hands it back to the chip for execution. This way, the OS (or the old DOS memory managers, if you remember them) translate your address into a flat memory model. Under this system, each application gets its own little memory sandbox - In theory, I shouldn't be able to corrupt the memory of another program unless I have both cunning and malicious intent.

    Now, what this means is that if I write a Win32 application-layer program that attempts to directly modify, say, the interrupt vector table, I'm not writing to the real interrupt vector table that the hardware uses when it receives physical interrupt signals, I'm writing to a version of it in my slice of memory, and what happens next is anybody's guess.

    In otherwords, access to memory-mapped system resources like screen-buffers and interrupt vectors is arbitrated by the OS memory manager. What I want to do may work, it may not. I imagine Linux is more friendly in this regard, but I really don't know. DOS, on the otherhand, was anarchy - there was no memory manager (unless you intentionally ran one with your application) so all requests from the application layer were carried out without any intervention.

    Tim
  • That's an old link - outdated version. Go to

    http://savannah.nongnu.org/projects/pgubook/
  • by efti ( 568624 ) on Friday February 06, 2004 @01:28AM (#8198501)
    What, you don't build your own processors? What fun is that?

    Chip development, and particularly processor development is a hobby few people can afford. But you can roll your own CPU on a budget -- use an FPGA (Field Programmable Grid Array) where you can program the connections between the components. And there are people who do this kind of thing. You can find some of their work here [opencores.org]

  • by ashot ( 599110 ) <ashot@noSpAm.molsoft.com> on Friday February 06, 2004 @01:44AM (#8198582) Homepage
    This strategy generally works better for Computer Engineers rather then Computer Scientists because what you are really learning with Assembly tends to be how the hardware functions. In fact if you are going to learn assembly, you could spend just another month and learn the basics of computer architecture. I feel like that is what really helped me de-mystify the computing process: the ability to trace back all the way to elementary physical processes, and see the computer as a almost a physical entity rather than a magical black box.

    For an example of this, you can see Patt and Patel's "Introduction to Computing Systems:From bits and gates, to C and beyond." I took and TA for a course that uses this book.
  • by rblancarte ( 213492 ) on Friday February 06, 2004 @01:55AM (#8198653) Homepage
    If you continue on the curriculum, CS 352 - Computer Architecture also teaches assembly. Depending on the section, you will either learn some IA32 or MIPS. I was also a EE undergrad about 10 years ago, then we were also taught 68000 Assembly.

    LC-2 assembly is not bad, but that class is just a light class compared to what you will learn in 352. I would say pay close attention in that class and EE 316, very useful for 352.
  • by snStarter ( 212765 ) on Friday February 06, 2004 @02:04AM (#8198707)
    No one would really call the PDP-11 a CISC machine. You might call it a RISC VAX however (pause for audience laughter).

    Also, many PDP-11's were random logic and not micro-coded. The later 11's were microcoded, of course, the 11/60 being the extreme because it had a writeable control store that let you define your own micro-coded instructions.

    It's important to remember that the entire RT-11 operating system was written entirely in MACRO-11 by some amazing software engineers who knew the PDP-11 instruction set inside and out. The result was an operating system that ran very nicely in a 4K word footprint.

    The VAX had a terrific compiler, BLISS-32, which created amazingly efficient code; code no human being would ever create but fantastic none-the-less.

  • by Weird O'Puns ( 749505 ) on Friday February 06, 2004 @02:07AM (#8198723)
    You don't have to buy the book. You can download it from here [nongnu.org] for free in pdf format.
  • by johnnyb ( 4816 ) <jonathan@bartlettpublishing.com> on Friday February 06, 2004 @02:12AM (#8198753) Homepage
    "I don't like dishonesty in the submission process."

    What's dishonest? It would have been dishonest had I registered a new account to make the submission. However, the way that you know that I was the author was because I _did not_ resort to dishonest tactics. I simply wrote from the third person, which is exactly how I wrote the back cover text for the book, the press releases, etc.

    I don't even spamguard my email address, because I want people to know who I am and be able to reach me easily.
  • by Anonymous Coward on Friday February 06, 2004 @02:33AM (#8198856)
    You missed the point.

    This:

    (But understanding how assembly instruction sets work, and how compilers work, are both useful for writing better code at the compiler level. Less so now that optimizers are really good - but the understanding is still helpful.)

    Was the exact point of the post you spent forever "refuting". He didn't say software written in assembler was better, he said people who knew assembler produce better code in high level languages. Maybe you ought to spend more time learning the English language, I've heard it's a useful skill to have.
  • in reality... (Score:5, Informative)

    by slew ( 2918 ) on Friday February 06, 2004 @03:51AM (#8199100)
    For what it's worth, they don't use just NANDs in cmos chip design in the real world. The primary primitive is the AND-OR-INVERT (AOI) structure.

    In the cmos world, pass-gates are much cheaper than amplifying gates (in the size vs speed vs power tradeoff), although you can't put too many pass gates in a row (signal degradation). So in fact MUX (multiplexor to pass one of the two inputs using the control of a third) and XORS (use input A to pass either !B or B) are used quite a bit.

    Some background might be helpful to think about the more complicated AOI struture, though...

    In a cmos NAND-gate, the pull-up side is two p-type pass gates in parallel from the output to Vdd (the positive rail) so that if either of the two p-type gates is low, the output is pulled high. For the pull-down side, two n-type pass gates are in series to ground so both n-type gates have to be low before the output is pulled to ground. This gives us a total of 4 transistors for a cmos-nand where the longest pass gate depth is 2 (the pull-down). The pull-down is restricted to be the complement function of the pull-down in CMOS (otherwize either the pull-up and pull-down will fight or nobody will pull causing the output to float and/or oscillate).

    A 2-input NOR gate has the p-type in series and the n-type in parallel (for the same # of transistors).

    Due to a quirk of semi-conductor technology, n-type transistors are easier to make more powerfull than p-type so usually a NAND is often slightly faster than a NOR (the two series n-types in a NAND gate are better at pulling down than the two series p-types are at pulling up in a NOR gate). However, this isn't the end of the story...

    Notice that you can build a 3-input NAND by just adding more p-type transistors in parallel to the pull-up and more n-type in series to the pull-down. You can make even more complicated logic by putting the pull-up and pull-down transistor in combinations of series and parallel configurations. The most interesting cmos configurations are called AOI (and-or-invert) since they are the ones you can make with simple parallel chains of pass transistors in series for pull-up and pull-down.

    For most cmos semi-conductor technologies, you are limited to about 4 pass gates in series or parallel before the noise margin starts to kill you and you need to stop using pass gates and just start a new amplifying "gate". Thus most chips are designed to use 4 input AOI gates where possible and smaller gates to finish out the logic implementation.

    Thus "everyone" really uses lots of different types of gates (including simple NAND and XORS as well as more complicated AOI).
  • Re:Question (Score:2, Informative)

    by craigtay ( 638170 ) on Friday February 06, 2004 @04:07AM (#8199141) Journal
    Switching a task in an OS is called context switching. This means that the OS saves all of the registers used by the application, and then loads up the saved registers of the task you are switching to. When this task has ended, or another context switch occurs, the registers of the saved task are then reloaded.

    If you go to www.google.com and look up context switching you should be able to find all the information you'll ever need!
  • Re:Not the point! (Score:2, Informative)

    by weg ( 196564 ) on Friday February 06, 2004 @04:48AM (#8199215)
    All logical formulae can be expressed in terms of negation and either disjunction or conjunction... construction of logical negation from a nand gate is trivial. Why should XOR be the champion??
  • by zeath ( 624023 ) on Friday February 06, 2004 @04:55AM (#8199228) Homepage
    I think the point that is trying to be made is that assembly, as you said, offers the programmer the ability to understand the hardware. Through that understanding you can better optimize your code. For example, by writing assembly I know how obscenely difficult it is, relatively speaking of course, for bytes to be thrown on and off the stack. Using that knowledge, along with other basic understandings of how the assembly level code works, I can then make a better judgement as to whether I should force a function to be inlined or not.

    In the larger scheme of things, assembly won't permit you to write elaborate user interfaces or hardware-intensive functionality, but it will grant you a better understanding of the code you're writing in the higher level languages for these purposes.

    I know that a lot more of what I was doing made sense when I finally got to a low-level programming course in my junior year. Even while in that class I made frequent comments that the computer architecture course from the year before would have made a lot more sense if we had the low-level class first.

    IMHO this is an excellent concept, and would breed a new set of knowledgable programmers not only being able to program efficiently but knowing why their programs are efficient.
  • by Dahan ( 130247 ) <khym@azeotrope.org> on Friday February 06, 2004 @06:46AM (#8199660)
    So where's the misspelling? Or perhaps it is you who needs to learn how to spell [m-w.com]?
  • by nameer ( 706715 ) on Friday February 06, 2004 @11:16AM (#8201293)
    Ladder Logic!

    I've built machine control cabinets with all of the logic done as relay circuits. Many control-reliable circuits (such as "emergency-stop") are still done using relays (albiet solid state ones). Although PLC's have for the most part replaced relay logic, they are still programed by the shop floor electricians using a UI that looks like a relay schematic.

    It's kind of a kick to think through a control problem in ladder logic, build it, and then listen to the clicking as it works through.

  • by RagManX ( 258563 ) <ragmanx@@@gamerdemos...com> on Friday February 06, 2004 @11:18AM (#8201310) Homepage Journal
    As others have pointed out, learning assembly is a way to become more familiar with the low-level stuff that can in turn help you with the high-level stuff. I learned assembly in college (Vax assembly, in fact, hehehe), and put it to use in a later job where we were using Visual Basic (version 3 at the time, on 486 and Pentium class boxen).

    My boss had written a routine for dealing with user input that allowed a user to just start typing from any field on the main input screen and the cursor would go automagically to the text input field and start a search based on the typed text. Today, that would seem trivially easy, but at the time, not many programs were doing this. The problem, though, was his handling of the typing was horribly inefficient. I could guess what was going on behind the VB code, because I know assembly and some compiler construction theory. I was able to improve the performance of his code by 3 orders of magnitude. Since the function worked on his system fine (a then top of the line Pentium), he couldn't understand why I spent time optimizing the routing. For our customers, though, many of whom were using 486s, this made a huge difference. Under his code, a moderately skilled typer could out-type his routine, and the letters would show up in a different order than typed (due to his poor coding and an interaction with how VB handled execution among several routines that got called when the cursor skipped up to the text input field). Under my routine, we could never out-type the routine, and customer calls about the function not working were eliminated. Since those calls alone made up over 5% of our help-desk calls about that product, that's a significant savings.

    And that was all from knowing enough assembly and compiler construction to intuit how VB was handling the code, and using the info to improve it. I'm not good at assembly, but I know enough to help me optimize my coding in many cases. I've done plenty of stuff like the above (but usually not as significant an improvement, because really, someone has to write some pretty poor code to allow another user to tweak that much), and others who know assembly but work at a higher level probably have similar tales.

    RagManX

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...