Learning Computer Science via Assembly Language 1328
johnnyb writes "
A new book was just released which is based on a new concept - teaching computer science through assembly language (Linux x86 assembly language, to be exact). This book teaches how the machine itself operates, rather than just the language. I've found that the key difference between mediocre and excellent programmers is whether or not they know assembly language. Those that do tend to understand computers themselves at a much deeper level.
Although unheard of today, this concept isn't really all that new -- there used to not be much choice in years past. Apple computers came with only BASIC and assembly language, and there were books available on assembly language for kids.
This is why the old-timers are often viewed as 'wizards': they had to know assembly language programming. Perhaps this current obsession with learning using 'easy' languages is the wrong way to do things. High-level languages are great, but learning them will never teach you about computers. Perhaps it's time that computer science curriculums start teaching assembly language first."
Knuth (Score:5, Informative)
Re:Linux x86 assembly? (Score:5, Informative)
Not new (Score:3, Informative)
Your book? (Score:5, Informative)
What you meant to say was that your new book has just been released. If you're going to pimp your wares on Slashdot, at least put an appropriate disclaimer on. That said, I completely agree with the premise of the book. I've met a lot of mediocre programmers, and a few good ones. But I've never yet met a real star that didn't have some background in assembly language programming. Personally, I haven't written anything in assembly in well over a decade. But that fact that I can do so if needed makes me a better programmer, and I'd recommend it to any aspiring coder as a key skill to learn. I wouldn't say IA32 is a particularly nice introduction (I'd start with a cleaner, simpler architecture, such as 6502), but it is at least widely available to anyone that wants to study it...
Re:Not So New Concept (Score:3, Informative)
While I admit that it helps you understand the device more, I have to say it's much less intuitive and enjoyable than high-level programming (not that I'm the type to find scripting fun, but you know what I mean).
Available under GNU FDL (Score:5, Informative)
http://savannah.nongnu.org/projects/pgubook/ [nongnu.org]
It's also being used at Princeton [princeton.edu]
assembly? bah - real men program with punchcards (Score:3, Informative)
For those of you insane enough to take the plunge, check out this FREE online introduction course [ccsu.edu] (no reg, don't ya love it). The guy who wrote it is pretty wacky. I took his java introductory course [ctstateu.edu] and it was hip as well as very educational.
Disclosure: The submitter is the Author. (Score:5, Informative)
Re:Good idea, Bad Idea (Score:3, Informative)
Am I that old? (Score:1, Informative)
I know we had one course on computer hardware and assembly. There were at least one or two others that touched on the subject. It wasn't much but it was something.
Knowing assembler doesn't help me too much at work where I am forced to write in VB6. It does help a lot with my projects at home that use microcontrollers. In that invironment, even if you can use C, you still need to know a lot about the hardware you are working with.
Loughborough Uni UK (Score:1, Informative)
Only straight computer science learnt assembler, however. Computer science with Business were learning VB, and didn't even look at c or java and I think that is wrong! Do u?
Jimbob
Buffer Overflows Explained (Score:1, Informative)
MIXAL (Score:3, Informative)
Re:Not So New Concept (Score:5, Informative)
Re:No, but... (Score:5, Informative)
Try looking at the asm output from GCC at -O2 on those two statements.
Knuth had reasons for using ASM that were a lot better than that. It does give you a better idea of how things are laid out in memory, because you have to do it yourself. It's easier to do detailed performance analysis of algorithms, because you can get exact cycle counts. (Which in turn helps train your intuition, and tell you how to find out from a CPU's instruction set how it does at various things to tune algorithms.) You can look at how cache affects things.
Take a look at his reasons [stanford.edu].
Re:Linux x86 assembly? (Score:5, Informative)
So in truth, the kernel is the car. Asm can be the road, it can be the engine, it can be the passengers, it can be the wind resistance, it can be virtually any component. But nonetheless, if your writting an application sitting on top of the kernel you are going to need to speak to the kernel's api at some point (or the api of a layer sitting on top of it), just as if your writting a windows application in asm or c, or vb, you need to be speaking to the win32 api.
Asm is no different than any other language, knowing the language is great and all, but it's worthless without learning the proper api's you'll need to actually write a program that does something. That's a major flaw in most programming tutorials. They'll teach C or another language and not mention a single word about the api's one needs to know to actually write a program that does more than calculate pie.
Implementation specific vs. generic... (Score:5, Informative)
For example, a b-tree data structure is fundamentally the same thing whether you implement it in 32-bit ARM assembly language or 16-bit x86 assembly language or C or Java.
To understand how assembly language works, you need to understand how a processor works, how instruction decoding works, how register transfer language works, how clocking a processor makes it accomplish things. To understnad how registers hold values electrically and transfer values between registers you need to understand some physics and electronics.
To understand how a compiler takes a source language and translates it into a target language, you need to understand a little about the kinds of languages computers can understand (Context-Free Languages) and how they can parse them (Context-Free Grammars). Delving into that field will lead to the core theory of computer science, what is possible with machines in general and what is impossible.
A real computer science program at a university will take you through all of these subjects over several years, allowing for some excursions into other things like databases and cryptography. A real computer science program is always theory with projects that are applied to actual implementations.
Re:Syntax, OS interfaces... (Score:5, Informative)
Incorrect. There are at least four different assemblers and standards:
ASM - GNU Assembler. AT&T standard, as commonly used on Linux. The syntax hasn't changed since the 60's - which is both very good and very bad. I personally think it should be retired.
MASM - Microsoft Assembler. Intel standard assembly. The syntax is nice, but there are some ambiguous operators (is [] address of or address by value? - the meaning changes depending on the context). This is typically what the commercial Windows world uses. MASM itself is mostly obsolete - the Visual C compiler can now do everything that it could and supports all modern CPU instructions (even on Visual C++ 6 if you install the latest CPU pack).
NASM - Netwide Assembler. An assembler that set out to put right all the things that were wrong with MASM. The syntax is excellent, ambiguous operators are cleared up, documentation is also excellent, it interoperates beautifully with Visual C on Windows and GNU C on Linux. Ideally NASM would replace AS as the standard now that it's open source.
TASM - Borland Turbo Assembler. Based around the Intel standards, but does things slightly differently. Has extensions which allow for easy object-oriented assembly programming - which can make for some very nice code. Had a MASM compatibility mode, but nobody in their right mind used that if they could help it. I had version 5, but I don't believe they've kept it up to date, so it's obsolete now.
There are a couple of others as well, most notably AS86 (which was the leading independent solution for writing assembler back in the DOS days).
Assembly is an impediment (Score:3, Informative)
As a starting language it's really a matter of preference: the bottom-up method gets you really understanding the machine (and really yearning for more convenient tools), but it is a slow, painful start. The things you learn are less general and will be less applicable years down the line. On the other hand, learning a high level language can leave you in the dark about what's going on under the hood, which means that some aspects of what you're doing will seem like "magic". For you efficiency addicts, this can mean less efficient code. On the other hand, the world will have fewer efficiency addicts!
However, I think that assembly will turn a lot of people off of programming that could otherwise be interested in the subject and perhaps productive programmers, if not cowboy kernel hackers.
Re:Linux x86 assembly? (Score:3, Informative)
Given that most Linux-based assemblers use AT&T syntax and most other x86 assemblers use Intel syntax, yes. This page [codeproject.com] summarizes some of the most significant differences (operand order, operand prefixes, etc.).
There's also the small matter of talking to the host OS. The difference between x86 assembly coding on Linux and x86 assembly coding on Win32 is comparable to the difference between 6502 assembly coding on the Apple II and 6502 assembly coding on the Commodore 64. Just as JSR $FDED does one thing on the Apple II (print the character in the accumulator) and does something completely different on the Commodore (crash?), making Linux calls to Windows is not likely to work too well.
The Art of Assembly Language Programming (Score:2, Informative)
It is not the language, it is the paradigm. (Score:3, Informative)
1. Procedural (Pascal/C/BASIC)
2. Object-Oriented (Eiffel/Smalltalk/Java/C++)
3. Functional (Scheme/Lisp/Logo)
4. Declarative (Prolog/Forth)
5. Assembly (x86 etc.)
Re:Programming or CompSci (Score:2, Informative)
Jeff
who thinks that you should ask google
Re:It is not the language, it is the paradigm. (Score:5, Informative)
1. Imperative
-- 1a. Procedural (Pascal/C/BASIC)
-- 1b. Object-Oriented (Eiffel/Smalltalk/Java/C++)
-- 1c. Assembly language
2. Functional-Type
-- 2a. Pseudo-functional (Scheme/Lisp)
-- 2b. Pure functional (Haskell/ML/Pure lambda calculus)
3. Declarative (Prolog)
Imperative languages are based on the execution of individual commands. Fundamentally they are based on the concept of assignment -- moving data from one place to another. Functional languages are based on the evaluation of expressions and the absence of side-effects. Pseudo-functional languages have variables, loops, and side-effects but are mainly based on functional concepts. Declarative languages are based on the concept of goals, and the recursive description of how those goals should be achieved, or the definition of what constitutes achievement of the goals.
I'm not sure why you consider Forth a declarative language. To me it seems more like an imperative language with an unusual syntax.
Re:Syntax, OS interfaces... (Score:5, Informative)
x86 instructions that deal with 2 data points can be written 2 ways:
instr src,dest
instr dest,src
The intel standard (used by nasm, tasm, masm) is dest,src. The ATT standard (used by gas) is src,dest
Re:Linux x86 assembly? (Score:5, Informative)
I've worked before with programmers who had little experience in programming 'bare hardware'- they do really foolish things like not initing timers, setting up stack pointers, and the like.
Writing bare ASM code for a processor (where it boots up out of your own EPROM or on an emulator) is good experience in minimalism. It can give you a good feeling when the project is all done and you can say you did it all yourself.
For those interested in getting into this kind of thing, start with a PIC embedded controller and a cheap programmer. You can get PIC assembly language tools for free, and build a programmer, or buy a kit for a programmer, that plugs into your serial or parallel port. Your first PIC machine can be the CPU, a clock crystal, a few resistors and capacitors, and the LED you want to blink, or whatever other intrigues you. If you're not into complex soldering, and/or layout and complex schematics, you can buy pre-etched boards you just plug the PIC into.
Another easy-start processor would be the 68HC11. It has a bootstrap built into ROM. Basically, you can jumper the chip so it wakes up listening on the serial port for code you send down the wire at it, and burns it into the EEPROM memory in the 'HC11 chip itself. Move the jumper and reboot the chip, and it's running your code.
I think this is far more interesting that just writing apps that run on an Operating System you didn't roll yourself.
ASM is not the place to start. (Score:5, Informative)
Having taught an assembly/into computer arch class, I agree with the sentiment that students who get "under the hood" gain valuable knowledge and working skills. Not just pounding ASM, but in learning how the machine works. Point agreed.
Also having taught first year computer science students, and seen how some of academia's transitions in pedagogy affected students... I have to say that the idea of teaching first year students in assembly is friggin' daft.
My reasoning is the same as why I strongly advocated an objects-first teaching model. It is increasingly critical for students to build a strong sense of software design and abstraction early on. This foundation makes students much better prepared to solve problems of many different scales (asm to component-systems) in the long run.
There's evidence from a paper in one of the Empirical Studies of Programmers workshops that this approach does trade off design skills for purely algorithmic reasoning for students at the end of their first year. But my own experience, as well as that of some prominent Comp Sci Education (CSE) folks seems to indicate that this is far more than compensated for as a student's skills grow.
Here's my theory as to why this is the case:
The details of debugging, alogrithmic thinking, and problem solving are very much skill building exercises that really require time of exposure to improve. But it is much more difficult in my experience for students to build good design sense on their own. Once the framework for thinking in terms of good abstractions is laid down, it provides much stronger support for later filling all of those gory low-level details.
Historical perspective: Ironically, this same reasoning is much of why I believe that academia's switch to C++ from languages like Pascal, Modula-2, etc. was an educational disaster for many years. The astute reader is now thinking: "hey, you just said you like objects-first; what up?" In the Procedural Era, many schools wouldn't expose students to C in the first year, as it had too many pitfalls that distracted from learning the basics of algorithmic thinking and important abstraction skills. Once the foundation was put in place, it was okay to swtich 'em to C for the rest of the program.
When C++ and the early object boom really hit, this put on big pressure to teach first year students using C++. At one point in the mid-90's, upwards of 75% of 4-year institutions were teaching their first year in C++. Thus a language that had plenty more pitfalls than C, previously shunned for its pedagogical failings, entered the classroom. Combined with a lack of of proper OO mental retooling on the part of first year instructors and faculty made for something of a skills disaster on a broad scale. At best, students learned "Modula-C" instead of good OO style. At worst, they were so confused by this melange of one-instance classes and sloppy hybrid typing that they didn't get a cohesive foundation whatsoever.
Re:Not the point! (Score:5, Informative)
Tom
Download the book for free (Score:3, Informative)
Re:Forget Computer Science! (Score:2, Informative)
Computer Engineering Perhaps? (Score:2, Informative)
In the old days, computer scientists didn't really exist. You basically had groups of electrical engineers, mathematicians, etc, developing what is today computer science.
As a graduate of an accredited Computer Engineering curriculum, my take is this; computer scientists develop software, algorithms, etc. Computer engineers design the underlying digital circuits, logic, and such. Software guys vs. hardware guys.
As such, you'll find computer engineers use assembly a heck of a lot more than computer scientists. I've worked with MIPS, x86, motorola's, and several others. And when you get down to it, I like to work with C more than I do with languages like C++ or Java. I enjoy the low-level nitty-gritty.
I'm making a generalization of course; there's no great schism between the two groups and our work often overlaps. We just each use the tools most appropriate for our jobs.
Re:MIXAL (Score:3, Informative)
Writing your own MIX machine is an interesting exercise. I remember I finished the instruction set but never got round to actually implementing MIXAL (the assembly language). Which is embarrassing, since Knuth practically gives you the recipe.
Re:Your book? (Score:2, Informative)
I'll download it for free [nongnu.org] myself.
This book (Score:5, Informative)
Re:Somewhere in the middle... (Score:5, Informative)
A recent article devoted to the *macho* side of programming
made the bald and unvarnished statement:
Real Programmers write in FORTRAN.
Maybe they do now,
in this decadent era of
Lite beer, hand calculators, and "user-friendly" software
but back in the Good Old Days,
when the term "software" sounded funny
and Real Computers were made out of drums and vacuum tubes,
Real Programmers wrote in machine code.
Not FORTRAN. Not RATFOR. Not, even, assembly language.
Machine Code.
Raw, unadorned, inscrutable hexadecimal numbers.
Lest a whole new generation of programmers
grow up in ignorance of this glorious past,
I feel duty-bound to describe,
as best I can through the generation gap,
how a Real Programmer wrote code.
I'll call him Mel,
because that was his name.
I first met Mel when I went to work for Royal McBee Computer Corp.,
a now-defunct subsidiary of the typewriter company.
The firm manufactured the LGP-30,
a small, cheap (by the standards of the day)
drum-memory computer,
and had just started to manufacture
the RPC-4000, a much-improved,
bigger, better, faster --- drum-memory computer.
Cores cost too much,
and weren't here to stay, anyway.
(That's why you haven't heard of the company, or the computer.)
I had been hired to write a FORTRAN compiler
Mel didn't approve of compilers.
"If a program can't rewrite its own code",
he asked, "what good is it?"
Mel had written,
in hexadecimal,
the most popular computer program the company owned.
It ran on the LGP-30
and played blackjack with potential customers
at computer shows.
Its effect was always dramatic.
The LGP-30 booth was packed at every show,
and the IBM salesmen stood around
talking to each other.
Whether or not this actually sold computers
was a question we never discussed.
Mel's job was to re-write
the blackjack program for the RPC-4000.
(Port? What does that mean?)
The new computer had a one-plus-one
addressing scheme,
in which each machine instruction,
in addition to the operation code
and the address of the needed operand,
had a second address that indicated where, on the revolving drum,
the next instruction was located.
In modern parlance,
every single instruction was followed by a GO TO!
Put *that* in Pascal's pipe and smoke it.
Mel loved the RPC-4000
because he could optimize his code:
that is, locate instructions on the drum
so that just as one finished its job,
the next would be just arriving at the "read head"
and available for immediate execution.
There was a program to do that job,
an "optimizing assembler",
but Mel refused to use it.
"You never know where it's going to put things",
he explained, "so you'd have to use separate constants".
It was a long time before I understood that remark.
Since Mel knew the numerical value
of every operation code,
and assigned his own drum addresses,
every instruction he wrote could also be considered
a numerical constant.
He could pick up an earlier "add" instruction, say,
and multiply by it,
if it had the right numeric value.
His code was not easy for someone else to modify.
I compared Mel's hand-optimized programs
with the same code massaged by the optimizing assembler program,
and Mel's always ran faster.
That was because the "top-down" method of program design
hadn't been invented yet,
and Mel wouldn't have used it anyway.
He wrote the innermost parts of his program loops first,
so they would get first choice
of the optimum address locations on the drum.
Re:Actually, they DON'T. (Score:2, Informative)
My understanding of the parent post was that this is exactly what he was saying. I don't think he was claiming that programs written in assembly were better, but that programmers who knew assembly were better programmers.
I think you were agreeing with him.
Re:Actually, they DON'T. (Score:2, Informative)
It was a real shock when this new IBM PC thingy came along and I started dipping into x86 ASM. NOT orthogonal. Not all instructions could use each addressing mode. Downright ugly.
Moving huge amounts of data is not unique to the VAX et. al. Even the x86 can do it. And the page faults, etc., add no complexity at all because the memory management hardware is at a lower layer, and thus invisible to the assembly language programmer.
--An old shell-back
Re:Linux x86 assembly? (Score:5, Informative)
Re:MIXAL (Score:3, Informative)
It's available online (Score:3, Informative)
--
Was it the sheep climbing onto the altar, or the cattle lowing to be slain,
or the Son of God hanging dead and bloodied on a cross that told me this was a world condemned, but loved and bought with blood.
Just to clarify (Score:5, Informative)
However, in the real world, NANDS are cheap (2-3 transistors), so that's what everyone uses.
Good, but not a good starting point (Score:3, Informative)
A few thoughts:
It's essential to teach some assembly at some point in a CS undergrad - A CS course should give full insight into the workings of a real CPU, and should give as wide a variety as possible.
At Edinburgh [ed.ac.uk] the first year CS course included assembly, C, and
When I was a CS undergrad we had practical classes in no fewer than 17 languages, covering the range of imperative, declarative, functional and stack based, plus specialist toys like theorem provers and SQL.
The best starting point for a university level course is the good old procedural language - in my day it was Pascal, C++ and Modula-3, these days I'd use Java (and many CS departments do).
Also, when you do get to assembler, I don't think using a real assembler is the best teaching tool - assemblers are intended for developing real low level code, or as back end targets for compilers. For teaching at Edinburgh, we used an X11 based tool called xspim which simulated a MIPS R2000 (we actually ran it on Sun Sparc-II's, not that it matters), and it let you single step and examine registers without the complexity of adding a debugger, and had a window where you could see the registers, CPU pipeline etc. displayed.
For introducing programming concepts to a younger audience I think an interpreted language which will execute command lines, allowing them to experiment while avoiding the edit-compile-run cycle, is very important. Some are better than others; when I was a kid the 8 bit micros (Apple, Commodore, Atari,
I don't like Pilot or Comal for teaching (failed experiments of the 1980's) but I think LOGO [mit.edu] is a very commendable way to make concepts accessible to the young.
A perhaps unexpected place I was made to learn with an interpreted environment was as an undergrad at Cambridge University [cam.ac.uk], where the first programming language taught is ML [ed.ac.uk] which for the CS people who haven't heard of is an implementation of lambda calculus with a sane syntax.
Re:Linux x86 assembly? (Score:3, Informative)
Now, what this means is that if I write a Win32 application-layer program that attempts to directly modify, say, the interrupt vector table, I'm not writing to the real interrupt vector table that the hardware uses when it receives physical interrupt signals, I'm writing to a version of it in my slice of memory, and what happens next is anybody's guess.
In otherwords, access to memory-mapped system resources like screen-buffers and interrupt vectors is arbitrated by the OS memory manager. What I want to do may work, it may not. I imagine Linux is more friendly in this regard, but I really don't know. DOS, on the otherhand, was anarchy - there was no memory manager (unless you intentionally ran one with your application) so all requests from the application layer were carried out without any intervention.
Tim
Re:For the poor like me here is the download (Score:2, Informative)
http://savannah.nongnu.org/projects/pgubook/
Re:Linux x86 assembly? (Score:3, Informative)
Chip development, and particularly processor development is a hobby few people can afford. But you can roll your own CPU on a budget -- use an FPGA (Field Programmable Grid Array) where you can program the connections between the components. And there are people who do this kind of thing. You can find some of their work here [opencores.org]
Better for computer engineers (Score:2, Informative)
For an example of this, you can see Patt and Patel's "Introduction to Computing Systems:From bits and gates, to C and beyond." I took and TA for a course that uses this book.
Re:Not So New Concept (Score:3, Informative)
LC-2 assembly is not bad, but that class is just a light class compared to what you will learn in 352. I would say pay close attention in that class and EE 316, very useful for 352.
PDP11, VAX, 68K mislabeled (Score:3, Informative)
Also, many PDP-11's were random logic and not micro-coded. The later 11's were microcoded, of course, the 11/60 being the extreme because it had a writeable control store that let you define your own micro-coded instructions.
It's important to remember that the entire RT-11 operating system was written entirely in MACRO-11 by some amazing software engineers who knew the PDP-11 instruction set inside and out. The result was an operating system that ran very nicely in a 4K word footprint.
The VAX had a terrific compiler, BLISS-32, which created amazingly efficient code; code no human being would ever create but fantastic none-the-less.
Re:Not So New Concept (Score:3, Informative)
Re:Not So New Concept (Score:4, Informative)
What's dishonest? It would have been dishonest had I registered a new account to make the submission. However, the way that you know that I was the author was because I _did not_ resort to dishonest tactics. I simply wrote from the third person, which is exactly how I wrote the back cover text for the book, the press releases, etc.
I don't even spamguard my email address, because I want people to know who I am and be able to reach me easily.
Re:Actually, they DON'T. (Score:1, Informative)
This:
(But understanding how assembly instruction sets work, and how compilers work, are both useful for writing better code at the compiler level. Less so now that optimizers are really good - but the understanding is still helpful.)
Was the exact point of the post you spent forever "refuting". He didn't say software written in assembler was better, he said people who knew assembler produce better code in high level languages. Maybe you ought to spend more time learning the English language, I've heard it's a useful skill to have.
in reality... (Score:5, Informative)
In the cmos world, pass-gates are much cheaper than amplifying gates (in the size vs speed vs power tradeoff), although you can't put too many pass gates in a row (signal degradation). So in fact MUX (multiplexor to pass one of the two inputs using the control of a third) and XORS (use input A to pass either !B or B) are used quite a bit.
Some background might be helpful to think about the more complicated AOI struture, though...
In a cmos NAND-gate, the pull-up side is two p-type pass gates in parallel from the output to Vdd (the positive rail) so that if either of the two p-type gates is low, the output is pulled high. For the pull-down side, two n-type pass gates are in series to ground so both n-type gates have to be low before the output is pulled to ground. This gives us a total of 4 transistors for a cmos-nand where the longest pass gate depth is 2 (the pull-down). The pull-down is restricted to be the complement function of the pull-down in CMOS (otherwize either the pull-up and pull-down will fight or nobody will pull causing the output to float and/or oscillate).
A 2-input NOR gate has the p-type in series and the n-type in parallel (for the same # of transistors).
Due to a quirk of semi-conductor technology, n-type transistors are easier to make more powerfull than p-type so usually a NAND is often slightly faster than a NOR (the two series n-types in a NAND gate are better at pulling down than the two series p-types are at pulling up in a NOR gate). However, this isn't the end of the story...
Notice that you can build a 3-input NAND by just adding more p-type transistors in parallel to the pull-up and more n-type in series to the pull-down. You can make even more complicated logic by putting the pull-up and pull-down transistor in combinations of series and parallel configurations. The most interesting cmos configurations are called AOI (and-or-invert) since they are the ones you can make with simple parallel chains of pass transistors in series for pull-up and pull-down.
For most cmos semi-conductor technologies, you are limited to about 4 pass gates in series or parallel before the noise margin starts to kill you and you need to stop using pass gates and just start a new amplifying "gate". Thus most chips are designed to use 4 input AOI gates where possible and smaller gates to finish out the logic implementation.
Thus "everyone" really uses lots of different types of gates (including simple NAND and XORS as well as more complicated AOI).
Re:Question (Score:2, Informative)
If you go to www.google.com and look up context switching you should be able to find all the information you'll ever need!
Re:Not the point! (Score:2, Informative)
Re:Linux x86 assembly? (Score:2, Informative)
In the larger scheme of things, assembly won't permit you to write elaborate user interfaces or hardware-intensive functionality, but it will grant you a better understanding of the code you're writing in the higher level languages for these purposes.
I know that a lot more of what I was doing made sense when I finally got to a low-level programming course in my junior year. Even while in that class I made frequent comments that the computer architecture course from the year before would have made a lot more sense if we had the low-level class first.
IMHO this is an excellent concept, and would breed a new set of knowledgable programmers not only being able to program efficiently but knowing why their programs are efficient.
Re:To be a programmer without ever... (Score:3, Informative)
Re:4-bit Full Adder Using Relays or Vacuum Tubes (Score:2, Informative)
I've built machine control cabinets with all of the logic done as relay circuits. Many control-reliable circuits (such as "emergency-stop") are still done using relays (albiet solid state ones). Although PLC's have for the most part replaced relay logic, they are still programed by the shop floor electricians using a UI that looks like a relay schematic.
It's kind of a kick to think through a control problem in ladder logic, build it, and then listen to the clicking as it works through.
Re:Linux x86 assembly? (Score:3, Informative)
My boss had written a routine for dealing with user input that allowed a user to just start typing from any field on the main input screen and the cursor would go automagically to the text input field and start a search based on the typed text. Today, that would seem trivially easy, but at the time, not many programs were doing this. The problem, though, was his handling of the typing was horribly inefficient. I could guess what was going on behind the VB code, because I know assembly and some compiler construction theory. I was able to improve the performance of his code by 3 orders of magnitude. Since the function worked on his system fine (a then top of the line Pentium), he couldn't understand why I spent time optimizing the routing. For our customers, though, many of whom were using 486s, this made a huge difference. Under his code, a moderately skilled typer could out-type his routine, and the letters would show up in a different order than typed (due to his poor coding and an interaction with how VB handled execution among several routines that got called when the cursor skipped up to the text input field). Under my routine, we could never out-type the routine, and customer calls about the function not working were eliminated. Since those calls alone made up over 5% of our help-desk calls about that product, that's a significant savings.
And that was all from knowing enough assembly and compiler construction to intuit how VB was handling the code, and using the info to improve it. I'm not good at assembly, but I know enough to help me optimize my coding in many cases. I've done plenty of stuff like the above (but usually not as significant an improvement, because really, someone has to write some pretty poor code to allow another user to tweak that much), and others who know assembly but work at a higher level probably have similar tales.
RagManX