Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Programming Education IT Technology

Learning Computer Science via Assembly Language 1328

Posted by CowboyNeal
from the unconventional-approaches dept.
johnnyb writes " A new book was just released which is based on a new concept - teaching computer science through assembly language (Linux x86 assembly language, to be exact). This book teaches how the machine itself operates, rather than just the language. I've found that the key difference between mediocre and excellent programmers is whether or not they know assembly language. Those that do tend to understand computers themselves at a much deeper level. Although unheard of today, this concept isn't really all that new -- there used to not be much choice in years past. Apple computers came with only BASIC and assembly language, and there were books available on assembly language for kids. This is why the old-timers are often viewed as 'wizards': they had to know assembly language programming. Perhaps this current obsession with learning using 'easy' languages is the wrong way to do things. High-level languages are great, but learning them will never teach you about computers. Perhaps it's time that computer science curriculums start teaching assembly language first."
This discussion has been archived. No new comments can be posted.

Learning Computer Science via Assembly Language

Comments Filter:
  • by Anonymous Coward on Thursday February 05, 2004 @06:49PM (#8195931)
    AoA [ucr.edu], a worthwhile read
  • by dysprosia (661648) on Thursday February 05, 2004 @06:54PM (#8195989)
    I agree with the sentiment that knowing assembly language maketh a programmer, in some circumstances. It allows a deeper understanding of how certain languages work "under the hood", and how to debug errors that may not be so easily detected by than just staring at code. Knowing C and assembler, for instance, is a good match in my eyes.

    It may not be so important though, if you're a database programmer, or if you're dealing with high-level languages such as Java or something.
  • by pointzero (707900) on Thursday February 05, 2004 @06:55PM (#8196009) Homepage
    Perhaps it's time that computer science curriculums start teaching assembly language first.
    Here at University of New-Brunswick (Canada), they may not teach assembly language "first", but we do have a second year course dedicated to assembly and the inner workings of computers. My only problem though at the school is that we learn the Motorola M68HC11 cpu and not current ones. Sure it's easier to learn and understand, but most computers we work on today are x86 based.

    My 2 cents.
  • For the record (Score:5, Interesting)

    by pclminion (145572) on Thursday February 05, 2004 @06:56PM (#8196025)
    The CS program at Portland State University starts with assembly in their introductory classes. At the time I was there, it was x86 assembly, but I've heard that some professors are using Sparc assembly as well -- not a good idea in my opinion, simply because of 1) the delay slot and 2) the sethi instruction, both of which are a little confusing for someone who's never coded before, let alone never coded in assembly language.

    I think it's a little weird to call this "Learning Computer Science via Assembly Language." It's programming, not computer science. Computer science is really only marginally about computers. It has to do more with algorithms, logic, and mathematics.

    You can study computer science, and produce new knowledge in the field, without ever touching a computer.

    This misunderstanding is, I think, part of the reason so many students drop out of CompSci. They head into it thinking it's about programming, and are startled to find that computation and programming are not equivalent.

    That's why the Compilers course at PSU is considered the "filter" which kills all the students who aren't really interested in computer science. They really need to spin off a seperate "Software engineering" school for these students, since what they really want to study is programming.

  • by Cryptnotic (154382) on Thursday February 05, 2004 @06:57PM (#8196035) Homepage
    Well, for starters the syntax for assemblers is different. There are two standards, the AT&T standard (which is used by the GNU assembler) and the other one that is more familiar to DOS/Windows x86 assembly programmers (which is used by the NASM assmebler).

    Second, OS interfaces for making system calls (e.g., to read files, open network connections, etc) are different in Linux versus DOS or Windows).

  • by pla (258480) on Thursday February 05, 2004 @06:58PM (#8196049) Journal
    Is "Linux x86 assembly" any different to any other kind of "x86 assembly"?

    Yes. Although it requires understanding the CPU's native capabilities to the same degree, Linux uses AT&T syntax, whereas most of the Wintel world uses (unsurprisingly) Intel/Microsoft syntax.

    Personally, although I far prefer coding C under Linux, I prefer Intel syntax assembly. Even with many years of coding experience, I find AT&T syntax unneccessarily convoluted and somewhat difficult to quickly read through.

    The larger idea holds, however, regardless of what assembler you use. I wholeheartedly agree with the FP - People who know assembly produce better code by almost any measurement except "object-oriented-ness", which assembly makes difficult to an extreme. On that same note, I consider that as one of the better arguments against OO code - It simply does not map well to real-world CPUs, thus introducing inefficiencies in the translation to something the CPU does handle natively.
  • Re:Your book? (Score:5, Interesting)

    by FreshFunk510 (526493) on Thursday February 05, 2004 @06:58PM (#8196052)
    Ugh. Johnnyb should've wrote a disclaimer that he was promoting his own book. This type of action turns me away from wanting to support such an individual. Sorry, nothing personal.
  • by Anonymous Coward on Thursday February 05, 2004 @06:58PM (#8196053)
    I've often wondered about this... I'm a meager 14 year old, and I've learned (and a good deal more then just a "hello world..." program) numerous "high level" languages - C/C++, a little bit of JAVA, and a good deal of PHP... However, I'm somewhat mystified by the concept of low level languages. I've never really taken the time to learn them, namely due to the fact that they are mostly useless in actual practice (as far as I can tell.) Nevertheless, I feel that I have missed out on learning how the computer works below the surface and what is actually happening when the code I write is compiled and run. It sort of produces an artificial barrier to my code - I know what it's doing, but not how and why. I feel that I could code much better if I understood what was going on, and yet I'm not sure if I'll ever really be able to learn a language like assembly due to the fact that I'm used to higher level languages and have been somewhat spoiled by them.

    Does anyone else feel this way, or am I just an oddball? (which is most likely the case ;)
  • by jinxidoru (743428) on Thursday February 05, 2004 @07:01PM (#8196087) Homepage
    In the BYU CS department they've recently gone a little farther than starting with assembly language. The introductory programming class actually begins with digital logic gates. The next section deals with assembly programming using the contrived LC (Little Computer) opcode set. It's a much smaller set of opcodes than x86, so it's simple enough for beginners, but still contains the important elements learned from doing assembly programming. The course ends with an introduction to C++ and OOP. I think the change in curriculum is a very wise idea. I took a class in digital circuitry and it has helped me so much in my programming. I'd like to see more programs move in this direction.
  • by AvitarX (172628) <me@@@brandywinehundred...org> on Thursday February 05, 2004 @07:04PM (#8196128) Journal
    Except more and more jobs are being outsourced because schools have churned out commodity programmers and business's can get them anywhere.

    What we need if we want to stay the forefront is for there to be fewer programmers of a higher quality. For the past decade we have had too many people not gettin turned off by CS. So they enter the workforce as commodity disposable workers, and they tarnish the reputation of you uys who are good too. Now our jobs go to India and either they have better or equal workers and thins will never come back or they suck and we can keep churnin out assembly line (not assembly) programmers.

    Or we can start teaching people to be great programmers and start another tech revolution with actual American (European/wherever you are the is a victim of outsourcing) workers.
  • by skurk (78980) on Thursday February 05, 2004 @07:07PM (#8196170) Homepage Journal
    You have a good point. I code assembly on many different CPU's, and I can only see a minor difference between them

    For example, on the 6502 family (like the 6510 from the C64), you have only three registers; X, Y and A. These registers can only hold a byte each. Most of the variables you have are stored in zero pointers, a 255-byte range from address $00-$FF.

    Then the 68k CPU (as in the Amiga, Atari, etc) you have several more registers which can be used more freely. You have D0-D7 data registers and A0-A7 address registers. These can be operated as bytes, words or longwords as you wish, from wherever you want.

    The x86 assembly is written the "wrong way", and is pretty confusing at times. Where I would say "move.l 4,a6" on the 68k, I have to say "mov dx,4" on the x86. Takes a few minutes to adjust each time.

    Once you master assembly language on one CPU, it's pretty easy to switch to another.

    I still think the 680x0 series are the best.
  • Not a good idea (Score:3, Interesting)

    by El (94934) on Thursday February 05, 2004 @07:08PM (#8196192)
    I think one would be much better off writing in C without optimization, then stepping through the execution in a disassembler to see how the resulting machine code operates. Yes, it helps to write more efficient high level code if you know how it is converted to machine code. For example, I had a coworker who made a habit of declaring initialized arrays local to his functions. I had to point out to him: "You do know that this causes the array to be copied onto the stack every time you enter the function, thus really slowing down program execution, don't you?" Apparently this had never occured to him, because he had never actually watched the code execute.
  • by CaptainCarrot (84625) on Thursday February 05, 2004 @07:08PM (#8196194)
    High retention rates in higher education is not necessarily a good thing. Not everyone can benefit from such an education, and the sooner they find that out the better.

    The problem with your idea is that the "simple to use" languages that are most often taught are too simple and are too many generations removed from what the computer was actually doing. Years ago, even using BASIC on a TRS-80 would teach you a lot about how the machine worked. To get the best performance out of the thing you had to at least be familiar with the memory map, and that inevitably led you to further investigation.

    I gather that Visual Basic is among the more commonly used introductory languages, but when so many layers are placed between the programming language and the machine how are you supposed to learn what's going on at the bottom?

  • yep (Score:2, Interesting)

    by NemoX (630771) on Thursday February 05, 2004 @07:11PM (#8196228)
    I agree with what the author has to say about assembly. I never felt like I new enough until I learned assembly, even though I started with LOGO in 1st grade, and have had programming every year since. Then I came to understand a lot of how to prevent security holes in higher-level languages, and how to write tighter code. It also helped with better understanding program security from hackers, i.e. how they can hack my programs, product keys (or other such copy protection measures) and find the exploits. Mostly since they disassemble anything they can't decompile...and of course disassembling something puts it into assembly language of one sort or another. It is a class that I didn't get in college, and went back for after I got my degree. The most useful one IMO.

    An off the shelf book I liked and found useful was "The Art of Assembly Language" by Randall Hyde. I liked it better then my text books, anyway :p
  • by betis70 (525817) on Thursday February 05, 2004 @07:11PM (#8196239) Homepage
    From the website

    >>To be a programmer without ever learning assembly language is like being a professional race car driver without understanding how your carburetor works.

    A race car driver is a high-level user of the car (more akin to a financial analyst using Excel). Why would a high-level user care about HOW the carburetor works? All he has to tell the pit crew is how the car is behaving. A professional race car driver's job is to drive the car faster than the guy or gal next to him. End of story.

    Now the mechanic or race car engineer is a different story ...
  • by Ion Berkley (35404) on Thursday February 05, 2004 @07:12PM (#8196255)
    Lets see 20 years ago this is what I remember being taught in undergrad CS (Though I'd been an assembly programmer for 7 years already then):
    1st Year: Pascal, PDP-11 assembly
    2nd Year: 68000 Assembly,
    3rd Year: Ada, Eiffel, Modulo-2, Smalltalk, VHDL, EDIF

    Interestingly despite using mostly UNIX hardware C was left to be self taught, though it could be used in place of Pascal.
    The point really was use appropriate tools for appropriate jobs: Thus we learnt OS/Kernel basics in assembler, likewise IRQ concepts and HAL stuff. For algorithmic stuff we worked in a mainstream strongly typed high level language. For trends yet to propagate to the commerical world (some doomed ones!) we learned in exotic academic languages.
    Assembly is as valid today as it was in the 1950's, its just that its most appropriate for only some tasks that a minority of programmers do: Deeply embeded microcontroller SW, OS kernel work (take a look in our beloved linux kernel code), hardware bootstrap, hard real time etc etc etc
  • by ackthpt (218170) * on Thursday February 05, 2004 @07:16PM (#8196290) Homepage Journal
    While starting Computer Science students off with assembly (without first introducing them to a high-level language) may be a relatively new concept these days, the idea of teaching low-level languages to Computer Science students is not a revolutionary technique whatsoever. Every decent Computer Science curriculum includes several semesters of courses in which assembly language is required, to demonstrate their knowledge of basic computer processes.

    I've found I get blank stares sometimes when discussing memory usage, I/O bottle neck or code optimization in front of PHB's. All they want is crap to run and if they've got the money they'll throw it at buying more power. Sadly I've seen BSCS people who care less about getting a project done rather than done well, often with hideous looking code which can choke the fastest quad processor servers. A little time spent, considering how code may be written to minimize impact or organize I/O more efficiently isn't even encouraged anymore, where it once was a hard rule.

    Look at VB.NET, it's the very embodiment of just code and don't worry about it. So much of the work is buried in libraries/namespace, you really have no idea what impact calling routines will do.

    Today's lesson: Screw finesse, just throw more CPU and memory at it.

  • by BigZaphod (12942) on Thursday February 05, 2004 @07:17PM (#8196317) Homepage
    I don't think knowing assembly language is required to become a good programmer, but a good programmer should learn how the computer works at that level at some point in their life. And that's the key. Assembly language isn't a bad way to learn that, but it isn't the only way. And, in many respects, it isn't really the best way, either.

    See, I'm sort of tore. I like this idea since I feel most programmers these days are sorely lacking in the fundamentals. However there are advantages to starting higher. One major advantage I can think of is that when you start so low level, there are certain ideas that may never occur to you. Certain ways of solving problems won't come up because you "know" how the computer works and therefore adapt your thinking to it and seeing things very narrowly. I believe that we should be working towards adapting the computer to us instead. And when students start out learning at higher and higher levels, they start thinking higher and higher level and see connections from their "low level" beginnings to a higher level ideal without being bogged down in the little details. I think that would help drive innovation.
  • This is BS (Score:2, Interesting)

    by mslinux (570958) on Thursday February 05, 2004 @07:18PM (#8196319)
    Real men type the 1's and 0's in directly. Who needs a sissy assembler to do the translation? I wouldn't hire a programmer who did not know what "011001001100" meant to today's x86 procs.

    Ok, let's get real. This is equivelent to saying that unless a farmer gets on his hands and knees to plant 1 seed at a time, he's not a real farmer. Sure, he knows how to operate a $100,000 tractor that can plant several million seeds each day, but his hands-on knowledge of planting seeds isn't near what his great-grandad's was so, he must be less of a farmer... even though his output is thousands of times greater.

    HELLO! It's called technology. It's all about advancement. We no longer have to type in the 1's and 0's ourselves. Hell, very few of us need to use assembly. Why was C written? Because all the fucked-up assembly languages in the world do not work together. A program written in XYZ assembly will only run on a XYZ computer, etc.

    Today, programming is high-level. Write once... run everywhere (Java, Python, C#). Let's leave this assembly crap, and it is crap, where it belongs... back in the dark ages!
  • by MarcoAtWork (28889) on Thursday February 05, 2004 @07:19PM (#8196331)
    I still think the 680x0 series are the best.

    amen! as somebody who in the days of yore coded a Mandelbrot viewer on an Atari ST (68000) using only integer arithmetics (also shifts instead of muls whenever possible as the 68000's MUL was so slow, at the time I would've killed for the 68000 to have a barrel shifter though) and registers I can certainly sympathize with that: x86 CPUs always seemed to have way too few registers for my taste.
  • by pla (258480) on Thursday February 05, 2004 @07:31PM (#8196434) Journal
    maxim: cycles are cheap, people are expensive.

    True. This topic, however, goes beyond mere maximizing of program performance. Pur simply, if you know assembler, you can take the CPU's strengths and weaknesses into consideration while still writing readable, maintainable, "good" code. If you do not know assembly, you might produce simply beautiful code, but then have no clue why it runs like a three-legged dog.


    it is significantly better value to design and build a well architected OO solution

    Key phrase there, "well-architected". In practice, the entire idea of "object reuse" counts as a complete myth (I would say "lie", but since it seems like more of a self-deception, I woun't go that far). I have yet to see a project where more than a handful of objects from older code would provide any benefit at all, and even those that did required subclassing them to add and/or modify over half of their existing functionality. On the other hand, I have literally hundreds of vanilla-C functions I've written over the years from which I draw with almost every program I write, and that require no modification to work correctly (in honesty, the second time I use them, I usually need to modify them to generalize better, but after that, c'est fini).


    Who cares if it's not very efficient - it'll run twice as fast in 18 months

    Y'know, I once heard an amusing joke about that... "How can you tell a CS guy from a programmer?" "The CS guy writes code that either won't run on any machine you can fit on a single planet, or will run too slowly to serve its purpose until technology catches up with it in few decades". Something like tha - I killed the joke, but you get the idea.

    Yeah, computers constantly improve. But the clients want their shiny new software to run this year (if not last year, or at least on 5-year old equipment), not two years hence.
  • Re:Your book? (Score:5, Interesting)

    by FreshFunk510 (526493) on Thursday February 05, 2004 @07:39PM (#8196512)
    Oh, most definitely.

    Had he admitted that it was his own book, I might've actually read about the book, read the summary and saw if I was interested (as I'm a developer and have a degree in CS).

    But when he doesn't admit that and writes obviously biased remarks regarding knowing assembly to be a good programmer, I can't help but view it skeptically. In fact when I saw the response that it was his book I didn't even bother reading it anymore. All the words he posted lose credibility.

    And considering I got Score: 5 quite quick I have a feeling other people would agree with me. People don't like being "tricked" into buying stuff. It's the same reason why people don't like vendor-lock in and hate Microshaft.

    If you tell people its' your book and you give an open and honest review/opinion regarding it people will actually respect that and read about the book. Hey, it's just my guess, but I think I'm right.
  • by Sire Enaique (637079) on Thursday February 05, 2004 @07:40PM (#8196523)
    It's just what somebody said earlier, it's not the language itself that's difficult to learn, it's the APIs.

    20 years ago hp calculators used ASM as their programming language - though it wasn't called asm by hp. I didn't realize it until I started learning Z80 asm and noticed it was almost the same as my hp-15C's language.

    Except for the hp-16C, those calculators weren't targetted at programmers, so I guess there must be lots of people out there who know asm without realizing they do...
  • by cide1 (126814) on Thursday February 05, 2004 @07:41PM (#8196535) Homepage
    I think people that have been out in the real world for a while don't realize how many languages are taught in school's today. I'm in Purdue's computer engineering program, and so far, I have had to learn c, c++, fotran, java, asm, python, bash, and ksh. I'm sure there are some others I have forgotten (like the little bit of perl I know). We also learn ABEL and VHDL, several version control systems and Makefiles. My roommate is taking an AI course right now which is all in scheme. People talk about the old days like their is lost knowledge, we still learn all that. I could write ATARI games in asm if I wanted to, but why do that when tools are available that let me do so much more? For learning, we don't have to learn assembly first anymore, you can start with any language. I think it is good to take a two pronged approach. Learn C first, and at the same time, start learning digital logic. C compilers are forgiving, and warn about obvious errors compared to assembly just doing exactly what you tell it do. When one is comfortable with both, I think learning assembly is much easier. Without this prior knowledge, you are just doing what you are told to do, you do not really understand it, and when you make mistakes, you have a harder time understanding why. After learning asm, it is easier to think like the machine, and easier to write efficient code in other languages.
  • by bmac (51623) on Thursday February 05, 2004 @07:56PM (#8196709) Journal
    I agree with you in general, but I feel that OO's problems reside in that it should just be a way to organize code. If the OO folks just kept the perspective that an object is just a data definition and a bunch of functions whose first parameter is that data type, the mapping would be no different than C's mapping.

    Because they have had the desire to create layer upon ugly layer of syntax, the language just doesn't map down, as you say. Look how long it took to get a decent C++ compiler (5+ years). As well, the basic idea of data structure + associated functionality has been totally lost in the jumble of inheritance and public/private/protected.

    That said, I do disagree that an assembly programmer will fail the "object-oriented-ness" test. As stated at the start, it's all a matter of organization, and, IMO, that is where success or failure is determined. Sure, organizing a project that uses an OO lang is going to be easier than organizing a project that uses assembler, but that doesn't mean that it can't be done. If you look at my journal you'll see how much I value code organization ~ I say it determines the success of the project, regardless of prog lang choice.

    Peace & Blessings,
    bmac
    for true peace & happiness: www.mihr.com
  • by Anonymous Coward on Thursday February 05, 2004 @08:07PM (#8196849)
    I and others disagree. Here's two:

    Consider Paul Graham: Beating the Averages [paulgraham.com]. Considering that he's the kind of guy that gets invited to MIT as a language wizard [mit.edu] (along with Guy Steele, David Detlefs, Martin Rinard, Jonathan Rees, and David Moon), and considering that he has a net worth in the (hundreds of) millions, I'll go with his choice: abstraction rules.

    Consider Erann Gat from JPL [nasa.gov] (the guys who get to send robots to Mars and build autonomously controlled space probes) disagrees [flownet.com] and empirically proves it.

    Diamond Walker emphatically endorses [google.ca] LISP - his story is very compelling.

    The list goes on and on: Dan Friedman, Mitchell Wand, Shriram Krishnamurthi, Christian Quiennec ...

    But, to paraphrase Graham, if you don't know these people I guess I don't have to worry about you.
  • by Ungrounded Lightning (62228) on Thursday February 05, 2004 @08:09PM (#8196861) Journal
    People who know assembly produce better code by almost any measurement except "object-oriented-ness", which assembly makes difficult to an extreme.

    Actually, they don't.

    A study was done, some decades ago, on the issue of whether compilers were approaching the abilities of a good assembly programmer. The results were surprising:

    While a good assembly programmer could usually beat the compiler if he really hunkered down and applied himself to the particular piece of code, on the average his code would be worse - because he didn't maintain that focus on every line of every program.

    The programmer might know all the tricks. But the compiler knew MOST of the tricks, and applied them EVERYWHERE, ALL THE TIME.

    Potentially the programmer could still beat the compiler in reasonable time by focusing on the code that gets most of the execution. But the second part of Knuth's Law applies: "95% of the processor time is spent in 5% of the code - and it's NOT the 5% you THOUGHT it was." You have to do extra tuning passes AFTER the code is working to find and improve the REAL critical 5%. This typically was unnecessary in applications (though it would sometimes get done in OSes and some servers).

    This discovery lead directly to two things:

    1) Because a programmer can get so much more done and working right with a given time and effort using a compiler than using an assembler, and the compiler was emitting better assembly on the average, assember was abandoned for anything where it wasn't really necessary. That typically means:

    - A little bit in the kernel where it can't be avoided (typically bootup, the very start of the interrupt handling, and maybe context switching). (Unix System 6 kernel was 10k lines, of which 1.5k was assembler - and the assembly fraction got squeezed down from then on.)

    - A little bit in the libraries (typically the very start of a program and the system call subroutines)

    - Maybe a few tiny bits embedded in compiler code, to optimize the core of something slow.

    2) The replacement of microcoded CISC processors (i.e. PDP11, VAX, 68K) with RISC processors (i.e. SPARC, MIPS). (x86 was CISC but hung in there due to initera and cheapness.)

    Who cares if it takes three instructions instead of one to do some complex function, or if execution near jumps isn't straightforward? The compiler will crank out the three instructions and keep track of the funny execution sequence. Meanwhile you can shrink the processor and run the instructions at the microcode engine's speed - which can be increased further by reducing the nubmer of gates and length of wiring, and end up with a smaller chip (which means higher yeilds, which means making use of the next, faster, FAB technology sooner.)

    CISC pushed RISK out of general purpose processors again once the die sizes got big: You can use those extra gates for pipelining, branch prediction, and other stuff that lets you gain back more by parallelism than you lost by expanding the execution units. But it's still alive and well in embedded cores (where you need SOME crunch but want to use most of the silicon for other stuff) and in systems that don't need the absolute cutting-edge of speed or DO need a very low power-per-computation figure.

    The compiler advantage over an assembly programmer is extreme both with RISC and with a poorly-designed CISC instruction set (like the early x86es). Well-designed CISC instruction sets (like PDP11, VAX, and 68k) are tuned to simplify the compilers' work - which makes them understandable enough that the tricks are fewer and good code is easier for a human to write. This puts an assembly programmer back in the running. But on the average the compiler still wins.

    (But understanding how assembly instruction sets work, and how compilers work, are both useful for writing better code at the compiler level. Less so now that optimizers are really good - but the understanding is still helpful.)
  • by yintercept (517362) on Thursday February 05, 2004 @08:11PM (#8196876) Homepage Journal
    While the machine code generated might be the same, there are lots of ways to get there.

    Thanks for reminding the forum that Assembler is already one step removed from the actual machine language. I threw myself against several assembler lanuages and never got the result I desired. I loved the logic but hated the syntax.

    Personally, I wish that, in the evolution of computer languages, we spent a little bit more time evolving a somewhat more intuitive set of languages at the assembler level. As there is still a great deal of programming that really should be done at the machine level, an assembler rival might help spur on the use of linux.

    The book is going into my wish list.

  • Re:Not the point! (Score:3, Interesting)

    by EugeneK (50783) on Thursday February 05, 2004 @08:20PM (#8196949) Homepage Journal

    AND(a,b) = XOR(XOR(a,b),OR(a,b))


    You might be right about the cost of gates; I have no idea. Ok, it was a dumb joke. :)
  • by LorenTB (619148) on Thursday February 05, 2004 @08:25PM (#8196981) Homepage
    blech, 6 figures for VBA macros? I guess it's a living... um, good job, but I'd sooner toss my machine out the window than work with Excel all. day. long.

    There's more to life than money.
  • Re:nah (Score:4, Interesting)

    by dubious9 (580994) on Thursday February 05, 2004 @08:26PM (#8196987) Journal
    Yeah because you don't use variables, conditionals and types in assembly. Right. You just have to think harder about it.

    Also, you can use object oriented principles in assembly, just like you can in C. There aren't any convenient keywords or enforced methodologies (ie everything is an object), but you can gather sections of code that tie data and associated functionality together. You learn the advantages of modular programs. You learn how to count in binary. It forces you to not code carelessly. You learn good coding principles that you can apply to higher languages. In general you get another tool in your tool box. I'd say there are different flavors of programming and they aren't mutually exclusive. ie:

    Object Oriented
    Procedural
    Event-Based
    Interpreted
    Managed Memory (garbage collection)
    Assembly

    To leave one of these out of a CS program leaves me wondering how good the program is.
  • by vsprintf (579676) on Thursday February 05, 2004 @08:34PM (#8197069)

    I don't know why, but just saying the words 'assembly language', sends a chill down my spine. I guess I am too weak minded to learn it.

    Maybe individual brains just work in different ways. In school, I knew some people who were good with high-level languages but just couldn't hack assembler. They could not get down to that absolute minimal step-by-step instruction level. I'm not sure what that says about those of us who use assembler. :) BTW, I certainly don't advocate assembler as a first computer language - second, perhaps.

  • by DingoBueno (461129) on Thursday February 05, 2004 @08:44PM (#8197135)
    Am I the only one who went to a school where it was expected that you already had a solid knowledge of programming? From day one I started taking theory classes. Aside from a digital design class where we briefly worked with motorola assembly, the only programming I did in school was to aid in math, physics, and CS theory. I personally find it offensive when somebody assumes that, because I was a CS major, I am nothing but a programmer. Honestly, I was a better programmer before college, because afterward all my time was spent on doing proofs, developing algorithms, studying computability, etc. Implementation was, for the most part, trivial.

    Anyway, that's what I learned about computer science. There are apparently a lot of school that teach *programming*, and I find that to be a real shame.
  • It's great (Score:2, Interesting)

    by Balthisar (649688) on Thursday February 05, 2004 @08:48PM (#8197158) Homepage
    I'm NOT an awesome programmer by any means. But I bet as far as someone that doesn't do it for a living, I'm damn good and could probably hang with a lot of pros.

    I programmed 8510 assy. on my Commodore-128 as a kid. Yeah, I had to, due to limitations of basic.

    There's NOTHING like knowing how a computer works to make a computer work, ya know?
  • by steveg (55825) on Thursday February 05, 2004 @08:51PM (#8197187)
    Isn't the syntax most naturally determined by the processor?

    Intel specified the syntax they did because their chips implement the instructions in that same order. You can obviously switch things around as part of the 'macro' in 'macro assembler' (and apparently gas did so) but it seems more 'natural' to stick with how the processor works. Anyway, it does to me.

    If you were masochistic enough to look at the numeric code you're producing, it might get confusing if the operands were switched around from what you were expecting. And there have been sometimes when it's been useful if not fun to look at the actual bits.

    Maybe if I were better at assembler I wouldn't have to trace it that way, dunno.
  • by prozac79 (651102) on Thursday February 05, 2004 @08:55PM (#8197210)
    I agree that assembly language is an important staple of any computer programmer. However, I don't think it should be taught first for a psychological reason. Teaching assembly first could be very confusing and might turn off people who could make great programmers. One of the things that attracted me to computer science was that I could create cool things right off the bat. My friends in other engineering diciplines had to go through about two years of coursework before they could even start doing anything remotely cool. I was having a great time writing programs in C and Java from day one. Then, when I did learn assembly, I had a much better background and understanding on how my printf statement was really fed into the computer. I don't think I would have had the respect for assembly that I have now if I would have tried to learn it first. It would be like trying to teach someone to drive by first having them put together the engine. Then again, if that were a requirement we would have less bad drivers. But I digress.

    Basically, I liked my education where I was taught concepts in the context of a language. You want to learn about memory allocation? Teach pointers and things in C. You want to learn about object oriented models... use Java or C++. You want to learn about how a computer functions at a base level, learn assembly. The language itself should never be the end goal, it should be a means to learn the concepts. The best programmers I've encountered are ones who have ideas, not ones who can't think outside a particular language.

  • Elitist, perhaps? (Score:3, Interesting)

    by mblase (200735) on Thursday February 05, 2004 @09:00PM (#8197244)
    Perhaps it's time that computer science curriculums start teaching assembly language first.

    Yeah, as a mathematics major I've often felt that number theory is glanced over far too often in high schools. Kids really ought to learn why 1+1=2 and how to strictly define multiplication of integers before getting into such oversimplifications of mathematics as the Pythagorean Theorem.

    I think the submitter is proceeding from a false assumption. Anyone who's really pursuing a BS in Computer Science (from a reputable school, anyway) is probably going to get a course in assembly language whether they like it or not, and anyone who hasn't earned a BS shouldn't be calling themselves a "computer scientist."
  • by tjb (226873) on Thursday February 05, 2004 @09:20PM (#8197368)
    First off, while I am not a Linux expert, I imagine that Linux would allow you more freedom to actually do stuff like mess with interrupt vectors and system resources (aasuming you are running at the correct privelege layer) whereas from my minimal exposure to Win32 driver-layer assembly, it seemed you are rather restricted in some regards.

    IMHO, DOS is the perfect OS to learn assembly-language programming on - no restrictions at all (and fast reboots when you fuck up). But maybe I'm just biased because I program DSPs with no operating system whatsoever :)

    Tim
  • by Phil John (576633) <phil @ w ebstarsltd.com> on Thursday February 05, 2004 @09:21PM (#8197381)
    ...his site is at http://www.grc.com [grc.com], he's got loads of security related info on his page and a shedload of Win32 progs coded entirely in assembler, every last line of em. He also created the very neat ShieldsUp tool to scare people into getting a personal firewall installed (like listing their netbios share names, doing a remote port scan and telling them the gory details of what people could do to their computer etc.).

    Most of the progs are under 30k in size, including a very cool sub-pixel font-rendering demo, and ones to disable messenger, dcom and upnp. A really nice touch is that some of them have sound fx, produced by a simple virtual synth, also coded in assembler...just cause he could (a true geek!)
  • Nah, its beatiful (Score:1, Interesting)

    by Anonymous Coward on Thursday February 05, 2004 @09:28PM (#8197438)
    Its sort of like comparing Renoire with Picasso.
    All the kinks and weirdosities in x86 make the programs written in it beautiful, like the artist that paints with his toes. I even had a chance to use the JPE instruction once, it was a beauty to behold!
  • by Kosgrove (75723) <jkodroff.mail@com> on Thursday February 05, 2004 @09:32PM (#8197469)
    I disagree. Assembly has little to nothing to do with either programming or computer science anymore. Computer science (IMHO) deals with the study of software engineering and algorithms (network protocols, etc). Computer engineering deals with custom integrated circuit development, including processor architecture.

    I learned assembly as an undergraduate at Penn State (before attending a year of grad school at Drexel), but what I got out of the course had far more to do with understanding architecture (something not relevant for most developers, but much more relevant for hardware engineers).

    Assembly does not have any of the high-level features (OOP, libraries, etc) features that developers need to know these days. It's rarely used, even in embedded programming since C/C++ compilers have gotten quite efficient and are available even for open-core (similar to open source) procesors for use on FPGA's.

    On the other hand, assembly is important to know for computer engineering undergrads and graduates interested in architecture, and having taught in the CompEng department there, I can say that the depth of assembly in the cirriculum there is not sufficient.
  • "Assembly" used to send chills down my spine, too, until my job required that I get over it and just do it. I think a big stumbling block is that the first instruction set most potential asm coders see is for the 8086 and it's large set of god-awful address indexing instructions. At least, that's the way it was at CSU, Northridge back in the day.

    However, when my then employer needed a high performance CPU for their missiles, they designed what in later days would have been called a RISC chip. The instruction set was all of maybe 20, so few that many of us found ourselves coding patches directly in hex. There were just a few general purpose registers and four pages of memory to worry about.

    It was elegant, straight forward, and only took a few sessions writing a patch to get the hang of. Once over the hurdle of writing a few lines on an embedded system like that, taking the next step and coding to an API of an open source system ain't too much bigger of a deal.

  • by Proudrooster (580120) on Thursday February 05, 2004 @10:24PM (#8197813) Homepage
    Who is Jon von Neumann? If you don't know and are planning on programming you should find out. Worse, if you already have a CS degree and don't know about "von Neumann" architecture then you missed am important topic.

    If you want high performance code, you must understand procedural programming and assembly language. You must understand the components of the modern "von Neumann" architecture [princeton.edu] like RAM, Registeres, L1 cache, L2 cache, ALU etc ....

    While everyone has gone OOP (Object Oriented) crazy, the "von Neumann" architecture is NOT optimized for OOP programming. Because modern CPU's have lots of cache, the latency that exists between the CPU and Memory is reduced. This is called "faking" memory bandwidth, read this article on the von Nuemann bottleneck [knozall.com].

    Serious coders should learn ASM, then move to a higher level language like 'C' then see how the 'C' statements compile in ASM and then analyze efficiency.

    Modern wisdom says, be wasteful, vendors will make bigger/faster machines and we won't have to care that our code is slow, inefficient, and not optimized for the architecture. Keep in mind, you can save substantially on hardware expenditures by hiring good coders that know how to tune and optimize code but, if you don't want to be bothered, just plan on large capital expenditures every couple of years. Also write everything is JAVA and make sure you create indexes on every column of every table in your database for faster lookups.. ( I am joking, don't really do this. )
  • Book is GPL (Score:1, Interesting)

    by Anonymous Coward on Thursday February 05, 2004 @10:25PM (#8197821)
    ...available at ProgrammingGroundUp [eskimo.com]
  • by Anonymous Coward on Thursday February 05, 2004 @10:46PM (#8197931)
    When I started at IBM, in 1962, they sent us to programming class and the first thing they taught us was ... all together, now ... Assembler!

    Of course things were much easier then, we were on the 7090 which is a "direct address" architecture, that is, the actual address (all 15 bits of it!) is part of the instruction. There were index registers, and occasionally you might code an address of 0 and use the index register as a "base" or "segment" register, but that was the exception. The 7090 architecture had its glitches (index registers were subtracted from the address, but by and large it was much simpler than most of today's machines, and much easier to learn assembler for.

    Oh, after we finished Assembler, then we moved to a "High Level" language ... Fortran. Not Fortran 96, nor 77, nor 66, not even Fortran IV, but Fortran II (ugh!).


    By the way, I'm doing this post as AC because I've got lots of copies of my resume out there, and not one so much as hints at my having been working in 1962 ...

  • SICP (Score:3, Interesting)

    by paxil (99137) on Thursday February 05, 2004 @10:46PM (#8197933)
    Sounds like a bad idea to me, for reasons pointed out in other posts.

    For a first course in CS, I think it would be hard to do better than one based on Structure and Interpretation of Computer Programs [mit.edu].

    This book takes one from zero to writing a compiler in a few hundred pages, including a chapter on writing code for register machines which gives the student a good idea of what is going on "under the hood."

    To those who would say that Scheme is useless outside of academics, I would counter that once the concepts in this text are mastered, it is easy to transfer them to other languages.
  • Amen! (Score:2, Interesting)

    by SuperChuck69 (702300) on Thursday February 05, 2004 @11:12PM (#8198118)
    My God, I've only been out of school going on 6 years now and I'm already antiquated. I learned my assembly, I studied my architectures, I shifted bits.

    These days, kids come out of school unable to manipulate simple pointers. Why? Because the sissy langauges they use don't even USE pointers, so they were never TAUGHT pointers. God forbid they have to figure out how a machine is going to execute their code!

    What they do know how to do is read. They read what's going to happen in the future. They read about the NEXT version of Java and its speed and other associated wonders. They read about how feature xyz is supposed to work (I read the source code for feature xyz; it doesn't work that way). They read about the flying car they're going to be cruising around in in a few scant years.

    As a fallen Java evangelist (not to pick on Java, but it deserves it), I've learned that there are a few constants in that particular language. It's slow as balls; it has been since I was using it in alpha 7 or 8 years ago, and every version promises that the next will be faster than C. Container-managed beans suck; for 5 years, they've been saying that the mythical container (read as: someone else) will magically optimize them, which we all know is bullox. The more fictional something is the more it's talked about; Java's been talking about Jini for something like 4 years now, with absolutely nothing to show for it.

    If students spent more time learning assembly and less time fluffing around with "references" (which is short for "retarded pointers"), they'd be much more bitter and cynical... Like me.

  • Re:Your book? (Score:3, Interesting)

    by johnnyb (4816) <jonathan@bartlettpublishing.com> on Friday February 06, 2004 @12:01AM (#8198375) Homepage
    I don't see how it's "tricking" you. I believe in moral marketing. However, I don't understand the general opinion that one shouldn't promote one's own stuff.

    Two things:

    1) whether or not I wrote the book is irrelevant as to whether or not it makes a good Slashdot story.

    2) would you be offended if I told you that the back-cover text, which was also written by me, is also written in the third person?

    Anyway, I'm not quite certain where you think the trickery or deception is. The only thing I can see from your post substantiating this is:

    "But when he doesn't admit that and writes obviously biased remarks regarding knowing assembly to be a good programmer, I can't help but view it skeptically."

    I agree that the remarks are biased in the way all remarks are biased. However, I think you are reversing the order of the biases. It is _because_ I believed assembly language essential to being a good programmer that I bothered to write the book in the first place.

    Anyway, I'm sorry if I offended you, but I am not sorry for how I worded my article submission.
  • by AnotherBlackHat (265897) on Friday February 06, 2004 @12:08AM (#8198406) Homepage

    Of all the processors out there, yes the x86 is common but it has to be one of the WORST instruction sets ...


    It only seems that way because you haven't been exposed to the really wretched ones.

    There were processors that had only one word of stack, processors that needed special instructions to cross page boundaries, that had only two registers, that couldn't add or subtract (forget multiply and divide), and even processors that didn't have linear instruction sequences (the program counter was a LFSRs).

    The whole x86 family even as far back as the 4040 has a much better instruction set than many of the (now thankfully dead) processors.

    -- this is not a .sig
  • by rcpitt (711863) * on Friday February 06, 2004 @12:18AM (#8198466) Homepage Journal
    The first piece of digital gear I ever had my hands on was the size of a small desk and about 5' high. It had just enough "logic" on it to create an eight bit ring counter (0 1 10 11 100...)

    That was in 1965 and the unit was the only one for our whole school district and worth about $30,000 (CDN - about $35000 US at the time)

    From there to today's Object Oriented Programming languages has been an interesting time. I wouldn't have missed it for anything, and I honestly think that living through it has given me a perspective that many more recent programmers don't have and IMHO need, sometimes.

    Where "brute force and ignorance" solutions are practical, there is no gain in knowing enough about the underlying hardware and bit twiddling to make things run 1000% faster after spending 6 months re-programming to manually optimize. In fact, since (C and other) compilers have become easily architecture tuned, there really are few areas where speed gains from hardware knowledge can be had, let alone made cost effective. Most are at the hardware interface level - the drivers - most recently USB for example.

    If you're happy with programming Visual Basic and your employer can afford the hardware costs that ramping up your single CPU "solution" to deal with millions of visitors instead of hundreds, then you don't need to know anything about the underlying hardware at the bit level.

    On the other hand, if you need to wring the most "visits" out of off-the-shelf hardware somehow, then you need to know enough to calculate the theoretical CPU cycles per visit.

    Somewhere between these two extremes lies reality.

    Today I use my hardware knowledge mostly as a "bullshit filter" when dealing with claims and statistics from various vendors. I have an empiric understanding of why (and under what circumstances) a processor with 512 Megs of level 1 cache and a front-side bus at 500MHz might be faster than a processor with 256 Megs of L1 cache and a 800MHz FSB and vice versa. Same thing for cost effectiveness of SCSI vs IDE when dealing with a database app vs. access to large images in a file system (something that came up today with a customer when spec'ing a pair of file servers, one for each type application)

    Back in the mid 70s I dealt with people who optimized applications on million $ machines capable of about 100 online users at one time. Today I deal with optimization on $1000-$3000 machines with thousands of 'active' sessions and millions of 'hits' per day. Different situations but similar problems. Major difference is in the cost of "throwing hardware at the problem" (and throwing the operating systems to go with the HW - but then I use Linux so not much of a difference ;)

    Bottom line is that understanding the underlying hardware helps me quite a bit - but only as a specialist in optimization and cost-effectiveness now, not in getting things to work at all as in the past.

  • by DaveAtFraud (460127) on Friday February 06, 2004 @12:18AM (#8198467) Homepage Journal
    I ended up teaching introductory Pascal back in the late '80s and found it to be useful only for teaching programming concepts but useless for teaching the students about how computers actually work. As an alternative to Pascal, I would choose FORTRAN over C because its possible to introduce students to things like internal representation without getting them tied up in their shorts over things like pass be reference vs. pass by value, pointers, etc. If C is bad, assembler is worse because the students will more likely get bogged down in nuances of assembly syntax. Also, if someone simply wants to learn how to program in C, I would suggest they buy "C for Dummies" and knock themselves out. On the otherhand, if someone wants to learn about how a computer works while at the same time being introduced to programming skills, debugging, etc., I'd go with something like FORTRAN as the teaching language.

    The goal of a first "computer science" class should not be to merely teach technical skills (programming, debugging, program design) but to also give the students an understanding of how computers work. Even with today's proliferation of home computers, most people (student or otherwise) have utterly no idea how a computer works internally since their exposure is limited to simply installing software someone else wrote. Assembler is too far on the side of "how the CPU works" without giving the student any better insight into how the whole system actually operates (memory, CPU, storage, peripherals, etc.) than a higher level language while requiring much more effort by the student to accomplish anything useful. This would seem to indicate that a second generation language such as FORTRAN (or C) would be more likely to let the course delve into programming, machine operations, machine organization, etc. It also means that the students can produce "interesting" programs fairly early on which will keep more people interested (which would you rather have: a PHB MBA with no programming classes or one that has at least sat through enough classes to have some understanding of what goes on inside the box and how difficult that can be to accomplish).

    One other complaint against assembler as an introductory teaching language is that, depending on the specific assembler, it is usually difficult to see the overall program structure even after the program is complete. This is primarily due to the low implementation level that leaves simple program controls (if-then-else, do-for, do-while, etc.) burried in the assembler syntax of loading and testing registers and smeared out over what is generally multiple individual statements. The student *may* end up with an appreciation of what it involves to implement these structures but will loose site of the tree (control structure) and never even notice the forrest (overall program design).
  • by johnnyb (4816) <jonathan@bartlettpublishing.com> on Friday February 06, 2004 @12:20AM (#8198471) Homepage
    "Bullshit. Since when did what language you know determine how good a programmer you are?"

    Assembly is different. All languages eventually turn into assembly. Knowing how the stack operates, what a stack frame is, what a register is, what addressing modes are available, etc., all bear down on ALL programming languages.

    For example, in C, where should you place your most-used member of a struct? Why, in the first position, since it can be accessed using indirect addressing rather than base pointer addressing.

    Why do Scheme and C often have trouble interacting? Because Scheme's continuations mean that objects on the stack have to be able to live forever, meaning that the hardware stack is near useless.

    Also, many people who have trouble understanding pointers in C are able to "get it" after learning assembly language because it makes the concept far more concrete.

    Why are OO calls more costly than straight function calls? Because the method has to be looked up in a virtual method table first, before it is called. Not only does the lookup cost cycles, but the indirect jump can kill your pipeline.

    Anyway, these are the types of issues that all programmers in all languages have to contend with. It is easier to understand these types of concepts if the programmer knows assembly language.
  • by efti (568624) on Friday February 06, 2004 @12:31AM (#8198514)

    FPGA = Field Programmable Gate Array.

    Damn, I really have too many acronyms / abbreviations to remember these days.

  • by fingusernames (695699) on Friday February 06, 2004 @12:56AM (#8198659) Homepage
    When I was at Purdue in CS, we had to write a machine emulator, for a VAX I think. It was the companion to a CEE course, which you likely took; I forget the numbers of both. We wrote our emulator in, of all things, Pascal. We had to program in assembly and then hand translate that into binary, and manually create the binary files to feed into the emulator. It would crank, and dump the memory image to disk. We then had to look through the memory image to figure out what happened.

    Later, in a compiler course, we had to compile a subset of C++ to SPARC asm, and also use an emulator to execute that, and also do a memory dump analysis, though we did have pseudo ops to display output too.

    Also, we didn't really have any classes that taught languages and programming, per se, other than the entry level CS courses which taught using Pascal. I learned C in a EE course, FORTRAN when I was still in Aero (switching from AAE to CS, I already knew FORTRAN & C). We learned C++ specifically for a compiler course, for most of us our first time using a language supporting OO at the language level; of course, it was just a cross compiler from C++ to C, at that time. Other languages, specifically for those courses.

    Something also for you to realize is the caliber of the education you receive at Purdue. In the many years since I've left there and worked professionally, I have worked with many fellow professionals from many universities. The CS program, and the EE/CEE programs, at Purdue are among the very top. I worked at Cray Research, back in the day when Seymour was still around. They recruited heavily from Purdue for a reason: they trusted two PU CEE interns enough to be solely responsible for developing the pre-hardware emulator for the first MPP Cray machine, used for porting the OS and testing the architecture. Also at Xerox, and Motorola; also heavy Purdue recruitment. I've been in the position to interview and evaluate a good number of candidates from many universities, and I can say that Purdue is very much top tier.

    Unfortunately, lots of other universities don't have "real" CS/CEE programs. A good number are degree mills, producing graduates narrowly educated, and not well equipped to learn new technologies and methodologies throughout their career, because they were never challenged and never exposed. I have run into "CS" graduates who learned in nothing but a pure MS GUI environment, using IDEs, who never were taught how computers really work, how to develop language translation/compilers, how to develop an operating system. There are CS graduates who never have touched a Un*x command line. It's sad. The Internet driven tech bubble has, perversely, devalued CS as a curriculum it seems.

    Larry
  • Question (Score:2, Interesting)

    by jadavis (473492) on Friday February 06, 2004 @01:02AM (#8198696)
    Ok, I can make my way through a couple different assembly languages, but I'm missing something important.

    Every program you write is run inside the operating system, right? So how does an OS switch between all those tasks if it seems like you're intimately working with the hardware? You put something in a register, well what if a new task comes along and want's to use that register also?

    Is everything virtualized from the kernel, sort of as if the kernel is the machine? How do you differentiate between accessing kernel memory and accessing virtual memory?

    Also, doesn't the operating system implement the RTS and other basic features? If you're programming a kernel, do you not have the RTS available?

    I think that all these questions are related. Would someone please enlighten me?
  • by rs79 (71822) <hostmaster@open-rsc.org> on Friday February 06, 2004 @03:18AM (#8199162) Homepage
    I understand that much of C was inspired by that instruction set.

    I'm not sure inspired would be the right way to say that. C was invented a shorthand for assembler, in particular PDP-11 assembler. I'm probably just being pedantic but I think it's an important distinction.

    We owe a lot to those machines, by '74 UNIX and C were available (barely) from Bell Labs but by the late summer of 76 Dave Conroy at Teklogix in Mississauga, Ontario, had written and made work the only C compiler not written by Bell Labs, which ran under RSX-11M. This became DECUS C, and then gcc.

    I worked there between high school and university ; Dave taught me C to test his compiler and must have got all of about $1200 for writing it as it only took him a few weeks. It was of course written entirely in assembler.
  • Knuth (Score:1, Interesting)

    by Anonymous Coward on Friday February 06, 2004 @08:21AM (#8200244)
    Knuth has been teaching CS using ASM for a *Long* time.

    *shrug*
  • Re:Not the point! (Score:2, Interesting)

    by Rick.C (626083) on Friday February 06, 2004 @08:43AM (#8200376)
    isn't the most primitive CMOS gate a NAND gate?

    You used CMOS?? Luxury! You probably had a clean-room, too.

    We had to make our AND gates out of bubble gum and baling wire.

    And cut the wire with our teeth!

    On more serious, but equally off-topic note, back in the mid 1970s I attended an IBM presentation by one of the researchers who built the first IBM disk drive. He described how they coated that very first 14-inch platter:

    The plan was to put the platter on a record turntable, crank the speed up to 78 RPM and pour the iron oxide slurry onto center of the spinning platter from a paper Dixie cup. The theory was that the centrifugal force would spread the slurry out evenly across the surface and give them a nice, smoothe coating.

    Well, they gathered around the turntable, started pouring out the slurry, looked down at the brown streak across all their clean, white lab coats, and said, "Maybe we should turn it down to 33 RPM."

    True story.

  • by nutznboltz (473437) on Friday February 06, 2004 @09:37AM (#8200815) Homepage Journal
    C was invented a shorthand for assembler, in particular PDP-11 assembler.

    Yes, C is basically an abstracted PDP-11.

    Dave Conroy at Teklogix in Mississauga, Ontario, had written and made work the only C compiler not written by Bell Labs

    Is this the same Dave Conroy that does FPGA re-implimentations of old DEC computers?

    http://www.spies.com/~dgc/ [spies.com]

  • Use C (Score:2, Interesting)

    by jabber01 (225154) on Friday February 06, 2004 @10:19AM (#8201319)
    Ladies and Gentlemen of the Slashdot community... Code in C.

    If I could offer you only one tip for your future as programmers, C would be it. The long-term benefits of C have been proved by scientists, and academics, and professionals, and corporations all over the world, whereas the rest of my advice has no basis more reliable than a Microsoft marketing report. I will dispense this advice now.

    Enjoy the power and beauty of simple command line tools. Oh never mind, you will not understand the benefits of vi, make, grep and gcc until you're working as a contractor, and that fancy environment you have at home isn't available. But trust me, in 5 years, you'll sit down in front of an expensive IDE, pore over the GUI menus and checkboxes, read the badly written two-inch-thick manual twice, and still not be able to write a program to say "Hello World!" You'll then recall in a way you can't grasp now, the importance of understanding how to get useful work done with the standard, simple tools.

    Don't worry about not using the latest and greatest Microsoft tools or language; or worry, but know that worrying is as effective as watching all your carefully crafted code fail because a registry setting or dll you depend on was changed by the installation of some do-nothing program. The real troubles in your programming career are apt to be things that never crossed your worried mind; like a middle manager who wipes out the /opt directory thinking it to be "optional" while trying to make space for his porn collection, or a lethal bug in your fancy CASE tool.

    Write one script every day to aid you.

    csh!

    Don't be reckless with other people's test procedures, don't put up with people who are reckless with yours.

    Lint.

    Don't waste time on hand-optimizing your code; sometimes you're elegant, sometimes you're a kludge... The optimizing compiler will do it better anyway, and in the end you'll have to maintain the source code - so keep it readable.

    Remember the useful code snippets you receive, forget the cruft. If you succeed in putting together an elegant an efficient source "proverb" library, send it to me.

    Keep a logbook of your programming efforts. Toss your core dumps.

    Top.

    Don't feel guilty about not following a particular development methodology completely. The most useful programs I've seen have been cobbled together ad hoc. Projects I've seen follow the Rational Unified Process, Extreme Programming or Aspect Oriented Programming methodologies to the letter, dies over budget and behind schedule.

    Get comfortable with revision control.

    Be kind to your O'Reilly books. You'll miss them when they're gone.

    Maybe you'll become famous, maybe you won't, maybe you'll win the Turing, maybe you won't, maybe your codified soul will be ignored sight unseen, maybe you'll crank out the next Mosaic... Whatever you do, don't congratulate yourself too much or berate yourself either - the commercial success of your program depends more on the whims of lawyers and marketing drones than on the technical merits of your application or the effort you've put in. So does everybody else's.

    Enjoy your home computer, use it every way you can... Don't just use it to hack away at code or to automate your bedroom lights or to run a website, it's the greatest tool for self-expression you'll ever own.

    Learn... even if you have nowhere to do it but from a weblog forum. Read the responses, even if you don't agree with them. Do NOT read Ziff-Davis magazines, they will only make you feel like a long-haired, smelly zealot.

    Get to know your kernel, you never know when you'll need to recompile it yourself. Be prudent about your patches, they are the best trace of development and the means most likely to keep you safe and relevant in the future.

    Understand that CASE, GUI and other developer tools come and go, but for the precious few you should truly know. Work hard to bridge the gaps between convenience and uti
  • Assembly (Score:2, Interesting)

    by fadethepolice (689344) on Friday February 06, 2004 @12:47PM (#8203143) Journal
    I definitely agree with the idea that computer programmers need to learn assembly, and other low-level programming skills. I started my education as an Electronics Engineering Technician in D.C., (i used to teach NSA techs how to build FM radios in the electronics lab) Anyway... my career has been mostly writing macros, maintaining networks, user support, light engineering, etc. and I feel that my in-depth knowledge of computers has helped me immeasureably in these higher-level areas of software maintenance. I laugh at "micosofties" that graduate from colleges here in Northeast PA with their MCSE... I outperform them on a regular basis.
  • Not a new idea (Score:2, Interesting)

    by RCR (79301) on Friday February 06, 2004 @01:31PM (#8203744)
    In 1978 I had a liberal arts degree and no job. I borrowed $2000 and went to a trade school to learn computer programming. That $2000 has paid off big time of the last 26 years and continues to do so. The course started with a simulated, 1000-memory-cell decimal "computer"; we programmed it in assembler. We then went to IBM 360-style assembler (all this was on punched cards), followed by Fortran, RPG, and COBOL.

    I got a job right away for a company that made data comm hardware and, naturally, used assembler. I didn't touch a high-level language till 1984 (and that was C).

    As the years have gone by, the trade (NOT profession!) has become more and more the domain of the "computer scientist". These are the guys who learned how to write compilers in school. Often, they're disappointed when the job involves something more mundane. These are also the guys who have trouble with pointers and other basic concepts because they don't really know what's going on on the machines they program.

    German machinists used to spend a year doing nothing but filing as part of their training. Nothing but filing. These were people who honored their craft. Assembler is the computer equivalent. Without it, you might be a "scientist", but you'll be a lousy craftsman.

    I once worked for a guy who learned programming in the Air Force in the 60's. His motto: "It's all just loads and stores and branches."

  • by Mandatory Default (323388) on Friday February 06, 2004 @02:38PM (#8204796)
    It's clear you weren't writing code 30 years ago and you have no ideas what it was like back then. I was there. If you had ever tried to write an entire application in 2K, you'd know that every trick Mel played saved precious memory and/or cycles. Back then, hardware was expensive, people were cheap. 4K words of memory cost more than most people made in a year (and no, 4K words was not 4KB) - and that pricing was long after the drums described in this article. The optimizers were interesting toys, but that was about it.

    Your talking about documented code makes pretty clear how little you know on the subject. He was writing machine code. Not assembly. Not C. Not Perl. Machine code. There's nowhere to put comments. Even if there were, you'd never waste a machine word on them.

    If you've never had to get a forklist to help you lift your computer, you aren't qualified to pass judgment on Mel.
  • by GorillaTest (742268) on Friday February 06, 2004 @08:15PM (#8208682)
    It's a contrived simple assembler they used to use at AT&T to teach programmers assembler. This is not a new concept. This kind of touches on one of my themes, that being that most younger computer science people are mere technicians who are in the field to make money, not because they love it. They don't have a clue about the history of their discipline, nor do they care. I really see this in the the 15 to 30 year olds. 99 times out of 100 they really don't have much of a clue. In a way I feel sad for them, because they didn't experience things before the huge monolithic companies (you know the cube farm sweat shops like Adobe and Apple and Microsoft) got on the scene and pushed lots of wonderful products off the market through business muscle and not survival of the best product. I also feel sad when I try to hire people. Most graduates that I interview know a whole bunch about Visual Basic and SQL, and not much else. (In other words they have no real training at all)

If a 6600 used paper tape instead of core memory, it would use up tape at about 30 miles/second. -- Grishman, Assembly Language Programming

Working...