Learning Computer Science via Assembly Language 1328
johnnyb writes "
A new book was just released which is based on a new concept - teaching computer science through assembly language (Linux x86 assembly language, to be exact). This book teaches how the machine itself operates, rather than just the language. I've found that the key difference between mediocre and excellent programmers is whether or not they know assembly language. Those that do tend to understand computers themselves at a much deeper level.
Although unheard of today, this concept isn't really all that new -- there used to not be much choice in years past. Apple computers came with only BASIC and assembly language, and there were books available on assembly language for kids.
This is why the old-timers are often viewed as 'wizards': they had to know assembly language programming. Perhaps this current obsession with learning using 'easy' languages is the wrong way to do things. High-level languages are great, but learning them will never teach you about computers. Perhaps it's time that computer science curriculums start teaching assembly language first."
The Art of Assembly book (Score:1, Interesting)
Knowing assembly language is vital, mostly (Score:2, Interesting)
It may not be so important though, if you're a database programmer, or if you're dealing with high-level languages such as Java or something.
Computer Science Curriculums (Score:3, Interesting)
Here at University of New-Brunswick (Canada), they may not teach assembly language "first", but we do have a second year course dedicated to assembly and the inner workings of computers. My only problem though at the school is that we learn the Motorola M68HC11 cpu and not current ones. Sure it's easier to learn and understand, but most computers we work on today are x86 based.
My 2 cents.
For the record (Score:5, Interesting)
I think it's a little weird to call this "Learning Computer Science via Assembly Language." It's programming, not computer science. Computer science is really only marginally about computers. It has to do more with algorithms, logic, and mathematics.
You can study computer science, and produce new knowledge in the field, without ever touching a computer.
This misunderstanding is, I think, part of the reason so many students drop out of CompSci. They head into it thinking it's about programming, and are startled to find that computation and programming are not equivalent.
That's why the Compilers course at PSU is considered the "filter" which kills all the students who aren't really interested in computer science. They really need to spin off a seperate "Software engineering" school for these students, since what they really want to study is programming.
Syntax, OS interfaces... (Score:5, Interesting)
Second, OS interfaces for making system calls (e.g., to read files, open network connections, etc) are different in Linux versus DOS or Windows).
Re:Linux x86 assembly? (Score:5, Interesting)
Yes. Although it requires understanding the CPU's native capabilities to the same degree, Linux uses AT&T syntax, whereas most of the Wintel world uses (unsurprisingly) Intel/Microsoft syntax.
Personally, although I far prefer coding C under Linux, I prefer Intel syntax assembly. Even with many years of coding experience, I find AT&T syntax unneccessarily convoluted and somewhat difficult to quickly read through.
The larger idea holds, however, regardless of what assembler you use. I wholeheartedly agree with the FP - People who know assembly produce better code by almost any measurement except "object-oriented-ness", which assembly makes difficult to an extreme. On that same note, I consider that as one of the better arguments against OO code - It simply does not map well to real-world CPUs, thus introducing inefficiencies in the translation to something the CPU does handle natively.
Re:Your book? (Score:5, Interesting)
Re:Not So New Concept (Score:1, Interesting)
Does anyone else feel this way, or am I just an oddball? (which is most likely the case
Going even farther than just assembly (Score:2, Interesting)
Re:Good idea, Bad Idea (Score:3, Interesting)
What we need if we want to stay the forefront is for there to be fewer programmers of a higher quality. For the past decade we have had too many people not gettin turned off by CS. So they enter the workforce as commodity disposable workers, and they tarnish the reputation of you uys who are good too. Now our jobs go to India and either they have better or equal workers and thins will never come back or they suck and we can keep churnin out assembly line (not assembly) programmers.
Or we can start teaching people to be great programmers and start another tech revolution with actual American (European/wherever you are the is a victim of outsourcing) workers.
Re:Linux x86 assembly? (Score:5, Interesting)
For example, on the 6502 family (like the 6510 from the C64), you have only three registers; X, Y and A. These registers can only hold a byte each. Most of the variables you have are stored in zero pointers, a 255-byte range from address $00-$FF.
Then the 68k CPU (as in the Amiga, Atari, etc) you have several more registers which can be used more freely. You have D0-D7 data registers and A0-A7 address registers. These can be operated as bytes, words or longwords as you wish, from wherever you want.
The x86 assembly is written the "wrong way", and is pretty confusing at times. Where I would say "move.l 4,a6" on the 68k, I have to say "mov dx,4" on the x86. Takes a few minutes to adjust each time.
Once you master assembly language on one CPU, it's pretty easy to switch to another.
I still think the 680x0 series are the best.
Not a good idea (Score:3, Interesting)
Re:Good idea, Bad Idea (Score:3, Interesting)
The problem with your idea is that the "simple to use" languages that are most often taught are too simple and are too many generations removed from what the computer was actually doing. Years ago, even using BASIC on a TRS-80 would teach you a lot about how the machine worked. To get the best performance out of the thing you had to at least be familiar with the memory map, and that inevitably led you to further investigation.
I gather that Visual Basic is among the more commonly used introductory languages, but when so many layers are placed between the programming language and the machine how are you supposed to learn what's going on at the bottom?
yep (Score:2, Interesting)
An off the shelf book I liked and found useful was "The Art of Assembly Language" by Randall Hyde. I liked it better then my text books, anyway
Race car driver analogy (Score:2, Interesting)
>>To be a programmer without ever learning assembly language is like being a professional race car driver without understanding how your carburetor works.
A race car driver is a high-level user of the car (more akin to a financial analyst using Excel). Why would a high-level user care about HOW the carburetor works? All he has to tell the pit crew is how the car is behaving. A professional race car driver's job is to drive the car faster than the guy or gal next to him. End of story.
Now the mechanic or race car engineer is a different story
Languages I sudied for my CS degree... (Score:2, Interesting)
1st Year: Pascal, PDP-11 assembly
2nd Year: 68000 Assembly,
3rd Year: Ada, Eiffel, Modulo-2, Smalltalk, VHDL, EDIF
Interestingly despite using mostly UNIX hardware C was left to be self taught, though it could be used in place of Pascal.
The point really was use appropriate tools for appropriate jobs: Thus we learnt OS/Kernel basics in assembler, likewise IRQ concepts and HAL stuff. For algorithmic stuff we worked in a mainstream strongly typed high level language. For trends yet to propagate to the commerical world (some doomed ones!) we learned in exotic academic languages.
Assembly is as valid today as it was in the 1950's, its just that its most appropriate for only some tasks that a minority of programmers do: Deeply embeded microcontroller SW, OS kernel work (take a look in our beloved linux kernel code), hardware bootstrap, hard real time etc etc etc
Good Idea, not appreciated by PHB's, tho (Score:2, Interesting)
I've found I get blank stares sometimes when discussing memory usage, I/O bottle neck or code optimization in front of PHB's. All they want is crap to run and if they've got the money they'll throw it at buying more power. Sadly I've seen BSCS people who care less about getting a project done rather than done well, often with hideous looking code which can choke the fastest quad processor servers. A little time spent, considering how code may be written to minimize impact or organize I/O more efficiently isn't even encouraged anymore, where it once was a hard rule.
Look at VB.NET, it's the very embodiment of just code and don't worry about it. So much of the work is buried in libraries/namespace, you really have no idea what impact calling routines will do.
Today's lesson: Screw finesse, just throw more CPU and memory at it.
Assembly language not required... (Score:3, Interesting)
See, I'm sort of tore. I like this idea since I feel most programmers these days are sorely lacking in the fundamentals. However there are advantages to starting higher. One major advantage I can think of is that when you start so low level, there are certain ideas that may never occur to you. Certain ways of solving problems won't come up because you "know" how the computer works and therefore adapt your thinking to it and seeing things very narrowly. I believe that we should be working towards adapting the computer to us instead. And when students start out learning at higher and higher levels, they start thinking higher and higher level and see connections from their "low level" beginnings to a higher level ideal without being bogged down in the little details. I think that would help drive innovation.
This is BS (Score:2, Interesting)
Ok, let's get real. This is equivelent to saying that unless a farmer gets on his hands and knees to plant 1 seed at a time, he's not a real farmer. Sure, he knows how to operate a $100,000 tractor that can plant several million seeds each day, but his hands-on knowledge of planting seeds isn't near what his great-grandad's was so, he must be less of a farmer... even though his output is thousands of times greater.
HELLO! It's called technology. It's all about advancement. We no longer have to type in the 1's and 0's ourselves. Hell, very few of us need to use assembly. Why was C written? Because all the fucked-up assembly languages in the world do not work together. A program written in XYZ assembly will only run on a XYZ computer, etc.
Today, programming is high-level. Write once... run everywhere (Java, Python, C#). Let's leave this assembly crap, and it is crap, where it belongs... back in the dark ages!
Re:Linux x86 assembly? (Score:3, Interesting)
amen! as somebody who in the days of yore coded a Mandelbrot viewer on an Atari ST (68000) using only integer arithmetics (also shifts instead of muls whenever possible as the 68000's MUL was so slow, at the time I would've killed for the 68000 to have a barrel shifter though) and registers I can certainly sympathize with that: x86 CPUs always seemed to have way too few registers for my taste.
Re:Linux x86 assembly? (Score:5, Interesting)
True. This topic, however, goes beyond mere maximizing of program performance. Pur simply, if you know assembler, you can take the CPU's strengths and weaknesses into consideration while still writing readable, maintainable, "good" code. If you do not know assembly, you might produce simply beautiful code, but then have no clue why it runs like a three-legged dog.
it is significantly better value to design and build a well architected OO solution
Key phrase there, "well-architected". In practice, the entire idea of "object reuse" counts as a complete myth (I would say "lie", but since it seems like more of a self-deception, I woun't go that far). I have yet to see a project where more than a handful of objects from older code would provide any benefit at all, and even those that did required subclassing them to add and/or modify over half of their existing functionality. On the other hand, I have literally hundreds of vanilla-C functions I've written over the years from which I draw with almost every program I write, and that require no modification to work correctly (in honesty, the second time I use them, I usually need to modify them to generalize better, but after that, c'est fini).
Who cares if it's not very efficient - it'll run twice as fast in 18 months
Y'know, I once heard an amusing joke about that... "How can you tell a CS guy from a programmer?" "The CS guy writes code that either won't run on any machine you can fit on a single planet, or will run too slowly to serve its purpose until technology catches up with it in few decades". Something like tha - I killed the joke, but you get the idea.
Yeah, computers constantly improve. But the clients want their shiny new software to run this year (if not last year, or at least on 5-year old equipment), not two years hence.
Re:Your book? (Score:5, Interesting)
Had he admitted that it was his own book, I might've actually read about the book, read the summary and saw if I was interested (as I'm a developer and have a degree in CS).
But when he doesn't admit that and writes obviously biased remarks regarding knowing assembly to be a good programmer, I can't help but view it skeptically. In fact when I saw the response that it was his book I didn't even bother reading it anymore. All the words he posted lose credibility.
And considering I got Score: 5 quite quick I have a feeling other people would agree with me. People don't like being "tricked" into buying stuff. It's the same reason why people don't like vendor-lock in and hate Microshaft.
If you tell people its' your book and you give an open and honest review/opinion regarding it people will actually respect that and read about the book. Hey, it's just my guess, but I think I'm right.
Re:Not So New Concept (Score:3, Interesting)
20 years ago hp calculators used ASM as their programming language - though it wasn't called asm by hp. I didn't realize it until I started learning Z80 asm and noticed it was almost the same as my hp-15C's language.
Except for the hp-16C, those calculators weren't targetted at programmers, so I guess there must be lots of people out there who know asm without realizing they do...
Re:Not So New Concept (Score:5, Interesting)
Re:Linux x86 assembly? (Score:2, Interesting)
Because they have had the desire to create layer upon ugly layer of syntax, the language just doesn't map down, as you say. Look how long it took to get a decent C++ compiler (5+ years). As well, the basic idea of data structure + associated functionality has been totally lost in the jumble of inheritance and public/private/protected.
That said, I do disagree that an assembly programmer will fail the "object-oriented-ness" test. As stated at the start, it's all a matter of organization, and, IMO, that is where success or failure is determined. Sure, organizing a project that uses an OO lang is going to be easier than organizing a project that uses assembler, but that doesn't mean that it can't be done. If you look at my journal you'll see how much I value code organization ~ I say it determines the success of the project, regardless of prog lang choice.
Peace & Blessings,
bmac
for true peace & happiness: www.mihr.com
Really -- the rich and cutting edge disagree (Score:1, Interesting)
Consider Paul Graham: Beating the Averages [paulgraham.com]. Considering that he's the kind of guy that gets invited to MIT as a language wizard [mit.edu] (along with Guy Steele, David Detlefs, Martin Rinard, Jonathan Rees, and David Moon), and considering that he has a net worth in the (hundreds of) millions, I'll go with his choice: abstraction rules.
Consider Erann Gat from JPL [nasa.gov] (the guys who get to send robots to Mars and build autonomously controlled space probes) disagrees [flownet.com] and empirically proves it.
Diamond Walker emphatically endorses [google.ca] LISP - his story is very compelling.
The list goes on and on: Dan Friedman, Mitchell Wand, Shriram Krishnamurthi, Christian Quiennec
But, to paraphrase Graham, if you don't know these people I guess I don't have to worry about you.
Actually, they DON'T. (Score:5, Interesting)
Actually, they don't.
A study was done, some decades ago, on the issue of whether compilers were approaching the abilities of a good assembly programmer. The results were surprising:
While a good assembly programmer could usually beat the compiler if he really hunkered down and applied himself to the particular piece of code, on the average his code would be worse - because he didn't maintain that focus on every line of every program.
The programmer might know all the tricks. But the compiler knew MOST of the tricks, and applied them EVERYWHERE, ALL THE TIME.
Potentially the programmer could still beat the compiler in reasonable time by focusing on the code that gets most of the execution. But the second part of Knuth's Law applies: "95% of the processor time is spent in 5% of the code - and it's NOT the 5% you THOUGHT it was." You have to do extra tuning passes AFTER the code is working to find and improve the REAL critical 5%. This typically was unnecessary in applications (though it would sometimes get done in OSes and some servers).
This discovery lead directly to two things:
1) Because a programmer can get so much more done and working right with a given time and effort using a compiler than using an assembler, and the compiler was emitting better assembly on the average, assember was abandoned for anything where it wasn't really necessary. That typically means:
- A little bit in the kernel where it can't be avoided (typically bootup, the very start of the interrupt handling, and maybe context switching). (Unix System 6 kernel was 10k lines, of which 1.5k was assembler - and the assembly fraction got squeezed down from then on.)
- A little bit in the libraries (typically the very start of a program and the system call subroutines)
- Maybe a few tiny bits embedded in compiler code, to optimize the core of something slow.
2) The replacement of microcoded CISC processors (i.e. PDP11, VAX, 68K) with RISC processors (i.e. SPARC, MIPS). (x86 was CISC but hung in there due to initera and cheapness.)
Who cares if it takes three instructions instead of one to do some complex function, or if execution near jumps isn't straightforward? The compiler will crank out the three instructions and keep track of the funny execution sequence. Meanwhile you can shrink the processor and run the instructions at the microcode engine's speed - which can be increased further by reducing the nubmer of gates and length of wiring, and end up with a smaller chip (which means higher yeilds, which means making use of the next, faster, FAB technology sooner.)
CISC pushed RISK out of general purpose processors again once the die sizes got big: You can use those extra gates for pipelining, branch prediction, and other stuff that lets you gain back more by parallelism than you lost by expanding the execution units. But it's still alive and well in embedded cores (where you need SOME crunch but want to use most of the silicon for other stuff) and in systems that don't need the absolute cutting-edge of speed or DO need a very low power-per-computation figure.
The compiler advantage over an assembly programmer is extreme both with RISC and with a poorly-designed CISC instruction set (like the early x86es). Well-designed CISC instruction sets (like PDP11, VAX, and 68k) are tuned to simplify the compilers' work - which makes them understandable enough that the tricks are fewer and good code is easier for a human to write. This puts an assembly programmer back in the running. But on the average the compiler still wins.
(But understanding how assembly instruction sets work, and how compilers work, are both useful for writing better code at the compiler level. Less so now that optimizers are really good - but the understanding is still helpful.)
A better assembly language (Score:3, Interesting)
Thanks for reminding the forum that Assembler is already one step removed from the actual machine language. I threw myself against several assembler lanuages and never got the result I desired. I loved the logic but hated the syntax.
Personally, I wish that, in the evolution of computer languages, we spent a little bit more time evolving a somewhat more intuitive set of languages at the assembler level. As there is still a great deal of programming that really should be done at the machine level, an assembler rival might help spur on the use of linux.
The book is going into my wish list.
Re:Not the point! (Score:3, Interesting)
AND(a,b) = XOR(XOR(a,b),OR(a,b))
You might be right about the cost of gates; I have no idea. Ok, it was a dumb joke.
Re:I taught myself assembly, and I disagree (Score:2, Interesting)
There's more to life than money.
Re:nah (Score:4, Interesting)
Also, you can use object oriented principles in assembly, just like you can in C. There aren't any convenient keywords or enforced methodologies (ie everything is an object), but you can gather sections of code that tie data and associated functionality together. You learn the advantages of modular programs. You learn how to count in binary. It forces you to not code carelessly. You learn good coding principles that you can apply to higher languages. In general you get another tool in your tool box. I'd say there are different flavors of programming and they aren't mutually exclusive. ie:
Object Oriented
Procedural
Event-Based
Interpreted
Managed Memory (garbage collection)
Assembly
To leave one of these out of a CS program leaves me wondering how good the program is.
Re:Linux x86 assembly? (Score:5, Interesting)
I don't know why, but just saying the words 'assembly language', sends a chill down my spine. I guess I am too weak minded to learn it.
Maybe individual brains just work in different ways. In school, I knew some people who were good with high-level languages but just couldn't hack assembler. They could not get down to that absolute minimal step-by-step instruction level. I'm not sure what that says about those of us who use assembler. :) BTW, I certainly don't advocate assembler as a first computer language - second, perhaps.
On programming in a CS curriculum (Score:2, Interesting)
Anyway, that's what I learned about computer science. There are apparently a lot of school that teach *programming*, and I find that to be a real shame.
It's great (Score:2, Interesting)
I programmed 8510 assy. on my Commodore-128 as a kid. Yeah, I had to, due to limitations of basic.
There's NOTHING like knowing how a computer works to make a computer work, ya know?
Re:Syntax, OS interfaces... (Score:2, Interesting)
Intel specified the syntax they did because their chips implement the instructions in that same order. You can obviously switch things around as part of the 'macro' in 'macro assembler' (and apparently gas did so) but it seems more 'natural' to stick with how the processor works. Anyway, it does to me.
If you were masochistic enough to look at the numeric code you're producing, it might get confusing if the operands were switched around from what you were expecting. And there have been sometimes when it's been useful if not fun to look at the actual bits.
Maybe if I were better at assembler I wouldn't have to trace it that way, dunno.
Teach easier langauges first (Score:2, Interesting)
Basically, I liked my education where I was taught concepts in the context of a language. You want to learn about memory allocation? Teach pointers and things in C. You want to learn about object oriented models... use Java or C++. You want to learn about how a computer functions at a base level, learn assembly. The language itself should never be the end goal, it should be a means to learn the concepts. The best programmers I've encountered are ones who have ideas, not ones who can't think outside a particular language.
Elitist, perhaps? (Score:3, Interesting)
Yeah, as a mathematics major I've often felt that number theory is glanced over far too often in high schools. Kids really ought to learn why 1+1=2 and how to strictly define multiplication of integers before getting into such oversimplifications of mathematics as the Pythagorean Theorem.
I think the submitter is proceeding from a false assumption. Anyone who's really pursuing a BS in Computer Science (from a reputable school, anyway) is probably going to get a course in assembly language whether they like it or not, and anyone who hasn't earned a BS shouldn't be calling themselves a "computer scientist."
Re:Linux x86 assembly? (Score:3, Interesting)
IMHO, DOS is the perfect OS to learn assembly-language programming on - no restrictions at all (and fast reboots when you fuck up). But maybe I'm just biased because I program DSPs with no operating system whatsoever
Tim
Anyone ever seen Steve Gibsons stuff? (Score:3, Interesting)
Most of the progs are under 30k in size, including a very cool sub-pixel font-rendering demo, and ones to disable messenger, dcom and upnp. A really nice touch is that some of them have sound fx, produced by a simple virtual synth, also coded in assembler...just cause he could (a true geek!)
Nah, its beatiful (Score:1, Interesting)
All the kinks and weirdosities in x86 make the programs written in it beautiful, like the artist that paints with his toes. I even had a chance to use the JPE instruction once, it was a beauty to behold!
Re:Linux x86 assembly? (Score:4, Interesting)
I learned assembly as an undergraduate at Penn State (before attending a year of grad school at Drexel), but what I got out of the course had far more to do with understanding architecture (something not relevant for most developers, but much more relevant for hardware engineers).
Assembly does not have any of the high-level features (OOP, libraries, etc) features that developers need to know these days. It's rarely used, even in embedded programming since C/C++ compilers have gotten quite efficient and are available even for open-core (similar to open source) procesors for use on FPGA's.
On the other hand, assembly is important to know for computer engineering undergrads and graduates interested in architecture, and having taught in the CompEng department there, I can say that the depth of assembly in the cirriculum there is not sufficient.
Chills? It Ain't That Bad. Just Avoid x86 To Start (Score:3, Interesting)
However, when my then employer needed a high performance CPU for their missiles, they designed what in later days would have been called a RISC chip. The instruction set was all of maybe 20, so few that many of us found ourselves coding patches directly in hex. There were just a few general purpose registers and four pages of memory to worry about.
It was elegant, straight forward, and only took a few sessions writing a patch to get the hang of. Once over the hurdle of writing a few lines on an embedded system like that, taking the next step and coding to an API of an open source system ain't too much bigger of a deal.
I think Jon von Neumann would agree, learn ASM! (Score:3, Interesting)
If you want high performance code, you must understand procedural programming and assembly language. You must understand the components of the modern "von Neumann" architecture [princeton.edu] like RAM, Registeres, L1 cache, L2 cache, ALU etc
While everyone has gone OOP (Object Oriented) crazy, the "von Neumann" architecture is NOT optimized for OOP programming. Because modern CPU's have lots of cache, the latency that exists between the CPU and Memory is reduced. This is called "faking" memory bandwidth, read this article on the von Nuemann bottleneck [knozall.com].
Serious coders should learn ASM, then move to a higher level language like 'C' then see how the 'C' statements compile in ASM and then analyze efficiency.
Modern wisdom says, be wasteful, vendors will make bigger/faster machines and we won't have to care that our code is slow, inefficient, and not optimized for the architecture. Keep in mind, you can save substantially on hardware expenditures by hiring good coders that know how to tune and optimize code but, if you don't want to be bothered, just plan on large capital expenditures every couple of years. Also write everything is JAVA and make sure you create indexes on every column of every table in your database for faster lookups.. ( I am joking, don't really do this. )
Book is GPL (Score:1, Interesting)
What Goes Around, Comes Around (Score:1, Interesting)
Of course things were much easier then, we were on the 7090 which is a "direct address" architecture, that is, the actual address (all 15 bits of it!) is part of the instruction. There were index registers, and occasionally you might code an address of 0 and use the index register as a "base" or "segment" register, but that was the exception. The 7090 architecture had its glitches (index registers were subtracted from the address, but by and large it was much simpler than most of today's machines, and much easier to learn assembler for.
Oh, after we finished Assembler, then we moved to a "High Level" language
By the way, I'm doing this post as AC because I've got lots of copies of my resume out there, and not one so much as hints at my having been working in 1962
SICP (Score:3, Interesting)
For a first course in CS, I think it would be hard to do better than one based on Structure and Interpretation of Computer Programs
This book takes one from zero to writing a compiler in a few hundred pages, including a chapter on writing code for register machines which gives the student a good idea of what is going on "under the hood."
To those who would say that Scheme is useless outside of academics, I would counter that once the concepts in this text are mastered, it is easy to transfer them to other languages.
Amen! (Score:2, Interesting)
These days, kids come out of school unable to manipulate simple pointers. Why? Because the sissy langauges they use don't even USE pointers, so they were never TAUGHT pointers. God forbid they have to figure out how a machine is going to execute their code!
What they do know how to do is read. They read what's going to happen in the future. They read about the NEXT version of Java and its speed and other associated wonders. They read about how feature xyz is supposed to work (I read the source code for feature xyz; it doesn't work that way). They read about the flying car they're going to be cruising around in in a few scant years.
As a fallen Java evangelist (not to pick on Java, but it deserves it), I've learned that there are a few constants in that particular language. It's slow as balls; it has been since I was using it in alpha 7 or 8 years ago, and every version promises that the next will be faster than C. Container-managed beans suck; for 5 years, they've been saying that the mythical container (read as: someone else) will magically optimize them, which we all know is bullox. The more fictional something is the more it's talked about; Java's been talking about Jini for something like 4 years now, with absolutely nothing to show for it.
If students spent more time learning assembly and less time fluffing around with "references" (which is short for "retarded pointers"), they'd be much more bitter and cynical... Like me.
Re:Your book? (Score:3, Interesting)
Two things:
1) whether or not I wrote the book is irrelevant as to whether or not it makes a good Slashdot story.
2) would you be offended if I told you that the back-cover text, which was also written by me, is also written in the third person?
Anyway, I'm not quite certain where you think the trickery or deception is. The only thing I can see from your post substantiating this is:
"But when he doesn't admit that and writes obviously biased remarks regarding knowing assembly to be a good programmer, I can't help but view it skeptically."
I agree that the remarks are biased in the way all remarks are biased. However, I think you are reversing the order of the biases. It is _because_ I believed assembly language essential to being a good programmer that I bothered to write the book in the first place.
Anyway, I'm sorry if I offended you, but I am not sorry for how I worded my article submission.
Re:X86??? OMG that sucks... (Score:3, Interesting)
It only seems that way because you haven't been exposed to the really wretched ones.
There were processors that had only one word of stack, processors that needed special instructions to cross page boundaries, that had only two registers, that couldn't add or subtract (forget multiply and divide), and even processors that didn't have linear instruction sequences (the program counter was a LFSRs).
The whole x86 family even as far back as the 4040 has a much better instruction set than many of the (now thankfully dead) processors.
-- this is not a
From a ring counter to OOP (Score:5, Interesting)
That was in 1965 and the unit was the only one for our whole school district and worth about $30,000 (CDN - about $35000 US at the time)
From there to today's Object Oriented Programming languages has been an interesting time. I wouldn't have missed it for anything, and I honestly think that living through it has given me a perspective that many more recent programmers don't have and IMHO need, sometimes.
Where "brute force and ignorance" solutions are practical, there is no gain in knowing enough about the underlying hardware and bit twiddling to make things run 1000% faster after spending 6 months re-programming to manually optimize. In fact, since (C and other) compilers have become easily architecture tuned, there really are few areas where speed gains from hardware knowledge can be had, let alone made cost effective. Most are at the hardware interface level - the drivers - most recently USB for example.
If you're happy with programming Visual Basic and your employer can afford the hardware costs that ramping up your single CPU "solution" to deal with millions of visitors instead of hundreds, then you don't need to know anything about the underlying hardware at the bit level.
On the other hand, if you need to wring the most "visits" out of off-the-shelf hardware somehow, then you need to know enough to calculate the theoretical CPU cycles per visit.
Somewhere between these two extremes lies reality.
Today I use my hardware knowledge mostly as a "bullshit filter" when dealing with claims and statistics from various vendors. I have an empiric understanding of why (and under what circumstances) a processor with 512 Megs of level 1 cache and a front-side bus at 500MHz might be faster than a processor with 256 Megs of L1 cache and a 800MHz FSB and vice versa. Same thing for cost effectiveness of SCSI vs IDE when dealing with a database app vs. access to large images in a file system (something that came up today with a customer when spec'ing a pair of file servers, one for each type application)
Back in the mid 70s I dealt with people who optimized applications on million $ machines capable of about 100 online users at one time. Today I deal with optimization on $1000-$3000 machines with thousands of 'active' sessions and millions of 'hits' per day. Different situations but similar problems. Major difference is in the cost of "throwing hardware at the problem" (and throwing the operating systems to go with the HW - but then I use Linux so not much of a difference ;)
Bottom line is that understanding the underlying hardware helps me quite a bit - but only as a specialist in optimization and cost-effectiveness now, not in getting things to work at all as in the past.
I Suggest FORTRAN (or C as a second choice) (Score:3, Interesting)
The goal of a first "computer science" class should not be to merely teach technical skills (programming, debugging, program design) but to also give the students an understanding of how computers work. Even with today's proliferation of home computers, most people (student or otherwise) have utterly no idea how a computer works internally since their exposure is limited to simply installing software someone else wrote. Assembler is too far on the side of "how the CPU works" without giving the student any better insight into how the whole system actually operates (memory, CPU, storage, peripherals, etc.) than a higher level language while requiring much more effort by the student to accomplish anything useful. This would seem to indicate that a second generation language such as FORTRAN (or C) would be more likely to let the course delve into programming, machine operations, machine organization, etc. It also means that the students can produce "interesting" programs fairly early on which will keep more people interested (which would you rather have: a PHB MBA with no programming classes or one that has at least sat through enough classes to have some understanding of what goes on inside the box and how difficult that can be to accomplish).
One other complaint against assembler as an introductory teaching language is that, depending on the specific assembler, it is usually difficult to see the overall program structure even after the program is complete. This is primarily due to the low implementation level that leaves simple program controls (if-then-else, do-for, do-while, etc.) burried in the assembler syntax of loading and testing registers and smeared out over what is generally multiple individual statements. The student *may* end up with an appreciation of what it involves to implement these structures but will loose site of the tree (control structure) and never even notice the forrest (overall program design).
Re:"Tight" code not always necessary (Score:3, Interesting)
Assembly is different. All languages eventually turn into assembly. Knowing how the stack operates, what a stack frame is, what a register is, what addressing modes are available, etc., all bear down on ALL programming languages.
For example, in C, where should you place your most-used member of a struct? Why, in the first position, since it can be accessed using indirect addressing rather than base pointer addressing.
Why do Scheme and C often have trouble interacting? Because Scheme's continuations mean that objects on the stack have to be able to live forever, meaning that the hardware stack is near useless.
Also, many people who have trouble understanding pointers in C are able to "get it" after learning assembly language because it makes the concept far more concrete.
Why are OO calls more costly than straight function calls? Because the method has to be looked up in a virtual method table first, before it is called. Not only does the lookup cost cycles, but the indirect jump can kill your pipeline.
Anyway, these are the types of issues that all programmers in all languages have to contend with. It is easier to understand these types of concepts if the programmer knows assembly language.
Re:Linux x86 assembly? (Score:2, Interesting)
FPGA = Field Programmable Gate Array.
Damn, I really have too many acronyms / abbreviations to remember these days.
Re:Not So New Concept (Score:2, Interesting)
Later, in a compiler course, we had to compile a subset of C++ to SPARC asm, and also use an emulator to execute that, and also do a memory dump analysis, though we did have pseudo ops to display output too.
Also, we didn't really have any classes that taught languages and programming, per se, other than the entry level CS courses which taught using Pascal. I learned C in a EE course, FORTRAN when I was still in Aero (switching from AAE to CS, I already knew FORTRAN & C). We learned C++ specifically for a compiler course, for most of us our first time using a language supporting OO at the language level; of course, it was just a cross compiler from C++ to C, at that time. Other languages, specifically for those courses.
Something also for you to realize is the caliber of the education you receive at Purdue. In the many years since I've left there and worked professionally, I have worked with many fellow professionals from many universities. The CS program, and the EE/CEE programs, at Purdue are among the very top. I worked at Cray Research, back in the day when Seymour was still around. They recruited heavily from Purdue for a reason: they trusted two PU CEE interns enough to be solely responsible for developing the pre-hardware emulator for the first MPP Cray machine, used for porting the OS and testing the architecture. Also at Xerox, and Motorola; also heavy Purdue recruitment. I've been in the position to interview and evaluate a good number of candidates from many universities, and I can say that Purdue is very much top tier.
Unfortunately, lots of other universities don't have "real" CS/CEE programs. A good number are degree mills, producing graduates narrowly educated, and not well equipped to learn new technologies and methodologies throughout their career, because they were never challenged and never exposed. I have run into "CS" graduates who learned in nothing but a pure MS GUI environment, using IDEs, who never were taught how computers really work, how to develop language translation/compilers, how to develop an operating system. There are CS graduates who never have touched a Un*x command line. It's sad. The Internet driven tech bubble has, perversely, devalued CS as a curriculum it seems.
Larry
Question (Score:2, Interesting)
Every program you write is run inside the operating system, right? So how does an OS switch between all those tasks if it seems like you're intimately working with the hardware? You put something in a register, well what if a new task comes along and want's to use that register also?
Is everything virtualized from the kernel, sort of as if the kernel is the machine? How do you differentiate between accessing kernel memory and accessing virtual memory?
Also, doesn't the operating system implement the RTS and other basic features? If you're programming a kernel, do you not have the RTS available?
I think that all these questions are related. Would someone please enlighten me?
PDP-11 C / Origin of gcc (Score:3, Interesting)
I'm not sure inspired would be the right way to say that. C was invented a shorthand for assembler, in particular PDP-11 assembler. I'm probably just being pedantic but I think it's an important distinction.
We owe a lot to those machines, by '74 UNIX and C were available (barely) from Bell Labs but by the late summer of 76 Dave Conroy at Teklogix in Mississauga, Ontario, had written and made work the only C compiler not written by Bell Labs, which ran under RSX-11M. This became DECUS C, and then gcc.
I worked there between high school and university ; Dave taught me C to test his compiler and must have got all of about $1200 for writing it as it only took him a few weeks. It was of course written entirely in assembler.
Knuth (Score:1, Interesting)
*shrug*
Re:Not the point! (Score:2, Interesting)
You used CMOS?? Luxury! You probably had a clean-room, too.
We had to make our AND gates out of bubble gum and baling wire.And cut the wire with our teeth!
On more serious, but equally off-topic note, back in the mid 1970s I attended an IBM presentation by one of the researchers who built the first IBM disk drive. He described how they coated that very first 14-inch platter:The plan was to put the platter on a record turntable, crank the speed up to 78 RPM and pour the iron oxide slurry onto center of the spinning platter from a paper Dixie cup. The theory was that the centrifugal force would spread the slurry out evenly across the surface and give them a nice, smoothe coating.
Well, they gathered around the turntable, started pouring out the slurry, looked down at the brown streak across all their clean, white lab coats, and said, "Maybe we should turn it down to 33 RPM."True story.
Re:PDP-11 C / Origin of gcc (Score:3, Interesting)
Yes, C is basically an abstracted PDP-11.
Dave Conroy at Teklogix in Mississauga, Ontario, had written and made work the only C compiler not written by Bell Labs
Is this the same Dave Conroy that does FPGA re-implimentations of old DEC computers?
http://www.spies.com/~dgc/ [spies.com]
Use C (Score:2, Interesting)
If I could offer you only one tip for your future as programmers, C would be it. The long-term benefits of C have been proved by scientists, and academics, and professionals, and corporations all over the world, whereas the rest of my advice has no basis more reliable than a Microsoft marketing report. I will dispense this advice now.
Enjoy the power and beauty of simple command line tools. Oh never mind, you will not understand the benefits of vi, make, grep and gcc until you're working as a contractor, and that fancy environment you have at home isn't available. But trust me, in 5 years, you'll sit down in front of an expensive IDE, pore over the GUI menus and checkboxes, read the badly written two-inch-thick manual twice, and still not be able to write a program to say "Hello World!" You'll then recall in a way you can't grasp now, the importance of understanding how to get useful work done with the standard, simple tools.
Don't worry about not using the latest and greatest Microsoft tools or language; or worry, but know that worrying is as effective as watching all your carefully crafted code fail because a registry setting or dll you depend on was changed by the installation of some do-nothing program. The real troubles in your programming career are apt to be things that never crossed your worried mind; like a middle manager who wipes out the
Write one script every day to aid you.
csh!
Don't be reckless with other people's test procedures, don't put up with people who are reckless with yours.
Lint.
Don't waste time on hand-optimizing your code; sometimes you're elegant, sometimes you're a kludge... The optimizing compiler will do it better anyway, and in the end you'll have to maintain the source code - so keep it readable.
Remember the useful code snippets you receive, forget the cruft. If you succeed in putting together an elegant an efficient source "proverb" library, send it to me.
Keep a logbook of your programming efforts. Toss your core dumps.
Top.
Don't feel guilty about not following a particular development methodology completely. The most useful programs I've seen have been cobbled together ad hoc. Projects I've seen follow the Rational Unified Process, Extreme Programming or Aspect Oriented Programming methodologies to the letter, dies over budget and behind schedule.
Get comfortable with revision control.
Be kind to your O'Reilly books. You'll miss them when they're gone.
Maybe you'll become famous, maybe you won't, maybe you'll win the Turing, maybe you won't, maybe your codified soul will be ignored sight unseen, maybe you'll crank out the next Mosaic... Whatever you do, don't congratulate yourself too much or berate yourself either - the commercial success of your program depends more on the whims of lawyers and marketing drones than on the technical merits of your application or the effort you've put in. So does everybody else's.
Enjoy your home computer, use it every way you can... Don't just use it to hack away at code or to automate your bedroom lights or to run a website, it's the greatest tool for self-expression you'll ever own.
Learn... even if you have nowhere to do it but from a weblog forum. Read the responses, even if you don't agree with them. Do NOT read Ziff-Davis magazines, they will only make you feel like a long-haired, smelly zealot.
Get to know your kernel, you never know when you'll need to recompile it yourself. Be prudent about your patches, they are the best trace of development and the means most likely to keep you safe and relevant in the future.
Understand that CASE, GUI and other developer tools come and go, but for the precious few you should truly know. Work hard to bridge the gaps between convenience and uti
Assembly (Score:2, Interesting)
Not a new idea (Score:2, Interesting)
I got a job right away for a company that made data comm hardware and, naturally, used assembler. I didn't touch a high-level language till 1984 (and that was C).
As the years have gone by, the trade (NOT profession!) has become more and more the domain of the "computer scientist". These are the guys who learned how to write compilers in school. Often, they're disappointed when the job involves something more mundane. These are also the guys who have trouble with pointers and other basic concepts because they don't really know what's going on on the machines they program.
German machinists used to spend a year doing nothing but filing as part of their training. Nothing but filing. These were people who honored their craft. Assembler is the computer equivalent. Without it, you might be a "scientist", but you'll be a lousy craftsman.
I once worked for a guy who learned programming in the Air Force in the 60's. His motto: "It's all just loads and stores and branches."
Re:Somewhere in the middle... (Score:3, Interesting)
Your talking about documented code makes pretty clear how little you know on the subject. He was writing machine code. Not assembly. Not C. Not Perl. Machine code. There's nowhere to put comments. Even if there were, you'd never waste a machine word on them.
If you've never had to get a forklist to help you lift your computer, you aren't qualified to pass judgment on Mel.
Anyone ever heard of Barbiac? (Score:2, Interesting)