Please create an account to participate in the Slashdot moderation system


Forgot your password?
Programming Books Media Book Reviews IT Technology

C 299

Craig Maloney submitted this review of Addison-Wesley's entry in the tough field of books on C (book title: C), and pulls no punches in comparing it to others. He says it's slightly above average, but that "experienced programmers will likely pass on this book." Read the complete review below for his reasoning.
C (Addison-Wesley Nitty Gritty Programming Series)
author Klaus Schröder
pages 400
publisher Addison-Wesley
rating 5.5
reviewer Craig Maloney
ISBN 0-201-75878-4
summary A slightly better than average C book with some very good points, but poor delivery.

Lost in the Company of Giants

It's hard not to take a book like C and compare it to such acclaimed and trusted books as K&R, Expert C Programming, and other lesser known, but equally good tomes. Unfortunately C doesn't really compare with many of the other classic books covering the C language. For starters, the writing in this book isn't quite up to the same caliber as the other books. Part of the problem with this book is language. English does not appear to be the author's native language. There are sentences in this book that require a few glances to glean the full meaning. C is difficult enough to present without a language barrier introducing more problems. Another problem is organization. The ideas presented at the beginning of the book are muddled and disjointed, with multiple ideas introduced but not formally explained until later. Beginners will have a terrible time working through this book without becoming quickly confused, and experienced programmers will likely pass on this book in favor of the other well-known books.

Not All Bad

The book is not all bad, however. The examples in the book are plentiful and are based on tried-and-true examples found in books like K&R. There are some idioms that are used in the examples that will irk the more structured programmers (not using braces in certain areas being the biggest example), but most of the examples are pretty good. Also, the explanations of the more advanced topics are relatively good considering how confusing the more basic material is. Memory management is explained well, with clear diagrams (although the programs are a bit confusing without a careful eye).

So What's in it for Me?

Addison-Wesley is clearly marketing this book to the same crowd that purchases quick-learning books. Unfortunately beginners purchasing this book will quickly find themselves lost amid the confusing descriptions in this book. Those who manage to muddle through will find some tasty bits of information locked inside, but the work involved in getting there outweighs the rewards. Most programmers will probably want to leaf through a copy of this book before purchasing it to make sure they'll get the most out of it.

You can purchase C from Fatbrain. Want to see your own review here? Just read the book review guidelines, then use Slashdot's handy submission form.

This discussion has been archived. No new comments can be posted.


Comments Filter:
  • Aaah... (Score:1, Insightful)

    by Anonymous Coward
    I love a review that isn't afraid to say "don't bother"
  • I was just getting into C programming (specifically for a project on my Palm, but also in general using GCC like for my GBA and Linux)... And I was looking for a good C book the other day but wasn't sure (there are quite a few). I'm an experience programmer in Java and other languages, so I don't need a basic tutorial on how C works.

    What I need is a book that talks about how to use C in real projects. Gochyas, how to use the STL, etc. Also, I don't really feel like using C++ which seems like too much - even though it's more similar to Java - it looks like the vast majority of apps that I want to explore and use are written just in C...

    Any suggestions?

    • Any suggestions?

      Yes, learn what STL is before you post to Slashdot about C (and then even go on to clarify that you dont want to use C++).

      • Hmmm. Got me.

        This is why I NEED A BOOK you asshole.

        Why don't you put your ego back in your pants and answer the question if you're the fucking expert.


        • Relax! My goodness. He was making a valid point. :-)

          STL (Standard Template Library) is a C++ feature.

          C does not have STL.

          And as far as books I would recommend "The C Programming Language", it is well accepted as "The" book for C. ISBN: 0131103628. The authors are Kernighan and Ritchie, so the book is often referred to as K&R C. They created the language (more or less).

          Combine K&R C with a good C reference manual and you should be able to do anything.

          For a reference manual I recommend "C a Reference Manual", ISBN: 0133262243 . It is written by Samuel P. Harbison with Guy L. Steele. Top notch minds IMO.

          Enjoy :)

    • how to use the STL, etc

      STL is only available in C++, and is now known as the "C++ Standard Library". While not very OO, it is heavily dependent on templates, a feature not available in C.
      • [cough] STL is part of the standard library. They are not the same thing. [/cough]
    • What I need is a book that talks about how to use C in real projects. Gochyas, how to use the STL, etc. Also, I don't really feel like using C++

      Not to sound arrogant, but this statement belies a need for substantial reading about C and C++, not casual reading.

    • I'm an experience programmer in Java and other languages, so I don't need a basic tutorial on how C works.

      C is quite different from Java in that it does not
      garbage collect so you must allocate your memory
      by hand. It also makes heavy use of pointers, which
      is not done in Java. It is a must to learn C from the
      ground up, in my opinion, so that you can get a firm
      grasp of how to create and use the these lower level
      techniques appropriately.
    • "New C Primer Plus" published by The Waite Group. There is no better instructional C book.
    • Not to get you all confused, but Digital Mars is developing a language called D (eleet C haxxors need not comment on languages named D) which is somewhat of a cross between C++ and Java in syntax, and is a migration path away from C/C++. I'm an "enterprise" Java programmer by day, but D really impresses seems to be the right step to get *general application development* off the mistake-prone and complicated C/C++ languages.

      No it's not a practical suggestion (AFAIK the compiler is not fully complete and the syntax is still being refined), but it sure is interesting.

      Check it out:
    • One can use C++ for the Palm, but unless one knows the language rather intimately, it's easy to produce a big chunk o' code. That's because, depending what you do, C++ can link big libraries without your acquiescence.

      The Palm requires the Old Virtues, grasshopper. Use simple C. Avoid non-Palm library calls.

      You could do a lot worse than just taking some of the sample Palm code and studying it. The O'Reilly Palm Programming book is pretty good.

  • by Ars-Fartsica ( 166957 ) on Tuesday March 05, 2002 @10:57AM (#3112390)
    K&R and "Advanced C Programming By Example" by John Perry.
    • by theCURE ( 551589 ) on Tuesday March 05, 2002 @11:30AM (#3112585) Homepage
      Yeah because if you don't buy "Advanced C Programming By Example", then K&R might fall over on the shelf.
    • Personally, the only C book necessary for a good, experienced programmer is C: A Reference Manual by Harbison and Steele. Best book on C out there. Skip K&R. It's awkward, poorly organized, and primarily interesting from a historical point of view.
    • I guess it depends what you want. For example, K&R is a fantastically concise, lucid and authoratative reference to the C language. But it isn't something I would give to someone learning to program. I probably wouldn't give it to an experienced programmer who is coming to C as a second language, either. It would at least need a companion "introduction" book as well.

      When I'm writing C I always like to have K&R close by, but there are much better books to give to someone learning C.
  • All in all, you'd give the book a grade about... "C"? :)
  • Rule of Thumb (Score:5, Interesting)

    by pizen ( 178182 ) on Tuesday March 05, 2002 @11:04AM (#3112430)
    One of my CS profs (Georgia Tech people probably know who I'm referring to) gave the class advice on buying a C book: Flip through the book. If malloc isn't covered, put it back. If malloc is covered in an appendix, put it back. If the book contains the line cp=realloc(cp,n); then "burn it! Burn it right there in the store! Burn all of them!"
    • If the book contains the line cp=realloc(cp,n); ...

      I didn't know Microsoft published a C book.


      Just buy more memory. Memory is cheap.

    • by megas ( 1636 ) on Tuesday March 05, 2002 @02:03PM (#3112862) Homepage
      The professor is Professor Jim Greenlee, he has fabulous quotes such as
      you must code until your fingers are bloody stumps and you wake up with a keyboard imprint on the side of your face
      and they're collected at []
      • Sheesh. Certainly Prof. Greenlee is not the only exponent of this idea, but still, I find myself wondering in how many of his students he confirmed the fraudulent idea that you do your best work in marathon sessions, on your fifth cup of coffee, after three hours' sleep.

  • by Dephex Twin ( 416238 ) on Tuesday March 05, 2002 @11:06AM (#3112450) Homepage
    He says it's slightly above average

    Actually, "C" is about dead-on average, I'd say!

  • Quick Learning (Score:5, Informative)

    by jeks ( 68 ) on Tuesday March 05, 2002 @11:08AM (#3112455)
    Learning C as your first language, without any prior computer experience may not be the most clever thing to do. Programming C efficiently, correctly and clearly is best achieved by first understanding computer architecture and programming concepts.

    A higher level language provides the abstractions necessary to accommodate "logical thinking" as opposed to a full understanding of say memory management and system I/O. Also, C is quite an orthogonal language in that it supports many awkward combinations of features and constructs. If you are not careful to make your source text clear and readable, debugging even your own code can be oh so cumbersome.

    Hence, perhaps reading a book such as "Computer Architecture: A quantitative Approach" by J. Hennessy and D. Paterson is a sensible step towards learning C for the beginner.
    • Re:Quick Learning (Score:3, Interesting)

      by yatest5 ( 455123 )
      Learning C as your first language, without any prior computer experience may not be the most clever thing to do. Programming C efficiently, correctly and clearly is best achieved by first understanding computer architecture and programming concepts.

      A higher level language provides the abstractions necessary to accommodate "logical thinking" as opposed to a full understanding of say memory management and system I/O. Also, C is quite an orthogonal language in that it supports many awkward combinations of features and constructs. If you are not careful to make your source text clear and readable, debugging even your own code can be oh so cumbersome.

      I would totally disagree. Learning a higher-level language, like Java, first, will mean you miss out on learning all the basic stuff you would pick up in C which will stand you in good stead if you ever need to pick up other languages quickly.
      • Re:Quick Learning (Score:5, Interesting)

        by Uberminky ( 122220 ) on Tuesday March 05, 2002 @11:43AM (#3112652) Homepage
        I'd tend to agree with you on this one. I understand his points, and perhaps another language should be used at the very beginning. I started out on HyperTalk (the scripting language of HyperCard -- that was the coolest toy I had as a kid), learned Pascal (because that was what they taught)... and then FINALLY learned C. This may sound silly but it was a breath of fresh air to me. Everything made sense, everything could be explained in terms of this. The special cases were all gone. Sure, some things were more work, but I learned so much from C in that 10th grade class. I really feel it has helped me have an edge over many of my college peers. Heck, I would've loved to learn assembly language first. I'm having a blast with it now, in embedded systems. It's doing the same thing that C once did for me, simplifying it down to what's REALLY going on, deep down on the metal, so that I truly understand it. One architecture may go out of date, but you can learn just as much from PDP-11 assembly as you can from Lisp today, they're just in different fields. My fifteen cents.. ;)
    • Re:Quick Learning (Score:2, Interesting)

      by Bobby Orr ( 161598 )
      This is why I like Pascal so much. I really believe I saw a difference between students who learned Pascal as a first language vs. those who were dropped right into C/C++.

      At first, you hate the structure that it forces on you. After looking at some freshman's C++ code that is barely better than spaghetti BASIC, though, you begin to pine away for the Good Old Days of Pascal!
    • I dunno about that. I'm mostly self-taught, gleaning most of my knowledge from books and magazines, and while I didn't start out with C, I also didn't find a lack of knowledge about the inner workings of a PC to be a hindrance either.

      Not to start a flame-war about this language vs. that language, but I started out with QuickBasic 4.5 (or, since it's no longer for sale, QBasic, as it was shipped with MS Windows '9x/SE/ME). It got me started, and I used the very basic book 'QBasic by Example' to learn the ins and outs of how to write programs (I admit, for Basic, some of the code in this book is REALLY old and stale, but for those just starting to learn programming, it's a good start to have IMHO). From that I moved up to Visual Basic for DOS 1.0, the only real difference between it and QBasic was the addition of some reference material on the language (what functions/subroutines do what, how to call them, categorized and indexed for easy perusal).

      After that I moved up to Borland Pascal (IMHO a good cross of the best things of C and Basic), learning more about the hardware involved in a PC and dipping into assembly language for the first time (Borland Pascal (and for the curious, Borland Delphi today) make it easier to learn assembler since it includes a built-in assembler (basm) that doesn't invoke any seperate compilers (like some C compilers do when you use __asm for example-- if you're lacking an assembler, your __asm blocks won't compile (again, this depends on if the C compiler has an internal assembler, or uses an external one)). From that I learned C and ultimately started working on C++ (a subject I'm still expanding on today).

      I think it's more appropriate to start out with the highest level language you can find, get your feet wet with learning, then work your way down into the internals (this is probably also the best time to start to learn code optimization at the CPU level, for example). For someone jumping into PC programming that has no prior experience, a computer architecture book IMHO would be the wrong way to go.
      • I think it's more appropriate to start out with the highest level language you can find
        I feel the opposite. (I do appreciate all of your points, of course.) While I'd agree that trying to teach a non-programmer to write compact, efficient assembly language for a particular processor would be difficult to say the least, there is something that is, in my mind, absolutely wonderful about assembly, and to a certain degree C: simplicity. Java is great because it will do everything for you. But for a learning language it seems absolutely awful. It's such a hideously complex language. "Hello, World" takes a page of code! Everything is so abstracted away, and there is so much theory behind everything. Whereas with assembly language, C, and to a point Scheme, they all have very small fundamental parts. And once you understand these, you are good to go. You can learn more about the inner workings of them, but you know all you need to know with just some basic knowledge. As I see it, Python has this same problem. It's great in that it'll do everything for you. But nothing is fundamental. There are so many special cases it's ridiculous. So for a more advanced programmer it's wonderful. Just not for learning, IMHO.
        • public class Hello {
          public static void main(String[] args) {
          System.out.println("Hello !");


          #include "stdio.h"

          int main() {
          printf("Hello !");

          5 lines each. Its a draw.
  • by ForsakenRegex ( 312284 ) on Tuesday March 05, 2002 @11:11AM (#3112477) Homepage
    The best book for C beginners I've ever come
    across is _A Book on C_, by Al Kelley,
    and Ira Pohl. I've recommended it to quite a
    few beginners and they've all said it was an
    easy and very informative read.
  • When people talk about "K&R" as a book, they're referring specifically to "The C Programming Language" by Kernighan and Ritchie (amazon []), which is, without a doubt, the best reference available for C, and well worth the $40 its gonna cost ya.

    Mind you, its probably not the best book to learn C, but once you have the basics, this book will become your bible. I keep a softbound copy at home, and a nice hardback version I found at a used bookstore at work. Absolutely indispensible.
  • The Art of Writing (Score:3, Insightful)

    by Alien54 ( 180860 ) on Tuesday March 05, 2002 @11:13AM (#3112492) Journal
    The Art of writing and education is difficult in its own right, and not everyone knows how to put things together. Often you have to correctly analyse what concepts are fundamental to the understanding of more complex concept.

    As an example, I can recall a man who came into the store where I was working, and who asked me how much "virtual memory" cost.

    Besides trying not to laugh there was the problem mentioned above.

    The fundamental concepts missing were the concepts of "memory" and "virtual", along with a larger mental model to enable the average person to organise the concepts into something useful when dealing with computers. [The usual mental model I use for beginners is one of a computer = your information factory. Hard drives = warehouse, etc.]

    It is possible to arrange things inthe manner of " Gradus [] ad [] Parnassum [] " (graded steps)

    Without proper technique in this area, It is very easy to make a bloody mess of it. It is a skill in its own right, separate from knowledge of the area to be taught in the first place.

  • Tools of the trade. (Score:3, Interesting)

    by DCram ( 459805 ) on Tuesday March 05, 2002 @11:13AM (#3112496)
    When you walk into HQ or Home Depot have you ever asked yourself why are there so many tools that do about the same job? There is always the right tool for the job. This is the same as it is in the programing world. I have heard people in this thread say why use C. Because if its the right tool to get the job done then use it. Don't get so caught up in one language, even a new one, that you make yourself a dino.

    A good C book is always a great find. I find myself going through kernel code or drivers or as of late the netsnmp code going hmmm.. I've seen this before but exactly what is going on.

    Refrence material is always matter what.

    What I would love to see is a thread on what books most coders have found to be the most useful, what they have on there shelves.

    I love reading the reviews that say the book is good but not really worth my time.
    • Three of the best I've read for programming in C (that are happily ear-marked, bent, and written-in):

      "C Traps and Pitfalls", Andrew Koenig, AT&T, 1988
      - A bit dated in places, but still covers the very fundamental gotchas that a lot of programmers forget/don't know

      "Expert C Programming: Deep C Secrets", Peter Van Der Linden, SunSoft Press, 1994
      - Fun to read, especially how the simple linguistics of a language can cause major ($20 mil)

      "The C Programming Language, second ed", Kernighan and Ritchie, AT&T, 1988
      - They developed C, 'nuff said.

  • by Anonymous Coward
    A C book that doesn't cover a proper coding style is pretty useless.

    Yeah, you might bang out a quick "itch-scratcher" without using a strict coding style, but not a solid piece of software that can be maintained.
  • Are not all other programming languages that are popular today, just extensions of the 'C' base? If you look back 'C' it is the first cross platform language, take JAVA which is really 'C' with an OOP style and a few syntax changes.

    A quote from the reviewer: "C is difficult enough to present without a language barrier introducing more problems." I think he is confusing 'C' with 'C++'; which can be a very confusing implication of 'C'.

    I think a first time 'C' user would be well advised to get one of the 'K&R' programming books. I found it very helpful when I was learning 'C'.
  • I don't know why they even write C books. There has always been and will only be one programming book "the C programming languaje" by Kernigan Ritchie. Not only a superb C book, simply put, the best programming book ever. Now I'm looking for a similar book on JAVA, but all the books I found are tooooooo big. So far, the best JAVA book I read is "Just JAVA and beyond" by Peter Van Der Linden. Chapter 2 is all you need to know about OOP.
  • An old april-fools joke, this snippet [] will give you a good chuckle at the expense of C:

    "We stopped when we got a clean compile on the following syntax:

    for(;P("\n"),R--;P("|"))for(e=C;e--;P("_"+*u++/8)% 2))P("| "+(*u/4)%2);

    To think that modern programmers would try to use a language that allowed such a statement was beyond our comprehension!"

    Also check out shooting yourself in the foot [] in various programming languages.
  • The best C book that I have ever read would be:
    • UNIX Systems Programming for SVR4
    Not only does it addtess such cryptic issues as UNIX terminals and processes in great detail, but it also covers much of the C standard library. A must have for any serious C programmer.
  • Ada Anyone (Score:2, Insightful)

    by shaunbaker ( 157313 )
    I am really curious as to know peoples opinions on Ada. It is highly verbose and slightly annoying but overall the system seems well thought out. Class-wide programming seems straightforward enough and the compiler seems to catch just about everything that would cause cryptic C++ bugs. It takes a little longer because quick and dirty hacks aren't allowed, but overall it seems very good.
  • Get rid of C! (Score:5, Insightful)

    by Tom7 ( 102298 ) on Tuesday March 05, 2002 @03:17PM (#3113375) Homepage Journal

    Goodness, this is an awfully empty review. Except for the comment about the author's native language (which humorously is followed up by an awkward if not ungrammatical sentence from the reviewer), this whole review could be applied to practically any programming book! What sets this book apart? If nothing, then why review it?

    Anyway, the real reason I clicked on this article is because I just love a C debate. Since there's hardly anything to talk about with regard to the review, let's get to it!

    Here's what I say: outside of the low-level systems crowd, C should die. We should *not* be teaching beginning programmers C or C++. C should not be the "default language" of computer programming.

    Today, efficiency is no longer the primary concern about software. It has been ousted by robustness and ease/cost of development (modularity and reuse). C is awful for robustness (the language lets you "get away" with anything you want, though those things are seldom what you want in application software), and even worse for modularity and re-use. Modern languages, or even quasi-modern languages like Java, are far better than C for robustness and ease of development. They even win on some points which are typically seen as less important than efficiency: portability, elegance, etc.

    Finally, the efficiency of high-level languages is comparable (though not as good as) C. Compiler technology is improving somewhat, as well. But since developing and debugging take less time, you have more time to optimize your program (if necessary), so I am not convinced that this is really a big deal. Yet, even if I need to concede the efficiency issue, I still believe modern languages beat C overall.

    I'll be glad to argue about that stuff, but today I have a different point to make. C is also holding back the progress of other computer fields. Let me give you an example. My friend is working on compilers that compile high-level languages (to them, C is a high-level language) down to circuits. Here, discovering data parallelism in the code is probably the most difficult problem. Of course, the same issues arise in compiling to architectures like the IA-64 or even P4, where in order to get best execution speed, the compiler needs to figure out what instructions can be executed in parallel.

    When they compile C, they need to look for common idioms (certain patterns of for-loop use), then analyze them to extract the parallel algorithm. For instance, this simple C loop adds k to every integer in an array a of size s:

    for (int i = 0; i < s; i++) {
    a[i] += k;

    The idea is that the compiler should be able to produce a circuit that does all of the adding in parallel, on silicon. Since you all probably grew up on C, this seems like the totally natural way to write that code. In fact, it is short and it is straightforward. Unfortunately, it is less straightforward to a compiler. The compiler needs to prove to itself that the for loop can be parallelized -- what if the programmer changed k or a or i in the loop body? The C code actually says to run each loop iteration sequentially.

    Of course, compiler writers have gotten pretty good at catching this particular idiom, but when the code gets more complicated (especially when the compiler needs to do alias analysis), it is not so good.

    The chief problem here is that the programmer is unable to effectively communicate his parallel algorithm to the compiler. The programmer takes something in his mind "add k to all elements in the list", sequentializes it for C "for (int i = 0 ...", and then the compiler has to *undo* this sequentialization to produce the parallel code. In the process of translating, some information is always lost.

    Now look how this code would be written in SML (a so-called "modern language"):

    Array.modify (fn x => x + k) a

    (fn x => x + k) is a function expression (SML lets you pass around functions), and Array.modify simply applies the function to every element in the array. Here, the compiler can very easily tell that my operation is parallel, because I used Array.modify! The code is also a bit shorter, and I also think that this code is a lot clearer. That's subjective, of course. BUT, I hope you will agree that for this example, the SML code is closer to what the programmer means, and easier for the compiler to understand.

    Anyway, perhaps some of you will say that this particular issue is not a problem, or that it is already solved (I would like to hear the solution!). I merely mean to propose an example of a theme I have been observing over the past few years in many areas of computer science. Computer programming is about communicationg with the compiler or machine in such a way that it is easy for the human to create code, and easy for the machine to understand it. C was never particularly easy for a human to create (though we have become accustomed to it), and though it was once easy for a compiler to understand, this is becomming less and less true. When the language is neither optimal for humans nor optimal for compilers, doesn't that mean that something needs as change?

    • Re:Get rid of C! (Score:3, Insightful)

      by Jimmy_B ( 129296 )
      Your post is definitely a troll, but since you immediately admit and justify it ("Since there's hardly anything to talk about with regard to the review, let's get to it!") I will respond anyways.

      I'll start my argument with an anecdote. There is one particular C++ program I was writing, not atypical, and not a particularly difficult problem computationally. I was programming on my AMD Athlon 1400MHz, a fast computer by any reasonable standards, and found that it wasn't running as fast as I'd like. I profiled it, and found the functions to blame were operator overloads (called very frequently) on a class coord, which looked something like this:
      friend coord operator+ (coord a, coord b) { return coord(a.x+b.x, a.y+b.y); }
      Spot the error on that line. Now, ask someone who's never programmed in C before to spot the error. Give up? It's the passing by value, which prevents the compiler from inlining the function; the correct way to write that function is:
      friend coord& operator+ (const coord& a, const coord& b) { return coord(a.x+b.x, a.y+b.y); }
      If not for my experience programming in C, I never would've realized that.

      It is my observation that people who are taught to ignore C, and start immediately with an object-oriented language such as C++, <B>never</B> learn low-level concepts such as inlining or pointers, and never learn to truly understand what it is that they're writing. I say that the best way to learn and properly appreciate the "right way" of doing something is to first do it the *wrong way*, in a project that doesn't matter for anything. I have debugged pointer spaghetti, written code with dozens of meaninglessly named global variables with no comments, written procedural code, and had I not done these things, I wouldn't know why it is important to name variables intelligently, to use object orientation, or to use pointers carefully. You tell someone to comment their code, and you get lines line this:
      a=b; // Copy b into a
      The only reason the comment is there is because they were taught to always comment their code, to comment every line, etc. On the other hand, someone who's dealt with uncommented code before would put useful comments where they need to be.

      I agree with you that C should not be used in production code where it can be avoided (that is, areas other than systems and embedded programming). However, I strongly believe that people should always learn and master C before learning higher-level languages. If the only reason you use classes is because you were taught that that's how to write clean code, then you're not using them correctly. On the other hand, if you're using classes because you wrote in C up to the point where you encountered a problem that required inheritance or polymorphism, then you're using the feature for the right reasons.

      Your example, by the way, in which a loop that increments elements of an array is parralelized in hardware, is actually simpler than you think. The compiler first performs loop unrolling (a very, very old idea), then analyzes the code blocks to see that they don't work with the same data, and parralelizes them. Your particular example implies that the only real way to solve parralelism is to define parralel functions and human-solve them; this clearly violates the distinction between language and library, and doesn't really help. Besides, in your case example of compiling high-level code into hardware, I could come up with far more examples where object orientation hurts rather than helps. OO hides all of the overhead, promoting huge bloat which, while not a problem in software, is fatal in hardware.
      • First of all, let me say that I see where you're coming from, and I agree entirely. I think it is important for people to understand the effects of the code they write, to an appropriate level.

        However, I don't think your examples and conclusions follow from that as well as they might. For example, the inlining issue you mentioned (passing by value in C++) is a quality-of-implementation issue for your C++ compiler. It isn't a flaw in the langauge. It's a shame that we have to have idioms like pass-by-const-reference in a language at all today, since they are effectively nothing but preemptive optimisation and syntactic cod liver oil.

        My other suggestion is that, while I do think it's advantageous to know what's going on under the hood, that doesn't have to come before learning a higher level language. On the contrary, those who have programmed C or assembler for years and who then try to pick up a language such as ML seem to be hampered by an intuitive need to understand everything that's going on under the hood, and to map all constructs and ideas in the language onto familiar low-level terms. This is counter-productive. We know that some higher level languages can generate very good code, with speeds and executable sizes comparable even to languages such as C or C++ today. Focussing on the little details -- trying to break down a complex construction in a high level language into low-level concepts -- hides the big picture, and we all know that your performance really depends on the overall job, not the little tweaks.

        As a supporting fact, consider that today, most C or C++ compilers on Intel boxes generate better (smaller, faster) assembly language output than "hand-optimised" assembly written by the programmer. Issues like pipelining and parallel processing have rendered truly low-level programming a specialist art, requiring great skill. Since the people writing compilers specialise in that skill, I'm happy to let them do their job, and get on with mine using the tools they give me.

      • > Your post is definitely a troll

        Just because a post is provocative doesn't mean it's a troll. A troll post usually says something that the author doesn't really believe, just to get people fired up. I actually believe this, and I think it's worthwhile for the C-centric slashdot crowd to think about it.

        > It's the passing by value, which prevents the
        > compiler from inlining the function; the correct
        > way to write that function is:

        I disagree that this "prevents" the compiler from inlining. It might force the compiler to call the copy constructor (if it can't deduce that it has no effect), but there's no reason it "prevents" inlining. It may be that the compiler just doesn't inline it, sure.

        I don't really see why programming in C gives you any more clue about when a C++ compiler will perform inlining, since that's basically an arbitrary choice made by the compiler.

        That said, I agree that there is definitely a place for learning C in a well-rounded programmer's education. I've taught programming enough to know that... Programmers that don't know low-level programming won't be able to do low-level optimizations. BUT, I think the emphasis that slashdot folks place on low-level efficiency is highly overrated. I feel that C actually encourages programming styles that are BAD for programming in general; I think it can do more harm than good for a programmer to learn "efficient" programming in C. So while it's a good idea for people to learn it, I don't necessarily think that it's smart to learn C *before* you learn other high-level languages.

        > Your example, by the way, in which a loop that
        > increments elements of an array is parralelized
        > in hardware, is actually simpler than you
        > think.

        Well, if you say so. People actually do their PhD theses on this kind of thing. In conversations with my friend, who actually works on these kinds of compilers, it seems that the problem is actually difficult. For instance, if there is almost any kind of memory write in the loop, then you need to do alias analysis to verify that it is not breaking the parallelism.

        > Your particular example implies that the only
        > real way to solve parralelism is to define
        > parralel functions and human-solve them; this
        > clearly violates the distinction between
        > language and library, and doesn't really help

        Well, let me reiterate my actual point: Instead of always working with C, languages should be adapted to the task at hand. C gives you an abstract view of a sequential machine -- and this is hardly how computers actually operate today (especially when compiling to hardware!) Having a language (even if like C) with a parallel array modification construct like SML's would make programs *easier* to write, and compilers better.
        I don't know what you mean by violating the distinction between language and library. Why should parallelism be a library??

        For the record, I wasn't encouraging Object Oriented programming, which I think is a bit of a sham -- but that's a different argument entirely. ;)
  • _C Programming: A Modern Approach_, by K. N. King.

    What I can say is, everyone I know who's on the C standards committee and has read this book liked it, as do a number of the regulars of comp.lang.c. I read an early printing, found maybe two small errors, and one basically correct thing that isn't the way I'd have done it. I believe he said the errors would be corrected in future printings.

    Nice book.
  • A few years ago I read a book that a huge "C" on the front. It wasn't about the language but about conspiracies and espionage. It had covered such things like a cannon ball price war in Portugal started by the British which resulted in the Spanish Armada being equiped with lower quality armourments. I think it was a better written book than the one covered by this review.

e-credibility: the non-guaranteeable likelihood that the electronic data you're seeing is genuine rather than somebody's made-up crap. - Karl Lehenbauer