Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Books Media Book Reviews IT Technology

C 299

Craig Maloney submitted this review of Addison-Wesley's entry in the tough field of books on C (book title: C), and pulls no punches in comparing it to others. He says it's slightly above average, but that "experienced programmers will likely pass on this book." Read the complete review below for his reasoning.
C (Addison-Wesley Nitty Gritty Programming Series)
author Klaus Schröder
pages 400
publisher Addison-Wesley
rating 5.5
reviewer Craig Maloney
ISBN 0-201-75878-4
summary A slightly better than average C book with some very good points, but poor delivery.

Lost in the Company of Giants

It's hard not to take a book like C and compare it to such acclaimed and trusted books as K&R, Expert C Programming, and other lesser known, but equally good tomes. Unfortunately C doesn't really compare with many of the other classic books covering the C language. For starters, the writing in this book isn't quite up to the same caliber as the other books. Part of the problem with this book is language. English does not appear to be the author's native language. There are sentences in this book that require a few glances to glean the full meaning. C is difficult enough to present without a language barrier introducing more problems. Another problem is organization. The ideas presented at the beginning of the book are muddled and disjointed, with multiple ideas introduced but not formally explained until later. Beginners will have a terrible time working through this book without becoming quickly confused, and experienced programmers will likely pass on this book in favor of the other well-known books.

Not All Bad

The book is not all bad, however. The examples in the book are plentiful and are based on tried-and-true examples found in books like K&R. There are some idioms that are used in the examples that will irk the more structured programmers (not using braces in certain areas being the biggest example), but most of the examples are pretty good. Also, the explanations of the more advanced topics are relatively good considering how confusing the more basic material is. Memory management is explained well, with clear diagrams (although the programs are a bit confusing without a careful eye).

So What's in it for Me?

Addison-Wesley is clearly marketing this book to the same crowd that purchases quick-learning books. Unfortunately beginners purchasing this book will quickly find themselves lost amid the confusing descriptions in this book. Those who manage to muddle through will find some tasty bits of information locked inside, but the work involved in getting there outweighs the rewards. Most programmers will probably want to leaf through a copy of this book before purchasing it to make sure they'll get the most out of it.


You can purchase C from Fatbrain. Want to see your own review here? Just read the book review guidelines, then use Slashdot's handy submission form.

This discussion has been archived. No new comments can be posted.

C

Comments Filter:
  • Aaah... (Score:1, Insightful)

    by Anonymous Coward on Tuesday March 05, 2002 @10:50AM (#3112342)
    I love a review that isn't afraid to say "don't bother"
  • by 91degrees ( 207121 ) on Tuesday March 05, 2002 @11:04AM (#3112433) Journal
    C was written in the dark ages of second generation languages, when the concept of a programming language was quite a new idea. It was all well and good in its time, but it has been suprceded by any number of other languages. Quite simply C lacks any of the benefits of any other language.

    C has no bounded array support. This makes it inherently unsuitable for any security programming taks. The concepts of pointers, structures and unions is totally confused. The typeing can't decide whther its a stronly or losely typed language, so it is possible to cast a pointer to an int to a pointer to a float, and have a different result from casting an interger to a float.

    And the support for OO techniques is minimal to non-existent. No templates, no classes. No inheritence. Unlike C++, self modifying code is impossible in a non-platform specific manner. Unlike Perl and Haskell, it is next to impossible to extend an array.

    The #include mechanism is a joke. It requires that you type the name of a function at least 3 times, and then you have to deal with circular include paths, and multiple includes of the same file. Support for 32 bit architectures had to be added afterwards in the form of a long keyword.

    There is no equivalent of the object class. Can you believe that people are still programming in a language that doesn't have this useful conmcept? Most of the functionality in C is available in MSDOS batch scripts!

    It seems that Dennis Ritchie was so besotted with A and B that he forgot to take out the trash.

    If the computing industry had any sense, it would have switched to Java 10 years ago. Why hadn't it? Inertia!
  • by pizen ( 178182 ) on Tuesday March 05, 2002 @11:10AM (#3112467)
    It seems that Dennis Ritchie was so besotted with A and B that he forgot to take out the trash.

    If the computing industry had any sense, it would have switched to Java 10 years ago. Why hadn't it? Inertia!


    Speaking of taking out the trash...I prefer to say when garbage collection occurs. I don't get that control with Java like I do with C.
  • The Art of Writing (Score:3, Insightful)

    by Alien54 ( 180860 ) on Tuesday March 05, 2002 @11:13AM (#3112492) Journal
    The Art of writing and education is difficult in its own right, and not everyone knows how to put things together. Often you have to correctly analyse what concepts are fundamental to the understanding of more complex concept.

    As an example, I can recall a man who came into the store where I was working, and who asked me how much "virtual memory" cost.

    Besides trying not to laugh there was the problem mentioned above.

    The fundamental concepts missing were the concepts of "memory" and "virtual", along with a larger mental model to enable the average person to organise the concepts into something useful when dealing with computers. [The usual mental model I use for beginners is one of a computer = your information factory. Hard drives = warehouse, etc.]

    It is possible to arrange things inthe manner of " Gradus [go.com] ad [karadar.com] Parnassum [alexanderpublishing.com] " (graded steps)

    Without proper technique in this area, It is very easy to make a bloody mess of it. It is a skill in its own right, separate from knowledge of the area to be taught in the first place.

  • by Anonymous Coward on Tuesday March 05, 2002 @11:16AM (#3112512)
    A C book that doesn't cover a proper coding style is pretty useless.

    Yeah, you might bang out a quick "itch-scratcher" without using a strict coding style, but not a solid piece of software that can be maintained.
  • by jaoswald ( 63789 ) on Tuesday March 05, 2002 @01:53PM (#3112785) Homepage
    What makes you think dynamic memory management in C is efficent?

    Just because "malloc" and "free" each fit on one line of code doesn't mean they are fast or efficient. In particular, if your memory arena gets seriously fragmented, the performance of these routines can get worse and worse over time.

    As opposed to a genuine generational garbage collector, for which the performance stays relatively consistent.
  • by erichbear ( 513532 ) on Tuesday March 05, 2002 @02:27PM (#3113048)
    The best C book that I have ever read would be:
    • UNIX Systems Programming for SVR4
    Not only does it addtess such cryptic issues as UNIX terminals and processes in great detail, but it also covers much of the C standard library. A must have for any serious C programmer.
  • Why are people still using a 30 year old language?
    Why are people ignoring 30 years of code maturity?
  • Ada Anyone (Score:2, Insightful)

    by shaunbaker ( 157313 ) on Tuesday March 05, 2002 @02:36PM (#3113118) Homepage
    I am really curious as to know peoples opinions on Ada. It is highly verbose and slightly annoying but overall the system seems well thought out. Class-wide programming seems straightforward enough and the compiler seems to catch just about everything that would cause cryptic C++ bugs. It takes a little longer because quick and dirty hacks aren't allowed, but overall it seems very good.
  • Get rid of C! (Score:5, Insightful)

    by Tom7 ( 102298 ) on Tuesday March 05, 2002 @03:17PM (#3113375) Homepage Journal

    Goodness, this is an awfully empty review. Except for the comment about the author's native language (which humorously is followed up by an awkward if not ungrammatical sentence from the reviewer), this whole review could be applied to practically any programming book! What sets this book apart? If nothing, then why review it?

    Anyway, the real reason I clicked on this article is because I just love a C debate. Since there's hardly anything to talk about with regard to the review, let's get to it!

    Here's what I say: outside of the low-level systems crowd, C should die. We should *not* be teaching beginning programmers C or C++. C should not be the "default language" of computer programming.

    Today, efficiency is no longer the primary concern about software. It has been ousted by robustness and ease/cost of development (modularity and reuse). C is awful for robustness (the language lets you "get away" with anything you want, though those things are seldom what you want in application software), and even worse for modularity and re-use. Modern languages, or even quasi-modern languages like Java, are far better than C for robustness and ease of development. They even win on some points which are typically seen as less important than efficiency: portability, elegance, etc.

    Finally, the efficiency of high-level languages is comparable (though not as good as) C. Compiler technology is improving somewhat, as well. But since developing and debugging take less time, you have more time to optimize your program (if necessary), so I am not convinced that this is really a big deal. Yet, even if I need to concede the efficiency issue, I still believe modern languages beat C overall.

    I'll be glad to argue about that stuff, but today I have a different point to make. C is also holding back the progress of other computer fields. Let me give you an example. My friend is working on compilers that compile high-level languages (to them, C is a high-level language) down to circuits. Here, discovering data parallelism in the code is probably the most difficult problem. Of course, the same issues arise in compiling to architectures like the IA-64 or even P4, where in order to get best execution speed, the compiler needs to figure out what instructions can be executed in parallel.

    When they compile C, they need to look for common idioms (certain patterns of for-loop use), then analyze them to extract the parallel algorithm. For instance, this simple C loop adds k to every integer in an array a of size s:

    for (int i = 0; i < s; i++) {
    a[i] += k;
    }

    The idea is that the compiler should be able to produce a circuit that does all of the adding in parallel, on silicon. Since you all probably grew up on C, this seems like the totally natural way to write that code. In fact, it is short and it is straightforward. Unfortunately, it is less straightforward to a compiler. The compiler needs to prove to itself that the for loop can be parallelized -- what if the programmer changed k or a or i in the loop body? The C code actually says to run each loop iteration sequentially.

    Of course, compiler writers have gotten pretty good at catching this particular idiom, but when the code gets more complicated (especially when the compiler needs to do alias analysis), it is not so good.

    The chief problem here is that the programmer is unable to effectively communicate his parallel algorithm to the compiler. The programmer takes something in his mind "add k to all elements in the list", sequentializes it for C "for (int i = 0 ...", and then the compiler has to *undo* this sequentialization to produce the parallel code. In the process of translating, some information is always lost.

    Now look how this code would be written in SML (a so-called "modern language"):

    Array.modify (fn x => x + k) a

    (fn x => x + k) is a function expression (SML lets you pass around functions), and Array.modify simply applies the function to every element in the array. Here, the compiler can very easily tell that my operation is parallel, because I used Array.modify! The code is also a bit shorter, and I also think that this code is a lot clearer. That's subjective, of course. BUT, I hope you will agree that for this example, the SML code is closer to what the programmer means, and easier for the compiler to understand.

    Anyway, perhaps some of you will say that this particular issue is not a problem, or that it is already solved (I would like to hear the solution!). I merely mean to propose an example of a theme I have been observing over the past few years in many areas of computer science. Computer programming is about communicationg with the compiler or machine in such a way that it is easy for the human to create code, and easy for the machine to understand it. C was never particularly easy for a human to create (though we have become accustomed to it), and though it was once easy for a compiler to understand, this is becomming less and less true. When the language is neither optimal for humans nor optimal for compilers, doesn't that mean that something needs as change?

  • Re:Get rid of C! (Score:3, Insightful)

    by Jimmy_B ( 129296 ) <<gro.hmodnarmij> <ta> <mij>> on Tuesday March 05, 2002 @04:46PM (#3114022) Homepage
    Your post is definitely a troll, but since you immediately admit and justify it ("Since there's hardly anything to talk about with regard to the review, let's get to it!") I will respond anyways.

    I'll start my argument with an anecdote. There is one particular C++ program I was writing, not atypical, and not a particularly difficult problem computationally. I was programming on my AMD Athlon 1400MHz, a fast computer by any reasonable standards, and found that it wasn't running as fast as I'd like. I profiled it, and found the functions to blame were operator overloads (called very frequently) on a class coord, which looked something like this:
    friend coord operator+ (coord a, coord b) { return coord(a.x+b.x, a.y+b.y); }
    Spot the error on that line. Now, ask someone who's never programmed in C before to spot the error. Give up? It's the passing by value, which prevents the compiler from inlining the function; the correct way to write that function is:
    friend coord& operator+ (const coord& a, const coord& b) { return coord(a.x+b.x, a.y+b.y); }
    If not for my experience programming in C, I never would've realized that.

    It is my observation that people who are taught to ignore C, and start immediately with an object-oriented language such as C++, <B>never</B> learn low-level concepts such as inlining or pointers, and never learn to truly understand what it is that they're writing. I say that the best way to learn and properly appreciate the "right way" of doing something is to first do it the *wrong way*, in a project that doesn't matter for anything. I have debugged pointer spaghetti, written code with dozens of meaninglessly named global variables with no comments, written procedural code, and had I not done these things, I wouldn't know why it is important to name variables intelligently, to use object orientation, or to use pointers carefully. You tell someone to comment their code, and you get lines line this:
    a=b; // Copy b into a
    The only reason the comment is there is because they were taught to always comment their code, to comment every line, etc. On the other hand, someone who's dealt with uncommented code before would put useful comments where they need to be.

    I agree with you that C should not be used in production code where it can be avoided (that is, areas other than systems and embedded programming). However, I strongly believe that people should always learn and master C before learning higher-level languages. If the only reason you use classes is because you were taught that that's how to write clean code, then you're not using them correctly. On the other hand, if you're using classes because you wrote in C up to the point where you encountered a problem that required inheritance or polymorphism, then you're using the feature for the right reasons.

    Your example, by the way, in which a loop that increments elements of an array is parralelized in hardware, is actually simpler than you think. The compiler first performs loop unrolling (a very, very old idea), then analyzes the code blocks to see that they don't work with the same data, and parralelizes them. Your particular example implies that the only real way to solve parralelism is to define parralel functions and human-solve them; this clearly violates the distinction between language and library, and doesn't really help. Besides, in your case example of compiling high-level code into hardware, I could come up with far more examples where object orientation hurts rather than helps. OO hides all of the overhead, promoting huge bloat which, while not a problem in software, is fatal in hardware.
  • Re:Rule of Thumb (Score:2, Insightful)

    by psamuels ( 64397 ) on Wednesday March 06, 2002 @12:46AM (#3116479) Homepage
    This is of course assuming that you are trying to recover from a memory allocation error. Generally if realloc fails, it means that your program isn't going to execute very far before it totally pukes anyway.

    Many things which are excusable in production should not be tolerated in education. If the author of the C book you are looking at actually teaches you to leak memory ... burn it! You can't trust an author with such sloppy technique to teach good technique.

    I've seen "void main()" in some C books. Same thing. Obviously in some environments it doesn't matter, but bad habits are bad habits..

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...