





C 299
C (Addison-Wesley Nitty Gritty Programming Series) | |
author | Klaus Schröder |
pages | 400 |
publisher | Addison-Wesley |
rating | 5.5 |
reviewer | Craig Maloney |
ISBN | 0-201-75878-4 |
summary | A slightly better than average C book with some very good points, but poor delivery. |
Lost in the Company of Giants
It's hard not to take a book like C and compare it to such acclaimed and trusted books as K&R, Expert C Programming, and other lesser known, but equally good tomes. Unfortunately C doesn't really compare with many of the other classic books covering the C language. For starters, the writing in this book isn't quite up to the same caliber as the other books. Part of the problem with this book is language. English does not appear to be the author's native language. There are sentences in this book that require a few glances to glean the full meaning. C is difficult enough to present without a language barrier introducing more problems. Another problem is organization. The ideas presented at the beginning of the book are muddled and disjointed, with multiple ideas introduced but not formally explained until later. Beginners will have a terrible time working through this book without becoming quickly confused, and experienced programmers will likely pass on this book in favor of the other well-known books.
Not All Bad
The book is not all bad, however. The examples in the book are plentiful and are based on tried-and-true examples found in books like K&R. There are some idioms that are used in the examples that will irk the more structured programmers (not using braces in certain areas being the biggest example), but most of the examples are pretty good. Also, the explanations of the more advanced topics are relatively good considering how confusing the more basic material is. Memory management is explained well, with clear diagrams (although the programs are a bit confusing without a careful eye).
So What's in it for Me?
Addison-Wesley is clearly marketing this book to the same crowd that purchases quick-learning books. Unfortunately beginners purchasing this book will quickly find themselves lost amid the confusing descriptions in this book. Those who manage to muddle through will find some tasty bits of information locked inside, but the work involved in getting there outweighs the rewards. Most programmers will probably want to leaf through a copy of this book before purchasing it to make sure they'll get the most out of it.
You can purchase C from Fatbrain. Want to see your own review here? Just read the book review guidelines, then use Slashdot's handy submission form.
Aaah... (Score:1, Insightful)
I was looking for a C book... (Score:2, Interesting)
What I need is a book that talks about how to use C in real projects. Gochyas, how to use the STL, etc. Also, I don't really feel like using C++ which seems like too much - even though it's more similar to Java - it looks like the vast majority of apps that I want to explore and use are written just in C...
Any suggestions?
-Russ
Re:I was looking for a C book... (Score:1, Troll)
Yes, learn what STL is before you post to Slashdot about C (and then even go on to clarify that you dont want to use C++).
Re:I was looking for a C book... (Score:1, Flamebait)
Hmmm. Got me.
This is why I NEED A BOOK you asshole.
Why don't you put your ego back in your pants and answer the question if you're the fucking expert.
-Russ
Re:I was looking for a C book... (Score:2)
STL (Standard Template Library) is a C++ feature.
C does not have STL.
And as far as books I would recommend "The C Programming Language", it is well accepted as "The" book for C. ISBN: 0131103628. The authors are Kernighan and Ritchie, so the book is often referred to as K&R C. They created the language (more or less).
Combine K&R C with a good C reference manual and you should be able to do anything.
For a reference manual I recommend "C a Reference Manual", ISBN: 0133262243 . It is written by Samuel P. Harbison with Guy L. Steele. Top notch minds IMO.
Enjoy
Jeremy
Re:I was looking for a C book... (Score:2)
Re:I was looking for a C book... (Score:2)
STL is only available in C++, and is now known as the "C++ Standard Library". While not very OO, it is heavily dependent on templates, a feature not available in C.
Re:I was looking for a C book... (more pedantry) (Score:1)
Re:(OT) So what are the standard libraries in C++? (Score:2)
Actually, strictly speaking the STL isn't even part of the C++ standard library. The STL (as designed by Alex Stepanov) is an entity in its own right, and the C++ standard library borrows very heavily from its ideas and maps them onto C++. The result of that mapping is commonly known as the STL among the C++ community, but the use of the term is not strictly accurate. There, that should settle the pedantry. ;-)
To answer your question more completely, the ISO C++ standard defines the following sections relating to the library.
Additionally, of course, there is what is inherited from C.
That covers the STL-based stuff (containers, iterators, algorithms), string manipulation, i18n issues, iostreams and such, the numeric classes (complex numbers, etc.), and various basic tools to make programming C++ easier.
You need to read more than you know (Score:2)
Not to sound arrogant, but this statement belies a need for substantial reading about C and C++, not casual reading.
Re:I was looking for a C book... (Score:1)
C is quite different from Java in that it does not
garbage collect so you must allocate your memory
by hand. It also makes heavy use of pointers, which
is not done in Java. It is a must to learn C from the
ground up, in my opinion, so that you can get a firm
grasp of how to create and use the these lower level
techniques appropriately.
Re:I was looking for a C book... (Score:2)
Re:I was looking for a C book... (Score:2)
No it's not a practical suggestion (AFAIK the compiler is not fully complete and the syntax is still being refined), but it sure is interesting.
Check it out: http://www.digitalmars.com/d/
Re:I was looking for a C book... (Score:2)
Where to look for good C and C++ book reviews (Score:3, Informative)
You might try the Association of C and C++ Users [accu.org] web site.
Don't use C++ for Palm anyway (Score:2)
One can use C++ for the Palm, but unless one knows the language rather intimately, it's easy to produce a big chunk o' code. That's because, depending what you do, C++ can link big libraries without your acquiescence.
The Palm requires the Old Virtues, grasshopper. Use simple C. Avoid non-Palm library calls.
You could do a lot worse than just taking some of the sample Palm code and studying it. The O'Reilly Palm Programming book is pretty good.
Re:I was looking for a C book... (Score:2)
Only two C books needed: (Score:3, Interesting)
Re:Only two C books needed: (Score:4, Funny)
Re:Only two C books needed: (Score:2)
Re:Only two C books needed: (Score:2)
When I'm writing C I always like to have K&R close by, but there are much better books to give to someone learning C.
So what you're saying is... (Score:1, Redundant)
Rule of Thumb (Score:5, Interesting)
Re:Rule of Thumb (Score:2)
I didn't know Microsoft published a C book.
...or...
Just buy more memory. Memory is cheap.
Re:Rule of Thumb (Score:4, Funny)
you must code until your fingers are bloody stumps and you wake up with a keyboard imprint on the side of your face
and they're collected at http://swiki.cc.gatech.edu:8080/cs2130/57 [gatech.edu]
"code until your fingers are bloody stumps" (Score:2)
hyacinthus.
Re:Rule of Thumb (Score:3, Informative)
if((temp=realloc(cp,n))!=null)
cp=temp;
Re:Rule of Thumb (Score:2)
The problem IS with realloc.
Realloc can also move the memory block ensuring that any pointers to the interior of the original memory block will be FU'ed. Realloc is a great idea when the main thing you deal with is arrays of chars or ins or floats, but most programs these days use data structures that have slightly more linkage than that. Similarly, a straight bit-by-bit copy is almost never proper copy semantics for data structures. Realloc is, in general, a car wreck waiting to happen. The original poster was right.
Re:Rule of Thumb (Score:2)
if realloc fails, it means that your program isn't going to execute very far before it totally pukes anyway.
Very true. If realloc failing you probably have bigger problems.
Re:Rule of Thumb (Score:2)
I would personally treat it as a fatal error and just exit the program
True, but you missed an important step: write (attempt) all the users data to disk so once he solves the memory problem he can finish his work.
Re:Rule of Thumb (Score:2, Insightful)
Many things which are excusable in production should not be tolerated in education. If the author of the C book you are looking at actually teaches you to leak memory ... burn it! You can't trust an author with such sloppy technique to teach good technique.
I've seen "void main()" in some C books. Same thing. Obviously in some environments it doesn't matter, but bad habits are bad habits..
Not quite that low... (Score:2)
malloc() is not *part* of C, merely a POSIX system call. Granted, it's about the most essential of system calls, but if your book is about C, and C alone, you can leave malloc() out. I guess.
Re:Not quite that low... (Score:2)
(Of course, malloc is free to not exist in freestanding environments, but they don't necessarily start execution at main(), and there's no requirement that they support any form at all of input or output.)
Re:Not quite that low... (Score:2)
Re:low standard (Score:2)
cheers,
mike
Re:Does Greenlee have his Ph.D. yet ? (Score:2)
He's still just an instructor (AFAIK). I just refer to him as a prof because it's easier than explaining the real title. I also enjoy his teaching style but I'm glad he's the only one. I don't think I could deal with a full load of Greenlee-like profs.
Above average? (Score:4, Funny)
Actually, "C" is about dead-on average, I'd say!
markQuick Learning (Score:5, Informative)
A higher level language provides the abstractions necessary to accommodate "logical thinking" as opposed to a full understanding of say memory management and system I/O. Also, C is quite an orthogonal language in that it supports many awkward combinations of features and constructs. If you are not careful to make your source text clear and readable, debugging even your own code can be oh so cumbersome.
Hence, perhaps reading a book such as "Computer Architecture: A quantitative Approach" by J. Hennessy and D. Paterson is a sensible step towards learning C for the beginner.
Re:Quick Learning (Score:3, Interesting)
A higher level language provides the abstractions necessary to accommodate "logical thinking" as opposed to a full understanding of say memory management and system I/O. Also, C is quite an orthogonal language in that it supports many awkward combinations of features and constructs. If you are not careful to make your source text clear and readable, debugging even your own code can be oh so cumbersome.
I would totally disagree. Learning a higher-level language, like Java, first, will mean you miss out on learning all the basic stuff you would pick up in C which will stand you in good stead if you ever need to pick up other languages quickly.
Re:Quick Learning (Score:5, Interesting)
Re:Quick Learning (Score:2, Interesting)
At first, you hate the structure that it forces on you. After looking at some freshman's C++ code that is barely better than spaghetti BASIC, though, you begin to pine away for the Good Old Days of Pascal!
Re:Quick Learning (Score:2)
Not to start a flame-war about this language vs. that language, but I started out with QuickBasic 4.5 (or, since it's no longer for sale, QBasic, as it was shipped with MS Windows '9x/SE/ME). It got me started, and I used the very basic book 'QBasic by Example' to learn the ins and outs of how to write programs (I admit, for Basic, some of the code in this book is REALLY old and stale, but for those just starting to learn programming, it's a good start to have IMHO). From that I moved up to Visual Basic for DOS 1.0, the only real difference between it and QBasic was the addition of some reference material on the language (what functions/subroutines do what, how to call them, categorized and indexed for easy perusal).
After that I moved up to Borland Pascal (IMHO a good cross of the best things of C and Basic), learning more about the hardware involved in a PC and dipping into assembly language for the first time (Borland Pascal (and for the curious, Borland Delphi today) make it easier to learn assembler since it includes a built-in assembler (basm) that doesn't invoke any seperate compilers (like some C compilers do when you use __asm for example-- if you're lacking an assembler, your __asm blocks won't compile (again, this depends on if the C compiler has an internal assembler, or uses an external one)). From that I learned C and ultimately started working on C++ (a subject I'm still expanding on today).
I think it's more appropriate to start out with the highest level language you can find, get your feet wet with learning, then work your way down into the internals (this is probably also the best time to start to learn code optimization at the CPU level, for example). For someone jumping into PC programming that has no prior experience, a computer architecture book IMHO would be the wrong way to go.
Re:Quick Learning (Score:2)
A page of code ? (Score:2)
public static void main(String[] args) {
System.out.println("Hello !");
}
}
compare:
#include "stdio.h"
int main() {
printf("Hello !");
}
5 lines each. Its a draw.
Re:Quick Learning (Score:2)
Re:Quick Learning (Score:2)
For *very* simple cases like string literals, yes. But what about when you start wanting to actually wanting to have dynamic values?
const char *foo = "This is";
const char *bar = "intuitive";
printf( "%s %s?", foo, bar );
Oh yeah, very intuitive... Compare with
String foo = "This is";
String bar = "intuitive";
System.out.println( foo + " more " + bar );
%s? What does 's' stand for? String? Why is the variable a char* then? That's the C version of a string? Sheesh! It's a good thing C isn't complex or anything...
Of cource, there's always C++
std::string foo = "this is";
std::string bar = "intuitive";
std::cout << foo << " also more " << bar;
C coders really need to take a step back and look at their "simple" language and try to remember all of the arcana necessary to memorize to use it. And what about those string manipulation functions? strcmp, strcat, strstr, strtok... Why were they only six letters long and cryptic? Oh! Those limited-functionality computers from >30 years ago that had only a few K of memory and symbol tables couldn't handle long names. It's 2002 for god's sake! Long names don't matter in processing time and modern editors (if you can call emacs modern) can do variable and function name completion for you. It's not a time issue, it's not a performance issue, and it's not simpler.
<bio>I code in C for a living along with a few other languages and I'm tired of constantly writing a new linked list structure or binary tree or any other data structure reminiscent of the wheel.</bio>
Re:Quick Learning (Score:2)
If you want to learn more about recursive Fortran programming, check here [hpc.mil] or here [ibiblio.org].
If your classmates were using a strictly conforming Fortran 77 compiler (like GNU), that might explain why their parsers were more difficult to write (ie, without recursion). Most 77 compilers (and anything recent) will let you write recursion just fine.
Re:Quick Learning (Score:2)
Have a nice day.
here's a good beginner book on C (Score:4, Informative)
across is _A Book on C_, by Al Kelley,
and Ira Pohl. I've recommended it to quite a
few beginners and they've all said it was an
easy and very informative read.
Agreed. (Score:1)
In case you don't know what they're talking about (Score:1)
Mind you, its probably not the best book to learn C, but once you have the basics, this book will become your bible. I keep a softbound copy at home, and a nice hardback version I found at a used bookstore at work. Absolutely indispensible.
Re:In case you don't know what they're talking abo (Score:2)
Are you kidding, you want a copy of BOTH edditions. I still go back to my first edition K&R book when I need to know.
Maybe you don't have to make your code run on an old VAX as well as a system with a ANSI-C compiler, but I do from time to time. Even when I don't have to, I like to know my programs will work. (of course there are a lot of ansi-C tricks that I have to not use to make it work, and then put back in for modern systems, but I can deal with that)
Re:In case you don't know what they're talking abo (Score:2)
Hasn't gcc been ported to the VAX? Seemed like it would have to have been, or the NetBSD Port [netbsd.org] would be impossible.
Re:In case you don't know what they're talking abo (Score:2)
Yeah, but we hadn't updated the vax in years. It was only used for testing in a lab, as a network traffic generator. Not internet connected so security wasn't an issue. Never saw a real load, so there was no point in upgrading to something more modern. We have plenty of more modern equipment to play with, so nobody wanted to upgrade the vax, though when you need a few more traffic generators they were there.
The Art of Writing (Score:3, Insightful)
As an example, I can recall a man who came into the store where I was working, and who asked me how much "virtual memory" cost.
Besides trying not to laugh there was the problem mentioned above.
The fundamental concepts missing were the concepts of "memory" and "virtual", along with a larger mental model to enable the average person to organise the concepts into something useful when dealing with computers. [The usual mental model I use for beginners is one of a computer = your information factory. Hard drives = warehouse, etc.]
It is possible to arrange things inthe manner of " Gradus [go.com] ad [karadar.com] Parnassum [alexanderpublishing.com] " (graded steps)
Without proper technique in this area, It is very easy to make a bloody mess of it. It is a skill in its own right, separate from knowledge of the area to be taught in the first place.
Tools of the trade. (Score:3, Interesting)
A good C book is always a great find. I find myself going through kernel code or drivers or as of late the netsnmp code going hmmm.. I've seen this before but exactly what is going on.
Refrence material is always good..no matter what.
What I would love to see is a thread on what books most coders have found to be the most useful, what they have on there shelves.
I love reading the reviews that say the book is good but not really worth my time.
Re:Tools of the trade. (Score:2, Informative)
"C Traps and Pitfalls", Andrew Koenig, AT&T, 1988
- A bit dated in places, but still covers the very fundamental gotchas that a lot of programmers forget/don't know
"Expert C Programming: Deep C Secrets", Peter Van Der Linden, SunSoft Press, 1994
- Fun to read, especially how the simple linguistics of a language can cause major ($20 mil)
bugs
"The C Programming Language, second ed", Kernighan and Ritchie, AT&T, 1988
- They developed C, 'nuff said.
Most importantly, does it cover coding style? (Score:1, Insightful)
Yeah, you might bang out a quick "itch-scratcher" without using a strict coding style, but not a solid piece of software that can be maintained.
Lost in a sea of C's! (Score:1)
A quote from the reviewer: "C is difficult enough to present without a language barrier introducing more problems." I think he is confusing 'C' with 'C++'; which can be a very confusing implication of 'C'.
I think a first time 'C' user would be well advised to get one of the 'K&R' programming books. I found it very helpful when I was learning 'C'.
Kernigan Ritchie (Score:1)
Just for fun... C is a hoax? (Score:5, Funny)
"We stopped when we got a clean compile on the following syntax:
for(;P("\n"),R--;P("|"))for(e=C;e--;P("_"+*u++/8)
To think that modern programmers would try to use a language that allowed such a statement was beyond our comprehension!"
Also check out shooting yourself in the foot [noncorporeal.com] in various programming languages.
Re:Just for fun... C is a hoax? (Score:2)
Best C programing book available (Score:2, Insightful)
Ada Anyone (Score:2, Insightful)
Get rid of C! (Score:5, Insightful)
Goodness, this is an awfully empty review. Except for the comment about the author's native language (which humorously is followed up by an awkward if not ungrammatical sentence from the reviewer), this whole review could be applied to practically any programming book! What sets this book apart? If nothing, then why review it?
Anyway, the real reason I clicked on this article is because I just love a C debate. Since there's hardly anything to talk about with regard to the review, let's get to it!
Here's what I say: outside of the low-level systems crowd, C should die. We should *not* be teaching beginning programmers C or C++. C should not be the "default language" of computer programming.
Today, efficiency is no longer the primary concern about software. It has been ousted by robustness and ease/cost of development (modularity and reuse). C is awful for robustness (the language lets you "get away" with anything you want, though those things are seldom what you want in application software), and even worse for modularity and re-use. Modern languages, or even quasi-modern languages like Java, are far better than C for robustness and ease of development. They even win on some points which are typically seen as less important than efficiency: portability, elegance, etc.
Finally, the efficiency of high-level languages is comparable (though not as good as) C. Compiler technology is improving somewhat, as well. But since developing and debugging take less time, you have more time to optimize your program (if necessary), so I am not convinced that this is really a big deal. Yet, even if I need to concede the efficiency issue, I still believe modern languages beat C overall.
I'll be glad to argue about that stuff, but today I have a different point to make. C is also holding back the progress of other computer fields. Let me give you an example. My friend is working on compilers that compile high-level languages (to them, C is a high-level language) down to circuits. Here, discovering data parallelism in the code is probably the most difficult problem. Of course, the same issues arise in compiling to architectures like the IA-64 or even P4, where in order to get best execution speed, the compiler needs to figure out what instructions can be executed in parallel.
When they compile C, they need to look for common idioms (certain patterns of for-loop use), then analyze them to extract the parallel algorithm. For instance, this simple C loop adds k to every integer in an array a of size s:
for (int i = 0; i < s; i++) {
a[i] += k;
}
The idea is that the compiler should be able to produce a circuit that does all of the adding in parallel, on silicon. Since you all probably grew up on C, this seems like the totally natural way to write that code. In fact, it is short and it is straightforward. Unfortunately, it is less straightforward to a compiler. The compiler needs to prove to itself that the for loop can be parallelized -- what if the programmer changed k or a or i in the loop body? The C code actually says to run each loop iteration sequentially.
Of course, compiler writers have gotten pretty good at catching this particular idiom, but when the code gets more complicated (especially when the compiler needs to do alias analysis), it is not so good.
The chief problem here is that the programmer is unable to effectively communicate his parallel algorithm to the compiler. The programmer takes something in his mind "add k to all elements in the list", sequentializes it for C "for (int i = 0 ...", and then the compiler has to *undo* this sequentialization to produce the parallel code. In the process of translating, some information is always lost.
Now look how this code would be written in SML (a so-called "modern language"):
Array.modify (fn x => x + k) a
(fn x => x + k) is a function expression (SML lets you pass around functions), and Array.modify simply applies the function to every element in the array. Here, the compiler can very easily tell that my operation is parallel, because I used Array.modify! The code is also a bit shorter, and I also think that this code is a lot clearer. That's subjective, of course. BUT, I hope you will agree that for this example, the SML code is closer to what the programmer means, and easier for the compiler to understand.
Anyway, perhaps some of you will say that this particular issue is not a problem, or that it is already solved (I would like to hear the solution!). I merely mean to propose an example of a theme I have been observing over the past few years in many areas of computer science. Computer programming is about communicationg with the compiler or machine in such a way that it is easy for the human to create code, and easy for the machine to understand it. C was never particularly easy for a human to create (though we have become accustomed to it), and though it was once easy for a compiler to understand, this is becomming less and less true. When the language is neither optimal for humans nor optimal for compilers, doesn't that mean that something needs as change?
Re:Get rid of C! (Score:3, Insightful)
I'll start my argument with an anecdote. There is one particular C++ program I was writing, not atypical, and not a particularly difficult problem computationally. I was programming on my AMD Athlon 1400MHz, a fast computer by any reasonable standards, and found that it wasn't running as fast as I'd like. I profiled it, and found the functions to blame were operator overloads (called very frequently) on a class coord, which looked something like this:
friend coord operator+ (coord a, coord b) { return coord(a.x+b.x, a.y+b.y); }
Spot the error on that line. Now, ask someone who's never programmed in C before to spot the error. Give up? It's the passing by value, which prevents the compiler from inlining the function; the correct way to write that function is:
friend coord& operator+ (const coord& a, const coord& b) { return coord(a.x+b.x, a.y+b.y); }
If not for my experience programming in C, I never would've realized that.
It is my observation that people who are taught to ignore C, and start immediately with an object-oriented language such as C++, <B>never</B> learn low-level concepts such as inlining or pointers, and never learn to truly understand what it is that they're writing. I say that the best way to learn and properly appreciate the "right way" of doing something is to first do it the *wrong way*, in a project that doesn't matter for anything. I have debugged pointer spaghetti, written code with dozens of meaninglessly named global variables with no comments, written procedural code, and had I not done these things, I wouldn't know why it is important to name variables intelligently, to use object orientation, or to use pointers carefully. You tell someone to comment their code, and you get lines line this:
a=b;
The only reason the comment is there is because they were taught to always comment their code, to comment every line, etc. On the other hand, someone who's dealt with uncommented code before would put useful comments where they need to be.
I agree with you that C should not be used in production code where it can be avoided (that is, areas other than systems and embedded programming). However, I strongly believe that people should always learn and master C before learning higher-level languages. If the only reason you use classes is because you were taught that that's how to write clean code, then you're not using them correctly. On the other hand, if you're using classes because you wrote in C up to the point where you encountered a problem that required inheritance or polymorphism, then you're using the feature for the right reasons.
Your example, by the way, in which a loop that increments elements of an array is parralelized in hardware, is actually simpler than you think. The compiler first performs loop unrolling (a very, very old idea), then analyzes the code blocks to see that they don't work with the same data, and parralelizes them. Your particular example implies that the only real way to solve parralelism is to define parralel functions and human-solve them; this clearly violates the distinction between language and library, and doesn't really help. Besides, in your case example of compiling high-level code into hardware, I could come up with far more examples where object orientation hurts rather than helps. OO hides all of the overhead, promoting huge bloat which, while not a problem in software, is fatal in hardware.
Good points, bad conclusions? (Score:2)
First of all, let me say that I see where you're coming from, and I agree entirely. I think it is important for people to understand the effects of the code they write, to an appropriate level.
However, I don't think your examples and conclusions follow from that as well as they might. For example, the inlining issue you mentioned (passing by value in C++) is a quality-of-implementation issue for your C++ compiler. It isn't a flaw in the langauge. It's a shame that we have to have idioms like pass-by-const-reference in a language at all today, since they are effectively nothing but preemptive optimisation and syntactic cod liver oil.
My other suggestion is that, while I do think it's advantageous to know what's going on under the hood, that doesn't have to come before learning a higher level language. On the contrary, those who have programmed C or assembler for years and who then try to pick up a language such as ML seem to be hampered by an intuitive need to understand everything that's going on under the hood, and to map all constructs and ideas in the language onto familiar low-level terms. This is counter-productive. We know that some higher level languages can generate very good code, with speeds and executable sizes comparable even to languages such as C or C++ today. Focussing on the little details -- trying to break down a complex construction in a high level language into low-level concepts -- hides the big picture, and we all know that your performance really depends on the overall job, not the little tweaks.
As a supporting fact, consider that today, most C or C++ compilers on Intel boxes generate better (smaller, faster) assembly language output than "hand-optimised" assembly written by the programmer. Issues like pipelining and parallel processing have rendered truly low-level programming a specialist art, requiring great skill. Since the people writing compilers specialise in that skill, I'm happy to let them do their job, and get on with mine using the tools they give me.
Re:Get rid of C! (Score:2)
Just because a post is provocative doesn't mean it's a troll. A troll post usually says something that the author doesn't really believe, just to get people fired up. I actually believe this, and I think it's worthwhile for the C-centric slashdot crowd to think about it.
> It's the passing by value, which prevents the
> compiler from inlining the function; the correct
> way to write that function is:
I disagree that this "prevents" the compiler from inlining. It might force the compiler to call the copy constructor (if it can't deduce that it has no effect), but there's no reason it "prevents" inlining. It may be that the compiler just doesn't inline it, sure.
I don't really see why programming in C gives you any more clue about when a C++ compiler will perform inlining, since that's basically an arbitrary choice made by the compiler.
That said, I agree that there is definitely a place for learning C in a well-rounded programmer's education. I've taught programming enough to know that... Programmers that don't know low-level programming won't be able to do low-level optimizations. BUT, I think the emphasis that slashdot folks place on low-level efficiency is highly overrated. I feel that C actually encourages programming styles that are BAD for programming in general; I think it can do more harm than good for a programmer to learn "efficient" programming in C. So while it's a good idea for people to learn it, I don't necessarily think that it's smart to learn C *before* you learn other high-level languages.
> Your example, by the way, in which a loop that
> increments elements of an array is parralelized
> in hardware, is actually simpler than you
> think.
Well, if you say so. People actually do their PhD theses on this kind of thing. In conversations with my friend, who actually works on these kinds of compilers, it seems that the problem is actually difficult. For instance, if there is almost any kind of memory write in the loop, then you need to do alias analysis to verify that it is not breaking the parallelism.
> Your particular example implies that the only
> real way to solve parralelism is to define
> parralel functions and human-solve them; this
> clearly violates the distinction between
> language and library, and doesn't really help
Well, let me reiterate my actual point: Instead of always working with C, languages should be adapted to the task at hand. C gives you an abstract view of a sequential machine -- and this is hardly how computers actually operate today (especially when compiling to hardware!) Having a language (even if like C) with a parallel array modification construct like SML's would make programs *easier* to write, and compilers better.
I don't know what you mean by violating the distinction between language and library. Why should parallelism be a library??
For the record, I wasn't encouraging Object Oriented programming, which I think is a bit of a sham -- but that's a different argument entirely.
Re:Get rid of C! (Score:2)
Unlike buildings, we shouldn't start from scratch every time we write a new program; we ought to build up and only program the new stuff that's specific to our program and nothing else. C is terrible at that, which is why God gave us Lisp [lisp.org] and the wisdom [paulgraham.com] to use it. I have yet to hear anyone make a cogent argument as to why Lisp is inferior to C for general-purpose application coding (not systems-level stuff; e.g., why Excel, Word, Notepad, the calculator, Netscape, etc ought to be coded in C or C++ instead of CL or Scheme). And it's not like I haven't asked.
My favorite C book... (Score:2)
http://knking.com/books/c/
What I can say is, everyone I know who's on the C standards committee and has read this book liked it, as do a number of the regulars of comp.lang.c. I read an early printing, found maybe two small errors, and one basically correct thing that isn't the way I'd have done it. I believe he said the errors would be corrected in future printings.
Nice book.
Same title different book? (Score:2)
Re:Experienced programmers (Score:2)
man is good as a specific reference but I think since you are already an experienced programmer you aren't looking for a book like this one.
Re:don't waste your time (Score:4, Informative)
Re:don't waste your time (Score:5, Funny)
Why I wasn't using Java 10 years ago (Score:2, Funny)
Also, old does not necessarily mean bad; universities still teach LISP (out in 1958) and quite rightly so.
Re:Why are people still using a 30 year old langua (Score:3, Insightful)
If the computing industry had any sense, it would have switched to Java 10 years ago. Why hadn't it? Inertia!
Speaking of taking out the trash...I prefer to say when garbage collection occurs. I don't get that control with Java like I do with C.
Re:Why are people still using a 30 year old langua (Score:1)
Actually, the precursor to B was BCPL, not A. B is C's precursor.
Re:Why are people still using a 30 year old langua (Score:2)
I guess I'll wait for "D" to come out
Re:Why are people still using a 30 year old langua (Score:4, Informative)
A good programmer can manage memory without the help of the runtime environment. There's a certain pride in being able to program with no memory leaks.
Re:Why are people still using a 30 year old langua (Score:2)
Then where have all the good programmers gone? Or have they all moved over to Lisp?
Informative? More like impudent (Score:2)
We're not talking about "hello world". Try coding up 50k lines of C and see how well your memory management skills stack up.
Re:Why are people still using a 30 year old langua (Score:2)
Storing the __LINE__ and __FILE__ compiler variables along with each allocation in your debug build, and being able to print the current state of all allocations is a no brainer. And it points to exactly the point of leakage. For a more complex project, add tags so you can check for memory of a given type left between points in your code.
You should get the idea, and adapting this to suit your memory usage mode should be pretty straightforward.
A decent programmer should be able to add this to his project and find all leaks between reading this and going home tonight. RAM leaks are a sign of a bad programmer, or a programmer too lazy to create good tools for himself.
Quit blaming your tools. Get off your ass and make the ones you're missing.
Re:Why are people still using a 30 year old langua (Score:2)
sorry.
Its just so good to here someone post about good programming habits instaed of what language makes it easier to code without regard to what there doing.
the question is: are you in the business to write code, or to write programs?
Re:Why are people still using a 30 year old langua (Score:2)
This assumes that your test cases are always exhaustive enough to cover any possible order of reference creation and destruction processes, and have run times long enough to do so.
I'm not blaming my tools, I'm blaming your tools. When I program in Common Lisp, I don't spend a millisecond of my time debugging/preventing memory leaks. You seem to spend days. Which seems like a better use of programmer time?
Re:Which decent games are written in lisp? (Score:2)
I would suspect that the best breakpoint for run-time efficiency is to have a C or assembly language rendering engine, with Lisp-based game action code. I.e. the dynamic portion that has to be easily changed is in Lisp, the part that has to smash data into registers as fast as possible would be in C or assembly. Of course, commercial Lisps typically have provisions for a Lisp-syntax expression of assembly code, so you can write assembler using Lisp macros...
Re:Why are people still using a 30 year old langua (Score:4, Insightful)
Just because "malloc" and "free" each fit on one line of code doesn't mean they are fast or efficient. In particular, if your memory arena gets seriously fragmented, the performance of these routines can get worse and worse over time.
As opposed to a genuine generational garbage collector, for which the performance stays relatively consistent.
Re:Why are people still using a 30 year old langua (Score:2)
To be fair, this is becoming less and less of an issue, but still an issue for hyper important/time sensitive apps. With C-style mm, you can decide when to free memory if you want.
Re:Why are people still using a 30 year old langua (Score:2)
Most modern garbage-collected languages can use generational garbage-collection, which, although not hard real-time, generally avoids long pauses, and is very efficient when much garbage is being generated quickly.
Malloc and free are generally not hard real-time.
Try reading the Garbage Collection FAQ [memorymanagement.org]
Re:Why are people still using a 30 year old langua (Score:1)
C has no bounded array support. This makes it inherently unsuitable for any security programming taks.
*cough* *NIX is written in C *cough*
If the computing industry had any sense, it would have switched to Java 10 years ago. Why hadn't it? Inertia!
Let's just think of how fast x86 Java VM's were 10 years ago. In 1992 I had just purchased a Macintosh Quadra (25mhz), Windows NT wouldn't come out for a year later, and Windows 3.1 and OS/2 2.0 had just came out. The Sparc 10 had just been released. Pentiums were not coming out for a whole 'nother year.
The HotJava browser didn't come out until...1994?
Re:Why are people still using a 30 year old langua (Score:2)
*cough* *NIX is written in C *cough*
Yes, but wasn't Windows also originally written in C? I guess it's the programmers and not the language that determine the security.
Re:Why are people still using a 30 year old langua (Score:1)
Good. Now choose an OS with no known buffer overflow exploits.
Re:Why are people still using a 30 year old langua (Score:1)
management complications in C cause problems,
but, I believe that OpenBSD has no known
buffer overflow exploits, at least in the core
OS.
Re:Why are people still using a 30 year old langua (Score:2, Funny)
C is a language that combines all the elegance and power of assembly language with all the readability and maintainability of assembly language.
Re:Why are people still using a 30 year old langua (Score:5, Informative)
1. It is the native language to most operating systems API (*nix, MS-*).
2. It is the language most third party librarys and code is written in.
3. It is the language a lot of old code is written in.
4. Interfacing other languages with libraries or APIs written for C is never as easy as the documentation says it is and issues may slow down development time significantly.
5. It is the only widely used standardized all purpose language that is available to all platforms.
C++ - is hardly standardized (most compilers still does not come close to the ANSI-C++ standard).
ADA - Is not widely used.(unfortunately)
Java - Is not an all purpose language.
6. It is a very elegeant and consistent language (unlike C++).
Re:Which compilers do not come close to ANSI-C++ ? (Score:2)
There is currently no 100% standard-compliant C++ compiler available, AFAIK. Comeau's is close, but even that has been suffering with getting export to work properly, and Comeau C++ seems to be several months ahead of the field in standards compliance.
You're right that most of the problems are with template support, but to the serious C++ programmer, those are serious problems.
Re:Why are people still using a 30 year old langua (Score:3, Insightful)
Re:Why are people still using a 30 year old langua (Score:2)
Show me a 30 year-old C compiler that's still in use.
The language may be mature; the knowledge that the compiler writers possess about the language and how to compile it may be mature; but I seriously doubt the maturity of most compiler code in use today.
By your rationale, we should prefer Fortran and Lisp
OOPS! I do (at least for most things). Never mind...
Learning C before C++? (Score:2)
Interesting. However, many experts do disagree with you, including Bjarne Stroustrup [att.com] and Marshall Cline [parashift.com].
Re:C vs. C++ (Score:2)
It's easy to jump to that conclusion if you've been reading lots of C++ books, and certainly the IOStreams library has advantages over C's approach. However, it is itself fundamentally flawed for many applications, simply because it makes it nigh-on impossible to write code that's well-behaved in multiple (human-spoken) languages.
Re: Does the world need more C books (Score:2)
> side effects, which is a bane to programming.
Hmm. I agree with what you say about C++ -- I think the language is awful -- and I agree that side effects are a bane to programming, but I definitely don't agree that C is good for side-effectless (ie, functional) programming.
For instance, implementing something as simple as binary trees without side-effects almost *requires* a garbage collector in order to get reasonable performance. (Maybe you could do something tricky with lazy reference counting, I dunno.)
In fact, since memory allocation is a side effect, even string manipulation in C is hard to pull off without side-effects. (How do you return a variable-sized string from a function?) On the contrary, C++ has value-semantics strings that make this possible.
And since C lacks first class functions (really, I should say it lacks nested functions, since you can actually pass around pointers), lots of the functional idioms just don't work.
What did you mean by this?