The IOCCC Competition Is Back 201
Rui Lopes writes "After a 5 year hiatus, the IOCCC (International Obfuscated C Code Contest) is back! This marks the 20th edition of the contest. Submissions are open between 12-Nov-2011 11:00 UTC and 12-Jan-2012 12:12 UTC. Don't forget to check this year's rules and guidelines."
Comment removed (Score:5, Insightful)
Re:It'd be nice if ... (Score:5, Interesting)
Re:It'd be nice if ... (Score:5, Insightful)
Re:It'd be nice if ... (Score:5, Insightful)
Most C coders seem to achieve obfuscation without any additional incentive.
Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.
Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.
Re:It'd be nice if ... (Score:5, Insightful)
Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.
Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.
+1, it all started going downhill when :
- professional language designers abdicated their role, and the void was filled by amateurs
- people who use these languages have no fucking clue what they're doing and we're all paying the price
- corporations hyped languages for their own purposes and languages stagnated or worse were crapified to an absurd level (witness java).
Re: (Score:2)
professional language designers abdicated their role, and the void was filled by amateurs
I'm not sure how you define a "professional language designer", but I don't think Ritchie was one either way. It's precisely why there are a lot of messy things about the original design of C, such as its declarator syntax, or mixed signed/unsigned arithmetic rules, or implicit int. Some of it, I believe, comes from having the language designed as its compiler was written, and design being tweaked so that it would be easier to implement - it's a hacker's pragmatic approach to making a tool that's needed for
Platforms that can't run C (Score:3)
The kind of crap that would be easier to rewrite than refactor?
How about stuff that needs to be rewritten from scratch because a target platform can't run C? This is true of the web (or at least it was until Emscripten), and it's still true of Xbox Live Indie Games and Windows Phone 7.
Re: (Score:2)
RIGHT the fuck on. +5, bigup, the facts of life.... here gentlemen - you have your answer.
Re: (Score:2)
Re:It'd be nice if ... (Score:4, Funny)
Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.
Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.
Wow! Dr. Ritchie, everyone thought you were dead!
Re:It'd be nice if ... (Score:5, Insightful)
Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.
Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year.
This clearly shows you simply don't understand the problem. A good programmer can (and does) write well structured, clean, DOCUMENTED and maintainable product in any language. The issue has nothing to do with the language used and everything to do with lack of discipline, inexperience and a slapdash and unprofessional attitude. Usually the worst programmers are the ones who think that once the code is written and compiles clean, the job is done. For most of these people there is little hope of educating them as they are incapable of seeing the bigger picture.
Re: (Score:2)
Yes, a true communicator switches to any of Earth's languages at will and celebrates the variety, eagerly perfecting his ability in any new language which some committee or group of enthusiasts recently invented. This is a realisable and good use of the copious time every human has available: the sugary topping has always been more important than the meal below.
Re: (Score:3)
Sounds great, but all that nice sounding theory doesn't apply in practice. For example, C and C++ in particular are languages that started out "simple" but became quagmires over time. It's impossible to write portable C/C++ code that meets your requirements of "well structured and clean".
Haven't you noticed how every cross-platform C/C++ library starts out with pages of pages of "MY_LIBRARY_INT32" and "MY_LIBRARY_EXPORT" and other redefinitions of "standard" types, keywords, and functions? That's because C
Re: (Score:3)
Meanwhile, this Java code will work on all platforms, processors, and compilers, forever and ever:
I'll switch to using a language other than C when a language other than C allows me to use wait-free shared-memory multiprocessing algorithms.
Re: (Score:2)
C# and Java both have atomic operations in the standard library. See Interlocked.CompareExchange [microsoft.com], and java.util.concurrent.atomic [oracle.com] for examples.
Multi-threaded programming is particularly easy in those languages, because a lot of their internals are inherently thread-safe. For example, strings are read-only, so they can be passed around risk free. Similarly, mark & sweep garbage collection is thread-safe, and doesn't suffer from the rare but complex to debug memory leaks that occur with reference counting
Re: (Score:2)
I was talking about multiprocessing, not multithreading.
Re: (Score:3)
java.util.concurrent.atomic is a perfect example of why Java is not a viable choice for the work I'm doing. One of the tasks I currently have to handle is multiprocess disjoint set construction (using the wait-free union-find algorithm), on a very large corpus. This algorithm requires each disjoint set tree node to contain two fields: a reference to its superset, and a rank counter. In Java, the only choice I have is to use an array of AtomicStampedReference<V>, which will always occupy at least two p
Re: (Score:3, Interesting)
Well, duh. That's the entire point of C. You're complaining that it's a language close to the metal.
St
Re: (Score:2)
"char" is probably the most stable type in C
But can't be used to represent characters, because Unicode requires at least 16 bits for the character type. So yeah... that's obvious.
No problem, I'll use wchar_t, which has a nice dependable size of... 8, 16, or 32 bits, and may be either signed or unsigned.
And btw in C you have intXX_t and uintXX_t types now.
Are you sure? They're in only present in recent versions of the standard library... and wait for it... while they're defined to be exactly XX bits, they're not guaranteed to exist.
Re: (Score:2)
But can't be used to represent characters, because Unicode requires at least 16 bits for the character type.
Never heard of UTF-8 then, I take it? That works fine in regular old char arrays... (Though, to be fair, it introduces other issues, such as you can no longer depend on the length of the string being the number of characters in it...)
Re:It'd be nice if ... (Score:4, Interesting)
You're making basically the same argument as people were saying back when machine code was what people wrote and C was new. If you have an open mind, you can easily see that C has serious shortcomings by modern language standards.
C offers no abstractions for complex data types. It offers no subtyping. There's no facility for generic programming other than macros, which everyone knows suck. No support for closures or comprehensions. None of these things are "trivially implemented", as you state. Even its syntax sucks, as anyone would agree who's tried to declare a non-trivial function pointer.
Many common programming tasks require extensive pointer manipulation in C. Even the best programmers (I'm one of them, and I concede this point) make ocassional mistakes with pointers, and they are the worst kind of bug: silently incorrect or a crash at a random place in the code.
C is perfectly appropriate for some projects, especially with really low-level code (as most C constructs translate directly to assembly). C++ is usually better, as it has a richer typing system and ability to do generic programming, but you need to be an expert as the language is full of pitfalls (which are mostly C's fault). For projects that don't need to be close to the hardware, scripting languages can multiply programmer productivity.
Re: (Score:2)
Re:It'd be nice if ... (Score:5, Informative)
Many common programming tasks require extensive pointer manipulation in C. Even the best programmers (I'm one of them, and I concede this point)
I'm seriously doubting your professed skill here. You don't ever have to do pointer arithmetic in C, unless you are counting parameter passing as 'extensive pointer manipulation,' but you pass parameters as pointers in Java too (that's why you can get an NPE). The most common use of pointer arithmetic is for array processing, but if you want to be safe you can just use the array[] notation and not worry about understanding pointer arithmetic (I usually do, unless I have a compelling need to use pointer arithmetic). Furthermore I don't even know what you are talking about when you say, "non-trivial function pointer." Aren't all function pointers the same, just a bunch of parameters and a return value? Or are you declaring an array of function pointers or something? That might be where your problems are coming from.
From experience I can say by far the thing that takes the most extra time when I am writing in C (compared to Java) is the lack of a good library, with common data structures like hash tables and lists and regular expressions. The number of times I've had to write a generic list library for some random platform, or figure out someone else's nonstandard implementation, is depressing.
Also the generics in Java are a double edged sword. They allow more flexibility, but allow you to get away with writing incredibly confusing code, that can be extremely difficult to understand without a debugger. C code (really) tends to be a lot more readable. The downside is that it's usually a lot easier to refactor Java code without needing to rewrite a lot of interfaces (even when the interfaces were poorly written in the first place).
Ultimately though, a good programmer will write good code in any language. A poor programmer will likewise write poor code in any language.
Re: (Score:2)
Also the generics in Java are a double edged sword. They allow more flexibility, but allow you to get away with writing incredibly confusing code, that can be extremely difficult to understand without a debugger.
The design of generics in Java is flawed in more than one way, but can you give an example of "confusing code that can be extremely difficult to understand", especially "without a debugger" - that last part sounds completely nonsensical to me since Java generics are purely compile-time; they don't have any runtime variability by design (due to erasure).
Anyway, there are many better examples of generics done right. My personal favorite are OCaml functors, which could be easily slapped on top of C with only m
Re: (Score:3)
Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.
An upgrade to C++ is a very good idea though.
Re: (Score:2)
Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.
An upgrade to C++ is a very good idea though.
I know, let's call it C# or maybe Java.
Re:It'd be nice if ... (Score:4, Funny)
Java is NOT an upgrade to C++. There was a fork in the road, to the left went C++ to the right went Java. C++ took you through a swamp filled with poisonous snakes, quicksand and man eating spiders. Java took you through a haunted forest, with werewolves and zombies.
No matter which path you took, you died before reaching your goal.
Re: (Score:2)
Java is NOT an upgrade to C++. There was a fork in the road, to the left went C++ to the right went Java. C++ took you through a swamp filled with poisonous snakes, quicksand and man eating spiders. Java took you through a haunted forest, with werewolves and zombies.
No matter which path you took, you died before reaching your goal.
Java's got it's weaknesses but there's nothing seriously wrong with the language. The problem is the libraries, especially the enterprise libraries, and I mean both the EJB monstrosity and the consultant hell that passes for light weight. Misuse of the idea of design patterns are to blame. Let's look for ways to tack on another layer of abstraction we'll never actually use, shall we?
Re: (Score:3)
http://thedailywtf.com/Articles/The-Integer-Cache.aspx
It's a WTF. What, you never saw a C or C++ WTF?
And it seems that you need a string builder class to help you manipulate strings. Sorry, but that tells me that you're doing strings wrong.
It's one of the tradeoffs that you have when you design what your strings look like. In C++, they are mutable, but they are "value types" in Java terms - i.e. copying them around actually copies the buffer, so when you pass a string to a function, it can freely mutate it since it has the pointer. But copying like that ain't cheap - it's why you have to pass by const reference where possible. Often can't return by reference, though. RVO (and, in C++11, move sema
Re:It'd be nice if ... (Score:4, Insightful)
And that is ultimately my main beef with C, it's impossible to write non-trivial code that DOESNT make use of the pre-processor. Header guards in 2011? Really? C either needs to make an Objective-c like import statement a standard or else make #pragma once standard and make it default, so that if in the rare case you do actually need to include a file more than once, THEN you have to use a pre-processor command. I think the pre-processor is a really useful feature of C, but it should never be essentially mandatory to use it.
Re: (Score:2)
Re:It'd be nice if ... (Score:4, Insightful)
It's "broken" in a sense that, if you try real hard to break it, you can find ridiculous corner cases that can do that. In the meantime, there are millions of line of C and C++ code written using it, which compile and work just fine.
Re:It'd be nice if ... (Score:5, Funny)
Yeah, Python sure flooded the marketplace in the past year. Now, if you'll excuse me, I've got to check the breaking news about the Lewinsky scandal after buying some hot dot-com stocks while on the way to work at the World Trade Center because apparently it's the late nineties again somehow.
Re:It'd be nice if ... (Score:4, Funny)
Re:It'd be nice if ... (Score:4, Insightful)
Closed source code is the same, only you don't get to see it.
Re: (Score:2, Insightful)
I'm going to say that open source is bad and pre-emptively brand all disagreement as fanboyism so that my opinion is taken as authoritative.
FTFY
Re: (Score:3)
Re:It'd be nice if ... (Score:4, Funny)
Back on track, do you have anything useful to add besides pining for the fjords?
Re: (Score:2)
It is called playing to your strengths.
While producing well structured, well documented, clean and correct code in C would be quite a challenge it could never approach some of the new languages in these terms.
Re:It'd be nice if ... (Score:5, Interesting)
Most C coders seem to achieve obfuscation without any additional incentive.
You got it wrong: bad coders create bad code. Good coders know how to create good code. In any language.
When someone knows C well enough to create a truly obfuscated or compressed piece of portable C code that follows the rule of the language to a tee, i.e. that can be compiled strict or linted, and wins the IOCCC, it's a very good sign that this someone can create excellent C code.
I should know, I won the IOCCC years ago, and used it many times in my resume. When would-be employers told me "what's the IOCCC?", I knew they weren't going to be good employers. When they told me "oh, I see you won the IOCCC", they knew I could code good C, and I knew they groked what I did. Winning the IOCCC helped me land a job a few times.
Re: (Score:3)
You got it wrong: bad coders create bad code. Good coders know how to create good code. In any language.
My favorite programming adage: "You can create bad Fortran in any language."
Re: (Score:2, Insightful)
While your code may be technically correct, compile, and do what it's intended to do, that does not make it good code. It just makes it code that works.
Look at IOCCC examples posted on wikipedia. If the average programmer (ie: your coworker) will need to spend more time thinking about the extra whitespace and the syntactic monstrosity that comprises the competition, then your code design sucks and you've ended up causing more headaches with your "good" code.
Re: (Score:2)
I'm not very familiar with this competition but that seems to be the very point. The winning code should be almost impossible to understand unless you are very good. This isn't good code in the traditional sense but in an ironic sense.
Re:It'd be nice if ... (Score:5, Informative)
Maybe I'm a little older than you think? :)
Re: (Score:2)
Re: (Score:2)
They did. It failed. They are back to what comes naturally, Erlang. :p
Re:It'd be nice if ... (Score:4, Insightful)
Sure, that could be nice as well, but the IOCCC provides great challenges and puzzles, something that a clean code contest wouldn't. And what would you rather see in your news paper: difficult puzzles or easy ones? Or, for the youngsters here: would you rather play word feud, or type the answer to 1 + 1 over and over again?
Besides that, the IOCCC entries contain mostly well structured and correct code, and afterwards they get documented as well. It's just not readable.
Re: (Score:2)
Go look up the Demoscene.
Re: (Score:3)
Let me tell you, no demo code is ever anywhere near well structured, well documented, clean or correct.
Re: (Score:3)
IOCCC code is hard to read
DemoScene code is efficient
Both are normal code taken to extremes
Re: (Score:2)
Re: (Score:2)
Heh.
Let me tell you, you don't get to 96k by writing clean code. You get there by writing utter unholy messes, and you get there by cheating like hell, and you get there by using every dirty trick in the book.
Also, you often do it in a week or so before the compo, and continue right up to the deadline, in the party hall, and you do it know you will never have to maintain or look at that code ever again after you hand it in.
If you think demoscene code is "clean", you have absolutely zero experience with it.
Re: (Score:2)
Considering I have done some demos myself, you'd be wrong.
The software running my entire research facility is in 4K, that's network stack, video feed controls, nutrient/water monitoring, the works.
You don't get good small executables writing crappy code.
Period.
Re: (Score:2)
Considering I have done some demos myself, you'd be wrong.
Links?
The software running my entire research facility is in 4K, that's network stack, video feed controls, nutrient/water monitoring, the works.
That is code that you are going to be maintaining. That is absolutely nothing like demo code.
Re: (Score:2)
The demos I have done are all for specific hardware in embedded platforms, not x86 machines.
And released at which party?
Re: (Score:2)
But no one wants that, and hence, Microsoft.
Underhanded C contest should return (Score:5, Interesting)
The IOCCC is cool, but the Underhanded C Contest [xcott.com] was a lot more valuable.
The entries for the IOCCC can show a lot of cleverness, but nobody in their right mind would accept such code. The beauty of the Underhanded C ones is that the code looks reasonable, but does extremely undesirable things.
Re:Underhanded C contest should return (Score:4, Interesting)
call me paranoid but this contest and the ioccc are the reasons why i don't particularly let anything from s.e.l. touch my systems. i am not a good enough coder to be able to tell if what it's doing is what it says it's doing or something the cia wants it to do..
Re:Underhanded C contest should return (Score:5, Insightful)
Re: (Score:3)
This makes me so happy... (Score:2, Insightful)
I hope there are many submissions... It's things like this that teach you the FULL amount of abuse a language can take while still making something that works. :-D
I would like to see this rule illustrated (Score:3)
The C language is not my thing per se, but I'd like to see simple C program code the illustrates the subtleties of C. Anyone?
Re:I would like to see this rule illustrated (Score:4, Insightful)
Look up "Duff's Device". There's a good example.
Re:Use Duff's Device (Score:5, Interesting)
don't use them anymore, go read that post: http://lkml.indiana.edu/hypermail/linux/kernel/0008.2/0171.html [indiana.edu]
Re:Use Duff's Device (Score:5, Interesting)
Emphasis mine. That's REALLY freaking interesting. Posting this AC before modding you up.
Re:Use Duff's Device (Score:5, Insightful)
Re: (Score:2)
C++ also encourages object-oriented programming techniques, which spreads functional responsibility across multiple classes. This naturally leads to more, smaller functions that may also mess with the instruction cache.
Re: (Score:2)
But then all those small functions inline, and you end up with the same big chunks of code as in C.
Re: (Score:2)
Re: (Score:2)
I haven't ever used a C++ compiler which didn't have some way to control inlining. Some even let you set hard limits on levels on resulting function size.
And it's not like C can't be inlined just as well. Especially if you enable link-time code generation, where compiler can peek across translation unit lines.
Re: (Score:2)
I haven't ever used a C++ compiler which didn't have some way to control inlining
Yes, I know, I do work on a couple of C++ compilers...
Some even let you set hard limits on levels on resulting function size
... but it's always done at that kind of granularity. That's the problem. In every case, inlining can appear to be a locally correct optimisation. Every function that has something small inlined into it will become faster. You can measure that with microbenchmarks. The problem is that doing this
And it's not like C can't be inlined just as well
Not as easily. The C specification doesn't define ODR linkage, which makes creating inline functions more difficult. C++ templates get expanded in every co
Re: (Score:3)
Re: (Score:2)
This link:http://www.eecs.berkeley.edu/~necula/cil/cil016.html
describes some corner cases of C. You need to have some prior to knowledge of C to appreciate the non-obviousness of the examples though.
Awesome! (Score:4, Funny)
It's about time I got some more reference code.
The Internet is based on C (Score:5, Insightful)
If C did not get the job done for this kind of computing then it would have been replaced. The fact that C thrives in the systems programming domain is a tribute to it's utility.
A proficient C coder can write clear, maintainable, efficient code that runs on many platforms. This requires both skill and practice. Not everyone is capable of doing this. It requires the ability to keep multiple competing abstractions in mind when coding. I think a lot of people try this and find it difficult and then blame the language. Those who persevere and learn this style of working can usually move on to other kinds of programming and also do excellent work.
Some problem domains require different languages and different skill sets. Personally, I like writing code where I know that if I were to look at the assembly code generated by the compiler I can see how it relates to the C code I wrote. I rarely do this, but it's good to know that I can if I want to. I'm doing any C coding now, because I always use the appropriate language to the task. But I also know that my C coding skills give me a distinct advantage in solving difficult problems, no matter what they are,
Re:The Internet is based on C (Score:5, Interesting)
Without C code there would literally be no Internet.
Because obviously only C is Turing-complete.
Before I stir up any vitriol, I'm just kidding. I think C is under appreciated precisely because is provides only a thin abstraction that (hopefully) maps well to the target architecture, but otherwise stays out of the way. That is to say, when all you have is a hammer, you can easily shoot yourself in the foot.
Re:The Internet is based on C (Score:4, Informative)
uint16_t x = 0;
int32_t y = 0;
Almost every platform I've ever worked on has stdint...
Re:The Internet is based on C (Score:4, Insightful)
stdint.h came in with C99. There were decades where people hand-rolled their own versions so network communications would work...
Re: (Score:2)
Re: (Score:2)
O_o
Oh my god! This is why I'm not ever going back to C/C++ unless forced to at gunpoint.
Re: (Score:3)
The people on the standards committes are either saints, or abject evil scum. Maybe they're in a quantum superposition of both states, as long as you don't open the ISO report and take a look...
Re: (Score:2)
The reason why there's no guarantee is because there are actual real-world architectures on which it would be impossible to uphold such a guarantee (at least in a reasonably efficient way). E.g. SHARC [wikipedia.org] only has 32-bit words, so implementing an int16_t would be kinda tricky - sure, the compiler could do it with bitwise operations, but think about how a pointer to int16_t would look, and how dereferencing operation would work.
In practice, if you need intN_t, you'll probably just assume that it's available, and
Re:The Internet is based on C (Score:5, Informative)
- I think people who put the "*" of pointer syntax near the variable name and not the type name when declaring pointers should be shot. It should always be int* pointer_to_int, not int *pointer_to_int.
I'm sure my complaints are unwarranted except for the first point.
But that's backwards of what the compiler really does. Consider this:
int* p, q;
What types do p and q have? p is a pointer-to-int; q is an int. By putting the * next to the type name it makes it look like all the things are int*, but they're not. By putting the * with the type (which I did for my first year of C coding) you're making reading the code harder rather than easier. It'd be like writing
a = b * c+d;
and trying to convey that the '+' binds tighter since it doesn't have spaces. That's not what the compiler will do and writing it so only serves to confuse the reader.
In addition, what you see at declaration is representative (modulo the weirdness of array subscript and pointer deference) to what you'd do to get the type. That is, int ***p means that you'd have to type ***p to get an int. *p means you'd need another ** to get an int, etc.
Re: (Score:2)
When I write C, I put my splats next to the variable name too, it's just how I've always done it. . I think the Pascal way is the best where a pointer would be declared like this: p: ^integer;. This is very clear in my mind, as it reads "p is a pointer to an integer. int* p or int *p, they both read backwards to me, I just put it next to the variable name because that's where it's going to be when I dereference the pointer, so I might as well be consistent in my declarations as well.
Re: (Score:2)
If style issues bother you, run your code through a styler before and after you receive them from your source-code management system.
Really, style and content in C are as separate as they are in HTML and CSS. If you want a certain way of spacing things, generate a rule that turns everything into your style before you see it and converts back to whatever the agreed style is before you publish it for others. It really makes no difference to the compiler, only the programmer. And the programmer that can't e
Re: (Score:2)
I hate "long long"
This dates back to Algol 68. In Algol, the basic numeric data types were INT and REAL, and you could prepend an arbitrary number of SHORT or LONG to them. Any implementation was guaranteed to support SHORT INT and LONG INT as distinct data types, respectively smaller and larger than INT, but any extra prefixes were conditionally supported - it was always legal to write, but it was implementation-defined whether e.g. LONG LONG INT is larger than LONG INT, or the same. I think it's a fairly interesting conven
Re: (Score:2)
Keyword: cognitive load. Case in point: hilariously excrutiating code example in linux man page of snprintf. If you need to jump through all these burning hoops to do something this mundane, imagine how much more your proficient C coder could achieve in a more sensible laguage with the same amount of effort.
A sensible C coder might use vasprintf instead of the example in that manpage. The fact that all the standard library functions aren't great for all (or sometimes any) use cases is hardly unique to C.
Re: (Score:2)
A sensible C coder might use vasprintf instead of the example in that manpage. The fact that all the standard library functions aren't great for all (or sometimes any) use cases is hardly unique to C.
The *a*printf functions are not in the C standard, so the portable coder would not use them. I guess opinion may vary about what is 'sensible' :)
Re:The Internet is based on C (Score:4, Insightful)
Problems with the C standard library certainly do exist and can expose security issues, but Windows security problems exist because the OS design emphasises user friendliness and backwards compatibility over tight protections.
It's like inside your home, you don't lock all your cupboards, drawers and doors - that would be painful, eg to walk from the kitchen to the living room you'd take out your key, unlock the door, open go through shut, relock the door, each time. To make your home livable you keep it insecure, and that's how Windows was designed from the ground up.
But now suppose there's a magic internet wormhole that opens in your toilet room, and anybody can enter your house. Suddenly it makes sense to have locks on all the doors and cupboards etc, but it's too late. Windows + Internet = insecure.
Unix doesn't have this problem, because Unix was always designed as a hotel (multiuser OS) rather than a home. So there's locks on the rooms and the swimming pool needs an access card etc. If a wormhole opens in the hotel lobby or even in one of the guest rooms, there's limited access to most areas by design.
Re: (Score:2)
Firstly, Windows NT (ie: contemporary Windows) was designed and built from day 1 as a multiuser OS.
Secondly, also from day 1, it has had a far more comprehensive and capable security infrastructure than traditional UNIX.
Thirdly, UNIX was originally built as a single-user OS. Multiuser capability was added (soon) afterwards. One rather visible aspect of this is the presence of a superuser (root).
You are wrong about pretty much everything you've written.
Re: (Score:2)
XP SP2 was the first time a (laughable) effort was actually made to enforce some security. With SP2, enforcing security on sensitive API calls meant something trivial like inserting a pop up dialog box to ask the user for confirmation. All you had to do to bypass it was send a button click message t
Re: (Score:2)
You liar! I can't find the swimming pool anywhere in Solaris, BSD, or Linux!
Re: (Score:3)
Actually during the creation of NT Microsoft licensed a network stack from Spyder Systems, which was based on the BSD TCP/IP stack. Microsoft replaced this with it's own stack for NT 3.5, which was the second version, and I believe that was the one that went in to Windows 95. Some small userland utilities persisted after the stack was replaced, and who knows how much BSD code Microsofts own network stack contains even to this day seeing as we can't review Microsoft's source. Not that it really matters, a
Re: (Score:2)
Actually during the creation of NT Microsoft licensed a network stack from Spyder Systems, which was based on the BSD TCP/IP stack. Microsoft replaced this with it's own stack for NT 3.5, which was the second version, and I believe that was the one that went in to Windows 95.
More to the point, you can still see the heritage in the C API; it's recognizably similar throughout despite many other parts (e.g., file descriptors) being wildly different between Win and Unix. That's OK too. It means that Microsoft have a properly road-tested API in use. (The code itself may have gone, but that would be No Big Deal. While some C code really does survive for multiple decades, it's not really to be expected in any OS. APIs are much longer-lived.)
Oh no, someone obfuscated IOCCC for 5 years (Score:2)
but I am happy it is back
Because D is such a heavily used language... (Score:5, Insightful)
A number of quick points... Some people just don't know, so here are some practical speaking points...
-C has been around longer than most of the non-C programmers alive. That includes you people on this site, which has the smartest people, from one of the most divisive areas in the civil space: the "tech wars".
-D was such a better language... also, C++ because we never hear about C anymore.
-Java is on it's way out, being deprecated by the largest company in the world, which also deprecated Flash (on mobile) which Adobe just acquiesced to, replaced by Google's new iteration. Maybe not in the next 5 years, but it can no longer grow... it will have to get smaller with less support.
-Objective-C, used by Apple Inc., the largest company in the world, is a wholly-compatible superset of (ANSI) C. There are no signs of change here. Big surprise, it's all the same hardware components, just in larger capacities, at faster rates, and smaller form-factor. C can't help us with the flux capacitor... but that has not been added to the standard CPU, memory, memory storage, etc. model.
-Google announced that Android will run a C-like-language in the native space that uses the CPU and GPU. Even with Dart coming our way...
-CUDA... C is relevant in other (all) GPU spaces which is the go-to-guy, for the moment, to eak out more performance from a machine.
-And here is where the feelings get hurt: In college, I strattled the EE/CS line while being firmly EE. EEs learn C because it teaches them valuable things about the hardware, being a very light obfuscation. CS departments tend to concentrate on, well, anything else. Flavors of the year, interesting projects, etc. That is their place. My older brother went the CS route, 8 years before I got my turn and went EE. I admire him and his success greatly but I know, push came to shove, I can talk about certain topics without talking about garbage collectors and universal typing.
So, please, if you've never used C in any significant way, just don't comment. Listen. People, young and old, have something to tell you about the most significant programming language ever invented.
And to bring this all together: When you are trying to eke out CPU cycles so your 3D rendering is above 60 fps on that mobile device, you will know why closeness to hardware and C, in particular, may be your best friend. Or a C-like language...
Another way to look at it: People who know C and have worked with it, can't just unknown it. They know what you non-C people know, but also have other experience. If MOST of them say C is indispensable, then how about you do the one thing some Tech Asshole never do: Take someone else's advice. And STFU.
Can we just talk about something else that is awesome and not caught up in this stupid argument?
Re: (Score:3, Funny)
People, young and old, have something to tell you about the most significant programming language ever invented.
You mean LISP? I haven't seen anyone mention it yet.
Re: (Score:2)
Remember that most modern languages had their compilers or interpreters written in C (even if they are now self compiled), or often now Java ... and the JVM and compiler is written in C ...
My favorite IOCCC winning entry (Score:2)
We've all coded a quine, but this [ioccc.org] goes at least three steps further. Documentation here. [ioccc.org]
I am so looking forward to reading the 2012 winning entries!
Perl.. (Score:3)
I can't believe no one has mentioned this yet!
But, before perl, what was Larry Wall famous for?
Winning the IOCCC, not once, but twice.
Makes you think...