Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming News

The IOCCC Competition Is Back 201

Rui Lopes writes "After a 5 year hiatus, the IOCCC (International Obfuscated C Code Contest) is back! This marks the 20th edition of the contest. Submissions are open between 12-Nov-2011 11:00 UTC and 12-Jan-2012 12:12 UTC. Don't forget to check this year's rules and guidelines."
This discussion has been archived. No new comments can be posted.

The IOCCC Competition Is Back

Comments Filter:
  • by phantomfive ( 622387 ) on Sunday November 13, 2011 @02:27PM (#38042170) Journal
    But then we never would have a piece of code that calculates its own area. Isn't that worth it? (LINK [wikipedia.org]).
  • by vadim_t ( 324782 ) on Sunday November 13, 2011 @02:36PM (#38042240) Homepage

    The IOCCC is cool, but the Underhanded C Contest [xcott.com] was a lot more valuable.

    The entries for the IOCCC can show a lot of cleverness, but nobody in their right mind would accept such code. The beauty of the Underhanded C ones is that the code looks reasonable, but does extremely undesirable things.

  • by Rosco P. Coltrane ( 209368 ) on Sunday November 13, 2011 @02:51PM (#38042356)

    Most C coders seem to achieve obfuscation without any additional incentive.

    You got it wrong: bad coders create bad code. Good coders know how to create good code. In any language.

    When someone knows C well enough to create a truly obfuscated or compressed piece of portable C code that follows the rule of the language to a tee, i.e. that can be compiled strict or linted, and wins the IOCCC, it's a very good sign that this someone can create excellent C code.

    I should know, I won the IOCCC years ago, and used it many times in my resume. When would-be employers told me "what's the IOCCC?", I knew they weren't going to be good employers. When they told me "oh, I see you won the IOCCC", they knew I could code good C, and I knew they groked what I did. Winning the IOCCC helped me land a job a few times.

  • by Truekaiser ( 724672 ) on Sunday November 13, 2011 @02:57PM (#38042380)

    call me paranoid but this contest and the ioccc are the reasons why i don't particularly let anything from s.e.l. touch my systems. i am not a good enough coder to be able to tell if what it's doing is what it says it's doing or something the cia wants it to do..

  • Re:Use Duff's Device (Score:5, Interesting)

    by JonySuede ( 1908576 ) on Sunday November 13, 2011 @03:27PM (#38042578) Journal

    don't use them anymore, go read that post: http://lkml.indiana.edu/hypermail/linux/kernel/0008.2/0171.html [indiana.edu]

  • Re:Use Duff's Device (Score:5, Interesting)

    by Anonymous Coward on Sunday November 13, 2011 @04:06PM (#38042798)

    Jim Gettys has a wonderful explanation of this effect in the X server. It turns out that with branch predictions and the relative speed of CPU vs. memory changing over the past decade, loop unrolling is pretty much pointless. In fact, by eliminating all instances of Duff's Device from the XFree86 4.0 server, the server shrunk in size by _half_ _a_ _megabyte_ (!!!), and was faster to boot, because the elimination of all that excess code meant that the X server wasn't thrashing the cache lines as much.

    Emphasis mine. That's REALLY freaking interesting. Posting this AC before modding you up.

  • by Sduic ( 805226 ) on Sunday November 13, 2011 @05:26PM (#38043228)

    Without C code there would literally be no Internet.

    Because obviously only C is Turing-complete.

    Before I stir up any vitriol, I'm just kidding. I think C is under appreciated precisely because is provides only a thin abstraction that (hopefully) maps well to the target architecture, but otherwise stays out of the way. That is to say, when all you have is a hammer, you can easily shoot yourself in the foot.

  • by Anonymous Coward on Sunday November 13, 2011 @05:56PM (#38043352)

    You're making basically the same argument as people were saying back when machine code was what people wrote and C was new. If you have an open mind, you can easily see that C has serious shortcomings by modern language standards.

    C offers no abstractions for complex data types. It offers no subtyping. There's no facility for generic programming other than macros, which everyone knows suck. No support for closures or comprehensions. None of these things are "trivially implemented", as you state. Even its syntax sucks, as anyone would agree who's tried to declare a non-trivial function pointer.

    Many common programming tasks require extensive pointer manipulation in C. Even the best programmers (I'm one of them, and I concede this point) make ocassional mistakes with pointers, and they are the worst kind of bug: silently incorrect or a crash at a random place in the code.

    C is perfectly appropriate for some projects, especially with really low-level code (as most C constructs translate directly to assembly). C++ is usually better, as it has a richer typing system and ability to do generic programming, but you need to be an expert as the language is full of pitfalls (which are mostly C's fault). For projects that don't need to be close to the hardware, scripting languages can multiply programmer productivity.

  • by Billly Gates ( 198444 ) on Sunday November 13, 2011 @06:03PM (#38043410) Journal

    No language is perfect and I like to think of using a particular language for a particular purpose. Some places C should not be used today. Gnome is a classic example if you want to bind OO languages with GObject.

    One issue with C (I know I am going to get flamed here) is security. Most of Windows security flaws are from C due to buffer and stack overlfows, and even vector attacks that languages like Pascal handle better when turned into assembly code. The Marines used old macs for years for this reason. I am not a computer science major so you can flame all you want on me not knowing anything, but XP SP2 and OpenBSD had to introduce special libraries due to the fact that a buffer and stack overflow can result in code execution. Unix had a bad wrap with security too early on before Windows 2000 and much of that could be due to C and its libraries. Don't give me it is a the programmers fault as he or she has no clue what the resulting assembly code does and how it handles when something is full.

    I am not a troll here or anti C but it is something overlooked by programmers who only grew up with C, C++ or Java/C# which are all derived or influenced by C itself.

  • by BenoitRen ( 998927 ) on Monday November 14, 2011 @07:46AM (#38046994)

    Haven't you noticed how every cross-platform C/C++ library starts out with pages of pages of "MY_LIBRARY_INT32" and "MY_LIBRARY_EXPORT" and other redefinitions of "standard" types, keywords, and functions? That's because C is a badly designed language where the behaviour and/or availability of even basic language keywords like "int" is a crap shoot that depends on the compiler and the target processor type

    Well, duh. That's the entire point of C. You're complaining that it's a language close to the metal.

    Still, since C99 there are special integer types available that define exactly how large they are and if they're unsigned or not. Yes, they are widely available. It's been over 10 years since their introduction.

    Meanwhile, this Java code will work on all platforms, processors, and compilers, forever and ever:

    At least until Java decides to deprecate parts of its standard library. Java code written during the 90s no longer works. So much for "write once".

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...