Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming News

The IOCCC Competition Is Back 201

Rui Lopes writes "After a 5 year hiatus, the IOCCC (International Obfuscated C Code Contest) is back! This marks the 20th edition of the contest. Submissions are open between 12-Nov-2011 11:00 UTC and 12-Jan-2012 12:12 UTC. Don't forget to check this year's rules and guidelines."
This discussion has been archived. No new comments can be posted.

The IOCCC Competition Is Back

Comments Filter:
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Sunday November 13, 2011 @02:25PM (#38042160)
    Comment removed based on user account deletion
    • by phantomfive ( 622387 ) on Sunday November 13, 2011 @02:27PM (#38042170) Journal
      But then we never would have a piece of code that calculates its own area. Isn't that worth it? (LINK [wikipedia.org]).
    • by masternerdguy ( 2468142 ) on Sunday November 13, 2011 @02:35PM (#38042234)
      This is a good competition because it helps exploit the guts of C in new and exciting ways. Go back to your clean and neat database client if you can't play with the cowboys.
    • by Hazel Bergeron ( 2015538 ) on Sunday November 13, 2011 @02:37PM (#38042250) Journal

      Most C coders seem to achieve obfuscation without any additional incentive.

      Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

      Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.

      • by Anonymous Coward on Sunday November 13, 2011 @02:46PM (#38042308)

        Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

        Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.

        +1, it all started going downhill when :
        - professional language designers abdicated their role, and the void was filled by amateurs
        - people who use these languages have no fucking clue what they're doing and we're all paying the price
        - corporations hyped languages for their own purposes and languages stagnated or worse were crapified to an absurd level (witness java).

        • professional language designers abdicated their role, and the void was filled by amateurs

          I'm not sure how you define a "professional language designer", but I don't think Ritchie was one either way. It's precisely why there are a lot of messy things about the original design of C, such as its declarator syntax, or mixed signed/unsigned arithmetic rules, or implicit int. Some of it, I believe, comes from having the language designed as its compiler was written, and design being tweaked so that it would be easier to implement - it's a hacker's pragmatic approach to making a tool that's needed for

      • by 93 Escort Wagon ( 326346 ) on Sunday November 13, 2011 @02:59PM (#38042390)

        Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

        Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year, each with their uninteresting quirks of syntactic sugar, competing on the basis of some uniquely uninteresting difference which can almost always be trivially implemented in any of the alternatives. They are hard to read in the same way that German is hard to read to someone who has only been reading German for a year: skill and speed comes through practice with the language, not from the ego of its authors.

        Wow! Dr. Ritchie, everyone thought you were dead!

      • by petes_PoV ( 912422 ) on Sunday November 13, 2011 @04:13PM (#38042842)

        Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

        Contrast this with the ten dozen other fly-by-night half-baked languages which have flooded the marketplace over the past year.

        This clearly shows you simply don't understand the problem. A good programmer can (and does) write well structured, clean, DOCUMENTED and maintainable product in any language. The issue has nothing to do with the language used and everything to do with lack of discipline, inexperience and a slapdash and unprofessional attitude. Usually the worst programmers are the ones who think that once the code is written and compiles clean, the job is done. For most of these people there is little hope of educating them as they are incapable of seeing the bigger picture.

        • Yes, a true communicator switches to any of Earth's languages at will and celebrates the variety, eagerly perfecting his ability in any new language which some committee or group of enthusiasts recently invented. This is a realisable and good use of the copious time every human has available: the sugary topping has always been more important than the meal below.

        • by bertok ( 226922 )

          Sounds great, but all that nice sounding theory doesn't apply in practice. For example, C and C++ in particular are languages that started out "simple" but became quagmires over time. It's impossible to write portable C/C++ code that meets your requirements of "well structured and clean".

          Haven't you noticed how every cross-platform C/C++ library starts out with pages of pages of "MY_LIBRARY_INT32" and "MY_LIBRARY_EXPORT" and other redefinitions of "standard" types, keywords, and functions? That's because C

          • Meanwhile, this Java code will work on all platforms, processors, and compilers, forever and ever:

            I'll switch to using a language other than C when a language other than C allows me to use wait-free shared-memory multiprocessing algorithms.

            • by bertok ( 226922 )

              C# and Java both have atomic operations in the standard library. See Interlocked.CompareExchange [microsoft.com], and java.util.concurrent.atomic [oracle.com] for examples.

              Multi-threaded programming is particularly easy in those languages, because a lot of their internals are inherently thread-safe. For example, strings are read-only, so they can be passed around risk free. Similarly, mark & sweep garbage collection is thread-safe, and doesn't suffer from the rare but complex to debug memory leaks that occur with reference counting

              • I was talking about multiprocessing, not multithreading.

              • java.util.concurrent.atomic is a perfect example of why Java is not a viable choice for the work I'm doing. One of the tasks I currently have to handle is multiprocess disjoint set construction (using the wait-free union-find algorithm), on a very large corpus. This algorithm requires each disjoint set tree node to contain two fields: a reference to its superset, and a rank counter. In Java, the only choice I have is to use an array of AtomicStampedReference<V>, which will always occupy at least two p

          • Re: (Score:3, Interesting)

            by BenoitRen ( 998927 )

            Haven't you noticed how every cross-platform C/C++ library starts out with pages of pages of "MY_LIBRARY_INT32" and "MY_LIBRARY_EXPORT" and other redefinitions of "standard" types, keywords, and functions? That's because C is a badly designed language where the behaviour and/or availability of even basic language keywords like "int" is a crap shoot that depends on the compiler and the target processor type

            Well, duh. That's the entire point of C. You're complaining that it's a language close to the metal.

            St

      • by Anonymous Coward on Sunday November 13, 2011 @05:56PM (#38043352)

        You're making basically the same argument as people were saying back when machine code was what people wrote and C was new. If you have an open mind, you can easily see that C has serious shortcomings by modern language standards.

        C offers no abstractions for complex data types. It offers no subtyping. There's no facility for generic programming other than macros, which everyone knows suck. No support for closures or comprehensions. None of these things are "trivially implemented", as you state. Even its syntax sucks, as anyone would agree who's tried to declare a non-trivial function pointer.

        Many common programming tasks require extensive pointer manipulation in C. Even the best programmers (I'm one of them, and I concede this point) make ocassional mistakes with pointers, and they are the worst kind of bug: silently incorrect or a crash at a random place in the code.

        C is perfectly appropriate for some projects, especially with really low-level code (as most C constructs translate directly to assembly). C++ is usually better, as it has a richer typing system and ability to do generic programming, but you need to be an expert as the language is full of pitfalls (which are mostly C's fault). For projects that don't need to be close to the hardware, scripting languages can multiply programmer productivity.

        • If you sometimes made some mistakes with pointers, occasionally, it is 10000 times better than what an unexperienced C# developer could do with all the neat language features, you simply have no idea how obfuscated his code could become, and how much you begin to like the idea of having revoked the death penalty...
        • by phantomfive ( 622387 ) on Monday November 14, 2011 @01:40AM (#38045566) Journal

          Many common programming tasks require extensive pointer manipulation in C. Even the best programmers (I'm one of them, and I concede this point)

          I'm seriously doubting your professed skill here. You don't ever have to do pointer arithmetic in C, unless you are counting parameter passing as 'extensive pointer manipulation,' but you pass parameters as pointers in Java too (that's why you can get an NPE). The most common use of pointer arithmetic is for array processing, but if you want to be safe you can just use the array[] notation and not worry about understanding pointer arithmetic (I usually do, unless I have a compelling need to use pointer arithmetic). Furthermore I don't even know what you are talking about when you say, "non-trivial function pointer." Aren't all function pointers the same, just a bunch of parameters and a return value? Or are you declaring an array of function pointers or something? That might be where your problems are coming from.

          From experience I can say by far the thing that takes the most extra time when I am writing in C (compared to Java) is the lack of a good library, with common data structures like hash tables and lists and regular expressions. The number of times I've had to write a generic list library for some random platform, or figure out someone else's nonstandard implementation, is depressing.

          Also the generics in Java are a double edged sword. They allow more flexibility, but allow you to get away with writing incredibly confusing code, that can be extremely difficult to understand without a debugger. C code (really) tends to be a lot more readable. The downside is that it's usually a lot easier to refactor Java code without needing to rewrite a lot of interfaces (even when the interfaces were poorly written in the first place).

          Ultimately though, a good programmer will write good code in any language. A poor programmer will likewise write poor code in any language.

          • Also the generics in Java are a double edged sword. They allow more flexibility, but allow you to get away with writing incredibly confusing code, that can be extremely difficult to understand without a debugger.

            The design of generics in Java is flawed in more than one way, but can you give an example of "confusing code that can be extremely difficult to understand", especially "without a debugger" - that last part sounds completely nonsensical to me since Java generics are purely compile-time; they don't have any runtime variability by design (due to erasure).

            Anyway, there are many better examples of generics done right. My personal favorite are OCaml functors, which could be easily slapped on top of C with only m

      • Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

        An upgrade to C++ is a very good idea though.

        • by syousef ( 465911 )

          Nonsense. C is simple and, while some smart programmers think it's necessary to over-use the preprocessor (even the Linux kernel is sometimes guilty), it's a language you can learn once and apply productively for the rest of your life.

          An upgrade to C++ is a very good idea though.

          I know, let's call it C# or maybe Java.

          • by Austerity Empowers ( 669817 ) on Monday November 14, 2011 @12:53AM (#38045372)

            Java is NOT an upgrade to C++. There was a fork in the road, to the left went C++ to the right went Java. C++ took you through a swamp filled with poisonous snakes, quicksand and man eating spiders. Java took you through a haunted forest, with werewolves and zombies.

            No matter which path you took, you died before reaching your goal.

            • by syousef ( 465911 )

              Java is NOT an upgrade to C++. There was a fork in the road, to the left went C++ to the right went Java. C++ took you through a swamp filled with poisonous snakes, quicksand and man eating spiders. Java took you through a haunted forest, with werewolves and zombies.

              No matter which path you took, you died before reaching your goal.

              Java's got it's weaknesses but there's nothing seriously wrong with the language. The problem is the libraries, especially the enterprise libraries, and I mean both the EJB monstrosity and the consultant hell that passes for light weight. Misuse of the idea of design patterns are to blame. Let's look for ways to tack on another layer of abstraction we'll never actually use, shall we?

      • by antifoidulus ( 807088 ) on Sunday November 13, 2011 @07:23PM (#38043822) Homepage Journal
        while some smart programmers think it's necessary to over-use the preprocessor

        And that is ultimately my main beef with C, it's impossible to write non-trivial code that DOESNT make use of the pre-processor. Header guards in 2011? Really? C either needs to make an Objective-c like import statement a standard or else make #pragma once standard and make it default, so that if in the rare case you do actually need to include a file more than once, THEN you have to use a pre-processor command. I think the pre-processor is a really useful feature of C, but it should never be essentially mandatory to use it.
    • It is called playing to your strengths.
      While producing well structured, well documented, clean and correct code in C would be quite a challenge it could never approach some of the new languages in these terms.

    • by Rosco P. Coltrane ( 209368 ) on Sunday November 13, 2011 @02:51PM (#38042356)

      Most C coders seem to achieve obfuscation without any additional incentive.

      You got it wrong: bad coders create bad code. Good coders know how to create good code. In any language.

      When someone knows C well enough to create a truly obfuscated or compressed piece of portable C code that follows the rule of the language to a tee, i.e. that can be compiled strict or linted, and wins the IOCCC, it's a very good sign that this someone can create excellent C code.

      I should know, I won the IOCCC years ago, and used it many times in my resume. When would-be employers told me "what's the IOCCC?", I knew they weren't going to be good employers. When they told me "oh, I see you won the IOCCC", they knew I could code good C, and I knew they groked what I did. Winning the IOCCC helped me land a job a few times.

      • You got it wrong: bad coders create bad code. Good coders know how to create good code. In any language.

        My favorite programming adage: "You can create bad Fortran in any language."

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        While your code may be technically correct, compile, and do what it's intended to do, that does not make it good code. It just makes it code that works.

        Look at IOCCC examples posted on wikipedia. If the average programmer (ie: your coworker) will need to spend more time thinking about the extra whitespace and the syntactic monstrosity that comprises the competition, then your code design sucks and you've ended up causing more headaches with your "good" code.

        • I'm not very familiar with this competition but that seems to be the very point. The winning code should be almost impossible to understand unless you are very good. This isn't good code in the traditional sense but in an ironic sense.

    • They did. It failed. They are back to what comes naturally, Erlang. :p

    • by tgv ( 254536 ) on Sunday November 13, 2011 @02:54PM (#38042368) Journal

      Sure, that could be nice as well, but the IOCCC provides great challenges and puzzles, something that a clean code contest wouldn't. And what would you rather see in your news paper: difficult puzzles or easy ones? Or, for the youngsters here: would you rather play word feud, or type the answer to 1 + 1 over and over again?

      Besides that, the IOCCC entries contain mostly well structured and correct code, and afterwards they get documented as well. It's just not readable.

    • by Khyber ( 864651 )

      Go look up the Demoscene.

      • by Goaway ( 82658 )

        Let me tell you, no demo code is ever anywhere near well structured, well documented, clean or correct.

        • IOCCC code is hard to read

          DemoScene code is efficient

          Both are normal code taken to extremes

    • by Surt ( 22457 )

      But no one wants that, and hence, Microsoft.

  • by vadim_t ( 324782 ) on Sunday November 13, 2011 @02:36PM (#38042240) Homepage

    The IOCCC is cool, but the Underhanded C Contest [xcott.com] was a lot more valuable.

    The entries for the IOCCC can show a lot of cleverness, but nobody in their right mind would accept such code. The beauty of the Underhanded C ones is that the code looks reasonable, but does extremely undesirable things.

  • by Anonymous Coward

    I hope there are many submissions... It's things like this that teach you the FULL amount of abuse a language can take while still making something that works. :-D

  • by bogaboga ( 793279 ) on Sunday November 13, 2011 @02:36PM (#38042246)

    To illustrate some of the subtleties of the C language.

    The C language is not my thing per se, but I'd like to see simple C program code the illustrates the subtleties of C. Anyone?

    • by Anonymous Coward on Sunday November 13, 2011 @02:49PM (#38042340)

      Look up "Duff's Device". There's a good example.

    • by Bomazi ( 1875554 )

      This link:http://www.eecs.berkeley.edu/~necula/cil/cil016.html
      describes some corner cases of C. You need to have some prior to knowledge of C to appreciate the non-obviousness of the examples though.

  • Awesome! (Score:4, Funny)

    by Anonymous Coward on Sunday November 13, 2011 @03:13PM (#38042478)

    It's about time I got some more reference code.

  • by Required Snark ( 1702878 ) on Sunday November 13, 2011 @04:12PM (#38042830)
    I find all the "C sucks" comments to be both amusing and stupid. Without C code there would literally be no Internet. Every bit you are sending and receiving uses C. The two operating systems that represent 99.99% or more of the running computers that are online run C. Both Windows and Linux use the BSD TCP/IP stack.

    If C did not get the job done for this kind of computing then it would have been replaced. The fact that C thrives in the systems programming domain is a tribute to it's utility.

    A proficient C coder can write clear, maintainable, efficient code that runs on many platforms. This requires both skill and practice. Not everyone is capable of doing this. It requires the ability to keep multiple competing abstractions in mind when coding. I think a lot of people try this and find it difficult and then blame the language. Those who persevere and learn this style of working can usually move on to other kinds of programming and also do excellent work.

    Some problem domains require different languages and different skill sets. Personally, I like writing code where I know that if I were to look at the assembly code generated by the compiler I can see how it relates to the C code I wrote. I rarely do this, but it's good to know that I can if I want to. I'm doing any C coding now, because I always use the appropriate language to the task. But I also know that my C coding skills give me a distinct advantage in solving difficult problems, no matter what they are,

    • by Sduic ( 805226 ) on Sunday November 13, 2011 @05:26PM (#38043228)

      Without C code there would literally be no Internet.

      Because obviously only C is Turing-complete.

      Before I stir up any vitriol, I'm just kidding. I think C is under appreciated precisely because is provides only a thin abstraction that (hopefully) maps well to the target architecture, but otherwise stays out of the way. That is to say, when all you have is a hammer, you can easily shoot yourself in the foot.

  • but I am happy it is back

  • by Anonymous Coward on Sunday November 13, 2011 @07:50PM (#38043968)

    A number of quick points... Some people just don't know, so here are some practical speaking points...

    -C has been around longer than most of the non-C programmers alive. That includes you people on this site, which has the smartest people, from one of the most divisive areas in the civil space: the "tech wars".

    -D was such a better language... also, C++ because we never hear about C anymore.

    -Java is on it's way out, being deprecated by the largest company in the world, which also deprecated Flash (on mobile) which Adobe just acquiesced to, replaced by Google's new iteration. Maybe not in the next 5 years, but it can no longer grow... it will have to get smaller with less support.

    -Objective-C, used by Apple Inc., the largest company in the world, is a wholly-compatible superset of (ANSI) C. There are no signs of change here. Big surprise, it's all the same hardware components, just in larger capacities, at faster rates, and smaller form-factor. C can't help us with the flux capacitor... but that has not been added to the standard CPU, memory, memory storage, etc. model.

    -Google announced that Android will run a C-like-language in the native space that uses the CPU and GPU. Even with Dart coming our way...

    -CUDA... C is relevant in other (all) GPU spaces which is the go-to-guy, for the moment, to eak out more performance from a machine.

    -And here is where the feelings get hurt: In college, I strattled the EE/CS line while being firmly EE. EEs learn C because it teaches them valuable things about the hardware, being a very light obfuscation. CS departments tend to concentrate on, well, anything else. Flavors of the year, interesting projects, etc. That is their place. My older brother went the CS route, 8 years before I got my turn and went EE. I admire him and his success greatly but I know, push came to shove, I can talk about certain topics without talking about garbage collectors and universal typing.

    So, please, if you've never used C in any significant way, just don't comment. Listen. People, young and old, have something to tell you about the most significant programming language ever invented.

    And to bring this all together: When you are trying to eke out CPU cycles so your 3D rendering is above 60 fps on that mobile device, you will know why closeness to hardware and C, in particular, may be your best friend. Or a C-like language...

    Another way to look at it: People who know C and have worked with it, can't just unknown it. They know what you non-C people know, but also have other experience. If MOST of them say C is indispensable, then how about you do the one thing some Tech Asshole never do: Take someone else's advice. And STFU.

    Can we just talk about something else that is awesome and not caught up in this stupid argument?

    • Re: (Score:3, Funny)

      by Anonymous Coward

      People, young and old, have something to tell you about the most significant programming language ever invented.

      You mean LISP? I haven't seen anyone mention it yet.

    • Remember that most modern languages had their compilers or interpreters written in C (even if they are now self compiled), or often now Java ... and the JVM and compiler is written in C ...

  • We've all coded a quine, but this [ioccc.org] goes at least three steps further. Documentation here. [ioccc.org]

    I am so looking forward to reading the 2012 winning entries!

  • by bored ( 40072 ) on Monday November 14, 2011 @11:51AM (#38048790)

    I can't believe no one has mentioned this yet!

    But, before perl, what was Larry Wall famous for?

    Winning the IOCCC, not once, but twice.

    Makes you think...

E = MC ** 2 +- 3db

Working...