Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming Stats News

C Beats Java As Number One Language According To TIOBE Index 535

mikejuk writes "Every January it is traditional to compare the state of the languages as indicated by the TIOBE index. So what's up and what's down this year? There have been headlines that C# is the language of the year, but this is based on a new language index. What the TIOBE index shows is that Java is no longer number one as it has been beaten by C — yes C not C++ or even Objective C."
This discussion has been archived. No new comments can be posted.

C Beats Java As Number One Language According To TIOBE Index

Comments Filter:
  • Woohoo (Score:2, Interesting)

    by Anonymous Coward

    Go C!

    • Re:Woohoo (Score:5, Funny)

      by K. S. Kyosuke ( 729550 ) on Monday January 07, 2013 @02:17PM (#42508135)

      Go C!

      Actually, Go has some catching up to do on C.

    • Re:Woohoo (Score:5, Funny)

      by Anonymous Coward on Monday January 07, 2013 @02:59PM (#42508747)

      oh no you don't
      my friend told me to visit go c before
      i'm not falling for that one again

    • Re:Woohoo (Score:4, Interesting)

      by ByOhTek ( 1181381 ) on Monday January 07, 2013 @03:12PM (#42508941) Journal

      Not going to complain about that.

      It's depressing that Objective-C, Ruby and VB.Net have gone up, and see C# go down...
      But nice to see C and Bash go up, as well as Java go down. Then again, Java goes down on everything, that's how much it blows. I need a job that doesn't require me to program so much of that. Oh well, occasionally I can get to use C, Bash or Python...

      • Re: (Score:3, Insightful)

        by Megane ( 129182 )

        It's depressing that Objective-C, Ruby and VB.Net have gone up, and see C# go down...

        It's depressing to see a Microsoft-proprietary language, that they're not even supporting on one of their most recent platforms, go down in popularity? Really?

  • Dying gasps (Score:3, Funny)

    by OrangeTide ( 124937 ) on Monday January 07, 2013 @02:08PM (#42507993) Homepage Journal

    Doesn't a dying star expand into giant before it dies?

    • Re:Dying gasps (Score:5, Informative)

      by Anonymous Coward on Monday January 07, 2013 @02:14PM (#42508083)

      You would be surprised how many mission critical embedded systems - are still being written in C

      • by yurtinus ( 1590157 ) on Monday January 07, 2013 @02:21PM (#42508199)
        ...I think you'd be more surprised at how many are written in C#
        • by Vanderhoth ( 1582661 ) on Monday January 07, 2013 @02:31PM (#42508341)
          If it's more than one, then I'm surprised!!!

          I kid.
          • Re:Dying gasps (Score:4, Interesting)

            by rs79 ( 71822 ) <hostmaster@open-rsc.org> on Monday January 07, 2013 @08:03PM (#42512959) Homepage

            Oh snap.

            See, displying "Yay! it fit in the memory they gave me!" can't be 7 megs.

            C was only ever a shorthand for PDP-11 machine language, (back when C was young we'd routinely look at the compiler output. At that point it was passing arguments in registers and Dave Conroy sat in the next cube over working on what has today morphed into* gcc. That's one long lived piece of code.) and in tight spaces and critical loops you want machine language.

            Romable node.js would be the only thing I'd consider other the C for embedded code. I don't mind paying that overhead for the inherent asynchronous I/O advantages;, you have to muck around in C a lot to do that so it's worth the trade-off. Anything else just didn't bring enough to the table to warrant the overhead IMO.

            Contemporary support for C outside of Bell labs was because of embedded code (a camera gantry project, later, the Halifax postal processing plant) - the RSX11M C compiler written for that became DECUS C which went public and then went everywhere, including replacing the Bell compiler.

            * Yeah I know the claim is gcc is a clean rewrite, but logging into toad.com in the early 90s I found DGC commented source in the mix. Jon wasn't aware of it apparently...

        • by mysidia ( 191772 )

          Critical systems written by vendors other than Microsoft?

      • by TeknoHog ( 164938 ) on Monday January 07, 2013 @05:12PM (#42510943) Homepage Journal
        Not to mention the number fo songs written in C. In particular, C# is a real pain with all those black keys.
    • Re:Dying gasps (Score:5, Insightful)

      by Hentes ( 2461350 ) on Monday January 07, 2013 @02:22PM (#42508225)

      Not really. While I agree that C is a bad language, it has no competition in low-level coding. With embedded systems gaining ground, more and more people will start to use it. Although C++ could take its role and it even fixes many of its shortcomings (e.g. namespaces), it's very easy to misuse, so most project leaders don't trust their collegues with it. What would people switch to? Forth, Pascal?

      • Re:Dying gasps (Score:5, Insightful)

        by localman57 ( 1340533 ) on Monday January 07, 2013 @02:30PM (#42508333)
        Exactly. When you are working in a resource constrained environment, and you want to be able to accurately predict what machine instructions will be generated from your source code, you use C. It's faster to code, and more portable than assembly language, with almost all of the control. Typically maybe 5% or less of your application needs to be really, really fast (interrupt handlers, DSP code, special communications or math or encryption libraries). You might code these in assembly, and the rest in C.

        But if you're starting new big applications for the PC in C, you're probably insane.
      • You just answered my question on why C could even be considered :)

        Embedded systems would explain the growth, .NET FW is windows only pretty much, NOBODY, but MS is going to try to port it to an embedded machine. I'd still say 90% of businesses are .NET shops to varying degrees, but I didn't immediately understand what C was doing up there.

        Java though... makes me doubt the validity of TIOBE heavily, object-C doesn't help either, I get that there's a lot of android/iOS programming going on (I believe t
        • by Kergan ( 780543 )

          Java though... makes me doubt the validity of TIOBE heavily, object-C doesn't help either, I get that there's a lot of android/iOS programming going on (I believe this is what object-C is used for mostly nowadays, but... more than 90% of businesses combined using .NET... doubtful). Maybe if TIOBE was based on +/- % changes I'd understanding, but as an overall popularity index, businesses have the $, and businesses use .NET unless they're web based...

          And therein lies the rub in your argument. Many companies actually are web based. Many others are into mobile. And a shit ton of stuff is embedded or low level enough that .Net isn't even an option.

          In my own industry (telco/finance), hardly anyone I'm aware of uses C# or VB.Net. It's almost all C for the low level stuff, Java for the enterprisy stuff, and java/Obj-C for mobile stuff. Oh, and there's some COBOL for legacy stuff, too.

          In my brother's (automation/machine tools), it's mostly C for low level stuf

      • What would people switch to? Forth, Pascal?

        About 25 years ago, working in an embedded product company, I had a friendly little argument with my software colleagues (me design hardware, UGH!) They insisted that there was nothing around that could compete with the C-compiler-that-later-became-Microsoft's for tight compiled code. So we had a little contest: they wrote a chunk of our kind of code in C, and I did it in Modula-2 (Logitech's compiler.) In both cases we were building reusable code with object methods.

        Quite enlightening.

        How the comparison

      • Re:Dying gasps (Score:4, Interesting)

        by UnknownSoldier ( 67820 ) on Monday January 07, 2013 @03:10PM (#42508925)

        > While I agree that C is a bad language, it has no competition in low-level coding.
        Mostly agree. Although I prefer turning all the crap in C++ off to get better compiler support.

        > Although C++ could take its role and it even fixes many of its shortcomings (e.g. namespaces)
        Uh, you don't remembered "Embedded C++" back in the late 90's early 00's ?

        If you think namespaces are part of the problems you really don't understand the complexity of C++ at _run_time_ ...

        Namely:

        * Exception Handling
        * RTTI
        * dynamic memory allocation and the crappy way new/delete handle out of memory
        * dynamic casts
        * no _standard_ way to specify order of global constructors/destructors

        Embedded systems NEED to be deterministic, otherwise you are just gambling with a time-bomb.

        http://en.wikipedia.org/wiki/Embedded_C%2B%2B [wikipedia.org]

        --
        There are 2 problems with C++. Its design and implementation.

      • Re:Dying gasps (Score:4, Interesting)

        by marcosdumay ( 620877 ) <(marcosdumay) (at) (gmail.com)> on Monday January 07, 2013 @03:17PM (#42509013) Homepage Journal

        After you accept the constraints of an embbebed environment and low level access, C is not a bad language anymore. Any language useable on that kind of environment is at least as bad as C.

        • Re:Dying gasps (Score:5, Interesting)

          by disambiguated ( 1147551 ) on Monday January 07, 2013 @03:32PM (#42509273)
          Maybe you're like me.

          I've been using C for so long that I think I've lost objectivity. C is the first language I learned (other than line numbered basic.) In my mind, C is the language all other languages are judged against.

          But if there's any truth to this (when did the TIOBE index become the official word?) it makes me wonder if it's not C itself that is making a comeback, but good old fashioned procedural style programming.

          All these fancy new languages with their polymorphism, encapsulation, templates and functional features have lost their sparkle. Programmers are rediscovering that there isn't anything you can't do (even runtime polymorphism) with just functions, structs, arrays and pointers. It can be easier to understand, and although it may be more typing, it has the virtue that you know exactly what the compiler is going to do with it.
          • Re: (Score:3, Insightful)

            by steelfood ( 895457 )

            Object-oriented is good for projects that need to be maintained. But it comes with additional complexity that programmers need to learn and track. It's only feasible for very large projects with multiple developers working on multiplel branches of development, and even then, it requires a very good maintainer, or it's too easy to fuck everything up. The infrastructure necessary to do good OO quickly gets as expensive and as complex as the problem the OO paradigm is trying to solve.

            Procedural is the simplies

          • Re:Dying gasps (Score:4, Insightful)

            by RCL ( 891376 ) on Monday January 07, 2013 @06:34PM (#42512037) Homepage
            More importantly, OOP requires that you start with class hierarchy - wrong approach. Class hierarchy is de facto a decision tree, a very unstable classifier. If you build it on insufficient number of observations (which is nearly always the case when you start writing software from scratch), you will need to throw away your decision tree and rebuild it anew. That's what we call "refactoring" in OOP.

            Let me shout it loud: Premature classification is the root of all evil! :) Think about your program as of data-to-data mapping primarily, and then build the abstractions.
        • .NET Micro Framework? for systems with 64K of RAM and 256K of storage. That gives you C# and VB.
          As a bonus you get emulators and debugging built in and its open source under an Apache license.

      • by Tarlus ( 1000874 )

        Yeah, I'm of the opinion that a person who cannot properly use C (and understand how memory management works) has no business writing mission-critical software in any language. JVM's garbage collector is for sissies. =P

      • Re:Dying gasps (Score:5, Insightful)

        by Bill_the_Engineer ( 772575 ) on Monday January 07, 2013 @04:06PM (#42509907)

        While I agree that C is a bad language...

        Just because the language doesn't hold your hand and make sure you don't mess up doesn't necessarily make it a "bad" language. C++ could never fully take C's role due to the name mangling that C++ requires. I use and like C++ but there are somethings that are best left to C. Especially if you're binding compiled code to some interpreted language (e.g. Perl, Python, or Ruby) to speed up some computationally intensive portion of an application.

      • Not really. While I agree that C is a bad language, it has no competition in low-level coding.

        The enduring popularity and success of C is a strong argument that there aren't any really bad languages - that language design just doesn't matter very much. That designing new languages is largely a waste of time. The only real requirement for a language is that it gives you enough control to do whatever you want; from there you can get anywhere, and you can also write progressively higher levels of APIs t

      • Re:Dying gasps (Score:5, Interesting)

        by VortexCortex ( 1117377 ) <VortexCortex AT ... trograde DOT com> on Tuesday January 08, 2013 @01:11AM (#42515299)

        Not really. While I agree that C is a bad language, it has no competition in low-level coding.

        Oh, there's competion, just not much because it's not wanted or needed. As a hobby I started building an x86 OS from scratch with only a hex editor. From there created an assembler in machine code, then used it to create a small text editor and then assemble a disassembler, then disassembled the assembler (to save me from re-entering ASM), then I began work on creating my own system level language from scratch to build the OS with. The thing is, if you want to make the leanest language just barely above the metal, but still be cross platform, guess what you get? You get C. Seriously.

        My syntax was different, but because the op-codes (eg: jmp, the movs and push, pop, call, ret, enter, leave, protected mode & architecture features like the MMU, restricting use of code & stack registers, etc. when you add any features (like functions) you end up creating something almost exactly like C in all but name. The architecture is responsible for C, it's a product of its environment. For instance, I wanted to use multiple stacks: a separate stack for the call stack and another one for parameters / local vars / etc -- In fact I wanted to extend that to support co-routines at the lowest levels possible, all while eliminating stack smashing as a direct exploit vector -- Ah, but because of the way Virtual Memory Addressing works, and because there are dedicated operations for doing single stack per thread function calling, there's a huge performance hit to doing things in other ways down at the low level (I figured out a few tricks, but it's still slower than C functions).

        Now, most folks wouldn't tolerate a system level language that was any more inefficient than it had to be, and slightly contrary to the way things want to work at the hardware level just to add features globally that many programs don't need (e.g. namespaces, call-stack security, co-routines, etc), so they'll follow the restrictions of the system and the language produced will come out to be just like C, maybe with slightly different syntax, but all the same idioms. Maybe function calling would be something other than CDECL (instead, for variadics, I pass the number of parameters in a register, and have the callee clean up the stack, reduces code size a bit -- and I have other reasons), but even this is possible to do now in C too at the compiler level. Even when you get to adding OOP to the language, you run into C++ and it's problem with diamond inheritance, and dynamic casting (if you do things the most efficient way possible) -- I allow virtual variables as well as functions to eliminate the diamond inheritance issue with shared bases having variables -- Just make them "virtual", like you would a function, it's slower, another layer of indirection, but if I did it the fast way I'd just be re-implementing C++!

        There's a fine line I'm walking, a little too far from the architecture / ASM and my language might as well run via VM, a little less and I might as well just use C/C++. So, the space we have to innovate in to squeeze more worth out of a compilable language isn't really that big. Indeed, when I take a look at GoLang disassembled I see all the same familiar C idioms -- They just give you a nicer API for some things like concurrency, and add a few (inefficient) layers of indirection to do things like duck-typing. Great for application level logic, but I'd still write an OS and its drivers in a C like language instead.

        There's a reason why C "has no competition in low-level coding" It's because we don't need a different syntax for low level coding, it's done. As a language designer / implementer when I hear folks say "C is a bad language" I chuckle under my breath and think they might be noobs. Maybe you meant the design by committee approach sucks, but probably not. The features C has it needs to have, the syntax it has, it needs to have, e.g., pointer to a pointer to a

    • by Anonymous Coward on Monday January 07, 2013 @02:40PM (#42508489)

      Seriously, for the last fucking time, can we stop posting on Slashdot random shit picked up from TIOBE? The TIOBE index is so completely and utterly full of fail that I can't believe people are STILL clinging onto it as evidence of anything whatsoever.

      It shouldn't be traditional to do anything with TIOBE, except perhaps laugh at it or set it on fire.

      So once last time, one final fucking time I'll try and explain to the 'tards who think it has any merit whatsoever why it absolutely does not.

      We start here, with the TIOBE index definition, the horses mouth explanation of how they cludge together this table of bollocks they call and "index":

      http://www.tiobe.com/index.php/content/paperinfo/tpci/tpci_definition.htm [tiobe.com]

      First, there is their definition of programming language. They require two criteria, these are:

      1) That the language have an entry on Wikipedia

      2) That the language be Turing complete

      This means that if I go and delete the Wikipedia entry on C, right this moment, it is no longer a programming language, and hence no longer beating anything. Apparently.

      The next step, is to scroll past the big list of languages, to the ratings section, where we see that they state they take the top 9 sites on Alexa that have a search option, and they execute the search:

      +" programming"

      Then weight the results as follows:

      Google: 30%
      Blogger: 30%
      Wikipedia: 15%
      YouTube: 9%
      Baidu: 6%
      Yahoo!: 3%
      Bing: 3%
      Amazon: 3%

      The first problem here is with search engines like Google, I run this query against C++ and note the following:

      "About 21,500,000 results"

      In other words, Google's figure is hardly anything like a reasonable estimate because a) Most these results are fucking bollocks, and b) The number is at best a ballpark - this accounts for 30% of the weighting.

      The next problem is that Blogger, Wikipedia, and YouTube account for 54% of the weighting. These are all sites that have user generated content, as such you could literally, right now, pick one of the lowest languages on the list, and go create a bunch of fake accounts, talking about it, and turn it into the fastest growing language of the moment quite trivially.

      To cite an example, I just ran their query on English Wikipedia for the PILOT programming language and got one result. A few fake or modified Wikipedia entries later and tada, suddenly PILOT has grown massively in popularity.

      The next point is the following:

      "Possible false positives for a query are already filtered out in the definition of "hits(PL,SE)". This is done by using a manually determined confidence factor per query."

      In other words yes, they apply an utterly arbitrary decision to each language about what does and doesn't count. Or to put it simply, they apply a completely arbitrary factor in which you can have no confidence of being of any actual worth. I say this because further down they have a list of terms they filter out manually, they have a list of the confidence factors they use, and it takes little more than a second to realise massive gaps and failings in these confidence factors.

      For example, they have 100% confidence in the language "Scheme" with the exceptions "tv", and "channel" - I mean really? the word Scheme wouldn't possibly used for anything else? Seriously?

      So can we finally put to bed the idea that TIOBE tells us anything of any value whatsoever? As I've pointed out before a far better methodology would at least taken into account important programming sites like Stack Overflow, but ideally you'd simply refer to job advert listings on job sites across the globe - these will tell you far more about what languages are sought after, what languages are being used, and what languages are growing in popularity than any of this shit.

      Finally I do recall last year stumbling across a competitor to TIOBE that was at least slightly better but still not ap

      • I think someone should mod AC informative. This does sound like a worthless statistic.

      • Thanks a lot for making this write-up. Now I can just post a link to it every time someone submits yet another TIOBE story.

  • ...Bash? (Score:5, Interesting)

    by earlzdotnet ( 2788729 ) on Monday January 07, 2013 @02:10PM (#42508017)
    Am I the only person seriously wondering how Bash went from position 72 to 20? Bash is in the top 20 programming languages... Something is wrong with the programming universe
    • Re:...Bash? (Score:5, Insightful)

      by Tridus ( 79566 ) on Monday January 07, 2013 @02:11PM (#42508043) Homepage

      More like something is wrong with the measuring system being used.

      • Re:...Bash? (Score:5, Informative)

        by i kan reed ( 749298 ) on Monday January 07, 2013 @02:22PM (#42508213) Homepage Journal

        Yep, they use frequency of search on the internet for the language to estimate. Which means confusing, and easily broken languages like C, and infrequently used(and thus easily forgotten) languages like bash get a huge leg-up.

        • by mysidia ( 191772 )

          Yep, they use frequency of search on the internet for the language to estimate. Which means confusing, and easily broken languages like C, and infrequently used(and thus easily forgotten) languages like bash get a huge leg-up.

          How come BATCH (.BAT) isn't on there, then?

  • a bit of latency (Score:5, Interesting)

    by lorinc ( 2470890 ) on Monday January 07, 2013 @02:10PM (#42508025) Homepage Journal

    Java will come back to number 1 in a few years thanks to Android...

  • definition (Score:5, Informative)

    by mapkinase ( 958129 ) on Monday January 07, 2013 @02:11PM (#42508041) Homepage Journal

    TIOBE programming community index is a measure of popularity of programming languages, calculated from number of search engine results for queries containing the name of the language. [1] The index covers searches in Google, Google Blogs, MSN, Yahoo!, Wikipedia and YouTube.

    thx, bye.

    • This may be a leading edge indicator. C is sufficently simple that after your first few months you seldom need to consult documentation. I've got nearly 20 years experience, and I seldom or never have to google how to achieve something in C. Algorithms, maybe, but not C syntax. As opposed to very heavy library based languages, such as C# .Net, where I'm constantly googling, because I typically assume there's already a library that does "that" for me, whatever "that" happens to be.
    • by erice ( 13380 )

      TIOBE programming community index is a measure of popularity of programming languages, calculated from number of search engine results for queries containing the name of the language. [1] The index covers searches in Google, Google Blogs, MSN, Yahoo!, Wikipedia and YouTube.

      So it isn't really about usage then.

      Bash gets a lot of hits because it is a popular shell, not because so many people want to program in it.

      C gets some lift because of so many C-like languages and C bindings used by people are not necessarily programming in C.

    • The index covers searches in Google, Google Blogs, MSN, Yahoo!, Wikipedia and YouTube.

      So its definition of "popularity" is: "I'm trying to use this language, but I don't know how." This may say more about the number of C programs whose original authors have left the field, than the number of new C programs being written.

  • The other one (Score:5, Informative)

    by Tridus ( 79566 ) on Monday January 07, 2013 @02:16PM (#42508119) Homepage

    Is called PYPL (PopularitY of Programming Languages), and it ranked C# as #1 and C down in #5 based on a different methadology. Honestly, they both sound pretty silly to me.

    https://sites.google.com/site/pydatalog/pypl/PyPL-PopularitY-of-Programming-Language [google.com]

  • by notknown86 ( 1190215 ) on Monday January 07, 2013 @02:23PM (#42508233)
    Using the TIOBE methodology, I deduce that the following activities are more popular that C Programming:

    - Abduction by alien
    - Going to prison
    - Dying
  • Not surprising (Score:4, Informative)

    by Murdoch5 ( 1563847 ) on Monday January 07, 2013 @02:29PM (#42508313) Homepage
    Is anyone really surprised by this? C is the best overall language, it spans every platform I can think of, it's the most standarized language and above all of that simple to learn and use. C is the language for real programmers, if you can't do it in C then you just can't program.
    • C is the best overall language, it spans every platform I can think of

      I can think of several platforms that C doesn't easily span. Xbox Live Indie Games and Windows Phone 7 only support C#, the Web only supports JavaScript, Flash Player only supports ActionScript and other languages that compile to ActionScript bytecode, and the Java applet environment and MIDP phones only support Java and other languages that compile to JVM bytecode. Or are you counting Emscripten as "C support"?

  • TIOBE algorithms (Score:5, Insightful)

    by sl4shd0rk ( 755837 ) on Monday January 07, 2013 @02:41PM (#42508491)

    Clearly C is more popular as more people complain about it sucking.

    C sucks -- About 321,000,000 results
    bash sucks -- About 7,500,000 results
    Java sucks -- About 5,810,000 results
    c++ sucks -- About 898,000 results
    objective c sucks -- About 293,000 results

  • C Just works (Score:5, Insightful)

    by EmperorOfCanada ( 1332175 ) on Monday January 07, 2013 @02:42PM (#42508497)
    The bulk of my recent programming has been in Objective C but once I leave API calls my code quickly becomes pretty classic C with elements of C++. Yes I love the simplicity of a foreach type structure where it is brain dead to iterate through some set/hash/array of objects with little or no thought about bounds but once I start to really hammer the data hard I often find my code "degenerating" into c. Instead of a class I will create a structure. Instead of vectors I use arrays. I find the debugging far simpler and the attitude to what can be done changes. In fairly raw C I start having thoughts like: I'll mathematically process 500,000 structures every time someone moves their mouse and then I literally giggle when it not only works but works smoothly. What you largely have in C is if the machine is theoretically able to do it then you can program it. Good mathematics can often optimize things significantly but sometimes you just have brute manipulations that need to be fast.

    But on a whole other level my claim with most higher level languages ranging from PHP to .net to Java is that they often make the first 90% of a large project go so very quickly. You seem to jump from prototype to 90% in a flash; but then you hit some roadblocks. The garbage collection is kicking in during animations causing stuttering and the library you are using won't let you entirely stop garbage collection. Or memory isn't being freed quickly enough resulting in the requirement that all the users' machines be upgraded to 16Gb. Then that remaining 10% ends up taking twice as long as the first 90%. Whereas I find with C (or C++) you start slow and end slow but the first 90% actually takes 90% of the final time.

    But where C is a project killer is the whole weakest link in the chain thing. If you have a large project with many programmers as is typically found in a large business system working on many different modules that basically work on the same data set that a safer language like Java is far far better. I am pretty sure that if the business programmers working on projects that I have seen were to have used C instead of Java that those server systems would crash more than once a minute. You can still program pretty badly in Java but a decent programmer shouldn't blow the system apart. Whereas a decent C programmer might not be good enough for a large project.

    So the story is not if C is better than say Java but what is the best language for any given problem set. I find broad systems, like those found in the typical business, with many programmers of various skill levels are idea for Java. But for deep system where you layer more and more difficulty on a single problem such as real-time robotic vision that C or C++ are far superior. A simple way to figure out what is the best language is to not compare strengths and weaknesses generally but how they apply to the problem at hand. In a large business system where horsepower is plentiful then garbage collection is good and pointers are only going to be a liability. But if you are pushing up to the limits of what the machine can do such as a game then a crazy pointer dance might be the only possible solution and thus demand C or even ASM.

    Lastly do you want your OS programmed in Java?
  • C and C++ are still the best languages for parallelism, in particular vectorization and shared memory systems.

  • It should be noted that for most programming languages, it is highly likely that the compiler and other code used for most if not all programming languages are written in C. If you're using Java, you're using C code. If you're using Perl, you're using C code. If you're using Python, you're using C code. And so on.
  • by interval1066 ( 668936 ) on Monday January 07, 2013 @03:15PM (#42508981) Journal
    C haters: told ya so.
  • by kbdd ( 823155 ) on Monday January 07, 2013 @04:05PM (#42509883) Homepage
    I find it funny that people are claiming that C is a bad language, yet they use no such words to qualify assembly.

    C is a bad language to the extent that it lets do what you want, even if that means shooting yourself in the foot. A language that would marry C's strengths while providing safeguards against buffer overruns and other ills would be an oxymoron.

    If you have a section of code that is particularly time critical, you could write it in assembly as many people do. I prefer to actually write it in C but check the assembly output from the compiler, and optimize the C source until I am happy with the result. In all cases I have been able to achieve my objectives this way without actually having to insert a block of assembly (not all compilers let you do that in-line). The resulting code is still very easy to read (for me down the road or anyone else) while being efficient.

    C is not the Swiss army knife of software (even though if one language qualified, C would be the closest), but it has its areas of expertise. In the world of small embedded systems, there is simply no alternative worth considering (and few alternatives available.)

    For desktop applications, not so great (and I speak of experience.)

    • C could be a better language if it got rid of some idiosyncrasies, like its weird declarator syntax (it's the only language I know of that had tools like cdecl [die.net] written for it), or stillborn features like separate namespaces for structs/enums/unions (that everyone works around by using typedefs), or certain unsafeties in the language itself - e.g. implicit downcast from void* to any pointer type and from double to int, or mixed signed/unsigned arithmetic being perfectly a-ok but not doing what you expect hal

  • Number One? (Score:4, Funny)

    by PPH ( 736903 ) on Monday January 07, 2013 @05:19PM (#42511041)

    Yes but which one is number zero?

  • by gweihir ( 88907 ) on Monday January 07, 2013 @07:25PM (#42512611)

    C programmers have an understanding of the machine they use that Java people will never be able to reach. The only advantage Java programmers have is that they are cheap. Or better, they look cheap to management. In fact they are hugely expensive because most will write code that sucks badly.

The "cutting edge" is getting rather dull. -- Andy Purshottam

Working...