Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Java Stats

C Programming Language 'Has Completed a Comeback' (infoworld.com) 243

InfoWorld reports that "the once-declining C language" has "completed a comeback" -- citing its rise to second place in the Tiobe Index of language popularity, the biggest rise of any language in 2017. An anonymous reader quotes their report: Although the language only grew 1.69 percentage points in its rating year over year in the January index, that was enough beat out runners-up Python (1.21 percent gain) and Erlang (0.98 percent gain). Just five months ago, C was at its lowest-ever rating, at 6.477 percent; this month, its rating is 11.07 percent, once again putting it in second place behind Java (14.215 percent) -- although Java dropped 3.05 percent compared to January 2017. C's revival is possibly being fueled by its popularity in manufacturing and industry, including the automotive market, Tiobe believes...

But promising languages such as Julia, Hack, Rust, and Kotlin were not able to reach the top 20 or even the top 30, Tiobe pointed out. "Becoming part of the top 10 or even the top 20 requires a large ecosystem of communities and evangelists including conferences," said Paul Jansen, Tiobe managing director and compiler of the index. "This is not something that can be developed in one year's time."

For 2017 Tiobe also reports that after Java and C, the most popular programming languages were C++, Python, C#, JavaScript, Visual Basic .Net, R, PHP, and Perl.

The rival Pypl Popularity of Programming Language index calculates that the most popular languages are Java, Python, PHP, JavaScript, C#, C++, C, R, Objective-C, and Swift.
This discussion has been archived. No new comments can be posted.

C Programming Language 'Has Completed a Comeback'

Comments Filter:
  • by JoeyRox ( 2711699 ) on Sunday January 07, 2018 @07:41PM (#55882177)
    C never went anywhere. Its mindshare was just continually eclipsed by whatever bullshit venture-captial-seeking-paradigm-of-the-month was en vogue for that month.
    • by RightwingNutjob ( 1302813 ) on Sunday January 07, 2018 @07:44PM (#55882193)
      What changed? Did someone let slip that bitcoin mining can be done in C faster than with remote calls to jquery?
    • whatever bullshit venture-captial-seeking-paradigm-of-the-month was en vogue for that month.

      These days, whatevers-of-the month barely last a week.

    • by Tough Love ( 215404 ) on Sunday January 07, 2018 @10:23PM (#55882775)

      I'm not saying C did or did not come back, or did or did not go away. I am saying, you won't know from Tiobe, it way too random. They count language questions, not language usage, and don't make the slightest attempt to correct for predictable skew like selection bias due to who hangs out there as opposed to, say, stackoverflow.

      My totally on reliable take on it? C dev population stays about the same: very few, very skilled, and very highly paid. Because of the latter, the number of C wannabes spikes from time to time, but don't worry, they will go away after they ask a few questions and still can't code.

      • Re: (Score:3, Informative)

        by OrangeTide ( 124937 )

        C dev population stays about the same:

        Some of us are getting quite old and have been dying off.

      • C programmers are not highly payed.
        Depending on region Java and C# developers are payed highest.
        E.g. in London area C# developers with banking/finance experience earn about $1000 per day.
        C developers in Europe not even make half of that.
        OTOH C# is not very popular in Germany, we mostly do Java here. So you also can earn good money with C# skills in Germany, about $750 a day. (Not popular means: traditionally not many projects were done in C# ... now as those projects exist, the market has a serious demand f

  • Maybe they are trying to say it's more popular now? Either way I'm glad!
  • by iggymanz ( 596061 ) on Sunday January 07, 2018 @08:01PM (#55882263)

    These aren't measures of how much languages are used, they're useless bullshit as asking which languages generate the most Twitter Tweets? Facebook posts? News articles?

    Number of jobs held would be interesting.

    So would number of unique jobs openings for each language.

    • Job adverts are certainly an interesting method but never the whole story. I've been checking job listings since the 90s for python and up till about 3-4 years ago where rarer than hens teeth (there's a tonne now since the management types finally discovered it). But all that time there was a huge amount of local devs using python for personal projects , glueing things together with it in server racks, holding conferences and meets and so on. Hell one of the founding devs of Django even lives near me. But g

  • by smist08 ( 1059006 ) on Sunday January 07, 2018 @08:02PM (#55882269)
    With all the low cost ARM computers, perhap people appreciate the speed of C. Generally you get smaller faster programs. Or perhaps more people are working on the Linux kernel?
    • by sourcerror ( 1718066 ) on Sunday January 07, 2018 @08:09PM (#55882295)

      You can use more advanced programming languages on the Pi that have decent execution speed too, like Java or C#. I think C's popularity is more driven by Arduino, where other alternatives don't exist.

      • by TypoNAM ( 695420 )

        Except the fact that Arduino IDE uses C++ not C.

        • I kind of like arduino's slightly relaxed version of C++ myself, and use perl or C/C++ in pies to do the linux type stuff - databases and web servers - for my arduino based automation devices (often ESP based). Works for me - use C(++) where you are super time-dependent and need utter control and some higher level language for the glue that saves programmer time and effort and catches some of the errors. I personally missed the Java and PHP trains, but I hear I didn't miss much...something about fractals
    • by TheRaven64 ( 641858 ) on Monday January 08, 2018 @06:24AM (#55884033) Journal
      The slowest Raspberry Pi has 512MB of RAM and a 700MHz 32-bit processor. The original Smalltalk-80 implementation ran on a 2MHz 16-bit processor with 512KB of RAM and contained a full graphical user interface and applications written entirely in Smalltalk, a pure object-oriented language that didn't even have concessions to implementation ease like primitive types or intraprocedural flow control[1]. It did use some clever microcode tricks to make things like screen updates faster, but even without these Smalltalk was quite performant on a 20MHz processor.

      The idea that a RPi is too slow for a high-level language to be fast enough is astonishing.

      [1] In Smalltalk, integers are immutable instances of the SmallInt class, which is typically implemented as a tagged pointer. If integer arithmetic overflows, the result is an immutable instance of the BigInt class, which is stored as a pointer to an arbitrary-precision integer object. It's depressing how later dynamic languages, particularly scripting languages, haven't managed to have as useful integers. Smalltalk also had a variety of floating point types. It did not have things like if statements in the language. True and False were singleton subclasses of the Boolean class, which implemented methods like ifTrue: and ifFalse:. These took a block (closure) as an argument and either executed it or didn't execute it, depending whether they were True or False.

  • In Standard C, how do you find the least significant set bit of an integer type? Most CPUs these days support a clz [stackoverflow.com] instruction, but Standard C doesn't.
    • Why do you believe that a programming language Standard needs to provide a standard for a specific CPU-level instruction? Can you name another standardised language that does so as part of the language?
      • by Ichijo ( 607641 )

        It isn't specific to a single CPU, this category of instructions are also found in ARM and PowerPC, even the PDP-10 provides hardware support for counting trailing zeros in an integer. Counting leading/trailing ones or zeros is common in many problems such as compression algorithms. And on platforms without FPUs, the compiler would emulate floating point instructions in software, so there's precedent for providing C language support for instructions that aren't available on every CPU.

    • A good compiler will convert a loop for finding the bit into the single instruction.

    • If your C standard library doesn't have ffs(), then... sorry, Windows user. I guess there's always _BitScanForward or __lzcnt.

      Oh, and if your CPU uses clz to count trailing zeroes, you should report that as a bug.

      • if your CPU uses clz to count trailing zeroes, you should report that as a bug.

        Not necessarily. Use s = (s & (s - 1)) ^ s to clear all 1 bits other than the least significant, giving a "one-hot" integer. Then you can count leading zeroes and use that to infer trailing zeroes. It might look like the following (in a generic pseudo-assembly language):

        mov B, A
        sub A, 1 // A differs from B in the lowest 1 bit and all bits below it
        and A, B // A is B with the lowest 1 bit cleared
        xor A, B // A is the lowest 1 bit of B
        clz A // A is the number of leading 0 bits
        rsb A, 31 // A is the number o

    • by arth1 ( 260657 )

      In Standard C, how do you find the least significant set bit of an integer type?

      Is this a trick question? Something like this in a #define, I suppose?
      for (i=0;iwhy you would need to do that - chances are that you're complicating things when you don't need to, and just need an & operation. If you stumbled upon this when trying to do netmasks, be assured that the problem is already solved, and quite efficiently too.
      C is helpful in making you think and find simpler algorithms, which execute far more efficiently than bloated libraries and language intrinsics that have to deal with ca

      • by arth1 ( 260657 )

        Oops, forgot about slashdot parsing <

        for (i=0;i<sizeof(var)<<8 && var % (1<<i); i++);

        • by AmiMoJo ( 196126 )

          I think you meant & instead of %.

          In any case, there are better ways to do this: http://graphics.stanford.edu/~... [stanford.edu]

          My favourite is the second example using only AND operations with constants. It's faster and has more predictable execution time, although it's not constant.

          I seem to recall that some systems (BSD?) have a prototype function for this in their standard library too, which on some CPUs complies to a single instruction.

    • In C or Python, running an unsigned integer through (s & (s - 1)) ^ s will give you only the least significant 1 bit (1, 2, 4, 8, 16, 32, etc.). For 60 (0x3C), it gives 4; for 1280 (0x500), it gives 256 (0x100).

  • by StevenMaurer ( 115071 ) on Sunday January 07, 2018 @08:48PM (#55882445) Homepage

    No one is seriously going to try to use C for front end web development, just as no one is seriously going to try to use Javascript in an embedded microprocessor. So what this study is doing is just pointing out where the current jobs are.

    Trying to compare languages, is like asking "which is better? a band saw or a screw driver?". They're entirely different. And anyone who doesn't understand that, simply doesn't have enough experience with other programming tools yet.

    • You are right in general...but wrong here.
      This report is not a beauty context or any other comparison of tools but rather statistics of web searches. As such it got reported and misinterpreted as "popularity" as if it was a preference.

    • Mod parent up - I posted so I can't use my points. This is correct.
    • Re: (Score:2, Interesting)

      by Anonymous Coward
      I seriously used C for web development (an old intranet system that was written when CGI was a new thing) It wasn't as hard as it may sound, nor did it take much more time to do than if I were to use a javascript framework of the week. it actually it took less effort than when having to deal with js / npm / webpack etc. These days I would use python on the back end if starting a new web project, with just enough front end scripting libs (jquery, bootstrap) to get on with the job, but I'd still prefer to co
    • by TheRaven64 ( 641858 ) on Monday January 08, 2018 @06:28AM (#55884045) Journal

      just as no one is seriously going to try to use Javascript in an embedded microprocessor

      I draw your attention to JerryScript, developed by Samsung as a lightweight JavaScript interpreter specifically designed for running in embedded microprocessors.

  • by drolli ( 522659 ) on Sunday January 07, 2018 @09:30PM (#55882603) Journal

    I consider this the transition to a more stable period. For some time it was unclear how functions would be split between programming languages. All kind of ideas were in the room, with interesting new contenders. Still the programming community decided that the areas covered by Java, Javascript, C, Python are well distributed in the way in which they are and that "good enough is still good".

    C# (competitor to C++ and Java in my eyes) seems to be dying. Swift doenst take over from objective C as fast as that one is going down. So for low-level languages nothing else is left. OOP and Data Architectures are firmly in the hand of Java, which has a very small overlap and a very good synergy with C. Python coesists in other areas, and hurts neither of he two languages.

    So in some sense: the war is over and java, (C+C++), python, Javascript have won for now.

  • by Proudrooster ( 580120 ) on Sunday January 07, 2018 @09:55PM (#55882705) Homepage

    Everything on planet freaking earth has firmware in it now and guess what compiler makes the smallest binaries that can talk to hardware?

    Yes, that's right C.

    Python can talk to C, but you aren't going to write firmware or low level interrupt handling code (like reading IR pulses) or monitoring an I2C bus.

    Java is C's fat lazy son that lives in the basement and consumes all available resources can do things eventually, if you give him enough resources, time, and be able to tolerate the odor.

    As long as computer architecture remains constant, C will be king. It's fast, small, and get's sh*t done now. Plus once you write it in C, you can call it from any other modern programming language like Python, Java, Objective-C, Swift, etc ...

    I went to a graduation party of a graduating Aerospace engineer last weekend and his advice was learn C, learn simulation and solving software like Mathematica, learn how to work in teams, and be multidiscipline in career focus for great success. He graduated from Perdue.

    • by prefec2 ( 875483 ) on Monday January 08, 2018 @04:21AM (#55883675)

      You cannot write device drivers in Java or Python as both languages require either an interpreter alias virtual machine or a just in time compiler. Also they are not the right tool for that particular job. However, they are well suited for other tasks.

    • If you learn Java or C++ you automatically have learned C. Besides the preprocessor and the linker.
      And ... cough cough cough. C is not much used in the air space industries, besides what your "Aerospace engineer" thinks.

      • Re: (Score:2, Insightful)

        by Anonymous Coward

        If you learn Java or C++ you automatically have learned C

        I knew some of those Java guys and saw their C. Their initial contracts generally expired without renewal and their projects were scheduled for a complete rewrite. There are worlds of difference between these languages and anyone going at C as if they were writing Java or Java as if they were writing C is going to cause a lot of pain.

      • If you learn Java or C++ you automatically have learned C.

        I can tell that you know nothing about C, and probably not much about Java or C++ as well.

  • The fact that it is searched-for a lot can just as easily indicate that the projects in other languages get completed quicker. That would make it a less useful (and **therefore** more used) language. The fact that more C gets used is about as telling as how many KLOCs of one language (vs another language) gets written. It just substitutes how much time is spent on one language over another for how KLOCs are written. Measuring output of work by the effort put in is always bad measure. It elevates effort
  • So when is Turbo Pascal coming back? ;-)

    • Re: (Score:3, Informative)

      by OrangeTide ( 124937 )

      Lazarus [lazarus-ide.org] gets you the Object Pascal derived syntax from Turbo Pascal and Delphi, and provides an IDE available on multiple platforms. I've had zero problems installing it on several systems, mostly for quick and dirty projects where I didn't necessarily want to use C.

    • Well,
      joking aside there is a quite complete and portable Pascal: https://www.freepascal.org/ [freepascal.org]

  • by nitehawk214 ( 222219 ) on Monday January 08, 2018 @12:20AM (#55883097)

    Or maybe it's a reflection that nearly every developer out there knows C to some degree and doesn't have to search for help as much?

    Maybe it means there are more older C devs that are more likely to go to a book than Stack overflow?

    Either way, it's a garbage metric designed to generate lazy clickbait articles, like this one.

  • Really you can accomplish everything in C, Java and BASH.

    Build and deploy thousands of machines, and you don't have to throw 30 plus years of systems engineering under the bus because some idiot out of college wants to use python.

For God's sake, stop researching for a while and begin to think!

Working...