Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Stats

C Rival 'Zig' Cracks Tiobe Index Top 50, Go Remains in Top 10 (infoworld.com) 167

InfoWorld reports: Zig, a general purpose programming language that interacts with C/C++ programs and promises to be a modern alternative to C, has made an appearance in the Tiobe index of programming language popularity. Zig entered the top 50 in the April edition of the Tiobe Programming Community Index, ranking 46th, albeit with a rating of just 0.19%. By contrast, the Google-promoted Carbon language, positioned as an experimental successor to C++, ranked just 168th.
Tiobe CEO Paul Jansen argues that high-performance languages "are booming due to the vast amounts of data that needs to be processed nowadays. As a result, C and C++ are doing well in the top 10 and Rust seems to be a keeper in the top 20." Zig has all the nice features of C and C++ (such as explicit memory management enhanced with option types) and has abandoned the not-so-nice features (such as the dreadful preprocessing). Entering the top 50 is no guarantee to become a success, but it is at least a first noteworthy step. Good luck Zig!
Tiobe bases its monthly ranking of programming language popularity on search engine results for courses, third party vendors, and engineers. Here's what they's calculated for the most popular programming languages in April of 2023:
  • Python
  • C
  • Java
  • C++
  • C#
  • Visual Basic
  • JavaScript
  • SQL
  • PHP
  • Go

April's top 10 was nearly identical to the rankings a year ago, but assembly language fell from 2022's #8 position to #12 in 2023. SQL and PHP rose one rank (into 2023's #8 and #9 positions) — and as in March, the rankings now shows Go as the 10th most popular programming language.


This discussion has been archived. No new comments can be posted.

C Rival 'Zig' Cracks Tiobe Index Top 50, Go Remains in Top 10

Comments Filter:
  • Zero Wing (Score:4, Funny)

    by Gherald ( 682277 ) on Sunday April 09, 2023 @02:47AM (#63436174) Journal

    > Good luck Zig!

    For great justice.

  • Rival? Really? (Score:5, Insightful)

    by Briareos ( 21163 ) on Sunday April 09, 2023 @03:23AM (#63436200)

    Shouldn't it at least have a similar amount of use in the wild - or at least within an order of magnitude - of C/C++ before you could start calling it a rival?

    • It's a "rival" to C/C++ in the same sense that my personal web site is a "rival" to Wikipedia. After all, both sites host informative content!

    • Comment removed based on user account deletion
      • by caseih ( 160668 )

        How do you define success? Popularity?

        Clearly Zig meets its creator's expectations, and from what I read it is appealing to some largish companies who are using it in production. Will it ever be popular? Probably not. But that's not a requirement for success.

        Been reading up on Zig a bit, as it's been on my radar for a while. I can't speak to the language itself, but as a build tool that can compile C and C++, it's pretty powerful. Cross-compiling is the easiest I've seen. Just that alone is extremel

        • How do you define success? Popularity?

          For a programming language? Yeah that's pretty much it. Look at just how much of a pile of shit java and the unrelated and similarly named language, JavaScript are. Yet there are vast ecosystems around each, owing entirely to their popularity.

          • Both Java and JavaScript are fine languages and fine ecosystems.

            If you are butt hurt that JavaScript did something od in "insert your favourite browser" or that an Java Applet loaded slow: your problem.

            Being ignorant about the two most used languages on the planet, not sure where that puts you.

      • by Briareos ( 21163 )

        Well, if it competes with C it's first and foremost a competitor - a rival in my book is at least somewhat on the same level...

      • by flink ( 18449 )

        Why? Since when does the definition of rival include similar market shares?

        Zig is intended to compete with C. Therefore it is a rival, using the standard English definition. The fact Zig is unsuccessful at this point is neither here nor there.

        Context matters. "Rival" has more than one definition. In this context it is usually means "approaching the size or magnitude of, comparable to". e.g. "The largest pteranodons rivaled the wingspan of a Cessna aircraft."

    • Re:Rival? Really? (Score:5, Interesting)

      by tlhIngan ( 30335 ) <slashdot&worf,net> on Sunday April 09, 2023 @04:19PM (#63437244)

      Well, it beat Rust, C, C++ and Java in speed in some artificial benchmarks.

      https://www.youtube.com/watch?... [youtube.com]

      People had a year to optimize a simple prime number sieve and submit into Dave's github repo. The fasted language was Zig, followed by Rust, then C, then C++ and then... Java. (The most optimized fastest implementation was run and benchmarked).

      Funny enough, assembly wasn't even up there despite several assembly versions.

      The rules were simple - it had to find the number of prime numbers from 1 to 1,000,000 using the prime sieve operation, it can have no hardcoded output values other than 2, and it must use bit fields going through the prime sieve- no using bytes or words as flags. (Some languages which didn't support bitmask operations are in the repository but their results aren't official for not following this rule).

      So I suppose good job for the Zig and Rust programmers that managed eke out every last ounce of performance out of their code in order to beat straight up C.

      • Funny enough, assembly wasn't even up there despite several assembly versions.

        That kinda tells you something about the competition though. The assembly version should be the reference version. The one that cannot be beaten because it's "perfect." As the only language that isn't forced to be compiled and have it's compiled output interpreted by an optimiser, in an ideal world it should be the theoretical maximum. That it wasn't says you can't rely on any one submission being made by the best of the best in their respective field. Makes comparisons between languages difficult when you

      • Funny enough, assembly wasn't even up there despite several assembly versions.
        Writing *performant* assembly on modern processors *by hand* is not as easy as it used to be in the 1980s or 1990s.

        So I suppose good job for the Zig and Rust programmers that managed eke out every last ounce of performance out of their code in order to beat straight up C.
        Can have many reasons.
        * C programmers dying out ...
        * Rust and Zig more simple to express complex bit operations
        * Rust and Zig programmers are young programmers,

  • by shoor ( 33382 ) on Sunday April 09, 2023 @03:27AM (#63436208)

    I studied computer science in the 70s and, in school, was exposed to various languages of that era (COBOL, Fortran, APL, Algol, etc.) But working as a programmer, the only languages I was ever paid to program in were assembler and C. So those were the only ones I ever got good at.

    So, writing as somebody with an admittedly limited viewpoint, the only thing I would have modified in C would be to make it less easy to make the mistake of writing something like a = b when you meant a == b. Maybe I would have wanted to discourage depending on priority of operations as well, so you couldn't write a + b * 2 for instance. Either a + (b * 2) or (a + b) * 2. When trying to maintain code left behind by some hotshot, I didn't want to have to deal with long strings of stuff like that looking to see if long gone hotshot had made a mistake. (I also thought varargs was a kind of clunky kluge, but I could live with it.)

    • Re: (Score:2, Insightful)

      by Viol8 ( 599362 )

      "writing something like a = b when you meant a == b"

      Agreed. Allowing an assignment to also be a test was probably a bad idea. All it does is remove one line of code at the expense of this sort of problem. eg:

      if (a = b) { } shouldn't be allowed. Make it
      a = b;
      if (a) { }

      "so you couldn't write a + b * 2 for instance. Either a + (b * 2) or (a + b) * 2"

      Disagree. Mathematical operator precedence should be well known by anyone with basic maths skills, never mind professional developers.

      • by Entrope ( 68843 ) on Sunday April 09, 2023 @05:50AM (#63436348) Homepage

        Sometimes one wants to write something like that -- usually with a more complicated "b" -- so the syntax should be allowed, unless one replaces it with syntax like C++ borrowed from Go, if (a = b; a) { ... }. Good C compilers will warn of what you wrote, though. gcc suppresses its warning with if ((a = b)) { ... }, which is harder to type by accident.

        C's operator precedence did erroneously make bitwise operators lower-precedence than comparison operators. People have been suffering from that mistake for 50 years now.

    • the only thing I would have modified in C would be to make it less easy to make the mistake of writing something like a = b when you meant a == b.

      Compilers have done this for years, see e.g.:

      https://godbolt.org/z/dq8n33ad... [godbolt.org]

    • by narcc ( 412956 )

      Maybe I would have wanted to discourage depending on priority of operations as well, so you couldn't write a + b * 2 for instance.

      I'm not sure what you're advocating for here. You don't want any operator precedence independent of grouping? Do you have trouble remembering the rules? I mean, I can see having some trouble when you're mixing a bunch of different operator types if you're not used to it, but just for arithmetic operators?

      The closest thing I could come up with that is likely to be problematic would be the use of unary increment/decrement operators in array indices, function arguments, the right-hand side of an assignment

      • I'm not sure what you're advocating for here. You don't want any operator precedence independent of grouping? Do you have trouble remembering the rules? I mean, I can see having some trouble when you're mixing a bunch of different operator types if you're not used to it, but just for arithmetic operators?

        I dunno, only allow one arithmetic operation at a time? So you'd have to do
        tmp = b * 2;
        result = a + tmp;

        Anyway I think regardless of the language a good practice is to put anything more complicated than that in parentheses so that it's unambiguous and no surprises pop up if you add some more exotic operators in the mix.

      • Unfortunately Smalltalk dropped operator precedence. It is from left to right.
        When I pointed out, that this is an error trap for myself, as I KNOW mathematic precedence, the other guy declared me an idiot: to stupid to parse an expression from left to right ...

    • mistake of writing something like a = b when you meant a == b.

      All modern compilers will warn you about an assignment in a boolean context unless you are foolish and disable warnings.

    • Having written in C/C++ for many years before shifting to C#, I'd like to add a few gripes with C:

      - Header files are a pain, especially when you have to include them in a specific sequence, or when the same standard library function is in one header file on one platform, and another header file on a different platform. This makes porting to multiple platforms a royal pain.
      - It's way too easy to get into trouble with NULL pointers.
      - It's way too hard to keep track of memory allocations. C++ new and delete he

      • by Viol8 ( 599362 )

        "Having written in C/C++ for many years before shifting"

        When, the 1990s?

        "but there's still no automatic release of allocated memory when you no longer have any pointers that reference that memory"

        Huh? shared_ptr has been around since 2011.

        "In C, there is no built-in support for multi-threading"

        Why should there be? Threads are an OS thing. Should it have multi process support too? C++ threads are an anomaly IMO and also lowest common denominator in that they don't support some pthreads functionality because

        • shared_ptr is C++ only, it's not available in C. If you look back to the original post on this thread, the subject at hand is C.

          There is no hard line between what is part of the OS, and what is part of the language. If you think threading should be excluded from the language because it's an "OS thing" then file I/O should also be excluded, because that too is an "OS thing." https://en.wikipedia.org/wiki/... [wikipedia.org] The fact is, when I'm writing an application, I don't care what is an "OS thing" and what is a "Syst

          • File IO is in a library, its not built into the language itself. You might want to get a clue before you post again.

            • You literally can't write anything useful in C without including multiple libraries. The libraries are so integral to the language that they might as well be part of it. You literally can't even allocate memory without the standard libraries that come with the compiler. So to say that file IO isn't part of the langue, is disingenuous. https://stackoverflow.com/ques... [stackoverflow.com]

              • by Viol8 ( 599362 )

                In that case threading is integral to the language too, so you're being disingenous complaining it isn't so what are you complaining about?

                Can't have it both ways pal. C doesn't pretent to be anything other than a high level assembler. If you want hand holding scripting languages then use them, thats what choice is about.

    • I think a = b as an if condition is rather logic, and mathematical operand precedence simply has to be the default, but I agree there are only a few "minor" things that could be fixed.

      From the top of my head:
      Precedence for &, |, and ^ as has been said by Ritchie himself.
      Unassigned pointers either disabled or NULL by default.
      Maybe integrate the safe printf, scanf versions from the library.

      I also think plenty of issues with C could be solved by a smarter compiler, seeing how that is one of the things that

    • Comment removed (Score:5, Interesting)

      by account_deleted ( 4530225 ) on Sunday April 09, 2023 @11:20AM (#63436704)
      Comment removed based on user account deletion
      • IMO there are multiple different categories of UB in C/C++.

        Some UB just reprepresents pig-headedness on the part of compiler writers and standards organisations. The optimisation benefits are massively outweiged by the fact that they make even the simplest operations (like addition) into potential footguns.

        Some UB is arguable, stuff like the aliasing rules. On the one hand i'm not convinced the optimisation opertunities justify the extra mental load, on the other hand they relate to things (like pointer typ

    • by Calibax ( 151875 )

      As someone who started coding professionally in the '60s (when "computer science" wasn't a subject in any university - if you were interested in computers you were pointed to the Math department) I agree with your comments about confusion of boolean and arithmetic uses of "=".

      But the biggest problem I've seen with neophyte coders using C (and other languages) is memory management. While malloc and free provide everything needed, it's really easy for less practiced coders to end up with a complete mess (thi

    • >the only thing I would have modified...

      Well zig makes a lot more changes, many of which I consider to a good thing.

      #1 on the list being the build language is the language itself. So you makefile is written in zig. No messing with a collection of bloated, unreadable, unmaintainable build languages - I'm looking at you cmake.

      I played with it for a while, but the inline assembly wasn't good at the time, which got in the way of what I wanted to do. I'll get back to it sometime because I really did like the

  • by Viol8 ( 599362 ) on Sunday April 09, 2023 @04:00AM (#63436250) Homepage

    *yawn*

    Another month, another C++ killer. Yeah, whatever, get back to us when a few more people than the author, his dog and his mates use it.

    "such as the dreadful preprocessing"

    Writes someone who's never done low level cross platform development.

    • by AmiMoJo ( 196126 ) on Sunday April 09, 2023 @05:33AM (#63436322) Homepage Journal

      Dave Plummer did a comparison of programming language performance just yesterday: https://youtu.be/pSvSXBorw4A [youtu.be]

      All the code is in a Github repo. The results were surprising, with Zig so far ahead of every other language, including C and C++, that it made many people wonder if the test was flawed.

      The rules are in the Github repo but basically every entry had to use the same algorithm and certain features like packing data into bits rather than whatever is most efficient for the Ryzen CPU used in the test. My guess is that Zig ignores some of the bit manipulation stuff, or takes some other shortcut that the programmer is not allowed to use but which the compiler can do via optimization.

      In any case, Zig appears to be an interesting language if it can offer both excellent performance and some nice safety features. Depending on exactly why it is so much faster than C, it may not be any good for embedded/system programming though.

      • by gweihir ( 88907 )

        These "benchmarks" must not only be flawed but intentionally set up to be lies. Well-written C code is generally pretty close to well-written assembler and no language can get better than that. Just not possible.

        • by AmiMoJo ( 196126 )

          The competition has been open for a year, and you can still submit code if you think you can do better. He didn't write most of it, other people did.

          Certainly something is going on here, and I'd be very interested to know what it is.

          • by dfghjk ( 711126 )

            "Certainly something is going on here, and I'd be very interested to know what it is."

            A brief look at source code suggests there certainly is.

            First, the implementations do not "use the same algorithm", they merely all end with the same result. The various codebases implement significant modifications in an attempt to exploit the execution environment. The competition is not between languages, it is between programmers trying to game the rules.

            Worse, the benchmarking is performed using tools that introduce

          • by gweihir ( 88907 )

            That just means nobody competent cared enough. I certainly do not. Of course if somebody wants to push some specific tool they may just invest more effort. But _nothing_ can beat C by a large margin unless the C code is not very good.

            • by AmiMoJo ( 196126 )

              I'm wondering if it was something like the language having built in functions for certain things that were able to skirt the rules. Or perhaps the optimizer decided it would trade memory for speed in a way that the rules didn't allow the programmer to write themselves.

              Clearly there was something wrong with the test.

        • Well written assembly does not exist since 30 years, outside the ARM platform. That is why ARM got so popular.
          Writing efficient assembly on a modern processor is close to impossible - by hand.

          • by dfghjk ( 711126 )

            What a load of crap. All of it. The rise of shitty programmers has not eliminated good programmers OR the need for assembly programming bu good programmers, AND ARM is not popular because it is the only architecture with "well written assembly". It's as if SuperKendall wrote this post for you.

            • it is the only architecture with "well written assembly".
              I did not wrote that.
              Dumbass. Reread what I wrote.

              But if you can write assembly for a PowerPC or MIPS or SPARC, better than the C compiler: I lift my hat.

              As you clearly can not even read: I doubt that.

              You might be on par with a good C compiler if you *can* write ARM assembly. Which you probably can't ...

              Hint: you could look up the different processor architectures and how their assembly languages work. Might take a few weeks so, after you have read t

          • by gweihir ( 88907 )

            1. That is bullshit.
            2. That was not my point. My point was that well-written C code will always be close to what the hardware can do. C just does not add a lot of overhead or translation loss. That is the very reason why it is still around.

    • by ShanghaiBill ( 739463 ) on Sunday April 09, 2023 @05:50AM (#63436346)

      "such as the dreadful preprocessing"

      Writes someone who's never done low level cross platform development.

      Indeed. The preprocessor is one of the best things about C/C++, and is a major reason other languages have failed to displace it.

      • It isn't, not for C++. Historical uses of macros, such as inlining, is much better just letting the compiler do the inlining. Generic code generation are much better just as templates, and with modules, blazingly fast at compile-time too.

        The last common use for macros - a kind of reflection - is done much better with actual reflection. This, coupled with the simplifications to compile-time programming, is much more easier to use than macros. Reflection will be in the next C++ standard.
        • by Viol8 ( 599362 )

          Congrats on proving you don't know what you're talking about.

          None of that is any use if your platforms have different function calls/interrupt handling/whatever to do what you need to do. eg: there's no point having a fork() call in win32 code.

          • You can achieve that without the preprocessor in C++. Especially now that we have modules.

            Congrats on proving that you're stuck in outdated ideas.
            • by Viol8 ( 599362 )

              No, you can't. All it does is shift the conditional compilation to libraries.

              Congrats on proving you're an ignorant idiot.

              • Yes you can. I've done it. The conditional compilation is in the build system.

                No need for macros at all.

                Congrats for proving your absolute incompetence.
                • by Viol8 ( 599362 )

                  "Yes you can. I've done it. The conditional compilation is in the build system."

                  Oh, suddenly its in the build system not the language. Nice goal post moving there!

                  Sure, you can do that, make the build horrendously complex and fragile and make your life difficult in order to satisfy some pointless ideological methodology if you want. In fact you could compile up 2 completely seperate code bases that only have the binary name in common depending on the platform, but its hardly a sane way to proceed.

      • by caseih ( 160668 )

        And yet starting with C++20, it's clear C++ is moving away from the preprocessor. Instead of including heading files, for example, you import modules. And also there is const expr. The preprocessor will be around for decades yet I'm sure, but over the next 10 years hopefully all new C++ code will be pre-processor free.

  • Want dramatically higher performance for manipulating lots of data? Rebuild the cpu/memory architecture we have now.

    Computers spend a fuck ton of time waiting for data to move around to be processed as opposed to actual processing. There are a zillion tricks and hacks such as numa and various multi threading schemes to kludge around the architecture but they don't always work, can make the proletariat worse in some edge case and the improvements aren't that great anyway.

    The cpu memory bus is the problem.

    • by gweihir ( 88907 )

      Nope. You can get what you describe. It has been tried. Turns out that outside of signal processors and other specialized processors with small memory, this idea does not work.

      • Yes of course, it has been tried in the past with previous generation technology and materials so it must be impossible.

        Excuse me while I head out slowly on my horse without stirrups to visits my friend on his farm where he needs help with planting and harvesting because the plow hasn't been invented yet.

        It is simply impossible to do anything than what we have today. You also won't need more than 640k.

        Let's just stop trying because some random internet posted said it can't be done.

        • by dfghjk ( 711126 )

          "Let's just stop trying because some random internet posted said it can't be done."

          No, let's try because some random internet posted said it can be done, the random naming himself "WaySmarterThanYou" while making comments about 640k being enough.

          This post is pure projection. Consider it an opportunity to show us all how smart you are, the next dominant architecture is in your hands, please wipe the lube off first.

          • Did my name trigger you? Poor nerdling. Cry to momma.

            When you have something better than ad hominem to contribute I'll be here waiting with the other adults.

        • by gweihir ( 88907 )

          Seriously? You seem to have gotten even dumber. The very point of it having not only been tried but being available in production tech _today_ means that the engineers looking at how to squeeze more out of general-purpose CPUs are not only aware of that option, they have fucking working silicon they can evaluate and play with. And that means at this time this idea is a dead-end for large-memory general purpose computing. Deal with it.

          • Ok, great, we can never ever advance beyond current technology. We're just done. Sell your amd and intel stocks. It's a wrap, it's over, the industry has explored and maxed out all revolutionary options and from this moment forward can only make incremental advances mostly based on chip shrinks.

            Absolutely no way anyone could come up with something new or improve upon a previously set aside tech. Nope, impossible!

            Thanks for contributing today.

            • H100 GPUs solve the memory bottleneck problem, they got almost linear scaling with large language models unlike A100 and all previous generations of GPUs. They improved the NVLink and NVSwitch technologies to increase bandwidth.
    • I used to have a printed sheet of paper on my wall at work; allegedly it originally came from Google, but who knows? And with the age of it, it was probably referring to processors from a decade or more ago, but I'm not sure that it's entirely irrelevant even at this point.

      Anyway, it listed memory types by their "distance" in CPU Clock cycles from the CPU. Level-1 cache took about 3 CPU cycles to access; L-2 cache about 10, L-3 cache about 30, and Main memory about 100 CPU clock cycles to access. This ba

      • For small data that fits mostly in cache I agree but the subject here was big data. There's no current way to cache very large amounts of data in a reasonable way.

        Figure 99% of your data isn't even local to your machine. It's probably in some storage over the network (be that tcp or some custom thing, doesn't matter much) where to even hit local memory it has to be pulled from memory or disk or cache etc from the storage system / database then transferred over some form of wire (presumably) then finally s

  • If Zig junks it, they better have a way of doing all the thing it could do.

    Also, note to language inventors: Pick an unused name so we can google it easily. I still can't look for C++ jobs on some sites, Zig will give loads of bad hits, and "Go", really?!
    • by Entrope ( 68843 )

      Here's a nickel kid, buy yourself a real search engine. The fraction of times where DuckDuckGo (and thus presumably Bing) confuses "Go" with "go" in my searches is on the order of a percent. I don't imagine that Google is significantly worse.

      • Here's a nickel, kid. Buy yourself a fact.

        I use Duckduckgo and I tried a few searches of "go <word>" with it.
        The Go language comprised about 20% of the results.
    • by gweihir ( 88907 )

      The C preprocessor is an "expert-only" tool. It has stood the test of time and it is quite useful. Of course, all the wannabe "coders" do not qualify so they want languages that cater to their incompetence.

      • The true test of a programmer is not whether they can write code, but whether anyone else can read it.
        • by dfghjk ( 711126 )

          It is unclear how that test should be evaluated, or why the vast majority of computer uses (who don't read code) care.

          The true test of a programmer is the quality of his work. If his work is writing code for others to read, the "whether anyone else can read it" has merit. If he works alone, it is irrelevant.

          • Let's just say there's a whole swath of problems that cannot be solved on the scale of a single programmer working alone. Optimizing your skills for that case is limiting. If you aren't working alone, then yes, your code will be read. Many more times than it will be written.
        • The true test of a programmer is not whether they can write code, but whether any other *equally competent* programmer can read it.

          FTFY
      • I have yet to see anyone use preprocessor metaprogramming techniques extensively. There are some things about the preprocessor that are just too expert.

        And I, as someone who has used preprocessor metaprogramming, would not be sad to see most uses of macros go, when there are much better ways for compile-time programming for most cases.
    • I'd argue that the preprocessor is a detriment to good code, not a benefit. This is because it all too often hides what is actually happening.

      Take the simple case of

      #define bool short
      #define true 1
      #define false 0

      This makes it easy to simulate a boolean variable. But does it?

      bool b = 5;
      if (b == true) ...
      if (b) ...

      One would think that the two if statements would yield the same result, but they won't.

      This kind of hiding of what's really happening may be convenient, but it leads to bugs that are hard to diagnos

      • by Uecker ( 1842596 )

        What I like about the preprocessor compared to other compile time computation techniques is that

        1) it is relatively stupid
        2) you can inspect the output

        • Those are very interesting choices of things you like about preprocessors.

          Sure, you can inspect the output, but who wants to have to do that? That's an extra chore that makes debugging harder than it has to be.

          And "it's relatively stupid"...how is that an advantage?

          • by Uecker ( 1842596 )

            Ever debugged a complicated C++ compile-time issue?

            • Oh, you mean, compile-time issues that are caused by the preprocessor? Yeah, I have.

              In C#, there are no "compile-time issues." If there are no errors listed as errors while you are typing your code, there won't be any issues with the compile. It's literally that simple.

      • You have shown you can write bad code using the preprocessor.

        Nothing more.
  • More like tinsy weensy wannabe that nobody ever heard about and nobody will again.

  • by The Evil Atheist ( 2484676 ) on Sunday April 09, 2023 @06:52AM (#63436402)
    https://ziglang.org/learn/why_... [ziglang.org]

    It's amazing how many languages that try to claim to be better than other languages don't do homework to get their facts right. I know C++ best, but I'm sure people who know the other languages it competes with can find what's wrong with it.

    C++ ... have operator overloading, so the + operator might call a function.

    And? The + operator IS a function. Functions can be inlined. It's no different from function overloading. It's no different from passing a flag that calls a specific version of a function, except with ordinary function overloads the overload resolution is done at compile-time and the choice is optimized away.

    C++ coroutines allocate heap memory in order to call a coroutine.

    Actually a lot of work has been done to allow memory allocations to be elided. Not only can C++ compilers elide allocations for coroutines, C++ compilers are allowed to elide allocations for normal memory.

    C++, Rust, and D have such a large number of features that they can be distracting from the actual meaning of the application you are working on. One finds oneself debugging one’s knowledge of the programming language instead of debugging the application itself.

    This is such a nonsense argument. Any useful program will be large enough to require functionality in either the language or libraries. You're going to have to learn a lot of things either way. Libraries can also have a large number of features that distract you from the actual meaning of the application. You can also have too many libraries.

    Programmers who can't wait to debug their programs at runtime are what's wrong with programmers. If the language can do a lot of the debugging for you every time you compile, then you'd be a moron to not let the computer do it.

    Especially stupid is the defer keyword instead of destructors. It completely misses the point of having destructors in the first place. The point is you don't need to write cleanup code everywhere.

    Apparently there is this https://github.com/ziglang/zig... [github.com]

    They seem to have fundamentally misunderstood how and why RAII works.

    • I find a lot to agree with in your comment. I suggest those interested in a better C investigate Ada in addition to C++. It is amazing how many are trying to create a better C and overlook the fact that the alternatives already exist. They are just not as popular because "they are not new".

      I also find it interesting that many of those involved in designing these new solutions have not done a better job of learning about these existing alternatives to understand the features and implications that they are

      • I have a high respect for Ada. I'm known as a C++ guy around these parts, but I wouldn't mind an opportunity to use Ada, if only there were jobs around me that would accept an Ada noob.

        Imagine how much better the world would be if the people who hated C++ just went with Ada, instead of creating abominations like D, Java, C#, Javascript, Python, Rust, etc.
        • Python isn't a C++ alternative, it's a bash alternative. Works great in that context.
          • Even C++ works as a bash and Python alternative these days. If you use bash or Python, you don't care about performance. You can easily write C++ in a script-like style. The best part is, when you inevitably need to use native libraries, C++ can do it without change, since they'll inevitably be written in C. Or when the program gets big, it won't become the unwieldy dynamic typing mess like Python does.

            But still, Python isn't being used as a bash alternative. It's being used in everything from build syst
  • Who cares about some made up language popularity index? Is that thing owned by the same company as Slashdot, or what?
  • I'm not caring much about Zig. But the others: I find it amusing how the order of the top languages varies so widely. Every survey is different.

    Python being the top language - honestly, that's a joke. There may be a lot of amateur programmers playing around with it, but a scripting language that allows runtime redefinition of its data types? Not for any sort of serious software development.

  • I'll wait for the inevitable successor (and/or competitor) language "Zag".

  • Many a programmer's Know David Plummer's Youtube Channel, "Dave's Garage".

    One of their ongoing threads was to test programming languages speed with a prime sieve to determine the fastest language. In the end, the effort became a comunity effort testing ~90 languages, from mainstream to obscure. In the end, Zig won.

    Is surprising enough that Zig won, but even more surprising is that while second 2nd fastest language (Rust) was able to crank 5857 passes of the sieve per second, 1st placed Zig was able to do 10

    • by ffkom ( 3519199 )
      There is a reason why serious benchmarks, such as the SPEC suite, do not just measure results for one particular algorithm.

      If you want the result to be that other programming languages are faster than zig, just pick and choose some different algorithm, like this example: https://programming-language-b... [vercel.app]
    • If you are comparing compiled languages then for most low level languages a well written program with a decent compiler what is actually run is optimized assembly language - not sure how you can gain any significant speed except with a clever algorithm that could be implemented in any other low level language

      i.e. either it's compiler has a clever trick that you could do in any other language, or the code has a clever trick you could do in any other language

  • Go's libraries are not ready for production use. For instance Go's http.Get(url) does not support important aspects of standards like the fact that DNS can return multiple results. Something like Java's Apache libraries still need to be created to wrap Go's overly simplistic functions in many places.
  • Many languages with static type, no gc and compilation to binary code, are better choices according to the tiobe index, and should be tried first as C/C++ replacement, much before Zig.

    According to the last tiobe Index, before zig, you should try: D, Objective-C, ADA, Fortran (high perf distributed co-array in zig ?), Rust, and assembly.

    First step for success ? 2 concurrent compilers ! All major languages have several compilers. My top 6 have at least gcc and llvm implementations, including the approaching g

  • While I don't see reason to change from C++ to Zig, I'll give it to zig that their approach of shipping a self-contained cross compiler - even for other languages like C++ - is a nice idea. Want to compile some C++ code into an executable for a different operating system you do not have at hand? Zig can help doing that, without involving a line of zig code: https://andrewkelley.me/post/z... [andrewkelley.me]

"The pathology is to want control, not that you ever get it, because of course you never do." -- Gregory Bateson

Working...