Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming

Why C Isn't a Programming Language Any More (theregister.com) 284

The C programming language has many problems. But now the Registers notes that "Aria Beingessner, a member of the teams that implemented both Rust and Swift, has an interesting take... That C isn't a programming language anymore...."

"And it hasn't been for a long time," Beingessner writes in an online essay: This isn't about the fact that C is actually horribly ill-defined due to a billion implementations or its completely failed integer hierarchy. That stuff sucks, but on its own that wouldn't be my problem.

My problem is that C was elevated to a role of prestige and power, its reign so absolute and eternal that it has completely distorted the way we speak to each other. Rust and Swift cannot simply speak their native and comfortable tongues — they must instead wrap themselves in a grotesque simulacra of C's skin and make their flesh undulate in the same ways it does....

Everyone had to learn to speak C to talk to the major operating systems, and then when it came time to talk to eachother we suddenly all already spoke C so... why not talk to eachother in terms of C too?

Oops! Now C is the lingua franca of programming.

Oops! Now C isn't just a programming language, it's a protocol.

The Register picks up the argument: it's fair (if wildly controversial) to say, as this 2018 Association for Computing Machinery paper puts it, that C is not a low-level programming language. As its subtitle says: "Your computer is not a fast PDP-11."

This is not a relative assessment: that is, it's not saying that there are other programming languages that are lower-level than C. It's an absolute one: C is often praised for being "close to the metal," for being a "portable assembly language." It was, once, but it hasn't been since the 1970s; the underlying computational models of modern computers are nothing like the one that C represents, which was designed for a 1970s 16-bit minicomputer.

The Register summarizes what happens when a language has to interface with an operating system — and thus, that operating system's C code. [I]t has to call C APIs. This is done via Foreign Function Interfaces (FFIs).... In other words, even if you never write any code in C, you have to handle C variables, match C data structures and layouts, link to C functions by name with their symbols....

The real problem is that C was never designed or intended to be an Interface Definition Language, and it isn't very good at it.

This discussion has been archived. No new comments can be posted.

Why C Isn't a Programming Language Any More

Comments Filter:
  • by Gabest ( 852807 ) on Sunday April 03, 2022 @02:59PM (#62413586)

    Every time I write speed critial code in C I check the generated asm output and it is totally possible to fine tune it to get you want in low level.

    • Re: (Score:3, Insightful)

      by rlwinm ( 6158720 )
      ^^^ This is a sign of a seasoned pro. Where it matters (which is often less than 1% of the code) I know all the seasoned guys check the output of the compiler both for sanity and correctness as well as to guide the compiler better.
      • by narcc ( 412956 )

        You know, for years the common "wisdom" on Slashdot has been that compilers are magic and that mere humans can't possibly best them. They tell you not to worry about how wretchedly inefficient your code is because the compiler will magically optimize it.

        They'll also tell you that processing power and memory are so cheap that it's not worth your time to even attempt to write more efficient code. Then they use python, as if to prove their point that performance doesn't matter any more, and then they wonder

    • by Aighearach ( 97333 ) on Sunday April 03, 2022 @03:39PM (#62413676)

      Their claim is so ridiculously stupid it really calls into question the future of Rust if these are the sort of idiots on their core team.

      Most of my code runs on embedded systems. Most of the documentation I keep open while writing C is processor registers. I have absolutely no trouble at all doing this in C. Because, golly, it is "close to the metal."

      Aria Beingessner should burn her IDE and learn how the stuff works.

      Rust and Swift cannot simply speak their native and comfortable tongues — they must instead wrap themselves in a grotesque simulacra of C's skin

      Not C's fault.

      • Absolutely stupid, I agree. Doesn't sound like a programmer wrote this to be honest, more like a journalist who saw a few books. Of course it's a language, not a protocol. And no, it's integer hierarchy isn't broken. And it has standards, it's not a mass of incompatible compilers. Yes, it interfaces to an OS, but it is also used for applications.

        • by ShanghaiBill ( 739463 ) on Sunday April 03, 2022 @04:45PM (#62413858)

          Yes, it interfaces to an OS

          Yes, and it does so through assembly language stubs. Rust does the same and has no problem interfacing with the OS. The fact that the OS is written in C is irrelevant.

          TFA is not just stupid, it is wrong.

      • by phantomfive ( 622387 ) on Sunday April 03, 2022 @04:27PM (#62413814) Journal

        Over the years, people have tried to make a better "low level interface." We've seen LISP machines, we've seen APL, we've seen FORTH machines, we've seen the JVM, we've seen CRL, and a couple different iterations of web assembly. Most of these have the goal of making it easier for higher level languages to interact with the computer.

        C is popular because it matches what the computer does (fairly well). I am happy to see Aria Beingessner make another attempt along these lines. Maybe she will come up with something impressive. But it isn't an accident that things ended up this way. They ended up this way because C is the best at what it does.

      • by ceoyoyo ( 59147 ) on Sunday April 03, 2022 @04:35PM (#62413832)

        Her problem appears to be that she'd like to use C librairies someone has generously shared with her, but she's astonished that nobody has written a Rust interface for her. But OMG, if she HAS to, she'll write her own. Except C headers are apparently impossible to parse (except for C parsers). And OMG, what even IS a long anyway?

        There's a reason programmers used to learn assembly, even if they never used it again. It teaches you all the work that has to be done so you can have an i32 (32-bit int for Rust people; I assume it's little endian?).

        My own CS professor thought assembly was for ignorant children and insisted we start with transistors and learn about all the work that has to be done before you can have luxuries like mov, jmp and add.

        • by BumboChinelo ( 2527572 ) on Sunday April 03, 2022 @05:15PM (#62413932)
          You lucky bastard! We started with electrons and holes, doping and band gaps
        • My own CS professor thought assembly was for ignorant children and insisted we start with transistors and learn about all the work that has to be done before you can have luxuries like mov, jmp and add.

          That would've been a cool approach; and probably an easier one for my C language prof, who had a background in electronics engineering, rather than straight compsci. As it was, C was somewhat new to him and college level curricula - my age is showing itself, here. Because I'd already spent some time playing with Turbo C, I was more familiar with C than he was, and actually helped him present the material in some cases.

        • by Cbs228 ( 596164 )

          Her problem appears to be that she'd like to use C librairies someone has generously shared with her, but she's astonished that nobody has written a Rust interface for her.

          Rust has bindgen [crates.io] for this purpose. It uses LLVM to dissect C headers and produce an (unsafe) Rust wrapper for them. The programmer is still responsible for checking over it and writing a higher-level, "safe" Rust API if needed, but bindgen does most of the drudge work.

          an i32 (32-bit int for Rust people; I assume it's little endian?).

          An i32 has native endianness by default. You can convert it to/from any endianness you want, but arithmetic still needs to happen natively. Rust arguably has better behavior [github.com] for arithmetic overflow than C.

          I frequently tell people that "C is th

        • by ljw1004 ( 764174 ) on Monday April 04, 2022 @12:46AM (#62414860)

          There's a reason programmers used to learn assembly, even if they never used it again. It teaches you all the work that has to be done so you can have an i32 (32-bit int for Rust people; I assume it's little endian?). My own CS professor thought assembly was for ignorant children and insisted we start with transistors and learn about all the work that has to be done before you can have luxuries like mov, jmp and add.

          I did loads of assembly back in the 1990s. (first job was writing assembly; did RISC stuff with arm chips; and like your professor I built my own hobby CPU out of logic gates).

          But the world is so different now.

          What has been humbling for me, now as a professional programmer, is that the machine's behavior is so different from the mental model I formed from assembly. If I want to write a hash-table? -- the fastest implementation in practice is now one that scans 256 hash lines in parallel using SIMD. If I want to write a concurrent data-structure? -- the fastest implementation in practice is no longer one that uses atomic compare-and-swap, but instead one that uses shards, and locks an entire shard at a time, so it can use SIMD. If I want to write a compiler? -- the fastest implementation is now one that uses garbage collection (!!!) since AST-walking produces so much junk on the shallow heap that having a garbage collector wipe it out in one go is faster than allocating and deallocating each piece.

          So when I see someone reaching for assembler or C to solve a problem or just to reason about how to solve a problem? My first thought is that they're probably leaving perf on the table. (unless they themselves are the person who spent the past two years of their lives pursuing this one small area).

      • by Anonymous Coward on Sunday April 03, 2022 @04:42PM (#62413842)

        Their claim is so ridiculously stupid it really calls into question the future of Rust if these are the sort of idiots on their core team.

        You have to ask, don't they know that? Assuming they're halfway competent, apparently enough so to "be on the team that implemented" two compilers. Maybe not, and that's technical demerits on both compilers, since they're then full of code by incompetent programmers. But let's assume this for the sake of argument.

        Then what? Then they're spinning a narrative to justify moving away from C, aimed at middle managers. That's what.

        On another note, I don't really mind if C would shuffle the mortal coil, eventually. The thing is that most CPUs are still "C machines", designed to, indeed, be decent simulacra of really fast PDP11s. Certainly so compared to, say, a lisp machine or a forth machine. Moreover, neither swift nor rust offer abstractions that would lead to efficiency in translating program code to run on hardware other than a C machine. Meaning, neither rust nor swift are capable of being better in this.

        And then there's the tiny detail that rust cannot generate code without llvm, which is written in C++, which in turn is a rather gigantic bastard language based directly on C. If you're talking about C not matching current hardware well, what about C++? I think they can make that case about as well as the case that rust or swift would be decent C replacements.

        So they spin a ludicrous narrative. But again, that's aimed at giving clueless middle managers reasons to move away from C, and, presumably, onward to rust. So while this is utter and complete bollocks technically, it is vaguely plausible to the intended demographic.

        So, this message is for the same reason as why they keep on proposing putting rust in places it won't shine. This is propaganda.

        If you want to do anything about it: Write a piece how C, despite its failings, is still vastly superior in practice to rust. And write it at a level understandable to a five year old, ie, clueless middle management.

      • it is "close to the metal."

        This, in a nutshell.

      • English also is not a language because different people speak it differently and there is no enforced standard.
        Rust is not a language, but a chemical process

      • > Not C's fault.

        Well, it is to the extent that its popularity makes it the one people copy.

        You don't see a lot of people "wrap{ping} themselves in a grotesque simulacra of COBOL's skin", for instance.

    • BS and not BS. They are right, C is not and never was supposed to be a low level language. C is a high level language. ASM is a low level language and NEITHER is machine code. The code these guys are writing is just so far from the known universe that C looks close to the metal in comparison.
      • Rubbish. One of the reasons C and C++ are still used is because they allow you to easily, deterministically (usually) know exactly what the compiler will do and what the underlying code will look like, what the structures you define will look like, stack usage, all that shit.

        If C is a high level, you are just turning it to 11. "Guise, C is an 8, so C# is a 10 and RandoLangOfTheWeek is a 14!"

        C is a low level language compared to almost every other language in even semi-popular use.

    • More like CS [github.io] than BS: "I am Aria Beingessner, Gankra, and a cat."
    • If you're writing code in a vacuum you aren't the subject of this article.

      The problem as the article states is that for anyone not writing code purely for embedded systems or OS code you'll almost always need to communicate with C code. And that means you need to not just speak assembly or bytecode you need to speak "C as defined by the compiler that compiled the OS."

      And that "C" is incredibly inconsistent, not just between versions of "C" but between versions of a compiler. Basic definitions like Long Long

    • Re: (Score:3, Insightful)

      by Kisai ( 213879 )

      It's the kind of complaint I see a lot by other programming languages. "Nobody speaks my language, why do I have to learn C"

      Because C, IS the language closest to representing the underlying hardware. C is to English as Java is to French. French is maybe used in 3 or so countries, but the vast majority of the world, including french-speaking countries, learn English because that's the language everyone can use, requires the least about of knowledge of the grammar/syntax to be understood, and the least amount

    • by znrt ( 2424692 )

      i don't think this is BS. BS has usually an agenda, some message. this is just nonsense, meaningless banter. it's weird to read this from people you'd expect to be intelligent.

      oh well, i guess every sector has to have "influencers" of sorts now ...

  • Nonsense (Score:5, Insightful)

    by mustafap ( 452510 ) on Sunday April 03, 2022 @03:08PM (#62413610) Homepage

    C is still perfectly acceptable and relevant for the vast majority of computers on the planet. Just not the ones the author works on. Microcontrollers far out number servers, laptops, tablet, phones. Platforms on which the author's comment are actually reasonable.

    • C is still perfectly acceptable and relevant for the vast majority of computers on the planet. Just not the ones the author works on. Microcontrollers far out number servers, laptops, tablet, phones. Platforms on which the author's comment are actually reasonable.

      And further, a lot of software has to be safety certified and validated, for which C++ has a tendency to hide processing in places that are not obvious in the code.

      The relative simplicity of C is an asset when trying to determine whether the code is correct.

      • If that was true, there wouldn't be so much undefined behavior going on in our operating systems, browsers, etc, that lead to security vulnerabilities. Even well known, seasoned programmers are susceptible to doing this with C.

        • That's more due to layer upon layer of abstraction than anything else. You can't invoke twenty libraries without penalizing your code somehow.
          Add in the fact that many paradigms also hide what's actually going on and require absurd data transfer strategies to fit within that paradigm (I'm talking about you Object Oriented Programming), and you got a massive problem - and then I'm not even getting into the entire design pattern nightmare. Basically, common sense problem solving and highly skilled programmer
          • Rust doesn't have that problem. The only way a library I import with rust can break something is if that library itself has a problem, which is pretty rare. The code I write that uses it is guaranteed not to cause undefined behavior, data races, etc. Sure, I could create a race condition, but it won't ever cause memory corruption, dirty reads, dirty writes, buffer overflow, etc.

  • Odd. (Score:5, Insightful)

    by rlwinm ( 6158720 ) on Sunday April 03, 2022 @03:08PM (#62413614)
    So as an embedded programmer with some 30+ years of work on everything from real-time bare metal code to embedded Linux doodads I guess I haven't been programming in a language. Because even to this day I don't hear a single engineer I work with (and I work with lots of external companies of different cultures) question the language we write in - in fact it's almost explicit that it's C.

    This article is pure garbage disconnected from reality.
    • by ukoda ( 537183 )
      Yes, pure garbage. Force them to write some machine code for a few years and see if they still want to claim C is not a programming language.

      To make claims that C is like something else does not take away from what it also is. If they claim it is used as a protocol it does not follow that it can no longer be considered a programming language. Apparently they didn't teach them the basics of logic.
      • Re:Odd. (Score:5, Interesting)

        by Darinbob ( 1142669 ) on Sunday April 03, 2022 @04:20PM (#62413804)

        The foreign function interface protocol isn't "C", it's generic programming. It's intended to be used with Fortran, Pascal, and many others. Thus, simple variables, simple functions, simple parameters, simple return values. No one thought having an ABI was weird until suddenly "how come it doesn't do it in a way that's easier for my favorite language?"

        This is why I avoid some new languages like the plague, not because they're bad languages, but because they have entire religious cult attached.

  • Just make any old crazy statement to get clicks. Profit!!

  • And I suppose PASCAL is better? It's always preached top-down programming but effectively forced people to write their programs bottom-up. Not only that, but for many years there was no standard I/O package because it wasn't originally designed to be compiled or run; just checked over by the class's instructor.
  • by musicmaker ( 30469 ) on Sunday April 03, 2022 @03:12PM (#62413622) Homepage

    Bits into bytes, bytes into words, words into structs. Structs over buses and network cables. C isn't so much the lingua franca as it is the most common and successful abstraction over the physical way hardware communicates with hardware. This author seems to forget that C isn't itself the thing, it's the structures that the hardware express that it is an abstraction over. No matter what language you're in, you ultimately are executing op-codes on a CPU. Swift, Scala, whatever high level language you choose is ultimately always going to be restricted in certain ways because that is the physical reality of computing. I don't know what else you want. It's literally how. it. works.

    • C also made expressing ideas a good mixture of concise while still remaining intuitive. There's a good reason that most languages used today have borrowed either in part or even quite heavily from it. Maybe there are better ways that things can be done, but there are a lot of worse ways and that makes most changes side grades or just a matter of personal preference over improvements in the best case and just a good intention that doesn't pan out in most others.

      There's a reason that not only is C still ar
  • by peppepz ( 1311345 ) on Sunday April 03, 2022 @03:14PM (#62413626)
    It's written in such a clickbait way that it seems a YouTube video. Even if it contained some valid points, it's impossible for me to take it seriously.
  • by Otis B. Dilroy III ( 2110816 ) on Sunday April 03, 2022 @03:14PM (#62413628)
    line of crap that I have ever read.
    • If I had mod points I would vote that up.

      Idiots debating idiocy with other Idiots. I am supposed to be impressed somehow with their sweeping vision of What Is Wrong when my life is all about meeting some contract deadline. That does not change no matter what programming language I am using.

    • line of crap that I have ever read.

      Luckily you don't often read about what Rust developers say, then. LOL

      This is par level stupid.

  • Wow (Score:5, Insightful)

    by 93 Escort Wagon ( 326346 ) on Sunday April 03, 2022 @03:15PM (#62413632)

    What an incredibly silly bit of verbal gymnastics. It's almost as if the whole motivation was to get advertising clicks.

    • Yes. Take this for example: "My problem is that C was elevated to a role of prestige and power, its reign so absolute and eternal that it has completely distorted the way we speak to each other."

      Reminds me of the English language though, as spoken by people outside England. Many dialects are still quite understandable.

      • by ceoyoyo ( 59147 )

        That's not written to get clicks. That's written to get grant money from humanities funding agencies.

  • Okay ... (Score:5, Insightful)

    by fahrbot-bot ( 874524 ) on Sunday April 03, 2022 @03:17PM (#62413636)

    I read the blog post and no more coffee for Aria.

    My problem is that C was elevated to a role of prestige and power, its reign so absolute and eternal that it has completely distorted the way we speak to each other. Rust and Swift cannot simply speak their native and comfortable tongues — they must instead wrap themselves in a grotesque simulacra of C's skin and make their flesh undulate in the same ways it does....

    C was here first -- or before Rust and Swift anyway -- and used, almost literally -- everywhere. Sound like you're upset (or bitter/jealous) that it won't now get out of the way (for, arguably, perhaps better defined/implemented languages), but things are what they are and C is, basically, the (or, at least, a) "reference language". Get back to us when as many things, including operating systems, are written in Rust and/or Swift. Until then I'll keep C in the "Languages" section of my resume.

  • by Viol8 ( 599362 ) on Sunday April 03, 2022 @03:17PM (#62413638) Homepage

    ...its doing quite well for itself being the language of choice for most popular OS kernels not to mention toolkits, DBMSs, compilers and interpreters so before all those who worship at the alter of Guido come along and tell us how much better Python is, go check what language Python itself is written in. Ditto most of the other high level scripting languages too.

    • by ceoyoyo ( 59147 )

      Python is a great language. One of it's killer features is how easy it is to incorporate C code.

      • by fahrbot-bot ( 874524 ) on Sunday April 03, 2022 @04:32PM (#62413824)

        Python is a great language.

        Great, but ridiculous. Let me delete one tab and wreck your entire program ... :-)

        [If they'd add support for brace-delimited blocks (or similar), I'd climb onboard. Until then I'll use Perl.]

        • by ceoyoyo ( 59147 )

          Oh damn! I see how well your C/Rust/Java program works when I delete a curly brace! Wait a minute....

          I've never seen the logic in that argument. A delimiter is a delimiter. I thought curly braces were a step down from good old begin/end or block/end block, but whatever. It's not hard to write a preprocessor to change your delimiter of choice into whatever the compiler/interpreter wants.

          Now, if you're one of those masochists who things an appropriate delimiter is a specific length string of identical charact

          • by fahrbot-bot ( 874524 ) on Sunday April 03, 2022 @05:04PM (#62413912)

            Oh damn! I see how well your C/Rust/Java program works when I delete a curly brace! Wait a minute....

            I've never seen the logic in that argument. A delimiter is a delimiter.

            Funny, but the C/Java/Perl etc... delimiters exist and are relevant only at the start/end of the block, not on every line of code within the block, like in Python. Get rid of *all* the indention in C/Java/Perl and the code still works, not so in Python.

          • At least with a missing curly brace, or bracket, you can track down where it's most likely to go without too much trouble. Good luck trying to figure out where a missing tab should go.
          • by Junta ( 36770 )

            As one who writes a lot of Python code, it's more workable than people pretend, but it *is* true that if any sort of text processing step is going to mess with anything, it's going to be whitespace, either deleting or re-aligning. This usually isn't a problem when working with programming editors, but sometimes referencing code from instant messaging or a web forum makes python impossible to parse as the respective software 'helps' reflow the whitespace one way or another.

          • by DamnOregonian ( 963763 ) on Monday April 04, 2022 @03:00AM (#62414980)
            Oh that's nonsense. You delete a curly and your program will not compile.

            Fuck up a tab, and you haven't created invalid code. You've created code that now has new and exciting meaning.
    • This is a bit unfair, as Python isn't really where the criticism comes from. Anybody with same barebone knowledge would know that Python isn't designed to be a low-level language and frankly it doesn't even pretend to. Heck, it relies on C for speed. (Nobody in their sane mind would, say, propose to write kernel code in Python). I think the grips come from languages that do pretend to be as good as C (Rust, primarily). I won't discuss the merits of their claim, but just say that Python isn't the one you sho
  • Because it looked like a 25 sentence comment on a quote probably taken out of context. C isn't a lot of things because it's not meant to be. You don't grab a C compiler looking for a package manager or an interactive interpreter or a community manifesto. (Well, maybe you read K&R, but it's handy and concise.) You might want a standard library, but that's not really necessary either, depending on your target. I like C because I don't have to fight with syntax (I'm looking at your Python) or do things t
  • by ukoda ( 537183 ) on Sunday April 03, 2022 @03:33PM (#62413664) Homepage
    "C is actually horribly ill-defined due to a billion implementations or its completely failed integer hierarch" ?!?!

    To steal from the first Google reach result: "C has been standardized by ANSI since 1989 (ANSI C) and by the International Organization for Standardization (ISO)."

    C can take the number of implementations as a compliment but who would use anything other than GCC these days anyway?

    Yea, the integer size thing can catch newbies but you soon learn to include stdint.h and if you are an embedded programmer dealing with real world hardware then things like uint8_t and uint32_t make it pretty clear what the hardware supports.

    The whole basis for their argument is flawed. The fact that C is used as a reference point far beyond just compiling and linking code does not mean it is no longer a programming language. Can the authors really call themselves programmers if they can't tell the difference between '&', '|' and '^' in the logic of their arguments?
    • Re:Say what? (Score:5, Insightful)

      by Entrope ( 68843 ) on Sunday April 03, 2022 @03:55PM (#62413702) Homepage

      To be fair, maybe more fair than this load of bilge deserves, C leaves a bunch of details up to implementers. A lot of those are (or were originally) consciously left to the platform because there was a lot more variation in integer sizes, representation of negative numbers, pointer structure, and more when C was being standardized than there are today.

      However, an awful lot of the complaints from the summary are not really about C itself, but about platform specific binary interface standards. C doesn't say "pass arguments using registers in order XYZ, then spill to the stack". It doesn't say "leave the first page of virtual address space unmapped to help detection of null pointer dereferences". It doesn't specify how to implement memory barriers or garbage collection or a lot of other things. The complaints here mostly seem to be sour grapes that library interfaces are defined using languages other than this person wishes were in charge.

    • It's also the first time in my life I've heard of a "C integer hierarchy." I'm not sure how you can figure out which type of integer is "above" or "below" another integer.

      • by hawk ( 1151 )

        apparently, it's about "endian privilege" . . .

        [ok, I'll crawl back under my rock]

        hawk

  • "My problem is that C was elevated to a role of prestige and power, its reign so absolute and eternal that it has completely distorted the way we speak to each other.

    Rust and Swift cannot simply speak their native and comfortable tongues – they must instead wrap themselves in a grotesque simulacra of C's skin and make their flesh undulate in the same ways it does."

    What an absolute drama queen.

  • by RightwingNutjob ( 1302813 ) on Sunday April 03, 2022 @03:35PM (#62413670)

    The fundamental operations exposed to the programmer by C are manipulation of memory addresses and manipulation of numerical values stored at those addresses.

    Say what you will about pipelines and RISC and anything else, that's how any CPU operates at a high level and that's how any hardware one would like to control with a computer operates.

    This is already an abstraction since it's talking in math about digital circuits composed of transistors and diodes and semiconductor junctions. There are other abstractions that talk about computation in terms of abstract datatypes and functions and pictures and whatever, but as the summary says, it all has to translate to numbers stored at memory addresses in the end.

    This rant strikes me as if it came from a moron or a grifter. Since the ranter designs programming languages, moron doesn't seem to fit.

    File under "meaningless trash talk" and move on.

  • I really don't know what they are talking about. Their rant borderlines on ... feelings.
  • [I]t has to call C APIs. This is done via Foreign Function Interfaces (FFIs).... In other words, even if you never write any code in C, you have to handle C variables, match C data structures and layouts, link to C functions by name with their symbols....

    The real problem is that C was never designed or intended to be an Interface Definition Language, and it isn't very good at it.

    Well no shit Sherlock. C has worked very well, and still works well, for what it was designed for. Not so much an "abstraction of assembly language" but a static, but somewhat weakly-typed language that compiles to efficient native executable code. And there's no way getting around having to detail with the internal details of how processors work at some point. Other languages "solve this problem" by relying on C libraries. And then they claim C is the problem ? If C was such a barrier, there is noth

  • ... I've heard all of these about FORTRAN ... 30 years ago.

  • makes no sense (Score:4, Insightful)

    by cjonslashdot ( 904508 ) on Sunday April 03, 2022 @04:08PM (#62413752)

    "C is actually horribly ill-defined due to a billion implementations or its completely failed integer hierarchy"

    Actually, C has a tremendously successful history of being able to cross-compile to numerous architectures.

    And as for not being a real programming language, a few years ago I was writing a program in Ruby and getting frustrated, so I started again in C, which I had not used in 20 years, and after 40 minutes I had a 300 line program, and I got it to compile, and it worked the first time.

    To this day, my all time coding productivity record was using C on a Sun workstation in the 80s, when one Saturday I wrote 1000 lines of tested code in one day.

    • "Actually, C has a tremendously successful history of being able to cross-compile to numerous architectures."

      +1

      And it's pretty much the only language that offers interop with the language you don't use for speed. You can include a C library in Java, Rust, Kotlin, Swift, Obj-C *and practically every other language used anywhere*. When there is some form of interop between two languages on that list, it is *invariably using a C-style wrapper*.

  • [I]t has to call C APIs. This is done via Foreign Function Interfaces (FFIs).... In other words, even if you never write any code in C, you have to handle C variables, match C data structures and layouts, link to C functions by name with their symbols....

    In other words "Dealing with legacy code pollutes my pure and beautiful environment."

    Suck it up, buttercup. No one likes working with legacy code. Legacy baggage hobbles every new environment. Sadly, it's cheaper in terms of risk and programmer time to just deal with it.

    Yeah, we know C has some huge shortcomings. C was invented at a time when performance was really important so K&R didn't add expensive abstraction layers. C went on to be incredibly popular and built up a huge code base. Sure it would ha

    • Sure it would have been nice if K&R realized 16 and 32 bit values were going to be too small and that pinning down the data size model was going to matter but you know what? They weren't omniscient. And if they had, Unix never would have worked and the whole problem would have resolved itself.

      It's also a problem that has been solved by stdint.h

      • It's also a problem that has been solved by stdint.h

        Almost mentioned this. I think that's a pretty reasonable solution to the int size issue.

        I'm sure Rust calling C has all sorts of other problems, like strings. I don't know what a Rust string or array looks like but a cup of coffee says it's more complicated than a char *.

    • In other words "Dealing with legacy code pollutes my pure and beautiful environment."

      From how some people approach this at my current workplace (I'm in hardware, thankfully), it seems like there's nothing worse than someone else's code.

  • I never quite understood why C was had types int, float, long, double, short (signed or unsigned), which do nothing to directly specify width and not int, uint, or float followed by 8, 16, 32 or 64, eg, uint16.

    • "int" is supposed to be the type of integer that runs most "naturally" on the processor you are using.

      That is, if you want to write code that will run easily on a 16 bit machine, a 32 bit machine, and a 64 bit machine, then you use "int" and you don't assume its range.

    • Re:type names (Score:5, Informative)

      by fahrbot-bot ( 874524 ) on Sunday April 03, 2022 @04:47PM (#62413868)

      I never quite understood why C was had types int, float, long, double, short (signed or unsigned), which do nothing to directly specify width and not int, uint, or float followed by 8, 16, 32 or 64, eg, uint16.

      A lot has to do with the original implementation on the PDP-11. The first answer on What is the historical context for long and int often being the same size? [stackoverflow.com] cites and references the C99 rationale [open-std.org] (PDF), section 6.2.5. I've highlighted an interesting bit below:

      [...] In the 1970s, 16-bit C (for the PDP-11) first represented file information with 16-bit integers, which were rapidly obsoleted by disk progress. People switched to a 32-bit file system, first using int[2] constructs which were not only awkward, but also not efficiently portable to 32-bit hardware.

      To solve the problem, the long type was added to the language, even though this required C on the PDP-11 to generate multiple operations to simulate 32-bit arithmetic. Even as 32-bit minicomputers became available alongside 16-bit systems, people still used int for efficiency, reserving long for cases where larger integers were truly needed, since long was noticeably less efficient on 16-bit systems. Both short and long were added to C, making short available for 16 bits, long for 32 bits, and int as convenient for performance. There was no desire to lock the numbers 16 or 32 into the language, as there existed C compilers for at least 24- and 36-bit CPUs, but rather to provide names that could be used for 32 bits as needed.

      PDP-11 C might have been re-implemented with int as 32-bits, thus avoiding the need for long; but that would have made people change most uses of int to short or suffer serious performance degradation on PDP-11s. In addition to the potential impact on source code, the impact on existing object code and data files would have been worse, even in 1976. By the 1990s, with an immense installed base of software, and with widespread use of dynamic linked libraries, the impact of changing the size of a common data object in an existing environment is so high that few people would tolerate it, although it might be acceptable when creating a new environment. Hence, many vendors, to avoid namespace conflicts, have added a 64-bit integer to their 32-bit C environments using a new name, of which long long has been the most widely used. [...]

      In general, a Long is at least as big as an Int, which is at least as big as a Short ..., a Double is at least as big as a Float ...

    • by dskoll ( 99328 )

      There were computers with 36-bit words, you know. C was designed (for better or worse) to be easy to implement on a wide variety of computer architectures.

      Nowadays, almost all digital computers have words sizes that are a power-of-two bits, and <stdint.h> solves the problem nicely without constraining the actual compiler implementation.

      • There were computers with 36-bit words, ...

        Many CDC systems were 48-bit and, later, 60-bit. I was a sysadmin for a few at NASA Langley in the late 80s / early 90s, along with a Cray-2 and YMP -- the Crays ran a version of System V Unix (w/BSD features) called UNICOS.

  • by mveloso ( 325617 ) on Sunday April 03, 2022 @05:03PM (#62413902)

    In a way, they are correct: C has been around for so long and it's been used by low-level people so much that processor architectures and behavior are optimized for C.

    What does that really mean?

    Here's an example: processor cache architectures assume locality and sequential access, because that's how C allocated memory: in a big block. The way that stacks are allocated are also an artifact. In fact, the stack itself is an artifact of how C handles local variables. It's difficult to imagine how else to handle local variables at that level, really.

    That said, that's what CS is about. I mean, GPUs have now been optimized for vector operations. They can go ahead and optimize CPUs for, well, dynamic languages? It's unclear what you'd actually optimize for. Object-oriented programming? Interpreted languages? But then thinking outside the C box at a low level will also be hard. It's difficult to say "how would we represent an object" without thinking about a block of memory.

    Maybe it's time to dust off the old symbolic machines stuff?

    • People keep saying this kind of thing, but they haven't been able to demonstrate it practically in any way. Build a machine or a platform that is faster than C, but so far that hasn't happened.

      We can look at graphics cards as an illustrative example, but it turns out they are faster only in specific cases, and not in others. Furthermore, graphics cards are interfaced through C (then Python uses C to interface with them).

      tl;dr C is faster in practice, despite some theoretical reasons other languages might be

  • Of course those deeply involved in Rust and Swift will lamely try to disparage other programming languages, especially one like C that has such a rich history and future. Doing so merely shows how little confidence they have in Rust and Swift.
  • by evil_aaronm ( 671521 ) on Sunday April 03, 2022 @05:35PM (#62413988)
    This article strikes me as the new kids trying to justify their reinvention of the wheel. Not because their wheel is any better, but because it's "new," and theirs.

    Kinda like Poettering and that tragic bastardization called systemd.
  • Bothered reading "About me" for this..."person"? Says it all really...

  • by KindMind ( 897865 ) on Sunday April 03, 2022 @06:12PM (#62414092)

    ... moderation button, to mark the story as -1 troll ...

  • by Maury Markowitz ( 452832 ) on Sunday April 03, 2022 @06:52PM (#62414174) Homepage

    "the underlying computational models of modern computers are nothing like the one that C represents, which was designed for a 1970s 16-bit minicomputer."

    More crap from the unwanted tabloid of the computer world.

    Modern computers are *literally designed to run C quickly*. When I say literally, I mean literally, literally.

    RISC-I, from where we get the term, was designed specifically to run C and Pascal (which is largely identical in opcode terms) as fast as possible. Here is one of the early papers on the topic:

    https://people.eecs.berkeley.edu/~kubitron/courses/cs252-F00/handouts/papers/p216-patterson.pdf

    As you can see in the tables at the end, the entire system was based on statistics of C and Pascal programs and was designed to make them run faster. Among its key design notes was the decision to use register windows, because they found that C programs spend a LOT of time doing function calls, as much as 45% of *all* of the time in a program. As such, they decided making this run as fast as possible would be key to a new design.

    Other what-we-now-call-RISC designs all followed the same pattern. The 801 was based on studies of PL/1 and Fortran programs on IBM machines and reached exactly the same conclusions. MIPS was based on running Unix statistics, and came to the conclusion the compiler could do the windows better than the hardware but was otherwise almost identical to RISC in design and layout. ARM was based on the RISC-I/II design after they visited the lab and WDC. CRISP is the "C-language Reduced Instruction Set Processor". Motorola specifically talks about making Unix fast when introducing the 88000.

    How is it possible the author of that statement is so utterly unaware of the history of the system they're almost certainly writing those words on?

    I'll agree with one thing, the machine you're running isn't like the one C was designed for, *it's** much more** tuned to run C than the PDP-11 was*.

  • When an entity identifies as a cat, it is announcing its intent to curry favor until it decides to bite you in the hand.

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...