Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming

ACM Magazine Criticizes Latest Draft of New C Standard, 'C23' (acm.org) 159

The ACM's software engineering magazine Queue delves into the latest draft for "a new major revision of the C language standard, C23... due out this year," noting the highs, lows, and several useful new features. The most important, if not the most exciting, make it easier to write safe, correct, and secure code. For example, the new header standardizes checked integer arithmetic:

int i =...; unsigned long ul =...; signed char sc =...;
bool surprise = ckd_add(&i, ul, sc);


The type-generic macro ckd_add() computes the sum of ul and sc "as if both operands were represented in a signed integer type with infinite range." If the mathematically correct sum fits into a signed int, it is stored in i and the macro returns false, indicating "no surprise"; otherwise, i ends up with the sum wrapped in a well-defined way and the macro returns true. Similar macros handle multiplication and subtraction. The ckd_* macros steer a refreshingly sane path around arithmetic pitfalls including C's "usual arithmetic conversions."

C23 also adds new features to protect secrets from prying eyes and programmers from themselves. The new memset_explicit() function is for erasing sensitive in-memory data; unlike ordinary memset, it is intended to prevent optimizations from eliding the erasure. Good old calloc(size_t n, size_t s) still allocates a zero'd array of n objects of size s, but C23 requires that it return a null pointer if n*s would overflow.

In addition to these new correctness and safety aids, C23 provides many new conveniences: Constants true, false, and nullptr are now language keywords; mercifully, they mean what you expect. The new typeof feature makes it easier to harmonize variable declarations. The preprocessor can now #embed arbitrary binary data in source files. Zero-initializing stack-allocated structures and variable-length arrays is a snap with the new standard "={}" syntax. C23 understands binary literals and permits apostrophe as a digit separator, so you can declare int j = 0b10'01'10, and the printf family supports a new conversion specifier for printing unsigned types as binary ("01010101"). The right solution to the classic job interview problem "Count the 1 bits in a given int" is now stdc_count_ones().

Sadly, good news isn't the only news about C23. The new standard's nonfeatures, misfeatures, and defeatures are sufficiently numerous and severe that programmers should not "upgrade" without carefully weighing risks against benefits...

The article complains that C23 "transforms decades of perfectly legitimate programs into Molotov cocktails," citing the way C23 now declares realloc(ptr,0) to be undefined behavior. ("Compile old code as C23 only for good reason and only after verifying that it doesn't run afoul of any constriction in the new standard.") It also criticizes C23's new unreachable annotation, as well as its lack of improvement on pointers. "Comparing pointers to different objects (different arrays or dynamically allocated blocks of memory) is still undefined behavior, which is a polite way of saying that the standard permits the compiler to run mad and the machine to catch fire at run time."

The article even cites the obligatory XKCD cartoon. "Let's not overthink it; if this code is still in use that far in the future, we'll have bigger problems."
This discussion has been archived. No new comments can be posted.

ACM Magazine Criticizes Latest Draft of New C Standard, 'C23'

Comments Filter:
  • by gnasher719 ( 869701 ) on Monday April 03, 2023 @06:58AM (#63421578)
    I assume this works for any combination of integer types. And would it have hurt to call it checked_add?
    • Re:Chk_add (Score:5, Funny)

      by Entrope ( 68843 ) on Monday April 03, 2023 @07:08AM (#63421608) Homepage

      "Type-generic" does mean what you assume.

      And C could not afford the third vowel in the function name creat(), so obviously, it would have hurt to spell out the word "checked".

      • by tlhIngan ( 30335 )

        And C could not afford the third vowel in the function name creat(), so obviously, it would have hurt to spell out the word "checked".

        creat() is not a C function that is part of the C standard library. creat() is a UNIX system call (granted, the C library usually wraps it in a function for convenience purposes, but you could asm() to make the system call, or use syscall() if your C library has it).

        It's a system call. In fact, using open() is a far better way to do things.

        Note that open() is also a system ca

      • I blame Merv Griffin.

    • There are likely still compilers in use that have a maximum symbol length of 8 characters. So they can’t define any symbol name longer than that. Just a guess.

      • Uh this is the C23 standard so you won't be using it with any existing compilers!
        • I realize that. I mean upgrades to existing compilers that might have an 8-symbol limit for one reason or another. Probably embedded and the like. The world of C compilers is quite large.

          • I think some very old compilers had a six character limit on names. You could use longer names in the source but the compiler internally truncated them. I thought that the C99 standard required the the compiler have some minimum requirement like 32 characters for identifier lengths. I tried to Google but can't find any authoritative answers. Interesting point though. There should be an exception to moderation that you can positively moderate replies to your own posts so I could give you a point.
    • Allowing an apostrophe (i.e. single quote) as a digit separator in binary strings? WTH?! What was wrong with the underscore as a digit separator, like C# and Java?
      • by Erioll ( 229536 )
        C++ went with the tick mark in C++14: link [github.com] (and not just for binary, you can use those in integer literals as well) So it's consistent with it's sister/daughter language.
      • Underscores usually require holding shift to type and there's already too god damn many of them?
        • by AmiMoJo ( 196126 )

          Spare a thought for Japanese keyboard users. Shift for both underscore and single quote marks.

    • True, but quite a few C standard changes in the past were handled with macros inside of headers. Ie, stdbool.h. There usually is a requirement to back backwards compatible. Ie, code written in an older standard should work in a newer standard with minimal or no changes. Probably the reason stdbool.h was used, there's a possibility of it colliding with existing code that has its own "#define true 1" or "const int true = 1". That's rather rare I think, but on the other hand having stdint.h types all be b

  • Is there any reason to make comparing unrelated pointers undefined behaviour, instead of just unspecified behavior, like returning true or false in an unpredictable and possibly inconsistent way? The latter, together with pointer difference giving an unpredictable result, would be enough to check if a pointer points inside an array.
    • They don't "make" something undefined behaviour. The just don't define the behaviour. ie, they don't spend any effort to come up with a scheme of doing something, and they don't require the compiler writers to do so either. That's what undefined behaviour is - the lack of definition in the standard. Nothing stops compilers from actually defining a behaviour, but you may never get different compilers to agree on a behaviour, or even different versions of the same compiler.

      Compilers, static checkers and ru
      • No no no. Pointer comparisons for unrelated pointers have a definition for cases with the same array, and explicitly state that the behaviour is undefined otherwise. Which makes it impossible to test whether p points to any element of array a. All they had to do is make it âoeunspecifiedâ with a result of true or false. Can you show me any processor with a C compiler not capable of doing that?
        • by Uecker ( 1842596 )

          Compiler optimizations break this. With TS 6010 you can convert to uintptr_t and then comparisons work as expected. But most optimizing compilers do not support this at this point.

        • by Darinbob ( 1142669 ) on Monday April 03, 2023 @01:05PM (#63422758)

          Consider the older segmented architectures. The actual pointer values are split across registers in many with some compiler implementations which makes directly comparing pointers require extra code - if the pointers are into the same array then the compiler can assume they're in the same segment and create more optimized code. There are similar optimizations when using some other 16 or 8 bit systems, not everything fits nicely into a flat memory model with uniform access that C prefers.

      • by sjames ( 1099 )

        They don't "make" something undefined behaviour. The just don't define the behaviour.

        Declaring the behavior to be undefined explicitly repudiates any common convention that may seem to have been adopted, even where no known compiler does anything but the convention.

      • Compilers, static checkers and runtime sanitizers already do warn about undefined behaviour now

        This is an important point in C. Unlike in Rust, such extra tools are separate and not wrapped up into a single tool. And it's been that way since day 1, with a Unix idea of separate tools for separate jobs, with Lint as the add-on tool on top of C.

        Also, C has a different goal than many langauges. It wants older standard code to work with newer compilers, and newer standard code to work with older compilers. Not always, it's going to break somewhere but the standards try to make it break only in small w

    • Rather than making comparing unrelated pointers an undefined behaviour, they should have introduced a "Maybe" logic keyword, together with "True" and "False"...
    • by bsolar ( 1176767 )

      My understanding is that in unspecified behaviour there are multiple "correct" ways to perform an operation and the compiler can pick whatever way at any time, without a need of being consistent. In undefined behaviour, there is no correct way to perform the operation.

      In a segmented memory model, how would comparing pointers which reside in different segments work? The comparison would need to return a result depending on the relative position of the two pointers in the address space, but since they are in

      • by vadim_t ( 324782 )

        What do you mean? Like x86 real mode?

        First, addresses in different segments are very much comparable. It's just a quirky addressing scheme that works to a 20 bit address. There's a normalization procedure. Second, who cares? That model is long dead, and shouldn't impede the progress of modern code.

        • by bsolar ( 1176767 )

          What do you mean? Like x86 real mode?

          First, addresses in different segments are very much comparable. It's just a quirky addressing scheme that works to a 20 bit address. There's a normalization procedure.

          You can compare the physical memory addresses, but from what I understand that's semantically not the same as comparing their address in the address space. The standard specifies the latter for pointer comparison:

          When two pointers are compared, the result depends on the relative locations in the address space of the objects pointed to.

          The way I understand it, conceptually, a different segment is a different address space. An address in a segment *does* have a relative physical position in memory to an address in another segment, but no relative position in the address space since the second pointer resides in a different address

        • What do you mean? Like x86 real mode?

          First, addresses in different segments are very much comparable. It's just a quirky addressing scheme that works to a 20 bit address. There's a normalization procedure.

          That doesn't work for 80286 protected mode, where segments are completely distinct objects. To find the CPU physical address, you'd have to read the segment descriptor and do a bunch of math. I don't remember off the top of my head, but an unprivileged program may not even be able to do that. The 386 architecture continued to support this segment silliness, even though essentially nobody ever used it to map less than the entire memory space. I assume that they finally dropped it for x64.

          Second, who cares? That model is long dead, and shouldn't impede the progress of modern code.

          There are plenty of

          • I assume that they finally dropped it for x64.

            Nope! amd64 is designed to be fully backwards compatible with x86, so you can still get it all the way into protected mode via "compatibility mode".

        • Long dead but supported in C. There are new and current chips with similar oddities too. Not uncommonly, different regions of memory access might use different instructions, such as the first 128 bytes accessible in a single byte instruction, Or in PIC there is very very little memory (dozens up to a few hundred bytes); memory is registers, and registers are memory, and you're using non-standard C anyway. There are DSPs which have all sorts of interesting quirks, such as pointer size not matching word s

    • by jeremyp ( 130771 )

      That's just a subset of undefined behaviour and it limits what an implementation can do. For example, the designers of a C compiler for a segmented architecture may want to trap if you try to compare two pointers in different segments but give a sane answer if you compare two pointers in the same segment.

      "Undefined behaviour" means "the implementor can do whatever is convenient" and for implementors targeting a flat address space, it's convenient to return a sane value no matter where the two pointers came

    • From C99 standard:

      3.4.3
      undefined behavior
      behavior, upon use of a nonportable or erroneous program construct or of erroneous data,
      for which this International Standard imposes no requirements

      3.4.4
      unspecified behavior
      use of an unspecified value, or other behavior where this International Standard provides
      two or more possibilities and imposes no further requirements on which is chosen in any
      instance

      So undefined behavior is what badly written code might do. Dereferencing the contents of a pointer after it has been freed.
      Unspecified behavior is a lot about details of properly written code. Are the padding bytes in a structures initialized to 0 or are they semi random?

      In this instance, comparing unrelated pointers is not allowed in the standard, and if you do it might generate an error, or a warning, or it might just work - it depends upon the compiler or system implementation. The st

  • Ugh. (Score:4, Interesting)

    by vadim_t ( 324782 ) on Monday April 03, 2023 @07:10AM (#63421614) Homepage

    UB is a pox on C and C++.

    Now, defenders will make the argument that UB allows for greater optimization. But for realloc, really!? Memory allocation is going to be avoided in high performance code anyway. Eking out a few extra microseconds on what is already a slow path in exchange for likely introducing crashes and security bugs is a bad tradeoff.

    Worse, this will probably break existing code that works perfectly fine at present, and create a whole bunch of problems that pop out of nowhere as soon as clang or GCC decide to take advantage of this, and somebody builds formerly perfectly good code with it.

    What we need is to minimize UB everywhere possible. Some people may be unaware of this, but "UB" is not "implementation defined". UB literally says that any outcome is acceptable, including results that look insane to most any developer.

    • by Ed Avis ( 5917 )
      Unlike the ACM article's author, I'm not offended by the new unreachable notation. It won't bite you unless you explicitly decide to use it. The problem is the implicit undefined behaviour allowing the compiler to do crazy things. In most cases I can kind of see why it's that way, but it needs a strong warning from the compiler that it's about to elide half of your code having decided it can never be reached without undefined behaviour being triggered.
      • Clang has an interesting option. You can mark functions as âoedoes not return", for example a failing assert. But then you can tell the compiler "assume it doesnâ(TM)t return ad far as warnings etc. are concerned, but actually it can returnâ. For example to save the day after a failed assert.
    • Most importantly, "implementation defined" means "you'll get the same result every time, but that result will depend on your compiler." "Undefined behavior" means "you may get wildly different results depending on what values are stored at a random place in memory, what else is running at the moment, or whether it's Tuesday."

      • Implementation defined means it'll work one way at first, but the compiler maker will change that meaning when you're working on a major project with a strict timeline that could greatly benefit by the latest version of the compiler, while I breaks the code you wrote last month.

        • Still better than undefined behavior. But yeah, it's probably good to avoid implementation defined when you can as well.

      • Undefined behaviour means "Your program can crash or worse. The compiler can assume that you donâ(TM)t invoke undefined behaviour. You canâ(TM)t check if an overflow happened because overflow is undefined behaviour so in case of overflow the code can crash, or the compiler can remove the check because overflow can be assumed to not happen.
        • by Uecker ( 1842596 )

          Undefined simply means it is not defined. Yes, compilers can use it for optimization. They can also use it for adding run-time checking. They can also define it. Or another standard - such as POSIX - can define it.

    • But for realloc, really!?

      The behavior was already: may or my not return anything of use and may or may not free the memory.

      If it's UB, implementations are still free to define it and that will probably be a quality of implementation kind of thing.

      • by vadim_t ( 324782 )

        No, that's implementation defined.

        UB means that it's effectively radioactive, and the compiler can make completely insane things happen. Eg, UB gives you fun stuff like this: https://kristerw.blogspot.com/... [blogspot.com]

        • by Uecker ( 1842596 )

          He is right though. Undefined behavior simply means the standard does not define it. This does not mean compilers can not also do something sensible.

          • Comment removed based on user account deletion
            • by Uecker ( 1842596 )

              Who would be "anyone important"? There is no single super power who can decide. If you want something changed, you could contribute to an open-source compiler or also submit proposals to the ISO C committee.

            • When Undefined Behavior originally appeared in the spec, all the C programmers thought it meant one thing, and all the compiler makers decided it meant something else.

              There's this myth among C programmers that the compilers are written by a mustache twirling cabal of evil masterminds intent on being as perverse as possible. This isn't true.

              Compiler authors are trying to make the code run as fast as possible because that's ALSO what C programmers want, and do so withing the spec.

              The compilers are not and can

              • Undefined behavior also means the compiler can flag the code as erroneous. Unspecified behavior will list what the options are. So we already had unspecified behavior in the prior C standard for realloc(ptr, 0), and if a compiler flagged this as an error it would not have been compliant with the standard. The standards committee could have just added more options to the unspecified behavior if there was a need for that.

                • Undefined behavior also means the compiler can flag the code as erroneous.

                  It can! Sometimes it does.

                  So we already had unspecified behavior in the prior C standard for realloc(ptr, 0), and if a compiler flagged this as an error it would not have been compliant with the standard.

                  Technically yes. In practice, many people, myself included compile with -Wall Wextra -Werror or the equivalent, which is very strictly speaking not compliant, but widely accepted practice.

                  • I turn off a lot of warnings, just for old code to work. Some of the warnings are ridiculous - ie, unused function parameters. There's a reason for the parameter - the API requires it, but there's a reason to not use the parameter, because it's not needed in the implementation. Very common in stub functions which our simulator uses. So it's a lot of wasted effort (months) to fix up just warnings that are not actual errors or due to incorrect code. Other warnings popping up because of #ifdefs, which caus

                • by Uecker ( 1842596 )

                  Compilers can diagnose whatever they want and still be compliant. They can not stop translating a strictly conforming program. What undefined behavior allows is that the compiler can extend the behavior in some way. For example, the can insert a run-time check.

              • Comment removed based on user account deletion
        • Not entirely. It may be radioactive. However your compiler vendor may choose to define precisely what it does. After all, that's entirely valid given the scope of UB. POSIX already defines what realloc() does as realloc(n, 0) is defined to be free(n). POSIX can continue to define it. So, if you ran your code on a POSIX compliant system, it would do what you expected. If not, then, well, you checked the docs before, right?

          • by Junta ( 36770 )

            Though I have seen that it's not *technically* identical
            This:
            pt = realloc(pt, 0)
            Will be considered to be a memory leak by debugging tools

            This:
            free(pt)
            won't.

            C23 appears to take a stronger stance against it, but it's been a bit wobbly to rely upon as far as C as far back as C99. There's probably a reason why the author had to resort to an unrelated example to illustrate the 'corner cases of realloc' using a well known utility, it's likely he can't find a realloc example in respectable codebases that are well

            • by Uecker ( 1842596 )

              C23 takes *no* stance by making it undefined. This just reflects reality: Different vendors always did different things, and nobody could rely on it. And if you relied on what your specific vendor did,, you still can.

              • by Junta ( 36770 )

                I'd say it takes a stance against it. It was implementation-defined, now it's undefined. Yes, an implementation *can* still promise a certain behavior (and I imagine everyone will probably just keep doing what they were doing), but generally speaking undefined behavior is deliberately a bit more "don't do that" sort of guidance, compared to "implementation-defined" or "unspecified".

                • by Uecker ( 1842596 )

                  It is not a "don't do that" guidance, although it seems increasingly understood in this way. (related to general confusion around UB). This change was made in particular to allow POSIX to define it.

                  • POSIX was ever so slightly different here than prior C standards. POSIX allows returning a null ptr while also setting errno. Ie, to POSIX this code generates an error. In prior C standards they doesn't mention errno in this case.

          • by jeremyp ( 130771 )

            But POSIX doesn't define the return value from realloc(p, 0) beyond either NULL or a pointer to an object that can be freed.

            • But POSIX doesn't define the return value from realloc(p, 0) beyond either NULL or a pointer to an object that can be freed.

              Yes? POSIX says you can free using realloc if you wish. That will likely keep working indefinitely because POsix won't change the spec. You're ill advised to use the return value I think.

    • Re:Ugh. (Score:4, Funny)

      by codebase7 ( 9682010 ) on Monday April 03, 2023 @09:04AM (#63421950)

      What we need is to minimize UB everywhere possible.

      Sir, this is C . You know, the scary programming language?

      The one your parents, friends, and teachers warned you about?

      The one that every tabloid decries at least once a month?

      The one that has warning signs posted around every validation tester's office?

      The one that has that UB for cheap in a dark alley and unmarked white vans?

      The one that has every modern "programmer" clutching their balls as they run screaming in pure terror?

      If you absolutely need your compiler to define everything for you so your AI generated scrawl doesn't create a security vulnerability the size of Andromeda, you're in the wrong place.

    • Re:Ugh. (Score:4, Informative)

      by AmiMoJo ( 196126 ) on Monday April 03, 2023 @10:14AM (#63422182) Homepage Journal

      Undefined Behaviour often exists for performance reasons.

      The classic example is overflowing a signed type. On most CPUs it wraps around to the smallest negative value possible, but that is just how 2's complement binary numbers work and not all CPUs use that method for negative numbers.

      So either you define some behaviour and every CPU that doesn't work that way is crippled by having to check ever signed calculation for overflows and implement whatever the language designer thinks is best, or you leave it undefined and the option is there for people to make platform specific optimizations to their code, or rely on platform specific behaviour.

      The basic assumption is that the C programmer knows what they are doing. I have no idea of it's true, but my guess would be that most of the guys still writing C meet that assumption. C++ has less UB because, well, C++ coders... ;-)

    • by jeremyp ( 130771 )

      I'm not too sure about the realloc example. The man page on my Mac says

      If size is zero and ptr is not NULL, a new, minimum sized object is allocated and the original object is freed.

      That says to me its behaviour is already aligned with the new standard and I'm not sure if it ever wasn't.

      Furthermore, I don't think I've ever seen C code that uses the behaviour described by the article. The example of grep doesn't use it, it only uses the "passing in a null pointer is the same as malloc" behaviour. Also, the example stack is a bit bogus. A real implementation would never bother to realloc with less memory because you'r

      • Prior standards made this "unspecified". Meaning systems can do what they want, and the code itself is not an error in the standard. POSIX specifies two allowed methods - what the Mac describes above by returning an minimally sized object that can be freed, or return 0 and set errno to an appropriate value.

        The new "undefined" means code doing this is now non compliant with the standard, ie, the code is erroneous. Even though it looks small, that's a big change in semantics.

        How often was this use of reall

        • by Uecker ( 1842596 )

          "undefined" does not mean "erroneous". It simply means it is not defined by ISO C. Certain errors lead to undefined behavior, but in other cases it simply means the code is not portable. In this case, code using realloc size zero size was already not portable.

           

  • Don't worry (Score:4, Funny)

    by Comboman ( 895500 ) on Monday April 03, 2023 @07:25AM (#63421646)

    Constants true, false, and nullptr are now language keywords; mercifully, they mean what you expect.

    Don't worry, you can still hide #define ture 0 and #define flase 1 in a header somewhere to trip up your typo-ridden coworkers.

  • For all the C I've written over the years it never occurred to me that it was pretty dumb they weren't built in keywords and why do I have to define them every time or write my own headers to include in everything else I write to get something like that.

    What took you guys so long?

    Oh and congrats for finally getting those in there.

    • Because the C committee is even more conservative about adding keywords than C++. Some C programmers are automatically against anything that arrived in C++ first.
    • Why would you need the keywords when 0 and 1 already exist!

    • you didn't have to define them yourself, you had to include stdbool.h before though. The reason for it was that the fear was that people had created their own defines for true/false since they where not protected keywords so they decided to give those programmers this long a heads up to change their code.
  • by dskoll ( 99328 ) on Monday April 03, 2023 @07:33AM (#63421666) Homepage

    The change to realloc is gratuitous and stupid. I hope that doesn't make it to the final standard... people need to speak up about that.

  • They are breaking existing code, including simple "grep" command.


    $ echo foo | ltrace grep bar |& grep realloc
    realloc(0, 128) = 0x55a17f5596f0

    In the new version this realloc pattern is no longer valid defined behavior. And this is only tip of the iceberg.

    C standards should never break existing code, unless there is a really valid reason for it. Simplifying an interface is not a valid reason.

    I'm not sure what they are thinking, but breaking existing code bases just for sake of it is never the right thin

  • C needs to die (Score:2, Flamebait)

    by dsanfte ( 443781 )

    Well maybe that's a bit strong. It certainly still has a few legitimate uses. But outside of embedded systems and specialist applications, nobody should be using memory-unsafe languages like C or C++.

    Even the NSA agrees.

    https://www.itpro.co.uk/develo... [itpro.co.uk]

    • I don't see the point of either outside of specialized niches. Where performance and/or low level hardware access matters greatly, and memory safety less so. (If both matter, then I think Rust should be given serious consideration.)

      All kinds of GUI applications for Linux are written using C or C++ against GTK, Qt, WxWidget, SDL2, etc., and maybe I'm missing something but I don't see a reason for any of that. What about a typical GUI requires raw memory access? What about it requires raw pointers? What

      • by tepples ( 727027 )

        What about a typical GUI requires raw memory access? What about it requires raw pointers? What about it can be done more easily in C or C++ than, say, Python, presuming that there are good quality bindings?

        1. C or C++ is useful for making these "good quality bindings."
        2. C or C++ is useful for doing CPU-intensive data processing for which there isn't a ready-made accelerator extension on PyPI, especially on Windows where only one compiler (Visual C++, not GCC) is capable of building extensions compatible with the ABI of the interpreter distributed by Python.org.

  • Looks like a nice easy way to hide evil code in innocent-looking github sources?
  • by DarkOx ( 621550 ) on Monday April 03, 2023 @08:19AM (#63421780) Journal

    The main criticism is the changes to realloc when dealing with grow from zero or shrink to zero behaviors.

    I agree with the articles author, the old behavior made managing such algorithms much much simpler. That will be a huge step backwards. However the other changes seem like really nice updates that reflect how the language is used and by whom today.

    If the realloc thing affects you, write a wrapper function with a few quick tests around when to delegate to free or malloc instead and and crtrl-r your way back to sanity. If you are worried about the performance, don't the compiler is going to inline whatever you did if the code path is at all hot. Even if it is a hot path, if you are doing dynamic allocation there you are not going to notice a few clicks of additional over head, you may even be getting back on the back side as far as however the platform implements realloc getting slightly simpler.

    • by Uecker ( 1842596 )

      The problem with realloc persisted for decades where different vendors did different things. You could not rely on it in portable code. So from a practical point of view, this is not a step backward, it is just acknowledging reality and not misleading the programmer.

      • by HiThere ( 15173 )

        It's a step backwards. They should have picked the best behavior and standardized on that. And to me that seems as if allowing realloc to free items and return nullptr on zero allocations.
        OTOH, that won't directly affect me, as I never use it. But it seems a really stupid decision.

        • by Uecker ( 1842596 )

          An international standards committee does not work like this. It is not a super government that decides on the best thing and then everybody else simply obeys. It works by finding consensus between different interests. In this case, different vendors implemented different things with no willingness to change for more than a decade (because it would break existing code for this environment). For portable C it was simply impossible to use realloc and this became a security problem. Making it undefined makes t

      • It breaks backward compatibility, and therefore existing code, and that should be done only rarely and with very good reason.

        I don't know C well enough to discern whether it was truly necessary. I think I understand the arguments both pro and con. I just don't know how to weigh them.

        Also, I hope that compilers will provide flags to keep the old behavior for a while, and warnings if the now-undefined construct is used. Otherwise people like me who use source-based Linux distributions will likely be very g

    • by Junta ( 36770 )

      Note that if, hypothetically, some compiler did decide to change what realloc does because C23 'frees them', then the problem comes as that compiler compiles a shared library and suddenly existing binaries that have worked forever may have weird behavior.

      In practice, I doubt any implementation is jumping up and down to change the behavior of realloc (current behavior is as good as any if it is 'undefined'), but broadly be aware in C that backwards compatibility is key and does not rely upon everyone in the

  • What's the problem with realloc(ptr, 0)? it already had undefined behaviour, for all intents and purposes. It could either return null or the address of a magic non-null pointer that you could free() multiple times. Now you could also have daemons flying out of your nose in addition to those two options. Big deal.

  • ...Pedant maximum Reached... drowning in technical minutiae, "smartiness", and the seeding of the fields of street cred... love it!

  • stdbool.h (Score:4, Informative)

    by OrangeTide ( 124937 ) on Monday April 03, 2023 @10:24AM (#63422200) Homepage Journal

    Constants true, false, and nullptr are now language keywords; mercifully, they mean what you expect

    _True and _False have been around since C99. With stdbool.h helpfully wrapping them with true and false. This was back when people wanted to avoid introducing new keywords that would break older programs. They still screwed that up with "restrict" in C99. But most (all?) the rest in C99 and C11 were _[A-Z] style with an optional macro in a header. (alignas, bool, complex, imaginary, static_assert, etc...)

    And honestly, it takes so long for major platforms to pick up C standards, they wait until at least a similar feature lands in C++ and even then they aren't motivated to do much with C.

    I'm primarily a C programmer for a living, although I worked as a Go programmer for several years, and even I kind of think the ISO C panel should just retire. The language can be frozen for another 20 years, by then we can worry about whoever wins the current language wars. Probably not Go nor Carbon, Rust maybe? Nim and Zig are both interesting but I don't see any commercial push behind them. SPARK has already filled a niche at my company that once was exclusively for C/C++. Honestly, I don't really know who will win, I'm not a psychic, but I don't see any practical feature showing up in C that will suddenly reverse its decline.

  • There's a common belief held by "old" computer scientists that a computer can only do what it's programmed (by a person) to do.

    When "something" else is in charge, it's sort of like when Word keeps auto correcting what you're trying to type and you can't turn it off.

    Just saying.
    • I'm not that old, but I would tend to express that a bit differently.

      If the algorithm is simple enough that a human can easily predict it, the algorithm isn't all that useful (after all I could just do it myself). It's just largely a convenience feature.

      If the algorithm is complicated enough that the user can't easily predict it, the algorithm isn't all that useful s you have no way to get predictable results and it may fail in catastrophic ways.

      Of course, clearly, neither of these statements are tru

  • Constants true, false, and nullptr are now language keywords

    What about
    FileNotFound [thedailywtf.com]?

You can be replaced by this computer.

Working...