Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming

Fighting the Culture of 'Worse Is Better' 240

An anonymous reader writes: Developer Paul Chiusano thinks much of programming culture has been infected by a "worse is better" mindset, where trade-offs to preserve compatibility and interoperability cripple the functionality of vital languages and architectures. He says, "[W]e do not merely calculate in earnest to what extent tradeoffs are necessary or desirable, keeping in mind our goals and values -- there is a culture around making such compromises that actively discourages people from even considering more radical, principled approaches." Chiusano takes C++ as an example, explaining how Stroustrup's insistence that it retain full compatibility with C has led to decades of problems and hacks.

He says this isn't necessarily the wrong approach, but the culture of software development prevents us from having a reasoned discussion about it. "Developing software is a form of investment management. When a company or an individual develops a new feature, inserts a hack, hires too quickly without sufficient onboarding or training, or works on better infrastructure for software development (including new languages, tools, and the like), these are investments or the taking on of debt. ... The outcome of everyone solving their own narrow short-term problems and never really revisiting the solutions is the sea of accidental complexity we now operate in, and which we all recognize is a problem."
This discussion has been archived. No new comments can be posted.

Fighting the Culture of 'Worse Is Better'

Comments Filter:
  • by Viol8 ( 599362 ) on Tuesday October 14, 2014 @08:39AM (#48139507) Homepage

    Back in the day. The clue is in the name. If it wasn't compatible but simply similar then it would have been called something else. Java perhaps.

  • by NotDrWho ( 3543773 ) on Tuesday October 14, 2014 @08:41AM (#48139517)

    It's easy for a programmer to say "We should stop worrying so much about compatibility and interoperability" when they don't have to deal with customers, support, or actually selling the end product. When a customer calls up and says, "Hey, how come this new version of Windows doesn't work with any of my old Windows software?" you can't just tell them "Because our programmers thought it was better to get a fresh start."

    • When a customer calls up and says, "Hey, how come this new version of Windows doesn't work with any of my old Windows software?" you can't just tell them "Because our programmers thought it was better to get a fresh start."

      "Hey, how come this new version of Mac OS doesn't work with any of my old Mac OS 9 software?", said Mac users in response to Classic support being dropped with the release of Mac OS X 10.5.

      "Hey, how come this new version of OS X doesn't work with any of my old PowerPC software?", said Mac users in response to Rosetta being dropped with the release of OS X 10.7.

      Both of those are from just the last 7 years, and I wouldn't be surprised if we could rattle off more, both for OS X and iOS. The fact is, you can te

      • by tlhIngan ( 30335 )

        When a customer calls up and says, "Hey, how come this new version of Windows doesn't work with any of my old Windows software?" you can't just tell them "Because our programmers thought it was better to get a fresh start."

        "Hey, how come this new version of Mac OS doesn't work with any of my old Mac OS 9 software?", said Mac users in response to Classic support being dropped with the release of Mac OS X 10.5.

        "Hey, how come this new version of OS X doesn't work with any of my old PowerPC software?", said Mac

        • If you want to see what happens when Microsoft actually tries a fresh start, see Windows Vista. Where UAC introduced unprivileged by default operation (breaking so many apps that assumed users were admins and bombarding them with dozens of elevation dialogs).

          Yeah, but obviously that would break things, look at how little warning they gave developers. They only released an API/standards that UAC played well with in 2001 with Windows XP. Surely that's not enough time to modify their code.

          Seriously, I rememb

        • Near as I can tell, you've pointed out an additional difference between Microsoft and Apple, rather than addressing or contradicting anything I was discussing.

          Basically, while Apple does indeed slap developers for misusing APIs, as you stated, it also frequently deprecates features and APIs that are working as intended while their customers are still using software that's dependent on those features and APIs, which is what I was pointing out. Both Classic and Rosetta were working as intended and were being

      • I can answer these, as I was there.

        "Hey, how come this new version of Mac OS doesn't work with any of my old Mac OS 9 software?", said Mac users in response to Classic support being dropped with the release of Mac OS X 10.5.

        Because Apple was unwilling to port the Classic 68K emulator to Intel because of the difference in processor byte order, among other things, making such a port not worthwhile in terms of performance of the Classic software. The user experience would have been crap, and so the decision was made by upper management to not support Classic going forward on Intel.

        For the PPC versions of Classic, they could have been supported under Rosetta, but it would have meant an approxima

    • For the most part we get these compatibility issues from someone trying to make something the previous system wasn't designed to do well, so they did some hacks to get it to work.
      When we wen't from DOS to Windows. The idea of a common drivers came into play. Before you needed to do assembly calls to support your devices.

      Now the real trouble makers are the sales men who push the product as something to solve all your needs, while it was made to solve particular problems.

  • Huh (Score:5, Insightful)

    by JanneM ( 7445 ) on Tuesday October 14, 2014 @08:42AM (#48139531) Homepage

    So.. preserving backwards compatibility and interoperability across versions is a bad thing? If he's unhappy with the feature set of C++ (and I wouldn't blame him for that), then how about simply picking up a different language instead? That's what a new, non-compatible C++ version would be in any case.

    Look at how great it has worked out for Python. It's been six years since the only mildly incompatible version 3 was released, and it has still not managed to become dominant over the legacy version 2. A more radical break would almost certainly have had an even tougher road ahead.

    • And the stupid thing is that everything python 3 changed were things the language desperately needed.

      Exactly one function parsed as a statement instead?
      Exception handling syntax different from every other block format?
      Defaulting strings to unicode?

      I mean, you're right that 3 never truly recovered, but those were all things the language desperately needed.

    • > A more radical break would almost certainly have had an even tougher road ahead.

      This is why we are still waiting for Perl 6, if it ever gets released.

      It's just too radical a break from what the Perl community expects. Even if you've never used Perl, you can tell from the drastic difference between Perl 5's dependable Camel [wikimedia.org] to Perl 6's Camelia [wikimedia.org].

      • by JanneM ( 7445 )

        > This is why we are still waiting for Perl 6, if it ever gets released.

        I suspect in the case of Perl 6 (and perhaps also for Python) it may have been better to give the language a new name, and allow even more radical changes. Keeping the name strongly signals that it's still the same language. Breaking compatibility is exactly what makes it a different one.

    • by LWATCDR ( 28044 )

      Once you change a language enough it is no longer the same language.
      I just do not see the issue. If you want C++ without C create a new language that fits that description. Apple created Objective c without c and called it Swift. So make C++ without c and call it Sure.

    • Paul Chiusano is in the wrong job or is too young to remember why we geezers are so concerned about compatibility.
      • Hum... Correcting myself, after reading TFA (i know, I know, bad idea) I have not seen anywhere the author wrote something clearly against compatibility, seems to me a misinterpretation of the anonymous who created the summary.
    • Python can't hold a candle to Perl in this regard. The Perl 6 design process started in 2000.

  • Seems like someone (or a consortium of someone's) should take C++, drop the C compatibility requirement, make whatever "cleanup" changes that allows, and call it C+++. Just make sure there's a module ready to go for gcc.
    • Seems like someone (or a consortium of someone's) should take C++, drop the C compatibility requirement, make whatever "cleanup" changes that allows, and call it C+++. Just make sure there's a module ready to go for gcc.

      Yeah and basically you also clean up the syntax since if you're breaking compatibility then you may as well make it parsable. What you get is D or Rust.

      Both of them seem like fine languages.

      Howeve, I end up always reaching for C++, because then I can reuse my old code and keep using my libra

      • Haven't looked at D in any detail, but I was under the impression it was more different than C++ than what I had in mind. I was imagining C+++ as staying as close as possible to C++ but with whatever modest improvements are enabled by the omission of C compatibility. But maybe that's in fact what D is.
  • by Afty0r ( 263037 ) on Tuesday October 14, 2014 @08:48AM (#48139569) Homepage

    "Technical Debt is a thing, people"

    • I am rather saddened that "TLDR" is a thing, and a big one these days...
      • by mcgrew ( 92797 ) *

        It's always been that way. Fewer than 1% of Americans are illiterate, but something like 97% are aliterate. When I was in school, very few kids wore glasses; no computers or cell phones and the TV was across the room. Reading (or any other close up work) at a young age makes you nearsighted.

  • I never heard the term before, had to wiki it, then realized it was a fundamental of software engineering that I was taught decades ... ago :( so old
    Don't create app functionality that the user won't use. More bells and whistles mean more dev time, test time and more to maintain, it's a simple concept.
    That's why apps now have functionality metrics (Firefox seems really big on it for example).
  • Worse is better is basically the KISS principle for software. C++ is not an example of that, but C is. This guy is an idiot.
    • by mcgrew ( 92797 ) *

      Worse is better is basically the KISS principle for software.

      Huh? It seems the opposite of the KISS principle. Making my old code no longer work is hardly keeping it simple. Most modern, ugly web design is as confusing and complicated and bloated as they can make it.

      I always followed the KISS principle, it's the easiest way to write bug-free code. Hard to write a buggy version of "hello, world".

      This guy is an idiot.

      I certainly don't disagree there.

      • This guy has no clue what 'worse is better' actually means despite linking to a wikipedia article explaining it, hence why I'm calling him an idiot. One of the most commonly cited examples of worse is better design is Unix. Instead of having a few large, comprehensive programs, Unix systems and Unix-like systems have a ton of small programs that do one thing. cd is 'worse' than explorer.exe because it doesn't do as much, but it's better because if something is broken with cd, you just fix cd a lot easier
  • by QuietLagoon ( 813062 ) on Tuesday October 14, 2014 @08:49AM (#48139579)
    "Worse is better" is little more than Chiusano's opinion of what is happening.

    .
    So he thinks that compatibility and interoperability are not features which he likes. OK, I'm OK with that.

    However, that is his opinion, nothing more, nothing less.

    There are reasons why interoperability and compatibility are desired. It is not the easiest path to provide those characteristics, on the contrary, it is easier to just say, ~screw compatibility, screw interoperability~, and you'll probably finish your task more quickly.

    So then the question becomes, why do people invest extra effort in order to assure interoperability and compatibility?

    ...which we all recognize is a problem....

    And now he presumes to speak for everyone....

    Overall it sounds like he just got out of a bad meeting in which someone told him that his opinions are not worth the air used to utter them, and now he's trying to convince the world that he is right and the world is wrong.

  • Take a look at evolution. Sexual reproduction has so many hurdles to jump through before a beneficial mutation could find a toehold. In asexual reproduction individuals can rapidly and radically adjust to the changing environment and pass on the beneficial mutations to the next generations. They produce teeming masses of viruses, bacteria, fungi, insects, and at most reptiles. (A confirmed case of parthenogenesis of a shark sent shock waves through the biologists. But I think it has never happened among mam
    • by mcgrew ( 92797 ) *

      Take a look at evolution. Sexual reproduction has so many hurdles to jump through before a beneficial mutation could find a toehold. In asexual reproduction individuals can rapidly and radically adjust to the changing environment and pass on the beneficial mutations to the next generations.

      The late writer and biochemist Isaac Asimov would have disagreed with you [bestlibraryspot.net] vehemently. Asimov held a PhD in biochemistry and did cancer research at Boston University.

      The above linked short sci-fi story was originally title

  • Simple != worse (Score:5, Insightful)

    by pla ( 258480 ) on Tuesday October 14, 2014 @08:56AM (#48139631) Journal
    Once upon a time, I wrote "clever" code. Truly beautiful, almost poetic in its elegance. Note I said "elegance", not "simplicity".

    I don't know who to credit for this (probably read it on Slashdot), but a single perspective completely changed the way I view coding:
    It takes substantially more effort to debug than it does to write code in the first place. If, therefore, I write code as clever as I possibly can - I can't effectively debug it (without investing far more time than I should) if something changes or goes wrong.

    Now, that doesn't mean "worse is better"... I can still produce good code; I can even still write the occasional clever function when performance demands it. But for the 99.9% of code that has almost no impact whatsoever on performance, I can just say "if X then Y else Z" rather than using cool-but-cryptic bitmasking tricks to avoid executing a conditional instruction. And hey, whaddya know, I can actually read it at a glance six month later, rather than praying I didn't forget to update my comments.


    On the flip side of this, a few weeks ago I helped a friend put together a spreadsheet with a few complex formulas in it. I love me some IFSUMS, arguably the best new feature of Excel in the past decade. Note that clause, "in the past decade". This weekend, she called me because her nice helpful spreadsheet wouldn't work - On Excel 2003. It seems that while 2003 has IFSUM, MS didn't add IFSUMS until 2007. The choice of one seemingly harmless backward-compatibility-breaking function made the whole thing useless in a given context. Now, in fairness, I can hear you all screaming "just upgrade already!"... But in the real world, well, we still have people using Windows 95.
    • I'm with you on this. As a programmer, the thing I hate the most is "Gee, Mom, look what I can do!" code -- obtuse code written to impress rather than be simple, obvious and functional. And yes there are indeed times when something mind-bendingly complex is needed to achieve the required goal, but by and large, the KISS principle applies. As to the article's main point, I have to ask what is the purpose of breaking backward compatibility: Making it faster to produce readable, easily maintainable code, being

      • by Zalbik ( 308903 )

        I agree, but I also think there needs to be a balance between KISS and other principles (e.g. DRY). I've come across developers who use KISS as an excuse to be lazy.

        I recently came across some code from a colleague where it was hundreds of lines filling in object properties from data in a spreadsheet. Each property being filled in was coded as a separate line, calling a one of five different routines based on the data type to be parsed.

        I asked "why didn't you just add a configuration (or just an array)

    • by u38cg ( 607297 )
      That stuff is why they invented pivot tables.
    • I don't know who to credit for this
      .
      .
      .
      If, therefore, I write code as clever as I possibly can - I can't effectively debug it

      Based on your quote, probably (originally) Don Knuth.

      “Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.” -Don Knuth.

    • by Jeremi ( 14640 )

      But for the 99.9% of code that has almost no impact whatsoever on performance, I can just say "if X then Y else Z" rather than using cool-but-cryptic bitmasking tricks to avoid executing a conditional instruction.

      ... and even in that other 0.01% of the time, it's likely that your compiler will optimize the pretty human-readable code into the cool-but-cryptic bitmasking trick at the assembly level anyway. There's no need for the human programmer to do that sort of obfuscatory wizardry at the source code level, when the compiler can do it for him -- and likely do it more reliably as well, since compiler writers pay more attention to what is strictly language-legal vs what-seems-to-sort-of-work-today.

      • and even in that other 0.01% of the time, it's likely that your compiler will optimize the pretty human-readable code into the cool-but-cryptic bitmasking trick at the assembly level anyway

        That's almost universally untrue. The 99.9% is made up of the union of "code that executes infrequently enough" and "code that the compiler can auto-optimize."

        Now, predicting what's in that 0.1% is tricky, which is why it is often better to optimize later after profiling reveals it. And may someone protect you from me

    • by Garen ( 246649 )

      It sounds like what you're referring to may be a reference an oft-quoted piece from Brian Kernighan (originally via "The Elements of Programming Style"):

      “Everyone knows that debugging is twice as hard as writing a program in the first place. So if you’re as clever as you can be when you write it, how will you ever debug it?”

  • I think the author's original focusing on C++ as an example of "worse is better" is a sad distraction. Clearly C++ was designed with the goal of being compatible with C. There are plenty of examples of languages which attempted to solve object-oriented programming but threw away backwards compatibility as a design goal: D, Java, and C# come to mind.

    That said, I think he does have an interesting point about our unwillingness to sit down and carefully consider our response to problems as they arise during dev

    • Clojure is designed to be be compatible - not backwards compatible, but intercalling compatible, with Java. The consequence is that a Clojure program can crash out of stack when it still has masses of heap. Why? Well, the JVM was designed for small embedded devices which would run small programs, which weren't expected to do a lot of recursion; and were low power with limited memory so allocating stack as a vector was seen as an efficiency win. The fact that most of the time we don't run Java on small embed

  • So Paul, is this a plug for your book or yet another argument for functional programming?

    Not much useful information or examples as to why or where "Worse is better" is harming the world of computing. There is always a better tool to help solve a problem. But for many reasons they may or may not be appropriate. You the programmer should know when and where to use them.

    One of my favorite quotes (the bold text at the end is the good part):

    Plan 9 failed simply because it fell short of being a compelling enough

  • I think we are seeing fresh starts. We've seen a shift away from desktop applications towards web as a common framework, that's become remarkably more useful. And now we are seeing moves towards mobility. Both web and mobility have forced a genuine separation between view systems and business logic that in the desktop world was a goal often not met. Then with the rise of devops architectures are becoming even more factory component where the parts are interchangeable and thus design mistakes can be corr

  • Does anyone have any idea of the hacks he's talking about?

    Since C++ is intended to and has always been a superset of C, how could there be any problems and hacks caused by compatibility with C? How could it be any better by discarding part of the language itself?

  • by Greyfox ( 87712 ) on Tuesday October 14, 2014 @09:20AM (#48139809) Homepage Journal
    I've been finding C++ pretty nice lately. Back when I started, the STL and templates weren't really a thing. They just added classes to C and called it good, and working with it then was a pretty shitty experience. I wandered off or a while and when I came back they'd added the STL,which provided some badly-needed data structures and language capabilities. Between that and boost and the tweaks to the syntax they made in C++11, it doesn't really feel like a bad language to work with. It can still be pretty bad if a programmer that's new to it goes crazy with operator overloading or tries to write it like Java or C. The streams library also feels pretty clunky, but you can always fall back to using the C file I/O or possibly boost's asio library if you hate it.

    A problem I've been seeing lately is that everyone seems to think software is carved in stone. In the past 3 or 4 years I've heard a LOT of excuses why some flaw in one system or another made a feature impossible. In these cases, fixing the flaw would be pretty trivial. Instead of doing THAT, people just build another layer of crap on top of the previous layer of crap and try to kind-of get something working. Code is not immutable. If it doesn't do something you need it to do, MAKE it do what you need it to do. Write a library, redesign a layer, simplify an interface, whatever. Don't just wring your hands and make the problem worse! Code is made to change. No design is ever perfect right from the start. If you try to make your design perfect from the start, you'll just end up paralyzed, afraid of doing anything because you might do it wrong. Start with a design that seems reasonable and adjust it as needed. Write small, decoupled libraries that can support that, and write unit tests to insure that each component behaves as expected. It's really not that hard, people!

    • You must work in a pretty dedicated corner of open source to think that you can actually get all your flaws fixed. I myself write and post patches to fix bugs and my patches are ignored like the bug doesn't exist. I've got a nice suite of patched open source projects since the flaw was in my way and I needed it gone. Getting other people to do stuff you want, even if it's as simple as accept a patch to make their own project work better, is non-trivial. 100x as much when you're dealing with proprietary code
    • by Jeremi ( 14640 )

      Code is not immutable. If it doesn't do something you need it to do, MAKE it do what you need it to do. Write a library, redesign a layer, simplify an interface, whatever.

      I completely agree in principle, but in practice, the more software that is using the current version of the code, the more things will break when you change the design. That has the effect of making the code less malleable, proportional to the number of its dependents.

      So for a function that is used only by your own program, it's no problem at all. For an in-house library that is used in several programs across your company, it's a bit of a hassle but doable. For a new computer language that is being use

  • Comment removed based on user account deletion
  • Python (Score:4, Insightful)

    by Daniel Hoffmann ( 2902427 ) on Tuesday October 14, 2014 @09:24AM (#48139825)

    How did breaking compatibility worked out for python 3?

    • by jythie ( 914043 )
      Pretty badly. My project has pushed back upgrading for something like 5 years now and it is doubtful we ever will. The only thing that could push us over is if something project we depend on switches exclusively to it, and scientific python libraries show little interest in doing that.
  • by Karmashock ( 2415832 ) on Tuesday October 14, 2014 @09:40AM (#48139939)

    If being perfect means not having critical core features then you're confused about what is and is not perfect. Compatibility is important. In many applications it is vital. Period - end of story. Does maintaining compatibility make the project more complicated? Yep. Coding is hard.

    Next issue.

  • Interoperability is king.
    I don't care how awesome your new application is, if I can't get my data into it and out of it, it's worthless to me. I've been down this road. If I have to start a massive project just to start using your application, or plan for a massive one to stop using it, that's a cost to me... a big one. And if my people have to go to training just to be able to use it because you didn't want to bother meeting a standard... or I have to hire people strait out of college? Again, that's a huge

  • If you don't like the mess in C++, find a better language and use that. C++ is actually the most popular language for performance-critical code? Hm, I wonder how that happened? Because of or despite its C compatibility?

    The world is full of bad technology that is popular because its version 1.0 was really popular at the time. The fact we still use all of it says something about the market for technology; apparently, backwards compatibility to a fault makes for more long-term popular systems than do-it-right-

  • by jythie ( 914043 ) on Tuesday October 14, 2014 @09:49AM (#48140023)
    I phrase I picked up years ago seems to apply here, 'standard is better then the best solution'. It is not a culture of 'worse is better', but of competing interests where the pure joy of developing languages is not the primary metric for determining what the best course of action is.
  • Without worse-is-better, you make sure the job is completely done before release. So, it takes years to make progress, you end up building an extremely complex system to cover every possibility. Because of that very complexity it is difficult to extend for new requirements, which become apparent after the system is specified and before it was built.

    If you want to build a nuclear power plant control system or something equally as critical and unchanging, sure, go ahead and engineer everything out the wazoo

  • by DavidHumus ( 725117 ) on Tuesday October 14, 2014 @10:25AM (#48140361)

    So far, I don't think I've seen a single comment here that got the point of the essay.

    He's not talking about incremental "improvements" to existing languages, he's pointing out that the common attitude of "we'll make this language easy to learn by making it look like C" is a poor way to achieve any substantial progress.

    This is true but everyone who's invested a substantial amount of time learning the dominant, clumsy, descended-from-microcode paradigm is reluctant to dip a toe into anything requiring them to become a true novice again.

    I've long been a big fan of what are now called "functional" languages like APL and J - wait, hold on - I know that started alarm bells ringing and red lights flashing for some of you - and find it painful to have to program in the crap languages that still dominate the programming eco-system. Oh look, another loop - let me guess, I'll have to set up the same boilerplate that I've done for every other loop because this language does not have a grammar to let me apply a function across an array. You want me to continue doing math by counting on my fingers when I've got an actual notation that handles arrays at a high level, but I can't use it because it's "too weird". (end rant)

    There have been any number of studies - widely ignored in the CS world - going back decades (see this http://cacm.acm.org/magazines/... [acm.org]) - pointing out how poorly dominant programming memes mesh with the way most people think about problems and processes. Meanwhile, the 1960s called - they want their programming languages and debugging "techniques" back - "printf", anyone?

    • by serviscope_minor ( 664417 ) on Tuesday October 14, 2014 @11:09AM (#48140725) Journal

      Meanwhile, the 1960s called - they want their programming languages and debugging "techniques" back - "printf", anyone?

      What's wrong with printf?

      printf has some very nice features. Firstly you get the histroy of what happened at previous iterations of your algotrithm, right there, which is something you don't get with a setpping debugger. Secondly, you can use a second language, suc as awk to process it very easily to find things out.

      I find debugging numeric code without printf to be a major pain. Sometimes it's worth plotting things a bit more dynamically, but you can do that with awk+gnuplot on the stream of data coming out with printf.

  • All of my competitors should adopt the author's philosophy of software development immediately. His ivory tower FP idealism is worthy of emulation by all.

    I will keep muddling through based on years of experience, leveraging existing code and know-how, maintaining backwards compatibility, planning long-term changes that sometimes take years to complete, deprecating unneeded features in as non-disruptive a manner as possible. And then, when the opportunity arises to do something radically different (like wi

  • Backward compatibility. Which btw Microsoft has broken a time or three.
  • silliness (Score:5, Interesting)

    by silfen ( 3720385 ) on Tuesday October 14, 2014 @11:06AM (#48140707)

    To accuse the C++ community of not having engaged in "reasoned discussion" about backwards compatibility is silly. Chiusano may not like the tradeoffs that C++ makes (I don't), but they are the result of a glacially slow and tedious community process and discussions. Whatever C++ is, it is by choice and reflection. Furthermore, "worse is better" refers to keeping things simple by cutting corners, and you really can't accuse C++ of keeping things simple.

    (Charges about too much backwards compatibility are ironic from someone who promotes Scala, a language that makes many compromises just in order to run on top of the JVM and remain backwards compatible with Java.)

  • by PPH ( 736903 ) on Tuesday October 14, 2014 @11:07AM (#48140711)

    ... the Wayland/systemd PR team is shifting into high gear.

  • Backwards compatibility is a good thing, but too much of it can be bad. There are glaring errors in languages which have not been fixed in the name of some faux backwards compatibility argument.

    I'll start with one that was fixed.... thirty years after the fact, all the while arguing that it couldn't be fixed, in the name of backward compatibility. The example? ANSI C gets the right answer for sqrt(2) while the original C didn't.

    An example of things that C got wrong and have yet to be fixed? the bug inducing

  • The outcome of everyone solving their own narrow short-term problems and never really revisiting the solutions is the sea of accidental complexity we now operate in, and which we all recognize is a problem.

    It strikes me that this describes our economic system as well.

The moon is made of green cheese. -- John Heywood

Working...