Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Bug Programming

Celebrating 30th Anniversary of the First C++ Compiler: Let's Find Bugs In It 153

New submitter Andrey_Karpov writes: Cfront is a C++ compiler which came into existence in 1983 and was developed by Bjarne Stroustrup ("30 YEARS OF C++"). At that time it was known as "C with Classes". Cfront had a complete parser, symbol tables, and built a tree for each class, function, etc. Cfront was based on CPre. Cfront defined the language until circa 1990. Many of the obscure corner cases in C++ are related to the Cfront implementation limitations. The reason is that Cfront performed translation from C++ to C. In short, Cfront is a sacred artifact for a C++ programmer. So I just couldn't help checking such a project [for bugs].
This discussion has been archived. No new comments can be posted.

Celebrating 30th Anniversary of the First C++ Compiler: Let's Find Bugs In It

Comments Filter:
  • by __aaclcg7560 ( 824291 ) on Thursday November 05, 2015 @11:19AM (#50871009)
    I'm sure I got a PDP-8 somewhere in my back closet.
    • Just in case you need a space heater?

      That must be some huge closet. The first VAX 11/780 I was was the size of two side by side ovens.

      • Since the first C++ compiler compiles to C code, a PDP-8 would be needed to test backward compatibility of the C code.
      • by Anonymous Coward

        Shit, I was an ENIAC once back in the forties. You should have seen me. I was freaking huge!

      • I'll see your VAX 11/780 and raise you a Data General Ecllpse MV9600U with 9 track tape drive and cpu cabinet taking up quite a lot of space.
    • I am going to bet you that the 3B2 [wikipedia.org] was that primary computer architecture for cfront.

      However, it does appear that cfront was extremely portable [wikipedia.org]:

      As Cfront was written in C++, it was a challenge to bootstrap on a machine without a C++ compiler/translator. Along with the Cfront C++ sources, a special "half-preprocessed" version of the C code resulting from compiling Cfront with itself was also provided. This C code was to be compiled with the native C compiler, and the resulting executable could then be used t

  • by Anonymous Coward

    "Sacred Artifact"? Are you kidding?

    I will happily agree that the language and compilers were both pretty awful back then. The worst warts have been beaten out now, and it's a pretty capable language in the main but this is now. 1990s C++ was horrible.

  • by Anonymous Coward

    This "new submitter" is the "science adviser" of the company who wrote that blog post, of which the main point is to sell you their product.

    • by Anonymous Coward

      The funny thing is that on reddit, this guy and his employees have been blogspamming his product for years, hiding it in posts like "analysis of this project", or "finding bugs in that project."

      Despite having multiple shill accounts like "Coder_CPP" and "resistor54" (among others), the community barely catches on and we have to suffer through cleverly placed product placement every few days.

      Part of how they got away with it for so long is their reputation is so high as they have their own subreddit http://r

      • by tibit ( 1762298 )

        That's true, but even if I wanted to throw money at them, I can't. They don't offer a stand-alone product. It's a Visual Studio plug-in. I don't use Visual Studio, other than for the compilers, so - meh.

  • by Anonymous Coward

    Before C++ came around, wasn't C just a glorified macro assembler?

    • by grimmjeeper ( 2301232 ) on Thursday November 05, 2015 @11:58AM (#50871315) Homepage

      Before C++ came around, wasn't C just a glorified macro assembler?

      Not exactly. It was intended to be an actual compiled language but one with as little overhead as possible and the ability to touch the hardware easily. It's considered a "middle level language". Where low level languages would be various assemblers and what not. High level languages (i.e. most other compiled language) deliberately abstract away the low level functionality to make writing applications easier. C was designed specifically as an in-house tool at Bell Labs for rewriting Unix and escaped sometime in the late 60's/early 70's. It was and always will be a compiled language, albeit one with only a very small base of core functionality.

      When they wrote it, Kernighan and Ritchie discarded most of the overhead that came with other languages. The linking of pointers to arrays and array arithmetic simplified the compiling while providing the bare minimum of array functionality. The lack of pretty much any built in functions made the language simple and compact. Putting all that functionality in libraries means you only needed to include just the pieces you used. And if you didn't need something, the language didn't force you to link it in. The language gave you just the few pieces that were absolutely necessary and you were responsible for the rest. The standard libraries that evolved after that are what gave C the ability to be a general purpose language.

      C is a great language for writing small, tight, efficient, low level programs as long as you know what you're doing and are willing to work with just a few small, sharp tools. It still has it's place in embedded systems and for writing operating systems (or at least the kernel). Beyond that, it's really quite limited. But no, it's not just a macro assembler. It is more than that.

      • by lgw ( 121541 )

        Mwh, the only thing C gives you over a good macro assembler is parsing of arithmetic expressions, and a somewhat processor-independent syntax. It's a nice layer of abstraction now that compiler optimization is so good, but originally was a real pain if you needed performance. (No one hand hand-code optimal assembly against a modern architecture, so you the performance arguments have swapped ends these days).

        • When C was first created, there was little to no compiler optimization available. The lack of overhead in the C language meant that the compiler added little to no bloat over the code that was written. What you wrote is what you got.

          Of course, that meant the inefficiency and bloat came in when average programmers started to implement more complex data and algorithms without knowing how to do it efficiently...

          • by lgw ( 121541 )

            The abstraction in C made writing hand-optimized code a guessing game in the early days. Rather than just writing what you wanted, you had to guess what would be compiled into what you wanted, and check the object code to see if you guessed write. How to rotate a register, how to get both the result and remainder of a single division instruction, that sort of thing. Often people just inlined the desired assembly.

            "What you wrote is what you got" was not how veteran assembly devs saw it - there was a lot o

            • Well sure. Any time you had a project that already mixed C and assembly, there already was a lot of detailed work to manipulate the processor very specifically and no small amount of just inlining the assembly code rather than figuring out how to write C syntax. And I've done my share of that, mostly early in my career. But compared to high level languages, C offered substantial efficiency improvements at the cost of having to "roll your own" on just about everything you wanted to do. And more often tha
              • by lgw ( 121541 )

                Sure, I get your point, but there were plenty of fast low-level (or "mid-level" if you want to call it that) languages back in the day, which have mostly faded into obscurity or oblivion now*. C was popular and lasting IMO because it was straightforward, because it felt like a sort of cross-platform macro-assembler. Just a few arcane quirks to figure out for the instructions C didn't represent directly.

                *My worst coding project ever was supporting legacy PL/S [wikipedia.org] code. (Made extra fun since IBM never released

                • Yeah, I still don't know what made C stand out over other languages. I never really got to know other languages that were around at the time. I bet the ability to cross compile between platforms was a significant contributor. That and it didn't obfuscate much on you so you could mostly know what was going on for every line of code you wrote.

                  It still amazes me the legacy that an escaped Bell Labs experiment like C has had with syntax of so many languages. C++, Java, C#, csh, Perl, Ruby, et al trace their

                  • by david_thornley ( 598059 ) on Thursday November 05, 2015 @03:10PM (#50872767)

                    Way back when, there were lots of different architectures, and they had their own C-level languages (CDC system programmers used Cybol, for example). C is the one that survived, because it was the language of Unix, and Unix caught on. It's multiplatform because Unix is. That gave it staying power. Nobody uses Cybol anymore, because it was tied to an architecture that just died.

      • by jrumney ( 197329 )

        The lack of pretty much any built in functions

        There's sizeof(), but that may have been a later addition.

        • The sizeof() function may have the syntax of a function but it's evaluated at compile time. It doesn't actually compile into a function call.
          • by jrumney ( 197329 )
            So what is a "built in function", if not one that is evaluated by the compiler rather than being defined in a library or by a macro in a header file?
            • It is exactly what it says. It is an actual function (i.e. executable code module) but it's built into the language and provided in the base linked package. C supplies defined functions as part of a library that is not part of the core language. That's the difference.
    • Yea, so? But it hid the details of the processor programing model from the programmer. So C programs became CPU independent, all you needed to do is create a cross compiler for your new CPU. It's just abstracting away the hardware dependence of Macro assemblers...

      Java is just a way to abstract away the memory management issues of C++ (well it's at least partly that..)

      • Re: (Score:3, Interesting)

        by lgw ( 121541 )

        You do realize there are no "memory management issues" in modern C++, right? Java abstracts away the memory management issues of C, and of C++ written like C.

        The advantage of Java and C# is the easier learning curve. C++ can be bizarre and arcane, while the managed languages are simpler to get right, and so you can far more easily hire programmers who won't screw everything up.

        • Oh yes there are memory management issues in C++, you still have to take care of it unless you only use stack and static storage. There are memory management issues in Java too, but most don't understand why they exist or how to avoid them (thankfully most don't run into the issues, but they are there).

          What you are talking about is that we have ABSTRACTED memory management tasks by going through libraries and templates. The "problem" hasn't been "fixed", all we've done is make it easy for programmers to n

          • by lgw ( 121541 )

            What you are talking about is that we have ABSTRACTED memory management tasks by going through libraries and templates. The "problem" hasn't been "fixed", all we've done is make it easy for programmers to not think about it if they wish.

            Yeah, sure, that's all any programming language ever is: layers of abstraction over assembly. My point was, from a practical perspective, you don't worry about memory leaks in modern C++ code any more than you do in modern Java code. But of course Java only offers the "no leaks" way, which in C++ you must discover the "no leaks" way, giving Java the easier learning curve. (You can still get memory leaks with anything, of course, if you try hard enough, but I'm talking about the 99% case here).

            But that doesn't mean C++ is old and arcane, nor is it bizarre

            Placement ne

            • What you are talking about is that we have ABSTRACTED memory management tasks by going through libraries and templates. The "problem" hasn't been "fixed", all we've done is make it easy for programmers to not think about it if they wish.

              Yeah, sure, that's all any programming language ever is: layers of abstraction over assembly. My point was, from a practical perspective, you don't worry about memory leaks in modern C++ code any more than you do in modern Java code. But of course Java only offers the "no leaks" way, which in C++ you must discover the "no leaks" way, giving Java the easier learning curve. (You can still get memory leaks with anything, of course, if you try hard enough, but I'm talking about the 99% case here).

              Um, like it or not, memory leaks still exist in Java and it's not that hard to stumble into code that causes them. The problem is that most people don't understand the parts that are abstracted away, they don't think they need to, so it's easy to stumble over your shoelaces and not know why. Yea, Java makes it a bit harder to bleed memory, but it doesn't eliminate the problem.

              But that doesn't mean C++ is old and arcane, nor is it bizarre

              Placement new. Weak references. Static initialization quirks (so many lock management and reference counting bugs caused by that over the years!). A Turing-complete templating language, so that you can turn your compiler run into a game of Tetris. Heck, just the fundamental need to understand that vectors will copy their member as they grow, and so you need to use the appropriate kind of smart pointer for that (which has changed, what 3 times now over the life of the language).

              Bizarre and arcane. All there for good reason, to solve problems that can't otherwise be solved, but bizarre and arcane.

              One man's garbage is another's treasure. You need to understand what you pay for all this convenience. I'm not saying Java doesn'

              • by lgw ( 121541 )

                Dude, you need to be less sensitive if you think I was running down C++ at the expense of Java. Don't make it a religious issue, you won't find happiness along that path.

        • There are memory management issues. Passing around auto_ptr and unique_ptr is tricky. shared_ptr can be a performance drag, as its memory accesses aren't necessarily local. Smart pointers can create a circular data structure just like in Java, except that good garbage collection in Java will collect that structure once it's inaccessible. If you're doing more than just mechanically using smart pointers (placement new, for example), there's other hassles.

          I love the language, but I can't recommend using

          • except that good^H^H^H^H every garbage collection in Java^H^H^H^H any language will collect that structure once it's inaccessible.
            FTFY.

          • by tibit ( 1762298 )

            auto_ptr doesn't exist, as far as anyone sane is concerned. unique_ptr is not passed around, it's moved around. shared_ptr is supposed to be used with make_shared.

            I fully agree that using C++ without understanding of what's going on will catch up with you, sooner or later. I have an engineer to train in C++ with very little programming background and all I can say is that it requires that I pay a lot of attention to detail, so as not to teach destructive overgeneralizations or oversimplifications, while tea

        • Ie, you can hire dumber programmers in order to make dumber programs.

    • A can see why you're anonymous.

    • Yes, it was. However they did not call it "glorified macro" but "portable".

      No idea why so many answers to you pretend it wasn't.

      The inventors of C defined it that way.

      OTOH: what is wrong with that :D

  • by Anonymous Coward

    1979 - Work on "C with classes" started
    At that time, "Object Oriented programming was considered too slow, too special purpose, and too difficult for ordinary mortals."

    Well, I'm glad C++ fixed that last problem!

    • by umghhh ( 965931 )
      Now you are responsible for coffee on my keyboard and parts of display and frankly this is tragic rather than funny. I still laugh tho...
  • Translating to C would not impose a limitation on the language features of C++, its possible to generate whatever C code you need to support C++ features. Using an LALR parser would introduce limitations on language design, however. This was once very common.

    • by lgw ( 121541 ) on Thursday November 05, 2015 @12:45PM (#50871703) Journal

      Translating to C would not impose a limitation on the language features of C++

      Practical limitations, for one guy banging out a language implementation in a hurry. C syntax was kept intact wherever possible, so that no translation would be needed. Which in turn led to quick adoption of C++ by C coders (which doomed C++ code to forever be ruined by C-style coding).

      • by tibit ( 1762298 )

        doomed C++ code to forever be ruined by C-style coding

        This is very true. The especially troubling aspect of how C++ is usually taught is that they start with the C feature set and then "build" on top of that. It's completely ass-backwards, and I hate these teachers with a passion. I then have to undo the damage they've done to the impressionable students that then work with me...

        • by lgw ( 121541 )

          I've basically given up professional C++ over this. I'd love a smaller project where I get to set the coding standards for a team small enough that I could actually CR everything and keep the rot from setting in, but somehow I've migrated to the big guys where that sort of team doesn't exist.

  • by mveloso ( 325617 ) on Thursday November 05, 2015 @12:09PM (#50871403)

    CFront wasn't a compiler, it was a preprocessor that spat out C code that was subsequently compiled by whatever C compiler you happened to have.

    Looking at CFront output was the best way to understand how C++ actually worked at the time, since it was all mapped to pretty straightforward C constructs. I don't think anyone around today knows what a vtable and ptable is, but back then it was how you could tell the programmers who really dug in to the language from those that didn't.

    • by Kjella ( 173770 )

      Looking at CFront output was the best way to understand how C++ actually worked at the time, since it was all mapped to pretty straightforward C constructs. I don't think anyone around today knows what a vtable and ptable is, but back then it was how you could tell the programmers who really dug in to the language from those that didn't.

      A ptable I got no idea, but a vtable is used whenever you use a virtual function and invokes an extra pointer dereferencing as opposed to a plain function. And I never learned C, but I had to learn that much to make inheritance work like I wanted in C++.

    • Certainly every professional C++ programmer I've ever worked with knows what a vtable is. I have no idea what a ptable is though - never heard the term before.

      • Looking at the article, there are Pclass, Ptable, Pname... there's no real documentation explaining them, but it's probably just a reminder to the programmer that the variable represents a "Pointer to" something.

        It's frowned on today to put a variable's type in the name (look up Hungarian Notation), but in the old days, C allowed implicit type casting, which meant that the compiler would not blink if you assigned a non-pointer variable to a pointer variable. It made for some...interesting bug hunting back

        • It still makes for bug hunting today. See the article today about 32-bit vs 64-bit Windows. Assign a pointer to an int to an int and it works just fine in a 32-bit build. Compile for 64-bit and fire up your debugger. Most compilers will generate a warning for this but there are often so many warnings (from things that used to be considered a sign of smartness) in any project of appreciable size that cleaning them all up is a project in and of itself and you don't know which ones will destabilize your 64
          • by tlhIngan ( 30335 )

            It still makes for bug hunting today. See the article today about 32-bit vs 64-bit Windows. Assign a pointer to an int to an int and it works just fine in a 32-bit build. Compile for 64-bit and fire up your debugger. Most compilers will generate a warning for this but there are often so many warnings (from things that used to be considered a sign of smartness) in any project of appreciable size that cleaning them all up is a project in and of itself and you don't know which ones will destabilize your 64-bit

            • I don't think that anybody would (intentionally) do this today. There are very few "green field" projects out there, though. Everybody has millions of lines of existing code that was written before those features existed. (C++11 and C++14 were the first updates in a decade). I go to places where many of the files are still 8.3 format all uppercase letters back from a time when the file systems were limited that way. Systems are large and you may call an API that uses a fairly modern interface but it wr
    • I don't know if it was CFront, but the first C++ programs I wrote were for homework assignments in the late 1980's. The compiler output was unreadable, of course, because it wasn't compiling my code. It was compiling a C++ to C translation of my code. All I really knew was that there was a problem with my program *somewhere* at the line, or above it, where the first error message appeared. It was an exercise in agony management.

      I remember thinking, "what an awful language!" at the time. Of course

      • I was a bit luckier, in that I learned the language using Turbo C++. It was a fantastic compiler/IDE for DOS. I actually still have very fond memories of working on my hobby projects in that environment. Later in my schooling, using gcc on Unix workstations felt really primitive by comparison. By the time I turned pro, it was pretty much all MSVC.

        What's funny (or sad, I suppose) is that, to this day, template-generated error messages are as horrible as ever. Hopefully we'll either get concepts or some

  • I remember when it first came out. Back in the days of BBS's

    The old joke was

    "Have you seen the new C+++"

    I wonder how many will still get the joke?

    • Hmm, slashdot ate the punchline.

      "Have you seen the new C+++ No Carrier"

      Had to drop the grater than and less than signs. Wow how times have changed. lol

      • I needed the "No Carrier" reminder and then I got the joke...Wow, yes I remember those days.

      • by narcc ( 412956 )

        Times haven't changed, <things> are exactly the same as they were previously. ><

        It is a tradition for users, regardless of how long they've been here, to forget something simple like how to use < and > in a forum post. I don't know how many times I've seen someone complain that Slashdot ate this or that bit of their post. Should have used the preview button...

        Just use &lt; instead of typing a < and you'll be fine. You can also use &gt; in place of simply typing a >, i

  • It's the monthly PVS-Studio slashvertisement. In this case, only the last words of the summary (within parentheses at that) suggested what this was about, and it took until a couple of paragraphs into TFA until it was mention, but sure enough, it was your regular post showing PVS-Studio output from a random open source software project.

    Seriously, this is getting old.

We are Microsoft. Unix is irrelevant. Openness is futile. Prepare to be assimilated.

Working...