Forgot your password?
typodupeerror
Programming Operating Systems IT Technology BSD

GCC Compiler Finally Supplanted by PCC? 546

Posted by ScuttleMonkey
from the all-good-things-must-end dept.
Sunnz writes "The leaner, lighter, faster, and most importantly, BSD Licensed, Compiler PCC has been imported into OpenBSD's CVS and NetBSD's pkgsrc. The compiler is based on the original Portable C Compiler by S. C. Johnson, written in the late 70's. Even though much of the compiler has been rewritten, some of the basics still remain. It is currently not bug-free, but it compiles on x86 platform, and work is being done on it to take on GCC's job."
This discussion has been archived. No new comments can be posted.

GCC Compiler Finally Supplanted by PCC?

Comments Filter:
  • by Anonymous Coward on Monday September 17, 2007 @12:11PM (#20637643)
    "NetBSD's" pkgsrc is really everyone's pkgsrc [wikipedia.org]. Try it on what you're running right now.

    It's my primary package manager on Interix, Mac OS X, Linux, and NetBSD.

  • by Anonymous Coward on Monday September 17, 2007 @12:12PM (#20637681)
    In the story on Undeadly, one of the comments states that he was able to build OpenBSD's source for bin usr.bin usr.sbin & sbin with pcc. Seems just a little more until at least OpenBSD can be built using it (PCC is missing __attributes__ & some asm stuff IIRC)
  • by DiegoBravo (324012) on Monday September 17, 2007 @12:24PM (#20637879) Journal
    "So, really, there's no point in duplicating effort on a BSD lisenced compiler. Correct me if I'm wrong."

    From the discussion of TFA:

    The licence is just the top of the iceberg [undeadly.org]

  • by Aladrin (926209) on Monday September 17, 2007 @12:28PM (#20637961)
    This got modded funny, but I'm sure it deserves insightful instead.

    GCC compiles on a LOT of different architectures. Does PCC? Does it do as good a job at compiling? Can we plop our current GCC-compiled source on PCC and have it compile without huge headaches?

    And what about these bugs that are even referenced in the summary? How could it POSSIBLY supplant GCC if it's that buggy? In fact, how could it have supplanted GCC if it hasn't taken GCC's place AT ALL yet?

    Try these headlines:

    GCC Compiler Finally Has 'Free' Competition
    New Compiler To Supplant Gnu Compiler?
    Battle of the licenses: Does the license of your compiler MATTER AT ALL!?
  • by pigiron (104729) on Monday September 17, 2007 @12:36PM (#20638121) Homepage
    GCC has been continuously changed not continuously improved. With each new chip that it optimizes for it seems to drop support for an older one. Plus it is dog slow.
  • Re:Why? (Score:5, Informative)

    by TheRaven64 (641858) on Monday September 17, 2007 @12:56PM (#20638485) Journal
    License aside, the biggest benefit is that it's small. The OpenBSD (for example) base system is a self-contained operating system, including everything required to build it from source. It has the following requirements from its compiler:
    • Must compile C code (GCC does this).
    • Must support all of the platforms OpenBSD targets (GCC has a habit of dropping support for various platforms).
    • Must be easy to add new backends for new architectures (GCC makes this really hard).
    • Must be easy to audit for security (GCC is a tangled mess).
    Maybe a few I've missed. GCC is like Linux; it's a fairly good solution for a lot of problems, but it's rarely the best solution for any given problem. PCC is a better fit for the needs of the OpenBSD base system.
  • by TheRaven64 (641858) on Monday September 17, 2007 @12:59PM (#20638581) Journal
    There's also an option three:

    - It has different goals to GCC.

  • by bockelboy (824282) on Monday September 17, 2007 @01:05PM (#20638715)
    Gah, that's a bit of a flame. If you look at the architecture, there's good front-end / back-end separation. The front and back ends are necessarily extremely complicated, but you can rip the entire front end and compile your code to GIMPLE, or whatever their SSA form is called. There also have been several different languages (Java, C, C++, Ada, Obj-C) that are production frontends to GCC's backend.

    I do know that I wrote an intermediate analyzer for a semester-length class, along with another grad student. In fact, the prof suggested we use GCC because several others have done the exact same thing, going back many years. It's not easy, but it's possible.

    Sounds like whoever posted that was just extremely frustrated and wanted to blow off some steam. It can happen. GCC used to be a lot worse, and a lot further behind academia. There have been growing pains in the last couple of years getting it to "catch up". Perhaps that's what he was frustrated with?
  • Re:That's dumb. (Score:3, Informative)

    by TheRaven64 (641858) on Monday September 17, 2007 @01:05PM (#20638717) Journal
    PCC and LLVM have different goals. LLVM is intended to be a replacement for GCC with a modern architecture. PCC is intended to be a simple C compiler, not too heavy on the optimisation, but easy to audit and able to produce correct code. LLVM is about an order of magnitude bigger than PCC, making it much harder to audit. PCC would be great for the OpenBSD base system, while LLVM would make a good choice for compiling packages. It's all about choosing the right tool for the job.
  • by Richard_at_work (517087) <richardprice@nOSPam.gmail.com> on Monday September 17, 2007 @01:25PM (#20639107)
    At one point in GCC's history (infact, not all that long ago), you couldn't even use GCC to do that - there was a big issue about Red Hat shipping a version of GCC by default that could not compile the Linux kernel, you had to install another earlier version if you wanted to do that.
  • by DreadSpoon (653424) on Monday September 17, 2007 @01:26PM (#20639115) Journal
    The biggest reason for the new compiler (despite the jackass article submitter's position) is that GCC does *NOT* support every architecture. GCC drops architectures frequently as the core contributors lose interest, which hurts OSes like NetBSD that try to support more than the mainstream architectures. NetBSD relies on a combination of GCC 2, 3, and 4 to compile the OS on all of the architectures it supports.

    The idea with PCC is not that it will be BSD licensed (nobody really gives a fuck what license the compiler is under), but that it will be supported directly by the BSD community, including the NetBSD hackers who have their bazillion architectures to support.
  • by DreadSpoon (653424) on Monday September 17, 2007 @01:37PM (#20639321) Journal
    First: PCC has not YET supplanted GCC. The BSDs are hoping it will in the future.

    Second: The biggest attraction of PCC is NOT the license. The article submitter who stated otherwise is a jackass.

    Third: There are techical reasons why GCC is actaully unusable by some BSDs, such as NetBSD, which aims to support many architectures that GCC has dropped. NetBSD uses a combination of GCC 2, 3, and 4 to compile all of its different architectures. The NetBSD developers would rather have a single compiler that handles them all. Obviously PCC is nowhere near that level yet, of course.

    Fourth: GCC politics are a pain in the ass for many BSD developers who just want to submit patches to a compiler without the overhead of GNU's policies and GCC's management.

    Fifth: GCC produces crappy code more often than anyone would like. GCC bugs are far from unheard of, performance of generated code is often unpredictable between releases, and in many less commonly used architectures or sources GCC will produce incorrect code. Yes, these cases are very rare, but the BSD folks have hit the problem often enough for it to be a concern. PCC, being simpler and less bloated with cruft from multiple rewrites of the internals will hopefully produce correct and predictable code more often than GCC.

    Sixth: PCC actually works today. It can compile most of the NetBSD userspace, as I recall, and the kernel will be ready to roll soon after some inline assembler problems are fixed. This isn't some theoretical hacky project - it works right now. It's not ready to replace GCC just yet, by any means, but it's a lot more than some Slashdotters seem to think it is.
  • Re:Interesting... (Score:3, Informative)

    by Chandon Seldon (43083) on Monday September 17, 2007 @02:00PM (#20639755) Homepage

    The GNU/Linux thing was kind of retarded given that Linux distributions feature code from a lot of different licenses, and GNU is the only one that's mentioned?

    Let's at least get RMS's position right: The GNU project was founded in 1984 to create a free operating system. In 1991, they were almost completely finished - they had written every essential component of a Unix-like operating system except for a kernel. Linus came along, wrote the Linux kernel, combined it with the almost-complete GNU system, and called the whole thing Linux. The GNU people were rightly upset that they were getting no credit for their work (to build a complete Unix-like OS).

    The counter argument from Linus is that the term "Operating System" means "kernel", and that anything outside the kernel is just "userspace tools". That's a difficult position to defend - the simplest counter argument being that operating systems run programs and that Linux can't even run "Hello World" without GNU System components like GNU Libc.

  • by YU Nicks NE Way (129084) on Monday September 17, 2007 @02:22PM (#20640135)
    No: taking the BSD license OFF the code drives us nuts. You're free to incorporate our code and give us no credit. We object if you take that right from a downstream user. If you dual license, you don't take that freedom away. If you replace the license, you do.

    We want to encourage people to use our code if it's the best code for the task. Period. You want to undermine copyright. Well, you're free to do that, but some of us don't think that's a good idea, and others of us don't think it's that important.
  • by synthespian (563437) on Monday September 17, 2007 @02:26PM (#20640215)
    Here's the content (just so it stays in this Slashdot thread and gets archived here).

    Re: BSD Licensed PCC Compiler Imported (mod 21/25)
    by Marc Espie (213.41.185.88) (espie@openbsd.org) on Sun Sep 16 13:28:48 2007 (GMT)
                > > I am saying think this through and carefully. Rewriting a giant suite of programs just because you don't agree with the philosophy behind it sounds awful to people who have no stakes in BSD licenses.
    >
    > It's not just the licence that is a concern about the GCC suite, it's dropping support for hardware that OpenBSD supports, it's fluctuating compilation quality and it's licence are all matters for concern to users.

    The licence is just the top of the iceberg.

    GCC is developed by people who have vastly different goals from us. If you go back and read the GCC lists, you'll notice several messages by me where I violently disagree with the direction it's following. Here is some *more* flame material.

    - GCC is mostly a commercial compiler, these days. Cygnus software has been bought by redhat. Most GCC development is done by commercial linux distributors, and also Apple. They mostly target *fast* i386 architectures and PowerPC. A lot of work has been done on specmarks, *but* the compiler is getting bigger and bigger, and slower and slower (very much so).

    - GCC warnings are not *really* useful. The -Wall flag shows many right things, and quite a few wrong issues.

    - There is a lot of churn in GCC which ends up with it no longer supporting some architectures that are still relevant to us.

    - The whole design of GCC is perverted so that someone cannot easily extract a front-end or back-end. This is broken by design, as the GPL people do believe this would make it easier for commercial entities to `steal' a front-end or back-end and attach it to a proprietary code-generator (or language). This is probably true. This also makes it impossible to write interesting tools, such as intermediate analyzers. This also makes it impossible to plug old legacy back-ends for old architectures into newer compilers.

    - As a result, you cannot have the new interesting stuff from newer GCC without also losing stuff... every GCC update is an engineering nightmare, because there is NO simple choice. You gain some capabilities, and you also lose some important stuff.

    - it's also very hard to do GCC development. Their branching system makes it very likely that some important work is falling between the cracks (and this happens all the time). If you develop code for GCC, you must do it on the most recent branch, which is kind of hard to do if your platform is currently broken (happens *all the time* if you're not running linux/i386). Even when you conform, it's hard to write code to the GNU coding standards, which are probably the most illegible coding guidelines for C. It's so obvious it was written by a lisp programmer. As a result, I've even lost interest into rewriting and getting in the GCC repository a few pieces.

    - some of their most recent advances do not have a chance to work on OpenBSD, like preparsed includes, which depend on mmap() at a fixed location.

    - there are quite a few places in GCC and G++ where you cannot have full functionality without having a glibc-equivalent around.

    - some of the optimisation choices are downright dangerous, and wrong for us (like optimizing memory fills away, even if they deal with crypto keys).

    - don't forget the total nightmare of autoconf/libtool/automake. Heck, even the GCC people have taken years to update their infrastructure to a recent autoconf. And GCC is *the only program in the ports tree* that actually uses its own libtool. Its configuration and reconfiguration fails abysmally when you try to use a system-wide libtool.

    I could actually go on for pages...

    I've actually been de facto maintainer of GCC on OpenBSD for a few years by now, and I will happily switch to another compiler, so frustrating has been the road with GCC.
  • Re:Interesting... (Score:4, Informative)

    by Goaway (82658) on Monday September 17, 2007 @03:11PM (#20640989) Homepage

    But modifying, even forking GCC is practical
    You haven't looked at the gcc codebase, have you?
  • Re:Interesting... (Score:3, Informative)

    by j-pimp (177072) <.moc.liamg. .ta. .1891yppiz.> on Monday September 17, 2007 @04:20PM (#20642211) Homepage Journal

    What would be really cool is to see from someone like the OpenBSD crowd, if they're so keen on C, develop some verification tools that maybe only work on a very, very restricted subset of C. Any code which does not conform to this restricted "more easily verifiable" subset of C in the core OS would be rejected. I don't know how practical it would be, but it would be cool to see :). I mean as an academic, obviously I think we should all be using Z [wikipedia.org], but I understand this doesn't make good sense in a lot of real-world projects. But you want to get serious about correctness, don't pussy foot around: get serious about correctness.

    I think you miss the point. The OpenBSD people are married to Unix and C, in much the same way as Bjarne believes the answer is to fix C++, and not to use Java and C#. Yes your right, it would be nice if one of them "sees the light" and programs in something where you just can't create a buffer overflow so easily. That being said, they should write the lcompiler/vm/interperter/kernel that runs under that as they have proven themselves to be one of the few chosen by $DIETY to write decent C. Until a cpu is created where you can handle arrays in a non dangerous manner on the machine code level, someone will have to write in C or assembly.

  • by pthisis (27352) on Monday September 17, 2007 @04:20PM (#20642215) Homepage Journal

    Several years ago I tried making a linux to windows cross compiler and failed.


    FWIW, if your issue was GCC politics that's pretty much what caused the egcs split (and re-merge eventually, with a new philosophy and maintenance crew in charge of GCC). So if you looked at things prior to the gcc 2.95 era, you were looking at a different set of maintainers (and politics/philosophy/etc) than what's there now.

    I think I put a decent amount of effort into my attempts and I definitely knew how toproduce a standard linux hosted linux targeted instance of GCC that would produce working binaries. A few years later I installed watcom and while it did not support Linux, I could install already working binaries that allowed me to compile dos, windows, os/2 and netware binaries from my windows machine.

    Now the reasons for this are largely political. GCC works just fine as a cross compiler, I'm sure today I could get it to work now that I have written a lot more code, compiled more tarballs, and generally know more than I did then. I was able to get a freebsd to windows cross compiler working just fine thanks to the ports collection. Watcom never got a ready for prime time linux compiler, but what they shipped to end users as "experimental" always was a windows hosted compiler targeting linux.

    Now there is no technical reason that gcc or a third party can't make the cross compiling process simpler, but other than poor college students that like to experiment, anyone who needs a cross compiler either can do it themselves, can hire someone that can, or has to do a lot of hoop jumping.


    Last time I did it it was pretty straightforward (as straightforward as cross-compilation can be), and the documentation included worked fine.

    The problem is that to get a full cross-compiler setup isn't just a gcc problem; you need a libc (with headers), and a linker (binutils) as well, and libc is a particular pain.

    I had no problems in the 2.96 era or thereabouts building a linux->windows cross-compiler using only the GCC-included instructions; I basically did:
    1. Build binutils (linker), using "./configure --target=i386-mingw32 --prefix=/usr/local" or whatever target you're using
    2. untar pre-built libc/headers in /usr/local
    3. Build gcc using "./configure --target=i386-mingw32 --prefix=/usr/local --with-gnu-as=i386-redhat-linux".

    The flags might be slightly wrong as that's from memory. Note that I didn't bother bootstrapping libc; if you want, that's also doable; see, e.g., http://www.libsdl.org/extras/win32/cross/README.txt [libsdl.org] if you want a simple hand-holding script to do it for you.
  • Re:Interesting... (Score:3, Informative)

    by pthisis (27352) on Monday September 17, 2007 @04:37PM (#20642493) Homepage Journal

    But modifying, even forking GCC is practical
    You haven't looked at the gcc codebase, have you?


    It may not be forkable for the average person off the street, but if actual compiler developers are unhappy enough with something then forking GCC is certainly practical and we have plenty of examples of that happening.

    egcs forked gcc effectively enough that the fork displaced the original. Several bounds-checking gcc forks have been used in production systems. Apple and others have periodically forked (and sometimes re-merged).
  • by VGPowerlord (621254) on Monday September 17, 2007 @05:16PM (#20643127) Homepage
    As I recall, that was version 2.96 [gnu.org], which was actually the development branch for 3.0. Not surprisingly, development versions have bugs, which is why they shouldn't be used by mainstream users.
  • by YU Nicks NE Way (129084) on Monday September 17, 2007 @09:15PM (#20646073)
    Our license entails the statement "If you distribute this source code, then any recipient can take it private. If you don't redistribute this source code, then you can do anything you please." What that means is that those who choose to redistribute the source (e.g. you folks) have burdens not borne by those who take the code private.

    Yes, that really does mean what it says -- you can take our code private as a binary without given us a thing, but if you preserve the source, you have to give us credit.
  • Re:Interesting... (Score:1, Informative)

    by Anonymous Coward on Tuesday September 18, 2007 @12:59AM (#20647727)
    Do you have any idea how huge GCC is?

    If you want a simple, straightforward, bug-free compiler, you're better off writing one from scratch. It's actually not that hard - a non-optimizing (or minimally optimizing) C compiler is pretty simple (although multi-platform support is a PITA).

    If you turn off optimizations in GCC, the generated code is absolutely awful (and not much more likely to be bug-free than optimized code), compared to simplistic compilers that are designed to generate reasonably efficient code without the need for complicated optimizations.
  • by Secret Rabbit (914973) on Tuesday September 18, 2007 @02:15AM (#20648193) Journal

    And what about these bugs that are even referenced in the summary? How could it POSSIBLY supplant GCC if it's that buggy? In fact, how could it have supplanted GCC if it hasn't taken GCC's place AT ALL yet?
    Easy. Read the rest of the summary.

    and work is being done on it to take on GCC's job.

A committee is a group that keeps the minutes and loses hours. -- Milton Berle

Working...