GCC Compiler Finally Supplanted by PCC? 546
Sunnz writes "The leaner, lighter, faster, and most importantly, BSD Licensed, Compiler PCC has been imported into OpenBSD's CVS and NetBSD's pkgsrc. The compiler is based on the original Portable C Compiler by S. C. Johnson, written in the late 70's. Even though much of the compiler has been rewritten, some of the basics still remain. It is currently not bug-free, but it compiles on x86 platform, and work is being done on it to take on GCC's job."
One OT clarification: (Score:2, Informative)
It's my primary package manager on Interix, Mac OS X, Linux, and NetBSD.
Re:"Nothing for you to see here" indeed... (Score:1, Informative)
The licence is just the top of the iceberg (Score:3, Informative)
From the discussion of TFA:
The licence is just the top of the iceberg [undeadly.org]
Re:Not for NetBSD for sure (Score:5, Informative)
GCC compiles on a LOT of different architectures. Does PCC? Does it do as good a job at compiling? Can we plop our current GCC-compiled source on PCC and have it compile without huge headaches?
And what about these bugs that are even referenced in the summary? How could it POSSIBLY supplant GCC if it's that buggy? In fact, how could it have supplanted GCC if it hasn't taken GCC's place AT ALL yet?
Try these headlines:
GCC Compiler Finally Has 'Free' Competition
New Compiler To Supplant Gnu Compiler?
Battle of the licenses: Does the license of your compiler MATTER AT ALL!?
"production-quality"? - not quite (Score:2, Informative)
Re:Why? (Score:5, Informative)
Re:Stupid waste of time (Score:3, Informative)
- It has different goals to GCC.
Re:The licence is just the top of the iceberg (Score:3, Informative)
I do know that I wrote an intermediate analyzer for a semester-length class, along with another grad student. In fact, the prof suggested we use GCC because several others have done the exact same thing, going back many years. It's not easy, but it's possible.
Sounds like whoever posted that was just extremely frustrated and wanted to blow off some steam. It can happen. GCC used to be a lot worse, and a lot further behind academia. There have been growing pains in the last couple of years getting it to "catch up". Perhaps that's what he was frustrated with?
Re:That's dumb. (Score:3, Informative)
Re:"Nothing for you to see here" indeed... (Score:4, Informative)
not every architecture (Score:5, Informative)
The idea with PCC is not that it will be BSD licensed (nobody really gives a fuck what license the compiler is under), but that it will be supported directly by the BSD community, including the NetBSD hackers who have their bazillion architectures to support.
sensationalist bullshit yet again (Score:5, Informative)
Second: The biggest attraction of PCC is NOT the license. The article submitter who stated otherwise is a jackass.
Third: There are techical reasons why GCC is actaully unusable by some BSDs, such as NetBSD, which aims to support many architectures that GCC has dropped. NetBSD uses a combination of GCC 2, 3, and 4 to compile all of its different architectures. The NetBSD developers would rather have a single compiler that handles them all. Obviously PCC is nowhere near that level yet, of course.
Fourth: GCC politics are a pain in the ass for many BSD developers who just want to submit patches to a compiler without the overhead of GNU's policies and GCC's management.
Fifth: GCC produces crappy code more often than anyone would like. GCC bugs are far from unheard of, performance of generated code is often unpredictable between releases, and in many less commonly used architectures or sources GCC will produce incorrect code. Yes, these cases are very rare, but the BSD folks have hit the problem often enough for it to be a concern. PCC, being simpler and less bloated with cruft from multiple rewrites of the internals will hopefully produce correct and predictable code more often than GCC.
Sixth: PCC actually works today. It can compile most of the NetBSD userspace, as I recall, and the kernel will be ready to roll soon after some inline assembler problems are fixed. This isn't some theoretical hacky project - it works right now. It's not ready to replace GCC just yet, by any means, but it's a lot more than some Slashdotters seem to think it is.
Re:Interesting... (Score:3, Informative)
Let's at least get RMS's position right: The GNU project was founded in 1984 to create a free operating system. In 1991, they were almost completely finished - they had written every essential component of a Unix-like operating system except for a kernel. Linus came along, wrote the Linux kernel, combined it with the almost-complete GNU system, and called the whole thing Linux. The GNU people were rightly upset that they were getting no credit for their work (to build a complete Unix-like OS).
The counter argument from Linus is that the term "Operating System" means "kernel", and that anything outside the kernel is just "userspace tools". That's a difficult position to defend - the simplest counter argument being that operating systems run programs and that Linux can't even run "Hello World" without GNU System components like GNU Libc.
Re:Stupid waste of time (Score:4, Informative)
We want to encourage people to use our code if it's the best code for the task. Period. You want to undermine copyright. Well, you're free to do that, but some of us don't think that's a good idea, and others of us don't think it's that important.
Re:"Nothing for you to see here" indeed... (Score:5, Informative)
by Marc Espie (213.41.185.88) (espie@openbsd.org) on Sun Sep 16 13:28:48 2007 (GMT)
> > I am saying think this through and carefully. Rewriting a giant suite of programs just because you don't agree with the philosophy behind it sounds awful to people who have no stakes in BSD licenses.
>
> It's not just the licence that is a concern about the GCC suite, it's dropping support for hardware that OpenBSD supports, it's fluctuating compilation quality and it's licence are all matters for concern to users.
The licence is just the top of the iceberg.
GCC is developed by people who have vastly different goals from us. If you go back and read the GCC lists, you'll notice several messages by me where I violently disagree with the direction it's following. Here is some *more* flame material.
- GCC is mostly a commercial compiler, these days. Cygnus software has been bought by redhat. Most GCC development is done by commercial linux distributors, and also Apple. They mostly target *fast* i386 architectures and PowerPC. A lot of work has been done on specmarks, *but* the compiler is getting bigger and bigger, and slower and slower (very much so).
- GCC warnings are not *really* useful. The -Wall flag shows many right things, and quite a few wrong issues.
- There is a lot of churn in GCC which ends up with it no longer supporting some architectures that are still relevant to us.
- The whole design of GCC is perverted so that someone cannot easily extract a front-end or back-end. This is broken by design, as the GPL people do believe this would make it easier for commercial entities to `steal' a front-end or back-end and attach it to a proprietary code-generator (or language). This is probably true. This also makes it impossible to write interesting tools, such as intermediate analyzers. This also makes it impossible to plug old legacy back-ends for old architectures into newer compilers.
- As a result, you cannot have the new interesting stuff from newer GCC without also losing stuff... every GCC update is an engineering nightmare, because there is NO simple choice. You gain some capabilities, and you also lose some important stuff.
- it's also very hard to do GCC development. Their branching system makes it very likely that some important work is falling between the cracks (and this happens all the time). If you develop code for GCC, you must do it on the most recent branch, which is kind of hard to do if your platform is currently broken (happens *all the time* if you're not running linux/i386). Even when you conform, it's hard to write code to the GNU coding standards, which are probably the most illegible coding guidelines for C. It's so obvious it was written by a lisp programmer. As a result, I've even lost interest into rewriting and getting in the GCC repository a few pieces.
- some of their most recent advances do not have a chance to work on OpenBSD, like preparsed includes, which depend on mmap() at a fixed location.
- there are quite a few places in GCC and G++ where you cannot have full functionality without having a glibc-equivalent around.
- some of the optimisation choices are downright dangerous, and wrong for us (like optimizing memory fills away, even if they deal with crypto keys).
- don't forget the total nightmare of autoconf/libtool/automake. Heck, even the GCC people have taken years to update their infrastructure to a recent autoconf. And GCC is *the only program in the ports tree* that actually uses its own libtool. Its configuration and reconfiguration fails abysmally when you try to use a system-wide libtool.
I could actually go on for pages...
I've actually been de facto maintainer of GCC on OpenBSD for a few years by now, and I will happily switch to another compiler, so frustrating has been the road with GCC.
Re:Interesting... (Score:4, Informative)
Re:Interesting... (Score:3, Informative)
What would be really cool is to see from someone like the OpenBSD crowd, if they're so keen on C, develop some verification tools that maybe only work on a very, very restricted subset of C. Any code which does not conform to this restricted "more easily verifiable" subset of C in the core OS would be rejected. I don't know how practical it would be, but it would be cool to see :). I mean as an academic, obviously I think we should all be using Z [wikipedia.org], but I understand this doesn't make good sense in a lot of real-world projects. But you want to get serious about correctness, don't pussy foot around: get serious about correctness.
I think you miss the point. The OpenBSD people are married to Unix and C, in much the same way as Bjarne believes the answer is to fix C++, and not to use Java and C#. Yes your right, it would be nice if one of them "sees the light" and programs in something where you just can't create a buffer overflow so easily. That being said, they should write the lcompiler/vm/interperter/kernel that runs under that as they have proven themselves to be one of the few chosen by $DIETY to write decent C. Until a cpu is created where you can handle arrays in a non dangerous manner on the machine code level, someone will have to write in C or assembly.
Re:"Nothing for you to see here" indeed... (Score:3, Informative)
FWIW, if your issue was GCC politics that's pretty much what caused the egcs split (and re-merge eventually, with a new philosophy and maintenance crew in charge of GCC). So if you looked at things prior to the gcc 2.95 era, you were looking at a different set of maintainers (and politics/philosophy/etc) than what's there now.
Last time I did it it was pretty straightforward (as straightforward as cross-compilation can be), and the documentation included worked fine.
The problem is that to get a full cross-compiler setup isn't just a gcc problem; you need a libc (with headers), and a linker (binutils) as well, and libc is a particular pain.
I had no problems in the 2.96 era or thereabouts building a linux->windows cross-compiler using only the GCC-included instructions; I basically did:
1. Build binutils (linker), using "./configure --target=i386-mingw32 --prefix=/usr/local" or whatever target you're using
2. untar pre-built libc/headers in
3. Build gcc using "./configure --target=i386-mingw32 --prefix=/usr/local --with-gnu-as=i386-redhat-linux".
The flags might be slightly wrong as that's from memory. Note that I didn't bother bootstrapping libc; if you want, that's also doable; see, e.g., http://www.libsdl.org/extras/win32/cross/README.txt [libsdl.org] if you want a simple hand-holding script to do it for you.
Re:Interesting... (Score:3, Informative)
It may not be forkable for the average person off the street, but if actual compiler developers are unhappy enough with something then forking GCC is certainly practical and we have plenty of examples of that happening.
egcs forked gcc effectively enough that the fork displaced the original. Several bounds-checking gcc forks have been used in production systems. Apple and others have periodically forked (and sometimes re-merged).
Re:"Nothing for you to see here" indeed... (Score:3, Informative)
Re:Stupid waste of time (Score:3, Informative)
Yes, that really does mean what it says -- you can take our code private as a binary without given us a thing, but if you preserve the source, you have to give us credit.
Re:Interesting... (Score:1, Informative)
If you want a simple, straightforward, bug-free compiler, you're better off writing one from scratch. It's actually not that hard - a non-optimizing (or minimally optimizing) C compiler is pretty simple (although multi-platform support is a PITA).
If you turn off optimizations in GCC, the generated code is absolutely awful (and not much more likely to be bug-free than optimized code), compared to simplistic compilers that are designed to generate reasonably efficient code without the need for complicated optimizations.
Re:Not for NetBSD for sure (Score:3, Informative)