GCC Compiler Finally Supplanted by PCC? 546
Sunnz writes "The leaner, lighter, faster, and most importantly, BSD Licensed, Compiler PCC has been imported into OpenBSD's CVS and NetBSD's pkgsrc. The compiler is based on the original Portable C Compiler by S. C. Johnson, written in the late 70's. Even though much of the compiler has been rewritten, some of the basics still remain. It is currently not bug-free, but it compiles on x86 platform, and work is being done on it to take on GCC's job."
"Nothing for you to see here" indeed... (Score:5, Interesting)
Wake me up when you're able to use PCC instead of GCC to do a 'make world' (or
Re: (Score:3, Insightful)
Re:"Nothing for you to see here" indeed... (Score:4, Interesting)
You bring up a good point. For years I have been looking for an open source compiler thats about the same quality as GCC, but is anything but GCC. I'm not too picky about the politics, as long as there a different set of politics from the GCC politics. I had great hope for Open Watcom [openwatcom.com], but the license was bad enough for debian to consider it non free, and they are not actively trying to be an alternative to GCC. Its quite a shame, but I really don't blame them. Technically Watcom is about ready for primetime on linux,they just need to get enough people to periodically try to compile there pet open source linux program with it and send a "I cant get this to work" mail to the list, but no one seems to care. PCC, on the other hand has a much larger set of people that have a reason to like PCC for reasons other than its not gcc./p>
Re:"Nothing for you to see here" indeed... (Score:4, Interesting)
Re: (Score:3, Interesting)
I don't follow politics, so care to explain what's wrong with gcc's politics? Or, what _is_ gcc's politics?
I honestly don't know. However, it is an old project maintained by people. They have very specific ideas of how things should work, just like linus has very specific ideas about development (no C++ code in the kernel, you could use something besides GIT, but you would be an idiot, etc.)
Now I don't know much about the inner workings of GCC or Watcom, but I do know this. Several years ago I tried making a linux to windows cross compiler and failed. I think I put a decent amount of effort into my attempts
Re: (Score:3, Informative)
FWIW, if your issue was GCC politics that's pretty much what caused the egcs split (and re-merge eventually, with a new philosophy and maintenance crew in charge of GCC). So if you looked at things prior to the gcc 2.95 era, you were looking at a different set of maintainers (and politics/philosophy/etc) than what's there now.
Re: (Score:3, Interesting)
The page you linked to says a Linux and a FreeBSD port are undergoing.
The linux compiler has worked at times and they went as far as writing a binary called owcc that takes standard posix flags for cc and executes wcc with the equivilant args. You can get working linux binaries and they will compile non trivial code if you try hard enough.
The point is that yes Watom lacks in some areas technically. However, and this is especially true on windows where it works great, it more of an issue of lack of interest that makes it a GCC alternative.
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
I do like the fact that we will (hopefully) have 2 good compilers, but people aren't going to simply start licensing things under BSD just because of GPLv3, they will just keep using GPLv2 because they were using GPL for a reason.
Re: (Score:3, Insightful)
Well, no. On the other hand, the post to which the GP was responding seems to be implying that a GPLv3 GCC will infect its output, on account of its being "poisonous and viral". Either that, or it's a screaming non-sequiteur and totally off topic. You can't blame the GP for giving a poster the benefit of the doubt.
Re: (Score:3, Interesting)
Wow, that sounds great. Sign me up.
Embedded work is far better when the toolchain is based on GCC. Every proprietary compiler I've used has been a fight just to get started. I recently tried out avr-gcc and was delighted, all the GCC experience I had just dropped straight in.
Shame on any manufacturer who doesn't add a GCC backend for their CPU.
Re:"Nothing for you to see here" indeed... (Score:4, Informative)
Re: (Score:3, Informative)
Pure ISO C99 has limitations when writing a kernel (Score:5, Insightful)
Interesting... (Score:5, Insightful)
Re:Interesting... (Score:5, Insightful)
I don't know, I'm not a BSD user, but as much as RMS likes to claim that 'linux' is GNU/linux, maybe BSD users want their OS to be self reliant?
Would you like to compile Linux using a microsoft compiler?
Re:Interesting... (Score:5, Insightful)
Re:Interesting... (Score:5, Interesting)
Re:Interesting... (Score:4, Funny)
Re:Interesting... (Score:5, Interesting)
Auctually if I could write in C as well as him, I would do so more often. The problem is not him writing in C, its other people writing in C that are not as good as him. Do to the scope of his work, him writing in C does not lead to more bad C being written. So I'm auctually thankful he is coding in C.
That being said, he should encourage lesser programmers (including myself) to specifically not code in C.
Re:Interesting... (Score:5, Insightful)
I am the GPP, incidentally, and while I'm pleased I got modded up, I certainly wasn't going for funny :). I'm sorry, but C is just a poor choice for ensuring correctness.
First of all, open up your copy of the C standard (any of them will do) and grep for the phrase "undefined behaviour". C was standardized in a time when everyone and their dog had their own C compiler. Each C compiler did things in a different way, often in contradictory ways. The C standard came along and said "hey, you know what? You're ALL right". I'm being facetious, and the C standard has done a great job in promoting C, but the C standard has really not evolved very far in terms of guaranteeing semantics.
I don't mean to bring this up to say that "you can't write correct code" in C or such nonsense. Obviously it's easy with good habits (I recommend comp.lang.c as the best place to pick up these habits) to write conforming and well-defined code. But, if you're trying to verify code that's already been written, either by hand or via some automated tool like a static analyzer, it is painful.
The second problem with C is that it allows a lot of features that make verification of semantics difficult. Pointer aliasing, global variables (even "extern" global variables!!), etc. make static analysis dreadful. If you want to perform static analysis properly on C programs, it's hard to get around whole-program analysis, which is why no one uses static analysis with C code :). Seriously, what does C have beyond lint? How many people even use lint? It's not very useful.
Of course static analysis is not the end-all be-all of ensuring correct code. There's good coding habits and testing and profiling and whatnot too. But, I would argue that whatever effort can be put into verifying C code can be better put into code in other languages. The semantics of C are sometimes loosely defined, and very often far-reaching, preventing the use of modular reasoning. Whole-program analysis is not your friend.
What would be really cool is to see from someone like the OpenBSD crowd, if they're so keen on C, develop some verification tools that maybe only work on a very, very restricted subset of C. Any code which does not conform to this restricted "more easily verifiable" subset of C in the core OS would be rejected. I don't know how practical it would be, but it would be cool to see :). I mean as an academic, obviously I think we should all be using Z [wikipedia.org], but I understand this doesn't make good sense in a lot of real-world projects. But you want to get serious about correctness, don't pussy foot around: get serious about correctness.
Re: (Score:3, Informative)
What would be really cool is to see from someone like the OpenBSD crowd, if they're so keen on C, develop some verification tools that maybe only work on a very, very restricted subset of C. Any code which does not conform to this restricted "more easily verifiable" subset of C in the core OS would be rejected. I don't know how practical it would be, but it would be cool to see :). I mean as an academic, obviously I think we should all be using Z [wikipedia.org], but I understand this doesn't make good sense in a lot of real-world projects. But you want to get serious about correctness, don't pussy foot around: get serious about correctness.
I think you miss the point. The OpenBSD people are married to Unix and C, in much the same way as Bjarne believes the answer is to fix C++, and not to use Java and C#. Yes your right, it would be nice if one of them "sees the light" and programs in something where you just can't create a buffer overflow so easily. That being said, they should write the lcompiler/vm/interperter/kernel that runs under that as they have proven themselves to be one of the few chosen by $DIETY to write decent C. Until a cpu is
Re: (Score:3, Interesting)
I'm sorry, but C is just a poor choice for ensuring correctness.
Far too true, however for working with flexible or hazy requirements or looking to make code that is fast, C is very hard to beat. Also, just because this is true today, doesn't make it something that will be true forever.
F... "hey, you know what? You're ALL right". I'm being facetious, and the C standard has done a great job in promoting C, but the C standard has really not evolved very far in terms of guaranteeing semantics.
Once again, totally on the money. The standards group was too concerned to fix things like bitfields to make them useful, or standardize the method of determining the size of an "int". I think the standard evolved more to making the compiler writers happy than to make any real ef
Re:Interesting... (Score:4, Insightful)
I can understand some applications having closed source licenses...but a compiler is a means, not an end...it really just seems painful.
Re:Interesting... (Score:4, Informative)
Re: (Score:3, Informative)
It may not be forkable for the average person off the street, but if actual compiler developers are unhappy enough with something then forking GCC is certainly practical and we have plenty of examples of that happening.
egcs forked gcc effectively enough that the fork displaced the original. Several bounds-checking gcc forks have been used in production systems. Apple and others have periodically forked (and somet
Re: (Score:3, Interesting)
Re: (Score:3, Insightful)
It's not like a C compiler is a very difficult thing to write. GCC is much more than than just a C compiler. But if a C compiler is all you need...maybe GCC is overkill.
Re: (Score:3, Insightful)
In fact, Fords do something Toyotas don't do, namely explode.
Although GCC doesn't do that (that I know of), I see no problem with having an alternative. There are competing applications for everything else, from desktops (Gnome/KDE) to kernels (Linux/BSD), so why not compilers?
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
He can't force people to change the name of Linux, it's just that people decided to go along with it on their own. The GNU/Linux thing was kind of retarded given that Linux distributions feature code from a lot of different licenses, and GNU is the only one that's mentioned?
Re:Interesting... (Score:4, Insightful)
The justification I've usually seen for that is that GNU is the single biggest "contributor", as it were, particularly with respect to gcc, the command tools, etc. More than just that, though, it could be argued that without GNU, Linux would just be a kernel, with no user space to run. Of course, it could equally be argued that without Linux, the GNU user space tools would just be a nice collection of tools with no OS to run on...
Re:Interesting... (Score:5, Insightful)
The real reason that Stallman wants this is that he early on correctly perceived that Linus is totally ideology agnostic, and so he wanted to put the idea of GNU/Linux out there so people would talk about the ideology. I don't think this is bad or anything. I think the ideology needs to be heard more widely.
It could also be argued that without the GNU project, Linus wouldn't have had a license ready to use for Linux, and I think that contribution by the GNU project weighs at least as much as all the userspace tools which someone would likely have eventually written anyway.
Re: (Score:3, Interesting)
Debian GNU/kFreeBSD (Score:3, Interesting)
Re: (Score:3, Insightful)
Not true, they run just fine on Solaris including Open Solaris which is OSS. In fact I MUCH prefer Solaris with the GNU tools loaded, the old SysV tools suck by comparison and I only use them for the rare script that breaks on the GNU tools (They are overall very good about preserving backward compatibility and man will almost always tell you when they don't).
Re: (Score:3, Insightful)
GNU tools ran fine on other OSes long before Linux became so popular, including Solaris (and SunOS before it), AIX, HP/UX, IRIX, NEXTSTEP, Ultrix, and so on.
Re: (Score:3, Insightful)
At what point are you forced to grant credit? Shall we call it KDE/QT/GNU/Linux?
No it couldn't. The comparable BSD tools have been around longer than the Linux kernel.
Before Linux, it was pretty common to install sever
Re: (Score:3, Insightful)
Re: (Score:3, Informative)
Let's at least get RMS's position right: The GNU project was founded in 1984 to create a free operating system. In 1991, they were almost completely finished - they had written every essential component of a Unix-like operating system except for a kernel. Linus came along, wrote the Linux kernel, combined it with the almost-complete GNU system, and
Re:Interesting... (Score:5, Insightful)
Better idea, let's just get history correct.
Ok, true enough.
Sure, and I've almost created a free engery device ... I've done everything apart from this one bit that creates energy for free. Also, GNU did not "create" everything else apart from the kernel ... they created some pieces and were doing the distribution work, so other people "donated" their work.
True enough.
Not even close to true, Linux has only ever distributed the kernel ... other people combined it and called the whole things like "Red Hat Linux" or "Slackware Linux", GNU should/could have done this but had not bothered to do the work to make a usable distribution (as more than a collection of tarballs) and were happily ignoring Linux and telling everyone else to ignore it and use GNU-Hurd when it would be ready "any time now". This was pretty obvious naming at the time, we didn't call Solaris "GNU/Solaris" when we installed GCC, GNU-tar etc. on it.
They got a huge amount of credit, for the work they did. They just didn't get their name in lights ... because they refused to do the work required for that. Then they complained and wanted more recognition than anyone else got who'd done the same amount of work as they had (like Perl or Xorg etc.) ... this created a "slight" backlash by people who actually know what happened.
Re: (Score:3, Interesting)
No, the RMS position is not that, since Linux is not the OS developed by the GNU project. It is an OS that's common feature is the Linux kernel; as usually distributed, it includes various tools from the GNU project. The RMS position is, roughly, "The GNU System is an OS developed by the GNU Project, and therefore every project that incorporates any components from that system is
The licence is just the top of the iceberg (Score:3, Informative)
From the discussion of TFA:
The licence is just the top of the iceberg [undeadly.org]
Re:The licence is just the top of the iceberg (Score:5, Insightful)
GCC's intermediate formats GIMPLE and GENERIC are based on a research compiler, not a deliberate perversion. There's no technical steps to stop reuse, and indeed it has been done - Sun distribute the GCC 4.0.4 front-end altered to use their own SPARC code generator as a back-end.
Re:The licence is just the top of the iceberg (Score:5, Interesting)
Actually, the post you're replying to is total bollocks. GCC has had a clear divide between front and back end (not to mention a source-language independent middle layer for performing optimizations) since I first looked at it in about 1996. Each layer is hideously complex, but they are all there.
Re: (Score:3, Informative)
I do know that I wrote an intermediate analyzer for a semester-length class, along with another grad stude
Re:Interesting... (Score:5, Interesting)
The BSD license that PCC is under, I understand, is actually a problem even to the BSD folks: PCC is actually extremely old (it was originally written for the PDP11!) and apparently it still carries the advertising clause.
Re:Interesting... (Score:4, Insightful)
Reason 2) Competition
Reason 3) Choice
Reason 4) Tweak Stallman's nose
Re: (Score:3, Insightful)
3. they object to the restriction on their freedom?
Look, I am a Linux user and Hacker, but even I understand BSD need to have their code be free.
Re: (Score:3, Funny)
You just restated point #2.
Re: (Score:3, Insightful)
to me, it is the GPL that ensures that the *code* remains free, while the BSD license ensures that it is the *user* that remains free.
I really like both licenses, but they serve different purposes, and it highlights the priorities of the different groups.
Re: (Score:3, Insightful)
I don't think it's meant as FUD. Actually the idea that the code itself has been emancipated, so that no one can bind it or its children with the chains of ownership ever again, is quite an attractive and appealing idea.* Stallman was a bloody genius to come up with such a wild abstraction. He's freed the slaves all over again, it's just that these slaves are not human.**
However, while this philosophy m
Re:Interesting... (Score:4, Insightful)
3. they object to the restriction on their freedom?
or
4. they like competition and choice, even if the "market leader" is pretty good.
or
5. they've learned that a monoculture isn't good for the ecology (even if the "market leader" is pretty good).
Re: (Score:2, Interesting)
Indeed, the linked article says that PCC is 5-10 times faster than GCC, but currently performs only one optimization... What use is speed of compilation of the binaries produced are slower?
Re: (Score:3, Insightful)
"The BSD folks would love to have a BSD-licensed drop-in replacement for GCC"
could somebody provide a reference to verify that "the BSD folks" do in fact have such a desire?
Thanks!
Re:"Nothing for you to see here" indeed... (Score:5, Interesting)
This has been a long time coming. If you've ever looked at GCC code, you'll be familiar with the feeling of wanting to claw your eyes out (I had to for an article on the new Objective-C extensions *shudder*). I am somewhat surprised it's PCC not LLVM, but it makes sense. OpenBSD wants a C compiler in the base system, that can compile the base system and produces correct code. Support for C++, Objective-C, Java and Fortran would all be better off in ports. PCC is faster than GCC, smaller than GCC, more portable than GCC, easier to audit than GCC, and already compiles the OpenBSD userspace. I wouldn't be surprised if it replaces GCC in the OpenBSD base system soon. If it does, GCC (or maybe LLVM) will still probably be one of the first things I install from ports, but I'd still regard it as a good idea.
Re:"Nothing for you to see here" indeed... (Score:5, Informative)
by Marc Espie (213.41.185.88) (espie@openbsd.org) on Sun Sep 16 13:28:48 2007 (GMT)
> > I am saying think this through and carefully. Rewriting a giant suite of programs just because you don't agree with the philosophy behind it sounds awful to people who have no stakes in BSD licenses.
>
> It's not just the licence that is a concern about the GCC suite, it's dropping support for hardware that OpenBSD supports, it's fluctuating compilation quality and it's licence are all matters for concern to users.
The licence is just the top of the iceberg.
GCC is developed by people who have vastly different goals from us. If you go back and read the GCC lists, you'll notice several messages by me where I violently disagree with the direction it's following. Here is some *more* flame material.
- GCC is mostly a commercial compiler, these days. Cygnus software has been bought by redhat. Most GCC development is done by commercial linux distributors, and also Apple. They mostly target *fast* i386 architectures and PowerPC. A lot of work has been done on specmarks, *but* the compiler is getting bigger and bigger, and slower and slower (very much so).
- GCC warnings are not *really* useful. The -Wall flag shows many right things, and quite a few wrong issues.
- There is a lot of churn in GCC which ends up with it no longer supporting some architectures that are still relevant to us.
- The whole design of GCC is perverted so that someone cannot easily extract a front-end or back-end. This is broken by design, as the GPL people do believe this would make it easier for commercial entities to `steal' a front-end or back-end and attach it to a proprietary code-generator (or language). This is probably true. This also makes it impossible to write interesting tools, such as intermediate analyzers. This also makes it impossible to plug old legacy back-ends for old architectures into newer compilers.
- As a result, you cannot have the new interesting stuff from newer GCC without also losing stuff... every GCC update is an engineering nightmare, because there is NO simple choice. You gain some capabilities, and you also lose some important stuff.
- it's also very hard to do GCC development. Their branching system makes it very likely that some important work is falling between the cracks (and this happens all the time). If you develop code for GCC, you must do it on the most recent branch, which is kind of hard to do if your platform is currently broken (happens *all the time* if you're not running linux/i386). Even when you conform, it's hard to write code to the GNU coding standards, which are probably the most illegible coding guidelines for C. It's so obvious it was written by a lisp programmer. As a result, I've even lost interest into rewriting and getting in the GCC repository a few pieces.
- some of their most recent advances do not have a chance to work on OpenBSD, like preparsed includes, which depend on mmap() at a fixed location.
- there are quite a few places in GCC and G++ where you cannot have full functionality without having a glibc-equivalent around.
- some of the optimisation choices are downright dangerous, and wrong for us (like optimizing memory fills away, even if they deal with crypto keys).
- don't forget the total nightmare of autoconf/libtool/automake. Heck, even the GCC people have taken years to update their infrastructure to a recent autoconf. And GCC is *the only program in the ports tree* that actually uses its own libtool. Its configuration and reconfiguration fails abysmally when you try to use a system-wide libtool.
I could actually go on for pages...
I've actually been de facto maintainer of GCC on OpenBSD for a few years by now, and I will happily switch to another compiler, so frustrating has been the road with GCC.
Kind of depends... (Score:5, Insightful)
Kind of depends on who you ask, doesn't it?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Not for NetBSD for sure (Score:5, Funny)
Re:Not for NetBSD for sure (Score:5, Informative)
GCC compiles on a LOT of different architectures. Does PCC? Does it do as good a job at compiling? Can we plop our current GCC-compiled source on PCC and have it compile without huge headaches?
And what about these bugs that are even referenced in the summary? How could it POSSIBLY supplant GCC if it's that buggy? In fact, how could it have supplanted GCC if it hasn't taken GCC's place AT ALL yet?
Try these headlines:
GCC Compiler Finally Has 'Free' Competition
New Compiler To Supplant Gnu Compiler?
Battle of the licenses: Does the license of your compiler MATTER AT ALL!?
Re:Not for NetBSD for sure (Score:5, Insightful)
Actually, support for different architectures is one of the main reasons OpenBSD is looking at it. GCC has a habit of dropping architectures because 'nobody uses them,' which causes some OpenBSD (and NetBSD) ports to remain stuck with old versions of GCC. The x86 backend for PCC was written in three weeks by one person, so it seems reasonable to assume it should be possible to add support for the other required platforms relatively easily.
It's worth remembering that in BSD-land, things are divided into the base system and third party packages. The base system needs a C compiler that is capable of compiling the userland (which PCC already does for OpenBSD), is small, portable, and easy to audit. Packages have quite different requirements; they need support for more languages, etc. PCC is likely to replace GCC in the BSD base systems, but that doesn't mean that people won't install GCC or LLVM for compiling other things.
Re: (Score:3, Informative)
Why is the license important? (Score:2)
Okay, if you run BSD I can see why the license is important, you want your software to run under the BSD license. But for the rest of us, what advantage does a BSD licensed compiler offer? It's not like GCC forces the GPL onto compiled software, does it?
For commercial software, pointing to the GCC source isn't that much of a burden if you need to distribute a compiler.
I'd have though the fact a compiler was faster and or lighter would be much more important than the license
Re: (Score:2)
It's what MSFT does for Unix Services for Windows. the GPL components simply get acknowledged and pointed back at the developers.
Re: (Score:3, Funny)
That's "tastes great" and "LESS filling". Clearly you're trying to push a "tastes great" agenda by deliberately misrepresenting the opposing viewpoint. Typical tactics for the tasteistas.
(imagine Daffy Duck saying all that)
Quick! (Score:5, Funny)
Re: (Score:3, Insightful)
The same rule applies to copyright notices. You are not allowed to modify the copyright n
One OT clarification: (Score:2, Informative)
It's my primary package manager on Interix, Mac OS X, Linux, and NetBSD.
Maybe someday (Score:2)
Answer (Score:4, Insightful)
No. Next question.
Stupid waste of time (Score:4, Insightful)
Re:Stupid waste of time (Score:4, Funny)
I'm just sayin'...
Re: (Score:3, Informative)
- It has different goals to GCC.
Re: (Score:3, Interesting)
Actually... the BSD license is GPLv2 or GPLv3-compatible, because it doesn't impose any restrictions beyond those included in GPLv2 or GPLv3. So BSD code can be incorporated into a GPL program (Theo de Radt's recent rants notwithstanding).
Re:Stupid waste of time (Score:4, Informative)
We want to encourage people to use our code if it's the best code for the task. Period. You want to undermine copyright. Well, you're free to do that, but some of us don't think that's a good idea, and others of us don't think it's that important.
Re: (Score:3, Informative)
Yes, that really does mean what it says -- you can take our code private as a binary without given us a thing, but if you preserve the source, you have to give us credit.
*yawn* (Score:5, Funny)
Sounds like a challenge... (Score:2)
Like I don't have enough things to spend my time on. Maybe I should finish my PDP11 page [kicks-ass.net] first.
How many compilers can we cram into a phone booth? (Score:2)
OK, the onchip cache on my core 2 duo is many times larger than the full RAM on any PDP-11 I've ever heard of. So why should I be interested in a
It compiles? (Score:2)
That's dumb. (Score:5, Interesting)
Either way, they'd much, much better off if they imported LLVM and redirected their compiler brain power to clang [llvm.org].
Re: (Score:3, Informative)
Call me when it compiles something other than x86. (Score:2)
I really think some effort needs to be put in to support other processors like ARM, MIPS and PPC - after all, GCC is a hardly a diva for any of these (code generation is terrible) and they are not the fastest platforms either. This is where actual
LLVM / clang (Score:5, Interesting)
If you're interested in advanced compiler technology, check out LLVM [llvm.org], which is an ground up redesign of an optimizer and retargettable code generator. LLVM supports interprocedural cross-file optimizations, can be used for jit compilation (or not, at your choice) and has many other capabilities. The LLVM optimizer/code generator can already beat the performance of GCC compiled code in many cases, sometimes substantially.
For front-ends, LLVM supports two major ones for C family of languages: 1) llvm-gcc, which uses the GCC front-end to compile C/C++/ObjC code. This gives LLVM full compatibility with a broad range of crazy GNU extensions as well as full support for C++ and ObjC. 2) clang [llvm.org], which is a ground-up rewrite of a C/ObjC frontend (C++ will come later) that provides many advantages over GCC, including dramatically faster compilation and better warning/error information.
While LLVM is technologically ahead of both PCC and GCC, the biggest thing it has going is both size of community and the commercial contributors [llvm.org] that are sponsoring work on the project.
-Chris
Re: (Score:3, Interesting)
llvm-gcc is quite mature (it has built huge amounts of code, including app
Re:Increasingly extremeist? (Score:4, Insightful)
Although competition is good... (Score:2)
It seems profoundly stupid to stress out the BSD license as the "most important" feature of this new software.
GPL may not be as free as BSD-license, but one needs to be a real zealot to switch based primarily on this reason. I hope, FreeBSD will wait for it to work on other platforms and only switch because it is "leaner, lighter, and faster".
We need to replace gcc ... why? (Score:4, Interesting)
Sure, I prefer BSD-style licenses, and so do some other people, but what drives gcc development is the GNU license. I think I'll stick to the compiler that's debugged. Oh, that's right, I forgot, it comes with a debugger too. If you like that sort of thing.
not every architecture (Score:5, Informative)
The idea with PCC is not that it will be BSD licensed (nobody really gives a fuck what license the compiler is under), but that it will be supported directly by the BSD community, including the NetBSD hackers who have their bazillion architectures to support.
Re:We need to replace gcc ... why? (Score:5, Interesting)
You couldn't have gotten that statement any MORE WRONG if you had tried.
GCC's "production quality" is an on-again, off-again thing. Through most of v3.x it had too many bugs to count, and was inherently unreliable. It couldn't even compile ITSELF with the most basic optimizations or the resulting binary would generate incorrect code. Up until v4 it also misaligned stack variable. It had, and still has, MANY bugs. That GCC successfully compiles code at all is almost entirely due to it being so popular that everyone knows it, and works around its bugs without even thinking about it.
It has never had GOOD support for any other platforms than x86. Remember the RedHat GCC2.96 fiasco? They forked it because they needed it to support more platforms than it currently did. And even through v3.x the non-x86 ports of GCC had even more bugs than on x86, commonly falling apart if you attempt to use any optimizations. Now, they're DROPPING support for those platform entirely, which is a big problem for developers of operating systems for those platforms.
"Improved" is pretty vague. HURD has probably been "improved" for every minute of it's existence as well... Meanwhile the far younger ICC (Intel's compiler) beats the pants off of GCC without even trying.
What's more, GCC's "improvements" come at great cost. If you're a full-time developer, for the final release you want optimized code, but while developing, you want to compile and be able to test code frequently, and so as quickly as humanly possible. GCCv3+, even with all optimizations disabled, takes far, far longer to compile binaries than even older versions of GCC, and as it says, something like 10X slower than PCC.
The license issue is only incidental. These (and other) problems pushed them away from using GCC. Since they happen to be BSD developers, they'd prefer their work to be BSD licensed, and so it is.
sensationalist bullshit yet again (Score:5, Informative)
Second: The biggest attraction of PCC is NOT the license. The article submitter who stated otherwise is a jackass.
Third: There are techical reasons why GCC is actaully unusable by some BSDs, such as NetBSD, which aims to support many architectures that GCC has dropped. NetBSD uses a combination of GCC 2, 3, and 4 to compile all of its different architectures. The NetBSD developers would rather have a single compiler that handles them all. Obviously PCC is nowhere near that level yet, of course.
Fourth: GCC politics are a pain in the ass for many BSD developers who just want to submit patches to a compiler without the overhead of GNU's policies and GCC's management.
Fifth: GCC produces crappy code more often than anyone would like. GCC bugs are far from unheard of, performance of generated code is often unpredictable between releases, and in many less commonly used architectures or sources GCC will produce incorrect code. Yes, these cases are very rare, but the BSD folks have hit the problem often enough for it to be a concern. PCC, being simpler and less bloated with cruft from multiple rewrites of the internals will hopefully produce correct and predictable code more often than GCC.
Sixth: PCC actually works today. It can compile most of the NetBSD userspace, as I recall, and the kernel will be ready to roll soon after some inline assembler problems are fixed. This isn't some theoretical hacky project - it works right now. It's not ready to replace GCC just yet, by any means, but it's a lot more than some Slashdotters seem to think it is.
GCC (Score:3, Funny)
Then the answer is no. I may be alone in the world but I'm perfectly happy with the gcc compiler and have been for years. It does what its supposed to, It is FREE, It is crossplatform (MingW), and it annoys the BSD guys.
Clear Winner. GCC
It has been pointed out here, that people who choose a compiler based on its license are idiots. Well if I'm working on windows I use MingW specifically because of its license. If I'm working in Linux and I usually am, I choose GPL above all others. Count me as an Idiot if you like, But you can shove the alternatives. I know what I am getting and have a reasonable expectation what is coming in the future, and if I need to modify it (Heaven Forbid) I can. BSD is a fine license for people who NEED it. I don't. When given the choice I choose GPL. GCC Slower, maybe so. Code works and I get paid. If it takes 3 hrs for QT to compile. I bill for 3Hrs.
Sorry but, I'm a pragmatist in all things except freedom. I've been burned enough. (Admittedly, I've personally never been burned by BSD code, unless you count Windows.)
GCC is the Microsoft Windows of compilers... (Score:5, Funny)
You're not going to supplant GCC until you get all the code that depends on GCC-specific features modified to be standard portable C. That's a barrier to entry as steep as Microsoft's application barrier to entry. Now it's not as bad as it was in the early '90s when GCC was sprouting new C extensions everywhere (like the ability to have declarations not at the start of blocks, or the ability to leave the second element out of the trinary conditional operator, or things like alloca), and a lot of those features have now become common and even standardized (and others, like the shortcut trinary, have been deprecated). But it's not as easy as just having a good compiler, or even a good language translating ecosystem like Tendra... the playing field is anything but level.
Re: (Score:3, Insightful)
But you're still stuck with using glibc if you want to be able to compile anything. You do have different libcs floating around, uclibc, etc; but they're all gnu and they're all meant for embedded market. I doubt you'd be able to recompile the linux kernel with any of them.
Re:Why? (Score:5, Informative)
Re:Does it crash less? (Score:4, Insightful)
while the person you're responding to *is* a troll, I guess it's worth pointing out that GCC and other highly optimizing compilers will "break" some apps that a simpler compiler won't break. Why?
Many optimizations rely on careful reading of the standard, and explicitly taking the liberties that the standard lets you take. For instance, the following loop terminates on a simple compiler, but becomes infinite on some optimizing compilers:
int i = 1;
while (i > 0)
. . . i = i * 2;
The ANSI C standard states specifically that signed integer overflow behavior is implementation defined. If you were expecting 'i' to go negative after 30 iterations, and for that to terminate the loop, you could be in for a nasty surprise.
Suppose an application relied on this behavior, and now it misbehaves when compiled with GCC. Did GCC "break" that application? In some sense, yes: The app functions correctly with compiler (a) but not with compiler (b), so the app must be compiled with compiler (a). The breakage, however, happened because the application its not strictly conforming. It uses compiler dependent semantics, and that's hardly GCC's fault.
Simpler compilers also don't reorder code as much, and don't optimize away as much "dead code." Stuff that really should have memory barriers, explicit synchronization and perhaps the volatile keyword applied to them run just fine without all those things when compiled with a simple compiler and run on a scalar, in-order CPU. The source code is also easier to read, because in the end the semantics are much more restricted--meaning the compiled output more closely resembles the source input. Give that code to a highly optimizing compiler, though, and run it on a super-scalar, out-of-order machine, and it'll break left, right and center. Is it the compiler's fault? Is it the CPU's fault? It's really the gap between the semantics the programmer thought he had (and happened to have in the simpler environment), and what C actually guarantees.
Simpler compilers implement simpler semantics that are easier to understand, but only because they're compiling a very restricted form of C that offers way more implicit guarantees than the C standard actually does. Personally, unless that's made explicit (and therefore truly guaranteed forevermore by the compiler), I suspect it's actually a recipe for disaster. If nothing else, it could lead to code that's significantly harder to move to different platforms, since it'll start to rely on these simpler, "easier" semantics. Of course, then again, super-scalar out-of-order CPUs still strip a bunch of that away, so who knows, it might not be that bad.
--JoeRe: (Score:3, Insightful)
It is extremely difficult and next to pointless to write code that is strictly conforming. It is in fact quite useful, for instance, to use unions to re-interpret bit patterns. (Note that "portable" is something rather different than "strictly conforming," or even "conforming." Many non-conforming programs are still highly portable because of commonalities among implementations.)
For example, suppose you want to bit-reverse an entire array in memory. That is, bit 0 of the first element in the array swap