Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
Programming Operating Systems IT Technology BSD

GCC Compiler Finally Supplanted by PCC? 546

Sunnz writes "The leaner, lighter, faster, and most importantly, BSD Licensed, Compiler PCC has been imported into OpenBSD's CVS and NetBSD's pkgsrc. The compiler is based on the original Portable C Compiler by S. C. Johnson, written in the late 70's. Even though much of the compiler has been rewritten, some of the basics still remain. It is currently not bug-free, but it compiles on x86 platform, and work is being done on it to take on GCC's job."
This discussion has been archived. No new comments can be posted.

GCC Compiler Finally Supplanted by PCC?

Comments Filter:
  • Kind of depends... (Score:5, Insightful)

    by KingSkippus ( 799657 ) * on Monday September 17, 2007 @12:07PM (#20637585) Homepage Journal

    ...and most importantly, BSD Licensed...

    Kind of depends on who you ask, doesn't it?

  • by DiegoBravo ( 324012 ) on Monday September 17, 2007 @12:10PM (#20637635) Journal
    ...Wake me up when you're able to use PCC instead of GCC to do a 'make bzImage'
  • Answer (Score:4, Insightful)

    by christurkel ( 520220 ) on Monday September 17, 2007 @12:12PM (#20637665) Homepage Journal
    GCC Compiler Finally Supplanted by PCC?

    No. Next question.
  • Interesting... (Score:5, Insightful)

    by cromar ( 1103585 ) on Monday September 17, 2007 @12:12PM (#20637669)
    I really don't see any point in implementing a new C compiler under the BSD lisence. There's no reason to duplicate effort: it's not like the compiled binaries would be under the GPL. And any GPL libraries you link to, you wouldn't need to distribute (thus avoiding the GPL). So, really, there's no point in duplicating effort on a BSD lisenced compiler. Correct me if I'm wrong.
  • by ( 780570 ) on Monday September 17, 2007 @12:12PM (#20637677)
    Seriously. Let's duplicate the wheel twice: once for GPL, once for BSD, and then bicker amongst ourselves. Stuff like this stands in the way of actual progress being made. Neither side is right, I don't have a solution, but this is just dumb.
  • Re:GNUless Linux (Score:3, Insightful)

    by RLiegh ( 247921 ) on Monday September 17, 2007 @12:15PM (#20637713) Homepage Journal
    Not really; you already have the sun and intel compilers for Linux (I've been told that the intel compiler has even been tweaked so you can build a bzImage with it).

    But you're still stuck with using glibc if you want to be able to compile anything. You do have different libcs floating around, uclibc, etc; but they're all gnu and they're all meant for embedded market. I doubt you'd be able to recompile the linux kernel with any of them.
  • by MissP ( 728641 ) on Monday September 17, 2007 @12:16PM (#20637739)
    With respect to

    "The BSD folks would love to have a BSD-licensed drop-in replacement for GCC"

    could somebody provide a reference to verify that "the BSD folks" do in fact have such a desire?

  • Re:Interesting... (Score:5, Insightful)

    by everphilski ( 877346 ) on Monday September 17, 2007 @12:20PM (#20637807) Journal

    I don't know, I'm not a BSD user, but as much as RMS likes to claim that 'linux' is GNU/linux, maybe BSD users want their OS to be self reliant?

    Would you like to compile Linux using a microsoft compiler? :)
  • Re:Interesting... (Score:5, Insightful)

    by Anonymous Coward on Monday September 17, 2007 @12:29PM (#20637983)

    Would you like to compile Linux using a microsoft compiler? :)
    If it produced the best code, why not? People already compile Linux using the Intel compiler.

  • Re:Interesting... (Score:3, Insightful)

    by WindBourne ( 631190 ) on Monday September 17, 2007 @12:29PM (#20637987) Journal
    3. they object to the restriction on their freedom?

    Look, I am a Linux user and Hacker, but even I understand BSD need to have their code be free.
  • by SuperKendall ( 25149 ) on Monday September 17, 2007 @12:33PM (#20638069)
    FSF's increasingly extremist political views

    FSF's views have not changed over the years - they have remained consistent.

    The reason they look "increasingly extremist" is because of other views on software freedom heading the other way, thus drawing an even greater contrast. That does not mean that the FSF views are the ones that are extreme...

    If anything the FSF and RMS have shown themselves to be quite prescient in warning against dangers that have come to pass.
  • Re:Interesting... (Score:4, Insightful)

    by Brandybuck ( 704397 ) on Monday September 17, 2007 @12:33PM (#20638077) Homepage Journal
    Reason 1) Avoid a monoculture

    Reason 2) Competition

    Reason 3) Choice

    Reason 4) Tweak Stallman's nose
  • by FranTaylor ( 164577 ) on Monday September 17, 2007 @12:34PM (#20638093)
    Even if a compiler generates miserably inefficient code, it is valuable if the code is correct. It ia a valuable tool to use for the verification of other compilers. It can also be used as part of a compiler bootstrapping process. Since its code size is probably a small fraction of GCC's, it may make a better teaching tool. If people are actually going to use it, given that it must coexist in a world with much more mature compilers, it will itself probably become much more mature in a relatively short period of time. GCC currently has no competitors in the free realm and has suffered from neglect in the past. A little competition may keep the developers on their toes and prevent another egcs.

  • by IamTheRealMike ( 537420 ) on Monday September 17, 2007 @12:39PM (#20638173)
    The reason they look increasingly extremist is because the FSF tends to make up policies and rules which bind GCC development in order to avoid the theoretical risk of making GPL violations easier. As compiler technology advances these restrictions have become increasingly burdensome, in particular, several of the technical advantages of LLVM are things the GCC team would have liked to do but RMS nixed because it would have made it too easy to circumvent the license.
  • Re:Interesting... (Score:3, Insightful)

    by Jose ( 15075 ) on Monday September 17, 2007 @12:41PM (#20638211) Homepage
    Look, I am a Linux user and Hacker, but even I understand BSD need to have their code be free.

    to me, it is the GPL that ensures that the *code* remains free, while the BSD license ensures that it is the *user* that remains free.

    I really like both licenses, but they serve different purposes, and it highlights the priorities of the different groups.

  • Re:Interesting... (Score:3, Insightful)

    by Jeff DeMaagd ( 2015 ) on Monday September 17, 2007 @12:42PM (#20638235) Homepage Journal
    The difference is that both are open source and no matter what level of hot air that RMS can emit, he can't force anyone to do anything other than abide by the license of his software.

    He can't force people to change the name of Linux, it's just that people decided to go along with it on their own. The GNU/Linux thing was kind of retarded given that Linux distributions feature code from a lot of different licenses, and GNU is the only one that's mentioned?
  • Re:Interesting... (Score:4, Insightful)

    by Tim C ( 15259 ) on Monday September 17, 2007 @12:47PM (#20638329)
    The GNU/Linux thing was kind of retarded given that Linux distributions feature code from a lot of different licenses, and GNU is the only one that's mentioned?

    The justification I've usually seen for that is that GNU is the single biggest "contributor", as it were, particularly with respect to gcc, the command tools, etc. More than just that, though, it could be argued that without GNU, Linux would just be a kernel, with no user space to run. Of course, it could equally be argued that without Linux, the GNU user space tools would just be a nice collection of tools with no OS to run on...
  • by TheRaven64 ( 641858 ) on Monday September 17, 2007 @12:51PM (#20638389) Journal

    Actually, support for different architectures is one of the main reasons OpenBSD is looking at it. GCC has a habit of dropping architectures because 'nobody uses them,' which causes some OpenBSD (and NetBSD) ports to remain stuck with old versions of GCC. The x86 backend for PCC was written in three weeks by one person, so it seems reasonable to assume it should be possible to add support for the other required platforms relatively easily.

    It's worth remembering that in BSD-land, things are divided into the base system and third party packages. The base system needs a C compiler that is capable of compiling the userland (which PCC already does for OpenBSD), is small, portable, and easy to audit. Packages have quite different requirements; they need support for more languages, etc. PCC is likely to replace GCC in the BSD base systems, but that doesn't mean that people won't install GCC or LLVM for compiling other things.

  • by Anonymous Conrad ( 600139 ) on Monday September 17, 2007 @12:52PM (#20638401)

    The whole design of GCC is perverted so that someone cannot easily extract a front-end or back-end. This is broken by design, as the GPL people do believe this would make it easier for commercial entities to `steal' a front-end or back-end and attach it to a proprietary code-generator (or language).
    That's entirely wrong. RMS has been worried about this, and he (through the FSF who own the copyright) have previously objected to any patches that serialize the GCC's intermediate state for just this reason. (Although GCC's new link-time optimization work will change this.)

    GCC's intermediate formats GIMPLE and GENERIC are based on a research compiler, not a deliberate perversion. There's no technical steps to stop reuse, and indeed it has been done - Sun distribute the GCC 4.0.4 front-end altered to use their own SPARC code generator as a back-end.
  • Re:Interesting... (Score:4, Insightful)

    by jc42 ( 318812 ) on Monday September 17, 2007 @12:59PM (#20638571) Homepage Journal
    3. they object to the restriction on their freedom?

    4. they like competition and choice, even if the "market leader" is pretty good.

    5. they've learned that a monoculture isn't good for the ecology (even if the "market leader" is pretty good).
  • Re:Interesting... (Score:5, Insightful)

    by Omnifarious ( 11933 ) * <> on Monday September 17, 2007 @01:13PM (#20638871) Homepage Journal

    The real reason that Stallman wants this is that he early on correctly perceived that Linus is totally ideology agnostic, and so he wanted to put the idea of GNU/Linux out there so people would talk about the ideology. I don't think this is bad or anything. I think the ideology needs to be heard more widely.

    It could also be argued that without the GNU project, Linus wouldn't have had a license ready to use for Linux, and I think that contribution by the GNU project weighs at least as much as all the userspace tools which someone would likely have eventually written anyway.

  • Re:Interesting... (Score:3, Insightful)

    by afidel ( 530433 ) on Monday September 17, 2007 @01:24PM (#20639081)
    Of course, it could equally be argued that without Linux, the GNU user space tools would just be a nice collection of tools with no OS to run on...

    Not true, they run just fine on Solaris including Open Solaris which is OSS. In fact I MUCH prefer Solaris with the GNU tools loaded, the old SysV tools suck by comparison and I only use them for the rare script that breaks on the GNU tools (They are overall very good about preserving backward compatibility and man will almost always tell you when they don't). I know many of the stogy Solaris admin's don't like to load anything that didn't come on the first install disk, but if you have control of your environment and have linux experience it is really nice to load the GNU toolset.
  • by Mr Z ( 6791 ) on Monday September 17, 2007 @01:47PM (#20639525) Homepage Journal

    while the person you're responding to *is* a troll, I guess it's worth pointing out that GCC and other highly optimizing compilers will "break" some apps that a simpler compiler won't break. Why?

    Many optimizations rely on careful reading of the standard, and explicitly taking the liberties that the standard lets you take. For instance, the following loop terminates on a simple compiler, but becomes infinite on some optimizing compilers:

    int i = 1;

    while (i > 0)
    . . . i = i * 2;

    The ANSI C standard states specifically that signed integer overflow behavior is implementation defined. If you were expecting 'i' to go negative after 30 iterations, and for that to terminate the loop, you could be in for a nasty surprise.

    Suppose an application relied on this behavior, and now it misbehaves when compiled with GCC. Did GCC "break" that application? In some sense, yes: The app functions correctly with compiler (a) but not with compiler (b), so the app must be compiled with compiler (a). The breakage, however, happened because the application its not strictly conforming. It uses compiler dependent semantics, and that's hardly GCC's fault.

    Simpler compilers also don't reorder code as much, and don't optimize away as much "dead code." Stuff that really should have memory barriers, explicit synchronization and perhaps the volatile keyword applied to them run just fine without all those things when compiled with a simple compiler and run on a scalar, in-order CPU. The source code is also easier to read, because in the end the semantics are much more restricted--meaning the compiled output more closely resembles the source input. Give that code to a highly optimizing compiler, though, and run it on a super-scalar, out-of-order machine, and it'll break left, right and center. Is it the compiler's fault? Is it the CPU's fault? It's really the gap between the semantics the programmer thought he had (and happened to have in the simpler environment), and what C actually guarantees.

    Simpler compilers implement simpler semantics that are easier to understand, but only because they're compiling a very restricted form of C that offers way more implicit guarantees than the C standard actually does. Personally, unless that's made explicit (and therefore truly guaranteed forevermore by the compiler), I suspect it's actually a recipe for disaster. If nothing else, it could lead to code that's significantly harder to move to different platforms, since it'll start to rely on these simpler, "easier" semantics. Of course, then again, super-scalar out-of-order CPUs still strip a bunch of that away, so who knows, it might not be that bad.

  • Re:Interesting... (Score:3, Insightful)

    by Thuktun ( 221615 ) on Monday September 17, 2007 @01:49PM (#20639559) Homepage Journal

    Of course, it could equally be argued that without Linux, the GNU user space tools would just be a nice collection of tools with no OS to run on...
    It could not be successfully argued.

    GNU tools ran fine on other OSes long before Linux became so popular, including Solaris (and SunOS before it), AIX, HP/UX, IRIX, NEXTSTEP, Ultrix, and so on.
  • Re:Interesting... (Score:4, Insightful)

    by Austerity Empowers ( 669817 ) on Monday September 17, 2007 @02:05PM (#20639837)
    But modifying, even forking GCC is practical and rational, whereas making your own, new, compiler and supporting it for all eternity is not. I can understand much of the BSD bent on licenses, but in this case...I don't see it. Compilers are never "done", and writing one with a license that does not ensure other people's updates make it in is just ensuring that the author is perpetually supporting this himself.

    I can understand some applications having closed source licenses...but a compiler is a means, not an really just seems painful.

  • by otis wildflower ( 4889 ) on Monday September 17, 2007 @02:24PM (#20640191) Homepage
    Seriously, if you're writing code for a living, especially performance-critical code, isn't hardware/platform optimization for the end-use binary far more important than speed of compilation? Particularly if that binary gets blown out to hundreds or (of?) thousands of boxes. If I had to choose between a slow, but hand-tuned GCC for my platform or a quick other compiler that made correct but mediocre-performing (no SSE?/3DNow/VMX/VIS or whatever) binary code, I'd say GCC no contest.

    And frankly, slower compilers mean secksier hardware requirements for workstations.. ("Yes, GCC4 is slow, that's why I need that dual quad-core Xeon with 4GB RAM!!")

  • Re:Interesting... (Score:5, Insightful)

    by Anonymous Coward on Monday September 17, 2007 @02:36PM (#20640415)

    I am the GPP, incidentally, and while I'm pleased I got modded up, I certainly wasn't going for funny :). I'm sorry, but C is just a poor choice for ensuring correctness.

    First of all, open up your copy of the C standard (any of them will do) and grep for the phrase "undefined behaviour". C was standardized in a time when everyone and their dog had their own C compiler. Each C compiler did things in a different way, often in contradictory ways. The C standard came along and said "hey, you know what? You're ALL right". I'm being facetious, and the C standard has done a great job in promoting C, but the C standard has really not evolved very far in terms of guaranteeing semantics.

    I don't mean to bring this up to say that "you can't write correct code" in C or such nonsense. Obviously it's easy with good habits (I recommend comp.lang.c as the best place to pick up these habits) to write conforming and well-defined code. But, if you're trying to verify code that's already been written, either by hand or via some automated tool like a static analyzer, it is painful.

    The second problem with C is that it allows a lot of features that make verification of semantics difficult. Pointer aliasing, global variables (even "extern" global variables!!), etc. make static analysis dreadful. If you want to perform static analysis properly on C programs, it's hard to get around whole-program analysis, which is why no one uses static analysis with C code :). Seriously, what does C have beyond lint? How many people even use lint? It's not very useful.

    Of course static analysis is not the end-all be-all of ensuring correct code. There's good coding habits and testing and profiling and whatnot too. But, I would argue that whatever effort can be put into verifying C code can be better put into code in other languages. The semantics of C are sometimes loosely defined, and very often far-reaching, preventing the use of modular reasoning. Whole-program analysis is not your friend.

    What would be really cool is to see from someone like the OpenBSD crowd, if they're so keen on C, develop some verification tools that maybe only work on a very, very restricted subset of C. Any code which does not conform to this restricted "more easily verifiable" subset of C in the core OS would be rejected. I don't know how practical it would be, but it would be cool to see :). I mean as an academic, obviously I think we should all be using Z [], but I understand this doesn't make good sense in a lot of real-world projects. But you want to get serious about correctness, don't pussy foot around: get serious about correctness.

  • Re:Interesting... (Score:3, Insightful)

    by Splab ( 574204 ) on Monday September 17, 2007 @02:45PM (#20640567)
    It often depends on what you are doing, some programs need fast over reliable - others need stuff like strong exception safety, there should be room for both crowds.
  • Re:Interesting... (Score:5, Insightful)

    by Nevyn ( 5505 ) * on Monday September 17, 2007 @02:47PM (#20640607) Homepage Journal

    Let's at least get RMS's position right:

    Better idea, let's just get history correct.

    The GNU project was founded in 1984 to create a free operating system.

    Ok, true enough.

    In 1991, they were almost completely finished - they had written every essential component of a Unix-like operating system except for a kernel.

    Sure, and I've almost created a free engery device ... I've done everything apart from this one bit that creates energy for free. Also, GNU did not "create" everything else apart from the kernel ... they created some pieces and were doing the distribution work, so other people "donated" their work.

    Linus came along, wrote the Linux kernel,

    True enough.

    combined it with the almost-complete GNU system, and called the whole thing Linux.

    Not even close to true, Linux has only ever distributed the kernel ... other people combined it and called the whole things like "Red Hat Linux" or "Slackware Linux", GNU should/could have done this but had not bothered to do the work to make a usable distribution (as more than a collection of tarballs) and were happily ignoring Linux and telling everyone else to ignore it and use GNU-Hurd when it would be ready "any time now". This was pretty obvious naming at the time, we didn't call Solaris "GNU/Solaris" when we installed GCC, GNU-tar etc. on it.

    The GNU people were rightly upset that they were getting no credit for their work (to build a complete Unix-like OS).

    They got a huge amount of credit, for the work they did. They just didn't get their name in lights ... because they refused to do the work required for that. Then they complained and wanted more recognition than anyone else got who'd done the same amount of work as they had (like Perl or Xorg etc.) ... this created a "slight" backlash by people who actually know what happened.

  • Re:Interesting... (Score:3, Insightful)

    by peacefinder ( 469349 ) <> on Monday September 17, 2007 @03:22PM (#20641221) Journal
    "Or, I understand the philosophy, but I think that particular way of putting it is a bit of FUD slinging."

    I don't think it's meant as FUD. Actually the idea that the code itself has been emancipated, so that no one can bind it or its children with the chains of ownership ever again, is quite an attractive and appealing idea.* Stallman was a bloody genius to come up with such a wild abstraction. He's freed the slaves all over again, it's just that these slaves are not human.**

    However, while this philosophy maximizes the freedom of the code itself, it does so at some cost to other interests. One cannot bind the software with the chains of ownership again - that's the whole point - which is a bit of a problem if one wishes to write code that can be used by any human for any purpose even if it means some of the code's children can be chained up again by some bloody-minded capitalist while others remain free. The BSD license maximizes the freedom of people, while not providing complete protection from slavery for all of the code's children.

    Both licenses attempt to maximize freedom, but they do so in different ways and for different interests; to secure those different freedoms they must make different tradeoffs.

    [*: When I say this I definitely do not mean to spread Fear, Uncertainty, or Doubt. It's meant as a compliment... although there are of course some who will view Stallman's grand idea as just bizarre philosophical wanking.]
    [**: And if you think this debate is interesting, just wait until the GPL philosophy meets seemingly self-aware robots! See also Asimov's book "I, Robot".]
  • Re:Interesting... (Score:3, Insightful)

    by evilviper ( 135110 ) on Monday September 17, 2007 @03:46PM (#20641643) Journal

    GNU is the single biggest "contributor", as it were,

    At what point are you forced to grant credit? Shall we call it KDE/QT/GNU/Linux?

    it could be argued that without GNU, Linux would just be a kernel, with no user space to run.

    No it couldn't. The comparable BSD tools have been around longer than the Linux kernel.

    Of course, it could equally be argued that without Linux, the GNU user space tools would just be a nice collection of tools with no OS to run on...

    Before Linux, it was pretty common to install several of the GNU tools on the closed-source Unix systems of the day. Problems with the GNU tools on Minix was in fact the motivation for creating Linux. Today, everyone that has to use a Solaris system for any length of time installs the GNU tools and tries very hard to forget that the Solaris crap even exists. Bash, if nothing else, is a common install on any Unix system (even though OpenBSD's branch of pdksh, and ksh93 are actually better shells, IMO).
  • Re:Interesting... (Score:2, Insightful)

    by lyberth ( 319170 ) on Monday September 17, 2007 @03:49PM (#20641703) Homepage
    didn't you read, that this is work done from all the way back in the 70'ties?
    Also why should they not be allowed to make a new compiler, Toyota started to make cars long after Ford made theirs - even though - eeh - Ford already made cars that did all the things that the Toyota cars did.
    What license doesn't ensure , oh, let me turn it around - what license ensures that committed code makes it in? The manager of a project allows or disallows for code to be committed, not the license.

    A compiler is an application, an application that compiles source code.

    A lot of people in the bsd developer teams are tired of gcc becomming more and more complex and becoming slower and slower at compiling. This might indicate why some are trying to find alternatives.
  • Re:Interesting... (Score:3, Insightful)

    by Cro Magnon ( 467622 ) on Monday September 17, 2007 @03:57PM (#20641853) Homepage Journal

    Toyota started to make cars long after Ford made theirs - even though - eeh - Ford already made cars that did all the things that the Toyota cars did.

    In fact, Fords do something Toyotas don't do, namely explode. :)

    Although GCC doesn't do that (that I know of), I see no problem with having an alternative. There are competing applications for everything else, from desktops (Gnome/KDE) to kernels (Linux/BSD), so why not compilers?
  • by tajribah ( 523654 ) on Monday September 17, 2007 @04:33PM (#20642457) Homepage
    If they are interested in these architectures, they should help GCC folks maintain them instead of resurrecting obsolete compilers.
  • Re:Interesting... (Score:3, Insightful)

    by Bluesman ( 104513 ) on Monday September 17, 2007 @04:43PM (#20642587) Homepage
    The code base to GCC is huge and complex, not to mention sparsely documented. Going through the entire thing to make sure it's perfectly secure is likely a harder task than starting from scratch.

    It's not like a C compiler is a very difficult thing to write. GCC is much more than than just a C compiler. But if a C compiler is all you need...maybe GCC is overkill.
  • by Crayon Kid ( 700279 ) on Monday September 17, 2007 @08:15PM (#20645503)
    Please explain how the license for GCC affects code compiled by it.
  • by mrsteveman1 ( 1010381 ) on Monday September 17, 2007 @10:54PM (#20646823)
    What will actually happen is GCC will get forked at GPLv2, if the majority of people are behind the v2 fork the v3 version will become irrelevant.

    I do like the fact that we will (hopefully) have 2 good compilers, but people aren't going to simply start licensing things under BSD just because of GPLv3, they will just keep using GPLv2 because they were using GPL for a reason.
  • That's not a good test, given that Linux depends on gcc-isms. It's not written in any standardised form of C, which would be a far fairer test.
    How can a kernel be written in a standardized form of the C language? The C language does not specify any mechanism for alignment [], struct packing [], code sections, CPU-specific function calling conventions [], or any of the various other attributes, nor does it allow for the kinds of inline assembly language [] needed to talk to the memory mapper or the I/O hardware.
  • Re:Interesting... (Score:3, Insightful)

    by tepples ( 727027 ) <tepples@gmail.BOHRcom minus physicist> on Tuesday September 18, 2007 @12:17AM (#20647473) Homepage Journal

    Of course, it could equally be argued that without Linux, the GNU user space tools would just be a nice collection of tools with no OS to run on...
    Without Linux, the Linux developers would be working on HURD. Without Linux and without HURD, we'd likely be using GNU over Solaris, *BSD, or Minix (all of which are Free by now).
  • by DrJimbo ( 594231 ) on Tuesday September 18, 2007 @03:07AM (#20648451)
    The license doesn't explicitly say you can't modify it because it goes without saying that you are not allowed to modify it. If the default was for licenses to be freely modifyable by recipients then they would all be worthless. Also, it would be asinine for the default to be that you can freely modify the license because then every single license would have to have a standard clause that says you can't modify it.

    The same rule applies to copyright notices. You are not allowed to modify the copyright notice on a work even though it doesn't explicitly say such modifications are forbidden. The BSD license reminds recipients that they have to keep the copyright intact, but this is done as a courtesy and is not required.

    As for your Wikipedia quote, I already gave a detailed explanation of how BSD licensed code can be distributed along with code that has a more restrictive license because this might be what has caused the widespread misunderstanding that you are still suffering from.

    The rule is trivially simple: unless you are given explicit permission from the original author, you can't change the license or copyright notice on someone else's work. Period.

  • by NickFortune ( 613926 ) on Tuesday September 18, 2007 @05:51AM (#20649185) Homepage Journal

    I am not sure what you are suggesting here... Should people who feel that the GPL license is heading in a disastrous direction just STFU?

    Well, no. On the other hand, the post to which the GP was responding seems to be implying that a GPLv3 GCC will infect its output, on account of its being "poisonous and viral". Either that, or it's a screaming non-sequiteur and totally off topic. You can't blame the GP for giving a poster the benefit of the doubt.

    It really doesn't matter if the new license affects the compiled code.

    What a strange thing to say! I most certainly does matter if a compiler imposes its licencing terms on any programs it compiles. This would be a major show stopper for all sorts of deployment scenarios.

    Happily, GCC has never imposed such restrictions. You could argue that the point is moot in this case, but it's hardly unimportant.

    In this case the message is that BSD licensing looks better every day.

    I can't see why. It's not like the FSF will (or could) discontinue the GPLv2. If it was a good licence before v3 (and it was) then GPLv2 is still a good licence now.

  • by Mr Z ( 6791 ) on Tuesday September 18, 2007 @09:00PM (#20662585) Homepage Journal

    It is extremely difficult and next to pointless to write code that is strictly conforming. It is in fact quite useful, for instance, to use unions to re-interpret bit patterns. (Note that "portable" is something rather different than "strictly conforming," or even "conforming." Many non-conforming programs are still highly portable because of commonalities among implementations.)

    For example, suppose you want to bit-reverse an entire array in memory. That is, bit 0 of the first element in the array swaps locations with bit N-1 of the last element of the array, bit 1 swaps locations with bit N-2, etc., all the way down to the middle of the array. How would you implement that as a strictly conforming ANSI C program? It turns out to be rather difficult to do correctly. (Why would you want to? Well, it's a handy way to flip bitmaps for one thing.)

    First, you have to know how many bits are in each unsigned char. There could be from 8 to who-knows-how-many bits in an unsigned char. (Yes, 256-bit unsigned char are legal in ANSI C, as long as there's at least as many bits in a short int, int, long int and long long int.) So, you can't rely on any fast, cute implementations such as this ever-popular word-reversal routine:

    unsigned int bitrev(unsigned int x)
    x = ((x & 0xFFFF0000U) >> 16) | ((x & 0x0000FFFFU) << 16);
    x = ((x & 0xFF00FF00U) >> 8 ) | ((x & 0x00FF00FFU) << 8 );
    x = ((x & 0xF0F0F0F0U) >> 4 ) | ((x & 0x0F0F0F0FU) << 4 );
    x = ((x & 0xCCCCCCCCU) >> 2 ) | ((x & 0x33333333U) << 2 );
    x = ((x & 0xAAAAAAAAU) >> 1 ) | ((x & 0x55555555U) << 1 );
    return x;

    That code is implementation defined. It cannot be part of a strictly conforming program. It can be part of a conforming program, though it only works as expected on machines whose unsigned int is 32 bits. (That happens to be over 90% of the PCs and *NIX boxes people work with these days, but that wasn't true as recently as 10 years ago.)

    What about other undefined things? Well, sometimes an implementation defines them usefully. For example, consider this bit of code:

    unsigned int word;
    unsigned char bytes[4];
    } foo;

    foo.word = 0x01020304;

    if (foo.bytes[0] == 1 && foo.bytes[1] == 2 && foo.bytes[2] == 3 && foo.bytes[3] == 4) printf("Big endian\n");
    else if (foo.bytes[3] == 1 && foo.bytes[2] == 2 && foo.bytes[1] == 3 && foo.bytes[0] == 4) printf("Little endian\n");
    else printf("Unknown endian\n");

    This is useful code. Chances are nearly every compiler you meet (at least, which offers 32-bit ints) will handle this correctly and tell you the endianness of the machine. That means it's reasonably portable. It also happens to be quite undefined.

    Sure, it fails miserably on oddball machines with non-standard word sizes, but most programs only care to be portable amongst the vast majority of machines that have 8-bit char, 16-bit short, 32-bit int. (This is part of the reason why LP64 machines are more popular than ILP64 machines.)

    In general, compilers implement a superset of the standard by providing reasonable semantics to expressions that the standard leaves undefined. For instance, on most compilers, signed arithmetic wraps around the same as unsigned arithmetic, and the values you get are exactly what you'd expect from 2s complement arithmetic, despite the fact that the standard leaves those results undefined. Heck, until the adoption of C9x, C++ style comments were not technically legal in C programs, but most compilers accepted them.


(null cookie; hope that's ok)