Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming IT Technology

$100,000 Open Source Design Competition 210

Hrvoje Niksic writes "The Software Carpentry project has announced its first Open Source design competition. They offer prizes totaling $100,000 for people who come up with good design for tools that replace autoconf, make, as well as a bug tracking system and a regression testing suite. Good luck!"
This discussion has been archived. No new comments can be posted.

$100,000 Open Source Design Competition

Comments Filter:
  • by Anonymous Coward
    Because only then is it truly free. And yes, I mean both free as in speech and free as in beer.
  • by Anonymous Coward
    Did anyone else notice the requirement that "All tools will be ... be implemented primarily in, or scriptable with, Python." And curiously Guido van Rossum is one of the judges.... Perhaps a little bias on the language selection. Note: not intended to a start language war, just an interesting point
  • Yes, I have studied Python, and have used it, but only to learn about it. I haven't done anything serious with it. I liked it a lot, but I like other languages better. I'm not totally biased against interpreted languages. I've done major projects in java and php for example.

    It comes down to choosing the best tool for the job. For one web site I was working on I used a combination of c, c++, bash and php. Each did something that it was better suited for than the others. I like having that kind of flexability, so I don't like the idea of python, or any other language being mandated.

    1. Have you studied Python?
    2. Have you used it?
    3. Have you used it for so-called "serious" programming?

    I've seen some amazing software engineering support systems done in Python already. Stuff that is quite a bit more complex to design and implement than make et al. Stuff of the kind that previously made a long time C++ user like me say "this screams for C++". Until, that is, I saw it done and maintained in Python with much less (and cleaner) code and time than would ever be possible in C++.

    --

  • OK, fair enough.

    --

  • In simple terms, the autoconf-automake-libtool chain needs to be unified into a SINGLE program with a SINGLE database of bits and pieces (which could possibly be overlaid by a user's own database). First, there is the speed issue. Second, the problem that version inconsistencies produce (how may times have you had automake or authconf complain...). Third, and most importantly, each part of the toolchain mentioned is a single-point-of-failure in the build process -- reducing that to number to two-ish (gcc,NewMakeTool,possibly libraries) would make building a lot easier, and predictable (the predictability is NECESSARY if you want to distribute source and expect it to compile the same way on someone else's box as it does on yours)
    John
  • Forcing the structures in the source code, by (for example) inventing a new language that has uses XML or SGML for the 'authoratative' version sounds like what you are talking about -- this allows for some of the fancy editing features in Visual Braindamage to be added trivially to any such language.

    Question: What are pros/cons of what I shall term 'enforced coding structures' -- is there a proper term for this?
    John
  • They aren't exactly robust, depending on space vs. tab often leads to problems.
  • The referees [software-carpentry.com] are all presented on the project home page.
  • That's answared in the FAQ [software-carpentry.com]. Short version: It's very complex, they want experience with replacing the simple tools first.
  • I don't think so.

    Because Free Software/Open Source Software is based on collaboration while contests are based on competition. The two are almost perfect opposites.

    In this contest, we will have programmers duplicating their efforts on many projects rather than having many programmers expending their efforts on a few projects. What's more, not necessarily doing it because they are interested in the project, but in some cases, purely for the money. And after all, no-one writes FS/OSS for the money do they?



  • Incremental compiles.

    If you look at the IDE provided by the IBM VisualAge C++ compiler, it provides incremental compiles. So, if you only change a comment within the file, the file wont be recompiled. Again, if a particular function was changed in a manner that didnt impact the rest of the file, only the code for that function is recompiled. This vastly reduces "make" time. Of course, this needs to be supported by the compiler too, but then both the compiler and make are GNU tools, so atleast in theory, this is possible...

  • Thanks a lot!
  • Hmm. Let's see, mandate a particular development language for programs to be adopted by "medium-level" program designers.

    Let's apply this "logic" to building skyscrapers: "Design applicants must use Legos (tm) because they are easier for use by new engineers."

    My point: Not all languages have the same strengths or weaknesses. Mandating a particular language that may or may not map well into the problem domain is a Bad Idea (tm). Of course, GOOD SOFTWARE DESIGNERS already know that this is a Bad Idea (tm) and so they try to chose the best language for the problem.

    The contest is not a bad idea, but the rules are illogical at best. I will be highly surprized if they can build a make or autoconf replacement by the end of this year. These are complex problems and "just using Python" will not make them easier. Cheers

    --

  • If you look at the page carefully, you will find that the money is being put up by the US Government through the National Labs. So the US Govt is now *directly* funding Open Source. The products would be used to further parallel programming, beowulf, and just plain normal programming in a significant way. Yippee!

    Why Python? A lot of developers at Labs are Biologists, physicists, chemists, weather folk, etc who develop large beouwulf and other numerical codes. Given that most scientists pick up programming without any formal training, they are more interested in spending time on their coding and science than in Makefile and autoconf arcana(and dont tell me that a m4 based config system is nor arcane).

    Python is easyly understandable. A java/C++ programmer would pick it up in 2 hours. A C programmer in a day or two. People without any formal training pick it up in a week or so. existing C and C++ and Fortran code can be easily wrapped using SWIG--this has been used to implement the outer calculation and visualization loop for massively parallel molecular dynamics programs. See http://www.swig.org ,papers, for details.

    This is the basic point: How do you extent the advantages of software enginnering to those who are not software developers, but simulators and scientists, and other engineers. And how do you stop being elitist and lower the barrier of entry into Open Source for the nonexperts and the common man. And how do you ensure extensibility. The answer is scripting, and good coding practises for the non-expert demand python.
  • I can say that it is (a) very easy to learn without sacrificing power.

    The problem I have with Python is exactly that it has sacrificed some power. Take lists as an example. Python lists are arrays, so inserts and deletes are O(N)!!.

    I agree that more language choices are good. I also think that there is a right tool for the job, and that the job of collecting dependency information doesn't cry out for a scripting language.

  • heh, the organizers slam `make' and then require python? does anyone else see the irony of this?

    a good chunk of the FAQ is spent defending this decision, w/ the crux of the answer being along the lines of "we feel python is the best compromise...". too bad, one would think the most viable approach is to educate new developers rather than dumbing down the tools.

    --thi

  • My objection against make is not it's complicated syntax (which is only complicated because different levels of parsing - make's and sh's - intermix and regular expressions need a bit familiarity), but that it is slow.

    There's more to make's apparent "slowness" than meets the eye. Peter Miller [mailto] has written an excellent analysis in his paper, "Recursive Make Considered Harmful [pcug.org.au]" -- his argument is that make has been misused for years, and we need to rethink how we use it. Instead of recursive invocations of make, we need to use the features of modern make implementations (e.g. GNU make) to make whole-project Makefiles that can do the job make exists to do.

    Because Unix projects were once small enough to fit in a single directory comfortably, people got used to the idea of "one directory, one Makefile". When projects began to require many directories to organize the source files, many Makefiles and recursive invocations of make became the norm. This turns out to be extremely inefficient and prone to error, for a variety of reasons detailed in the paper. Instead, he advocates using many fragments of a single Makefile (one fragment per directory) and including those with make's include directive. (Hence the need for a modern make.) The paper also contains a section about writing efficient Makefiles, with techniques to significantly improve processing speed even with traditional recursive make techniques.

    Common objections to this technique are also addressed:
    • 4.1. A Single Makefile Is Too Big
    • 4.2. A Single Makefile Is Unmaintainable
    • 4.3. It's Too Hard To Write The Rules
    • 4.4. I Only Want To Build My Little Bit
    • 4.5. The Build Will Take Too Long
    • 4.6. You'll Run Out Of Memory
    While these techniques don't seem to have caught on much yet, there are some real-world projects (e.g. XEmacs [xemacs.org]) that seem to be doing so successfully...

    Take the time to read the paper; it looks to be worthwhile...
  • Ok, they had me up until the bit about having to build the tools in python.

    Who care if it makes sense? They're paying. I've been using an inappropriate language (required by non-technies) for 13 years, but the paychecks just keep coming. :-)

    It's like programming for Windoze. Even if the platform doesn't make sense for the application, that's what the customer has. If the first words out of your mouth are, "Ok, first of all, we need to get you a modern computer," then the job will go to someone else.


    ---
  • why do they want to replace the tools in question?

    Cynical answer: Licensing? make and autoconf being the backbone of a lot of what we do and also being GPL'ed. Their license is not going to be GPL, its going to be MIT-X style.

    why not just extend them as appropriate?

    Agreed, a good GUI would do the trick for a lot of people. If the intent of this really is to improve functionality though, I am suspicious that radio-buttons would really achieve this.

    I read the link pretty quickly...is this the DOE funding it? It looks like it. Does anyone know of any govt issues with GPL'ed software?

  • This is a response to your comments about a 'database' based compilation system. The ReiserFS filesystem, which has been in the /. news lately due to its support for journaling, is designed for handling a multiplicity of small files/large directories efficiently. The FS creator, Hans Reiser, is trying to move the filesystem in the direction of being able to replace your 'database' without any loss of functionality or speed, and without sacrificing readability.

    Check out his future plans on his site. Don't have URL handy, sorry.
  • They are actually throwing (mostly) US taxpayer money at the project, there contibution runs about 40% for the initial design, and about 5% for the total project. The rest is through government grants according to their web page..

    LetterRip
  • autoconf and automake are hideous.

    I'm not a dumb guy. I've bee coding for various platforms and languages for many years, and always enjoy playing with new technology and software. But next to sendmail configuration files, autoconf/make seem to be created by the devil himself. I've never seen such an orgy of scripts, binaries and m4 working to create makefiles that take way to long to comprehend and debug. It shouldn't be this complex and bloated.

    make, by itself, is perfect as a build tool. What I do like is the way glib/gtk+/gnome/libxml and other tools provide you with a -config script to which you can use backticks in your compile lines (eg. CFLAGS = `gtk-config --cflags`).

    glib (not to be confused by glibc), also provides many safe routines (ie. better sprintf()) and utility functions we all could use and have probably made for ourselves. It also guarantees the existence of various core functions, by simply either being an alias for the system function, or if it doesn't exist, providing an implementation. Thus instead of targeting UNICEs, I assume the target platform has ANSI C (or C++) plus glib and any other libs I need. No need to worry if it's strings.g or string.h, or if there's a strdup().

    Thus, it is my humble opinion that autoconf and friends should be replaced by simple make, *-config scripts and glib (or some other common library).
  • Now, the second issue is a bit different. You talk about storing intermediate data, such as parse trees. An extension to the "everything under revision control" method. What benefit does this get you? If the source has changed, you are going to have to rebuild the output anyway. If the original has not changed, you can just use the object file from revision control. What is the point?

    Oh man, I wish I had the references handy, but there are a couple algorithms for incremental lexing and parsing, which run big-O(n) where n is the number of *modified* tokens in the input. If you keep intermediate representations around, you don't have to rebuild the entire parse tree just because you changed X *= 2; to X *= 3; in a 50,000 line source file.

  • Given that the solutions provided by autoconf and make aren't needed by Python programmers, *why* would you expect a Python programmer to want to scratch this itch?
    -russ
  • Similarly, make [...] needs a simple GUI front end for newbies more than it needs rewriting.

    See http://www.alphalink.com.au/~gnb/maketoo l/ [alphalink.com.au]

    Disclaimer: I wrote it.

  • I don't suppose you've considered the facts that:
    1) It's an Adobe product, which are all huge resource hogs to begin with.
    2) He's running win98 which is even worse about resources than Adobe? Friend of mine has to reboot his win98 box every few hours because it doesn't free memory after using it half the time with Any software.


    Dreamweaver
  • Ten years ago, someone could have waved a hundred thousand dollar check in front of Linus's nose, and it wouldn't have got us to this point any faster.

    It wouldn't have gotten us to this point at all - Linus first wrote Linux because he couldn't afford anything else than Minix.

    Lack of money can be a good thing. (VERY unusual though. ;-)
  • by Anonymous Coward
    Why fix something if it isn't broken?
  • RMS doesn't slam the MIT X license in particular, he is advocating the benefits of copylefts. If you don't want a copyleft [gnu.org], the MIT X license is fine. I believe he described it as "a simple non-copyleft free software license with no particular problems" at one point.
  • Minix was broken (not properly supporting 32 bit platforms), and not really fixable (the license only allowed distribution of fixes in form of patches).
  • I don't see the relevance of motivation for a postulated "classic OSS model" in a project where US$200.000 will be invested in development.

    What can happen is that the new utilities are anough of an improvement for some project to switch to, but that the wast majority will stick to the existing well known tools, since they are "good enough" and most developers already know them. This means that the programmers most likely to improve the existing tools will use something entirely different instead for their projects, so the majority will end up with worse tools than they would without this competition.

  • I read the link pretty quickly...is this the DOE funding it? It looks like it. Does anyone know of any govt issues with GPL'ed software?
    Well, given that the development Gnat (GNU Ada) was founded with the specific requirement that the code was made GPL, I doubt the government as a whole has problem with GPL'ed code.
  • Arbitrary inserts and deletes may be O(N), but lookups and appends are O(1). I wouldn't consider this to exactly be a case of 'sacrificing power', although you might prefer real lists (actually, I would probably, and car and cdr and a real lambda, but anyway..)

    Aside from that, did you read my post? I said that it would be a good idea to write the dependency-collection code in a C module. But I can definitely see the advantages for writing a large part of the code in Python.

    Daniel
  • It'd be nice to have one language with extensions that are reasonable, and understandable.

    Well, there's Python :-) Seriously, I believe that is one of its design goals, and having learned it I can say that it is (a) very easy to learn without sacrificing power (aside from the flexibility/speed tradeoff you make with an interpreted language, but that's not really a power issue persay; you could even say it's more powerful for some things), and (b) highly extensible -- everything is provided via objects and modules, and I mean *everything* -- files and strings are objects, all routines for string manipulation, regex searching, and so on, are in modules, etc.

    Of course, the only thing I use it for these days is writing automatic mailhandlers (it has excellent mail-parsing facilities) -- mainly because I don't run into much stuff where I need Python's capabilities. Go figure :)

    Daniel

    PS - while Python is nice, many computer languages is a **GOOD THING**. Having a language that more closely models the problem -- for example, being able to use a functional language like Lisp or ML -- makes things a lot simpler and faster. More languages means more choices. End rant. :)
  • I mean, honestly, what is a "filesystem" except a way of organizing chunks of arbitrary data in a tree structure?

    If you have a "database" that also can organize chunks of arbitrary data in a tree structure, but is so much faster than ext2 at this that you believe it should be built into programs... why not wrap a filesystem driver around it and build it into every program at once?

    Your're right, I was not thinking of arbitrary data when I speak of that "database". I want to exploit features of a certain kind of data, that should be stored efficiently - the intermediate code representation.

    We usually store the initial representation of the program (the source) and the final representation (object files, executables, libs..) but throw away the intermediate representations that compilers, assemblers and linkers calculate time and time again.

    For a few cases we keep that intermediate representations. Precompiled headers are an example of an intermediate result that is kept near the beginning of the translation chain, template repositories (used in AT&T's cfront derived C++ compilers) are an example near the end.

    What I thought of is similiar to one of those relational databases in that the focus shifts from the raw input data to the intermediate representation that would be kept as a whole.

    Let's say the running core would consist not of a collection of files of characters but of datastructures that are more suited for compilation, typically of a forest of parse trees and related intermediate information for the whole project.

    One could import/export sources from that core, and the compilation process would be a special kind of query that exports binaries from that core.

    More global optimization strategies would be possible because the compilation proceeds on the whole project, not only single compilation units as normal compilers do.

    The trouble with template instantiation would be gone.

    But, yes now comes the but :-), such a system is certainly complicated. I had a lot of fun with precompiled headers and template repositories in the past. So I assume it is not easy to write them in a way that they work flawlessly.

  • Another one - pmake [usrsharedo...perasciigz]

    PMake - A Tutorial
    Adam de Boor

    If there's a missing requirement in the rules of the contest, it's the lack of a migration path from make. Without that, you just have an interesting toy, because no one will move their existing significant system without it.

    Important point!

  • One way this has been dealt with in the past is to store everything (source, source deltas, compiled objects and linked executables) in a source code management system along similar lines to SCCS or CVS. I cite as an example the CMS tools on OpenVMS.

    I am sure there is a lot to learn from those old systems.
    (Why some of the VMS guys inflicted Windows NT on us is a different thing :)

    That we seem to have lost some features as well and did not progress only, shines through in the hacks and rants of great hackers like Richard Stallmann and Jamie Zawinski. Take this quote

    Back before the current dark ages began, I hacked on Lisp Machines. (..)
    Have you ever wondered why we're stuck with crap like Unix and MS-DOS? Read Richard Gabriel's Worse is Better paper for a great explanation of why mediocrity has better survival characteristics than perfection

    from Jamie's page [jwz.org] and this quote from Richard

    Yes, with string-based interpreters you can represent programs as strings, and you can interpret them as strings, but you can't parse their syntax very easily as strings. In LISP, programs are represented as data in a way that expresses their structure, and allows you to easily do things to the programs that are based on understanding them.

    from a recent RMS interview [linuxcare.com]. Both refer to the LISP machines of MIT, which seem to have operated on a higher level program representation than mere strings.

    I interpret these rants that todays machines are stronger but dumber in a sense as well.

    Question is if one could combine the strengths of both worlds. The higher level representation found in LISP machines and the performance of our present C compiled systems.

    Like I tried to explain above, my feeling is that this could be achieved by shifting the primary representation to something closer to the intermediate structures that arise during compilation. It would have indeed similarities to a configuration management system. Adding a line to a text source would, after check-in, result in an immediate update of an persistent parse tree of the program database core.

    OpenVMS compilers and the linker can be invoked to operate on CMS objects without having to pre-fetch anything first.

    Do you have any reference where I can read more about CMS? (I would be happy also to have some nice review on the strengths of the LISP machines)

    If we were to implement something like this for ourselves I'd say the first thing to do is to find a lightweight, fast and efficient implementation of an object repository. Does anyone know of such?

    Sounds to me like what OODBs are promising. The one I had to try so far (POET 3 under Win32 and Solaris) was horrible. No idea how they perform today, as they seem to be two major revisions farther.

  • You speak of checking sources in and out of the "core" (I assume you mean a revision control repository), and checking binaries out. You also mention keeping parse trees and similar temporary data structures in the repository as well.
    We've really got two things here.

    No, I am not talking revision control repository here. I am talking moving towards a representation that is a bit further down the road of compilation.

    Obviously it is possible to represent a programm on many different stages:

    1. Starting from a collection of text files to
    2. preprocessed intermediate files,
    3. internal parse trees,
    4. other intermediate representations like RTL in case of gcc and possibly
    5. assembler source to
    6. object files to
    7. linked executables.

    If we treat each translation step as a mapping between representations, I can draw the compilation process this way:

    1 -> 2 -> 3 -> 4 -> 5 -> 6 -> 7

    In fact not all mappings are one way (loosing information) but are (or could be made) bijective

    1 <-> 2 <-> 3 -> 4 -> 5 -> 6 -> 7

    I now suggest focussing more on the programm in representation around stage 3. Why?

    Because this is the representation in terms of the language, on that for example the compiler works on for code generation.

    And I am not thinking just compilation here. I am more thinking turning the collection of sources into a database of things more understood, e.g. program related units, like classes, functions, macros, ..

    Take Emacs for example. Press Esc-x and then spc and you get a listing of all available functions. The Emacs kernel "knows" of all its functions. Compare that to grepping a deep C/C++ source tree for some function. That grepping should be replaced by a more appropriate query on a more appropriate database. (Like that Esc-x spc sequence is a query on the dumped Emacs kernel).

    If the source has changed, you are going to have to rebuild the output anyway. If the original has not changed, you can just use the object file from revision control. What is the point?

    Not every change to source would result in a complete rebuilding of the internal structures. Some mechanism like the access optimzation found in relational databases had to decide if the whole has to be rebuild, or if only parts have to be changed. This is certainly one of the hard parts.

    Now, I suppose you could argue that you don't always need to recompile an entire source file; you may have changed just one function. But to know that, the compiler is going to have to do a source analysis of it anyway.

    If I edit on the text stage you are right. Possibly most changes will result in a new translation. If I edit on the language stage ("add a function", "delete that class") not necessarily.

  • I suggest to have a look at this article [auug.org.au] on the problems with recursive makefiles.
  • Sorry, posted a local link. ;)

    Here [freebsd.org] is the one I meant (HTML [freebsd.org], PS [freebsd.org], ASCII [freebsd.org])

  • Ok, they had me up until the bit about having to build the tools in python.

    As far as I understand ("scriptable from python") they want to be able to run it from python.

    What I am more surprised about is to see Chris DiBona on the referee board.

    He worked on the "Open Sources" book and is the leading PR guy of VA Linux (for a bad example of his advocacy style listen to this interview [tamu.edu]). But there it ends. He is certainly not picked because he is an expert of tool design.

    Anyone knows about the other referees?

  • I mean, honestly, what is a "filesystem" except a way of organizing chunks of arbitrary data in a tree structure?

    If you have a "database" that also can organize chunks of arbitrary data in a tree structure, but is so much faster than ext2 at this that you believe it should be built into programs... why not wrap a filesystem driver around it and build it into every program at once?
  • One way this has been dealt with in the past is to store everything (source, source deltas, compiled objects and linked executables) in a source code management system along similar lines to SCCS or CVS. I cite as an example the CMS tools on OpenVMS. OpenVMS compilers and the linker can be invoked to operate on CMS objects without having to pre-fetch anything first.

    I'm not sure exactly how stuff is stored in CMS (apart from the obvious fact that source deltas are stored as diffs) but I'd guess that there must be some sort of efficient indexing involved since access to such a store doesn't have to be quite as flexible as a general purpose file system.

    If we were to implement something like this for ourselves I'd say the first thing to do is to find a lightweight, fast and efficient implementation of an object repository. Does anyone know of such?

    Consciousness is not what it thinks it is
    Thought exists only as an abstraction

  • I can't see anyone slapping together a tool to replace make any time soon... that's one piece of heavy parsing code.

  • To this and similar proposals, the complaint has been made that it undermines the cooperative nature that should properly inhere in the world of free software. If I compete, and in order to do so I write some incredibly cool tool, I'll probably keep the tool a "trade secret" in preparation for the next competition. This is a completely valid complaint; people are put in competition with one another when they ought not to be.

    The free market is a wonderful thing, and you don't want to discard the parts of it that work well. It's not unreasonable to offer compensation to somebody to write a useful piece of software.

    I recently came across a proposal by an economist (in the UK, I think) called "social policy bonds", which is applicable here. His proposal was that the government would create a financial instrument (a piece of paper) which could be redeemed for a fixed amount of money when some measurable social goal was fulfilled. Once the bonds were created, they would be auctioned to the highest bidder. A further free-market tweak could be put on the idea: bonds are issued by individuals rather than the government. Rather than collected tax dollars, an individual puts a chunk of cash in escrow with a private financial institute, which gives the individual a certificate serving the same function. If the condition is met, anybody can redeem the certificate and take the money out of escrow. (Until that time, the escrow agency can invest the money, or collect interest on it.)

    I've discussed this idea with a couple of banks in my area and they aren't interested in acting as escrow agents. The idea is too wierd and new for them. Maybe I'll try insurance companies. The viability of a certificate is contingent upon the reputation of the escrow agent.

    A similar instrument could be used in place of Software Carpentry's competition. It would remove the stipulation that only one person could prosper for each goal. People would be able to profit by contributing to the efforts of others. As with shares of stock, all the owners have an incentive to cooperate to cause the price of the shares to rise.

    The economist who originated this idea is named Ronnie Horesh. His proposal [geocities.com] goes into much greater detail than I have done here. It's a cool idea, probably useful for many different goals.

  • How does does implementing a tool in a scripted language make it eaiser for newcomers to learn and use?

    It doesn't. It makes it easier for the evaluators to evaluate. Once they get a set of winners, why don't we port 'em to C++?

  • "If you don't see any problem with that, just look at what RMS have to say about it."

    Why should anyone care what RMS thinks about someone else's project? Have we all lost the ability to think for ourselves?
  • "Now we can see the philosophy of some of those who support other "free" licenses."

    I seem to recall the original post in this thread called MIT/X the "wrong license". Are you saying that it's okay for GPL advocates to criticize MIT/X, but that it's at the same time wrong for MIT/X advocates to criticize GPL?
  • "The GPL is shorter than most EULAs, and it only really places one restriction on the code; You can't take advantage of this community work unless you are willing to participate in the community by giving back."

    Unfortunately, the legalese in the GPL creates some very problematic restrictions above and beyond "giving back" to the community. First of all, anything given back must also be GPL. Second, anything that dynamically links to a GPL library (remember, RMS doesn't want you to use the LGPL) must also be under the GPL, even if the code is 100% wholly your own.

    "The GPL is far less restrictive than any commercial license with which I'm familiar."

    So what? Khruschev was less restrictive than Stalin, but what difference does that make. However, a commercial license does have one huge advantage over the GPL: no commercial library that I am aware of tells the developer what license they must use.
  • "However, I thought that people willing to contribute/take part of this contest should be warned that the code they submit might be made propietary by the organizer or someone else."

    Of course we are aware of that! Do you think we are children unable to read the license for ourselves? We are adults and fully capable of deciding for ourselves what to do with our lives and our code.
  • No, I don't think it should all be BSD or MIT or whatever. You were making the argument that GPL was the single best license because of a single comparison with a bad license. It tried to illustrate the fallacious argument with an absurd comparison.

    People should use whatever license they choose. Hopefully they would choose a license appropriate for their projects. It's not much skin off my back if they choose something I don't agree with. It's their life and I'll let them run it. Even if the choose a commercial license. Even if they choose the GPL for a library.

    And as for the lack of chaffing under commercial licenses, re-read my post. No commercial license dictates what license you may or may not use.
  • I haven't used a huge variety of commercial libraries that include the source code, so I'll stick with one that everyone knows of, the MFC.

    I can write an MFC application and license it under **ANY** license I choose, including GPL, BSD or public domain and I can freely redistribute the MFC dll's. Compared to the GPL, the MFC license is much, much freer for me to *use* in the way that libraries are used.

    If a developer wishes to use the GPL for a library, then that is their right. But understand that by doing so, they deny the *use* of their library to everyone developing under a different license or using a non-GPL library.
  • "He didn't move for restrictive legislation..."

    But he did call for taxation on all software in order to fund free software. And he did call for the elimination of artistic rights.

    No other faction of Free Software, or even commercial and proprietary software, has the temerity to *demand* what other developers do with their software.
  • "What's wrong with a bit of Marxism, anyway? It sure makes programmming easier - which is what I care about."

    With the MIT or BSD licenses, I can include the entire readable and understandable license at the top of each and every source file. I don't have to wonder if I'm going to be sued for using library A with library B. In short, since there are fewer restrictions with the BSD or MIT, programming is easier (unless you like the lack of choice inherent in marxism).

    "Just don't try telling me on one hand that the GPL is evil because it forces people to give away their code, and then encourge people to licence under the BSD so the code can be "truly free"."

    No one was calling the GPL evil (at least in this thread). But it's absolutely ludicrous to call the GPL more free than the BSDL on the fact that the GPL is more restrictive. And though it's not evil, it is questionable that the GPL requires its users to redistribute political propaganda.
  • "Are you allowed to modify the MFC and redistribute it under ANY license?"

    No, which is why I prefer any Open Source license to Microsoft's EULA. However, there are many conditions where I am FREER with MFC than with a GPLd library. And one of these is a very common condition, that of writing an application.

    If Microsoft says "you can do A but not B" and GNU says "you can do B but not A", then there will be instances where the GPL will be the more restrictive license.

    "There is still the LGPL, whose use is discouraged by RMS, but is prefered by RMS to any other license, save the GPL, for libraries."

    The contendor for my biggest beef with the GPL is the restriction against dynamically linking to non-GPL code. RMS is just plain wrong on this issue. It is at this point that the GPL escapes the bounds of polite society and starts sticking its nose where it doesn't belong, namely, other people's code. And this isn't just some petty gripe. The very desktop that I am using has been declared *illegal* by Redhat.
  • "But neither thr BSD or MIT licence force you to include the source, do they?"

    But you were arguing that the GPL makes programming easier. Stop changing the subject :-)

    "...but apart from that I don't know what you are talking about."

    I'm talking about that whole introduction at the top of the GPL before section zero.
  • I agree totally.

    For a crowd that likes to toss around the words FREE and FREEDOM as much as they do, they are very stuck on the notion that there is only one right way to do something.

    The quickest way to earn the enmity of the Free Software community is to freely and voluntarily choose something. Choose *BSD and they bitch about proprietary exploitation. Choose KDE and they moan that it's illegal. Choose Redhat and they kvetch that it's too commercial.

    Choose to award someone $100K in a contest to improve autoconf and they're incensed that someone would spend his money without asking permission of Slashdot first.
  • "I actually like autoconf, and I have been using it in some of my projects, but the Makefiles generated by automake are just too bloated"

    I think you just answered your own question.
  • Hmmm, I didn't know that the DOE was funding this. This is not a good thing in my view. I can see one of my earlier posts is going to come back to haunt me. You'd think I'd learn to read the entire article first before commenting on it :-)

    However, I notice that most people bitching about this are not complaining about the source of funding, rather they are bitching that money is going someplace other than where they want it to.
  • I can understand wanting to standardize on one language to help make this "suite" a cohesive whole, but they've got to select the right tool for the job. Hell, I don't even have python installed on most of the boxes I use, but you can bet c and c++ will always be there.

    What exactly is the problem? Yeah, I guess Python won't be installed on every machine - but so won't any of those new tools. And if you are going to install new things, what's the problem with installing Python?

    -- Abigail

  • Why fix something if it isn't broken?

    I don't recall anyone saying to Linus Why are you fixing Minix? It ain't broken! Sometimes, you just want to find out whether you can do better then there already is.

    -- Abigail

  • If you don't see any problem with that, just look at what RMS have to say about it.

    RMS isn't God, GNU isn't a religion and the FSF isn't a bunch of prophets. We all know RMS is hardheaded, and only believes in himself. But many people have found out there are more ways leading to Rome. You might have an opinion about the choice of license, but just shouting "Wrong license", waving a political document written by someone else as "evidence" doesn't bring you far.

    If you don't like the license, don't participate. Don't use any of the products that might be created with this license. You might even write something better, and release that under your preferred license.

    Just don't act as a doomsday sayer.

    -- Abigail

  • However, I thought that people willing to contribute/take part of this contest should be warned that the code they submit might be made propietary by the organizer or someone else.

    This suggests worse things than can actually happen. While the X (or MIT) license allows you to relicense code for distribution, it doesn't allow you to strip off an existing license. What that means is that if I write code, and license it under the MIT license, you can take the code, possibly modify it, and distribute it under a license of your choice (GPL if you wish). But that doesn't mean I can no longer distribute my code under the MIT license. I don't lose rights. But I give you more rights than you would get had I distributed the code under GPL.

    You can take MIT licensed code, modify it, and distribute it as GPL coded. You can't do it the other way around. MIT licensed code allows you to do anything that GPL code allows you to do - and then some. It's not hard to figure out which gives you more freedom. I prefer MIT style licenses because I don't want to back up my preferences with legal actions.

    -- Abigail

  • You speak of checking sources in and out of the "core" (I assume you mean a revision control repository), and checking binaries out. You also mention keeping parse trees and similar temporary data structures in the repository as well.

    We've really got two things here.

    The first is the question of keeping just "source" files under revision control (and by source, I mean anything you use to build an executable - code, images, resources, etc.), or keeping everything (object files, final executables, etc.) under revision control.

    The argument for "just source" is that you should always be able to build an identical finished product from the proper source, and so keeping generated output files around is a waste of machine resources. You also run into fewer problems with the unexpected dependencies and conflicts you encounter during debugging.

    The argument for "everything" is that you can reduce build times by having those ouput files pre-generated, so the burden of rebuilding on a change is put on the person making the change, and everyone else just uses their output. You can also make the argument that finding the above-mentioned dependencies early on will lead to better code.

    As far as I'm concerned, this is largely a matter of opinion, and you should go with whatever works for you.

    Now, the second issue is a bit different. You talk about storing intermediate data, such as parse trees. An extension to the "everything under revision control" method. What benefit does this get you? If the source has changed, you are going to have to rebuild the output anyway. If the original has not changed, you can just use the object file from revision control. What is the point?

    Now, I suppose you could argue that you don't always need to recompile an entire source file; you may have changed just one function. But to know that, the compiler is going to have to do a source analysis of it anyway, so why bother trying to cache the output? If your source files are big enough that this is a significant problem, youprobably need to look at splitting up your source a bit more. It isn't just increased build times that are at issue here; programmer comprehension drops the bigger a source file gets.

    Don't get me wrong; I'm not trying to shoot you down here. I'm just trying to see what benefits one would get from the ideas you are suggesting.

    Incidentally, Borland's Incremental Linker, used in C++Builder and Delphi, does do something similar to what you are suggesting. If you change one object file and go to relink to make an updated executable, it simply replaces the parts of the executable that depend on the changed object code. The parts that did not change stay the same. Saves a little time.
  • (try telling MSVC to run lex and yacc and then compile the output files, using only .dsp files! HA!)
    I do this all the time. It's a bit fiddly, because you have to muck with that horrible custom build step dialog, but it's not even buried too deeply in the GUI. Just say "grammar.y" produces "yy.tab.c" (and "yy.tab.h" if you're having it make tokens for the lexer too) by means of "bison grammar.y", and put both files in the project. Similar for lex.

    That said, I still prefer makefiles. Much easier just to set up a general rule and tell make to go to it.

  • None of the labels you mention is entirely off base. His writings definitely do borrow rhetorical techniques and concepts from those of Marxism. I don't know about "Svengali," but there is some legitimacy to the claims that he is a fraud.

    What's wrong with a bit of Marxism, anyway? It sure makes programmming easier - which is what I care about.

    I personally believe that the FSF's 501(c)(3) tax exemption was obtained fraudulently, because the purpose of the FSF is (and always has been) to compete directly with for-profit businesses. This is not allowed, and since he intended to do this from the start, it may well be that he could be accused of defrauding the IRS.

    Glad to see someone is watching out for the poor IRS!

    Not just by "giving back" -- that's commonly done under other licenses such as the MIT X and BSD licenses. Rather, the GPL demands that the author give up any prospect of licensing his or her work for money. He or she must give the code not only to the original developers but to everyone for free. This is an onerous requirement which, as Richard Stallman himself states, is designed to reduce programmers' salaries and compromise their livelihoods.

    But at least then when that programmer is gone you can guarentee that you have an up to date copy of the source - try that with a commercial licence - and don't give me that line about how all BSDL software producers are nice guys who throw in the code for free - that is the same as GPL, and invalidates your arguement.

    Not true. I can pay a fee to license a commercial, royalty-free software library and use it in my work without being forced to compromise my livelihood as a programmer.

    Until you need to fix a bug in that library, and find you don't have the source and the company is out of business.

    RMS's discussion of such licenses isn't reasoned. It's demagoguery which is designed to deceive and to hide his true intent.

    As opposed to your well resasoned and level headed criticisms of the GPL, say?

    You know what I like best about the BSD licence? The fact you can re-licence code under it to the GPL. Life is beautiful, sometimes!

    Look, I have nothing against the BSDL - it is a nice licence if you want to let others make money off your code. Just don't try telling me on one hand that the GPL is evil because it forces people to give away their code, and then encourge people to licence under the BSD so the code can be "truly free".

    GPL - all the source, all the time!

  • No, it makes programming much harder by imposing a long, complex, baroque, restrictive license -- loaded with political baggage -- upon the code. The GPL is to open source software what Soviet Communism was to socialism -- a scheme which claimed to be idealistic but in fact had much more base motivations. Want programming to be easier? Want to see the state of the art advance? Use an open source license that makes the source truly available for reuse by all with virtually no strings attached. The GPL is not that license.

    Sure I want to see "the state of the art advance" - and I want to see the code. With the GPL I can and will.

    I'm not just watching out for the IRS; I'm watching out for anyone who pays taxes or contributes to United Way. By dubbing itself a "charity," the FSF has extracted money both from the taxpayers and from people who believed they were making donations to truly charitable causes. The FSF is not a charity, since it does not reserve its work for those in need and because its primary purpose is to compete directly with for-profit businesses.

    LMAO! Somebody has a big chip on his shoulder!

  • The source code is available for FreeBSD & OpenBSD, right? There is no difference between that, and GPL'ed code, except that GPL'ed code can never be hidden, while BSDL code can.

    That means that if a company wants to make some advance then with Linux they have to release the code (ignoring Binary Kernel Modules) while with *BSD they don't.

    I guess you'll say "but they often do release the code after a while" - well, I don't want to wait and hope. I want it guarrented!

  • With the MIT or BSD licenses, I can include the entire readable and understandable license at the top of each and every source file. I don't have to wonder if I'm going to be sued for using library A with library B. In short, since there are fewer restrictions with the BSD or MIT, programming is easier (unless you like the lack of choice inherent in marxism).

    But neither thr BSD or MIT licence force you to include the source, do they? That is what I like about the GPL.

    But it's absolutely ludicrous to call the GPL more free than the BSDL on the fact that the GPL is more restrictive. And though it's not evil, it is questionable that the GPL requires its users to redistribute political propaganda.

    No one is claiming that the GPL is more free than the BSDL. It's not - and that is why I like it.

    As for political propaganda - well, I guess it forces you to redistribute the source, which can empower those who normally wouldn't have it, but apart from that I don't know what you are talking about.

  • Whilst I applaud any company who's ready to spend substantial wads of cash on OpenSource development, I really think that competitions are the wrong way to go about it:

    I thought this at first, too. Seems like a dumb thing for an "Open Source" company to do, since it appears to encourage competition at the expense of the OSS model. But upon reading the webpage I discovered there's more to the story.

    • It encourages secrecy and non-cooperation between the various people working on projects like this.

    Actually, after the initial design phase, finalists are monetarily encouraged to join forces, since they can double their take if they win.

    • It doesn't encourage the best people to do the work because they'll say to themselves "I could work for six months on this - and then lose the competition and get nothing".

    This is not obvious from just reading the blurb, but the biggest part of the competition is purely to design such tools; implementation details are discouraged. Though a good design is quite hard, it doesn't take six months, especially if the designer in question has already bounced around such ideas. Perhaps the money will draw out a designer who's been itching but hasn't yet scratched.

    Also, the deadline for the initial design submissions is March 31, so the most one could waste is ten weeks of planning. (And since it's just a design, there's no debugging!)

    • The wording of the competition seems to prohibit developers from doing the rational thing which is to start with the best parts of the existing autoconf/make system and just fix whatever is perceived to be broken.

    From the rules (emphasis is mine): "Designs based on existing tools, written in any language, are welcome. Such designs will be judged on the same basis as those written from scratch." So this is still possible.

    • Putting money into OpenSource teams has to be done with great care since it can often result in serious internal debates about who contributed most and who deserves what share of the money.

    You're right on with this one. The rules of the game change a bit when we're talking about quantities of money rather than merely number of listings in the CREDITS file.

    Failing that, break the money up into $20k 'grants' and offer them to people who are already working in the right direction.

    As I said, existing projects can compete. And the $100,000 is broken up into $2500 apiece for each of four finalists in the four categories, and then a second award of $7500 for the final winner in those categories and $2500 each for the runners up.

    This competition is A Bad Thing.

    Again, that's what I thought at first. But after reading the rules I had to agree it's not as dumb as it sounds. They're merely putting some money into trying to find the best design for some new tools (since they're claiming that the inherent design limitations of the existing tools are what they're trying to overcome). Then once the initial submissions are weeded through, they basically let the OSS model take over, even monetarily rewarding finalists who join forces and end up with a better final design.

  • This is a real question, no offense meant, no flamebait. Isn't autoconf just there to work around a) the problems of the C programming languages with programming across platform bounds (e.g. sizeof(int) differs) and b) the differences between *ix systems (function call XYZ is not available on some systems, has another name on another system etc.)?

    If so, wouldn't it be a better approach to use a standardized high-level programming language that is a bit more away from the operating system? I know C cannot be dropped immediately, but if more and more tools are built to cover its shortcomings...

    OTOH, I like make, I'm using it on all kinds of platforms (Linux, Sun OS, Win32) to create programs in different languages or just to rebuild some LaTeX source.
  • You're spot on about the uses of autoconf you mention but it has other advantages as well. It encourages the use of very deep packages without having to remember the (nightmarish) rules about recursive Makefiles. Instead of dumping all my source in one directory, I can organise it in a logical fashion and get the flexibility, dependency stuff checking and modularity that goes with it. That's just too much work to do every time with vanilla make I find.
    Autoconf also simplifies creating a distribution (make dist), putting in checks into your code (make check), installing and uninstalling correctly no matter where the prefix is, and checking for the correct libraries before compile time.
    Not that I don't understand make - in fact I think that using autoconf and scanning its output has helped me get a handle on how make does its magic but doing it by hand? Eeek.
    A standardised programming language would work. I think Perl makefiles are already a step towards that idea. But a new one would take time to catch on. Nearly all GNU stuff comes with configure scripts. I get nervous when I download source and it doesn't have a configure because 90% of the time, something is wrong and it needs a bit of makefile hacking.
    A front-end for autoconf would be a good idea as proposed by some previous posters. Those who want to learn it before a nice GUI comes out or a replacement is found can check Havoc Pennington's online book [gnome.org] which has a chapter on the package.

  • make has been used to manage dependencies between project components for almost a quarter of a century. While it was a major advance over the hand-written shell scripts that preceded it, make's semi-declarative syntax is clumsy, and even short make scripts can be very difficult to debug. In addition, its functionality is not accessible from other programs without heroic effort, and it provides little support for common operations such as recursion.

    How can one expect to actually call himself a developer if he can't even manage to understand Makefiles? IMHO the standard make does wonderfully what it was designed for, and the GNU make is half-way to creeping featurism [jargon.org]. The consistency and interoperability of each of the system utilities adds elegance to Unix, and learning how to use them helps to keep the brain working! :) Stupidifying it is a mistake.

    This isn't elitism, but I believe that replacing make(1) to make it more accessible to dumb people doesn't make sense, except if you dumbify the programming languages as well.

    (I actually like autoconf, and I have been using it in some of [helllabs.org] my projects [helllabs.org], but the Makefiles generated by automake are just too bloated. I use my own nice very nice recursive Makefiles to build and package the system, and it works quite well!)

    • The contendor for my biggest beef with the GPL is the restriction against dynamically linking to non-GPL code.

    And, the LGPL doesn't have this problem. The LGPL also allows you to distribute modified source. So, if you find the GPL too restrictive for your libraries, how about the LGPL?

    • It is at this point that the GPL escapes the bounds of polite society and starts sticking its nose where it doesn't belong, namely, other people's code.

    I never hear this complaint about commercial licenses that generally completely disallow any modifications to the distributed (often binary) product whatsoever, even when such changes are for your use only. Aren't they saying what you can do with your code?

    If you feel the GPL "sticks its nose where it doesn't belong" there is an easy solution, Use no GPL code. You are no worse off than if the GPLd code had not been provided for your use.

    Heck, use the GPLd code to test against as you develop your own workalike code. I'm sure the FSF, unlike a lot of commercial licensors, would not involve itself in a look-and-feel lawsuit against you.

    • And this isn't just some petty gripe. The very desktop that I am using has been declared *illegal* by Redhat.

    So, it upsets you that you've run afoul of some licensing restriction. Seems like a petty gripe to me.

    I've often wanted to do things with commercial software that's not allowed by the license. For example, I often want to install it on more than one machine even if it's absolutely guaranteed that it will only be in use on one at a time, like a laptop that only I use and a desktop that only I use. The license prevents me from doing it. It's frustrating, but I knew it when I bought it that what the license allowed. Petty of me to complain about something to which I implicitly agreed by buying and then using the product.


    -Jordan Henderson

    • "The GPL is far less restrictive than any commercial license with which I'm familiar."

      So what? Khruschev was less restrictive than Stalin, but what difference does that make. However, a commercial license does have one huge advantage over the GPL: no commercial library that I am aware of tells the developer what license they must use.

    So, I take it by your analogy that you feel that any restriction on a license is evil. I suppose you are suggesting that all software should be covered under a BSD license of something.

    I was addressing the point that the poster felt that the GPL places "massive restrictions" on you. I think the restrictions are pretty similar to those placed on you by commercial licenses, only less restrictive, yet you don't hear complaints about people chaffing under those.


    -Jordan Henderson

    • None of the labels you mention is entirely off base. His writings definitely do borrow rhetorical techniques and concepts from those of Marxism. I don't know about "Svengali," but there is some legitimacy to the claims that he is a fraud. I personally believe that the FSF's 501(c)(3) tax exemption was obtained fraudulently, because the purpose of the FSF is (and always has been) to compete directly with for-profit businesses. This is not allowed, and since he intended to do this from the start, it may well be that he could be accused of defrauding the IRS.

    You seem big on throwing around unsubstantiated claims. Please cite the tax law where it is not allowed that 501(c)(3) corporations compete with for-profit entities.

    Mitre (a 501(c)(3) corporation) submits competitive bids, and often wins, against other Defense contractors. Another company that I'm familiar with, CTC (Concurrent Technology Corporation, a 501(c)(3) corporation), competes with for-profit companies to get various government contracts. Last I checked PBS (television) and NPR (radio) stations (all 501(c)(3)) compete with for-profit media.

    • Yes, let's look at this. BeOS? Hmm.... They use GCC. BSD UNIX? Also GCC. In fact, GCC has usurped virtually all of the compiler business except for a few embedded niches and Microsoft Windows. It has done this via predatory pricing -- an explicitly intended activity of the FSF. Why is this any more justified than Microsoft "cutting off Netscape's air supply?" The answer: It's not.

    Congratulations, you've found the one of the very few OS vendors that doesn't support their own compiler. I wonder if BeOS would even exist were it not for the great freely usable compiler technology?

    Your claim that gcc has eliminated all compilers except "a few embedded niches and Microsoft Windows" is bizaare. Sun, Compaq, IBM, SGI, and HP all have quite active compiler groups. Intel builds a compiler for their architectures. There never was much of a third party market for compilers before gcc, most compilers were provided by the OS/Hardware vendor and they still are.

    Gcc succeeds through "predatory pricing"? Another interesting claim. By this reasoning, I suppose ALL freeware should be stopped as being "predatory".

    In any case, predatory pricing is only illegal if it seeks to establish an illegal monopoly. Someone giving something away that continues to give away their product is not disallowed. If it were, we'd be shutting down the people who voluntarily clean the roadsides as being predatory against the businesses that do the same.

    • RMS's discussion of such licenses isn't reasoned. It's demagoguery which is designed to deceive and to hide his true intent.

    I'm still waiting to see some reason in rebuttal. Did you know that US Senate rules forbid the use of the word "Demagogue"? It's because it's an empty epithet that adds nothing to the discussion. RMS doesn't throw around labels and call that debate, unlike his detractors.

    • Absolutely. Check your history. It is historical fact, verifiable from RMS's own writings and from his remarks in public and in print, that the GPL was conceived as an instrument of spite against Symbolics -- a commercial spinoff of the MIT AI Lab. And all other companies of its ilk.

    I'm familiar with that history. So what? RMS saw what he considered an injustice and moved to correct it. He didn't move for restrictive legislation or just sit around and complain about it, he got to work in building something to correct an injustice. It's hardly a spite directed at Symbolics as it's not reasonable to assume that they would ever use GPL'd software. You might consider it spite against all who commit this, to RMS's thinking, wrong, but isn't that what correcting injustice is all about?

    Again, RMS's detractors characterize him with labels (spiteful, malicious) while not actually adding anything interesting to the discussion of ideas in which RMS engages.


    -Jordan Henderson

    • And as for the lack of chaffing under commercial licenses, re-read my post. No commercial license dictates what license you may or may not use.

    You are wrong. Typically, when you have a commercial license that includes source (ala SAP) you are only allowed to use that source under the original license. That's dictating what license you may or may not use. You are not allowed to redistribute it at all.


    -Jordan Henderson

  • Arrogance and ignorance are a bad combination.

    • Mitre is not a 501(c)(3) non-profit nor does it claim to be a charity.

    I direct your attention to this page [mitre.org], where it is stated:

    • "Both companies will be not-for-profit charities, under the provisions of IRS Section 501(c)(3)."

    Mitre was a 501(c)(3) that broke up recently into two 501(c)(3) companies, Mitre and Mitretek.

    Being a 501(c)(3) doesn't mean you are a traditional "charity", although you could indeed give tax deductible contributions to Mitre.

    Although I can't provide a web reference for CTC, I recently attended a briefing at CTCs headquarters where it was stated plainly that CTC is indeed a 501(c)(3) corporation. In fact, the briefing materials did mention that you can make tax deductible contributions to CTC!

    So, perhaps I was right initially. People do routinely libel RMS. You accused him of committing criminal fraud in his incorporation of the FSF as a 501(c)(3) entity.

    Gosh, do you need more examples of 501(c)(3) corporations that compete with for-profit corporations? Many Hospitals and some HMOs are 501(c)(3) corporations. There are both 501(c)(3) and for-profit consumer counseling services. I could go on. Your assertion that 501(c)(3) corporations are not allowed to compete with for-profit corporations is absurd.

    I suppose the rest of your unfounded conjectures and suppositions are about as reliable.

    It's no surprise that RMS, the FSF and the GPL are so negatively represented in the Computer Industry Press when columnists routinely bluster authoritatively on subjects about which they know nothing.


    -Jordan Henderson

  • I don't think you've substantiated your claim with that one passage from the GNU Manifesto.

    Your claim was that the GPL was designed to lower programmer's salaries.

    The section that you quoted is in response to the anticipated question "Won't everyone stop programming without a monetary incentive."

    In the answer, RMS says that won't happen (and in fact he's been proven correct by Linux) and then goes on to posit how the existence of this large body of software will reduce the incentive for people to produce non-free software, with which they could probably make more money.

    This is just a recognition that you can make more money from producing something that is scarce versus producing something that is freely available. People who object to the GPL often seem to enjoy promoting an artificial scarcity to forward themselves.

    The Holy Grail of Software Engineering for 40 years has been reuse, reuse, reuse. These artificial scarcities have served to make software reuse very spotty and poorly practiced. The GPL is the only license that enforces a discipline of software reuse. This is a good thing.

    As I said, RMS saw that as an effect of the GPL, not as a guiding principle. If it had been, it would have been up front in the Manifesto per se, and not in the anticipated objections section.

    Look, I don't agree with RMS on everything. His redefinition of the term "free" is not really completely reasonable. He has an unjustified utopian view about a post-scarcity world. I also don't agree that the wide adoption of GPL'd software will mean less pay for programmers. Most of the remuneration for programming I've received is for maintenance work or pay for specific modification that would typically not be available in public sources. The GPL only serves to increase the opportunities for pay for this type of work.

    But, I find RMS's opinions reasoned. I find his detractors hysterical. For example, you said that it's not far off the mark to call RMS a communist. A communist wouldn't say:

    • "
    • There is nothing wrong with wanting pay for work, or seeking to maximize one's income, as long as one does not use means that are destructive."

    As RMS states in his GNU Manifesto [fsf.org]. Somehow, RMS protestations and plain statements that show he is not a "Communist" are always ignored.


    -Jordan Henderson

    • I can write an MFC application and license it under **ANY** license I choose...

    Are you allowed to modify the MFC and redistribute it under ANY license?

    If not, it seems like the license you receive with the MFC is more restrictive, in a very important way, than the GPL.

    There is still the LGPL, whose use is discouraged by RMS, but is prefered by RMS to any other license, save the GPL, for libraries. The LGPL grants you, if I understand it correctly, the same freedom from restrictions that you enjoy with the license that comes with MFC.


    -Jordan Henderson

    • RMS isn't God, GNU isn't a religion and the FSF isn't a bunch of prophets.

    The referenced page [gnu.org] doesn't ask us to accept anything on faith. RMS doesn't support worship of him or any of his principles. RMS primarily makes reasoned arguments. You may disagree with those arguments, but if you were up to the challenge you'd use reason yourself.

    RMS tells us his view as to why other licenses, and specifically the X/MIT license can lead to problems. It's a pity his detractors have to bring up these tired cultist labels that actually serve to remove reason from the discussion.

    • If you don't like the license, don't participate. Don't use any of the products that might be created with this license.

    It's funny that with all of the libel that RMS takes for his stands, you never see RMS suggest that others who don't support the GPL should stop using GPL'd products. Gcc comes to mind as something that has benefitted many in the Open Source "Community" who snipe at RMS, the FSF and the GPL.

    Now we can see the philosophy of some of those who support other "free" licenses. They are factionalists. The FSF and the GPL support Freedom in software. The software is permitted to be used by anyone, even those who work against FSF goals.


    -Jordan Henderson

    • Certain corporations which perform specific activities for the government, as specified by Congress, may also be classified as tax-exempt under 501(c)(3).

    You're making this up as you go along, right?

    The examples I gave were all 501(c)(3)s. As I said, it's not by my definition, each of these organizations applied and were granted this status by the IRS. As you say, see Publication 557 [fedworld.gov] for details.

    While the functions of 501(c)(3) organizations may not seem to be "charitable" in function, they are generally referred to as "a 501(c)(3) charity". Donations can generally be made to these organizations on a tax-exempt basis. As I said, charity is broadly defined. Both Mitre and CTC refer to themselves as 501(c)(3) charities. As this IRS [irs.gov] page states:

    • The organizations described in 501(c)(3) are commonly referred to under the general heading of "charitable organizations."
    • The exempt purposes set forth in 501(c)(3) are charitable, religious, educational, scientific, literary, testing for public safety, fostering national or international amateur sports competition, and the prevention of cruelty to children or animals. The term charitable is used in its generally accepted legal sense and includes relief of the poor, the distressed, or the underprivileged; advancement of religion; advancement of education or science; erection or maintenance of public buildings, monuments, or works; lessening the burdens of government; lessening of neighborhood tensions; elimination of prejudice and discrimination; defense of human and civil rights secured by law; and combating community deterioration and juvenile delinquency.

    Note also that as long as the corporation involves itself in it's original chartered function, it can keep it's tax exempt status:

    • The articles of organization must limit the organization's purposes to one or more of the exempt purposes set forth in 501(c)(3) and must not expressly empower it to engage, other than as an insubstantial part of its activities, in activities that are not in furtherance of one or more of those purposes. This requirement may be met if the purposes stated in the articles of organization are limited in some way by reference to 501(c)(3). In addition, assets of an organization must be permanently dedicated to an exempt purpose.

    You can contribute, tax-deductibly to Mitre, CTC or the FSF because of their "charitable" status.

    The real issue is that you accused RMS of perpetrating a fraud by invalidly constituting a 501(c)(3) corporation. You went on to claim that such corporations were not allowed to compete with for-profit corporations. Both claims are bunk and any examination of the relevant tax code sections demonstrates this pretty clearly.


    -Jordan Henderson

    • I have not seen RMS libeled.

    Perhaps libeled is a bit strong. You do see RMS alternately called a communist, a Svengali a fraud and a number of other things. He generally is not criticized in a way that could legally be termed libel.

    • This shows the destructiveness of the GPL. While these people would very much prefer to use another tool, the predatory nature of the GPL has eliminated alternatives.

    An amazing claim. Let's see, every single OS vendor has a compiler suite which they heavily support. Most chip manufacturers have a compiler (Intel, Motorola, IBM) for their architecture. There are any number of companies that sell commercial compilers. There is lcc. Seems like there are many, many, many alternatives.

    Oh, you mean a good, free, cross architecture compiler that really works well?

    Since you make a speculation about a market that could have been had it not been dominated by that mean, destructive GPL'd gcc, allow me to me to make one. Intel, IBM, Compaq (and DEC), Motorola and probably any number of other companies spent a lot developing code generation for gcc. Had the GPL not tied their hands and required them to give these changes back, these companies almost certainly would have sold the modified compiler as a product. After all, each of these companies had their own compilers that they sold as products. Why would they give away their work on gcc had they not been forced to?

    Rather than gcc destroying the market for a good cross architecture compiler, it set up an environment where such a thing could thrive.

    Compilers (FORTRAN, Algol, Cobol) and free software both existed since the 1950s. It wasn't until a GPL'd compiler appeared that good, free, cross architecture compiler appeared.

    • By tying it up with a multi-page license that's rife with legalese and places massive restrictions on its use. Yeah, right.

    The GPL is shorter than most EULAs, and it only really places one restriction on the code; You can't take advantage of this community work unless you are willing to participate in the community by giving back. It may be more restrictive than the X/MIT or BSD licenses, but I don't see how you can reasonably call this a "massive restriction". The GPL is far less restrictive than any commercial license with which I'm familiar. What would you call commercial licensing? Tremendously restrictive? Unbelievably restrictive?

    • Not so. They differ with RMS, and with good reason. Stallman's agenda is one of spite and malice.

    Sure. He unfairly gets called all kinds of names. A reasoned discussion [gnu.org] of issues of various "free" licenses is met with "RMS isn't God, GNU isn't a religion and the FSF isn't a bunch of prophets." and absolutely no substantive arguments and Stallman's agenda is one of spite and malice?

    • While many users (in particular, "end users") can use the software in the way that best suits their needs, programmers cannot. This is the purpose of the GPL: to transform open source from a public good into a weapon directed against those who engage in activities of which Richard Stallman does not approve.

    Another amazing claim. What public good could these Open Source programmers be considering that is not allowed by the GPL except to turn an Open Source product into a Closed Source product. Seems like any other kind of Open Source license allows programmers to transform open source from a public good into a private good. You speak of engaging in "activities" (plural), but really there's only one activity that GPL doesn't allow. That's benefitting from others work without giving back changes to the community that gave you your start.


    -Jordan Henderson

  • It's my understanding that most of the writers of free software do so because they already have jobs and/or other income. Perhaps there are budding developers that need funding more than them?

    I'm sure this has good intentions, but will OSS developers of the future have to compete against each other with duplicated efforts to write the best free software, in hopes of winning cash and prizes?

    Sounds like some kind of game show to me...

    --

  • Conceptually, make sucks. The semi-declarative part is the good part. It's trying to do procedural things in a semi-declarative language that doesn't work well.

    The good idea in make is the dependency graph. Having a dependency graph that accurately describes what goes into your product is a very useful thing. But make doesn't let you use that dependency graph in multiple ways. You can't use it for linking. You can't use it to decide what goes into a distribution file. And it isn't enforced; there's nothing that checks that there aren't dependencies on things not in the graph.

    Make's "time greater than" approach to determining if something needs to be built is terrible. If make actually logged what the last compile used as inputs, it would be safe. Instead, the usual answer to doing anything that involves the possibility of a file changing to an earlier date is to either recompile everything manually or use something like "touch" to fool make.

    Some of the modern IDEs, notably CodeWarror, have rethought the dependency issue and seem to have got it mostly right. (MSVC is still cranking out makefiles internally.) But the UNIX toolset is still stuck in the 1970s.

    I rewrote make once, in 1979, to fix some of these problems. (I couldn't distribute it, so it was only used in-house.) I'm disappointed that in twenty years, nobody has rethought make. All they've done is add cruft.

  • [From the Software Carpentry [software-carpentry.com] project coordinator]

    Thank you all for your postings regarding the Software Carpentry [software-carpentry.com] project. To answer some of the points that have come up several times:

    This is a design competition, rather than a programming competition. Good entries should be relatively language-neutral --- we believe that at the 5000-word level, the similarities between modern object-oriented languages (C++, Java, Python, etc.) are more important than their differences.

    Designs based on existing tools are very welcome. If, for example, you think the only way to meet the criteria for the "build" category is to extend the syntax of standard Makefiles, then please submit that as a design. (However, for the reasons discussed in the FAQ [software-carpentry.com], if your plan for an implementation is simply to provide a Python scripting interface to GNU Make, you'll have to convince the judges that there's no "pure Python" way to achieve the same ends.)

    No, Software Carpentry is not a company looking for some publicity. The project is being funded by Los Alamos National Laboratory [lanl.gov], who believ that computational scientists and engineers need easier-to-use software engineering tools, and administered by CodeSourcery, LLC [codesourcery.com], who believe that those tools would be of use to the whole Open Source community. The FAQ [software-carpentry.com] talks about LANL's reasons for funding the project, as does this article [ddj.com] from Doctor Dobb's Journal [ddj.com].

    Yes, one of the project's goals is to give up-and-coming software designers a chance to get some attention, just as architects and classical musicians do.

    Yes, the competition is open to submissions from any country.

    No, this is not part of some perfidious Pythonesque plot for world domination :-). We thought very seriously about using Perl for the implementations, but after teaching classes in both Perl and Python at Los Alamos National Laboratory, came to the conclusion that the latter had a gentler learning curve. (This is not meant as disparagement of Perl as a tool for full-time professional programmers, it is simply an empirical observation of computational scientists and engineers.) Neither Guido nor any other member of the Python development team had any part in setting up the project, choosing Python, or choosing the competition categories.

  • by Vic Metcalfe ( 355 ) on Sunday January 16, 2000 @03:43AM (#1368159) Homepage
    Ok, they had me up until the bit about having to build the tools in python.

    Don't get me wrong, I have nothing against python, or scripting in general, but these tools scream c or c++ to me.

    I can understand wanting to standardize on one language to help make this "suite" a cohesive whole, but they've got to select the right tool for the job. Hell, I don't even have python installed on most of the boxes I use, but you can bet c and c++ will always be there.

    From their FAQ... "Requiring that all tools be written in, or scriptable with, a single language will make it easier for newcomers to learn, use, and extend these tools."

    How does does implementing a tool in a scripted language make it eaiser for newcomers to learn and use?

    Oh well, other than that mandate this looks like a really cool project. I wish Software Carpentry all the luck on the world!

  • by Per Abrahamsen ( 1397 ) on Sunday January 16, 2000 @03:58AM (#1368160) Homepage
    ... suck as cook [pcug.org.au], many of them arguably better. The reason they don't take off is that GNU make is "good enough", and people already know make.

    The same is true for all the programs they want to replace. At best, this competition will give some developer experience they can use for enhancing the standard tools. At worst, it will divert some free software talents towards enhancing and maintaining a little used set of alternative tools, rather than enhancing the tools used by the rest of the community. Most likely, someone will have wasted US$200.000.

  • by Daniel ( 1678 ) <(dburrows) (at) (debian.org)> on Sunday January 16, 2000 @06:01AM (#1368161)
    How does does implementing a tool in a scripted language make it eaiser for newcomers to learn and use?
    Well, I assume that they'll allow Python code in the control files themselves, the same way Makefiles allow sh code and autoconf allows m4 code. Writing the tool in the interpreted language makes this easier -- I suppose you could try to optimize by writing most of the code in C, then providing Python-visible hooks and calling the interpreter as appropriate; this might be less useful than you'd think, though. My inclination would be to write only the dependency-resolution stuff in C -- nothing else seems likely to be time-critical.
    Anyway, back to the reason to choose Python (as opposed to other scripting languages) -- Python is actually more common than you might think, it's not that hard [1] to install, and it's sane.
    Daniel
    [1] I'm assuming you're willing to use binary packages; for example, the Debian ones..
  • by mvw ( 2916 ) on Sunday January 16, 2000 @06:29AM (#1368162) Journal
    And that's why *BSD stuff will never beat GNU software - they took the time to do it properly; you're "too busy" to bother.

    That is a stupid remark, especially the time argument. Both cultures have excellent results, are fruitful to each others and are likely to stay with us for a while.

    The lack of a BSD compiler is the result of no one interested enough so far in writing one. Not more not less. No reason why this could not change one day.

  • by Guy Harris ( 3803 ) <guy@alum.mit.edu> on Sunday January 16, 2000 @10:06AM (#1368163)
    So write a new interface, not a new tool.

    Perhaps the person to whom your responding meant that the interface intrinsic to the tool - i.e., the semantics of Makefiles - were "confusing and lame", and therefore that "writing a new interface" would mean "writing a new tool".

    Sometimes incremental improvement of existing tools merely involves moving closer to a local optimum in the solution space, and avoiding a better local optimum somewhere else in the solution space.

    That's why I think that this competition isn't ipso facto a bogus idea - perhaps people won't come up with something better, perhaps they'll come up with something that's a little better but not enough to supplant the existing tools (NOTE: the availability of the new tools does not mean the old tools will go away! It's not as if you won't still be able to use make and autoconf.), but perhaps they might come up with something that has an underlying model that's significantly better, so that the new tools are easier to use, or more powerful, or more powerful and easier to use (e.g., it may be easier to make use of the tools' power).

    No, I don't know offhand what such a tool might look like - but that in no way constitutes an indication that no such different-and-better tool is impossible.

  • That's the point behind every pie-in-the-sky competition. When JFK got up on the podium and said he wanted a man on the moon, he was really standing in front of a hundred thousand engineers and saying, "I dare you." This group is doing the same thing.

    Will we get a new make out of it? You and I would say no, but all it takes is a few kids sitting around somewhere listening to us laugh, and they get even more fired up.

    The money isn't going to be what drags the programmers out of the woodwork - it's going to be the recognition that goes along with the money. (Sadly, the only way to get recognition in these IPO days is money, but that's another rant.) Ten years ago, someone could have waved a hundred thousand dollar check in front of Linus's nose, and it wouldn't have got us to this point any faster.
  • by rhet ( 29034 ) on Sunday January 16, 2000 @09:55AM (#1368165) Homepage

    So many of the previous posts are about why this is a Bad Thing because it diverts talent away from other things, why replace something that works, yada yada yada. The same could have been said of Linus' work on linux instead of contributing to BSD, etc. etc. Why are people so hung on "investing in open source (no, I refuse to capitalize open source)" and sticking with something "because that's how we've always done it?"

    Here's a company willing to throw money at open source development and they get blasted. Who cares what they do. Does it affect you? NO. It doesn't. You do your thing, let Software Carpentry do theirs. If you love make, then use it. If you don't, that doesn't mean you're dumb or stupid as previous posters seem to imply, it just means that you would like an alternative to "make." Common folks, let's put the FREEDOM back in FREE code.

    The open source zealots (and slashdotters in particular) are, ironically, (moderator: that's my own damn observation, not flame bait) among the most UNFREE people in this world. You seem to like any idea as long it jives with what you already believe. Live and let live.

    More open source s/w will NEVER be a Bad Thing. After all, it will only result in more FREE code and that means more FREEDOM.



  • by dbrower ( 114953 ) on Sunday January 16, 2000 @06:01AM (#1368166) Journal
    And there is jam [perforce.com], a paper from which has the bibliography:

    Atria Software, "Building Software Systems with ClearMake", ClearCase Users Manual, Natick MA, May 1994.

    Geoffrey M. Clemm, The Odin Reference Manual, available via anonymous FTP from ftp.cs.colorado.edu.

    S. I. Feldman, "Make - A Program for Maintaining Computer Programs", BSD NET2 documentation, April 1986 (revision).

    Glenn Fowler, "The Fourth Generation Make", Proceedings of the USENIX Summer Conference, June 1985.

    Peter Miller, "Cook - A File Construction Tool", Volume 26, comp.sources.unix archives, 1993.

    Christopher Seiwald, "Jam -- Make(1) Redux", Usenix UNIX Applications Development Symposium, Toronto, Canada April 1994.

    Richard M. Stallman and Roland McGrath, "GNU Make - A Program for Directed Recompilation", Free Software Foundation, 1991.

    Zoltan Somogyi, "Cake, a Fifth Generation Version of Make", Australian Unix System User Group Newsletter, April 1987.

    Dennis Vadura, dmake(1) manual page, Volume 27, comp.sources.misc archives, 1990.

    Which show some different approaches that have been taken, even though some of them don't qualify or are what are the things being replaced.

    If there's a missing requirement in the rules of the contest, it's the lack of a migration path from make. Without that, you just have an interesting toy, because no one will move their existing significant system without it.

    -dB

  • by Tet ( 2721 ) <slashdot@astradyne . c o .uk> on Sunday January 16, 2000 @03:59AM (#1368167) Homepage Journal
    The list of judges is impressive, so it's probably not just some new company seeking publicity. However, the question is, why do they want to replace the tools in question? The existing tools already have 90% or more of the required functionality, so why not just extend them as appropriate?

    Certainly autoconf needs some work to tidy it up (particularly the generated configure script), but it's not as bad as they make out. As for it being the last major application to use m4, I guess they've forgotten about sendmail...

    Similarly, make has some deficiencies, but again, it's mostly there, and what it does lack can be fairly easily added. It needs a simple GUI front end for newbies more than it needs rewriting.

    Overall, it's not a bad idea, but I think that the effort should have been put into more pressing areas, such as having an embeddable editor API for X (so that individual apps can have an editable text area, and the users gets to choose which editor is actually used).

    I can't help thinking that perhaps this is part of Guido's grand plan for Python to take over the world (not necessarily a bad thing in itself, but I'm always suspicious of things with political motives).

  • by Stiletto ( 12066 ) on Sunday January 16, 2000 @04:07AM (#1368168)
    Replace autoconf?
    Replace make?????

    These are robust, time-tested tools for creating software. If a better way existed to manage projects we (programmers in general) would probably have it by now.

    I was just recently a member of a team that converted a very large project from Microsoft's hideous Visual Studio project (.dsp) files to autoconf, automake, and make. Why was this done? Because it's easier to use, more flexible (try telling MSVC to run lex and yacc and then compile the output files, using only .dsp files! HA!), and opens the program up to porting to other platforms.

    Now on the other hand, if all "Software Carpentry" wants is versions of autoconf and make ported to python, well, I guess it's not that silly, but why would you want to do that? The source code for these programs is extremely portable already. Implementing them in Python gains you nothing.

    ________________________________
  • by sbaker ( 47485 ) on Sunday January 16, 2000 @07:54AM (#1368169) Homepage
    Whilst I applaud any company who's ready to spend substantial wads of cash on OpenSource development, I really think that competitions are the wrong way to go about it:

    • It encourages secrecy and non-cooperation between the various people working on projects like this. That transforms the usual model for OpenSource development from web-based cooperation into commercial competition.
    • It doesn't encourage the best people to do the work because they'll say to themselves "I could work for six months on this - and then lose the competition and get nothing". Since they have already fostered a competitive model, that's a bad thing.
    • The wording of the competition seems to prohibit developers from doing the rational thing which is to start with the best parts of the existing autoconf/make system and just fix whatever is perceived to be broken. Even worse, the people who might have been working on improving those projects are now dragged off to start again from scratch.
    • Putting money into OpenSource teams has to be done with great care since it can often result in serious internal debates about who contributed most and who deserves what share of the money.
    It would have been much better if this company had hired a good programmer for a year or two and had them work on 'fixing' whatever it is that they dislike in autoconf/make.

    Failing that, break the money up into $20k 'grants' and offer them to people who are already working in the right direction.

    This competition is A Bad Thing.

  • by mvw ( 2916 ) on Sunday January 16, 2000 @04:51AM (#1368170) Journal
    Replace autoconf?
    Replace make?????
    These are robust, time-tested tools for creating software. If a better way existed to manage projects we (programmers in general) would probably have it by now.

    I was just recently a member of a team that converted a very large project from Microsoft's hideous Visual Studio project (.dsp) files to autoconf, automake, and make. Why was this done? Because it's easier to use, more flexible (try telling MSVC to run lex and yacc and then compile the output files, using only .dsp files! HA!), and opens the program up to porting to other platforms.

    First, I completely agree with you that project configuration through MS Developers Studio projects is inferior to UNIX style configuration.

    I had many fruitless discussions about this with some of my collegues (sigh). For some reason it is seems nearly impossible to convince people with a DOS/Windows background that the complicated make syntax is less a PITA than the fact that a myriad of parameters is hidden behind various corners of the MS Dev Studio graphical user interface.

    My theory is that UNIX people love ASCII representations while the Windows crowds loves roaming GUIs. No idea why. But believe it or not there are people who for example can memorize an astounishing list of key/value pairs from the NT registry.

    On the other hand one must admit, that it is hard to understand make without being familiar with the way the basic UNIX tools (awk, sed, grep, sh..) interact in the more complex make files.

    My objection against make is not it's complicated syntax (which is only complicated because different levels of parsing - make's and sh's - intermix and regular expressions need a bit familiarity), but that it is slow.

    This is not a failure of make itself but because of the way we traditionally organize programms into files and directories. In fact we use the filesystem as database, and query this database by giving paths.

    I would expect a speed up, if we would move away from that organization into some kind of program database. A database that is very close to the semantics of the used programming language. Like a database of classes, makros, functions, globals etc So I expect that it would be possible to improve the speed of a make dependency aware construction tool in such a environment.

    It would also solve some issues with C++ (especially template resolution).

    However I have not seen a move away from that traditional C style project break up, compilation and linking to something which more resembles a database with tools on it yet.

    Possibly because it would give up many benefits, the most obvious ones the ASCII representation and the flexibility and power of interaction with the UNIX simple-tools-work-together style. Which is a very good style. In fact it is a very clever hierarchical design.

    Another reason is that the database thing would also have some other drawbacks. The slowness of make is part to scanning the file hierarchy. On the other hand this allows to add or delete components in an easy way. A database solution would insist of registring new items and eventually be more complicated to use here.

    My objection to autoconf is mostly license based. The whole autoconf/automake/libtool toolchain is an impressive one, but it is inheritently bound to the GPL. If we ever want a true free BSD, we have to think of an alternative. But as with the system compiler, we have other, more important work to tackle with limited resources right now.

"If value corrupts then absolute value corrupts absolutely."

Working...