$100,000 Open Source Design Competition 210
Hrvoje Niksic writes "The Software Carpentry project has announced its first Open Source design competition. They offer prizes totaling $100,000 for people who come up with good design for tools that replace autoconf, make,
as well as a bug tracking system and a regression testing suite.
Good luck!"
Re:Cook in NOT free (Score:1)
requires Python (Score:1)
Re:3 questions (Score:1)
It comes down to choosing the best tool for the job. For one web site I was working on I used a combination of c, c++, bash and php. Each did something that it was better suited for than the others. I like having that kind of flexability, so I don't like the idea of python, or any other language being mandated.
3 questions (Score:1)
I've seen some amazing software engineering support systems done in Python already. Stuff that is quite a bit more complex to design and implement than make et al. Stuff of the kind that previously made a long time C++ user like me say "this screams for C++". Until, that is, I saw it done and maintained in Python with much less (and cleaner) code and time than would ever be possible in C++.
--
Re:3 questions (Score:1)
--
Re:Should we move away from the file system paradi (Score:1)
John
Re:The old folks were not stupid (Score:1)
Question: What are pros/cons of what I shall term 'enforced coding structures' -- is there a proper term for this?
John
Re:They have got to be kidding... (Score:1)
Try cook (Score:1)
The referees... (Score:1)
Re:Where's source revision control? (Score:1)
Is a contest a good idea? (Score:1)
I don't think so.
Because Free Software/Open Source Software is based on collaboration while contests are based on competition. The two are almost perfect opposites.
In this contest, we will have programmers duplicating their efforts on many projects rather than having many programmers expending their efforts on a few projects. What's more, not necessarily doing it because they are interested in the project, but in some cases, purely for the money. And after all, no-one writes FS/OSS for the money do they?
Here is a feature that can improve make (Score:1)
Incremental compiles.
If you look at the IDE provided by the IBM VisualAge C++ compiler, it provides incremental compiles. So, if you only change a comment within the file, the file wont be recompiled. Again, if a particular function was changed in a manner that didnt impact the rest of the file, only the code for that function is recompiled. This vastly reduces "make" time. Of course, this needs to be supported by the compiler too, but then both the compiler and make are GNU tools, so atleast in theory, this is possible...
Re:The referees... (Score:1)
Fundamental Design Tenet Already Broken (Score:1)
Hmm. Let's see, mandate a particular development language for programs to be adopted by "medium-level" program designers.
Let's apply this "logic" to building skyscrapers: "Design applicants must use Legos (tm) because they are easier for use by new engineers."
My point: Not all languages have the same strengths or weaknesses. Mandating a particular language that may or may not map well into the problem domain is a Bad Idea (tm). Of course, GOOD SOFTWARE DESIGNERS already know that this is a Bad Idea (tm) and so they try to chose the best language for the problem.
The contest is not a bad idea, but the rules are illogical at best. I will be highly surprized if they can build a make or autoconf replacement by the end of this year. These are complex problems and "just using Python" will not make them easier. Cheers
--
Why Python, and who is really paying? (Score:1)
Why Python? A lot of developers at Labs are Biologists, physicists, chemists, weather folk, etc who develop large beouwulf and other numerical codes. Given that most scientists pick up programming without any formal training, they are more interested in spending time on their coding and science than in Makefile and autoconf arcana(and dont tell me that a m4 based config system is nor arcane).
Python is easyly understandable. A java/C++ programmer would pick it up in 2 hours. A C programmer in a day or two. People without any formal training pick it up in a week or so. existing C and C++ and Fortran code can be easily wrapped using SWIG--this has been used to implement the outer calculation and visualization loop for massively parallel molecular dynamics programs. See http://www.swig.org
This is the basic point: How do you extent the advantages of software enginnering to those who are not software developers, but simulators and scientists, and other engineers. And how do you stop being elitist and lower the barrier of entry into Open Source for the nonexperts and the common man. And how do you ensure extensibility. The answer is scripting, and good coding practises for the non-expert demand python.
Re:Language Standardization Issues (Score:1)
I can say that it is (a) very easy to learn without sacrificing power.
The problem I have with Python is exactly that it has sacrificed some power. Take lists as an example. Python lists are arrays, so inserts and deletes are O(N)!!.
I agree that more language choices are good. I also think that there is a right tool for the job, and that the job of collecting dependency information doesn't cry out for a scripting language.
python restriction (Score:1)
a good chunk of the FAQ is spent defending this decision, w/ the crux of the answer being along the lines of "we feel python is the best compromise...". too bad, one would think the most viable approach is to educate new developers rather than dumbing down the tools.
--thi
Recursive Make Considered Harmful (Score:1)
There's more to make's apparent "slowness" than meets the eye. Peter Miller [mailto] has written an excellent analysis in his paper, "Recursive Make Considered Harmful [pcug.org.au]" -- his argument is that make has been misused for years, and we need to rethink how we use it. Instead of recursive invocations of make, we need to use the features of modern make implementations (e.g. GNU make) to make whole-project Makefiles that can do the job make exists to do.
Because Unix projects were once small enough to fit in a single directory comfortably, people got used to the idea of "one directory, one Makefile". When projects began to require many directories to organize the source files, many Makefiles and recursive invocations of make became the norm. This turns out to be extremely inefficient and prone to error, for a variety of reasons detailed in the paper. Instead, he advocates using many fragments of a single Makefile (one fragment per directory) and including those with make's include directive. (Hence the need for a modern make.) The paper also contains a section about writing efficient Makefiles, with techniques to significantly improve processing speed even with traditional recursive make techniques.
Common objections to this technique are also addressed:
Take the time to read the paper; it looks to be worthwhile...
Re:They mandate Python!? (Score:1)
Who care if it makes sense? They're paying. I've been using an inappropriate language (required by non-technies) for 13 years, but the paychecks just keep coming. :-)
It's like programming for Windoze. Even if the platform doesn't make sense for the application, that's what the customer has. If the first words out of your mouth are, "Ok, first of all, we need to get you a modern computer," then the job will go to someone else.
---
Re:Obvious question: why? (Score:1)
why do they want to replace the tools in question?
Cynical answer: Licensing? make and autoconf being the backbone of a lot of what we do and also being GPL'ed. Their license is not going to be GPL, its going to be MIT-X style.
why not just extend them as appropriate?
Agreed, a good GUI would do the trick for a lot of people. If the intent of this really is to improve functionality though, I am suspicious that radio-buttons would really achieve this.
I read the link pretty quickly...is this the DOE funding it? It looks like it. Does anyone know of any govt issues with GPL'ed software?
Re:Should we move away from the file system paradi (Score:1)
Check out his future plans on his site. Don't have URL handy, sorry.
Re:Freedom (unless you're not like us)? (Score:1)
LetterRip
Re:They have got to be kidding... (Score:1)
I'm not a dumb guy. I've bee coding for various platforms and languages for many years, and always enjoy playing with new technology and software. But next to sendmail configuration files, autoconf/make seem to be created by the devil himself. I've never seen such an orgy of scripts, binaries and m4 working to create makefiles that take way to long to comprehend and debug. It shouldn't be this complex and bloated.
make, by itself, is perfect as a build tool. What I do like is the way glib/gtk+/gnome/libxml and other tools provide you with a -config script to which you can use backticks in your compile lines (eg. CFLAGS = `gtk-config --cflags`).
glib (not to be confused by glibc), also provides many safe routines (ie. better sprintf()) and utility functions we all could use and have probably made for ourselves. It also guarantees the existence of various core functions, by simply either being an alias for the system function, or if it doesn't exist, providing an implementation. Thus instead of targeting UNICEs, I assume the target platform has ANSI C (or C++) plus glib and any other libs I need. No need to worry if it's strings.g or string.h, or if there's a strdup().
Thus, it is my humble opinion that autoconf and friends should be replaced by simple make, *-config scripts and glib (or some other common library).
Re:Source or everything; keeping temporary data (Score:1)
Oh man, I wish I had the references handy, but there are a couple algorithms for incremental lexing and parsing, which run big-O(n) where n is the number of *modified* tokens in the input. If you keep intermediate representations around, you don't have to rebuild the entire parse tree just because you changed X *= 2; to X *= 3; in a 50,000 line source file.
autoconf && make are useless to Python. (Score:1)
-russ
Re:Obvious question: why? (Score:1)
Similarly, make [...] needs a simple GUI front end for newbies more than it needs rewriting.
See http://www.alphalink.com.au/~gnb/maketoo l/ [alphalink.com.au]
Disclaimer: I wrote it.
Blame the coder, not to language (Score:1)
1) It's an Adobe product, which are all huge resource hogs to begin with.
2) He's running win98 which is even worse about resources than Adobe? Friend of mine has to reboot his win98 box every few hours because it doesn't free memory after using it half the time with Any software.
Dreamweaver
Re:Yeah, right... That's the point, actually (Score:2)
It wouldn't have gotten us to this point at all - Linus first wrote Linux because he couldn't afford anything else than Minix.
Lack of money can be a good thing. (VERY unusual though.
Another point of view (Score:2)
A fine non-copyleft free software license (Score:2)
Re:Another point of view (Score:2)
Re:There are lots of make replacements... (Score:2)
What can happen is that the new utilities are anough of an improvement for some project to switch to, but that the wast majority will stick to the existing well known tools, since they are "good enough" and most developers already know them. This means that the programmers most likely to improve the existing tools will use something entirely different instead for their projects, so the majority will end up with worse tools than they would without this competition.
Re:Obvious question: why? (Score:2)
Re:Language Standardization Issues (Score:2)
Aside from that, did you read my post? I said that it would be a good idea to write the dependency-collection code in a C module. But I can definitely see the advantages for writing a large part of the code in Python.
Daniel
Re:Language Standardization Issues (Score:2)
Well, there's Python
Of course, the only thing I use it for these days is writing automatic mailhandlers (it has excellent mail-parsing facilities) -- mainly because I don't run into much stuff where I need Python's capabilities. Go figure
Daniel
PS - while Python is nice, many computer languages is a **GOOD THING**. Having a language that more closely models the problem -- for example, being able to use a functional language like Lisp or ML -- makes things a lot simpler and faster. More languages means more choices. End rant.
Re:We should move away from the database paradigm (Score:2)
If you have a "database" that also can organize chunks of arbitrary data in a tree structure, but is so much faster than ext2 at this that you believe it should be built into programs... why not wrap a filesystem driver around it and build it into every program at once?
Your're right, I was not thinking of arbitrary data when I speak of that "database". I want to exploit features of a certain kind of data, that should be stored efficiently - the intermediate code representation.
We usually store the initial representation of the program (the source) and the final representation (object files, executables, libs..) but throw away the intermediate representations that compilers, assemblers and linkers calculate time and time again.
For a few cases we keep that intermediate representations. Precompiled headers are an example of an intermediate result that is kept near the beginning of the translation chain, template repositories (used in AT&T's cfront derived C++ compilers) are an example near the end.
What I thought of is similiar to one of those relational databases in that the focus shifts from the raw input data to the intermediate representation that would be kept as a whole.
Let's say the running core would consist not of a collection of files of characters but of datastructures that are more suited for compilation, typically of a forest of parse trees and related intermediate information for the whole project.
One could import/export sources from that core, and the compilation process would be a special kind of query that exports binaries from that core.
More global optimization strategies would be possible because the compilation proceeds on the whole project, not only single compilation units as normal compilers do.
The trouble with template instantiation would be gone.
But, yes now comes the but :-), such a system is certainly complicated. I had a lot of fun with precompiled headers and template repositories in the past. So I assume it is not easy to write them in a way that they work flawlessly.
Re:There are lots of make replacements... (Score:2)
PMake - A Tutorial
Adam de Boor
If there's a missing requirement in the rules of the contest, it's the lack of a migration path from make. Without that, you just have an interesting toy, because no one will move their existing significant system without it.
Important point!
The old folks were not stupid (Score:2)
I am sure there is a lot to learn from those old systems. :)
(Why some of the VMS guys inflicted Windows NT on us is a different thing
That we seem to have lost some features as well and did not progress only, shines through in the hacks and rants of great hackers like Richard Stallmann and Jamie Zawinski. Take this quote
Back before the current dark ages began, I hacked on Lisp Machines. (..)
Have you ever wondered why we're stuck with crap like Unix and MS-DOS? Read Richard Gabriel's Worse is Better paper for a great explanation of why mediocrity has better survival characteristics than perfection
from Jamie's page [jwz.org] and this quote from Richard
Yes, with string-based interpreters you can represent programs as strings, and you can interpret them as strings, but you can't parse their syntax very easily as strings. In LISP, programs are represented as data in a way that expresses their structure, and allows you to easily do things to the programs that are based on understanding them.
from a recent RMS interview [linuxcare.com]. Both refer to the LISP machines of MIT, which seem to have operated on a higher level program representation than mere strings.
I interpret these rants that todays machines are stronger but dumber in a sense as well.
Question is if one could combine the strengths of both worlds. The higher level representation found in LISP machines and the performance of our present C compiled systems.
Like I tried to explain above, my feeling is that this could be achieved by shifting the primary representation to something closer to the intermediate structures that arise during compilation. It would have indeed similarities to a configuration management system. Adding a line to a text source would, after check-in, result in an immediate update of an persistent parse tree of the program database core.
OpenVMS compilers and the linker can be invoked to operate on CMS objects without having to pre-fetch anything first.
Do you have any reference where I can read more about CMS? (I would be happy also to have some nice review on the strengths of the LISP machines)
If we were to implement something like this for ourselves I'd say the first thing to do is to find a lightweight, fast and efficient implementation of an object repository. Does anyone know of such?
Sounds to me like what OODBs are promising. The one I had to try so far (POET 3 under Win32 and Solaris) was horrible. No idea how they perform today, as they seem to be two major revisions farther.
Re:Source or everything; keeping temporary data (Score:2)
We've really got two things here.
No, I am not talking revision control repository here. I am talking moving towards a representation that is a bit further down the road of compilation.
Obviously it is possible to represent a programm on many different stages:
If we treat each translation step as a mapping between representations, I can draw the compilation process this way:
1 -> 2 -> 3 -> 4 -> 5 -> 6 -> 7
In fact not all mappings are one way (loosing information) but are (or could be made) bijective
1 <-> 2 <-> 3 -> 4 -> 5 -> 6 -> 7
I now suggest focussing more on the programm in representation around stage 3. Why?
Because this is the representation in terms of the language, on that for example the compiler works on for code generation.
And I am not thinking just compilation here. I am more thinking turning the collection of sources into a database of things more understood, e.g. program related units, like classes, functions, macros, ..
Take Emacs for example. Press Esc-x and then spc and you get a listing of all available functions. The Emacs kernel "knows" of all its functions. Compare that to grepping a deep C/C++ source tree for some function. That grepping should be replaced by a more appropriate query on a more appropriate database. (Like that Esc-x spc sequence is a query on the dumped Emacs kernel).
If the source has changed, you are going to have to rebuild the output anyway. If the original has not changed, you can just use the object file from revision control. What is the point?
Not every change to source would result in a complete rebuilding of the internal structures. Some mechanism like the access optimzation found in relational databases had to decide if the whole has to be rebuild, or if only parts have to be changed. This is certainly one of the hard parts.
Now, I suppose you could argue that you don't always need to recompile an entire source file; you may have changed just one function. But to know that, the compiler is going to have to do a source analysis of it anyway.
If I edit on the text stage you are right. Possibly most changes will result in a new translation. If I edit on the language stage ("add a function", "delete that class") not necessarily.
Recursive Make Considered Harmful (Score:2)
Re:There are lots of make replacements... (Score:2)
Here [freebsd.org] is the one I meant (HTML [freebsd.org], PS [freebsd.org], ASCII [freebsd.org])
Referees - why DiBona? (Score:2)
As far as I understand ("scriptable from python") they want to be able to run it from python.
What I am more surprised about is to see Chris DiBona on the referee board.
He worked on the "Open Sources" book and is the leading PR guy of VA Linux (for a bad example of his advocacy style listen to this interview [tamu.edu]). But there it ends. He is certainly not picked because he is an expert of tool design.
Anyone knows about the other referees?
We should move away from the database paradigm (Score:2)
If you have a "database" that also can organize chunks of arbitrary data in a tree structure, but is so much faster than ext2 at this that you believe it should be built into programs... why not wrap a filesystem driver around it and build it into every program at once?
Re:We should move away from the database paradigm (Score:2)
I'm not sure exactly how stuff is stored in CMS (apart from the obvious fact that source deltas are stored as diffs) but I'd guess that there must be some sort of efficient indexing involved since access to such a store doesn't have to be quite as flexible as a general purpose file system.
If we were to implement something like this for ourselves I'd say the first thing to do is to find a lightweight, fast and efficient implementation of an object repository. Does anyone know of such?
Consciousness is not what it thinks it is
Thought exists only as an abstraction
Yeah, right... (Score:2)
I can't see anyone slapping together a tool to replace make any time soon... that's one piece of heavy parsing code.
alternative funding model (Score:2)
The free market is a wonderful thing, and you don't want to discard the parts of it that work well. It's not unreasonable to offer compensation to somebody to write a useful piece of software.
I recently came across a proposal by an economist (in the UK, I think) called "social policy bonds", which is applicable here. His proposal was that the government would create a financial instrument (a piece of paper) which could be redeemed for a fixed amount of money when some measurable social goal was fulfilled. Once the bonds were created, they would be auctioned to the highest bidder. A further free-market tweak could be put on the idea: bonds are issued by individuals rather than the government. Rather than collected tax dollars, an individual puts a chunk of cash in escrow with a private financial institute, which gives the individual a certificate serving the same function. If the condition is met, anybody can redeem the certificate and take the money out of escrow. (Until that time, the escrow agency can invest the money, or collect interest on it.)
I've discussed this idea with a couple of banks in my area and they aren't interested in acting as escrow agents. The idea is too wierd and new for them. Maybe I'll try insurance companies. The viability of a certificate is contingent upon the reputation of the escrow agent.
A similar instrument could be used in place of Software Carpentry's competition. It would remove the stipulation that only one person could prosper for each goal. People would be able to profit by contributing to the efforts of others. As with shares of stock, all the owners have an incentive to cooperate to cause the price of the shares to rise.
The economist who originated this idea is named Ronnie Horesh. His proposal [geocities.com] goes into much greater detail than I have done here. It's a cool idea, probably useful for many different goals.
Re:They mandate Python!? (Score:2)
It doesn't. It makes it easier for the evaluators to evaluate. Once they get a set of winners, why don't we port 'em to C++?
Re:Wrong license, I'm afraid (Score:2)
Why should anyone care what RMS thinks about someone else's project? Have we all lost the ability to think for ourselves?
Re:Factionalism (Score:2)
I seem to recall the original post in this thread called MIT/X the "wrong license". Are you saying that it's okay for GPL advocates to criticize MIT/X, but that it's at the same time wrong for MIT/X advocates to criticize GPL?
Re:Wrong license, I'm afraid (Score:2)
Unfortunately, the legalese in the GPL creates some very problematic restrictions above and beyond "giving back" to the community. First of all, anything given back must also be GPL. Second, anything that dynamically links to a GPL library (remember, RMS doesn't want you to use the LGPL) must also be under the GPL, even if the code is 100% wholly your own.
"The GPL is far less restrictive than any commercial license with which I'm familiar."
So what? Khruschev was less restrictive than Stalin, but what difference does that make. However, a commercial license does have one huge advantage over the GPL: no commercial library that I am aware of tells the developer what license they must use.
Re:Wrong license, I'm afraid (Score:2)
Of course we are aware of that! Do you think we are children unable to read the license for ourselves? We are adults and fully capable of deciding for ourselves what to do with our lives and our code.
Re:Wrong license, I'm afraid (Score:2)
People should use whatever license they choose. Hopefully they would choose a license appropriate for their projects. It's not much skin off my back if they choose something I don't agree with. It's their life and I'll let them run it. Even if the choose a commercial license. Even if they choose the GPL for a library.
And as for the lack of chaffing under commercial licenses, re-read my post. No commercial license dictates what license you may or may not use.
Re:Wrong license, I'm afraid (Score:2)
I can write an MFC application and license it under **ANY** license I choose, including GPL, BSD or public domain and I can freely redistribute the MFC dll's. Compared to the GPL, the MFC license is much, much freer for me to *use* in the way that libraries are used.
If a developer wishes to use the GPL for a library, then that is their right. But understand that by doing so, they deny the *use* of their library to everyone developing under a different license or using a non-GPL library.
Re:Wrong license, I'm afraid (Score:2)
But he did call for taxation on all software in order to fund free software. And he did call for the elimination of artistic rights.
No other faction of Free Software, or even commercial and proprietary software, has the temerity to *demand* what other developers do with their software.
Re:No way you are getting away with that! (Score:2)
With the MIT or BSD licenses, I can include the entire readable and understandable license at the top of each and every source file. I don't have to wonder if I'm going to be sued for using library A with library B. In short, since there are fewer restrictions with the BSD or MIT, programming is easier (unless you like the lack of choice inherent in marxism).
"Just don't try telling me on one hand that the GPL is evil because it forces people to give away their code, and then encourge people to licence under the BSD so the code can be "truly free"."
No one was calling the GPL evil (at least in this thread). But it's absolutely ludicrous to call the GPL more free than the BSDL on the fact that the GPL is more restrictive. And though it's not evil, it is questionable that the GPL requires its users to redistribute political propaganda.
Re:Wrong license, I'm afraid (Score:2)
No, which is why I prefer any Open Source license to Microsoft's EULA. However, there are many conditions where I am FREER with MFC than with a GPLd library. And one of these is a very common condition, that of writing an application.
If Microsoft says "you can do A but not B" and GNU says "you can do B but not A", then there will be instances where the GPL will be the more restrictive license.
"There is still the LGPL, whose use is discouraged by RMS, but is prefered by RMS to any other license, save the GPL, for libraries."
The contendor for my biggest beef with the GPL is the restriction against dynamically linking to non-GPL code. RMS is just plain wrong on this issue. It is at this point that the GPL escapes the bounds of polite society and starts sticking its nose where it doesn't belong, namely, other people's code. And this isn't just some petty gripe. The very desktop that I am using has been declared *illegal* by Redhat.
Re:No way you are getting away with that! (Score:2)
But you were arguing that the GPL makes programming easier. Stop changing the subject
"...but apart from that I don't know what you are talking about."
I'm talking about that whole introduction at the top of the GPL before section zero.
Re:Freedom (unless you're not like us)? (Score:2)
For a crowd that likes to toss around the words FREE and FREEDOM as much as they do, they are very stuck on the notion that there is only one right way to do something.
The quickest way to earn the enmity of the Free Software community is to freely and voluntarily choose something. Choose *BSD and they bitch about proprietary exploitation. Choose KDE and they moan that it's illegal. Choose Redhat and they kvetch that it's too commercial.
Choose to award someone $100K in a contest to improve autoconf and they're incensed that someone would spend his money without asking permission of Slashdot first.
Re:Is make too difficult? (Score:2)
I think you just answered your own question.
Re:Freedom (unless you're not like us)? (Score:2)
However, I notice that most people bitching about this are not complaining about the source of funding, rather they are bitching that money is going someplace other than where they want it to.
Re:They mandate Python!? (Score:2)
What exactly is the problem? Yeah, I guess Python won't be installed on every machine - but so won't any of those new tools. And if you are going to install new things, what's the problem with installing Python?
-- Abigail
Re:Another point of view (Score:2)
I don't recall anyone saying to Linus Why are you fixing Minix? It ain't broken! Sometimes, you just want to find out whether you can do better then there already is.
-- Abigail
Re:Wrong license, I'm afraid (Score:2)
RMS isn't God, GNU isn't a religion and the FSF isn't a bunch of prophets. We all know RMS is hardheaded, and only believes in himself. But many people have found out there are more ways leading to Rome. You might have an opinion about the choice of license, but just shouting "Wrong license", waving a political document written by someone else as "evidence" doesn't bring you far.
If you don't like the license, don't participate. Don't use any of the products that might be created with this license. You might even write something better, and release that under your preferred license.
Just don't act as a doomsday sayer.
-- Abigail
Re:Wrong license, I'm afraid (Score:2)
This suggests worse things than can actually happen. While the X (or MIT) license allows you to relicense code for distribution, it doesn't allow you to strip off an existing license. What that means is that if I write code, and license it under the MIT license, you can take the code, possibly modify it, and distribute it under a license of your choice (GPL if you wish). But that doesn't mean I can no longer distribute my code under the MIT license. I don't lose rights. But I give you more rights than you would get had I distributed the code under GPL.
You can take MIT licensed code, modify it, and distribute it as GPL coded. You can't do it the other way around. MIT licensed code allows you to do anything that GPL code allows you to do - and then some. It's not hard to figure out which gives you more freedom. I prefer MIT style licenses because I don't want to back up my preferences with legal actions.
-- Abigail
Source or everything; keeping temporary data (Score:2)
We've really got two things here.
The first is the question of keeping just "source" files under revision control (and by source, I mean anything you use to build an executable - code, images, resources, etc.), or keeping everything (object files, final executables, etc.) under revision control.
The argument for "just source" is that you should always be able to build an identical finished product from the proper source, and so keeping generated output files around is a waste of machine resources. You also run into fewer problems with the unexpected dependencies and conflicts you encounter during debugging.
The argument for "everything" is that you can reduce build times by having those ouput files pre-generated, so the burden of rebuilding on a change is put on the person making the change, and everyone else just uses their output. You can also make the argument that finding the above-mentioned dependencies early on will lead to better code.
As far as I'm concerned, this is largely a matter of opinion, and you should go with whatever works for you.
Now, the second issue is a bit different. You talk about storing intermediate data, such as parse trees. An extension to the "everything under revision control" method. What benefit does this get you? If the source has changed, you are going to have to rebuild the output anyway. If the original has not changed, you can just use the object file from revision control. What is the point?
Now, I suppose you could argue that you don't always need to recompile an entire source file; you may have changed just one function. But to know that, the compiler is going to have to do a source analysis of it anyway, so why bother trying to cache the output? If your source files are big enough that this is a significant problem, youprobably need to look at splitting up your source a bit more. It isn't just increased build times that are at issue here; programmer comprehension drops the bigger a source file gets.
Don't get me wrong; I'm not trying to shoot you down here. I'm just trying to see what benefits one would get from the ideas you are suggesting.
Incidentally, Borland's Incremental Linker, used in C++Builder and Delphi, does do something similar to what you are suggesting. If you change one object file and go to relink to make an updated executable, it simply replaces the parts of the executable that depend on the changed object code. The parts that did not change stay the same. Saves a little time.
Re:They have got to be kidding... (Score:2)
That said, I still prefer makefiles. Much easier just to set up a general rule and tell make to go to it.
No way you are getting away with that! (Score:2)
What's wrong with a bit of Marxism, anyway? It sure makes programmming easier - which is what I care about.
Glad to see someone is watching out for the poor IRS!
But at least then when that programmer is gone you can guarentee that you have an up to date copy of the source - try that with a commercial licence - and don't give me that line about how all BSDL software producers are nice guys who throw in the code for free - that is the same as GPL, and invalidates your arguement.
Until you need to fix a bug in that library, and find you don't have the source and the company is out of business.
As opposed to your well resasoned and level headed criticisms of the GPL, say?
You know what I like best about the BSD licence? The fact you can re-licence code under it to the GPL. Life is beautiful, sometimes!
Look, I have nothing against the BSDL - it is a nice licence if you want to let others make money off your code. Just don't try telling me on one hand that the GPL is evil because it forces people to give away their code, and then encourge people to licence under the BSD so the code can be "truly free".
GPL - all the source, all the time!
Re:Making programming easier (Score:2)
Sure I want to see "the state of the art advance" - and I want to see the code. With the GPL I can and will.
LMAO! Somebody has a big chip on his shoulder!
Which invalidates your arguement (again)! (Score:2)
The source code is available for FreeBSD & OpenBSD, right? There is no difference between that, and GPL'ed code, except that GPL'ed code can never be hidden, while BSDL code can.
That means that if a company wants to make some advance then with Linux they have to release the code (ignoring Binary Kernel Modules) while with *BSD they don't.
I guess you'll say "but they often do release the code after a while" - well, I don't want to wait and hope. I want it guarrented!
Re:No way you are getting away with that! (Score:2)
But neither thr BSD or MIT licence force you to include the source, do they? That is what I like about the GPL.
No one is claiming that the GPL is more free than the BSDL. It's not - and that is why I like it.
As for political propaganda - well, I guess it forces you to redistribute the source, which can empower those who normally wouldn't have it, but apart from that I don't know what you are talking about.
Re:Is this the best way to invest $100k in Linux? (Score:2)
Whilst I applaud any company who's ready to spend substantial wads of cash on OpenSource development, I really think that competitions are the wrong way to go about it:
I thought this at first, too. Seems like a dumb thing for an "Open Source" company to do, since it appears to encourage competition at the expense of the OSS model. But upon reading the webpage I discovered there's more to the story.
Actually, after the initial design phase, finalists are monetarily encouraged to join forces, since they can double their take if they win.
This is not obvious from just reading the blurb, but the biggest part of the competition is purely to design such tools; implementation details are discouraged. Though a good design is quite hard, it doesn't take six months, especially if the designer in question has already bounced around such ideas. Perhaps the money will draw out a designer who's been itching but hasn't yet scratched.
Also, the deadline for the initial design submissions is March 31, so the most one could waste is ten weeks of planning. (And since it's just a design, there's no debugging!)
From the rules (emphasis is mine): "Designs based on existing tools, written in any language, are welcome. Such designs will be judged on the same basis as those written from scratch." So this is still possible.
You're right on with this one. The rules of the game change a bit when we're talking about quantities of money rather than merely number of listings in the CREDITS file.
Failing that, break the money up into $20k 'grants' and offer them to people who are already working in the right direction.
As I said, existing projects can compete. And the $100,000 is broken up into $2500 apiece for each of four finalists in the four categories, and then a second award of $7500 for the final winner in those categories and $2500 each for the runners up.
This competition is A Bad Thing.
Again, that's what I thought at first. But after reading the rules I had to agree it's not as dumb as it sounds. They're merely putting some money into trying to find the best design for some new tools (since they're claiming that the inherent design limitations of the existing tools are what they're trying to overcome). Then once the initial submissions are weeded through, they basically let the OSS model take over, even monetarily rewarding finalists who join forces and end up with a better final design.
Use of autoconf? (Score:2)
If so, wouldn't it be a better approach to use a standardized high-level programming language that is a bit more away from the operating system? I know C cannot be dropped immediately, but if more and more tools are built to cover its shortcomings...
OTOH, I like make, I'm using it on all kinds of platforms (Linux, Sun OS, Win32) to create programs in different languages or just to rebuild some LaTeX source.
Re:Use of autoconf? (Score:2)
Autoconf also simplifies creating a distribution (make dist), putting in checks into your code (make check), installing and uninstalling correctly no matter where the prefix is, and checking for the correct libraries before compile time.
Not that I don't understand make - in fact I think that using autoconf and scanning its output has helped me get a handle on how make does its magic but doing it by hand? Eeek.
A standardised programming language would work. I think Perl makefiles are already a step towards that idea. But a new one would take time to catch on. Nearly all GNU stuff comes with configure scripts. I get nervous when I download source and it doesn't have a configure because 90% of the time, something is wrong and it needs a bit of makefile hacking.
A front-end for autoconf would be a good idea as proposed by some previous posters. Those who want to learn it before a nice GUI comes out or a replacement is found can check Havoc Pennington's online book [gnome.org] which has a chapter on the package.
Is make too difficult? (Score:2)
How can one expect to actually call himself a developer if he can't even manage to understand Makefiles? IMHO the standard make does wonderfully what it was designed for, and the GNU make is half-way to creeping featurism [jargon.org]. The consistency and interoperability of each of the system utilities adds elegance to Unix, and learning how to use them helps to keep the brain working! :) Stupidifying it is a mistake.
This isn't elitism, but I believe that replacing make(1) to make it more accessible to dumb people doesn't make sense, except if you dumbify the programming languages as well.
(I actually like autoconf, and I have been using it in some of [helllabs.org] my projects [helllabs.org], but the Makefiles generated by automake are just too bloated. I use my own nice very nice recursive Makefiles to build and package the system, and it works quite well!)
Re:Wrong license, I'm afraid (Score:2)
And, the LGPL doesn't have this problem. The LGPL also allows you to distribute modified source. So, if you find the GPL too restrictive for your libraries, how about the LGPL?
I never hear this complaint about commercial licenses that generally completely disallow any modifications to the distributed (often binary) product whatsoever, even when such changes are for your use only. Aren't they saying what you can do with your code?
If you feel the GPL "sticks its nose where it doesn't belong" there is an easy solution, Use no GPL code. You are no worse off than if the GPLd code had not been provided for your use.
Heck, use the GPLd code to test against as you develop your own workalike code. I'm sure the FSF, unlike a lot of commercial licensors, would not involve itself in a look-and-feel lawsuit against you.
So, it upsets you that you've run afoul of some licensing restriction. Seems like a petty gripe to me.
I've often wanted to do things with commercial software that's not allowed by the license. For example, I often want to install it on more than one machine even if it's absolutely guaranteed that it will only be in use on one at a time, like a laptop that only I use and a desktop that only I use. The license prevents me from doing it. It's frustrating, but I knew it when I bought it that what the license allowed. Petty of me to complain about something to which I implicitly agreed by buying and then using the product.
-Jordan Henderson
Re:Wrong license, I'm afraid (Score:2)
So what? Khruschev was less restrictive than Stalin, but what difference does that make. However, a commercial license does have one huge advantage over the GPL: no commercial library that I am aware of tells the developer what license they must use.
So, I take it by your analogy that you feel that any restriction on a license is evil. I suppose you are suggesting that all software should be covered under a BSD license of something.
I was addressing the point that the poster felt that the GPL places "massive restrictions" on you. I think the restrictions are pretty similar to those placed on you by commercial licenses, only less restrictive, yet you don't hear complaints about people chaffing under those.
-Jordan Henderson
Re:Wrong license, I'm afraid (Score:2)
You seem big on throwing around unsubstantiated claims. Please cite the tax law where it is not allowed that 501(c)(3) corporations compete with for-profit entities.
Mitre (a 501(c)(3) corporation) submits competitive bids, and often wins, against other Defense contractors. Another company that I'm familiar with, CTC (Concurrent Technology Corporation, a 501(c)(3) corporation), competes with for-profit companies to get various government contracts. Last I checked PBS (television) and NPR (radio) stations (all 501(c)(3)) compete with for-profit media.
Congratulations, you've found the one of the very few OS vendors that doesn't support their own compiler. I wonder if BeOS would even exist were it not for the great freely usable compiler technology?
Your claim that gcc has eliminated all compilers except "a few embedded niches and Microsoft Windows" is bizaare. Sun, Compaq, IBM, SGI, and HP all have quite active compiler groups. Intel builds a compiler for their architectures. There never was much of a third party market for compilers before gcc, most compilers were provided by the OS/Hardware vendor and they still are.
Gcc succeeds through "predatory pricing"? Another interesting claim. By this reasoning, I suppose ALL freeware should be stopped as being "predatory".
In any case, predatory pricing is only illegal if it seeks to establish an illegal monopoly. Someone giving something away that continues to give away their product is not disallowed. If it were, we'd be shutting down the people who voluntarily clean the roadsides as being predatory against the businesses that do the same.
I'm still waiting to see some reason in rebuttal. Did you know that US Senate rules forbid the use of the word "Demagogue"? It's because it's an empty epithet that adds nothing to the discussion. RMS doesn't throw around labels and call that debate, unlike his detractors.
I'm familiar with that history. So what? RMS saw what he considered an injustice and moved to correct it. He didn't move for restrictive legislation or just sit around and complain about it, he got to work in building something to correct an injustice. It's hardly a spite directed at Symbolics as it's not reasonable to assume that they would ever use GPL'd software. You might consider it spite against all who commit this, to RMS's thinking, wrong, but isn't that what correcting injustice is all about?
Again, RMS's detractors characterize him with labels (spiteful, malicious) while not actually adding anything interesting to the discussion of ideas in which RMS engages.
-Jordan Henderson
Re:Wrong license, I'm afraid (Score:2)
You are wrong. Typically, when you have a commercial license that includes source (ala SAP) you are only allowed to use that source under the original license. That's dictating what license you may or may not use. You are not allowed to redistribute it at all.
-Jordan Henderson
Yes, you are Incorrect on several points (Score:2)
Arrogance and ignorance are a bad combination.
I direct your attention to this page [mitre.org], where it is stated:
Mitre was a 501(c)(3) that broke up recently into two 501(c)(3) companies, Mitre and Mitretek.
Being a 501(c)(3) doesn't mean you are a traditional "charity", although you could indeed give tax deductible contributions to Mitre.
Although I can't provide a web reference for CTC, I recently attended a briefing at CTCs headquarters where it was stated plainly that CTC is indeed a 501(c)(3) corporation. In fact, the briefing materials did mention that you can make tax deductible contributions to CTC!
So, perhaps I was right initially. People do routinely libel RMS. You accused him of committing criminal fraud in his incorporation of the FSF as a 501(c)(3) entity.
Gosh, do you need more examples of 501(c)(3) corporations that compete with for-profit corporations? Many Hospitals and some HMOs are 501(c)(3) corporations. There are both 501(c)(3) and for-profit consumer counseling services. I could go on. Your assertion that 501(c)(3) corporations are not allowed to compete with for-profit corporations is absurd.
I suppose the rest of your unfounded conjectures and suppositions are about as reliable.
It's no surprise that RMS, the FSF and the GPL are so negatively represented in the Computer Industry Press when columnists routinely bluster authoritatively on subjects about which they know nothing.
-Jordan Henderson
Re:The Birth of the GPL (from Stallman himself) (Score:2)
Your claim was that the GPL was designed to lower programmer's salaries.
The section that you quoted is in response to the anticipated question "Won't everyone stop programming without a monetary incentive."
In the answer, RMS says that won't happen (and in fact he's been proven correct by Linux) and then goes on to posit how the existence of this large body of software will reduce the incentive for people to produce non-free software, with which they could probably make more money.
This is just a recognition that you can make more money from producing something that is scarce versus producing something that is freely available. People who object to the GPL often seem to enjoy promoting an artificial scarcity to forward themselves.
The Holy Grail of Software Engineering for 40 years has been reuse, reuse, reuse. These artificial scarcities have served to make software reuse very spotty and poorly practiced. The GPL is the only license that enforces a discipline of software reuse. This is a good thing.
As I said, RMS saw that as an effect of the GPL, not as a guiding principle. If it had been, it would have been up front in the Manifesto per se, and not in the anticipated objections section.
Look, I don't agree with RMS on everything. His redefinition of the term "free" is not really completely reasonable. He has an unjustified utopian view about a post-scarcity world. I also don't agree that the wide adoption of GPL'd software will mean less pay for programmers. Most of the remuneration for programming I've received is for maintenance work or pay for specific modification that would typically not be available in public sources. The GPL only serves to increase the opportunities for pay for this type of work.
But, I find RMS's opinions reasoned. I find his detractors hysterical. For example, you said that it's not far off the mark to call RMS a communist. A communist wouldn't say:
As RMS states in his GNU Manifesto [fsf.org]. Somehow, RMS protestations and plain statements that show he is not a "Communist" are always ignored.
-Jordan Henderson
Re:Wrong license, I'm afraid (Score:2)
Are you allowed to modify the MFC and redistribute it under ANY license?
If not, it seems like the license you receive with the MFC is more restrictive, in a very important way, than the GPL.
There is still the LGPL, whose use is discouraged by RMS, but is prefered by RMS to any other license, save the GPL, for libraries. The LGPL grants you, if I understand it correctly, the same freedom from restrictions that you enjoy with the license that comes with MFC.
-Jordan Henderson
Re:Wrong license, I'm afraid (Score:2)
The referenced page [gnu.org] doesn't ask us to accept anything on faith. RMS doesn't support worship of him or any of his principles. RMS primarily makes reasoned arguments. You may disagree with those arguments, but if you were up to the challenge you'd use reason yourself.
RMS tells us his view as to why other licenses, and specifically the X/MIT license can lead to problems. It's a pity his detractors have to bring up these tired cultist labels that actually serve to remove reason from the discussion.
It's funny that with all of the libel that RMS takes for his stands, you never see RMS suggest that others who don't support the GPL should stop using GPL'd products. Gcc comes to mind as something that has benefitted many in the Open Source "Community" who snipe at RMS, the FSF and the GPL.
Now we can see the philosophy of some of those who support other "free" licenses. They are factionalists. The FSF and the GPL support Freedom in software. The software is permitted to be used by anyone, even those who work against FSF goals.
-Jordan Henderson
Re:Yes, you are Incorrect on several points (Score:2)
You're making this up as you go along, right?
The examples I gave were all 501(c)(3)s. As I said, it's not by my definition, each of these organizations applied and were granted this status by the IRS. As you say, see Publication 557 [fedworld.gov] for details.
While the functions of 501(c)(3) organizations may not seem to be "charitable" in function, they are generally referred to as "a 501(c)(3) charity". Donations can generally be made to these organizations on a tax-exempt basis. As I said, charity is broadly defined. Both Mitre and CTC refer to themselves as 501(c)(3) charities. As this IRS [irs.gov] page states:
The exempt purposes set forth in 501(c)(3) are charitable, religious, educational, scientific, literary, testing for public safety, fostering national or international amateur sports competition, and the prevention of cruelty to children or animals. The term charitable is used in its generally accepted legal sense and includes relief of the poor, the distressed, or the underprivileged; advancement of religion; advancement of education or science; erection or maintenance of public buildings, monuments, or works; lessening the burdens of government; lessening of neighborhood tensions; elimination of prejudice and discrimination; defense of human and civil rights secured by law; and combating community deterioration and juvenile delinquency.
Note also that as long as the corporation involves itself in it's original chartered function, it can keep it's tax exempt status:
You can contribute, tax-deductibly to Mitre, CTC or the FSF because of their "charitable" status.
The real issue is that you accused RMS of perpetrating a fraud by invalidly constituting a 501(c)(3) corporation. You went on to claim that such corporations were not allowed to compete with for-profit corporations. Both claims are bunk and any examination of the relevant tax code sections demonstrates this pretty clearly.
-Jordan Henderson
Re:Wrong license, I'm afraid (Score:2)
Perhaps libeled is a bit strong. You do see RMS alternately called a communist, a Svengali a fraud and a number of other things. He generally is not criticized in a way that could legally be termed libel.
An amazing claim. Let's see, every single OS vendor has a compiler suite which they heavily support. Most chip manufacturers have a compiler (Intel, Motorola, IBM) for their architecture. There are any number of companies that sell commercial compilers. There is lcc. Seems like there are many, many, many alternatives.
Oh, you mean a good, free, cross architecture compiler that really works well?
Since you make a speculation about a market that could have been had it not been dominated by that mean, destructive GPL'd gcc, allow me to me to make one. Intel, IBM, Compaq (and DEC), Motorola and probably any number of other companies spent a lot developing code generation for gcc. Had the GPL not tied their hands and required them to give these changes back, these companies almost certainly would have sold the modified compiler as a product. After all, each of these companies had their own compilers that they sold as products. Why would they give away their work on gcc had they not been forced to?
Rather than gcc destroying the market for a good cross architecture compiler, it set up an environment where such a thing could thrive.
Compilers (FORTRAN, Algol, Cobol) and free software both existed since the 1950s. It wasn't until a GPL'd compiler appeared that good, free, cross architecture compiler appeared.
The GPL is shorter than most EULAs, and it only really places one restriction on the code; You can't take advantage of this community work unless you are willing to participate in the community by giving back. It may be more restrictive than the X/MIT or BSD licenses, but I don't see how you can reasonably call this a "massive restriction". The GPL is far less restrictive than any commercial license with which I'm familiar. What would you call commercial licensing? Tremendously restrictive? Unbelievably restrictive?
Sure. He unfairly gets called all kinds of names. A reasoned discussion [gnu.org] of issues of various "free" licenses is met with "RMS isn't God, GNU isn't a religion and the FSF isn't a bunch of prophets." and absolutely no substantive arguments and Stallman's agenda is one of spite and malice?
Another amazing claim. What public good could these Open Source programmers be considering that is not allowed by the GPL except to turn an Open Source product into a Closed Source product. Seems like any other kind of Open Source license allows programmers to transform open source from a public good into a private good. You speak of engaging in "activities" (plural), but really there's only one activity that GPL doesn't allow. That's benefitting from others work without giving back changes to the community that gave you your start.
-Jordan Henderson
Free Software : The Price is Right (Score:2)
I'm sure this has good intentions, but will OSS developers of the future have to compete against each other with duplicated efforts to write the best free software, in hopes of winning cash and prizes?
Sounds like some kind of game show to me...
--
Re:Is make too difficult? (Score:2)
The good idea in make is the dependency graph. Having a dependency graph that accurately describes what goes into your product is a very useful thing. But make doesn't let you use that dependency graph in multiple ways. You can't use it for linking. You can't use it to decide what goes into a distribution file. And it isn't enforced; there's nothing that checks that there aren't dependencies on things not in the graph.
Make's "time greater than" approach to determining if something needs to be built is terrible. If make actually logged what the last compile used as inputs, it would be safe. Instead, the usual answer to doing anything that involves the possibility of a file changing to an earlier date is to either recompile everything manually or use something like "touch" to fool make.
Some of the modern IDEs, notably CodeWarror, have rethought the dependency issue and seem to have got it mostly right. (MSVC is still cranking out makefiles internally.) But the UNIX toolset is still stuck in the 1970s.
I rewrote make once, in 1979, to fix some of these problems. (I couldn't distribute it, so it was only used in-house.) I'm disappointed that in twenty years, nobody has rethought make. All they've done is add cruft.
Multi-item response from the project coordinator (Score:2)
[From the Software Carpentry [software-carpentry.com] project coordinator]
Thank you all for your postings regarding the Software Carpentry [software-carpentry.com] project. To answer some of the points that have come up several times:
This is a design competition, rather than a programming competition. Good entries should be relatively language-neutral --- we believe that at the 5000-word level, the similarities between modern object-oriented languages (C++, Java, Python, etc.) are more important than their differences.
Designs based on existing tools are very welcome. If, for example, you think the only way to meet the criteria for the "build" category is to extend the syntax of standard Makefiles, then please submit that as a design. (However, for the reasons discussed in the FAQ [software-carpentry.com], if your plan for an implementation is simply to provide a Python scripting interface to GNU Make, you'll have to convince the judges that there's no "pure Python" way to achieve the same ends.)
No, Software Carpentry is not a company looking for some publicity. The project is being funded by Los Alamos National Laboratory [lanl.gov], who believ that computational scientists and engineers need easier-to-use software engineering tools, and administered by CodeSourcery, LLC [codesourcery.com], who believe that those tools would be of use to the whole Open Source community. The FAQ [software-carpentry.com] talks about LANL's reasons for funding the project, as does this article [ddj.com] from Doctor Dobb's Journal [ddj.com].
Yes, one of the project's goals is to give up-and-coming software designers a chance to get some attention, just as architects and classical musicians do.
Yes, the competition is open to submissions from any country.
No, this is not part of some perfidious Pythonesque plot for world domination :-). We thought very seriously about using Perl for the implementations, but after teaching classes in both Perl and Python at Los Alamos National Laboratory, came to the conclusion that the latter had a gentler learning curve. (This is not meant as disparagement of Perl as a tool for full-time professional programmers, it is simply an empirical observation of computational scientists and engineers.) Neither Guido nor any other member of the Python development team had any part in setting up the project, choosing Python, or choosing the competition categories.
They mandate Python!? (Score:3)
Don't get me wrong, I have nothing against python, or scripting in general, but these tools scream c or c++ to me.
I can understand wanting to standardize on one language to help make this "suite" a cohesive whole, but they've got to select the right tool for the job. Hell, I don't even have python installed on most of the boxes I use, but you can bet c and c++ will always be there.
From their FAQ... "Requiring that all tools be written in, or scriptable with, a single language will make it easier for newcomers to learn, use, and extend these tools."
How does does implementing a tool in a scripted language make it eaiser for newcomers to learn and use?
Oh well, other than that mandate this looks like a really cool project. I wish Software Carpentry all the luck on the world!
There are lots of make replacements... (Score:3)
The same is true for all the programs they want to replace. At best, this competition will give some developer experience they can use for enhancing the standard tools. At worst, it will divert some free software talents towards enhancing and maintaining a little used set of alternative tools, rather than enhancing the tools used by the rest of the community. Most likely, someone will have wasted US$200.000.
Re:They mandate Python!? (Score:3)
Well, I assume that they'll allow Python code in the control files themselves, the same way Makefiles allow sh code and autoconf allows m4 code. Writing the tool in the interpreted language makes this easier -- I suppose you could try to optimize by writing most of the code in C, then providing Python-visible hooks and calling the interpreter as appropriate; this might be less useful than you'd think, though. My inclination would be to write only the dependency-resolution stuff in C -- nothing else seems likely to be time-critical.
Anyway, back to the reason to choose Python (as opposed to other scripting languages) -- Python is actually more common than you might think, it's not that hard [1] to install, and it's sane.
Daniel
[1] I'm assuming you're willing to use binary packages; for example, the Debian ones..
Re:Should we move away from the file system paradi (Score:3)
That is a stupid remark, especially the time argument. Both cultures have excellent results, are fruitful to each others and are likely to stay with us for a while.
The lack of a BSD compiler is the result of no one interested enough so far in writing one. Not more not less. No reason why this could not change one day.
Re:Another point of view (Score:3)
Perhaps the person to whom your responding meant that the interface intrinsic to the tool - i.e., the semantics of Makefiles - were "confusing and lame", and therefore that "writing a new interface" would mean "writing a new tool".
Sometimes incremental improvement of existing tools merely involves moving closer to a local optimum in the solution space, and avoiding a better local optimum somewhere else in the solution space.
That's why I think that this competition isn't ipso facto a bogus idea - perhaps people won't come up with something better, perhaps they'll come up with something that's a little better but not enough to supplant the existing tools (NOTE: the availability of the new tools does not mean the old tools will go away! It's not as if you won't still be able to use make and autoconf.), but perhaps they might come up with something that has an underlying model that's significantly better, so that the new tools are easier to use, or more powerful, or more powerful and easier to use (e.g., it may be easier to make use of the tools' power).
No, I don't know offhand what such a tool might look like - but that in no way constitutes an indication that no such different-and-better tool is impossible.
Re:Yeah, right... That's the point, actually (Score:3)
Will we get a new make out of it? You and I would say no, but all it takes is a few kids sitting around somewhere listening to us laugh, and they get even more fired up.
The money isn't going to be what drags the programmers out of the woodwork - it's going to be the recognition that goes along with the money. (Sadly, the only way to get recognition in these IPO days is money, but that's another rant.) Ten years ago, someone could have waved a hundred thousand dollar check in front of Linus's nose, and it wouldn't have got us to this point any faster.
Freedom (unless you're not like us)? (Score:3)
So many of the previous posts are about why this is a Bad Thing because it diverts talent away from other things, why replace something that works, yada yada yada. The same could have been said of Linus' work on linux instead of contributing to BSD, etc. etc. Why are people so hung on "investing in open source (no, I refuse to capitalize open source)" and sticking with something "because that's how we've always done it?"
Here's a company willing to throw money at open source development and they get blasted. Who cares what they do. Does it affect you? NO. It doesn't. You do your thing, let Software Carpentry do theirs. If you love make, then use it. If you don't, that doesn't mean you're dumb or stupid as previous posters seem to imply, it just means that you would like an alternative to "make." Common folks, let's put the FREEDOM back in FREE code.
The open source zealots (and slashdotters in particular) are, ironically, (moderator: that's my own damn observation, not flame bait) among the most UNFREE people in this world. You seem to like any idea as long it jives with what you already believe. Live and let live.
More open source s/w will NEVER be a Bad Thing. After all, it will only result in more FREE code and that means more FREEDOM.
Re:There are lots of make replacements... (Score:3)
Atria Software, "Building Software Systems with ClearMake", ClearCase Users Manual, Natick MA, May 1994.
Geoffrey M. Clemm, The Odin Reference Manual, available via anonymous FTP from ftp.cs.colorado.edu.
S. I. Feldman, "Make - A Program for Maintaining Computer Programs", BSD NET2 documentation, April 1986 (revision).
Glenn Fowler, "The Fourth Generation Make", Proceedings of the USENIX Summer Conference, June 1985.
Peter Miller, "Cook - A File Construction Tool", Volume 26, comp.sources.unix archives, 1993.
Christopher Seiwald, "Jam -- Make(1) Redux", Usenix UNIX Applications Development Symposium, Toronto, Canada April 1994.
Richard M. Stallman and Roland McGrath, "GNU Make - A Program for Directed Recompilation", Free Software Foundation, 1991.
Zoltan Somogyi, "Cake, a Fifth Generation Version of Make", Australian Unix System User Group Newsletter, April 1987.
Dennis Vadura, dmake(1) manual page, Volume 27, comp.sources.misc archives, 1990.
Which show some different approaches that have been taken, even though some of them don't qualify or are what are the things being replaced.
If there's a missing requirement in the rules of the contest, it's the lack of a migration path from make. Without that, you just have an interesting toy, because no one will move their existing significant system without it.
-dB
Obvious question: why? (Score:4)
Certainly autoconf needs some work to tidy it up (particularly the generated configure script), but it's not as bad as they make out. As for it being the last major application to use m4, I guess they've forgotten about sendmail...
Similarly, make has some deficiencies, but again, it's mostly there, and what it does lack can be fairly easily added. It needs a simple GUI front end for newbies more than it needs rewriting.
Overall, it's not a bad idea, but I think that the effort should have been put into more pressing areas, such as having an embeddable editor API for X (so that individual apps can have an editable text area, and the users gets to choose which editor is actually used).
I can't help thinking that perhaps this is part of Guido's grand plan for Python to take over the world (not necessarily a bad thing in itself, but I'm always suspicious of things with political motives).
They have got to be kidding... (Score:4)
Replace make?????
These are robust, time-tested tools for creating software. If a better way existed to manage projects we (programmers in general) would probably have it by now.
I was just recently a member of a team that converted a very large project from Microsoft's hideous Visual Studio project (.dsp) files to autoconf, automake, and make. Why was this done? Because it's easier to use, more flexible (try telling MSVC to run lex and yacc and then compile the output files, using only
Now on the other hand, if all "Software Carpentry" wants is versions of autoconf and make ported to python, well, I guess it's not that silly, but why would you want to do that? The source code for these programs is extremely portable already. Implementing them in Python gains you nothing.
________________________________
Is this the best way to invest $100k in Linux? (Score:4)
Failing that, break the money up into $20k 'grants' and offer them to people who are already working in the right direction.
This competition is A Bad Thing.
Should we move away from the file system paradigma (Score:5)
Replace make?????
These are robust, time-tested tools for creating software. If a better way existed to manage projects we (programmers in general) would probably have it by now.
I was just recently a member of a team that converted a very large project from Microsoft's hideous Visual Studio project (.dsp) files to autoconf, automake, and make. Why was this done? Because it's easier to use, more flexible (try telling MSVC to run lex and yacc and then compile the output files, using only .dsp files! HA!), and opens the program up to porting to other platforms.
First, I completely agree with you that project configuration through MS Developers Studio projects is inferior to UNIX style configuration.
I had many fruitless discussions about this with some of my collegues (sigh). For some reason it is seems nearly impossible to convince people with a DOS/Windows background that the complicated make syntax is less a PITA than the fact that a myriad of parameters is hidden behind various corners of the MS Dev Studio graphical user interface.
My theory is that UNIX people love ASCII representations while the Windows crowds loves roaming GUIs. No idea why. But believe it or not there are people who for example can memorize an astounishing list of key/value pairs from the NT registry.
On the other hand one must admit, that it is hard to understand make without being familiar with the way the basic UNIX tools (awk, sed, grep, sh..) interact in the more complex make files.
My objection against make is not it's complicated syntax (which is only complicated because different levels of parsing - make's and sh's - intermix and regular expressions need a bit familiarity), but that it is slow.
This is not a failure of make itself but because of the way we traditionally organize programms into files and directories. In fact we use the filesystem as database, and query this database by giving paths.
I would expect a speed up, if we would move away from that organization into some kind of program database. A database that is very close to the semantics of the used programming language. Like a database of classes, makros, functions, globals etc So I expect that it would be possible to improve the speed of a make dependency aware construction tool in such a environment.
It would also solve some issues with C++ (especially template resolution).
However I have not seen a move away from that traditional C style project break up, compilation and linking to something which more resembles a database with tools on it yet.
Possibly because it would give up many benefits, the most obvious ones the ASCII representation and the flexibility and power of interaction with the UNIX simple-tools-work-together style. Which is a very good style. In fact it is a very clever hierarchical design.
Another reason is that the database thing would also have some other drawbacks. The slowness of make is part to scanning the file hierarchy. On the other hand this allows to add or delete components in an easy way. A database solution would insist of registring new items and eventually be more complicated to use here.
My objection to autoconf is mostly license based. The whole autoconf/automake/libtool toolchain is an impressive one, but it is inheritently bound to the GPL. If we ever want a true free BSD, we have to think of an alternative. But as with the system compiler, we have other, more important work to tackle with limited resources right now.