

Debugging Configure 72
An anonymous reader writes "All too often, checking the README of a package yields only the none-too-specific "Build Instructions: Run configure, then run make." But what about when that doesn't work? In this article, the author discusses what to do when an automatic configuration script doesn't work -- and what you can do as a developer to keep failures to a minimum. After all, if your build process doesn't work, users are just as badly off as if your program doesn't work once it's built."
Debugging configure (Score:4, Funny)
I've written a little program that debugs configure automatically. To compile my program, simply run configure, then run ... oh, wait. Never mind.
Re:Debugging configure (Score:2, Interesting)
Infact I NEVER run configure in the same source tree twice (still smarting after bitter 2 days debugging experience that got resolved by doing exactly the above
Re:Debugging configure (Score:2, Informative)
Re:Debugging configure (Score:2)
And, when there's no Makefile, there's no "make clean" either
Re:Debugging configure (Score:2)
2 configure
3 error
4 rm -Rf
5 goto 1
Do'h!
You talk to the developer (Score:4, Insightful)
Some projects you can't find the developers, then its time to learn how autoconf works and read the autoconf mailing lists and work out what test failed and why.
Most tests fail because
* some required feature is not supported on your platform,
* or because the configure script is old and autoconf has been updated
The latter is sadly quite common in badly maintained projects as autoconf has undergone revisions and badly written autoconf definitions have started failing.
If your platform lacks a feature you write around it for your platform with "ifdefs" which get activiated according the the test failure. I dont know how to do this, fortunatly for me all my projects have been working under people who understand autoconf quite well.
Sam
Re:You talk to the developer (Score:2, Interesting)
Of course, it could be considered a bug if the configure script fails silently or ambiguously just because a dependancy was missing, but I see it a lot.
It'd really help if the authors would put very verbose missing dependancy messages, even as far as including a URL for the dependancies, if they aren't common.
One build that comes to mind that was total hell was Flightgear. That thing ha
Re:You talk to the developer (Score:3, Insightful)
First I always put the last line or two of error messages into google. 8 times out of 10, I find someone asking the same question, and usually someone has already posted an answer.
Actually, before even that I check with Linux From Scratch [linuxfromscratch.org] (and Beyond LFS). But there are a whole lot of packages they don't cover.
Re:You talk to the developer (Score:2)
* some required feature is not supported on your platform,
In which case, the configure script should print a message telling you about it (and ideally, what configure option (eg --without-foo) to try to get around it, or what other package to install.
Preferably in noticeable enough text to stand out from all the normal messages configure spits out. And maybe prefixed by "Don't Panic!" in large friendly letters....
(I'm in the midst of tweaking my configure scripts for a rather co
configure.in parsing (Score:2)
Basically caching the results, but also moving the code for checking for the conditions to the users machine rather than pregenerated on the developers machine.
When I run configure it often says stuff like "check...(cached)" but configure still runs slow as molasses - so I'm not sure what exactly it is doing..
Re:configure.in parsing (Score:2)
Yeah, and where's the C compiler that will fix bugs in my code and just do what I MEAN??
Re:configure.in parsing (Score:2, Interesting)
This is what the pre-make-kit project (http://pmk.sf.net) is trying do to. The project aims to provide an alternative to autoconf. You should have a look.
tips and tricks (Score:5, Interesting)
The "code for checking" is all just a bunch of macros. Believe me, the slowdown you see isn't the shell reading a bunch of lines of text.
Some points that might speed things up:
Re:tips and tricks (Score:2)
'sed' aside, I wonder how much of that has to do with the ridiculous Makefiles (ok, Makefile.ins) that get generated if you use automake. I recently tried automake on a project, and the Makefiles were fscking huge! 600 and 800 line makefiles! I blew them away and make Makefile.ins out of the original development Makefiles I had. Sheesh.
Re:tips and tricks (Score:2)
Nothing whatsoever. Those are created long before the user (not the developer) runs 'configure'.
As to the size... well, yeah, you need that in there if you're going to be portable, and safe, and not use any weird extensions for weird versions of make, and still support all the makefile targets that a mature project would require. (Packaging, cleaning, dependancies, relocatable
Re:tips and tricks (Score:2)
Shrug. I may try again on my next project. I just tried the default on my current project and didn't like the results, so I went back to handmade Makefile.ins.
(BTW, I see you're one of the GNU libstdc++ maintainers -- any particular reason w
Re:tips and tricks (Score:2)
The Makefile.in -> Makefile translation is purely a sed operation. It doesn't even involve any regexs, just 's/@fixed_text@/other_fixed_text/g', which is entirely bounded by how good the sed implementation is. (Only a DFA engine is required, so if sed simply assumes that something tricky is going to happen and uses an NFA, or doesn't even have a DFA engine, then it's going to be needlessly slow. I've seen the time go from 20+ seconds to less than one second simply by replacing sed.)
Re:tips and tricks (Score:2)
Amazingly, that all made sense. I must be having flashbacks...
I find configure quite useful (Score:3, Informative)
It's still better to see something like "Testing for SSL... failed" than go "Can't locate libmcop_mt.so -- Huh? What child is this? Let's google..."
And if all else failes, remove the problematic code in configure, make, make install && pray
Re:I find configure quite useful (Score:4, Insightful)
Makefiles are often poorly written. In particular, people very often fail to use
If you want to see well-written Makefiles, look in the BSD source tree. Taking one at random, here's FreeBSD's src/usr.sbin/edquota/Makefile:
# @(#)Makefile 8.1 (Berkeley) 6/6/93
# $FreeBSD:
PROG= edquota
MAN= edquota.8
WARNS?= 4
Re:I find configure quite useful (Score:3, Insightful)
What if you have a package that has to work on FreeBSD AND OpenBSD AND Linux AND bazillion other unixes AND win32/cygwin AND win32/mingw AND
It won't be that simple any more, and at that point it's probably impossible to "write well" as well.
Bad, bad example (Score:2)
Make itself is badly underspecified. POSIX Make simply doesn't scale well enough to support a medium-sized or larger project; it doesn't standardize anything like dependancy tracking or including other makefile fragments.
As a result, every implementation of make grew its own features to handle the deficiencies. Including files, to take your example, can be done with every make program out t
Re:I find configure quite useful (Score:1, Informative)
As a user, I give up when dealing with unpleasant configure errors.
Re:I find configure quite useful (Score:2)
I give up when configure works, but the software still won't compile
Re:I find configure quite useful (Score:1)
This is one of the reasons linux is not mainstream. Packages need to be distributed as source as well as precompiled binaries for common archetectures, or at least the one it was written on.
Re:I find configure quite useful (Score:2)
Ports (Score:5, Insightful)
For example, I wrote my binary patching tool on FreeBSD, but I don't recommend that people (even on FreeBSD) build it directly from the source tarball; instead, I advise people to use the ports tree, since that puts BSDiff into FreeBSD's packaging system. If someone running Gentoo wants to use BSDiff, they can install it from portage, which adds workarounds for gmake and linux breakage.
Most developers only have access to a couple platforms for testing their code. Rather than doing a poor job of supporting every platform, it makes much more sense to support one platform directly, and allow other people to step in and provide the necessary patches and packaging to support other platforms.
Re:Ports (Score:3, Informative)
Said that, I do agree that a good autoconf configuration is very hard to acomplish, mainly when you doing it for the first time.
Autoheadache (Score:3, Insightful)
If autoconf is so problematic and PITA, why use it in the first place?
autoconf/make is more trouble than it's worth. Portable makefiles and small portability test programs is the right way to do it.
Some people just don't value complexity enough. These tools are needlessly complex. This page has more info: http://www.ohse.de/uwe/articles/aal.html
Re:Autoheadache (Score:2)
The number of layers of indirection used by
automatic configuration systems is absurd.
The number of otherwise useless skills which
must be mastered in order to debug a code
generator, and the complexity of that task
even in the presence of mastery of those
skills, represent a barrier to portability
which is much more substantial than the
relatively trivial task of constructing
code which is portable in the first place.
Re:Autoheadache (Score:2)
Re:Autoheadache (Score:2)
Autoconf, though, is pretty handy for dealing with things that are fairly commonly on different places (or even named differntly) on diff
Huh? (Score:3, Insightful)
I don't know of any aspect of computer programming that's both non-trivial and easy to accomplish, when you're going it for the first time.
You know, I've never had that much trouble with it. I've never had more trouble than doing it by hand would have g
Re:Ports (Score:2)
Rather than attempting to include support for every architecture via autoconf, I think the BSD ports approach is far superior
The problem is, how do you install the ports system? I'm currently trying to install Gentoo [gentoo.org] on my Linux box, but the installation is failing.
Re:Ports (Score:2)
Re:Ports (Score:2)
I just started the install last night. When I woke up this morning, I got that error. Assuming (incorrectly) that the make would restart from where it left off, I started it up again, and thereby lost the error message. But now it's many hours later, and I've reached the error message once again, so I will be doing a search on Google. Maybe I screwed up with my -march setting.
v
Re:Ports (Score:1)
CFLAGS="-march=k6 -O3 -mcpu=i686 -fomit-frame-pointer -funroll-loops -pipe"
Re:Ports (Score:1)
Re:Ports (Score:1)
I had the same problem years back when trying to build LFS for a WinChip-based iOpener. I was compiling everything on a P3 and making sure to optimize for i586 (I think), but the glibc had to be done differently, and consequently it kept optimizing for i686 and pratically every binary would terminate with an illegal instruction.
Re:Ports (Score:2)
According to the bug reports, I don't know. :^) Seriously, I think that is a K6-3, but it is so poorly documented by AMD. You have to look in /proc/cpuinfo [or whatever it is called], & compare the flags to the bug reports. Even then, it still may not work. I know that for a fact because I tried all of the K6 alternatives & it would fail during the gcc compile as well. I had to end up using i586. It was kind of sad, but @ least I finished installing it.
It just happens that I fin
Re:Ports (Score:1)
Re:Ports (Score:2, Interesting)
For example, you rely on BSD make. That makes some sense for a BSD project, of course. But BSD make is not available on Cygwin. If you don't want to use the more widely available GNU make and its extensions, you could at least restrict your makefile to the subset of cons
Re:Ports (Score:2)
In that particular case, I wrote the program for a specific purpose -- FreeBSD Update -- and was surprised by how many people wanted to use it for other purposes. So no, I wasn't really trying to write portable code. (On the other hand, GNU make is five times as large as B
Re:Ports (Score:2)
That's my point. I didn't think about portability at all while I was writing BSDiff, yet with a couple patches, it compiles and runs properly on a completely different platform.
But I was suggesting you write a POSIX makefile, with which people can use whatever make utility they prefer. Isn't that a laudable aim?
Well... yes, but POSIX make is missing some rather imp
Re:Ports (Score:3, Informative)
by far the most portable and ubiquitous build
system in the world.
Portability is laudable because it allows
people to use your code.
If using GNU make could be a barrier to the
adoption of your code, that might be a reason
not to use it, but the license of GNU make
doesn't have any obvious bearing on the
usability of your application which is built
by its mechanisms.
Re:Ports (Score:2)
A long time ago in a galaxy far far away, there were two makes. One was BSD make. The other was SysV make. They were not compatible with each other. Then along came GNU and instead of choosing one or the other, decided to meld them together in an unholy marriage whose offspring wasn't compatible with either. As if that weren't enough, it added in half a million extensions of its own.
T
Backwards (Score:5, Informative)
That's exactly how autoconf doesn't work. "Including support for every architecture" is how previous build systems worked, like xmake. Those were an unmitigated disaster.
Automake's goal is to allow the build system to adapt itself to whatever system happens to be available. It does not look at the OS and say, hey, I'm on Solaris, bring out these hardcoded settings. Instead it performs tests for each feature of interest -- where is the compiler? where are the SSL libraries? what's the exact function signature for this not-quite-standard-C subroutine? If your system has been customized from the factory default, so to speak, then hardcoded answers will be wrong, no matter how dilligently those porting people were in originally deriving them.
The idea is to have configure discover whatever the correct answers are, and set variables appropriately so that you don't have to care about the differences -- you can write the code once, as you say, and let the automatics do the necessary adjustments, rather than other people. Assume you're running on a POSIX system, for example, and let configure #define things as necessary to make up for the non-POSIX systems.
Re:Backwards (Score:3, Insightful)
unmitigated disaster?
In my view any project that needs to resort
to automake in order to configure the build
environment has already failed. It has failed
to deliver a portable Makefile, and failed to
deliver portable application source code,
and tried to work around that failure after
the fact by patching it with automake.
Re:Backwards (Score:4, Insightful)
Huh? When did this switch from autoconf to automake? Okay, sure, I'll play. I think automake is a fine tool.
Your view is wrong, since that's not what automake is for. That's autoconf's job.
Here you're clearly talking out of your ass. Go look at a generated Makefile.in. It's whole purpose in life is to be more portable than any hand-written makefile could ever be. Go ahead, try to implement the same complicated, real-world-necessary rule patterns yourself, without resorting to a nonportable feature of GNU make, or BSD make, or Sun make, or...
The point is to save typing during the input (Makefile.am) and yet keep the output (Makefile.in/Makefile) utterly portable. And it succeeds admirably. What, you'd rather force everyone to maintain hundreds of lines of makefile by hand?
What the fuck does that have to do with the makefile? Or the build system? This is a red herring, and a specious argument.
...a tool which, in fact, does not actually patch any files. Brilliant argument there, buddy. Perhaps you should pay attention to which tool does what.
Feh. Go back to your hand-written tedious makefiles. I'll stick with tiny free-form automake files. And add you to my /. killfile.
Re:Backwards (Score:2, Insightful)
Sure, for small and simple programs you can resort to hand-crafted Makefiles and just skip all that autofoo business, no question.
But once you bring stuff like multiple libraries (even if it's just no-install static libraries) or conditional compilation (enable/disable features that rely on certain libraries for example) into the equation, you'll have an extremely hard time getting this right on most platforms wit
Re:Backwards (Score:2)
Maybe most don't. Some of us do.
In my office alone I've got 11 boxes with 3 architectures (x86, PPC, 68K) and 6 OSs (Linux, Solaris, FreeBSD, MacOS, Windows - oh, okay, 5), including a small Beowulf cluster (only 5 nodes of P-166s, but hey..). Then there's the four machines in the basement, one of which has yet another architecture (Sparc). And if I get really wild and crazy,
Re:Backwards (Score:2)
Similarly, portable application source code is rather difficult in many cases. You can write to a standard like ISO C or ISO C++, but then what do you do about the systems that don't quite conform to that standard? Being portable to m
Re:Backwards (Score:3, Insightful)
You're right in that xmake and similar systems didn't work well. You're right in that an automatic configuration system should query for capabilities.
But you're wrong in that autoconf/automake is the answer. I tried to build some software yesterday and I got the error that I wasn't using the correct version of automake. WTF! When the specific versions of automake/autoconf are themselves configuration variables, something's seriously wrong.
automake is not required to use ./configure (Score:2)
Tell the maintainer of your troubles... it needs to be fixed.
autoconf/automake are self-contained, and should only need to be used for building your own configure scripts, or for setting up packages from CVS.
Re:automake is not required to use ./configure (Score:2)
And that's exactly what I'm doing. automake/autoconf need a simple way to switch between versions. There isn't one that I can find. If you know of a way, please tell me. Rid me of this aggravation.
If these incompatibilities were between major versions of the software, I could understand. But GNU has an annoying propensity to break compatibility for every minor release with most of their software.
Re:automake is not required to use ./configure (Score:2)
It's not so much the minor incompatabilities, in my experience, it's the overeagerness of developers to keep bumping the "minimum required version" needlessly, just to use the latest.
Autoconf did a huge leap, with many changes, and some projects still use the last of the old versions (2.13). Automake had a series of broken minor versions, so there's a jump from 1.4 to 1.7 or so.
Both automake and autoconf can have multiple versions sit side-by-side, so that's what some distros do; and Debian, at least,
Re:automake is not required to use ./configure (Score:2)
check out 'modules' (for switching btw. versions) (Score:1)
Re:Ports (Score:1)
Re:Ports (Score:2)
However, even if you do let other people provide the necessary patches for other platforms, there's something to be said for having a central repository of these patches. If I happen to use a weird system where the argu
Re:Ports (Score:2)
Sure; there's nothing to say that you can't look at the patches people are distributing. But there's a balance -- it doesn't make sense to include lots of #ifdef AIX code when 99.99% of users don't run AIX.
if you want your app to run on AIX 4.x do you really exp
Re:Ports (Score:2)
The webpage missed a major resource (Score:5, Informative)
The GNU Autotools have their own published book, the electronic edition of which is online [redhat.com]. This doesn't seem to be listed in the resources at the end of the article.
Re:least favourite thing to have to change is... (Score:2, Insightful)
Headers under include/linux are not guaranteed to work between kernel versions and that's why they need to be avoided. In fact it is not recommended to use them (except if you are libc). The headers are under include/linux because they ARE linux kernel headers and not because somebody wanted to break compatibility with BSD.
For example, the joystick interface used to be pretty much the same as BSD (linux 2.0 era) but not anymore (the new interface is much better but that's another story). If linux used