Forgot your password?
typodupeerror
Programming IT Technology

Alternatives to Autoconf? 108

Posted by Cliff
from the dodging-the-complexities dept.
Despairing Developer queries: "Once autoconf was a great way to make widely portable programs. But now when you have to spend more time sorting out incompatibilities between autoconf versions, breaking battles between autoconf compatibility wrappers and configure.in compatibility functions that tries to outsmart each other, and on top of that see the list of dependencies increase (it's not that fun to compile perl on unicos) and performance diving rapidly, what is your escape plan? Is there a drop in replacement for autoconf? Is there something else out there that is as portable as autoconf to use instead?"
This discussion has been archived. No new comments can be posted.

Alternatives to Autoconf?

Comments Filter:
  • QMake (Score:3, Interesting)

    by rufus0815 (651685) on Friday May 21, 2004 @06:55AM (#9213392)
    QMake (from Trolltech) is a possible replacement, though not free (according to the FSF)!
    • Re:QMake (Score:3, Informative)

      by blackcoot (124938)
      not entirely sure why this got marked as a troll, but i guess that's what the meta-mod system is for. two things: firstly, it's available under some crazy dual license scheme which allows you the option of accepting either the QPL [trolltech.com] or the GPL (see here [trolltech.com]). secondly, i don't think that qmake comes even close to covering the ground auto(conf|make) does. i guess the closest relation is to automake, except crippled. as far as i know, it doesn't have any of the detection / configuration stuff that autoconf does. of
      • Re:QMake (Score:3, Interesting)

        by rufus0815 (651685)
        AFAIK it still hasn't got a lot of features that automake has (specially the detection / configuration stuff - as you said correctly).
        Never the less, it's easy for beginners, and it's cross-plattform!
        So for posix developers it might be unsufficient - for Windoze developers (M$VC) it would be a great improvement (the way M$VC handles the project settings is horrific)!
      • Re:QMake (Score:1, Informative)

        by Anonymous Coward

        not entirely sure why this got marked as a troll

        Because QMake is free. I couldn't find any comment by the FSF about QMake, and it's dual licensed under exactly the same terms as Qt itself. From the FSF website:

        We recommend that you avoid using the QPL for anything that you write, and use QPL-covered software packages only when absolutely necessary. However, this avoidance no longer applies to Qt itself, since Qt is now also released under the GNU GPL.

        Simply suggesting QMake wouldn't have been a t

    • QMake is free according to the FSF. Actually, it always has been free according to the FSF. Even it's predecessor, tmake, was always free according to the FSF. What have you been smoking?
  • mod article up! (Score:5, Insightful)

    by blackcoot (124938) on Friday May 21, 2004 @07:09AM (#9213438)
    i wish there was a way to moderate articles up, because you've hit on one of my major (*major*) pet psychotic hatreds regarding developing software. auto(conf|make) sucks badly. it's bearable if you're developing from scratch (not depending on other libraries) or require that your bundled versions of libraries be used. but when your software depends on, say, 123098123871237 other packages (i.e. you're writing for gnome or kde), you're boned.

    unfortunately, there are no reasonable replacements that i know of, which is probably a testament to the nastiness inherent in solving this problem. a pity, really -- auto(conf|make) and company are a really good idea (in theory). unfortunately, there seems to be some really bad crack smoke involved in designing these tools. first (and probably foremost) in my mind is why isn't there a database of some sort which would at least allow the option of keeping track of which versions of what applications have been configured how and installed where.
    • Re:mod article up! (Score:5, Informative)

      by noselasd (594905) on Friday May 21, 2004 @07:31AM (#9213502)
      Try pkg-config --list-all
      pkg-config provides you with compiler/linker/preprocessor flags for
      compiling a program that uses various libraries.
      Now, if only all libraries provided a .pc file..

      • Mod parent up! (Score:5, Interesting)

        by sofar (317980) on Friday May 21, 2004 @07:36AM (#9213523) Homepage
        pkg-config singlehandedly is creating a sea of calm in autoconf land while still keeping the strength of autoconf. Take Xfce4 for instance. Within a relatively short timeframe this small group (4-5 devs) have written an entire framework that compiles on tens of platforms by using PKG_CONFIG extensively together with autoconf
        • Re:Mod parent up! (Score:2, Insightful)

          by Jahf (21968)
          Maybe, but I often find as a user when trying to compile a package that has a .pc pkg-config file requirement for another package that that other package often forgets to leave it's .pc file around.

          For instance, when compiling Evo from fresh source on a version of SuSE awhile back, I installed the various -devel RPMs that were required and yet those -devel RPMs left out their .pc files, meaning that even though I had the development libraries, Evo couldn't find them.

          Not saying pkg-config may not be useful
          • Re:Mod parent up! (Score:3, Informative)

            by sofar (317980)

            well that's actually your distro's fault. And yes binary distro's suck at providing headers and other -config and .pc stuff you need.

            source distro's like Lunar-linux [lunar-linux.org] provide a wonderful rich platform with all of these -dev stuff installed by default and the way the developer has meant it. These source distros are excellent for developers.
          • Re:Mod parent up! (Score:3, Informative)

            by irix (22687)

            As somone who works on an open-source project that requires evolution-devel to compile, let me say that I am well aware of that problem. When GNOME went over to pkgconfig, many distros took a while to build their -devel packages with the .pc files.

            That being said, I agree with the parent post - pkgconfig goes a long way toward solving problems with automake and autoconf. Everything in GNOME now uses it, so setting up the build environment for anything that depends on GNOME libraries is much, much easier.

        • Re:Mod parent up! (Score:3, Interesting)

          by KewlPC (245768)
          Until a project decides not to put it's .pc files in the standard place. For instance, when installing Clanlib on my Gentoo system (using Portage, no less), the .pc files didn't get put in /usr/lib/pkgconfig (which is where pkg-config expects them to be on Gentoo).

          Instead they got put in /usr/lib/clanlib-0.7.7, and when I tried to build a program that used pkg-config to find clanlib, the build broke. If I were a Linux newbie, I'd probably have just given up rather than try to find the .pc files that I knew
          • This is the one thing that bugs me about pkgconfig. I mean how hard can it be to add a --install option, allowing pkgconfig to choose where to place the .pc file.

            Maybe I'll go make a patch...

            Regards
            elFarto
      • Re:mod article up! (Score:3, Interesting)

        by Crayon Kid (700279)

        Which brings me to another issue: why isn't the output from 'configure --help' available in machine-readable form? Say, XML? This would help a lot with the creation of graphical configuration tools for source packages. AFAIK there are a couple of helper apps out there that do this, but they have to go through horrible hoops parsing the output from 'configure --help'.


        • The place to extract that information is not from 'configure --help', it's from the configure.{in,ac} file directly. The basic --help output (the standard options like --prefix, etc) is known already, so just get the arguments to AC_ENABLE_* and AC_WITH_*, or however they're spelled.

        • ...This would help a lot with the creation of graphical configuration tools for source packages. ...

          How is a graphical configuration tool going to help anything? You want to pile another layer of crap onto an already burdened autoconf just so you can play with your mouse? What the hell is wrong with you? Don't be such a pussy: type './configure --help', figure out the options on the command line, and stop recommending open source programmers waste their time developing mere gimmicks. Do you work for Micr

    • why isn't there a database of some sort which would at least allow the option of keeping track of which versions of what applications have been configured how and installed where.

      You mean why doesn't ./configure keep such a database or why isn't there one on all Unix boxes?

      The latter is obvious... there is no standard package management system.

      I can't imagine a mode of operation for configure where you're re-running it so frequently as to be a real time waster. Unless you've got a source base so screwed
      • Though you must remember, not all OS's/distro's utilize a packaging system, nor should they change their ways. For libraries on POSIX systems, looking in the library directories is sufficient enough for version information, for non-posix it might be a little harder (perhaps in the order of "stamping" a standard string format in ascii within the binary) For user-space programs its very easy (if the programmers are smart)... "-v" ...My $0.02
    • auto(conf|make) sucks badly

      Worst understatement ever!
      • it sucks so badly that i couldn't find a better way to express just how abhorrent the experience of dealing with auto* is. i was going to say something about how legions of puppies and kittens and other cute little furry animals die torturous deaths at the hands of commie nazi pedophile terrorist hippies. urm. yes. you see the problems i was having ;-P
  • Two suggestions: (Score:3, Informative)

    by warrax_666 (144623) on Friday May 21, 2004 @07:19AM (#9213459)

    These are both sort of combined configuration and build systems (which is the way it should be IMHO). Scons requires Python (>=1.5.2, IIRC), so it is as "only" as portable as Python itself (which is to say "very") while cmake doesn't require anything except a C++ compiler. The actual "language" in Scons is just regular python while cmake uses a hideous custom language.
    • by sofar (317980) on Friday May 21, 2004 @07:24AM (#9213479) Homepage

      they may be a drop in replacement for developers, but for packagers and people trying to track changes and new versions, both cmake and scons (blender!) are horrible. They cost us (A group of 10 people working on a distro) enormous amount of extra time (blender's upgrade to .33 took me a whole day to figure out, whereas before it only takes me 20 minutes to fully test a new blender version).

      all in all autoconf maybe a problem for developers, but for packagers it is still *by* *far* the best.
      • As a non-Linux packager, I want autoconf to die. It makes building a software package into a random event - either it works (by magic), or debug hell is afore you. Somewhat like installing Windows software.

        I actually found it easier porting software back before anybody did any attempt at making it fully automatic. autoconf's use of sh as a back-end language for a compiler for an auto-detect language often makes it necessery to muck about in the "object files" (sh files), and reverse engineering these i

  • problem inevitable (Score:5, Insightful)

    by fraccy (780466) on Friday May 21, 2004 @07:20AM (#9213462) Homepage
    I feel your pain, but this isn't just autoconf. Its a general theme of the way we compute. Version nightmares. I think the problem is unavoidable because of the way we currently compute: 1) Competition and the enormous diversity today will always leads to heterogenous systems, no matter how good their intentions initially 2) The semantics of software and its environment are not embedded in the data, which of course means when a version changes, something somewhere breaks, and someone somewhere has to fix it Theres only two solutions, either by decreasing diversity through standardisation (oh heck the version of that keeps changing too), or real autonomic computing operating at a higher level of abstraction. Roll on autonomic computing, real self-configuring and self-adapting systems. Until then, we can only attempt to minimise the problems, and can only ever solve them in a limited scope for a limited period of time.
  • by ville (29367)
    How about scons [scons.org]? I must admit I hardly used all of autotools' potential as they were quite complicated, so I can't say if scons offers anything even close to what autotools had.

    // ville

    • I second scons. It's perhaps more a replacement for make, but I don't think I'll ever write a makefile again after discovering scons,
      and the autotools are just way to complicated to begin bother with.
      • by sofar (317980) on Friday May 21, 2004 @07:31AM (#9213504) Homepage
        SCons isn't the solution either. SCons relies way too heavily on python and doesn't make better distribution tarballs. Most developers using SCons rooll out horrible tarballs that cannot even detect the proper Python version! (blender!!!).

        SCons makes you lazy, do the work, then your application builds BETTER on MORE platforms with autoconf
        • Ugh. Autoconf is such an ugly piece of shit. Its nearly impossible, for example, to create a KDE application without copying the autoconf/automake mess of an existing setup. And you're hosed if you need to do anything moderately unusual in your build procedure. In the end, 'make' is really just a one-off, hackish, single-purpose programming language. Its much better to replace it with a real, general programming language.
  • Try PMK, for example (Score:5, Informative)

    by wsapplegate (210233) <wsapplegate@est.un.goret.info> on Friday May 21, 2004 @07:32AM (#9213507) Homepage

    You can find it at pmk.sourceforge.net [sourceforge.net]

    Or else, you can have a look at A-A-P [a-a-p.org], by nobody else than Bram Moolenaar, the author of the One True Editor, a.k.a. ViM :-)

    There is also Package-framework [freshmeat.net], by Tom Lord, the author of the infamous Arch [gnuarch.org] SCM.

    I was about to mention SCons, too, but other people already did (it always pay to check other comments just before posting, especially on /. :-)

    To sum it up : there is no shortage of alternatives to the incredibly hairy Autoconf/Automake nightmare. The problem is, people are still using them for the very same reason they use CVS instead of Arch/Subversion, or Sendmail instead of Postfix/Exim : because they're considered ``standard'' tools, and people feel more comfortable with software they know to be used by plenty of other people (millions of programmers can't all be wrong. Can they ?). I really hope they'll stop making this kind of mistakes soon, so I won't need to curse them everytime I have to debug some Autoconf breakage...

    • by cgreuter (82182)

      I took a quick look at pmk a while back. I got as far as this FAQ entry:

      7. Why not supporting cygwin?

      Because cygwin is not handling executable files as it should. It is absurd to have to take care about a trailing'.exe' under a Unix-like environment.

      Absurd it may be, but Windows is the most popular platform out there and refusing to support it because it's too icky is just plain dumb. They've refused to make pmk useful enough to actually be valuable to me, so I haven't bothered using it for anyth

      • ``Absurd it may be, but Windows is the most popular platform out there and refusing to support it because it's too icky is just plain dumb.''

        Yay, a flamewar! I'll join! I see this issue the other way: it's not portable software not supporting Windows, it's Windows not supporting portable software.

        Before Windows was created, there was the POSIX API. Though definitely inspired by UNIX, POSIX is an API that any operating system could support, and there are indeed non-UNIX systems that are POSIX-compliant. Wh
        • Yay, a flamewar! I'll join!

          I'm not usually very good at flamewars, so just pretend I wrote a lot of incoherent abuse here.

          Asserting that developers are dumb because they do not support a system that was designed to be incompatible is, I would say, misguided.

          If you go back and read my post, you will see that I'm referring to the behaviour as dumb, not the people. That's a big distinction. These folks have written a working non-trivial software package and that requires some smarts. I'm talking a

          • Sh.t, that wasn't a flamewar, that was healthy discussion! You spoiled it. ;-)

            Anyway, I must go to bed now, or I'll be writing incoherent abuse because I'm incapable of doing anything else. Thanks for your explanation!
  • by aminorex (141494) on Friday May 21, 2004 @08:56AM (#9213943) Homepage Journal
    > Is there something else out there that is as
    > portable as autoconf to use instead?

    Yeah. It's called GNU Make.

    Seriously, if you write your makefiles and your
    code in a responsibly portable manner, there's
    absolutely no reason for autoconf or automake.
    And it's not hard. I've done it repeatedly.
    The auto* tools are an antipattern virus.
    • > Seriously, if you write your makefiles and your
      > code in a responsibly portable manner, there's
      > absolutely no reason for autoconf or automake.

      Well, that's right except sometimes you can't avoid some #ifdef quirkyness (because a function has to be invoked with different parameters in some foreign C library you target, or because Windows uses '\' instead of '/' as a directory delimiter, etc.). In these cases, the simplest way to go is to write a Makefile.Linux, a Makefile.FreeBSD, a Makefile.

    • by grubba (89) on Friday May 21, 2004 @11:44AM (#9215922)

      So how do you portably detect which taste of system calls you have on the OS without autoconf, and without an explicit database OS <=> feature?

      eg:

      • Is your getservbyname_r OSF/1- or Solaris-style?
      • Does your getspnam_r take 4 or 5 arguments?
      • Does your struct hostent have the field h_addr_list?
      • Are you on a Linux system with a broken <shed.h>?

      All of the above are easily detectable with autoconf.

      I however agree with you that there's absolutely no need for automake.

      • That's what ifdef is for.

        But more importantly, if you're writing application
        code using a system call layer, you've already lost
        the game.
        • by Anonymous Coward
          I had to anonymous answer you so I could mod parent up.

          Of course thats what ifdef is for, but ifdef isn't dangling its always:

          #ifdef SOMETHING

          and where do you think the SOMETHING comes from for cases like:

          * Is your getservbyname_r OSF/1- or Solaris-style?
          * Does your getspnam_r take 4 or 5 arguments?
          * Does your struct hostent have the field h_addr_list?
          *Are you on a Linux system with a broken ?

          I'll tell you; auto-conf doe some checks, possibly test-compiles, basically discovers the local landscape and s
          • #if defined(__sun__) || defined(__FreeBSD)
            #define FOO(a) (a)
            #elif defined (__linux__) && defined(__GNUC__)
            #define FOO(a) ((a)+3&~3)
            #else
            #error Your system sucks
            #endif
  • by Anonymous Coward on Friday May 21, 2004 @10:16AM (#9214704)
    Chances are the other packages people here are talking about haven't ever been built on a cray, too. A Cray is not exactly common-place.

    Usually, the simplest way to deal with broken scripts/automake/autoconf tests, is to use a better shell. Take the bash, tcsh, or whichever shell works on Unicos, and run your autoconf tests throught that. If you think you've found a problem in autoconf itself, run the regression test suite and submit a bug report.

    A quick google search over the autoconf mailing list archive shows that there were 0 posts on Unicos in the last two years. So chances are, that any bugs in autoconf itself in the last two years have not been tested, or that it has no bugs on Unicos ;) I'd think that the autoconf developers would appreciate a good bug report, just like any other project would.

    If your problem is autoconf's performance, then a) use a faster shell, b) use --cache-file, c) use ccache to run those myriads of little C tests faster. Or fix the tests in question.

    A classic on slow platforms is the maximum shell length test. That one comes from libtool, not from autoconf. It's been vastly improved in libtool 1.5.6, for example. If your upstream doesn't use a recent version of libtool, convince them to do so. :)
  • by jcorgan (30025)
    I initially read the headline as, "Alternatives to Ashcroft." Damn, I wonder what's on my mind these days?

  • Perl? (Score:1, Flamebait)

    Perl doesn't use autoconf. It does have script that sort of mimics autoconf, but isn't. Also most Perl extensions use `perl Makefile.pl` to configure themselves.
    • perl itself is autoconf'ed. perl Makefile.pl is just for XSs
      • Nope, like I said, Perl has a script that works like autoconf, but isn't.

        From Perl's configure.gnu:
        #! /bin/sh
        #
        # $Id: configure,v 3.0.1.1 1995/07/25 14:16:21 ram Exp $
        #
        # GNU configure-like front end to metaconfig's Configure.
        #
  • I have a replacement (Score:5, Interesting)

    by Chemisor (97276) on Friday May 21, 2004 @10:56AM (#9215211)
    I was pretty fed up with autoconf myself, and wrote a little C app to emulate it. Initial ./configure time dropped to a second and reconfigure is instantaneous. I would recommend it for your simpler projects: bsconf.c [sourceforge.net] and bsconf.h [sourceforge.net], the latter being the configuration file.
  • by Russ Steffen (263) on Friday May 21, 2004 @12:59PM (#9217167) Homepage

    ... Those who do not understand autoconf are doomed to reinvent it. Poorly.

  • by spitzak (4019) on Friday May 21, 2004 @01:28PM (#9217667) Homepage
    Case in point: I wanted to compile cairo [cairographics.org] on my Mandrake 9.1 system. I couldn't until I edited the autoconf file to remove "new commands" and added phony files to make pkgconfig happy. Then it compiled just fine and worked. I tried to compile the demos and was completely frustrated and eventually hand-wrote a trivial makefile and they all compiled just fine and worked (except for the GTK one...). I am now trying to compile the "Glitz" OpenGL backend, and I am running into the same troubles: I can't prove it yet, but I strongly suspect it will compile just fine on my machine, if I can just get around the mysteries and complaints of autoconf/automake/pkgconfig, probably by wasting a great deal of time and divining the basic, and probably simple, Makefile that would compile it.

    The incredibly frustrating, and dare I say stupid thing is that the only thing I need to "update" on my machine is the damn autoconf tools! I actually have all the libraries and headers these things need. That is completely backwards! In fact due to autoconf they have pretty much said "this only compiles on the very newest experimental Linux system". Well in that case, you have eliminated the need for "autoconf" and you could send out Makefiles that work only on a new Linux system. That would probably be easier for somebody like me to edit and get working on an older system.

    When you do try to fix this, you run into the horrifyingly bad syntax and rules of M4 and GMake. Supposedly this is because they want to be back-compatable and run on older systems. But they lie when they freely add new "autoconf" commands so that the newest version is needed. Why not scrap the whole thing and try a modern syntax?

    My proposed solution: make ONE compiled program that does "configure" and "make" and "make depend" (and "install" and "uninstall" and "clean" and "dist" and all the other special-cased targets...). This program can use the existing automake/conf stuff so it can be compiled for multiple platforms. The program then reads one file, in editable text (no XML, and it should be trivial to add/remove a source file by adding/deleting a line). This file should be parsed in a procedural way, with "if" and "for" statements and functions (ie it is perl or something) and the result should be the dependencies tree and it can then run the programs (the result is extremely static and has actual filenames and commands, not templates or "rules"). Make it really easy to add and remove switches to the compiler. Make it save user answers to questions in a file so it can provide those answers again, and make a gui program that provides panels and checkboxes to change those user answers. Make it automatically check for dependencies in any C style source files by looking for "#include" without running the compiler, and save the results in a binary database with date stamps so it can run this instantly as needed. Any package dependenciies should be checked with "if" statements in the file, the program would have enough commands to do typical file system things like look for files or grep a file for something, and it is then trivial to write a file that checks for something in multiple places by using "if-else" statements.

    • I wanted to compile cairo on my Mandrake 9.1 system. I couldn't until I edited the autoconf file to remove "new commands" and added phony files to make pkgconfig happy.

      Unless I misunderstand the problem, autotools is not to blame for your woes. The way I interpret your complaint is that you did not have a build environment that satisfies the build dependencies of cairo.

      Now, you might have a valid complaint that cairo does not adequately document its build requirements, or that it has build requirements

      • The autoconf program would not compile the configure.in file provided with Cairo.

        You are right that more problems were probably due to pkgconfig than autoconf. However for the average person smart enough to type "make" these all look like an inpenetrable mess and are all the same thing.
  • I think you are trying the fix the problem on the wrong level.

    There is nothing wrong with autoconf, and there is nothing wrong with new releases of software depending on the latest {autoconf|make|libc|lib|.*} whatsoever.

    Take gentoo for example. After installing a minimal OS i can type emerge gnome and it will pull roughly 200 packages and compile them all cleanly. I honestly don't have any idea which of those 200 unrelated projects require which version of autoconf or whatever. Gentoo knows and deals with
    • That beeing said, my gentoo servers pull a new autoconf version about once a month for some random unrelated update. Thats really a bit excessive.

      I am starting to think the same thing. I completely understand why it is happening though. Every time behavior changes in a commonly used library or other common dependency (commercial or open source) managed by the autotools, those version changes must be reflected in the autotools.

      The thing is, if you want the latest and greatest packages built from source

  • ...is to use a language that's not going to get bogged down in the mess of portability. Java's a good example in theory but in practice is known to be somewhat brittle. Python/Perl/Mozilla seem to be a much better example in practice.

    This, of course, limits you to systems on which the language has been implemented but it allows you to push the burden of portability onto somebody else.
    • Yup. C was designed as a systems programming language, which would let you get close to the hardware. That's why, for instance, an int isn't defined to be any particular number of bits. C was really created as an alternative to coding in assembly language. It was never conceived of as a language that would be used for writing huge, cross-platform end-user applications. C++ is likewise a fine language for certain purposes, but by designing it to be C-compatible, they gave it some of the same limitations and
      • by Anonymous Coward
        C code is more portable than any other language.

        I speak from experience, having taken many odd projects and tried to get them to work on some even odder platforms. If you are writing code that needs to be widely used, say it implements a protocol or parses a file format you hope to be standard, C is your only choice.

        If you are writing in perl or python, you are doing so because it is easier for you as the developer, not because it's easier or better for the person running it. In that case, one of two th
    • > ...is to use a language that's not going to get
      > bogged down in the mess of portability.

      This is a solution people should seriously consider -- a lot of project code doesn't need to be in C because it isn't performance-sensitive on the level that C addresses. Using a language that has a high degree of portability and the ability to extend into C for the places where you need to explicitly optimize may be a valid alternative solution to a robust configuration system. Now if only there wasn't such a
  • Surprised that no one has mentioned 'smake'

    Does anyone have a summary of the differences of all the various *makes?
  • to those who have some decent autoconf kung-fu:

    after you get the hang of it, is it comfortable enough to use with all your building? or is something you add on at the end when app/project is ready for export?

    I'm good enough at makefiles to make them pretty portable (i.e. it'll work on the 2-3 platforms I really care about at the time), and just have never bit the bullet and learned use autconf.

    still, if I download something, and install procedure is anything besides "./configure; make; make install", I c
    • by grubba (89)

      autoconf has two purposes:

      • Retargeting the installation and configuring options (eg --prefix, --with-*).
      • Detection of the build environment (eg compiler, system calls, etc).

      Converting from a plain Makefile to configure.in+Makefile.in is straightforward if your Makefile already uses variables for binaries and directories.

      The main reason to use autoconf is the second point above; when you write code that uses system calls (eg read(2), write(2)) and not libc calls (eg fread(3C), fwrite(3C)), that differ be

    • after you get the hang of it, is it comfortable enough to use with all your building? or is something you add on at the end when app/project is ready for export?

      I have to say yes. I just converted my latest project, htop [sf.net], to autoconf and automake, and don't regret a thing. I just followed the tutorial from the info pages (yes, not the manpages, the other ones, that nobody reads :) ), wrote two files (configure.ac and Makefile.am), and everything worked.

      There is also a very cool program called autoscan

  • Write portable code. (Score:4, Interesting)

    by V. Mole (9567) on Friday May 21, 2004 @03:24PM (#9219147) Homepage
    It's really not that hard. The autotools are crutch for people who can't be bothered to actually learn the C language and library, or the difference between POSIX and what their current environment provides. Go read #ifdef Considered Harmful [literateprogramming.com] (PDF) by Henry Spencer, about the right way to deal with portability problems, and notice that the whole approach of autoconf is wrong.

    "checking for stdlib.h..." indeed. If you don't have an ISO-C environment, you're so screwed that there's nothing autoconf can do to save you: you don't know how the language works, so whether or not you're missing a few library functions isn't going to make any difference.

    I once worked on a fairly large process control system, of which less than 1% was portability code. It ran on Solaris, AIX, HP-UX (this was pre-Linux). It also ran on VMS and WinNT. Most of the portability issues dealt with the entirely different models for process management and memory-mapped files, not with pipsqueek stuff like autoconf.

    It's real simple: you read the docs. You determine what the standard actually requires, not what your development system happens to do. And you program to that, then test.

    As for the autotools proclaimed ability to port to systems you weren't planning for, that's so much BS. If the system is sufficiently different that ANSI C and POSIX aren't sufficient, then you're going to have to update your code anyway.

    Replacing autoconf with some other crutch isn't going to help. Just ditch it entirely.

    • by Rob Riggs (6418)
      The autotools are crutch for people who can't be bothered to actually learn the C language and library, or the difference between POSIX and what their current environment provides.

      Yours is a rather myopic and provincial view of the problem, IMHO. The autotools are a necessary evil when one has more dependencies than just ISO C. What versions of libraries are installed? What odd bugs must I work around? Where are the headers for package foo located on this system? What compiler options should I pass?

      • I don't think you fully grasp the problems that the autotools actually do solve for developers.

        If the autotools, did, in fact, solve the problems, then I'd have less of an objection. But they don't. They merely mask the problems in copious output, unreadable logs, and bad code. Autoconf is bad enough, but when you get the atrocious automake and libtool plastered on, it's pretty much undecipherable.

        The gtk/gnome approach of the *-config script is a much cleaner way of dealing with optional subsystems. Le

        • I'm going to guess you haven't done much porting between non-unixlike systems, if you think autoconf is a good thing.

          Quite true. The only systems that I have ever seen that work across UNIX and non-UNIX systems are all custom-made. The autotools work best at solving UNIX compatibility issues. And I think it does a fine job if that's the target.

          I also agree that the gtk/gnome approach is better than some of the autoconf stuff. But it doesn't solve all of the problems that the autotools solve, such as

  • by k4_pacific (736911)
    A new program: autoconfconf

    I am struck by the recursive nature of the situation.

  • Autoconf is ok, but it should be designed with it's own versioning schema on mind. To avoid nasty incompatible hacks in future. That future is now. But in software, everything is fixable, just someone must care about it. Anybody volunteers instead of slashdot ranting? Yes, I can hear you: you just don't like this m4 stuff, sorry.

Facts are stubborn, but statistics are more pliable.

Working...