Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Operating Systems Software

Porting Open Source to Minor Platforms is Harmful 709

Tamerlan writes "Ulrich Drepper posted a blog entry titled "Dictatorship of Minorities". He argues that open source projects' attempts to support non-mainstream (read "non-Linux") operating systems slow down development and testing. While Ulrich may be biased (he is a RedHat employee) he has the point: if you ever read mailing list of any large open source project, you know that significant piece of traffic is about platform-specific bugs or a new release broken on some exotic platform."
This discussion has been archived. No new comments can be posted.

Porting Open Source to Minor Platforms is Harmful

Comments Filter:
  • by geomon ( 78680 ) on Monday May 30, 2005 @08:21PM (#12680165) Homepage Journal
    But I agree with the opinion if, as a developer and/or integrator, you are attempting to reach a mass market. Red Hat, then SuSE and others have abandoned the Sparc platform awhile ago. I bought an old Ultra 1 and am now limited to a few distros to keep my Sparc machine up to date.

    But it was my understanding that the idea behind open source was to roll your own if your machine was not covered. If I want to keep the Sparc chugging along, I guess I'd better learn to port it myself.
  • Of course (Score:2, Insightful)

    Of course it is... And by that logic, developing software at all is harmful - takes time, money, and all the same stuff it takes to port it.

    http://imcommunity.net/cgi-bin/u.cgi?u=38 [imcommunity.net]
    • Re:Of course (Score:5, Insightful)

      by Johnny O ( 22313 ) on Monday May 30, 2005 @09:23PM (#12680565) Homepage
      And this is the logic which most Windoze software manufacturers use to throw at us Linux users. IT is why us Linux users are missing some great software out there.

      So, tell me... In our minority, why on earth would we take that attitude?

      Ulrich is off his rocker.
      • Agree, and... (Score:5, Insightful)

        by leonbrooks ( 8043 ) <SentByMSBlast-No ... .brooks.fdns.net> on Monday May 30, 2005 @11:16PM (#12681262) Homepage
        ...adapting your application to architectures as diverse as x86, ppc, MIPS and Sparc at different word-widths is a great way to uncover subtle and long-standing bugs.

        To be sure, robustness may be as optional for you as it was for Microsoft (and would still be, absent competition from Linux), but in the long run it seems to pay off.

        Most of us Linux users would not regard, forex, The GIMP as particularly robust, but compared to the typical WIn32 app it's a paladin of reliability. My sister-in-law [goldenlight.bur.st] routinely leaves it open (and unsaved) for weeks on end, confident that it will still be there when she gets back (but a recent hard-disk failure of mine seems to have put the fear of God into her WRT reliability). She also happily browses everywhere fearlessly, knowing that she can't damage anything on her own machine, and nobody either I or her know of have ever been burnt by malware while browsing in Linux.

        MS users just don't do that - not more than once or twice, anyway.
        • by IamTheRealMike ( 537420 ) on Tuesday May 31, 2005 @08:09AM (#12683161)
          Wow, it's not often I find myself defending Ulrich Drepper of all people, but I do think there's a huge misconception here.

          People seem to be repeating a lot of "folk wisdom" about portability. Oh it's just bugs. Make your damn software clean you damn coder. Etc.

          I can guarantee you that anybody who says this has never actually read the sources to glibc, binutils or gcc. Hell, they probably never even read the mailing lists! I have, and when Ulrich says an enormous amount of effort goes on supporting minority platforms he is totally correct. Hello, binutils isn't the GIMP people! Other platforms have totally different architectures and often need huge amounts of platform specific code from these projects. This isn't a case of sloppy coding, it's a case of massive amounts of work being done to support edge cases. Go read the sources to bfd sometimes. Adding support for one platform that uses different assumptions about basic things like memory layout can require huge reworkings of the code.

          Essentially there are a lot of people spouting off here based on their experiences of compiling FooApp on FreeBSD or whatever. Have you written a C library? A threading library? No ... then you are probably not really qualified to judge how much work this generates for Ulrich.

          Oh, and for the guy later on in this thread who says "AIX is not a minority platform, WTF?" - I say to him WTF. AIX most definitely is a minority platform. Maybe not in the world he lives in, but in the real world Windows is dominant, sometimes people think about Linux/Mac or even FreeBSD and everything else barely registers at all unless you administer high end servers or work on embedded software (most people do not).

          I've been bitten by this mentality before. Back when exec-shield was first developed, it broke Wine (which I work on). So I set out developing a fix. Eventually, I wrote a GNU linker script that arranged virtual memory such that things would work correctly even when exec-shield was active. But it didn't work, because of a simple bug in the kernel. No problem, right? Just fix the bug, right? Well, actually, somebody did. A patch was written, submitted, got into Andrew Mortons tree ... and it didn't compile on Itanium. The original author didn't have access to such a machine, neither did I, and the person who reported the failure (who worked for Intel, IIRC) was overloaded with other work and couldn't fix it. So the patch was dropped.

          In the end, a few months later, we had a different solution that was about a million times more complicated. Largely because a simple bugfix patch didn't compile for unspecified reasons on a platform nobody uses and this was grounds for it to be dropped. That mentality of "all computers are born equal" is why Debian has become a laughing stock and it cuts both ways.

    • Straw man (Score:5, Insightful)

      by Markus Registrada ( 642224 ) on Monday May 30, 2005 @11:43PM (#12681400)
      Of course the headline Slashdot reported is not what he said. Uli is abrupt, but he is practical, and not stupid. He's not always right, but when he's wrong he's interestingly wrong. If you think he's arguing for something stupid, you aren't paying attention.

      What he said was *not* that Glibc, or Gcc, and whatnot shouldn't be ported to AIX, m68k, and whatever. What he said was that he does not care to *maintain* those ports, and should not be expected to. IBM (or IBM's customers) can certainly afford to maintain a port for AIX. Let them. Likewise, all those embedded-system houses dependent on m68k targets are welcome to step up and supply their own patches to keep their ports working.

      If a patch to mainline breaks the AIX port, it's the job of the AIX maintainers to figure out how to fix the patch, not him, and not whoever contributed the patch (but has no access to AIX targets).

      He's not even saying he would reject patches needed to support minority targets. Whoever's maintaining the m68k port doesn't need to maintain a fork. They are entirely welcome to send along whatever patches they need installed. They need only be sure their patches don't break any supported targets. This certainly makes more work for users of less popular targets, but it spreads the work around, instead of piling it up on those doing mainline development. The mainline maintainers have plenty else to worry about.
      • Re:Straw man (Score:5, Interesting)

        by Nasarius ( 593729 ) on Tuesday May 31, 2005 @02:20AM (#12682053)
        (but has no access to AIX targets).

        With a project as big and important as GCC, you'd think they'd have a server for each platform set up for all their developers to play with. Gentoo has Sparc, MIPS, PPC, etc. boxes for their developers to use for porting software.

        It seems to me that a smart idea would be to have some kind of system where a developer could submit a patch, which would then be sent out to a server farm, where each server would try to compile GCC with the patch, then run a test suite. Doesn't Mozilla do something like this nightly?

        "I don't have access to a [foo] box" should never be a valid excuse with larger projects.

        • With a project as big and important as GCC, you'd think they'd have a server for each platform set up for all their developers to play with.

          So, everybody who fixes something that (incidentally) affects emission of debug annotations in Gcc has to learn all the idiot formats used in AIX, Solaris, Tru64, PE, and what-have-you just because FSF happens to have those machines? Yes, somebody on the project might know intimate object-file format details about any of those. It's unreasonable to expect everybody

  • Question (Score:3, Insightful)

    by cyberfunk2 ( 656339 ) on Monday May 30, 2005 @08:24PM (#12680188)
    Are they referring to Mac OS here ? I highly value the open source ports made to Mac OS X, such as firefox.

    Furthermore, at least on OSX, the Fink project makes many programs OS X buildable, but puts the maintenance onus mostly on the Fink people, not the original authors. Of course this can have it's own problems.
  • Java? (Score:2, Funny)

    by Lingur ( 881943 )
    Wasn't Java supposed to solve this problem? I was under the impression that you could run Java apps on any platform (albeit slowly) without worrying about compatability?
    • Re:Java? (Score:5, Funny)

      by Anonymous Coward on Monday May 30, 2005 @08:45PM (#12680320)

      Yeah - Sorry about that, our bad.

      -- Sun Microsystems

      • Re:Java? (Score:5, Funny)

        by Anonymous Coward on Monday May 30, 2005 @08:51PM (#12680360)
        Obligatory Zawinski paraphrase:
        Some people, when confronted with a problem, think "I know, I'll use Java." Now they have two problems.
    • Re:Java? (Score:3, Insightful)

      by kaffiene ( 38781 )
      Java DOES solve that problem. The linux crowd by and large won't use it because it's not quite their flavour of 'free'.
      • Re:Java? (Score:3, Interesting)

        More specifically, the VIRTUAL MACHINE architecture addresses this problem. The .NET/Mono framework is another stab.

        I suspect that we have not yet seen the VM architecture in its full maturity.
  • Debian (Score:3, Insightful)

    by 3770 ( 560838 ) on Monday May 30, 2005 @08:25PM (#12680201) Homepage

    I'm a fan of Debian, but I think that Debians effort to support the myriad of architectures out there is hurting it.

    It does a great service to the rest of the Linux community though, because it helps keep things portable.

    But having a requirement that something work on a large number of platforms slows down the release cycle.
    • by jesterzog ( 189797 ) on Monday May 30, 2005 @09:11PM (#12680499) Journal

      But having a requirement that something work on a large number of platforms slows down the release cycle.

      I'm not sure that I agree. To me, the Debain delays seems to be at least as much a political and leadership thing. In particular, consider how quickly Debian went into a freeze for a new release after a change in leadership. Granted that it's a week behind schedule, but the green line is now going down rapidly [debian.org], and we're expecting a new release within a matter of days. If it was so difficult to support and release on multiple architectures, this likely wouldn't have been able to happen.

      I'm not trying to imply that the old Debian leadership was necessarily bad or that the new leadership is particularly good. But a change in attitudes very quickly resulted in a new release. This suggests that support on lots of architectures has little to do with it, whereas leadership attitudes has a lot to do with it.

      We'll have to wait and see just how reliable Debian Sarge turns out to be when it's promoted to stable, of course. (Disclaimer: I run debian sarge on my home workstation and laptop.)

  • by EbNo ( 174259 ) on Monday May 30, 2005 @08:25PM (#12680203)
    There are many instances where OpenBSD developers indicated that a bug found in one port led to discovery of problems that affected several other platforms. It seems in this case that multiplatform support is beneficial, and the larger the number of platforms, the greater the likelihood that such bugs will be found and fixed.
    • I was about to make a similar remark but thank you for stating it! While some code may "work" in one place this by no means makes it bug free. There are many instances of bad code working but sheer luck and only under a specific arch/platform. By ensuring code works under multiple architectures you will help eradicate bugs that may be exploitable. For example when a program seg faults repeatadly under OpenBSD I know that the program in question is not managing memory correctly. (OpenBSD with its memory protection refuses to allow reads/writes to illegal addresses that on other platforms could have resulted in exploitable holes.) While I have written many a fix for such programs it is nice to easily identify which programs/developers have a clue and which do not.
    • As you say, there are examples where porting has helped a project. I know that in porting one of my games to four platforms (Classic Mac OS -> Windows -> Linux -> OS X) has helped eliminate bugs that I never knew were there. Also, I learned things that have made my later projects easier to port since I more able to write them "correctly" to begin with. By avoiding platform specific libraries and techniques I write better code.
      • by Kvorg ( 21076 ) on Tuesday May 31, 2005 @04:13AM (#12682371)
        I am deeply convinced that porting assures portability, and portability is one aspect of clean code where bugs and wrong assumpsons are noticed, resolved and corrected.

        Surely porting to platforms such as the Alpha and UltraSparc was a very good basis for porting to platforms such as AMD-64. This is a crucial advantage for free software, where we can be sure that we will be able to support new platforms and make interesting platforms mainstream.

        On the other hand, the premisis that the main maintainers can not be responsible for all the porting effeorts is reasonable. Debian is thinking along the same lines, and for good reasons.

        I think it is wrong and bad to assume porting is a bad thing and avoid it. Even apparently futile projects such as porting free software for closed commercial platforms gives a large amount of flexibility in design and portability and helps projects such as embedded graphical environments.

        Portability is just one facet of advantages of free software and as such is a precios thing that we have to cultivate. But it sould be just another part of the free and open collaboration development process, not an obligations for the main developers.

        Just my 2 cents.
        • I too have found endian specific bugs in code I maintain. It's why I keep an Ultrasparc III in my closet for compile testing.

          I can easilly port between Linux and *BSD but the problem comes when users ask me to port my code to non Posix OS or claim to be Posix but really aren't Posix(win32). Or my latest peeve: sort of Posix but non ELF non GNU (OS X).
      • As you say, there are examples where porting has helped a project. I know that in porting one of my games to four platforms (Classic Mac OS -> Windows -> Linux -> OS X) has helped eliminate bugs that I never knew were there. Also, I learned things that have made my later projects easier to port since I more able to write them "correctly" to begin with. By avoiding platform specific libraries and techniques I write better code.

        Contrast this with the paragraph below:

        Jon Ross, who wrote the orig

    • by AaronW ( 33736 ) on Monday May 30, 2005 @09:09PM (#12680488) Homepage
      I have found this to be the case numerous times when working on KDE for Solaris. I've found numerous bugs, a few endian specific that were not specific to just Solaris. Supporting multiple similar platforms can be a good thing in terms of finding bugs. Some bugs will show up much more frequently on different platforms due to differences in things like memory managers or even how some APIs are implemented.
    • Sorry, had to do a Blake's 7 Zen impersonation there. Seriously, a well-designed program for Linux should run just as well under any *BSD, or any Unix platform, with little or no modification. POSIX is POSIX, no matter what the label on the box, after all.


      (The only exceptions would be things that use specific Linuxisms, such as some of the Netfilter calls, anything to do with a filesystem that is only available on Linux - XFS, JFS, Lustre for example. The network structures use different variable names, but that's not an excuse as you should be using an abstraction library in those cases.)

      • anything to do with a filesystem that is only available on Linux - XFS, JFS, Lustre for example

        You do realize that XFS is a port from SGI's UNIX varient (IRIX I believe). That JFS is ported from OS/2 to Linux (and was originally written for AIX and the s390 Mainframe if I recall the details correctly).

        Lustre might be Linux only. I believe it started out life as a Linux-only project. I believe that ReiserFS is fairly Linux only (I thought they had ports to other OS's, but I can't find anything o

    • by Nailer ( 69468 ) on Monday May 30, 2005 @09:51PM (#12680750)
      Wait a sec...

      Porting Open Source to Minor Platforms is Harmful

      What the fuck? Where'd you get that from? Read Ulrichs post. How about:

      Delaying the development of features because of problems with minority platforms that can't be fixed by the bug reporters is Harmful

      You may disagree, but unlike the title of this article, it does actually cover what Ulrich is talking about.
    • by Arker ( 91948 ) on Monday May 30, 2005 @09:52PM (#12680763) Homepage

      Indeed.

      Now, to be fair, I have to concede the man does have a point. Supporting several configuration options and several platforms increases complexity, if your goal is simply to get the thing running and marketable.

      At the same time, though, dealing with that increased complexity can give a project the impetus it needs to clean up spaghetti code that 'just runs' and replace it with more auditable and correct code, which is really a gain in the long run. Assuming, of course, your goal is to write good software - not just to write 'good enough' software and get it out the door in line with a deadline from marketing.

  • by Chordonblue ( 585047 ) on Monday May 30, 2005 @08:27PM (#12680217) Journal
    I'm not sure I totally agree with this article - at least as far as Windows porting is concerned. Programs like OOo are gaining acceptance in the Windows world and that foothold has led my own organization to 'embrace and extend' that success. For instance, for the first time we will be purchasing Apples - running NeoOffice of course - and we already have a few Linux terminals here for public use.

    I like to think of OSS/GPL stuff as a 'gateway drug' - to use an analogy. Using it may not automatically make people go to Linux, but it certainly makes it an increasing possibility.

  • by Dancin_Santa ( 265275 ) <DancinSanta@gmail.com> on Monday May 30, 2005 @08:28PM (#12680219) Journal
    First off, everyone will complain about this. The thinking goes that Open Source == Freedom, so the more choice in platforms you have, the more Freedom you have and thusly you actually help Open Source more.

    I think this is incorrect.

    First off, Open Source, despite its close engagement with Freedom ought to also stand for what is best in the Software Engineering world. This means clean, lightweight, portable code. For better or worse there is a standard which is POSIX. Linux, to the extent that it uses the GNU system, is basically POSIX-compliant. Open Source projects ought to target POSIX and keep themselves free of proprietary entanglements.

    This can be achieved by focusing efforts on programming for Linux, the premier Open Source operating system. Only by keeping the code clean can a project be easily ported, but a project that isn't even near completion ought not be ported at all. Such non-mainline work results in incompatibilities and divergences from the main trunk of code that cannot be easily fixed down the road.

    A very good example is the Symbian/Nokia gcc compiler which has many special extensions and cannot be used to compile for any other targets or operating systems. Well, they are doing away with their special version of the compiler and finally going back to the main line gcc tree. Unfortunately, all that work to specialize gcc for their platform is tossed out the window now. Work to no avail, essentially.

    The key here is not to focus on Linux, specifically. Rather, it is to focus on a standard and program to that. That Linux is one of the best of the standard bearers, it makes sense to complete programming there first rather than start porting to esoteric platforms right away.
  • NetBSD? (Score:5, Interesting)

    by Bananatree3 ( 872975 ) on Monday May 30, 2005 @08:29PM (#12680224)
    The development of netBSD to over 40 different platforms has brought a lot of good development to many different platforms that would have been dominated by mono-operating systems. A good instance is the handheld devices platforms (HPC,Palm PC, etc.), which would otherwise be dominated by Windows CE (except for the few Linux Palm PC's, but majority are WinCE). The development of netBSD for the majority of platforms has reached great maturity, and it is still developing well.
  • Try a VM (Score:2, Insightful)

    by alext ( 29323 )
    A proper engineered solution to this problem was developed some time ago: a VM.

    There are 1000s of Java projects on SourceForge that will never have this problem.
    • Re:Try a VM (Score:5, Insightful)

      by mellon ( 7048 ) * on Monday May 30, 2005 @08:50PM (#12680349) Homepage
      Spoken like someone who's probably never tried running their JVM-based GUI application on a new platform. Java cross-platform compatibility is a nice idea in theory, but in practice you wind up having to test everywhere and tweak your code when you run into differences in GUI implementations, so it's definitely not write-once, run everywhere. A more complete API specification would help here, but if wishes were horses, there'd be a lot of poo on the road.
      • Re:Try a VM (Score:4, Insightful)

        by mrchaotica ( 681592 ) on Monday May 30, 2005 @09:57PM (#12680799)
        No kidding. Lots of Swing Java apps (as opposed to Cocoa Java ones) are horrible on the Mac because they fail to make it act reasonably like a Mac app, whether it uses the Mac "look and feel" or not (not to mention that sometimes they're hardcoded to Java- or GTK-style so they look horrible too).

        This isn't an issue limited to Mac OS, by the way -- Java programmers ought to take the differences between GTK and Windows into account, too.
    • Re:Try a VM (Score:3, Insightful)

      by quelrods ( 521005 )
      Except that java is one of the most non-portable solutions ever. Sun java runs on exactly: windows x86, windows ia64, linux x86, linux ia64, solaris sparc32, and solaris sparc64. Some decently written C code easily runs on more systems that that while only requiring a compile. Java is only in theory portable. Unless sun opens the jvm it will never be fully portable.
    • Re:Try a VM (Score:3, Informative)

      by Rich0 ( 548339 )
      Clearly you aren't running on amd64. I've given up on running just about anything other than helloworld.java on this platform, using any JDK I can get my hand on (both Blackdown and Sun, stable and beta versions).

      A VM is just another architechture. In theory we could just write everything for x86 and then run emulators on every other platform, and it would be about the same thing.

      The problem is that Java works great as long as you only run it on an x86, or maybe a sparc or a mac. And java apps have the
    • Re:Try a VM (Score:4, Informative)

      by AndyL ( 89715 ) on Monday May 30, 2005 @10:51PM (#12681129)
      A quick glance at a changelog for a Java project, such as Azureus ,shows all sorts of "compatability fixes". Not just for compatability across diferent OSes, but also for compatability across diferent versions of the VM that's supposed to solve everyone's problems.
  • I call bad c code (Score:5, Insightful)

    by quelrods ( 521005 ) <(quel) (at) (quelrod.net)> on Monday May 30, 2005 @08:30PM (#12680229) Homepage
    Overall the arguement is mostly bogus. For example many linux developers have trouble writing code that even compiles under any of the *bsds. That is just sloppy coding. If everyone got in the habbit of at least writing code that doesn't use system specific includes (linux developers seem the worst at this) and compiled with gcc -strict -Wall or something similar it wouldn't be much of any issue. While I can see that a request to make something work on OpenBSD VAX might be better ignored I fail to see how supporting at the very least linux/*bsd (Open, Net, Free) on ppc, sparc, sparc64, and x86 is supporting a minority. Overall OSS users/developers ARE a minority and to argue over which minority beats who is silly. Also, to only bother to support linux is no better than only bothering to support windows!
    • Re:I call bad c code (Score:5, Interesting)

      by TCM ( 130219 ) on Monday May 30, 2005 @09:59PM (#12680805)
      Very good point. Supporting minority platforms is only troublesome if your code is riddled with i386isms and lazy coding practices. If you do it right the first time with always portability in mind, not only will you get cleaner and more maintainable code, you also get the ability to run just anywhere as a bonus. My prime example as usual is NetBSD.

      It's sad that whenever you build some GNU sources or the latest Linux app du jour you get tons of warnings. Some projects even compile their files with 2>&1 >/dev/null. How sad is that?

      It's not just the Linux folks. OpenSSL is actually worrying me: /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(85): warning: pointer casts may be troublesome [247] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(94): warning: pointer casts may be troublesome [247] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: bitwise operation on signed value possibly nonportable [117] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: possible pointer alignment problem [135] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: pointer casts may be troublesome [247] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: possible pointer alignment problem [135] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: pointer casts may be troublesome [247] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: possible pointer alignment problem [135] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: pointer casts may be troublesome [247] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: possible pointer alignment problem [135] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: pointer casts may be troublesome [247] /x/s.NetBSD.netbsd-3/crypto/dist/openssl/crypto/de s/fcrypt_b.c(111): warning: bitwise operation on signed value possibly nonportable [117]

      etc. etc.

      And this in a project that's driving quite an amount of sites, authentication mechanism and what not?

      Most if not all of the sources that are native to NetBSD, i.e. not imported like OpenSSL and GCC compile without any warning. You automatically get a good feeling about using it.

      Something needs to change in coding land.

      </rant>
  • by dist_morph ( 692571 ) on Monday May 30, 2005 @08:30PM (#12680232)
    Let's just eradicate them once and for all. A homogenous Linux monoculture will be easier to maintain and be to the benefit of all of us.
  • The point of open source is that the features that get added and the platforms that are supported are the ones that people put the time in to code. If a project supports platform x, it goes to reason that someone, somewhere, uses platform x and the given application, and has the skill to make them work together. This isn't a commercial project, where you have a marketroid telling you 'someone, somewhere wants feature x and for the application to work on platform y'.
  • by pedantic bore ( 740196 ) on Monday May 30, 2005 @08:34PM (#12680262)
    Most of the bugs that I've seen that are "platform-specific" are not actually due to bugs in that platform -- they're just ordinary bugs that were there all along, unnoticed due to poor assumptions. (Back in my day, we called this the "all the world's a VAX" assumption -- now it's the "all the world's x86") Finding these bugs and removing them make the code better.

    The bugs due to platform bugs -- well, knowing about them helps improve the platform.

    If you think fixing these bugs is a pain in the neck, fine. If you think it's a waste of time, however, think again.

    • by runningduck ( 810975 ) on Monday May 30, 2005 @08:57PM (#12680408)
      Porting software to different platforms has two distinct benefits:

      1) identifying subtle bugs

      2) preparing software for future platforms

      - Subtle Bugs -
      As stated in the parent post, porting software to various platforms help uncover bugs that may not surface during routine testing in a mono culture.

      - Future Platforms -
      Making software portable prepares software for the future. As computer technology advances, software that has been developed to be portable will be the first code running on the new hardware. I could go on and on about this, but I think the recent articles regarding Intel's Itanium already make this point loud and clear.
  • Keeping your code portable helps eliminate stupid assumptions, which make your software useless when the dominant platform changes. Once, all the world was a VAX, and people did stupid things. Then, the world changed. They kept doing stupid things.

    Think, for example, about 64 bit cleanliness. A piece of software which supported Alpha, UltraSPARC64 and SGI's MIPS64, and so on, wouold have been fairly trivial to port to IA64, and AMD64, and PPC64 when they started to become significant. OTOH, code which assumed it was running on a 386 would have been a pain in the ass to port to even just AMD64.

    Also, by supporting a broad spectrum of compilers, you will probably be able to understand what is going wrong when you compiler of choice changes. Witness code breakage on gcc3. Devs who had already ported their software to a variety of compilers were better able to respond to any issues, and fix their code.

    Many monoculturalists make stupid endian-ness assumptions. Now, Mac OS X is becoming a significant market. If you have stupid endian-ness assumptions, then you may wind up having to basically rewrite in order to gain access to those millions of potential customers/users.

    Imagine if OpenGL only supported SGI and 386. Or libtiff only worked on i386. People just wouldn't use them. Things like that get used because they are ubiquitous, and you can build them anywhere.
  • by Sinner ( 3398 ) on Monday May 30, 2005 @08:39PM (#12680283)
    Porting to minor platforms exposes bugs, real bugs, that might not have been found otherwise. It enforces good software engineering practices.

    Of course, you can overdo it. Take a look at InfoZip for example. No, seriously, take a look at it. It works on every platform you can think of, but the price is that the code is almost unreadable. The biggest problem is all the cruft needed to maintain 16-bit compatibility. It desperately needs updating to handle non-ASCII filenames intelligently, but the last thing that code needs is another layer of #ifdef's.

    There comes a time when you just have to say "fuck the Amiga".

    • the price is that the code is almost unreadable.

      I have seen this phenomenon but it certainly doesn't have to be that way. One of the best kept secrets of having both cleanliness and portability is to eliminate #ifdefs from sources as far as possible. The obvious ideal is to have identical source and abstract platform differences by other means (e.g. macros instead of #ifdefs, and using identical APIs on different platforms, supplying only platform dependent implementations).

      Keeping it clean probably al

  • Ummmm . . . (Score:5, Insightful)

    by erikharrison ( 633719 ) on Monday May 30, 2005 @08:39PM (#12680284)
    Sometimes, we use hyperbole to make a point.

    Unfortunately, I don't think that Ulrich is doing that.

    AIX is not a minority platform. What The Fuck. Okay, so the AIX guys are asshats in the way they treat GCC, fine. But GCC's claim to fame is that it it is the cross compiling, multiplatform compiler du jour. I think Ulrich loses a lot of credibility to say that GCC needs to not support AIX because it's a minority platform.

    *nix applications which run primarily in userspace should port to the various BSD's and Linux easily, and if they don't then 99% of the time it's a bug. And in many cases, it's a bug that will affect the working platforms eventually (relying on nonstandard behavior of system calls, linker oddities, assumptions about file placement, etc). And if a closed Unix platform has paid developers to assist in the porting, then it should run on that platform too. And if the paid devs are dickbrains, then a good project leader should say so. Behave, or fork and get your whining ass out of my tree.

    These AIX GCC guys shouldn't be saying "This patch breaks AIX, kill it", they should be saying "This patch fixes *blank* on AIX", at least most of the time.
    • Re:Ummmm . . . (Score:3, Interesting)

      by dvdeug ( 5033 )
      Okay, so the AIX guys are asshats in the way they treat GCC, fine

      I've watched the GCC list; David Edelsohn has consistently worked with the other people on the list. He takes the time to work out a solution to the AIX problems. OTOH, when Ulrich Drepper thinks something is broken on GNU/Linux, he will fight tooth and nail for the solution he thinks is right, as long as it doesn't mean that he has to clearly explain why his solution is right.
      • Big big BIG ditto (Score:3, Informative)

        by devphil ( 51341 )

        All of what dvdeug says here is true. I've done more than watch the list, I'm a maintainer, and David Edelsohn (an IBM employee) has always been willing to work with the GCC community. He even pushes the IBM developers and management to make each release of AIX slightly less bizzare than the previous release. Ulrich simply insults you if you disagree with him. He does not participate on the GCC lists; when he does send a message, it's a flame. I've never seen an explanatory email from him, on any topi

  • He's crazy. (Score:3, Insightful)

    by mellon ( 7048 ) * on Monday May 30, 2005 @08:42PM (#12680303) Homepage
    Porting reveals bugs. It also forces you to rethink short-sighted decisions. Furthermore, most of the problems I run into with porting have to do with cross-version incompatibility on Linux - the BSDs actually have comparitively stable APIs.

    This line of thinking is a lot like how I presume Microsoft thinks of things: if we just port to this one API, it doesn't matter how bletcherous it is. But as Microsoft has discovered, this kind of thinking actually turns into a straitjacket, which prevents them from being responsive when they need to be.
  • by ArbitraryConstant ( 763964 ) on Monday May 30, 2005 @08:43PM (#12680307) Homepage
    "IMO the most notorious case is how the gcc development is held hostage by Edelsohn and maybe IBM as a whole by requesting that everything always works perfectly well on AIX. How often has one seen "this patch breaks AIX, back it out". It cannot reasonably be expected that everybody tests on AIX. It is an proprietary OS running on proprietary and expensive hardware which not many people have access to. The overall development speed could be significantly improved by dropping the AIX requirement which, in some form or another, has been agreed upon by the steering committee. AIX is irrelevant in general today, i.e., the situation changed. And the people in the steering committee are too nice to just tell the very small minority causing the problem to take a hike."

    GCC is the de facto standard because it runs on more platforms than anybody else.

    If it ceases to run on all these platforms, it will either:
    a) fork a project that will support them
    b) another compiler will take its place as the de facto standard
    c) people will be forced to use whatever the default cc is on their OS.

    In any of these cases, the portability concerns will get an order of magnitude worse.
  • by bani ( 467531 ) on Monday May 30, 2005 @08:48PM (#12680339)
    Porting to other platforms/architectures often reveals bugs in your primary target platform. it is often worth the effort to port to other platforms on this basis alone.

    also, if it takes you a lot of effort to keep architecture-nimble, there is something fundamentally wrong with your design. this in itself should be a warning.

    But there is no benefit at all in supporting something like PA Risc, m68k, cris, etc as configurations of the generic code.

    ulrich obviously has no clue whatsoever about embedded systems, and should therefore stfu on this point. one of the most popular embedded platforms is a 68k variant (coldfire) -- it's probably second behind ARM. by dumping support of 68k you castrate linux in the embedded marketplace. there's much more to 68k linux than sun3 and atari/amiga.

    his rant against mingw as "undeserving" is stupid. mingw is an enabler -- it means people can develop for win32 without having to pay microsoft $$$$ for the privilege of doing so.

    his 'dictatorship of the minorities' argument is actually self-defeating on this point because microsoft users are in the majority. by his own arguments, we should be concentrating on supporting win32 as the primary target for gcc and primary architecture for linux.

    utterly ridiculous.

    bug-eyed rants like his just serve to reinforce the stereotype that all open source advocates are completely unhinged. it is not helpful in the least.
  • by toby ( 759 ) on Monday May 30, 2005 @08:52PM (#12680369) Homepage Journal
    In my experience, porting is like the water of a river washing over river stones. Over time, every port makes the stone smoother. This applies whether it's a new architecture, O/S, compiler, or even just the unfamiliar box of some other user.

    There are bugs that just don't get flushed out until you port to: non-x86; 64-bit; bigendian; Win32; OS X; etc, etc, etc. Drepper should know better: All the world's not a VAX, etc. (though a VAX port is a fine start :-)

    Also, every port makes the process of porting itself easier. It's no coincidence that the most reliable and defect-free software is typically the most-ported software. This has always been true: TeX and METAFONT (where the monetary bug bounty [tug.org] doubled for every bug report, so assured was Knuth of its quality); Apache; Linux itself; NetBSD; GCC and friends; etc.

  • Hypocritical (Score:5, Interesting)

    by Temporal ( 96070 ) on Monday May 30, 2005 @08:54PM (#12680383) Journal
    If you write your code to be portable in the first place, fixing platform-specific issues should be quick and easy.

    And, of course, you write your code to be portable because you make sure it runs on the big three: Windows, Mac OSX, and Linux.

    Right?

    Actually, I think a much larger problem is just that: Many OSS developers don't even try to support Windows. Yes, I know you hate the OS and don't want to support Microsoft, etc., etc.. But, how can you complain about major software not supporting Linux when you're writing your own software that doesn't support Windows? Isn't that entirely hypocritical?

    My take: Port your software to every platform you can, especially Windows. This gives freedom of OS to your users. And if you're a Linux user yourself, you should understand just how valuable and important this freedom is.
  • by kimanaw ( 795600 ) on Monday May 30, 2005 @08:55PM (#12680390)
    As someone who has supported multiplatform s/w that's hosted on

    • Win32
    • Linux (various, incl. PPC)
    • Solaris
    • AIX
    • HPUX
    • OS X
    • FreeBSD
    • MVS
    • OS/400
    • multiple other "minor" Un*x platforms
    • a Zaurus
    • a PocketPC
    • some routers running a proprietary kernel

    ...I call BULLSHIT!

    The bugs one finds on "minor" platforms usually end up being bugs on the "major" platforms you just haven't found before. Of course, for those of you still intent on/forced to write code in C/C++, you're likely getting your just desserts.

  • by bluGill ( 862 ) on Monday May 30, 2005 @08:56PM (#12680402)

    It is rare that I can say someone is wrong on all counts, but I have not found one defensible statement in there. (Though I guess one could be hidden and I missed it)

    His first mistake is thinking GNU is everything. Maybe for him it is, but for most people we use what works. When the boss sets me down on a AIX machine I want it to work - I'm not allowed to install Linux (though I'd install *BSD if I could wipe the OS), I'm supposed to get work done.

    Minorities are useful despite the cost of working with them. Bugs that are 1 in a million may happen every time on AIX. 1 in a million bugs are very hard to find. I've spends days looking at a partial crash trace wondering why it broke, and if it will happen again. With no known way to duplicate the bug it is really had to fix, and hard to be sure the 'fix' works. When it fails every time the bug is easy to fix.

    Good programmers should have no problem writing cross platform code. When your code breaks on AIX, it is a sign of bad code - even if the breakage is because AIX doesn't have a function you expect.

    Cross platform compilers (gcc) are much easier for me to work with. Because gcc is cross platform I can compile my stuff at home and debug it, than bring it to work and compile it and assume it works. Particularly with gcc 2.95, the support for C++ was so bad that you could not count on code written for that to work on a better compiler.

    Speaking of gcc 2.95, other vendors have had better compilers for years, while gcc is only arriving. Even today, gcc isn't a great c++ compiler. (though 4.x is much better) There is no point in throwing stones at other vendors - their compilers may have been expensive, but they at least worked close to right.

    The upper/lower case differences with Windows are a non-factor. You should never have any word that differs by case only - it leads to many bugs if you do.

    The API differences on Windows are mostly handled by Cygwin and mingw. Those areas that are different are places where you should have your code modular anyway. Mostly we are talking about device and networking code. IPv6 is on the way (has been for 10 years now...), you need some difference code to support that. There is no standard for device code - what works on OpenBSD won't work on linux, or FreeBSD.

    True almost nobody cares are VAX - but it is interesting anyway. If you code is modular like it should be, then supporting those weird things isn't a big deal - you write you code, and let those are care about it test.

    A short summary: There should be only one OS that anyone runs: RedHat Linux enterprize edition on x86. (not x86-64) Not Fedora core, much less gentoo or those other non-redhat distributions. You FreeBSD people can go to hell.

    He wants to take his ball and go home, I don't care, we are better off without people like him in the open source world.

    • by bani ( 467531 ) on Monday May 30, 2005 @09:03PM (#12680453)
      just FYI. mingw doesn't deal with api differences at all. mingw is just a gcc port which can emit win32 PE binaries. a good deal of mingw effort is spent writing free versions of win32 dlls to link against. but they don't abstract or translate APIs -- they're just free implementations of the win32 SDK.

      cygwin does deal with api differences however. it's a completely different beast.
    • Writing portable code is good for the soul.

      It makes you read the docs. It forces you to use a standard API when byte order is important. It keeps you from hardcoding values (eg sizeof(void*)). It keeps you from making platform dependant optimizations that might not even be supported by the next version of the platform you're on, or if you do it forces you to make them modular.

      It forces you to figure out what behavior you can rely on. Bug compatability with older versions relies on the magnanimity of the m
  • by dyfet ( 154716 ) on Monday May 30, 2005 @09:19PM (#12680543) Homepage
    First, supporting many platforms often reveal interesting and important bugs which can be missed because they do not manifest themselves well or often on the "primary" platform or target architecture most commonly being used.

    Second, platforms are not stagnent. Code that only works on 386 linux may some day have to deal with a x64 only world. Who knows what may happen in the future. Making decisions because you reject portability means you reject the future for your code as well.

    Third, different compilers are very useful for finding less obvious bugs. Ideally this means having a choice beyond gcc, if one is talking about C/C++, for example :). Using a single compiler means bugs your compiler doesn't itself know will likely be retained. Even using different versions of gcc can help. Different compilers often are good at finding completely different sets of bugs in source.

    Finally, pointer/integer size and endian prejudices are evil in C/C++ code. You will find these things very quickly if you spend your whole life exclusivily on i386 and one day try to port to ppc.

  • Axe to grind (Score:5, Informative)

    by Dan Berlin ( 682091 ) on Monday May 30, 2005 @09:19PM (#12680546)
    As a GCC developer (bias: I work for IBM Research), the only time i've ever seen David Edelsohn complain about something not working on AIX, it was broken on other significant platforms as well (Cygwin, etc), or was latently buggy and just working by luck.

    Judge for yourself. Go read the gcc list. Count the number of patches backed out in the past year because they broke AIX vs because they broke some other platform.

    It sounds like an unnecessary personal snipe, which, for people who know Uli, well, i won't bother finishing that.

    So if this is the most "notorious case" Ulrich's got, then he's wrong.

    Particularly the "GCC would be developed much faster".
    That is in fact, the funniest thing i've heard all day.

    GCC would be developed faster if there was less sniping and fiefdom's and more collaboration. Which, except for a few people, has been what is generally happening. Our development process is accelerating, not slowing down.

    And It certainly isn't slowed down because people need to bootstrap on AIX, which they don't.

    Nobody has ever required patches be bootstrapped on AIX unless it is very likely to have some material affect on that platform.

    This is just the same requirement we pose for any wide ranging change: Test it by compiling it for the architectures it is likely to break on.

    Note i didn't say running. We don't require anyone have AIX boxen around. Cross compiles work fine.

    Though if you break some architecture, you are expected to at least try to help the maintainer of that arch fix it.
  • by Pope Raymond Lama ( 57277 ) <gwidionNO@SPAMmpc.com.br> on Tuesday May 31, 2005 @12:03AM (#12681504) Homepage
    I track development of The GIMP - it just compiles fine in all "strange minor platforms", and a recent chain of Bugs in Irix compiling was resolved overnight - a matter of the Irix user reporting the bugs, and the core developers commiting the fixes.

    However, there is a non minor and weird platform which actually does generate a lot of trafic on the list, and is strange for most developers. Anyoen checking The GIMP bugizlla will find a lot of open bugs for Microsoft Windows Plataform. That however, doesn't slow the project either. It simply goes on, and the developers who work on Compiling and making the windows installer do what they can for the work arounds.

  • by dtfinch ( 661405 ) * on Tuesday May 31, 2005 @12:41AM (#12681668) Journal
    Porting and generally most other open source development happens on a needs basis. Developers decide "I need/want this, so this is what I'll work on." If someone needs a specific port of Linux, they will put forth effort into developing one, effort that might not go into OSS development otherwise. You can't believe that if you get them to stop, that energy will be focused on what YOU want them to work on.

    If there's a problem with developers being bossed around into doing niche work with no compensation, and they don't like it, they need to stand up for themselves. For example, if IBM wants gcc to work well on AIX, they should either make it happen themselves or pay the gcc developers to better look out for their interests. If, on the other hand, the gcc developers are well compensated for fixing AIX problems (I don't know what the situation is), then there's no problem, except in the eyes of bystanders who don't understand the situation.
  • by S3D ( 745318 ) on Tuesday May 31, 2005 @01:18AM (#12681841)
    I strongly disagree here too. I use a lot of open source programs , but I'm working with Windows and Symbian. OO, Gimp, Axiom, Maxima, gcc (major component of Symbian SDK), Firefox/Thundedbird/Sunbird etc.
    And I can't switch to Linux - all my projects for Windows and Symbian (and Nokia SDK windows-only, homegrown windows port require Wine anyway). And all the times I'm telling my clients and coworkers - look how much OO mre convinient then word, how Firebird is more safe, and Gimp have nice features, and Axiom and Maxima - well, you dont have to pay several thousand $/year for Mathematic. To drop support for "minor" platform would be a huge discouragement for people to use OSS. Don't forget that some OSS project are designed for mostly non-Linux platforms. Vincent OpenGL ES implementation is oriented for PPC/Symbian and don't have much sense for desktop Linux.
  • Define Minor (Score:3, Insightful)

    by marcovje ( 205102 ) on Tuesday May 31, 2005 @01:45AM (#12681936)

    Open Source is always about developer, not user headcount.

    Half the Linux distro's have less developers than the avg BSD. Let's kill them off.
  • by Per Abrahamsen ( 1397 ) on Tuesday May 31, 2005 @02:33AM (#12682102) Homepage
    The way to resolve the problem is to have two lists of supported platforms, primary platforms and secondary platforms. Primary platforms must work, there should be no releases that break the primary platforms, and new features must be developed with all the primary platforms in mind.

    For secondary platforms, patches that make the application work on those should be accepted and encouraged, but releases won't get delayed, and new features can be accepted even if a solution for the seocndary platform has not been found. In general, users of the secondary platform should not rely on the official releases of the platform, but get their code directly from the maintainer of the secondary platform (or from a cvs branch).

    Which platforms are primary and which are secondary should depend on the application. For easy-to-use end-user stuff like ForeFox, IA32 GNU/Linux, IA32 MS Windows and MacOS X would be a good set of primary platforms. For the GCC/binutils/gdb the set need to much more varied, and include popular embedded platforms. The strength of GCC has always been portabiblity and cross-compilation. It has only rarely been the best compiler for native compilation on popular platforms.
  • by jschrod ( 172610 ) <[jschrod] [at] [acm.org]> on Tuesday May 31, 2005 @04:33AM (#12682419) Homepage
    This should be from-the-guy-who-breaks-glibc-compatibility-with-e very-minor-release.

    Seriously, isn't this the same Ulrich Depper who can't even bother to get glibc right? glibc incompatibilities -- even in patch versions -- is a major headache on Linux. Compare that to those ``obsolete'' platforms like AIX and Solaris where I can still run binaries that I have compiled in the early 90s or even the 80s. glibc is one of the main reasons why Linux application deployment sucks in major (read: heterogenous) installations. Kernel differences are actually not as problematic, but glibc is biting ourselves all the day.

    He has shown already that he won't bother for people who run computing centers. Here's he, spouting more hobbiest opinions. Nothing new, move forward.

    • the movement from glibc2.2 to 2.3 was particularly painful. ugh.
    • by synthespian ( 563437 ) on Tuesday May 31, 2005 @08:21AM (#12683245)
      glibc is one of the main reasons why Linux application deployment sucks in major (read: heterogenous) installations.

      This is what Marc Espie, an OpenBSD developer said about Ulrich on O'Reilly's OnLamp [onlamp.com] (commenting the proactive measures OpenBSD takes in C programming vs. Ulrich's "Linux programmers are geniuses" view):

      "We have had a lot of success explaining the issues and getting a lot of people to switch from strcpy/strcat to strlcpy/strlcat.

      Weirdly enough, the Linux people are about the only major group of people that has constantly stayed deaf to these arguments. The chief opponent to strlcpy in glibc is most certainly Ulrich Drepper, who argues that good programmers don't need strlcpy, since they don't make mistakes while copying strings. This is a very mystifying point of view, since bugtraq daily proves that a lot of Linux and free software programmers are not that bright, and need all the help they can get.

      (Considering the shining, flaming personality of Drepper, and the fact that he is always Right, this is not that surprising, though)."
  • by madou ( 888303 ) on Tuesday May 31, 2005 @06:54AM (#12682766) Homepage
    I have a real problem with attitudes like "we do not support non-free operating systems". Of course, software should be free IMHO. But dropping support for non-free platforms takes away the ability to use at least free application software from users who aren't in a position to decide which os they want to use, be it at work or due to limited technical skills.

    Even more important, this type of attitude (( flame me, but I'd call it bunker mentality )) harms collaboration between open source projects, and also between commercial software vendors and open source projects. (( if you don't know what I'm talking about, take a look at the copyright notices for g++'s STL headers ))

    To make the point clearer, let's take Ulrich's ideas a bit further. From a BSD purist's point of view, GPL licensed software does qualify as "non-free".

    What if e.g. the OpenSSH guys decided to drop support for non-free operating systems such as Linux, particularly commercial distributions like Redhat that include proprietary code?

    "Of course, you Linux guys may always maintain a separate tree that includes supports for those exotic systems."

    So we'd have X people who could be working on something way more useful, trying to keep a forked tree in sync with the original project. Great.
  • by omb ( 759389 ) on Tuesday May 31, 2005 @07:10AM (#12682830)
    If you read what was actually said, most replies
    make no sense; to paraphrase:

    Mainstream developers, using common architectures,
    which will change over time, should not hold
    themselves hostage to proprietary, minority or
    legacy platforms ... and lack of platform access
    makes this impractical in any event.

    This makes complete sense, if, as is actually the
    case HP, IBM & SUN have, by incompetance or greed,
    placed themselves in a position where their
    platform _depends_ on GNU tools they need to spend
    some support revenue on the tool-chain, and
    provide gratis platform access. This is how it
    used to be before Red Hat bought Cygnus.

    Finally, no one is going to deprive legacy
    platforms, they have to do work, pay or resign
    themselves to a feature freeze.
  • A bug is a bug. (Score:3, Insightful)

    by autopr0n ( 534291 ) on Tuesday May 31, 2005 @11:12AM (#12684771) Homepage Journal
    If there's a bug in the main source code base that only manifests on a particular platform, it's still a bug.

Pascal is not a high-level language. -- Steven Feiner

Working...