Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Alternatives To SF.net's CompileFarm? 186

cronie writes "Not long ago, SourceForge.net announced the shutdown of the Compile Farm — a collection of computers running a wide variety of OSes, available for compiling and testing open source projects. SF.net stated their resources 'are best used at this time in improving other parts' of the service. I consider this sad news for the OSS community, because portability is one of the strengths of OSS, and not many of us have access to such a variety of platforms to compile and test our software on. As a consequence, I expect many projects dropping support for some of the platforms they can't get access to. Are there any sound alternatives with at least some popular OS/hardware combinations? Any plans to create one? (Perhaps Google or IBM might come up with something?)"
This discussion has been archived. No new comments can be posted.

Alternatives To SF.net's CompileFarm?

Comments Filter:
  • by Anonymous Coward on Sunday March 11, 2007 @03:46AM (#18306158)
    That's not much use for testing compiling on Solaris on SPARC64, or Tru64 on Alpha, etc...
  • by TeraCo ( 410407 ) on Sunday March 11, 2007 @04:03AM (#18306202) Homepage
    The only problem is that the people compiling aren't the same as the people who are buying.
  • by CaptainTux ( 658655 ) <papillion@gmail.com> on Sunday March 11, 2007 @04:08AM (#18306220) Homepage Journal
    Personally, I see less and less need for compiled and distributed software as broadband internet becomes ubiquitous and rich internet applications become more sophisticated. As it stands now, there is very little that traditional software does that can't be replicated on the web using the right technology. Software as a service is slowly becoming a reality and compiled software is soon to go the way of the dinosaurs.
  • by Animats ( 122034 ) on Sunday March 11, 2007 @04:15AM (#18306242) Homepage

    It was announced afterwards for a reason. They're not really taking it down because nobody wants it or anything, it's because they lack manpower to keep it working. It basically needs a lot of work to get it back in a usable state, and it's not widely used, so they're just dropping it.

    This is the classic downside of "software as a service".

  • by Anonymous Coward on Sunday March 11, 2007 @04:43AM (#18306346)
    Not to be a jerk, but even if the future of computing is "write once, run anywhere," doesn't it still stand to reason that the code should be tested to make sure it can indeed run anywhere? I tend to agree that languages like Java which abstract away much of the OS and hardware specifics will become suitable for more and more tasks as system performance increases, but 1)There will always be applications in which hardware and/or OS-specific optimization will be necessary, and 2) Even without considering this, only a fool would trust such abstraction layers (i.e. the JVM) implicitly, claiming multiplatform support without ever having testes on the platforms in question.
  • Re:Obvious (Score:1, Insightful)

    by fromvap ( 995894 ) on Sunday March 11, 2007 @04:47AM (#18306358)
    You seem to have mixed up "users" and "developers." For most people, anything that involves a command line IS rocket science. When a techie comes along and ports something and makes it available, it can be a huge gift to the normal users. When everyone figures it is too easy and not worth doing it for the newbies, you end up in a situation where an essential program like GimpShop for Windows ends up with the Linux version lagging far behind the latest release of Gimp, and the Windows version being even more lagged behind Linux GimpShop.
  • by Peet42 ( 904274 ) <Peet42@Net[ ]pe.net ['sca' in gap]> on Sunday March 11, 2007 @05:24AM (#18306460)

    The only problem is that the people compiling aren't the same as the people who are buying.


    True, but remember that the more software that eventually runs on your platform, the more people who are likely to adopt it.
  • by Excelcia ( 906188 ) <slashdot@excelcia.ca> on Sunday March 11, 2007 @05:27AM (#18306464) Homepage Journal
    And what is the client running? A web browser running on machine with an OS. So, you need compiler, programming, and testing infrastructure for:
    • The application provider's OS
    • The application provider's network services
    • The application
    • The client's OS
    • The client's network client
    And this is supposed to be a less complicated system to write, distribute, and debug than traditional systems that you can do away with traditional compile-farms? Software is a service, no need to install anything. Unless, of course, you want to print something. Or is that a service too? Burning a DVD is a service? Put your DVD-R in the drive, connect to your favourite DVD authoring service, and... go to sleep. Maybe tomorrow your disc will be done. Unless DVD or HD-DVD quality video is something you expect to get solely off broadband.

    There are so many exceptions to what software-as-a-service can reasonably do that the majority of people who are reading this do on a daily basis that I just have to laugh when people bring this up. Beyond a wet dream for Microsoft where they lovingly sit back and watch the monthly subscription dollars roll in, this is never going to happen.
  • by Antique Geekmeister ( 740220 ) on Sunday March 11, 2007 @05:28AM (#18306468)
    It's expensive: power, cooling, rent on the building with the rackspace, and bandwidth all add up to a considerable chunk of change. And the professional skills to run such a farm are unusual and expensive to hire, or to contribute. Even a modest Q/A testing and evaluation farm can cost a few hundred thousand dollars a year when you add up all the costs.
  • by Anonymous Coward on Sunday March 11, 2007 @05:31AM (#18306482)
    Well, for starters, a DDoS attack won't affect any of my traditional software. Nor will my cheap-ass ISP going bust. Or road construction outside my house cutting through the cable/phone lines.

    It wasn't so long ago that a whole COUNTRY (Pakistan) lost its internet access because one cable was damaged.

    Traditional software ain't going away anytime soon.
  • by rbarreira ( 836272 ) on Sunday March 11, 2007 @06:06AM (#18306572) Homepage
    Implement a media player on the "software as a service" model.

    Now implement a cryptography library on the "software as a service" model. Oops, you're sending plain text data through the cables...

    Now implement a real time application on the "software as a service" model.

    Now implement an application which requires near-100% availability on the "software as a service" model.

    Now implement a high-end game on the "software as a service" model.

    Are you done? Do you like the results?
  • by imemyself ( 757318 ) on Sunday March 11, 2007 @06:31AM (#18306624)
    I don't know if it would be suitable for the sort of thing you're talking about, but HP has (or atleast they had it a year or so ago), a thing where you can telnet into a variety of different systems they had. Mostly OpenVMS and HP-UX running a a few different architectures. I know that you didn't have network access from the box that you telnetted into, but I don't know what other restrictions there were. It might be something to check out if you're interested in making software for some of HP's higher-end stuff, but don't have the hardware to run OpenVMS or HP-UX.
  • by Haeleth ( 414428 ) on Sunday March 11, 2007 @07:55AM (#18306726) Journal

    This is true.. how many operating systems are in wide use for most applications these days?

    We have... Windows, MacOS, Linux, and BSD.
    All of them in numerous different versions, and in the case of OS X, Linux, and BSD, running on a variety of hardware. (There's still PLENTY of PPC-based Macs around, for one.)

    I spose there's still people working with Sun/Solaris and HP/UX and AIX
    Damn right. More than you'd think, in fact.

    but for the most part, open source devs care that it works on their stuff, and to heck with whatever else.
    Do you consider this an attitude to be encouraged?

    And even if you don't see a problem with it, what about those OS devs who do actually kind of like the idea of testing on a variety of hardware? There aren't many hobbyists who can afford to buy servers from HP and IBM.
  • by vrai ( 521708 ) on Sunday March 11, 2007 @08:14AM (#18306800)

    I spose there's still people working with Sun/Solaris
    Yes, such as the entire banking industry and almost all it's associated software vendors. Admittedly there's been a move towards Solaris/x86 but there's still a huge market for UltraSparc machines; not all jobs can efficiently distributed across multiple machines and Intel architecture can't provide more than 16 cores. The Cell processor is attracting a lot of attention as a potential replacement for Sparc and requires specialist development machines. You can't really test your new Cell optimised uber-parallel pricing model on a four core Intel.

    For most open source software you're completely correct - it'll never run on anything more exotic than a Core Duo. But if you're developing something other than desktop applications (e.g. programming languages, libraries, frameworks, etc) and you want your software to be used by the widest possible audience; you need to test it on as many architectures and operating systems as possible.

  • Usage stats? (Score:3, Insightful)

    by Orlando ( 12257 ) on Sunday March 11, 2007 @08:19AM (#18306826) Homepage
    As a consequence, I expect many projects dropping support for some of the platforms they can't get access to.

    Do we have any actual data on how popular the service was? I think this was a neat idea, but if it wasn't being used it won't be missed...
  • Re:VMs (Score:4, Insightful)

    by Curtman ( 556920 ) on Sunday March 11, 2007 @09:22AM (#18307084)

    The SF.net CompileFarm was not there to provide 'power'.

    I believe he meant this [ibm.com] kind of power. ;)
  • by Anonymous Coward on Sunday March 11, 2007 @10:08AM (#18307312)
    I've heard this argument before. From a vendor's point of view, it is so divorced from reality as to be laughable. Let's face it, if you, as a developer, have enough pull to direct a couple million US dollars toward a hardware contract (or even a few hundred thousand), you probably aren't sucking around for free equipment and resources. If you don't have the ability to throw that much cash at a vendor, there isn't a lot of incentive to talk, is there? Workstation/server vendors have different cost structures and markets than PC vendors.

  • by Anonymous Coward on Sunday March 11, 2007 @12:41PM (#18308188)
    Using the compile farm to do regular automatic compiles of the latest code (if there are any changes) was one of the most usefull features of SF. *sigh*

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...