Forgot your password?
Programming IT Technology

Competitive Cross-Platform Development? 411

Posted by Cliff
from the keeping-up-without-losing-compatibility dept.
Avalonia asks: "I work for a software company in the oil and gas exploration industry with a software development team of seven. Our software and development environment is cross-platform on Solaris, Irix, Linux and Windows. Most of our customers are on Solaris and Irix 64-bit systems, but Linux and Windows are increasingly important. Our environment is based around an elaborate command-line system of Makefiles controlling four different compilers (gcc 3.1, Sun Forte, Irix MIPSpro and Visual C++ 7). Needless to say, maintaining this system and producing modern multi-threaded C++ that will go through the four build systems is time-consuming in the extreme. A large proportion of our time is spent finding C++ code that just works rather than being creative and competitive with new functionality. What tools and strategies can we use to increase our productivity and regain our competitive advantage, without going for Windows only?"

"Our recent single-platform competitors (Windows only) can seriously outrun us in terms of productivity by using a single modern IDE development environment - such as C++ builder or Visual Studio - although we can scale onto larger multiprocessor Unix systems. With Windows 64-bit imminent we may lose our 'big-iron' scalability advantage. Java is not currently an option for the high-performance numerical and immersive graphical aspect of our applications."

This discussion has been archived. No new comments can be posted.

Competitive Cross-Platform Development?

Comments Filter:
  • try (Score:3, Insightful)

    by BigChigger (551094) on Tuesday November 05, 2002 @04:59PM (#4601860)

    I think it runs on several of the environments you mention. And I think there are C++ plugins for it.

  • by SirSlud (67381) on Tuesday November 05, 2002 @05:00PM (#4601868) Homepage
    why all the different compilers?

    Hrm, this seems too simple an answer, there must be something wrong with it .. but can't gcc cross compile? At least then you could dump alot of the compiler-specific scripting in your build procedures.
    • Along with GCC for cross-platform compatibility, look at using GLIBC to make sure the functions you expect to be there, are there for all of the platforms.

      Haven't checked recently, but is CygWin still being maintained? If it is, then you have GCC available on all four of your platforms, along with the same development libraries and headers.

      You may lose some of your "Big Iron" support since you are using GLIBC instead of the native libraries. (but I'm not an expert in this to know what kind of performance hit you may take).
      • by aridhol (112307) <> on Tuesday November 05, 2002 @05:11PM (#4602019) Homepage Journal
        You aren't necessarily using GLibC. GCC works just as well with other LibC implementations, so you can use your "Big Iron"'s own LibC and LibC++.
    • by ncw (59013) on Tuesday November 05, 2002 @05:24PM (#4602119) Homepage
      We do exactly this in our main product (a control and monitoring system). It is about 300k lines almost evenly matched between C and C++. Portability was a major concern in its design - we've already had to port it from an obsolete platform!

      We compile using gcc for unix (linux mostly) and Windows using mingw. We cross compile everything from linux and this all works from one Makefile. Recently we even managed to get the NullSoft NSIS installer working under Wine so we can make the install package under linux too.

      Once we got all this ironed out we don't really have to worry which platform we are working on - "it all just works". Any developer can compile for every platform too.

      We split the design into a server part and a client part. The server part doesn't do anything fancy but the client part of course interfaces with the user. We had flirtations with wxWindows [*]and GTK[*] as cross platform GUIs but in the end we decided to use SDL. SDL is very simple but it really works excellently - our application looks identical down to the last pixel on Windows and Linux. Of course we had to write our own windowing system but that is what C++ is for isn't it ;-)

      [*] In our experience GTK doesn't work very well under windows, wxWindows is just too different on Windows/Unix and we couldn't (then) afford the licence fee for QT for commercial products. SDL seemed just the answer for us.
    • by bmajik (96670) <> on Tuesday November 05, 2002 @05:27PM (#4602160) Homepage Journal
      gcc is the compiler people on linux use because its available, not because its the best.

      gcc on MIPS-IRIX is just awful. gcc is the least common denominator in terms of performance and just as bad as the others w.r.t. compilerisms and peculiarties. it just so happens that if you ported your code to gcc, it would _compile_ everywhere and run in a degraded state on non-linux-x86 platforms.

      so to review:
      On Solaris - the sun compiler smokes gcc for c, c++, and fortran code

      On IRIX - the SGI C++ compiler is almost a reference for how a good C++ compiler should be done. Oh yeah, its code generation is ideal on MIPS architectures (big surprise). can gcc even emit MIPS4 code yet ?

      On Windows - well, msvc is a pretty performant compiler.
      • Have you tried gcc 3.2? Personally, I have not seen any big difference between the code quality of VC++, gcc, and Intel's compiler, though I don't have a great deal of experience comparing them.

        I can't speak for the other platforms.

      • by JoeBuck (7947) on Tuesday November 05, 2002 @07:23PM (#4603401) Homepage

        This message is outdated, possibly reflecting experience with older GCC versions. GCC 3.x is in many ways closer to ISO C++ conformance than MSVC, and it has a new x86 backend that is a big improvement over what we had before.

        Sun's C++ compiler generates faster code than GCC for some cases, but slower for other cases. Sun tuned their compiler for the standard benchmarks, you will not see the gains they advertise for other platforms. In the recent pase, Sun regularly has broken binary compatibility in patch releases, leading to no end of problems for us in supporting customers.

        If you need Fortran, gcc's Fortran is not great. Also, the ia64 support is immature, you will not get fast code out of gcc for that platform.

        Sun, HP, and MSVC are all riddled with compiler bugs of various types; GCC's bugginess is now somewhere in the middle of the pack.

        Finally, differences between compilers can often be greatly reduced by simplifying the coding of inner loops. With code that has been given this treatment, we find that Intel's compiler is only about 5% better than gcc on our large codes.

        But if you do cross-platform C++, GCC can be a very good choice, as you have one set of front-end compiler bugs to work around instead of five or six.

  • Gcc? (Score:4, Insightful)

    by JanneM (7445) on Tuesday November 05, 2002 @05:01PM (#4601877) Homepage
    Why not just use gcc for all four platforms? The sticking point would likely be Windows, but even if you elect to stay with MFC++ for Windows, you've reduced the incompatibilities from four to two different compilers.

    • Re:Gcc? Speed. (Score:5, Informative)

      by for(;;); (21766) on Tuesday November 05, 2002 @05:10PM (#4602014)
      gcc is built for portability, not speed. VC++'s code is faster, but has zero portability and its own magical, lsd-inspired "innovations." [" for(int i=0;in;i++); for(int i=0;in;i++); ? Why would anybody ever want to compile code like that?"] That intel compiler mentioned on /. a while back sounded fast; but the gcc-for-everything approach may not be best, if they find compiled java too slow.
      • If you regularly compile your code on multiple platforms this is less of an issue. For instance we compile our crossplatform code on Visual Studio, Borland, GCC on Linux, GCC on OSX, GCC on Solaris, and Codewarior on OSX/OS9.

        It's amazing how many things you find this way. It actually a good way to find many bugs as well. Since we've been doing this a while we also decided to avoid a lot of the "newer" C++ features and that really helped both speed and portability. (Since not all compilers did STL well for some time)

      • `for(int i=0;in;i++); for(int i=0;in;i++);' is the right way accounding to the c++ standard, the wrong way is `for(int i=0;in;i++); for(i=0;in;i++);'
      • Re:Gcc? Speed. (Score:4, Interesting)

        by gillbates (106458) on Tuesday November 05, 2002 @06:13PM (#4602675) Homepage Journal
        gcc is built for portability, not speed. VC++'s code is faster

        You're joking, right? Perhaps I'm a little behind the times, but I was under the impression that GCC used a register based architecture where VC++ uses a stack based architecture. While GCC might spit out some average performing code with the default options, using -O3 will produce very fast running code. I've compared the performance of code compiled with GCC vs. the same program written in assembler, and at its best, GCC is only 50% slower than hand coded assembly, which is very good considering that some very well respected compilers will produce code that is 10 times slower than hand coded assembly.

        I've also looked at the assembly language output of VC++, and it's a joke. VC++ often inserts large sections of extraneous instructions into the code. I've managed to follow function calls in VC++, and it is not uncommon for a function call to have prolog and epilog code of several dozen instructions. Interestingly, GCC uses a register based schema, and I can actually follow the code pretty easily, in spite of the fact that it uses the AT&T syntax. GCC just produces cleaner code.

      • Re:Gcc? Speed. (Score:3, Interesting)

        by aminorex (141494)
        gcc 3.2 is generally superior to VC++ emitted code,
        in my experience.

        Mingw32 is the target of choice if you don't want
        to license Cygwin.
    • Re:Gcc? (Score:3, Interesting)

      by Dog and Pony (521538) ||
      - using GCC on Windows is not a problem anymore. I do it often, with no problems, and then you should keep in mind that I'm not very good at this stuff on any platform. :)
  • GCC (Score:3, Informative)

    by captaineo (87164) on Tuesday November 05, 2002 @05:01PM (#4601885)
    You can, with some loss of performance, use GCC on all of those platforms. That would cut out compiler quirks as a variable.

    Don't write Makefiles yourself. Instead write a script that translates simple build rules (foo.cpp -> foo.o -> foo.exe) into a custom Makefile for each platform. I went this route after battling for years with complex Makefile rules that never quite worked.
    • Re:GCC (Score:3, Informative)

      by Sludge (1234)

      Don't write Makefiles yourself. Instead write a script that translates simple build rules (foo.cpp -> foo.o -> foo.exe) into a custom Makefile for each platform. I went this route after battling for years with complex Makefile rules that never quite worked.

      I'm starting to look into using Cons [] for a cross-platform C/C++ makefile alternative. I haven't used it in a large project yet, but I can definitely get up and running faster than with Make.

  • GCC/DJGPP (Score:2, Informative)

    by maverickbna (578984)
    gcc is cross-platform, can cross-compile to different platforms, and for those who feel the need to use Windows, gcc has been ported, to the ever-popular development suite called DJGPP.

    GCC []
    DJGPP []
  • by jukal (523582) on Tuesday November 05, 2002 @05:04PM (#4601932) Journal
    ...uhm...ahmmm...mmm. Dunno what to add.
    • More difficult than you would like. Ever try to get something using templates to compile on those four platforms? I don't recommend it. If you are trying to get into the more complex things that you can do with C++, it's *very*, *very* difficult to write 100% portable code.
      • it's *very*, *very* difficult to write 100% portable code.

        yes, but it might be *very*, *very*, impossible to get manage than non-portable code. Well, there's more experienced minds in here, but I would say atleast maximing the percentage of platform independant code is the path to go. Hehehe, giving advices is just so easy - and fun! :)

  • Go client/server? (Score:5, Informative)

    by WasterDave (20047) <davep&zedkep,com> on Tuesday November 05, 2002 @05:05PM (#4601944)
    I assume that most of your problems are in the GUI end of the equation - why not break the application into two bits? Put the numerical stuff on a grunty 8 way box, and cook up the UI with whatever language best suits the available (and hireable) skills and platform?

    Communication between the two is probably best through SOAP, although to be honest I've not looked into this area for a long time. The GUI can still be built from Java (I believe Java has some reasonably fast OpenGL wrappers now), or look into wxWindows [] using the existing C++ resource.

  • I've done this! (Score:5, Informative)

    by hajo (74449) on Tuesday November 05, 2002 @05:07PM (#4601961) Homepage
    I've done something similar in a mixed environemnt. The way we set it up is to use java for GUI and logic development and then run a profiler against it and go native on those functions that needed it. These functions would be compiled into a library. (You'd be amazed how little of your code you need to optimize for huge performance gains!)
    If I had to do it again I would do the same thing except I would use python as the 'main, relatively slow, easy to code and maintain' language.

    • The problem with Python is that its GUI features are not fully cross-platform yet. We're still eagerly awaiting full OSX support. I don't know how that is going. There was a windowing manager I downloaded off of Source Forge but I couldn't get it working right.
      • The problem with Python is that its GUI features are not fully cross-platform yet.

        Not true: wxPython []. Sure, it doesn't come bundled with official Python distributions, but wxPython is cross-platform and quite capable, and blows the socks off Tk.
        • Have you got wxPython to work though? (That's what I was alluding to) I don't know if it is because I installed Fink, which has its own Python disto, but I never was able to get it to work right.

          Has anyone else had these problems? I'm thinking it was due to Fink as I've had other subtle problems with Python since I installed it.

  • by g4dget (579145) on Tuesday November 05, 2002 @05:07PM (#4601965)
    If you can do it at all, use something like Python: custom software is much easier to write in it than in C++, and it has good support for numerical operations. Java, too, is much easier and safer than C++.

    If you have to use C++, then wxWindows is a great environment: it works on lots of platforms and has extensive support for platform-independent I/O, threading, and networking.

  • by Dog and Pony (521538) on Tuesday November 05, 2002 @05:08PM (#4601976) [] - mature crossplatform C++ library, and not only for GUI, either.

    I don't know what you need, but WxWindows and GCC cross-compiling (see mingw32 faq [], for instance) might be what you need?

    WxWindows also have good bindings to python and perl etc. for more rapid crossplatform development.
  • What tools and strategies can we use to increase our productivity and regain our competitive advantage, without going for Windows only?

    All Windows productivity oxymoron jokes aside, how do you go from most of your users being on Solaris and Irix to thinking about going Windows only?

    Erg...I just saw the second part of the article, after the ad. That was annoying. Maybe it's time to sign up for a subscription.

  • Primary platform? (Score:4, Insightful)

    by binaryDigit (557647) on Tuesday November 05, 2002 @05:09PM (#4601992)
    Why not do what a great many other people do (though I have a feeling that you may be doing this already), and target a specific platform for initial release, and then release on the others afterward? This allows you to focus on the platform that gives you the most bang for the buck, but still keep your scaling advantage. If you already have an established product/development environment, then you should already know enough to keep from doing any of the "big mistakes" when it comes to writing portable code. Plus this allows you to divvie up your engineers into functionality vs porting.

    Another thing would be to standardize on say, gcc. since the source is available, you can do whatever tweaks you need to to get around any performance issues (I know, easier said than done). Then standardize on things like configure.
  • by buysse (5473) on Tuesday November 05, 2002 @05:10PM (#4602002) Homepage
    It sounds like your biggest problem isn't the cross-platform code as much as the Makefiles and the compiler differences.

    I would suggest first using gcc on all UN*X platforms, and also trying out something like ant [] instead of the various forms of make you're dealing with now.

    Also, have you considered using a library like Qt [] to handle most of the porting details? It's not free, but it is good if you can deal with it's oddities (I personally consider preprocessing to be evil).

    Good luck.

  • this topic (if you haven't read it already). The basic idea is to break out all parts that don't work across the different platforms into their own sections of code (classes/interfaces) assuming you aren't doing that already. Then when you do updates (that will hopefully work platform independantly) you can just right those sections once. Pretty obvious but I have seen so many places with this same issue who actually have completely seperate code bases rather then breaking parts out.

    Unfortunately if you are already doing this then you are probably screwed unless someone releases .NET CLR's for all those platforms.
  • controlling four different compilers (gcc 3.1, Sun Forte, Irix MIPSpro and Visual C++ 7)

    Couldn't you just use gcc across all systems? There are also plenty of opensource IDEs around. We use
    Eclipse [] along with ClearCase. Very good for cross platform.
    • Using GCC across all platforms doesn't guarantee no problems. (Although clearly it will simplify problems) I remember spending over a day tracking down a problem that was due to differences between GCC on OSX and GCC on Linux.
    • Couldn't you just use gcc across all systems?

      Not if you care about performance. Code compiled with a vendor-written compiler like Forte or MIPSpro can be over twice as fast as gcc's. At present, for example, gcc cannot optimize for the MIPS processor family, but SGI's compiler is finely tuned (remember SGI's bread-and-butter is fast compilation).

      Remember, gcc is the lowest common denominator. It'll compile you code sure, but that's all it will do, even with -O2. It might do a little better on Linux/x86 because it is the primary compiler on that platform, but I'd be very surprised if VC++ generated code doesn't outperform it, even allowing for the additional overhead of Win32.

      This isn't a criticism of gcc, since performance wasn't its design objective, but it does ably illustrate that Open Source isn't always the solution, and commercial software is often better.
  • ant instead of make? (Score:2, Informative)

    by mrjive (169376)
    Ant is an excellent alternative to Makefiles. It might not solve all your problems, but you should at least be able to simplify your build process quite a bit.

    Apache Ant []
  • Instead of different makefiles for each compiler, why not write wrappers for the various compilers to give them all the same interface? Then at least for compiler invocation you could have the same makefile for all four platforms, just set CC=cc_wrapper which is a shell script (or .cmd script on Windows, or Perl or whatever) which mangles its command-line options into whatever weird syntax this platform's compiler expects.

    With more fundamental issues like differences in the code accepted by different compilers, you can use #ifdefs in the code, which is not elegant but probably much better than makefile hacking.

    Some suggested switching to gcc on all platforms, but then others said it might produce slower code. Well, you won't know until you benchmark. It's possible (just) that you might decide to compile most of your object files with gcc, to eliminate most of the compiler-specific hackery, and just compile those parts of the program that are speed-critical using the native compiler. This assumes that the object formats used by gcc and the native compiler are compatible, but in principle there's no reason they shouldn't be. (Perhaps not in practice, but give it a try.)
  • Check out the latest edition of OilIT. Applications like this are why Fortran is still a very popular language among those who use computers for numerically-intensive work. Fortran9x is so easily portable, its not even funny. Bindings exist for all of the major threading standards (OMP, MPI, etc) and it is FAST. Further, is there any language with a better selection of libraries available for number-crunching? I doubt it. Develop your 'work' code in Fortran, and your front-end in Java. Both highly portable. The interface between the two will be platform specific, but this will only be a small portion of the total project & much easier to maintain than the large #ifdef heavy project you now have. $0.02
  • by eddy (18759) on Tuesday November 05, 2002 @05:15PM (#4602042) Homepage Journal

    Just add more people to your team! Double! Tripple! GO WILD WITH NEW RECRUITS!

    Sheeesh, some people never learn...

  • by pVoid (607584)
    Why not isolate threads and files, and IPC into light-weight objects?

    Sure you are going to have incompatible concepts across platforms (e.g. Windows doesn't support just unlinking an inode), but I'm sure you can find a happy subset without making too much compromise.

    I know Alias|Wavefront uses a very similar concept for their Studio and Maya products. (studio looks and behaves identically on all platforms - so it *is* possible).
  • by AKAImBatman (238306) <> on Tuesday November 05, 2002 @05:18PM (#4602063) Homepage Journal
    ...why you think you can't use Java.For all the bad press Java gets about being "slow", it is mostly old, outdated FUD. Newer virtual machines are often faster than C/C++ applications, especially in the number crunching arena. Intensive graphics are no big issue since Java now has a fullscreen API (page flipping, double buffering, and all that), a very fast implementation of Java3D, and (if you prefer) OpenGL wrappers.

    Even if you feel that Java doesn't cut it for everything, apply the 80/20 rule. 80% of your non-performance critical code in Java, and the later 10& in C/C++. This solution would at least *reduce* your multi-platform woes. You might try posting this on JavaGaming []. The guys over there are wizards at making Java perform with intensive graphics. (No surprise considering that some of the industries greatest performance experts hang out there.) They can also help you find the APIs you need. I'd really take a second look before you toss Java out as an option.
    • Yes, I know, I know, it is probably FUD, but how come I never ever find any java applications that are this fast as they say? I mean, I have this 1.8GHz computer with (only) 256 MB of ram and the rest average stuff. I have the hotspots and the JITs people talk about from different vendors. Why does no java apps actually perform?

      I like coding in java, you might even say "I want to believe" (in my best X-files voice) but how? People point to stuff like JBuilder, or other commercially built applications by big businesses. Nada. And no, I am not talking about my own applications either, although they be slow, they too... :)

      So give me one good example and tell me how to run it, and I will believe. Noone has, yet.
      • No problem (Score:5, Insightful)

        by AKAImBatman (238306) <> on Tuesday November 05, 2002 @06:58PM (#4603167) Homepage Journal
        Step 1. Go to [] and download the JRE 1.4.1.

        Step 2. Visit [] and click on "Webstart Now!".

        Step 3. Right click and save Meat Fighter []. Find where you saved the JAR file and double click.

        Step 4. Right click and save Duke Nukes Stuff []. Double click on the JAR.

        Step 5. Visit jGoodies [] and try their wide variety of products.

        If you are under Linux, I'm afraid the games probably won't perform well. (Little issue with getting X to be configured to handle high speed direct-framebuffer graphics). However, DataDino should work, although you may need to get the installer instead of using the super-cool WebStart link (Mozilla problem only!). If you don't have a database to use, visit the "Supported Databases" page and download the test HSQLDB database.

        The plain and simple fact is that Java is fighting two issues:

        1. Poorly written apps that give all Java apps a bad name. (For example, "genius" A decides to load a table before releasing the event thread. Table takes 5 minutes to load and user gets annoyed. The solution would have been to load the table in a separate thread so that the user can see and interact with the table items as they are being loaded.)

        2. Perceived performance vs. actual performance. People see Swing and the default look and feel and instantly "feel" that the app is slower than windows. Nothing could be farther from the truth. In all reality, it is probably running faster than the Windows app, it just doesn't seem right. This is caused by the Java L&F being way too "flat". Your brain doesn't quite connect the buttons and other objects as being solid objects to be manipulated.
        • Thank you, I will.

          But I will tell you beforehand, I am not totally buying that explanation (#2)- it would mean that almost every java developer, including old professionals would suck, since I've seen quite a few java apps.

          And your point 1. is nothing I recognize at all... but maybe I am just lucky.
          • Explaination (Score:3, Insightful)

            by AKAImBatman (238306)
            Think about Point #2 in this context. How many programs have you tried that you are surprised to learn are Java? If you said none, you probably aren't looking hard enough. (Hint: Look for the java.exe file in the installation directory.) You tend to recognize Java apps by their distinctive look. The distinctive look has problems with perceived performance. As such, many good programmers change it to use a non-standard look.

            After you visit jGoodies, you should understand more of what I mean.

            As for Point #1. I don't know enough about MFC to be 100% sure, but I believe that Windows automatically handles repainting when you are populating complex objects such as tables. (e.g. You'll tend to notice large tables in SQL Server Enterprise Manager paint nothing in the table as you scroll. Instead, you can watch the text filled in after the fact.) Swing (the Java GUI toolkit) requires the programmer to make these optimizations. Why? Because that's who *should* be doing it.

            What if for some reason, I want to design a scrollable table that is fast enough when pulling data over dial-up connections? Under Java, I might design it so that the data doesn't display until the user stops scrolling, or I might display partial data. Under MFC, do I have much choice? Not without jumping through a great deal of hoops.

            Notice how Microsoft writes new components every time they have a new piece of software (e.g. Office toolbars, Outlook shortcut bar, etc.). They do this to improve performance in their programs. Java programmers shouldn't have to rewrite GUI components, just data models. However, few and far between is the programmer who actually does this.

            BTW, another spot you might want to visit is
            Swing Sightings []. You can find links to all kinds of well written Java programs.

            If you'd like to try a Java program that uses native components instead of Java Swing, try Eclipse []. While I personally don't like it, it should help you understand the perceived problem a little more.
    • 80% of your non-performance critical code in Java, and the later 10& in C/C++
      And throw the remaining 10% out?
    • The fullscreen API and 3D features don't help if you just want quick response in a traditional, windowed productivity app.

      The performance of Swing still lags behind native code. We have some Java tools for in house use, I can't bear to make them greater than 30% of the screen size because the refresh rate is too painful.

      Maybe there's some aggressive coding techniques that would accelerate things, but if you're not a game developer, your boss won't consider GUI optimization time well spent. The Qt or wx libraries (or even Microsoft Visual Basic(tm)) will give you a snappy feeling application after a few minutes of assisted layout.
  • JNI is your friend (Score:5, Insightful)

    by Tim Macinta (1052) <> on Tuesday November 05, 2002 @05:19PM (#4602071) Homepage
    Java is not currently an option for the high-performance numerical and immersive graphical aspect of our applications.

    Java isn't an all or nothing deal. You could write your app in Java and then convert the parts that really need performance into C and call it via JNI. Then you only have to deal with keeping a much smaller C library portable.

    • by CmdrWass (570427)

      Not only that, there may be some merit for the "graphical aspect" of his argument, BUT the "high-performance numerical" part of his argument doesn't hold water.

      Java gets a bad rap for being "inefficient". The problem with java, isn't that it is slow, it is that it is so easy to learn that you have people programming in the language that have no business writing computer programs. I have proven time and time again to my peers that I can write code that is AS efficient or in some cases more efficient than comparative C++ programs. The thing is, I'm a java expert, and I know how to tweak things for performance. Any language can be inefficient if the person writing the code doesn't know what they are doing. And quite frankly, very few programmers I've met are at that level.
  • Use ACE (Score:3, Informative)

    by TheGreatAvatar (49772) on Tuesday November 05, 2002 @05:19PM (#4602080) Homepage
    I've developed several multi-threaded mulit-plateformed applications using ACE. This is a very well thought out package using various patterns to abstract the vulgarities of the different OS / compilers. ADAPTIVE Communication Environment []
    • Re:Use ACE (Score:2, Informative)

      by cK-Gunslinger (443452)
      I whole-heartedly agree with this. Many large corporations and projects are using ACE with lots of success. I find that multi-threaded and/or multi-process application development is simple and straightforward. We are only using the wrapper-functionality of ACE for the most part, but when combined with the pattern-abstractions and implementation, you can really achieve some well-designed, well-performing, cross-platform code.
  • Just rewrite all your makefiles for gmake - which is cross platform and can use any compiler and use one of these libraries (ACE or Rogue Wave) along witg STL - and you are set.

    I've done this... it works...
  • Have you looked at using the Bridge pattern [] (c.f. the GOF Design Patterns [] book).

    The GOF book gives an example of using the Bridge pattern to provide a platform-independent interface to a GUI api.

    Disclaimer: I don't have any real-world experience with the Bridge pattern, so I can't say how easy it is to make work, or how it performs in performance critical situations.

  • Hmm... (Score:3, Interesting)

    by Fugly (118668) on Tuesday November 05, 2002 @05:25PM (#4602145) Homepage
    I've seen a few shows on TV recently that featured some really wicked looking oil and gas exploration software. Stuff that let geologists view layers of terrain collected by seismic data in 3d, moving around in realtime... If that's the kind of stuff you're into, I can understand why java isn't an option for display right now.

    However, java is exactly what you need. You can scale it across processors on the big iron or run it on the desktop without recompiling.

    Have you considered only writing your display logic in C++ and using java for the backend number crunching? For raw floating point math, I've read that java is barely slower than native code at this point. It's my understanding that talking to the OS so you can get to the hardware is where you take the major performance hits using java. If you could do your raw crunching in multi-threaded java code, you could then deliver the data through one of many different mechanisms to your display logic and have that be the only code that you need to port from OS to OS...

    Another thing you could possibly look at is licensing 3rd party libraries made for cross-platform development. From your post, the only thing I know you're definitely having trouble porting is thread-related code. I'm sure there are multi-platform threading libraries for C++ out there somewhere.

  • Qt (Score:5, Interesting)

    by neonstz (79215) on Tuesday November 05, 2002 @05:26PM (#4602153) Homepage

    Have you looked at Qt []? It supports all the platforms you are developing for. It is primarily a platform independent GUI toolkit, but it also got a lot of other stuff like container classes (if you for some reason won't use stl), thread support, sql classes, xml classes and socket classes, all which are platform independent. It is not only just a portable GUI toolkit, I think it is the best GUI toolkit there is. I recommend it even if you're writing for Windows only. If you think of Qt more as a platform than a GUI toolkit, writing applications that run on multiple platforms (with native speed) may be easier than you think. (I'm not an employee of trolltech, although I am wearing a Qt t-shirt as I write this :)

    • MKS Toolkit does a terrific job of integrating a Unix ksh-based CLI with WinNT/2K/XP, and couples closely with IBM DB/2 so that you can run DB/2 UDB scripts as you can under AIX. It is expensive, but worth it if you can afford it.

      Qt is a very, very nice cross-platform development library. Another option you could look at used to be Neuron Data's Elements Environment, but they renamed the company ( and I don't know if they still sell EE as a seperate product (Advisor is a repackaging of their rule-base software, which was built on EE.)

      There are also the ever-present Rogue Wave class libraries, but I don't think they'll address your GUI requirements. However, if you use it to split out the core application functionality from the presentation (GUI), it might be helpful.

      A solid set of macros with compiler/platform detection directives can help a great deal for porting code, though many people prefer to use sed or perl scripts instead (ala config.) Macros have the advantage of dealing with portability more consistently, and localize the changes for platforms to the headers and migration binding code (usually done as a base library.)

      Using cross-platform libraries such as libraries, IBM's ICU (Unicode support), et. al. can also make your code much more portable without requiring extra work after the initial coding.

      There are also various open source projects that provide portable thread libraries, portable GUI toolkits, etc.

  • Simply moving over to GCC for all four platforms does seem to be the obvious choice; you can also flatten GUI differences by using a portable multiplatform library.

    I've had some experience doing just that. To date, Qt [] is the most mature of those and will give you uniform access to GUI, networking, threading and even database access for Win32, Unices (including Linux) and MacOS.

    If you aren't so worried about GUIs but need to output multimedia contents portably, SDL [] is a viable alternative. The portability of some of the more esoteric components is dubious but SDL has the distinct advantage of being completely free.

    As for performance concerns some people have raised about archaic versions of GCC, don't let that stop you-- even if you don't use GCC 3.2 (which produces very good code) the subtle improvement is speed is very rarely worth the greatly increased complexity in development and maintenance.

    Besides, with recent GCCs I'm hard-pressed to actually find any significant difference between code generated by it and other compilers (for IA32, anyways, and with all relevant optimizations turned on).

    -- MG

  • - First, Qt ( is awesome. High performance, excellent portability (including all the UNIX platforms you mentioned, plus Linux/Windows/MacOS X), legacy integration (with Motif, etc.), multithreading, etc. There are also third party toolkits to integrate more advanced visualization into Qt ( tion=Show&AID=108).
    Nowadays, Qt doesn't just handle GUIs. It also includes networking and threading abstractions, with really nice object oriented interfaces to them.

    - Second, think about what IDE features you're looking for. If it's a GUI builder (which seems doubtful, considering that you probably spend a lot more time writing visualization algorithms than dialog boxes), then you could use Qt Designer.
    But, if you're more generally interested in project management, integrated debugging, and source browsing, I'd suggest you take a look at Visual SlickEdit (, which integrates all those features into an amazing, cross-platform IDE/editor.

    Good luck!
  • If you definitelly need speed AND flexibility, I'd write the main application logic and GUI in full Java, then profile the application to find out those parts that really need to be accelerated, and write those as C++ code (you'd be VERY surprised at how easy it is to do this with Java; your Java code wouldn't even know that it is calling C++ code).

    Give it a try, that way you can use only one IDE for all of your code (NetBeans, Forte, JBuilder, Visual Age, are all good tools).

  • I'm guessing you'll get plenty of suggestions to change your language, which is certainly something to consider if you have that option.

    But if you're like me, you don't have that option. You've got a load of C++ that's not simply going to magically transform into Python or Java overnight.

    I would suggest the hard road. Boil down supported standard features in the compilers that you can use and tell people to stick to that list unless they can make a case that all of the compilers now support the new feature that they want to use.

    Although I feel it is dated now, Netscape used to publish such a guide [] for their developers.

    For example, in our early days, we would not permit namespaces or RTTI.

    Now, as compilers have gotten more supportive of the ISO C++ standard, we permit those features in our codebase.

    But we haven't yet decided to open the floodgates on exception handling, although it's supported pretty broadly.

    Finally, you really need an automated build system that runs the latest repository snapshots through the compilers on all the platforms and throws the results up on a web page, like Tinderbox.

    That will tend to enforce good standards as developers will see that their check-in attempts fly through with green and no warnings, or get dirty yellow about warnings, or red with downright errors during the build.

  • Ask the experts! (Score:2, Informative)

    by joeytsai (49613)
    Why not see what others who are facing the same issue have done? In particular, I'm thinking of mozilla [], which is another C++ application which has builds for Linux x86, Windows, OS X, OS/2, HPUX, AIX and Solaris.

    Yeah, they had to make their own toolkit (XUL), but I don't know if you need one (it wasn't totally clear from the question).

    In particular, check out this helpful document the mozilla team made about writing portable C++ code [].
  • I have written many cross-platform projects in C++ without a problem. All I did was identify low-level platform dependent issues and encapsulate them within subroutines, typedefs & #defines etc. Then write software using regular C++ code and use portability library. Example: typedef unsigned long PORTA_PID; // This function returns the current process ID. PORTA_PID getProcessID(void); As a previous respondent said, GUIs are a different matter. I would suggest to standardize on one GUI environment such as Java (or even MFC) and split the functionality so the GUI was a thin shell that communicates with the main business logic that exists in other C++ processes. You don' need no stinkin' tools, the amount of code required in the portability library is very small. That's my 2$. I'm worth more than 2c (just).
  • by AugstWest (79042) on Tuesday November 05, 2002 @05:52PM (#4602436)
    I don't understand why people even look at Makefiles anymore now that Ant [] exists. We've completely automated all of our builds and deploys across NT, Linux and Solaris with it, across different architectures and different locations.

    I'm not going to expound on using Java, since it is fairly ubiquitous these days and if it would work for you, I'm sure you would have already considered it.
  • by MSBob (307239)
    How are you, mate? How is everything at m*e?

    Have you considered java? Sounds insane, I know but Java3D seems to be coming of age and I've seen some pretty impressive large data visualisation demos written in it. It saves you a lot of headaches with cross platform development issues.

    I realise that most people think of Java only as a server side technology but Sun has been putting a lot of effort into making it more appealing to the scientific programming community. You should really give it another chance.

    Yours truly,

    You know who, eh? :-)

  • I do not know why noone considers Java as performant.

    I have a few words / acronyms...

    1. JIT. As in Just-In-Time compilation. Meaning Java software approaches native speed as time goes on.

    2. JNI. As in Java Native Interface. Ok, so you have a couple C++ libraries that really have to be as fast as they possibly can be, without necessarily incurring the wrath of the platform demon by being written in assembly. That doesn't mean that the 90% of your code that doesn't vastly effect the performance of your system ALSO has to be written in C++.

    3. Silicon. As in what chips are made of. Including chips that run Java. Though these are really targetted at the embedded market, to have your coffee machine run Java or whatever, a high-performance version is available. It plugs into your PCI bus. And runs Java there instead of in the main processor.

    Oh, not exactly on-topic, but insert obligatory note of how .NET is language-independent, supposedly platform-independent, being an open standard and all, and performant, as its compiled and not runtimed.
  • A colleague of mine is in the same boat as you. He asked me to implement some of his numerical calculations in java. We then benchmarked both. Java was the clear winner for us. You should consider trying the same.

    Go out and download and install Java's sdk []. Also, take a look at jama [].

  • by swagr (244747) on Tuesday November 05, 2002 @06:52PM (#4603101) Homepage
    Java is not currently an option for the high-performance numerical and immersive graphical aspect of our applications.

    So what you're saying is:
    You've coded it in Java, used native methods where applicable, optimized it, ran it, and it was too slow on every single hardware configuration known to man.

    Or are you just guessing?

    If you posted on Slashdot hoping we'd help you, give us the details. How "not an option" is it?
  • I still prefer Tcl (with Itcl for object-oriented programming). It's the easiest software environment to work with for me. I also use websh and dtcl extension for apache (
    Using TCL you'll have a very compact and clean code.
  • Java is not currently an option for the high-performance numerical and immersive graphical aspect of our applications. Read: "We don't want to learn Java" No Java programmer who read that believes you. The answer is that what you can learn to do in Java in 5 DAYS won't be fast enough, but if you're willing to either hire a good contractor (for gods sake ask to SEE something he's built though), or take 8 weeks or so to get GOOD, you can EASILY do this in Java...Unless of course you are doing special effects for ILM, in which case I humbly apologize, and I've got some questions about the next movie.
  • cmake and cygwin (Score:3, Interesting)

    by cant_get_a_good_nick (172131) on Tuesday November 05, 2002 @07:12PM (#4603295)
    As far as makefiles go, cmake [] looks promising. It seems to be a generalized Imake replacement. I haven't used it, but looks interesting. It is now part of the cygwin toolchain.

    as far as tools go, look at cygwin []. My company uses gnumakefiles on NT and UNIX, with generalized Makefiles for each project, and platform specific build rules in universal gmake include heeaders. We use ACE for a lot of the cross platform C++ stuff, a lot of our things are servers so we avoid the cross platform GUI stuff.
  • Would a cross-platform SDK with a C API do, instead of C++? C is more efficient than C++ for many purposes and much nicer for integration.

    You might want to take a look at Probatus Spectra SDK []. It provides many standard and advanced functionalities, as well as many helpful high-level frameworks for networking, globalization, and so on. A comprehensive cross-platform compatibility layer is an integral feature of the SDK.

    You mentioned that you need especially threading. Probatus Spectra SDK provides a very nice cross-platform API for threading. For example, it cross-implements the conditional variables in Unix threads and thread events in Windows.

    It also has an excellent build system based on a framework of makefiles that hides all platform-dependent issues.

    The currently supported platforms are Linux, Solaris (Forte compiler), HP-UX, IBM AIX, Tru-64 and MS Windows (VC++ and Borland C++ Builder compilers). Both 32- and 64-bit platforms are supported for Solaris,
  • Definitely check out:

    FLTK - - light-weight
    cross-platform C++ GUI toolkit with
    OpenGL support, etc.

    Boost - Portable, peer-reviewed C++ libraries
    including threading support, etc.

  • Honestly, I would use Visual Studio only as a deveopment environment -- but NOT rely on it. The other platforms will be easy enough to just use whatever text-editor the developers are comfortable with.

    I tried Intel's compiler for the first time today on Windows, and it's far superior to the one that comes with VC 6.0... it integrates nicely with the IDE (remember, the IDE is just an IDE, the compiler is a command-line utility) and -- in my case (fourier transforms, MP3 decoding) it produces code that is a LOT faster (though compile time is slower, output is larger...)

    Point being, there is no reason (ever) to tie yourself to visual studio. It's a glorified text editor, really, and as long as you keep it that way (stay away from MFC et al) then it won't introduce any problems with cross-platform programming. I personally run a couple of (private) cross-platform projects, and the best advice is to treat code as code. Don't rely on any of VS's magic, just use it as an environment that lets you edit text and compile with the push of the F7 key.
  • RogueWave (Score:2, Informative)

    by HapNstance (38538)
    We have very similar needs in the medical software industry and we find RogueWave to be a pretty good solution for our cross platform C++ needs. They have various libraries (database abstraction, STL, math, etc.) which are all cross platform and provide decent performance. There is still some tweaking to be done to get all platforms to compile clean but it is much less than what we had before we moved to RogueWave.
  • by AxelTorvalds (544851) on Tuesday November 05, 2002 @08:14PM (#4603875)
    Bummer, when you ask about this stuff, it's usually pretty late and there are often legacy decision that have been made that are hard to break free from. Or your young and naive and don't like something at work..

    This stuff is done for real. At IBM I worked on a very large project that compiled on AIX (several distinct versions that we were sensitive to) Solaris, Windows NT, Linux, HPUX, and supposedly OS/400 although I never actually built the OS/400 piece nor have I seen it operate. First things first, you need good coding conventions, don't let some punk break them either. Secondly you have to design some abstractions and build some foundation classes; or buy a really good set or downlaod some good free ones, I've heard positive things about ACE. This is mostly a problem with windows any more, back a few years you might support win16, win32, PM, and UNIX now it's pretty much just POSIX and Windows. You need to abstract the machine stuff out. Threads, possibly strings and such (Unicode vs. non-unicode..) possibly basic types (big endian vs. little) networking code.. A rule of thumb is that on this kind of project you should never talk to the OS directly without something in between, it's a huge effort to make that OS abstraction layer or learn the ins and outs of an off the shelf one but it's worth it, even if you pay with a little performance. Nothing sucks more than coding away on AIX building some cool classes and adding some cool new stuff, then checking it in and finding that it doesn't compile on any other platforms.. and you've got to figure that crap out ASAP to make a deadline. If you build one from scratch, as IBM usually does or did, you can tune some things for your application; your OS abstraction layer can be a great "helper" or "utility" layer.

    Typically well coded C and C++ can go from compiler to compiler pretty easily. Then you can use Pro64 on Mips, ACC on sparc, Intel C on Windows for performance critical portions of code. You have to be smart about it though and use some good conventions. The biggest rule would be avoid MS Visual Studio which is by far the most non-standard setup out there and if you do use it don't use their projects unless you have to. Some good make files with some good rules can help make this pretty easy. I don't know why more people don't do it but look at the Linux kernel's rules file. I have a Rules.make that I've built up and it includes things like different options for debug builds, profiled builds, and optimized builds, sets up some common rules for compiling C++ code and C code and what have you. My makefile include that file and then they are usually pretty short, generally not much more than a list of .cxx files and a library name. Then it's easy to make sweeping changes too. I think a good build system, one that will last should usually take a day or two to kind of put together pretty early on in a project, unless you can carp some good stuff from another project. The goal is a flexable and reliable build system that you don't have to worry about. Far too often people start cobbeling a build system together and then after 6-9 months it's broken and britle and hard to change because so many things have been changed and added through out the project. Put some effort in up front, consolidate your rules in to one place, use some environment variables to control some build switches. Use some shell scripts to figure out various things, not hundreds of lines of Bourne shell code, just little bits. Do this until AAP is ready and rocking and then use that. Also, if it needs to be said, use GNU Make; it runs damn near everywhere and it's pretty good at what it does the 15 minutes it takes to learn it will save you hours and even days worth of time in the long run.

    I'm a huge advocate of a solid and strong build. Mozilla is a project that festered for weeks or maybe even months becuase you couldn't build the damn thing when it went open source. Building code is something that can be done so well by tools that if you're worried about it then you need to fix the build. Building software is hardwork so take the pieces you can out of the equation, the build is the first one.

    Next, I assume you've abstracted out the GUI from the meat. If not, make this job one if you wish to have any chance in hell against your one platform competitors or even your mulitplatform competitors when you get down to it. View/Data Model and client/server: learn it, live it, love it. Or switch to a web based interface, lot's of people do it.

    While we're on abstraction. If you guys are really serious then you're probably going to have machine specific components. Look at /usr/src/linux/include and /usr/src/linux/arch for a starting point of reference. I would envision a project like this have a set of small Mips, Wintel, Linux-x86, etc. directories. Everything else can probably be compiled with GCC and then in those directories you'd have your assembly and machine specific compiled code.

    Lastly, you want to have a staged check in process. People hate not being able to commit code but at the same time you don't want them to commit code for real until it compiles on all of your platforms. Honestly, I don't know of a really good way to do this. Set up a build lab, do nightlies against the real code. Do nightlies against the "staged" code. Then have some kind of weekly merge meeting. That's how I've seen it, it's time consuming and somewhat painful. You bite off too much and you're spending a shitload of time merging stuff.

You are in a maze of little twisting passages, all different.