Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming Operating Systems Software Windows IT Technology

Does C# Measure Up? 677

An anonymous reader queries: "Windows::Developer is offering a detailed, quantitative examination [free login required] of C#'s performance versus Java, C, C++ and D. 'Overall the results were surprising, although perhaps unexciting, in showing that C# (and to a less extent Java) is, to a good degree, on a par in efficiency terms with its older and (presumed to be) more efficient counterparts C and C++ at least as far as the basic language features compared in this analysis are concerned,' writes the author, Matthew Wilson. I'm only an amateur coder, and confess to not understanding most of the two-part article. I'd love to hear how true programmers view his results, which are too wide-ranging to summarize easily here. How about it Slashdot, as this special edition asks, 'Can C# keep up with compiled languages like C, C++, and D or byte-code based Java?'"

While we're on the topic of C#, rnd() queries: "It's been a while now, since Mono and DotGnu have begun eroding the market power of Microsoft by creating open source implementations of C# and the Common Language Runtime. Over the weekend I loaded Mono and did some informal benchmarking of object creation, intensive message passing, massive iteration, etc., and the results show that Mono is about 90% as fast as Microsoft's implementation after a very short time. I now want to switch my .NET development over to Linux/Mono exclusively, but I want to first settle on a free alternative to Visual Studio .NET 2003. Any suggestions?"

This discussion has been archived. No new comments can be posted.

Does C# Measure Up?

Comments Filter:
  • Differences (Score:5, Funny)

    by fredistheking ( 464407 ) on Monday September 15, 2003 @07:32PM (#6969963)
    C# is higher pitch than C but less so than D.

    --
  • In Java's case ... (Score:2, Interesting)

    by Anonymous Coward
    Hotspot is really good about compiling bytecodes down to the native machine language. That code is real honest to goodness native code, and will execute at approximately C++ speed.

    Why use anything else?

    - David
    • Why use anything else?

      Templates. [josuttis.com]

      Any other questions?

      • by JasonB ( 15304 ) on Monday September 15, 2003 @07:45PM (#6970066) Homepage
        Java SDK v1.5 (not yet released) contains support for 'generics', which are very much like C++ templates for Java:

        http://java.sun.com/features/2003/05/bloch_qa.html [sun.com]
        • by Bingo Foo ( 179380 ) on Monday September 15, 2003 @08:04PM (#6970218)
          How are these mythical generics at metaprogramming? If Java eventually gets libraries like Blitz++, Loki, and boost::Spirit, I'll take a look. Oh, but that would require operator overloading, too. Maybe Sun should release the "J2UE" (Java 2 Usable Edition) as a JVM implementation of C++.
          </snarky>
        • by Anonymous Brave Guy ( 457657 ) on Monday September 15, 2003 @09:00PM (#6970829)
          Java SDK v1.5 (not yet released) contains support for 'generics', which are very much like C++ templates for Java:

          I really can't believe this thread. Why do people always come up with this worn out line whenever someone suggests that C++ templates are an advantage? And how come so many people have done so in replying to the same post? All but the first are (-1, Redundant), and the first is (-1, Ill-informed).

          OK, please, read this carefully, for I shall write it only once (in this thread):

          Java's generics are not even close to the power of C++ templates. They are glorified macros to fix a bug in the type system that should never have been there.

          C++ templates were at that level around a decade ago. Today, they're used not only to create generic containers (for which they are, no doubt, very useful) but also, via metaprogramming techniques, as the tool underlying most of the really powerful developments in C++ for the past few years: expression templates, high performance maths libraries, etc.

          If you didn't already know that, please read it again and understand it before proceeding.

          Java's generics don't even come close to the same power. They're a cheap knock-off, aimed at just one of the uses of C++ templates, which fixes a glaring flaw in the previous Java language. For that, they serve their purpose well, but please don't pretend they're anything more.

        • ... compared to the code-generating power of lisp macros [c2.com].
      • Templates are in Java 1.5 (see generics) so this argument is moot.
      • by Trejkaz ( 615352 )
        Yay! All the power of Java's Generics but with 10 times the code bloat.
      • Why use anything else?

        Templates.

        Any other questions?

        In C++, templates are absolutely necessary. In any other language, they aren't... and not even useful...
  • by Vaevictis666 ( 680137 ) on Monday September 15, 2003 @07:36PM (#6969988)
    I now want to switch my .NET development over to Linux/Mono exclusively, but I want to first settle on a free alternative to Visual Studio .NET 2003. Any suggestions?

    How about Eclipse? [eclipse.org]

    • by Anonymous Coward
      Eclipse do not seem to support c# natively (I confess to not having tried the c# plugin).

      I would recommend SharpDevelop. I understand work is underway to make it less dependent on WinForms so that it can run under Linux/Mono.

      If the original poster was looking for free-as-in-free-beer-IDE's, there's always the personal edition of Borlands C#Builder, as well.

      • by Glock27 ( 446276 ) on Monday September 15, 2003 @08:20PM (#6970339)
        Eclipse do not seem to support c# natively (I confess to not having tried the c# plugin).

        AFAIK, all language support in Eclipse is by plugin. You're basically saying you've never tried it, but you advise us to try something else.

        Huh?

        (I've used Eclipse a bit for Java, and it is excellent. I'm pretty sure it'll be a fine environment for many languages by the time it's all said and done.)

        • Eclipse can be good for other languages. My problem with it is that it's just too slow many times. I did have a problem in Windows getting C/C++ to compile, but I think it was looking for cygwin + gcc, and not the MSVC++ I had installed at the time. I too have used it extensively for Java, and probably wouldn't use anything else on a decent-sized project.
  • by Anonymous Coward on Monday September 15, 2003 @07:40PM (#6970038)
    Raed it. Ingroe it. Bncehmrak aynawy.
  • Duh (Score:5, Insightful)

    by be-fan ( 61476 ) on Monday September 15, 2003 @07:42PM (#6970048)
    This really makes sense if you understand how JITs work, and understand the nature of the benchmarks. These benchmarks are mostly microbenchmarks that test a particular operation in a tight loop. JITs can compile this loop once and run it directly on the CPU afterwords. This doesn't extrapolate well to situations where the JIT gets involved more often in the benchmark.
    • Re:Duh (Score:4, Insightful)

      by msgmonkey ( 599753 ) on Monday September 15, 2003 @08:24PM (#6970374)
      But having said that there is the "rule" that 10% of code runs 90% of the time, hence why caches work. If the JIT is only optimizing code that gets run 90% of the time and it's not doing a bad job (since the JIT compiler would most likely be in 2nd level cache) with the other 10% then the benchmark results would be relatively indicative.
  • by contrasutra ( 640313 ) on Monday September 15, 2003 @07:43PM (#6970058) Journal
    I now want to switch my .NET development over to Linux/Mono exclusively, but I want to first settle on a free alternative to Visual Studio .NET 2003. Any suggestions?

    VI. :-D

    C,D, I feel bad for the F Programmer.
  • Mono risks (Score:3, Insightful)

    by alext ( 29323 ) on Monday September 15, 2003 @07:43PM (#6970062)
    Mono is a bit faster than Java, for some benchmark?

    Well, why didn't you say so before?!

    That way we could have avoided all that tedious discussion about IP infringements, functionality etc. in yesterday's [slashdot.org] Mono debate. Clearly some dude's performance figure should override all other considerations when choosing a platform.
    • Microsoft has sought protection for a "programming language and compiler system capable of generating object code that performs comparably or better than C."
  • Uhm... (Score:4, Funny)

    by Black Parrot ( 19622 ) on Monday September 15, 2003 @07:45PM (#6970070)


    > Overall the results were surprising, although perhaps unexciting, in showing that C# (and to a less extent Java) is, to a good degree, on a par in efficiency terms with its older and (presumed to be) more efficient counterparts C and C++ at least as far as the basic language features compared in this analysis are concerned

    Could we have a few more weasel-words in that sentence, please?

  • Java has received alot of attention over the last 7 or 8 years or so. It's a great idea in theory, but it falls way short in practice. It does remain viable for a few select uses, though: small widgets, servlets and web app servers. Pretty much other than that, using Java will mean excessive memory usage and slow performance.

    Garbage collection in Java is not guranteed. It's what I call Union. It'll clean up when it god damn well feels like it. In the meantime, the system slows to a crawl.

    Graphics i
    • by trompete ( 651953 ) on Monday September 15, 2003 @07:54PM (#6970139) Homepage Journal
      to add....

      GUI development isn't that bad in Visual Studio. In fact, it is easy as hell. With emulators like Wine available, it makes sense to develop the software with Visual Studio and its amazing debugging tools for WINDOWS and then use Wine to run it on Linux.

      One could use the GTK and GNU-C to develop multi-platform software in the first place, but then you are dealing with the parent poster's problem of debugging on multi-platforms, not to mention that GTK sucks under Windows (Anyone else use GAIM?)
    • by Cnik70 ( 571147 ) on Monday September 15, 2003 @07:58PM (#6970168) Homepage
      You apparently haven't taken the time to work with any of the newer releases of java (1.3 and 1.4). Java is a very mature and worthy language, especially when it comes to developing non platform dependent applications. I prefer java since I CAN program it on a Linux box, transfer it to a windows, mac or even a mainframe and the program will still run exactly the same without changing a single line. lets see c, c++, .Net or c# do that... Also don't forget about a little thing called J2ME which is slowly but surely making it's way onto cell phones and other small devices. You may want to take another look at todays Java before declaring that it sucks.
      • by Gunfighter ( 1944 ) on Monday September 15, 2003 @08:11PM (#6970270)
        I'm afraid I'll have to agree with Cnik70 on this one. I can remember struggling through Java programs back in the day and thinking to myself, "No way in Hell this will ever become what everyone is saying it will be." That was then, this is now. Internal benchmarks of our code under mod_perl, PHP, Python (Zope), and Java for our web development show Java (Tomcat) to be the winner by a landslide when it comes to scalability, performance, and rapid development. From what I understand, even eBay is switching to a Java platform (anyone know more about this?). Of course, this is all on the server side...

        As far as GUI applications are concerned, the only thing that is slow about running Java GUI apps on modern hardware is the startup time. This can supposedly be taken care of with accelerator apps which keep a JVM running in the background just waiting for Java apps to be run. Even without such acceleration, I still use jEdit as my text editor of choice for all my programming needs (http://jedit.org/), and as a sysadmin I don't even program in Java (where jEdit is best applied). I usually stick to Perl/Python for automating systems administration scripts. Nevertheless, I find that the features, performance, and overall ease-of-use in jEdit save me loads of time (nice CVS integration too).

        Bottom line, Java is already in enterprise computing environments and, with an experienced developer, ready for primetime in smaller applications as well.

        -- Gun
    • by MikeApp ( 151816 ) on Monday September 15, 2003 @07:59PM (#6970183)
      You claim to have worked on numerous Java projects, then complain only about GUI apps? The large majority of Java projects are server-side, which is where it really rocks. Write once and deploy on your choice of platform (Linux, Solaris, or Windows if need be).
      • by pla ( 258480 ) on Monday September 15, 2003 @08:15PM (#6970305) Journal
        The large majority of Java projects are server-side, which is where it really rocks. Write once and deploy on your choice of platform (Linux, Solaris, or Windows if need be).

        Although true (in my experience), the idea itself has sooooo many things wrong with it...

        Portability - Server-side Java, by nature, does not involve any OS-specific activity. So, with no loss of portability or functionality, you could do the same in C/C++. Which, incidentally, will run on any platform for which GCC exists - About 30 *times* the number of platforms for which a JVM exists.

        Performance - Yes, servers tend to have fairly impressive hardware resources available to them. So lets cripple that hardware by making it run an interpreted language. JIT? Server-side apps also tend to have very short process lives, doing a small task and exiting. In such situations, JIT causes worse performance, as it wastes time optimizing something that will never execute again during this process' invocation.


        I believe (though I do not wish to put words in his mouth) that CurtLewis only mentioned GUI programming because if you use Java anywhere else, you have misused it. It makes it easy to write an app with a similar user experience on any hardware with the resources to run the JVM. If the idea of a "user experience" has no meaning to your app, using Java means you have made a suboptimal choice.
        • by EMN13 ( 11493 ) on Monday September 15, 2003 @09:13PM (#6970953) Homepage
          Server side tasks are just great for java; your arguments don't add up.

          Any platform I've heard of for server stuff will run java. C/C++ may have a wider distribution; but not amongst relevant platforms. Irrelevant to the discussion are portable/client side/ small / light platforms, and the computation heavy things. Servers are about "service" - the naming is not a coincidence.

          Most server processes run forever, or as close as possibly; not very short lived at all. Starting a new process (aka application) for every client is generally not best practice - those are threads, and don't require constant re-JIT-ing or other JVM/CLR overhead.

          Rather, the larger the system, and the more data, generally the less code in relation to that data, and even less JVM in relation to data. So for the quintessential server with tons of data gunk, the c/c++ advantage is much smaller than in the GUI.

          Furthermore, it is a mistake to compare the portability of java to that of c++ in the manner you do. c++ implementations aren't generally compatible. Take a look at the mozilla coding guidlines for portable c++: http://mozilla.org/hacking/portable-cpp.html

          c++ isn't portable, normally. C++-- might be though. Then again, it might not be. The java language is very standardized; and in case you shouldn't have a compiler on that platform, the bytecode is too!

          In conclusion, if you're using c++ for a server-side task you should consider using java instead. As a matter of fact, most scripting languages are probably better suited than c++, I can hardly image a worse fit.
          • by Anonymous Brave Guy ( 457657 ) on Monday September 15, 2003 @09:45PM (#6971241)

            Give or take a few quirks, most recent (last five years, say) C++ compilers support all the major features acceptably well for most purposes. There are only a few issues that cause significant problems: the infamous export, Koenig look-up and partial specialisation of templates come to mind. None of these features is used very often anyway, though.

            Coding standards that forbid the use of templates, exceptions and half the standard library in C++ are common, but way out of date.

            By the way, I write code for a living that gets shipped on more than a dozen different C++ platforms. Alas, some of them are well pre-standard, and so don't support even basic template or exception mechanics properly, which sucks. But this is an issue we've looked into in some detail, so I'm pretty confident in my statements above.

        • Oracle (Score:4, Interesting)

          by Un pobre guey ( 593801 ) on Monday September 15, 2003 @10:15PM (#6971469) Homepage
          So, with no loss of portability or functionality, you could do the same in C/C++.

          Try this:

          Write a backend app in Java, and one with the same functionality in C or C++. Make sure they both read and write data to an Oracle database. Now, with stopwatch in hand, make them both run on Win2K, Cygwin, Solaris, and Linux.

          Guess which language will allow you to finish first.

    • We're going to have to point out the separation of language and platform for you again. If the Perl community can figure it out, then anyone should be able to.

      Java 'the language' is good. Java 'the machine' is good. Java 'the platform' blows presumably because all the people who could make it better are sitting on Slashdot whining about Java's performance instead of doing something about it.

    • Informative my ass...

      Dude, show me a better IDE than Intellij IDEA [intellij.com] which is entirely written in Java, and then we'll start talkin'.

    • by Dr. Bent ( 533421 ) <ben.int@com> on Monday September 15, 2003 @08:12PM (#6970284) Homepage
      I've worked in a variety of Java development projects in the past and not once has it ever risen to the task to show itself as a worthy choice and/or a mature language. Instead it has invariably wasted companies time and money.

      I would be willing to bet the reason they failed is because you do not understand how to use Java correctly. As long as we're playing the personal experience game, it has been my experience that hardcore C/C++ programmers tend to make horrible Java programmers because they think Java should behave like C/C++. It doesn't (obviously), and when you try to shove that round peg into a square hole what you get is a huge mess.

      At my company, we have a bunch of old school C/C++ progammers and a few Java programmers. As our Java products started to take off (now our #1 selling product line), we wanted to move some of those developers over to Java to help out.

      It was a disaster. They made object pools in order to try to manage thier own memory. They were calling System.gc() and yield() instead of using a Java profiler to find bottlenecks. The never used lazy loading. They never used failfast ("exceptions are too slow!", they said). The result was all the projects they worked on were extremely brittle, used twice as much memory, and ran much slower than our original Java stuff because they were constantly fighting against the system instead of working with it.

      Try reading Effective Java [amazon.com] by Josh Bloch and Thinking in Java [mindview.net] by Bruce Eckel. Do what these guys suggest and your Java apps will run just as well than as anything written in C or C++.
    • by GroundBounce ( 20126 ) on Monday September 15, 2003 @08:46PM (#6970667)
      I am in the process of coding a data collection server application in Java which collects data from remote devices that dail up over a phone line. The application must interface with the serial port at a low level, send email, generate XLS files, send faxes, create charts, and, of course, communicate with a DBMS. It also has a significant GUI component as well. Oh yes, this application must be able to run both on Linux and Windows (2000 or XP).

      I have been developing on Linux. All in all, to date I've coded around 30,000 lines of code. Because of the high level of portability of the APIs, I had to change all of about a dozen lines of code to get it up and running on Windows. That's one dozen out of 30,000. There's now way you could even come close to that using C or C++, regardless of how cross-platform the library developers say their libraries are.

      As far as performance, Java may have been slow three of four years ago, but the last several versions of the JDK and the HotSpot JVM have seen a steady and rapid improvement in performance and stability. SWING, although it has improved, may still be a bit slow, but computational code written in Java is only slightly slower than in C++. Even current versions of SWING, although arbuably slower than native GUIs like GTK+ or QT, are fast enough so as to not be noticable on any machine faster than 800MHz or so.

      Most people who say Java is unstable or slow are remembering their experience from the JDK 1.1 days. The current JDK 1.4 bears little resemblance to that in terms of performance and maturity.

    • by cartman ( 18204 ) on Tuesday September 16, 2003 @03:28AM (#6973115)

      I disagree.

      Garbage collection in Java is not guranteed... It'll clean up when it god damn well feels like it.

      This is false. Garbage collection in Java is guaranteed. The VM times the garbage collection to occur when the CPU was otherwise idle, or when additional memory is needed by the VM or by other programs. This is the most reasonable behavior.

      In the meantime, the system slows to a crawl.

      This is false. Garbage collection is deferred to improve performance. Deferring gc does not make anything "slow to a crawl," since unused objects consume no cache and take no processor cycles.

      Graphics in Java are abysmally slow.

      This is an exaggeration. It is noticeably less responsive however.

      Java was supposed to be: Write once, run anywhere, but what it is in reality is: Write once, debug everywhere... over and over and over.

      Java requires far fewer debug cycles than C or C++, since an entire class of bugs (tricky "pointer bugs") is eliminated. Thus, "over and over and over" is not accurate.

      I've worked in a variety of Java development projects in the past and not once has it ever risen to the task to show itself as a worthy choice and/or a mature language. Instead it has invariably wasted companies time and money.

      It's possible the difficulty was with you or your team, not Java. Perhaps you're unfamiliar with the tools or the language. Other organizations have produced enormous projects, successfully, in Java.

      Stick with C and C++ for most development, there's a reason they are the standard: they work.

      For backend development and enterprise applications, Java is the standard, not C or C++. It's been that way for some time. There's a reason for it.

  • by DaMeatGrinder ( 565296 ) on Monday September 15, 2003 @07:48PM (#6970093) Homepage
    These articles set a metric of what is "good". They judge a bunch of languages based on this criteria, and announce a "winner".

    This is just one way to slice the pie.

    Languages are appropriate for different uses. I use C while kernel hacking. I use C++ for its template abstractions. I use PHP for web pages, Perl for command-line scripting. I use bash/tcsh for boot-scripts. I respect VB as an accessible language, but I have no use for a single-platform language.

    What language you use depends on your application. Comparing C, C++, and C# is like comparing a wrench and a screw driver. And concluding they can both be used as a hammer.

    • by Waffle Iron ( 339739 ) on Monday September 15, 2003 @08:09PM (#6970252)
      Comparing C, C++, and C# is like comparing a wrench and a screw driver.

      Moreover, languages like C and C++ can be used in very different ways, depending on the circumstances. You can code the "safe convenient" way using tools like STL or glib to manage strings, containers, etc. I've found that the overall performance of such an application often is in the same ballpark of a Java implementation (other than Java's obnoxious startup time).

      However, C and C++ also allow you to write in a "masochistic balls-to-the-wall" mode that gives you much higher performance in exchange for 10X the programming effort. To do this, you often have to analyze your algorithms over and over until you can implement them using only stack and static structures. You avoid malloc() at all costs, avoid copying any data unless absolutely necessary, etc. You might disassemble the compiler output, run profilers and arrange data in cache-friendly patterns to squeeze out even more performance. The implementation will be much more brittle and prone to bugs, but you can often get a 10X or more speed improvement over the "natural" C or C++ implementation. Obviously, very few problems warrant this kind of attention, but making blanket statements about "comparing" other languages to C/C++ really should acknowlege the large range of performance that these languages can cover.

    • by kylef ( 196302 ) on Monday September 15, 2003 @08:47PM (#6970687)
      What language you use depends on your application. Comparing C, C++, and C# is like comparing a wrench and a screw driver.

      And this is where the .NET Framework shines, because the CLR is a generic virtual machine to which any number of languages can be compiled. Currently there are C#, C++, VB, and even Java (under the moniker J#). There has been talk of writing a Python compiler and even possibly a Perl compiler. So you can choose your language of choice, and your resulting binaries or objects will fully interoperate with the other .NET languages and class libraries.

      And as far as this article is concerned, I think the interesting point is not that they're comparing apples to oranges, but just that the performance numbers for CLR-compiled C# aren't so horrible that they should scare off the majority of developers.

      • Bad Kool-Aid. (Score:5, Interesting)

        by rjh ( 40933 ) <rjh@sixdemonbag.org> on Tuesday September 16, 2003 @12:05AM (#6972243)
        Currently there are C#, C++, VB, and even Java

        If I understand the .NET framework correctly, there is no way to support either multiple inheritance or templates--in which case C++ cannot be accurately modeled in .NET. Nor will Java be .NETtable after 1.5, which will introduce pale imitations of templates (but imitation enough to give the CLR a hissyfit).

        The .NET CLR does not support multiple languages. It supports one language--C#. Its "multiple language support" comes from being able to compile down many functionally-identical languages with different syntaxes down to the same bytecodes.

        But truly different languages are not representable in the CLR. Show me how to do a Scheme continuation in the CLR, please, or export a C++ template, or a LISP macro.

        The only languages .NET supports are those which are subsets of C#. And once you realize that, .NET becomes much less interesting.

        (Warning: haven't used .NET much in the last few months. Last I checked these things were true, but after being very unimpressed with .NET I haven't kept up with things.)
        • Re:Bad Kool-Aid. (Score:4, Informative)

          by Keeper ( 56691 ) on Tuesday September 16, 2003 @01:23AM (#6972655)
          If I understand the .NET framework correctly, there is no way to support either multiple inheritance or templates--in which case C++ cannot be accurately modeled in .NET. Nor will Java be .NETtable after 1.5, which will introduce pale imitations of templates (but imitation enough to give the CLR a hissyfit).

          The next version of .Net (2.0) is supposed to introduce native template support to IL. Not sure about multiple inheritance, but I doubt it.

          The .NET CLR does not support multiple languages. It supports one language--C#. Its "multiple language support" comes from being able to compile down many functionally-identical languages with different syntaxes down to the same bytecodes.

          If you want to get technical, the CLR supports IL (implementation language), which is interpreted by the virtual machine. The virtual machine has a set of instructions it understands, and the various .Net compilers compile to those set of instructions. This isn't very different from taking a C++ program and building an executable that consists of instructions for an x86 cpu. The machine (virtual or not) provides the facilities for performing the operations you want to do.

          If you can represent what you want to do in IL, you can write a compiler for it. There's some guy out there who wrote an 386 asm->IL compiler (why? because he could I guess...) for example. Not that I think it's a terribly great idea, but it is neat.

          But truly different languages are not representable in the CLR. Show me how to do a Scheme continuation in the CLR, please, or export a C++ template, or a LISP macro.

          IL doesn't have special instructions designed to support those things, but they can be done; they just won't be necessarily be done fast or efficently; for instance, you could do C++ templates the way your average C++ compiler does them --> your compiled code has a unique object for each templated type required. And I can pretty much guarantee you that you could write a Scheme or LISP compiler -- it'd just be a pain in the ass to do (which I'd imagine any proper LISP compiler would be), and it wouldn't be wicked fast.

          The only languages .NET supports are those which are subsets of C#. And once you realize that, .NET becomes much less interesting.

          A more accurate statement would be that IL supports object oriented / functional languages very well, and other things not so well.

          Most commonly used programming languages are OOP or functional in nature. I haven't used a non object oriented language since I got out of college. Perl, C, C++, Java, Basic, Fortran, Pascal, Python, Cobol, Smalltalk, various shell languages, etc. are all either functional or object oriented languages.

          I'd argue that implementing optimal support in a form that most languages translate well to (as opposed to a form that rarely used languages translate well to) was a good design decision.

          The power of this allows you to write code to get what you want done, instead of spending a lot of time writing code to get a round peg to fit in a square hole. There are many other peripheral benefits that .Net brings to the table, and I personally think this isn't one of the big ones -- I think it's more of a "cool" factor than anything else -- but dismissing it outright because the design doesn't perfectly suite some obscure language that is generally used by AI researchers is kind of silly in my opinion.
          • Re:Bad Kool-Aid. (Score:3, Informative)

            by mrdlinux ( 132182 )
            I don't know why the OP mentioned Lisp macros, because they certainly have nothing to do with the backend of a Lisp compiler. However, problems still do exist with compiling to IL, mostly focusing around calling-conventions and potential limitations instituted by the CLR.

            For example, Common Lisp has advanced object-oriented features such as multiple-dispatch methods and method combination, and this would most have a large conceptual mismatch with the typical calling conventions used by languages in IL, he
        • Re:Bad Kool-Aid. (Score:3, Interesting)

          by gglaze ( 689567 )
          How can this comment be (+5 Interesting) while each of the subsequent comments are 1's and 2's? When I first read this, I almost jumped to write a diatribe, and found that the other replies had already covered most of the important points very nicely. This comment should be marked (-1 : Ill-Informed) or some such rating, as it is highly misleading to anyone on /. who might be trying to learn about .NET.

          As the others have mentioned here, the CLR is an abstraction of the assembly instruction layer, not the
      • You know, I used to think that the idea behind .NET supporting all these languages was interesting, but now that I've actually used it, I can't see what the big deal is; especially since the development environment to program for .NET is different for each of the different languages. Sure, you can hook a smattering of languages together, but who on Earth really does that? I have never worked for a company that says "write in whatever language you want to, we code to .NET so it doesn't matter!" What a nig
  • Conclusions (Score:3, Informative)

    by SimuAndy ( 549701 ) on Monday September 15, 2003 @07:55PM (#6970148) Homepage
    After a quick registration, I'm enlightened by the conclusions I drew from his article:
    • C# fares rather well.
    • ... but almost never as well as native C and C++ implementations done "smartly".
    • Java suffers from runtime-based overhead, with the advantage of a well-tested set of runtimes for various platforms. Unfortunately, due to the Java implementation of client-side runtime libraries, programming in Java is still "write once, test everywhere. curse. port everywhere" for real applications.
    And most importantly, as a game programmer, I'm going to pay attention to the relative performance the author received from the Intel-branded compiler, written to the metal of the Pentium IV with streaming SIMD instructions. -andy
  • by defile ( 1059 ) on Monday September 15, 2003 @07:58PM (#6970170) Homepage Journal

    OK lets get a few things settled.

    Given: two identical applications; A, written in low level language like machine assembly, C, or C++; B, written in high level language like Java, Python, VB, hgluahalguha.

    If the application is high in CPU burn (lets call it X), like oh, for (i = 0; i

    If the application is copying a very large file using basic read/write system calls and large enough buffers (lets call this Y), A and B will have very similar performance.

    If the application is printing hello world, they will have similar performance, although the startup costs for B may be higher, and A will probably finish executing faster.

    MOST applications written today are written to solve for Y. The code that most programmers write today is NOT the CPU intensive portion. Usually the CPU intensive portion is in the library called by the programmer: rendering a box, moving things around on a storage device, making something appear on a network.

    In these cases, a high or low level language makes no freaking difference on execution speed. However, your choice WILL make a huge difference on time to develop, maintainability, resultant bugginess, SECURITY, etc.

    OF COURSE THERE ARE EXCEPTIONS. Maybe you're writing a routine that needs to draw lines fast, or move bytes through a network filter at 100MB/sec, or you're compressing a file, whatever. In these cases you tend to write the performance critical code in a more low level language so you have greater control over the physical machine. Sometimes you write the entire application in the low level language.

    Many high level languages provide mechanisms for calling low-level code when it's necessary for performance. It's often pretty easy.

    The performance argument is a red herring.

    • by good-n-nappy ( 412814 ) on Monday September 15, 2003 @08:52PM (#6970752) Homepage
      Maybe you're writing a routine that needs to draw lines fast

      This is one of the specific things you can't really do with JNI and Java anymore. Java graphics is now really complicated. There's no way you'll be able to use low level OS rendering methods and have them integrate with Java2D and Swing.

      It's actually a real problem. You've got no recourse when the graphics primitive you need is too slow. You end up with a Java workaround or you switch to OpenGL or DirectX, which don't have good support for fonts and strokes and such.

      This is where C# may end up beating Java, on Windows at least. Eclipse SWT has promise too since you at least have the potential implement your own graphics code.
  • by rmdyer ( 267137 ) on Monday September 15, 2003 @07:59PM (#6970179)
    ...my life. It's been mostly C/C++ but also a good amount of assember and VB. Maybe someone here can answer one of the questions that keeps poping up when I write anything. My question is, why do we always end up creating libraries/classes that contain other code we will never use? What I would like is a compile environment where each function or object that I use is individually addressable, without having to pull in other "stuff" I don't need in my specific app. Is that so hard? Why doesn't the OS manage code better than pulling in a whole library? If I use only strcat() for example, why do I need to load in the entire C string library?

    The problem gets even worse with C++ and objects. Huge numbers of member functions and public variables that will never be used. Microsoft's .NET and Suns Java take the cake by making you load an entire "run-time" engine. This consists of vast numbers of "ready-to-go" objects that make your simple "Hello World!" app into an 11 Meg progam. Java can't even share the "run-time" between apps very well!

    Is there a program out there that can tell the efficiency of the operating system environment, apps and OS, by how many functions "aren't" getting used in a normal day by a user? I'm going to go out on a limb here and suggest that most RAM isn't being utilized by apps.

    What I would like is an extremely efficient programming environment that compiles my six line x++ program down to a few hundred bytes total...that's in-memory while running. I want to use my RAM for data and number crunching, not unusable code.

    +1-1
    • >If I use only strcat() for example, why do I >need to load in the entire C string library?

      Because strcat can call another string library function, which can call another, etc...

      Code duplication is bad, remember?
      • ...be smart enough to "know" what sub-functions each function you are using needs to be loaded. If strcat() only uses 3 sub-functions, then those should only be loaded for "it". And, if those functions are already loaded "somewhere in memory", the OS shouldn't load them again, and again, and again, as so often happens these days.

        I'm saying that the compiler/OS should be smart enough to "itemize" your entire application in terms of absolute functions/variables needed at compile and run-time.

        Where is that
        • Shared libraries (Score:4, Insightful)

          by n0nsensical ( 633430 ) on Monday September 15, 2003 @08:27PM (#6970416)
          And, if those functions are already loaded "somewhere in memory", the OS shouldn't load them again, and again, and again, as so often happens these days.

          If the functions are in a shared library, they shouldn't be. Each library (including, for example, the C runtime) is loaded once into memory and every process uses the same code. If you modify that code in memory, Windows makes an individual copy of the page for the modifying process. I couldn't tell you exactly what the overhead is used for, but the OS isn't loading 20 copies of strcpy or whatever as long as each executable is dynamically linked.
        • There are enough answers here, but I can't resist jumping in and clarifying.

          Most modern compilers in conjunction with OS's already basically do this. GCC and MSVC and the like all will link in only the functions that they deem referenced by your program and it's dependencies, provided you statically link the library.

          Dynamic link or shared libraries work differently. The OS "loads" them once and uses that one copy for all programs that require that library. The linker that builds the DLL has no way of k
    • by MobyDisk ( 75490 ) on Monday September 15, 2003 @08:27PM (#6970414) Homepage
      Linkers already do this. At this point, it is ubiquitous functionality. The linker has a list of functions that are referenced, and functions that are not referenced. Functions and libraries not referenced are discarded. I know for MSVC, this is done by using /opt:ref which is part of the default RELEASE builds. This all works for OO code like C++ as well since the methods boils down to functions as far as the linker is concerned.

      There are some special cases:
      1) You mentioned strcat() which can be inlined by many compilers. In this case, you trade speed for code bloat - the library isn't really used at all here. In MSVC this is an "intrinsic" function.

      2) Dynamic libraries may be treated differently. It is more difficult to try to partially load them. I'm not sure how Linux handles this. Windows allows for a DLL to contain multiple code and data segments, which can be loaded individually if needed.
    • by Spinality ( 214521 ) on Monday September 15, 2003 @08:33PM (#6970510) Homepage
      You're right. But with ever-increasing CPU horsepower, memory bandwidth, memory size, etc., there simply is no incentive for optimizing the things you're talking about. Moore's 'Law' has attenuated the advancement of software technology, by eliminating the need for efficient software.

      There are plenty of dinosaurian programmers like myself who remember hacking away on small machines. Save a byte by referencing a constant stored as an opcode in the instruction stream? Excellent! SLR R1,R1 is 60ns faster than SL R1,R1? Excellent! Decrease the size of the PDP-8 bootstrap loader by one word? Excellent! (Flipping those toggle switches took a lot of time.)

      In 'the old days' hardware was expensive and programmers were cheap. Hard problems were solved by incredible brainpower from great engineers. As a result, by the early 70's software had made huge advances over the punchcard/paper-tape era, and we had learned a lot about how to build quality systems.

      But there was a magic moment when the curves crossed between the cost of programming time versus the cost of hardware. Suddenly, it became easier to solve a problem by adding iron rather than by thinking harder. And so we slid backwards down the slope.

      As far as I'm concerned, hardly anything significant has happened in software architecture or praxis for a few decades. Sure, we have a bunch of fancy new widgets, and the common man's programming paradigm has changed. And the Open Source movement finally has given substance and a PHB-understandable framework to what Unix, LISP, Smalltalk, and other hacker communities used to do behind the scenes.

      But most of today's 'new' language, compiler, data management, operating system, and other computer science paradigms had their fundamental elements invented back in the 60's and 70's. We're finally RE-discovering many great concepts of the past. But it seems we've totally forgotten the importance of efficient, defect-free code, and the methods that might be used to create it.

      This is not to say that only great systems were built in 'the good old days' -- those days weren't that good, and there was plenty of crap. But the really bright folks figured out how to do things RIGHT; and yet they wound up being ignored, because 'doing it right' became less important than 'letting Bruce the part-time lifeguard do it over the weekend in Visual Basic.'

      So while I totally agree with your rant, and I've made a hundred similar rants of my own, the fact is we probably won't see fundamental improvements in software platforms until subatomic physics finally provides a wall against which hardware advances can bounce. As long as the answer to every performance/capacity complaint is "wait six months" there's no incentive to invest the man-centuries it will take to revamp our software environments. We probably need to build intelligent software that can optimize itself, because WE NEVER WILL.

      </rant>
      • But with ever-increasing CPU horsepower, memory bandwidth, memory size, etc., there simply is no incentive for optimizing the things you're talking about.

        Not entirely true. We've got more memory and CPU power, but the storage bandwidth has not kept pace. Every significant performance problem I've tackled in the past few years has had to do with cache misses and/or page faults. I've worked on shrink wrap code where the customers have decent though not high-end machines. Load times can be 30 seconds of

    • by cookd ( 72933 ) <douglascook@NOSpaM.juno.com> on Monday September 15, 2003 @10:31PM (#6971593) Journal
      Very complicated question. Ughh, there are answers at many levels, and they are all different. But here goes.

      Any decent linker nowadays is "smart." This means that it already does what you are asking for -- it knows how to figure out exactly what the dependencies are, and bring in only the symbols (a symbol in this context is a chunk of code or data) that are (directly or indirectly) referenced by your code. Even though you link against all of the C Runtime, or all of the string library, the linker realizes that you are only using strcat. For this example, we'll assume that strcat uses strlen and strcpy. So your call to strcat pulls in strcat, strcpy, and strlen, but nothing else. So what you mention actually is already happening. (Unless you turn off the "smart" linking, as is common for debugging purposes.)

      However, there are some additional factors at work. The first is the C Runtime (CRT). ANSI C has some very specific requirements about how the environment is to be set up before main() is called, and how the system is to be cleaned up after main() exits. C also has specs about how to prepare the system for unexpected termination and signal handling. Setup and cleanup reference a bunch of additional symbols, so you end up with much more than just main(), strcat(), strcpy(), and strlen() -- you also have atexit(), exit() and etc. There is usually a process by which you can get rid of this and start directly in main with no setup code, but then you can't use any of the CRT-supplied functions (since the CRT isn't initialized) -- you have to set up your process yourself, handle signals yourself, and are limited to calling OS functions directly (no nice wrappers like fopen, printf or such).

      Then there is the issue of linking. Static or dynamic? Static linking means that all of the symbols you reference, directly or indirectly, are compiled into your binary. Dynamic linking means that all of the symbols are converted to references to external binaries, and when your binary loads, the external binaries will also load and you will use the symbol as defined in the external binary. Static linking means everything you need (and nothing you don't) is right there, compiled into your binary, so you'll never load anything you don't need. On the other hand, every program that is statically linked has its own copy of the linked-in routines, which can be wasteful of disk space and memory. With dynamic linking, the entire external binary has to be available, even if you only need one symbol from it. On the other hand, there only needs to be one copy of the binary on disk, no matter how many times it is used. And most of the time, the operating system can arrange things so that only one copy of the binary's code is loaded into memory, no matter how many processes are using it. This can save a lot of memory. For most systems, it turns out to be much more efficient to load a multi-megabyte dynamic link library into every process rather than statically link just the 200k that you actually need from that library.

      Finally, there is the OS involvement. The OS has to do a certain amount of setup for any process, no matter how trivial that process is. It has to allocate a stack, map the process into memory, set up virtual memory tables for it, etc. On a modern OS, in order for it to provide the services we expect, it has to set up a bunch of stuff just in case the process decides to make use of it. It is the price we pay for having a lot of power available to us.

      So for an example, I wrote up two test programs. I'm a Windows guy, so everything was done using Visual C++ 7.1. The first test was just an empty main(). Compiled and linked statically, it takes 24k on disk. That is basically just the CRT startup and shutdown code and the signal handlers (plus error message strings, etc.). It also links (dynamically) to kernel32.dll, ntdll.dll, and the OS itself. It allocates 568k of user memory/136k VM, 7k of kernel memory, and holds 14 kernel objects (thread handle, process han
  • What is "good"? (Score:5, Insightful)

    by ljavelin ( 41345 ) on Monday September 15, 2003 @08:05PM (#6970225)
    I don't know if anyone has done any formal study on the complexity of development tools over time - but the fact is that programming tools are getting "lower" over time.

    When I started out in this business, a language like C was a high level programming language. It did a lot for the programmer, especially compared to assembler and FORTRAN.

    Everything we did every day was to save memory and CPU cycles. Can we squeeze a date field into 8 bits? You bet we'd try! And we did! Heck, we could ignore weekends and holidays. Phew!

    At the same time, databases were heirarchical. The databases were very close to the machine, so they were darn fast. As long as you didn't do any unexpected queries (like "sort by first name"), everything was blazing fast and tight.

    Then came the higher level systems. Ouch, they sucked! We were the very first customer to run DB2 in production (quiz- you know which one?) Anyhow, it sucked rocks compared to the heirarchical databases - they were just optimized for speed! Why would anyone ever want a relational database?

    But over years we came to see the light. With faster and faster machines, the number of cycles was less important. With bigger memory and disk, storage was less important. And it was butt-easy to use these tools. Easier and MUCH more maintainable.

    So yeah, Java and C# are going to use more memory and more cycles than plain old C (if using the languages as expected). But for most tasks, that isn't the whole story.

    The whole story is that Java and C# result in less expensive programs. And those programs should run fast enough. Yeah, not in EVERY case. But in most cases.

    Performance comparisons be damned.

    • Re:What is "good"? (Score:3, Interesting)

      by hey! ( 33014 )
      First of all, I've been at this professinally for over 20 years. C has always been considered to be a "low level" language -- even compared to something like Fortran. Fortran, in the versions most of us learned it when back whe it was more or less mandatory to learn it, was not "low level", it was just primitive, which is not the same thing at all.

      The big change over the years isn't just that machines have gotten faster: systems have become more complex. Typical application programs are more complex t
  • by nissin ( 706707 ) on Monday September 15, 2003 @08:06PM (#6970228)
    I've programmed extensively in C and Java, with some C++. I took a typical anti-Microsoft stance early on and refused to even look at C#. I was finally convinced to try it, and I must say that it has some nice features.

    I recommend that any programmers out there try using it before discounting it. It might be especially interesting for those C++ programmers out there who don't like Java for one reason or another.

    • C, C++, and Java (Score:3, Informative)

      by SHEENmaster ( 581283 )
      share portability in common. I can write an app in C, and run it on my server, laptop, palmtop, ancient SunServer, or even a Windows machine. The same goes for Java and C++.

      If I use C#, I'm effectively locked into .NET. Mono is a good start, but not enough to make my code reliably portable.

      C# has all the speed of Java with all the portability of X86 assembly linked to Windows libraries. Microsoft's BS patents will help ensure that the portability problems aren't corrected.

      The real purpose behi
  • by Prof.Phreak ( 584152 ) on Monday September 15, 2003 @08:22PM (#6970355) Homepage
    No matter what you say, C is *always* faster. You *cannot* write a loop in C#, and claim that it will run faster than a comprable loop in C. For one, the interpreter for Java or C# is very likely written in C itself.

    Does the speed matter? Does anyone care? No. Well, not for a vast majority of applications. Most of the time the I/O speed is the bottle-neck, not the "code" speed. So if you write a word processor, or some database app in C#, or C++, or C, or Java, more often than not, their 'apparent speed' will be comprable to each other.

    A better question to ask is: Would you do numerical analysis, or number crunching/factoring algorithms in Java or C#? Would you even do them in C? (or would you go for Fortran or distributed Haskell?)
    • No matter what you say, C is *always* faster. You *cannot* write a loop in C#, and claim that it will run faster than a comprable loop in C.

      Actually, in practice, no it's not and yes you can.

      What you're ignoring is the "late optimisation" effect of virtual-machine and intermediate language execution models. The installer or run-time can take intermediate code and convert it to native code, just as a C compiler would, but with full knowledge of the environment in which that code is going to run. That a

      • Actually things like this have been done (re-optimizing compilers) for binary programs. Please note that this is not run-system optimization (as is the case for the c# compile on install, or something like gentoo) this is run-time optimization, which produces faster code. I know of experements on alphas (based on what fx!32 did (which is interpretive emulator+translator+run-time optimization (might be where more jits try to go, and may be considered a very early, and very very advanced jitc.) and in product
        • I agree completely: a lot of it is down to the maturity of optimisation technology for the language in question. This is part of the reason that some high level languages -- OCaml, for example -- can generate code of a comparable speed to low-level beasts like C and C++: with a simpler underlying model, it's easier to perform good optimisations. It's also the reason C is still faster than C++ for some things, even though theoretically, C++ should never be slower.

          I was including dynamic optimisation (durin

  • by dido ( 9125 ) <dido@imperiuUUUm.ph minus threevowels> on Monday September 15, 2003 @08:32PM (#6970497)

    Too bad Limbo [vitanuova.com] is too deeply tied to Inferno [vitanuova.com]. It's unfortunate. From looking at how it's been designed it seems that Dennis Ritchie still hasn't lost his touch as an exceptional language designer. The Dis virtual machine that goes with it is supposedly designed to make a JIT simpler and more efficient, and well, according to Vita Nuova it gets something like 50% to 75% of the performance of compiled C/C++ code, which, if the article and Vita Nuova's benchmarks are accurate, totally blows C# and Java out of the water performance-wise. Now if only Vita Nuova would care to make it half as platform-neutral as Java... But hey, who cares, I'm already trying to do that [sourceforge.net]. :)

  • by supton ( 90168 ) on Monday September 15, 2003 @08:37PM (#6970566) Homepage

    Bruce Eckel sums it up best:
    "Programmer cycles are expensive, CPU cycles are cheap, and I believe that we should no longer pay for the latter with the former. " from a post [artima.com] by Bruce Eckel [mindview.net] on artima.com [artima.com].

    Perhaps people should stop obsessively benchmarking platform VMs, and start benchmarking coding productivity and teamwork, perhaps in Python [python.org], with the performance bits in C. For a real-world example, Zope [zope.org] does exactly that: 95% of the code in Zope ends up being done in Python - only the real performance-intensive stuff need be in C... and the stuff done in Python is easy to read, modify, reuse, and tweak (thus, better productivity for both developers that use Zope as and app-server platform well as developers who work on Zope's core).

  • by Sean Clifford ( 322444 ) on Monday September 15, 2003 @08:43PM (#6970638) Journal
    Well, there's always vi a la vim win32 port [vim.org]. :)

    I do a lot of ASP3.0/SQL2k and some utility development on Win32, taking a stab at .NET. It would be nice to move over to Mono [go-mono.com].

    Anyway, I've done a bit of poking around and ran across SharpDevelop - AKA #develop [icsharpcode.net]. It's open source a la GPL and looks a lot like Visual Studio, and compiles C# and VB.NET; has C# => VB.NET code conversion; does projects or files; has syntaxing for the whole MS shebang. It's a .97 - this build was released Friday 9/12/2K3, officially in beta, and you can get the binaries here [icsharpcode.net], go snag the source here [icsharpcode.net], and get the MS.NET1.1SDK here [dotnetgerman.com].

    To those folks who hiss and moan about the whole GNOME/.NET/Mono thing, take a gander at the rationale [go-mono.com] before playing jump to conclusions [myfavmovies.com] (mp3).

    SharpWT - AKA #WT [sharpwt.net] is a .NET port of Java SWT on both Windows/.NET and Linux/Mono platforms. So...you can develop your .NET apps to run on both Win32 and Linux with pretty much the same GUI. Neat, eh?

    Anyway, intrepid Windows Developer, if you can pry yourself away from the MSDN Library for a few minutes, you might find there's something to this Mono business.

  • by occamboy ( 583175 ) on Monday September 15, 2003 @08:45PM (#6970663)
    I've been coding for a million years: 6502 machine code all the way up to a recent foray into C#, and almost everything in between. Here's my take, for what it's worth. And, it's your chance to mod me down for pontificating!

    Delphi is the best all-around language ever for producing Windows apps. The Delphi programmer has control over everything if they want it, but they don't need to muck around with nasty details unless they need to. It encourages clean coding. Performance is superb. And the IDE is excellent. The Delphi package (language plus IDE) has been the path of least resistance to getting an app done for the past six or seven years.

    Prior to Delphi, the way to go was VB with C DLLs. Do the UI in VB, do the internals as C DLLs. VB was great for abstracting nasty stuff, but it often overabstracted, and performance was ungood. Writing companion DLLs in C boosted flexibility and performance.

    Generally speaking, C is basically not much beyond portable assembly language. If a reasonable alternative is available, and it usually is, using C for anything beyond embedded systems or super resource-critical applications is probably not a good idea, as the code tends to be dangerous and obsfuscated. And keeping track of pointers is just nutty.

    C++ is a nightmare. In theory it's object oriented, but all of the code that I've seen is a total mess, more like C using the C++ libraries. Ugh.

    Java is almost good: it is pretty safe, and encourages good habits. But I find it clumsy - like it was designed by academics rather than practical developers. Lack of enumerated types, for example, is insane, and encourages unsafe programming. It's also a pig, which doesn't matter so much for a lot of things these days - we usually have lots of CPU cycles to spare. For doing GUI apps, Swing is a weird joke - a pig's pig - which should be forgotten immediately. And the Java IDEs have only recently become usable for command-line stuff, but they suck as bad as Swing for GUI development.

    Now C#. Having toyed with C# for an app recently, I'm quite impressed. It's sort of like the best of Java with some of Delphi's goodness added. It's ALMOST as good as Delphi - which makes sense, since the same fellow (Anders Hejlsbeg) was key in developing both of these languages. And .NET, to which C# is bolted, is pretty good as well. And the IDE is very nice. So, all-in-all, I think that C# is a good choice for Windows development. The web forms stuff seems interesting, but I have a feeling that using it would be something that I'd end up regretting - there's bound to be nasty gotchas that won't show up until late in the project.
    • How about Python? (Score:3, Insightful)

      by MikeBabcock ( 65886 )
      I know the article is about C#, but every time I read something about C# I think to myself "I can do that in Python". Python is cross-platform, interpreted, object-oriented, fast, easy to extend with C or C++ and even more exciting for some people, usable transparently with Java as Jython.

      I've found that almost all the programs I've written I could've written smaller and better and more obvious (and therefore easier to debug and maintain) with Python. None of those that I have rewritten show any noticeab
  • by cartman ( 18204 ) on Tuesday September 16, 2003 @03:12AM (#6973042)

    I've done significant work in benchmarking Java programs versus equivalent C/C++ programs. I've found that garbage collection is the sole reason that Java is slower than C++; having a "virtual machine" imposes no performance penalty whatsoever. This is unsurprising, when you think about it; a JIT compiler simply moves the latter phases of compilation ("byte code->assembly") to runtime.

    Contary to what many people think, even most traditional C compilers work by compiling source to byte code, optimizing the byte code, then compiling byte code to assembly. There's no reason to expect that the resultant assembly will be any slower just because the final phase is done at run time by a JIT.

    Programs compiled using IBM's Java compiler routinely outperform equivalent programs compiled using "gcc -02", if garbage collection is not used in the Java program being compiled.

  • by Ninja Programmer ( 145252 ) on Tuesday September 16, 2003 @04:59AM (#6973398) Homepage
    First of all -- what's the deal with this whole "WARMUPS" thing? This is just the most explicit way possible of training the JIT mechanisms without measuring its overhead. That might be fine if you believe that the overhead asymptotically costs nothing, however, I don't know what evidence there is of this. The test should use other mechanisms other than this explicit mechanism to allow the language itself to demonstrate that the overhead is of low cost.

    The way this test is set up, the JIT people could spend hours or days optimizing the code without it showing up in the tests. This is the wrong approach and will do nothing other than to encourage the JIT developers to cheat in a way such as this just to try to win these benchmarks.

    Ok as to the specific tests:

    1. FloatInteger conversion on x86 are notoriously slow and CPU micro-architecturally dependent. It also depends on your rounding standard -- the C standard dictates a rounding mode that forces the x86s into their slowest mode. However using the new "Prescott New Instructions", Intel has found a way around this issue that should eventually show up in the Intel C/C++ compiler.

    This does not demonstrate anything about a language other than to ask the question of whether or not the overhead outside of the fi rises to the level of not overshadowing the slowness of the conversion itself.

    (That said, obviously Intel's compiler knows something here that the other guys don't -- notice how it *RAPES* the competition.)

    2. Integer to string conversion is just a question of the quality of the library implementation. A naive implement will just use divides in the inner loop, instead of one of the numerous "constant divide" tricks. Also, string to integer will use multiplies and still just be a limited to the quality of implementation as its most major factor determining performance.

    3. The Pi calculation via iteration has two integer->floating point conversions and a divide in the inner loop. Again, this will make it limited to CPU speed, not language speed.

    4. The Calculation of Pi via recursion is still dominated by the integer divide calculation. It will be CPU limited not language limited.

    5. The Sieve of Erastothenes (sp?) is a fair test. However, if SLOTS is initialized to millions, and the comparable C implementation uses true bits, instead of integers, then I think the C implementation should beat the pants off of C#, Java, or anything else.

    6. The string concatenation test, of course, is going to severely ding C for its pathetic string library implemenation (strcat, has an implicit strlen calculation in it, thus making it dramatically slower than it needs to be.) Using something like bstrlib [sf.net] would negate the advantage of C#, Java, or any other language.

    7. The string comparison with switch is a fair test, and gives each language the opportunity to use whatever high level "insight" that the compiler is capable of delivering on. It should be noted that a sufficiently powerful C compiler should be capable of killing its competition on this test, however, I don't believe any C compiler currently in existence is capable of demonstrating this for this case. I.e., this *could* be a legitimate case to demonstrate that C# or Java's high level abstractions give it an advantage over where the state of the art is in C compilers today.

    8. Of course the tokenization is another serious ding on the rather pathetic implementation of the C library. None of strtok, strspn, etc are up to the task of doing serious high performance string tokenization. If you use an alternative library (such as bstrlib [sf.net] or even just PCRE [pcre.org]) you would find that C would be impossible to beat.

    -----

    Ok, while the results here are interesting, I don't think there were enough tests here to truly test the language, especially in more real world (and less laboratory-like) conditions. Please refer to The Great Win32 Computer Language Shootout [dada.perl.it] for a more serious set of tests.
  • by ClubStew ( 113954 ) on Tuesday September 16, 2003 @07:40AM (#6974085) Homepage

    When are people going to learn that languages that target the Common Language Runtime (CLR, only part of the .NET Framework) compile to the same thing (Intermediate Language - IL)? C# is not being tested here. The C# compiler optionally optimized and compiles source code into IL which is later JIT'd and executed. Because of this, it's the CLR and the base class library (BCL) that are being tested.

    This could just as easily been done with VB.NET (not that I condone anything resembling VB) or J#, although MC++ is a different monster because it can (but doesn't have to) use native code in mixed mode and, thus, is not verifiable).

    The only differences besides language syntax is the specific language compiler and the optimizations it makes. The C# compiler does - as far as I've tested - a better job of optimizing. The VB.NET compiler, for instance, automatically adds an extra local var to support the syntax FunctionName = ReturnValue whether or not you use it!

    So, let's get the facts straight. C# is not being tested here. The .NET CLR and BCL are.

If all else fails, lower your standards.

Working...