New Intermediate Language Proposed 443
WillOutPower writes "Sun is inviting Cray (of supercomputer fame) and IBM (needs no introduction...) to join and create a new intermediate run-time language for high-performance computing. Java's bytecode, Java Grande, and Microsoft's IL language for the Common Language Runtime, it seems a natural progression. I wonder if the format will be in XML? Does this mean ubiquitous grid computing? Maybe now I won't have to write my neural network in C for performance :-)"
I bet the format turns out to be... (Score:2, Funny)
Re:I bet the format turns out to be... (Score:2)
Didn't we do this once before? (Score:4, Interesting)
write an interpreter on your target hardware that
would run the pascal p-code. It was supposed to
solve all sorts of problems. Except it was slow.
Nobody would write anything for it, I guess
because they didn't like Pascal, or USCD didn't
fire anybodies imagination with the product.
I don't see why we need to go through this again.
If you need performance write it in assembler or
use nicely optimized C. If you don't then an
interpreted scripting language will usually
suffice. What's the benefit to yet another
layer of abstraction?
Re:Didn't we do this once before? (Score:2, Interesting)
Re:Didn't we do this once before? (Score:2, Informative)
Re:Didn't we do this once before? (Score:5, Informative)
A more typical usage would be to write anything that needs better performance or that needs access to non-.net libraries in C++ (since it can be compiled to machine code before distributing) and then use that component in other languages that are easier for putting a GUI together. Again, it's always been possible to do stuff like this, but
With the only penalty being "hello world" is 7 meg (Score:2)
Re:Didn't we do this once before? (Score:5, Insightful)
Re:Didn't we do this once before? (Score:3, Informative)
Re:Didn't we do this once before? (Score:4, Insightful)
I think what he's saying is that the syntax isn't the only thing that defines a language. A language's type system probably plays a more important part in defining how the language works.
With .Net, it may seem like you have a lot of interoperating languages, but they're all basically the same language with different superficial characteristics. VB developers complain [devhood.com] about how VB.Net is totally different from previous versions of Visual Basic. It's because they gutted its internals and implanted C#. I wouldn't be able to tell the difference because I see similar syntax, but someone who really knows the language will detect a different core.
That's not to say that different type systems cannot be emulated. Nice [sf.net] is a language with Java-like syntax but with a much better type system (among other things) and it still runs on an ordinary JVM. However, any interoperability will have to be at the level of the lowest common denominator. If you want to call Nice code from Java, your interface ends up losing or having to give up some power.
You really can't even share libraries between truely different languages. The STL just doesn't fit into the Java/C#-style type systems (though generics is a step towards accomodating the STL). Perl libraries are also distinct. Imagine dealing with a Haskell-style lazy list in your C# code. It just wont feel right.
Re:Didn't we do this once before? (Score:5, Insightful)
I did really a lot of programming in UCSD pascal, and long UCSD p-code was the most wide spread operation sytem/virtual machine.
If you need performance write it in assembler or
use nicely optimized C.
Assembler loses all higher level abstractions, like inheritance, interface implementation, class relationships(relations, aggregations and compositions), thread synchronization. The same is true for C, besides that it is on source level not able to express higher level concepts. You might use assembler instead of C.
How do you optimize assemberl? The operation system, the non existing, but hypotetical VM, the loader, the processor, none of hem can optimzie "assembler". I mean: In Java Byte Code I have all the higher level abstractions of the system inspectable via reflection etc. In assembler I have nothing.
New bytecodes, able to express more higher level informations e.g. like prarallelization, or even this problem: consider you have an CPU server, consider you have code migrating to youor server, consider you want to trust that code, consider, the "owner" of the code does not want to trust you
Or, consider this, you want byte code as an mobile agent, similar to the scenario above, but it should be allowed to replicate over a GRID, but only under certain restrictions.
You want to optimize every replica at the VM where it is finally executed, to take an optimum of resources on that point. How do you do that in "assembler"?
Modern byte codes will be likely even closer to the constructs of the high level languages than byte code is. Resource allocation, object creation, class loading, higher level concepts, like delegation, parallelism, synchronization(on multiple mutexes probably), serialization, distributed(pervasive) computing, probably OODB support build in, probably a light weight EJB like execution environment, probably a 4 level hierarchy of VM, meta container, container and executed code
Well, I could continue for a day with improvements
What's the benefit to yet another
layer of abstraction?
The benefit is to optimze on that layer of abstraction and then to project/generate/assemble the optimzation down onto the machine layer(or the next lower layer).
angel'o'sphere
Re:Didn't we do this once before? (Score:4, Insightful)
Unless you really need to use every cycle, you're better off writing in a high level language and then recoding the critical portions (as identified through thorough profiling) in assembly language. (I speak as one who needed to use every cycle when I was a games programmer in the 80s. I've often thought of doing an all-assembly, no OS required app today, just to see how ludicrously fast it would run.)
You gain extensive experience with the procesor and platform to which you are writing, and you work bloody hard. It also depends on whether you are optimising for space or speed. For example: writing a game for the Amiga, I was told by the customer that it had to run on machines with half a meg of RAM (the entry-level machine). I once spent a whole day seeking a way to save 12 bytes; the first part of the solution involved recoding a routine using a different algorithm. The rewrite saved me 8 of my 12 bytes, and executed in the same number of clock cycles (that was a crucial constraint). I then got the other four bytes by using the interrupt vector for an interrupt I'd disabled. As I was writing to the silicon (not even using any ROM routines), I could get away with this. I wonder what kind of warnings a modern C++ compiler would throw up for this kind of behaviour ;-)
Assembly language is fun, but life can be too short. I had to spend so much time fitting the above-mentioned game into half a meg that, by the time it came to market, 1Mb was the standard required by all games anyway.
If you design your code well you have plenty. Even when you inline code to save the overhead of call/return, you will be aware of the functional purpose of those 50 instructions considered as a single entity. The same discipline required to write well-constructed code is needed for assembler. It's similar to using an old version of BASIC, with only GOTO and GOSUB for transferring control; although it allows the sloppy thinker to produce spaghetti code, a good coder will adhere to the same abstractions as they would use in a higher level language.
I'll stop rambling about the past and go and write myself a Forth system now :-)
(P.S. p-code was extremely cool. When I first got acquainted with Java, it was the first thing I thought of. Plus ca change, plus ca ne change pas...)
UCSD Pascal (Score:4, Insightful)
Article text (Score:3, Informative)
Sun's goal is to apply its expertise in Java to defining an architecture-independent, low-level software standard - like Java bytecodes - that a language could present to any computer's run-time environment. Sun wants the so-called Portable Intermediate Language and Run-Time Environment to become an open industry standard.
The low-level software would have some support for existing computer languages. But users would gain maximum benefit when they generated the low-level code based on the new technical computing language Sun has asked IBM and Cray to help define.
Whether IBM and Cray will agree to collaborate on the effort is unclear. Both companies have their own software plans that include developing new languages and operating systems as part of their competing work on the High Productivity Computing Systems (HPCS) project under the Defense Advanced Research Projects Agency (Darpa).
"We think languages are one area where the three of us should cooperate, not compete," said Jim Mitchell, who took on leadership of Sun's HPCS effort in August.
Last week Sun proposed to IBM's HPCS researchers they pool separate efforts on such a software language, an idea Sun said Darpa officials back. Sun also plans to invite Cray into the effort. Representatives from IBM and Cray were not available at press time.
The language could be used not just for the petascale systems in the project, but for a broader class of scientific and technical computers.
"Java has made it easy to program using a small number of threads. But in this [technical computing] world you have to handle thousands or hundreds of thousands of threads. We need the right language constructs to do that," Mitchell said.
What's the point? (Score:2, Insightful)
Re:What's the point? (Score:3, Interesting)
Re:What's the point? (Score:5, Interesting)
So, one of the ideas behind C# was to make an intermediate laguage (MS-Java-byte-code, if you will) which could be quickly compiled for the CPU in question. Stick a system call envrionment and garbage collector around it and you have [roughly] what C# is. One of the nice things about Java was that it was for no specific machine... it was very very simple at the instruction level, but making native code from that can be a pain.
Now, from the looks of the posted article some folks now want an intermediate laguage that can represent concepts like instruction vectorization and maybe SMP (hypter threading) and perhaps some other more complicated constructs that Java's machine code just doesn't talk about.
The end result is that you would have very fast machine code for the number-crunching loops in the code and portability. The compile time would be fairly quick and the optimization for the local CPU would be "smart" and fast if you marked up what where vectorizable instructions.
Why C# falls short, I can't say. I've only looked at the Java machine, never at how C# represets a program.
Hope this is helpful!
Re:What's the point? (Score:2, Insightful)
So, one of the ideas behind C# was to make an intermediate laguage (MS-Java-byte-code, if you will) which could be quickly compiled for the CPU in question. Stick a system call envrionment and garbage collector around it and you have [roughly] what C# is.
Another Java programmer, who has almost no experience with .NET, yet thinks he has enough "understanding" of .NET to put it into a nutshell.
However, you're not as bad as most people.
Why C# falls short, I can't say.
Could it be that there is less th
Re:What's the point? (Score:3, Insightful)
Anyways, why are there a bunch of Java programmers, ignorant of
Same reason you get C/C++/Perl programmers slanderi
Re:What's the point? (Score:2, Insightful)
OTOH, I'm dubious of Java for similar reasons. But Sun is in less of a position to be abusive.
Re:What's the point? (Score:3, Insightful)
Because it has anti-trust all over it and would hurt Microsoft more than help it.. The same thing applies to
That may be so that antitrust would come into play, but I really doubt that is the reason Microsoft hasn't charged app developers licensing fees for Win32.
The real value of an operating system is the applications that run on it. Microsoft wants everybo
Re:What's the point? (Score:2)
I will say that I'm a little annoyed that you chose to fly off the handle like you did. So, rather than get into a shouting match with you, here is a paper [msdnaa.net] for you and the folks here to check out on the JVM and CLR. Note that they are both largly stack machines but the CLR from MS made some design improvements. Please also note (if you are familia
Re:What's the point? (Score:3, Informative)
Can you provide any evidence for the quoted 300% speed boost? I code Java for a living, and ignoring the JVM startup hit (how often do you start and stop the sort of app you'd write in Java anyway?), it's anything but slow.
Re:What's the point? (Score:3, Insightful)
It's also a memory hog!
Kiddin. Whatever I code in C. I let the compiler sort out what platform it runs on. [hint: I write portable C code so it doesn't matter]
Tom
Re:What's the point? (Score:4, Informative)
*cough* *cough*
Bullshit [datadino.com]
Bullshit [jgoodies.com]
Bullshit [smartcvs.com]
Bullshit [jboss.org]
Bullshit [sf.net]
Bullshit [thinkfree.com]
Bullshit [netbeans.org]
Please take your bullshit trolling elsewhere. There are those of us with work to do.
Re:What's the point? (Score:3, Interesting)
Yep, that's me
As to a custom solution in VB, I'll bet it didn't have pie charts, graphs, "top 100" listings, and other great features that help find tho
Re:What's the point? (Score:5, Insightful)
The other benefits of using an IL are manifold. New languages can be implemented without having to write a compiler for each platform. New architectures can be supported without having to write compilers for each language.
Re:What's the point? (Score:5, Interesting)
All good compilers already use well-designed intermediate languages. A general intermediate language that aims to be equally suitable for many high level languages will most likely be inferior to the best intermediate language for a particular high level language.
The other benefits of using an IL are manifold. New languages can be implemented without having to write a compiler for each platform.
Great. Just what we need - another of those braindead technological "advances" like human-readable data interchange formats that makes life easier for a few developers (simpler, cheaper compiler development) and harder for millions of users (worse performance). Frankly, the only advantage for the rest of us I can think of would be the higher probability of the resulting tools being mostly bug-free.
No, this is about Sun control (Score:2)
Re:What's the point? (Score:5, Insightful)
If a language was sufficently high enough that you could describe to the compiler that you were implementing a recursive function (e.g. shell sort), the compiler should then be able to perform fold-unfold optimisation and convert the code into a more efficient tail iterative function. Fans of Haskell and similar languages might recognise this. Some C compilers will convert recursion to iteration where possible, but this is only in simple cases.
The fact is that today, even as C has reached maturity and as high level as it is, there are still some optimisations that are impossible because of subtleties of the language. For example, multiple pointers may point to the same memory, but depending on how the pointers are assigned, the compiler has no idea that this is the case, and has to follow the code in a literal fashion.
My personal view is that languages like Java still have a lot to offer. I would like to see a lot more investment in the compiler to perform better optimisations, and would also like to see a compile on install system for Java like C#; if I run an applcation it would atleast be nice if the compiled parts were cached somewhere. This I believe could make good performance gains, and it's interesting that Sun's Server Hotspot VM actually performs more optimisation when compiling a class than the Client VM, however, because of the increase in time taken to load and compile a class, the Client VM omits some optimisation techniques to favour speedier loading. I guess this descision is to make GUI's more responsive and reduce app load times; compile at install would remove this constraint. We should be going to higher level languages, not lower, and concentrate on getting to compiler correct.
Re:What's the point? (Score:5, Insightful)
Furthermore, getting from the high-level langauge to the intermediate language is cross-platform, which means that any optimizations done at this level are then available to all of the code generators for different platforms; this code is reused across back-ends. It also means that you can support multiple front-ends with the same back-end, and make your C++ and Java automatically compatible by virtue of sharing an intermediate language, and they also both benefit from the same architecture-specific back-end.
There's no reason that having an intermediate language means that you'll stop compiling at that level and use an interpreter for the intermediate language to run the program. In fact, gcc always compiles its intermediate language into machine code, and it can compile Java bytecode into machine code as well. Modern JVMs compile the bytecode into native machine code when the tradeoff seems to be favorable, and they can do optimizations at this point that a C compiler can't do (such as inlining the function that a function pointer usually points to).
An intermediate language essentially pushes more of the skill into the optimizing compiler, because the same optimizing compiler can be used for more tasks. Also, if the compiler is used at runtime, it can optimze based on profiling the actual workload on the actual hardware. This is especially important if, for example, IBM decides to distribute a single set of binaries which should run optimally on all of their hardware; you run the optimizer with the best possible information.
Re:What's the point? (Score:5, Interesting)
Because that doesn't give you best performance. Machine code represents an exact processor implementation. Tradeoffs have to be made with backwards compatibility (eg Redhat is compiled for Pentium), expected cache sizes (optimising size vs performance), processor specifcs (Itanium has 4 instructions per bundle, Sparc has one instruction after branch) etc.
While it is true that you could compile for an exact machine, it is a horrible way of trying to ship stuff to other people, and it does require recompilation if anything changes. (The former is why Redhat pretty much picks base Pentium - if they didn't they would need 5 or so variants of each package just in the Intel/AMD space. Granted they do supply a few variants of some packages, but not everything, and Gentoo people can confirm that doing everything does help).
Using IL lets the system optimise for the exact system you are running at the point of invocation. It can even make choices not available at compile time. For example if memory is under pressure it can optimise for space rather than performance.
It also allows for way more aggressive optimisation based on what the program actually does. While whole program optimisation is becoming available now (generally implemented by considering all source as one unit at link time), that still doesn't address libraries. At runtime bits of the standard libraries (eg UI, networking) can be more optimally integrated the running program.
Machine code also holds back improvements. For example they could have made an x86 processor with double the number of registers years ago. If programs were using IL, a small change in the OS kernel and suddenly everything is running faster.
Needless to say, using IL aggresively is not new. To see it taken to the logical conclusion, look into the AS/400 (or whatever letter of the alphabet IBM calls it this week). I highly recommend Inside the AS/400 [amazon.com] by Frank Soltis.
Re:What's the point? (Score:4, Insightful)
What's wrong with making a good compiler that writes directly to machine code?
a) it wont run on my phone, because no one will port teh compiler
b) it wont run on my new internet enabled microwave, because no one want to port the compiler
c) it wont run on my cars electronic, as no one want to port teh compiler
d) it wont run on the next ESA space probe, the Venus Express, because no one want to port the compiler
and so on.
Whats wrong with having an ultimative VM designed and freeing all software developers from all porting issues for one and for ever?
Whats wrong with having an ultimative VM designed and freeing all hardware developers to be braked out by compatibility issues?
Come one, code geeks. Make a step into the future!! A 4 GHz Pentium is about 16 million times faster than my Apple ][ which I used 15 years ago. Why should I be burdened with coding habits over 20 years old? I dont want to write 10 to 100 lines of assembelr a day, because it expresses far less in terms of instructions than 10 to 100 lines of C. And I dont want to write 10 to 100 line sof C a day becaue it expresses far less in terms of instructions than 10 to 100 lines o C++
We need more different higher level languages and more VMs, as it is easyer to make a new VM than a new processor. We do not need more compilers for the same old languages just because one built a new processor somewhere
angel'o'sphere
Re:What's the point? (Score:2)
What's wrong with encouraging good clean portable code, using good clean portable libraries?
Compilers 101 (Score:5, Informative)
Imagine N high-level languages and M target platforms. A naive approach would wind up creating NxM separate compilers.
Intermediate languages (ILs) allow you to write N "front-ends" that compile the N high-level languages to the IL, and M "back-ends" that compile from the IL to the M target platforms. So rather than needing NxM compilers, you only need N+M.
Even more significant is the optimizer. Front-ends and back-ends are relatively straightforward, but optimizers are very hard to write well. In the naive approach, you need NxM optimizers. With an IL, you only need one. The front-end translates to IL; the optimizer transforms IL to better IL, and the back-end translates to native code.
In summary, to answer one of your questions:
Every optimizing compiler uses an IL anyway. These companies, I presume, are simply agreeing to use the same IL across their products (though I'm only guessing because the article is slashdotted).XML ? (Score:5, Funny)
Re:XML ? (Score:2, Insightful)
Re:XML ? (Score:2, Interesting)
Re: (Score:3, Informative)
Re:XML ? (Score:2)
Re:XML ? (Score:4, Funny)
Re:XML ? (Score:2, Insightful)
Haven't used Mozilla recently, have you?
Re:XML ? (Score:2)
Tom
Re:XML ? (Score:5, Funny)
Then I saw Posted by michael and everything was better.
Please clarify (Score:2, Funny)
Re:XML ? (Score:2)
The only way to get performance out of XML would be to have IBM, Sun or Cray making XML cards. Like how graphics cards speed up 3D rendering to the point of almost real time ray tracing. Load your DTD and XLST into you XPU - XML Processor Unit and fire away.
I know it's just some weird delusion of the poster - the article mentions nothing about it. How hard would it be to create your own XML processing hardware? Would it be faster and what would it do?
Re:XML ? (Score:2)
but the biggest question is... (Score:5, Funny)
will sun survive until then?
GCC (Score:5, Insightful)
who cares about who Sun invites? (Score:3, Interesting)
I suspect the next intermediate language for high-performance numerical computing is either going to be the CLR, some extension of the CLR, or something entirely different, developed in academia.
Buzzword compliance (Score:5, Interesting)
What I hope is that Sun takes a good, long look at the only intermediate assembly that has been designed with language neutrality in mind, Parrot [perl.com]. While this article is over 2 years old, it's a decent starting point. Parrot has already been used to implement rudimentary versions of Perl 5, Perl 6, Python, Java, Scheme and a number of other languages. The proof of concept is done, and Sun could start with a wonderfully advanced next generation byte code language if they can avoid dismissing Parrot as, "a Perl thing" with their usual distain for things "not of Sun".... IBM on the other hand is usally more open to good ideas.
Re:Buzzword compliance (Score:4, Insightful)
"Parrot is strongly related to Perl 6... Perl 6 plans to separate the design of the compiler and the interpreter. This is why we've come up with a subproject, which we've called Parrot that has a certain, limited amount of independence from Perl 6." [emphasis added]
That certainly doesn't sound like it's been designed with language neutrality in mind. For what it's worth, MS's IL was designed with at least four languages in mind - VB.NET, C#, managed C++ and J#, and a couple of dozen others have been or are being ported to it, including Fortran, Cobol, Haskel, and (iirc) even perl.
As you say, the article is over two years old, so maybe they've changed their goals since then - but that article at least gives a very strong impression that Parrot is tied intimately in with Perl.
Re:Buzzword compliance (Score:5, Informative)
Re:Buzzword compliance (Score:4, Insightful)
Right now the only widely used intermediate language that comes close to being suitable for high-performance numerical computing is Microsoft's CLR (JVM actually still has better implementations, but it lacks important primitives like value classes).
My God, they've reinvented CADOL (Score:3, Funny)
*SIGH* (Score:3, Insightful)
Soon we'll need a 10 GHz CPU just to be able to boot tomorrow's OS in less than 5 minutes.
Cheer up. (Score:4, Interesting)
1) Nobody forces you to write in Java for PIII. Write hand-optimised asm sniplets for PIII and include them in bigger Java or C app for time-critical pieces. You get real PIII performance.
2) The software quality drops, but slower than CPU speed rises. That means your Java app for PIII will still work -slightly- faster than hand-coded ASM for 486.
3) Development cost. You can spend a week to write a really fast small piece of code in ASM. Or you can spend that week on writing quite a big, though slow app.
Most visible in games. Things like Morrowind, where crossing the map without stop takes a hour or more, and exploring all the corners is months of play, were plainly impossible when it all required hand-coding. Now for a developer it takes shorter to create a mountain in game than for a player to climb it. Of course the player needs better CPU to be able to display the mountain which wasn't hand-optimised, just created in an artificial high-level language defining the game world, but if you're going to enjoy the experience - why not?
Re:Cheer up. (Score:2)
You forgot to mention that Morrowind was buggy, slow, crashed often, and had horrible loading times.
(was a great game anyways)
Re:*SIGH* (Score:2)
ANDF? (Score:3, Informative)
That format could be extended into a vendor-neutral format for both interpretation, just-in-time compilation, and batch compilation.
Need more info... (Score:4, Insightful)
The article is very light on details.
Huh?
So, how many languages are being proposed here? A new "low-level" one, plus a higher-level "technical computing language" designed to make the most of the lower-level one? Just what's so special about this new low-level language that requires a specific new language to get the "maximum benefit" out of it? I don't have to write in Java to be able to compile to the JVM bytecode. For that matter, I could write in Java and compile to some other assembly language.
New back-ends ("low-level languages," if I understand the article) are added to GCC all the time. We never needed to add a whole 'nother front-end just for them.
I suspect that the real situation is less weird, and the journalist got confused... or heck, who knows, maybe they're proposing half a dozen new languages. It's Sun, after all.
Odd. I wouldn't have thought you'd need to do that these days anyway.
Now what I would like... (Score:2)
Re:Now what I would like... (Score:2)
This is kind of nonsense. It cannot really be "assembly" (at least in the normal sense in which assembly is defined, where there is a 1-1 or close to 1-1 correspondence between instructions in the language and machine instructions) if it supports things like procedures, nested complex arithmetic expressions, and named variables, and no sane high-level language can be without those things. Re
FREE YOUR MIND! (Score:3, Interesting)
Ever thought about writing interpreter or VM in VHDL and implementing it on a FPGA board? That would be pretty similar.
>if it supports things like procedures
Stacks substituted for local variables, CALL, RET, what a problem?
>nested complex arithmetic expressions,
Can be un-nested at compile time, not really going far from assembly. Just remember that each +, ( or % is a separate call. Can be RPN, why not? That's very close to ass
Reminds me of the old quote... (Score:5, Insightful)
Re:Reminds me of the old quote... (Score:2)
". . . except for too many layers of indirection."
-- addendum (possibly by a Java or C# programmer).
high-performance computing (Score:3, Interesting)
In my opinion I would like a C language variation that let me specify how many bits i would like to use for a variable, because it would save a lot time because of memory bandwidth (cache space included) and is very boring to make a good implementation of that in assembly.
Re:high-performance computing (Score:2)
In C we already have types like uint8_t and int16_t. In addition to that you can use bitfields for fields in your structs. What more would you ask for? Remember, that often extra space is actually added between your variables, for performance reasons. Alligned access is faster than unalligned access.
Re:high-performance computing (Score:2)
You're joking, right? C has officially had that (bit-fields) since '99. And any time the number of bits is not either 1 or a multiple of 8, it's much, much slower than it would be otherwise, because despite any difference in memory/cache
What struck me was this (Score:2, Funny)
Is he on crack?
That's elementary, Watson. (Score:2)
1) Learn the new language well.
2) Find some sucker who will employ you, or other ???
3) Profit!!!
4) For money earned, employ programmers from India to write your neural network in assembly for performance.
Parrot assembly? (Score:3, Insightful)
Re:Parrot assembly? (Score:2)
Re:Parrot assembly? (Score:2)
what is new about this?? (Score:4, Informative)
I guess it is one more time around the (reinvention) wheel for sun.
Re:what is new about this?? (Score:2)
Re:what is new about this?? (Score:2)
The problem is with the high level, (Score:2)
APL, J and K are three languages that could be considered IL for high performance computing. But they're high enough to write code directly in them; And so far, it seems, implementations do a wonderful job withou
C-Bonics (Score:3, Funny)
Re:C-Bonics (Score:2)
Comment amused me,
It delivered mirth to me.
Mod the parent up!
-Me
If you've not implemented parallel code X-Arch (Score:5, Informative)
Maintaining high performance code across cpu achitectures is bad enough (and I know of some supercomputing centers which are continuing with technically inferior AMD64/Xeon clusters rather than switch to PPC970 precisely because they know they can't afford to re-optimize for that arch).
Factor in that today most numerically intensive code is still written in FORTRAN because competing languages simply can't be as easily optimized.
Now let's think about SMP, while POSIX threads are portable, the best performace probably requires different threading code depending on arch/unix varriant. (And of course NPTL for linux is still in CVS.)
Now let's think about massively parallel, where inter-cpu communication will be handled a bit differently on every platform.
So the payoffs to developing an efficient cross-platform language layer are pretty substantial. (Which does not imply that I expect IBM to jump on to Sun's bandwagon on this :-))
Grid (Score:2, Insightful)
There are several issues with regard to current programming techniques and grid computing for HPC. Some include:
too little, too late (Score:5, Interesting)
We already have such a runtime: it's called "CLR". The CLR is roughly like the JVM but with features required for high performance computing added (foremost, value classes).
Sun wants the so-called Portable Intermediate Language and Run-Time Environment to become an open industry standard.
I hope people won't fall for that again. Sun promised that Java would be an "open industry standard", but they withdrew from two open standards institutions and then turned Java over to a privately run consortium, with specifications only available under restrictive licenses.
Sun's goal is to apply its expertise in Java to defining an architecture-independent, low-level software standard - like Java bytecodes - that a language could present to any computer's run-time environment.
Sun's "expertise" in this area is not a recommendation: the JVM has a number of serious design problems (e.g., conformant array types, arithmetic requirements, lack of multidimensional arrays) that attest to Sun's lack of expertise and competence in this area.
What this amounts to is Sun conceding that Java simply isn't suitable as a high-performance numerical platform and that it will never get fixed (another broken promise from Sun). But because the CLR actually has many of the features needed for a high-performance numerical platform, Sun is worried about their marketshare.
The question for potential users is: why wait until 2010 when the CLR is already here? And why trust Sun after they have disappointed the community so thoroughly, both in terms of broken promises on an open Java standard and in terms of technology?
Maybe we will be using a portable high-performance runtime other than the CLR by 2010, but I sure hope Sun will have nothing to do with it. (In fact, I think there is a good chance Sun won't even be around then anymore.)
Mod parent up (Score:2, Interesting)
Now, I'm all for having alternatives, but what's going to happen to Java? Will Java compile to IL? The Mono project and DotGNU project both have plans for compiling Java to IL, allowing
Re:too little, too late (Score:3, Informative)
But that is only true for comparable code. If you write C# code using value classes and do an idiomatic translation into Java, the Java code will run very slowly because you cannot express something like a value class efficiently in Java. It doesn't matter how good the Java JIT is, it just can't optimize things that the JVM doesn't let programm
This could have cool implications (Score:2, Interesting)
Now we are talking! I want my C# to compile to native code on Linux, Sun, and IBM mainframes. I want to take Java programmers in my firm and have their code call my C# and visa-versa.
SUN does not understand High Performance Computing (Score:2, Insightful)
SUN has not displayed an understanding of HPC. Adding Ope
XML? (Score:3, Funny)
About time. (Score:3, Interesting)
-- this is not a
Massive parallelism that doesn't suck is hard (Score:3, Insightful)
Over the last few decades, there have been many exotic parallel architectures. Dataflow machines, connection machines, vector machines, hypercubes, associative memory machines (remember LINDA?), perfect shuffle machines, random-interconnect machines, networked memory machines, and partially-shared-memory machines have all come and gone. Some have come and gone more than once. None has been successful enough to sell commercially in quantity. Very few of these machines have ever been purchased by any non-government entity.
There are two ends of the parallelism spectrum - the shared-memory symmetrical multiprocessor, where all memory is shared, and the networked cluster, where no memory is shared. Both are successful and widely used. Everything in between has been a flop.
Despite decades of failure, people keep coming up with new bad ways to hook CPUs together, and getting government agencies to fund them. It's more a pork program than a way to get real work done.
By the time one of these big wierdo machines is built, debugged, and programmed, it's outdated. A few years later, people are getting the same job done on desktops. Look at chess. In 1997, it took Deep Blue to beat Kasparov. Kasparov is now losing games to a desktop four-processor IA-32 machine.
Figuring out more effective ways to use clusters is far more cost effective than putting a National Supercomputer Center in some Congressman's district in Outer Nowhere. [arsc.edu] There's a whole chain of these tax-funded "National Supercomputer Centers". The "Alabama Supercomputer Center" [asc.edu] has ended up as an ISP for the public school system, hosting E-mail accounts and such. It's all pork.
JAVA vs. .NET (Score:3, Insightful)
The mono/.DOTGNU projects are similarly unfathomable. It will be nice to have these tools available to run more bloated GUI's, but if one of these projects really wanted to differentiate itself, that project should instead focus on a C# to native-compiler using gcc's backend and let the other project focus on a compiler-to-MSIL. I guarantee you that project would become the 'winner'.
Just one more reason to Free Java (Score:3, Interesting)
----------
I would like to see GNU/Linux to become a more powerful platform and by a more powerful platform I mean a platform that provides the user with a pleasant experience. Now, to provide a pleasant experience a platform must give the user a choice - a choice of applications that exist for the platform is a step in the right direction. However, GNU/Linux is not such a platform yet. If it were, it would have been embraced by the masses already and it is not. There are a few things that GNU/Linux system is lacking and one of the more important lacking components is a convenient tool that allows a novice create his/her own software for the platform, software that easily manipulates data imported from multiple sources and allows to create graphical interfaces to that data. In the Microsoft this functionality is provided by such a ubiquitous tool as Visual Basic. In the Free Software world there are many tools that are extremely powerful but none of them have the same kind of momentum that Visual Basic delivers on Microsoft platform.
To answer the question- "What can be the VB for Free Software?" we need to look at the kind of problems that will have to be solved by this tool. The problems solved by VB are of many kinds, but for the general public VB provides the bridge that closes the gap between a user and a multitude of small problems that the user wants to solve. Of-course it is possible to just create a VB IDE for FS platforms but I believe there is a more interesting solution to this problem and it is Java. Just like VB, Java runs in a virtual machine, so the user will never really have direct access to any hardware resources, but an abstract layer of JVM can provide a nice buffer between the user and the hardware and at the same time Java will always behave in the same way on multiple other platforms, including Windows. Java has thousands of convenience libraries, there is enough Free Software written for Java that can be integrated into an IDE. However there is a big problem with the language itself - it is not Free.
Sun allows anyone to use Java for free but nobody can modify the language itself except for Sun. In order for Java to become for Free Software and Gnu/Linux what VB became for Microsoft, Java has to be Freed and put out under the GPL. There is also probably a good business sense in it for the Sun Microsystems as well - their language suddenly becomes the language of choice for millions and thousands will work on improving the language, the virtual machine, the compiler etc. In this case Sun will stay in a position that Linus finds himself in - they become the gate-keepers for the vanilla Java tree, but Java will branch and will become much more spread than it is right now. Sun can capitalize on that by providing more Java based solutions and services.
Now it is likely that Sun management will not agree to the change of their Java's status, however, if there was an immediately profitable reason for them to do this, they just may turn around and start thinking about it. A reason that is profitable could be a large sum of cash available to them upon releasing Java under the GPL. Where could this money come from? These money could be collected by the FS and OS supporters, the developers and the users who would like to see more momentum in the GNU/Linux movement towards a successful (wide spread) desktop solution. I suppose no one will seriously object to have one more powerful tool in their Free Software tool-bag. Java can be this tool and it can be just the thing needed to tip the scales over towards quick appearance of a useful and a popular GNU/Linux desktop.
Re:Next try? (Score:3, Funny)
Re:Next try? (Score:3, Funny)
Better yet, where is Java Venti?
If you stop and think about it, what could a Sun/Starbucks partnership entail? The Starbucks card working on the SunRay platform, taking your virtual login identity to every Starbucks location you frequent? Even better to realize that just about all Starbucks locations have WiFi hotspots. Oh the conspiracy!
Re:Next try? (Score:5, Informative)
It's probably because there's no Java user community [sun.com] or usefull [gnu.org] implementations [gnu.org] out there [mindprod.com]. And it has virtually no practical application on the desktop [java.net] for that matter. Maybe because it doesn't do 3D [colorado.edu] or sound [jsresources.org]. Or is not so usefull as far as scalable RDBMS abstraction [slashdot.org] or a real application server [jboss.org] for the enterprise. Maybe they need to move into the mobile [sun.com] market [sun.com]. What's really needed is a good Java IDE [eclipse.org] to get developers on board. Changes should be driven by the software community [jcp.org] and making the source open [sun.com] would help as well. Sun should also be making improvments [sun.com] in Java's next(?) version [sun.com].
You're right, I guess "we" should just cut our losses.
Re:Next try? (Score:5, Insightful)
Oh god (Score:3, Insightful)
Hey, I heard that Microsoft just released a new version of their OS and called it "Longhorn". cn I say "Ok, so now that WinXP is on the retreat they try to enter a new area?"
Personally, I would consider "Hm, Microsoft seems to be catching up to us. Let's make something better than current Java OR
Re:Just what we need... (Score:2)
Re:Just what we need... (Score:2)
Re:English as an intermediate language? (Score:4, Informative)
You need Forth - possibly the only language where you make up the language as you go along.
Example of the Forth definition for the "make everything explode because the time or energy has run out" routine in a game I wrote years ago:
: kill_everything ( - )FWIW, "bursts" was a convenience word used to make it read better. Its definition was:
: bursts (thing_to_burst - )(The bits in brackets are stack diagram comments. The argument "thing_to_burst" is actually the address of the data structure representing the animated entity,)
By judicious use of the English language in choosing your names, you could write what people thought was pseudocode, and it compiled and ran :-)