Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Java Programming

Rootbeer GPU Compiler Lets Almost Any Java Code Run On the GPU 304

An anonymous reader writes "Today the source code to the Rootbeer GPU Compiler was released as open source on github. This work allows for a developer to use almost any Java code on the GPU. It is free, open source and highly tested. Rootbeer is the most full featured translator to convert Java Bytecode to CUDA. It allows arbitrary graphs of objects to be serialized to the GPU and the GPU kernel to be written in Java." Rootbeer is the work of Syracuse University instructor Phil Pratt-Szeliga.
This discussion has been archived. No new comments can be posted.

Rootbeer GPU Compiler Lets Almost Any Java Code Run On the GPU

Comments Filter:
  • Any code? (Score:5, Funny)

    by XxtraLarGe ( 551297 ) on Saturday August 11, 2012 @11:26PM (#40961759) Journal

    This work allows for a developer to use almost any Java code on the GPU.

    Except for the code my students write. :rolleyes:

    • by XxtraLarGe ( 551297 ) on Saturday August 11, 2012 @11:28PM (#40961765) Journal
      By the way, this comment is not an indictment of my teaching...
  • x264 (Score:4, Interesting)

    by girlintraining ( 1395911 ) on Saturday August 11, 2012 @11:31PM (#40961773)
    Maybe now the x264 developers will add GPU support and we'll finally have a solution for video encoding that uses the processor and GPUs in parallel. Here's to hoping... :\
    • That would be quite difficult for a C/assembly project to use a Java compiler.

      • Re: (Score:3, Interesting)

        That would be quite difficult for a C/assembly project to use a Java compiler.

        *shrug* They've been dragging their heels for years, claiming that there's no practical way to do it, that the quality is inferior, etc., etc. And now there's a way to generate bytecode that can be executed on the GPU and return predictable results. I'm sure someone who knows assembler can figure out a simple FIFO or IPC / shared memory arrangement... At this point, they can't hide behind technical hurdles: It's clear GPUs can be used, they just don't want to because they're stuck up.

        • Re:x264 (Score:5, Informative)

          by Anonymous Coward on Sunday August 12, 2012 @03:01AM (#40962645)

          Programs don't magically become faster when they are run on GPUs. E.g. Linear Algebra algorithms work really well on CPUs, but if ported 1 to 1 ( as this would ) to a GPU their performance is just abysmal. Usually one needs to use a specially designed algorithm that can actually use the massive parallelism of a GPU and not get stuck e.g. trying to synchronize or doing other kinds of communication. GPUs really like doing the same operation on independent data, which is basically what happens when rendering an image, they are not really designed to have operations that need information of all other data, or neighbouring data in a grid.... . Just because something works on a GPU does not mean its efficient, thus the performance could be much worse using a GPU .
          Also balancing CPU and GPU usage is even harder ( maybe impossible ? ) as you cannot predict what kind of System you will run your software on, thus usually these days the CPU feeds the GPU with data ( with the current Tesla cards only 1 core per GPU, this changes in the Kepler version to 32 ) and does some processing that can't be done on the GPU, but do not share any kind of workloads.

          I don't know how the h.264 codec is structured or if it is possible to have performance gains on encoding. However I really doubt that x.264 can be just ported as they rely heavily on CPU specific features ( SSE etc ) which is quite different to the much higher level bytecode that Java would produce.

          • Also balancing CPU and GPU usage is even harder ( maybe impossible ? ) as you cannot predict

            Computational code is quite regular, so it is usually possible to predict its performance, especially on GPUs which are quite simple architectures (no cache or reordering).

        • Re: (Score:3, Interesting)

          by Anonymous Coward

          If you've been following x264 development at all, you'd know that all coders which attempted to add complete GPU acceleration to x264 either failed or have given up because it was too difficult. Their motto has always been 'patches welcome', but the result can't be slower than x264 on a CPU or degrade quality significantly. Thus far not really an issue, since off-hand I can't think of anybody who was actually able to provide a patch which was able to be compiled and produce valid output, let alone have acce

        • by Bert64 ( 520050 )

          And how fast does the bytecode run?
          GPUs are specialist processors, running the kind of things they are designed to do they are very fast... Making them run things which they are not suited to can result in terrible performance.

    • But x264 is coded in C and Assembly, not Java.
  • Why?

    • Re:Super (Score:4, Insightful)

      by fm6 ( 162816 ) on Sunday August 12, 2012 @12:26AM (#40962065) Homepage Journal

      Why is any hack done? Because it can be.

    • Why?

      I don't know how close the current state of the project is to this scenario but I'd want it because my netbook slows to a crawl everytime eclipse recompiles a module I'm working on and I don't want to turn off the auto-compile due to preference for not losing some other features that come along with it. And my netbook sits beside the couch always ready to go when I'm lounging with whatever recreational project I happen to be working on sitting in Dropbox. Nothing better in a quiet house on a Saturday morn

    • Warming tea would be one obvious application. I don't know if they've got to the point where it can roast coffee yet, I suspect they'd have to make Eclipse run there in its entirety for that.

  • Blah its CUDA (Score:5, Interesting)

    by zixxt ( 1547061 ) on Sunday August 12, 2012 @12:19AM (#40962015)

    It's CUDA only, meaning it does not support any open standards. Call me when when I can target OpenCL.

  • There's hope yet! (Score:5, Insightful)

    by oakgrove ( 845019 ) on Sunday August 12, 2012 @12:19AM (#40962017)

    Damn, Slashdot, I almost had a freaking heart attack when I moused over (you don't think I actually clicked do you? New here?) the link in the summary and it was to the actual github page rather than some crappy 10 page blog post based on something pulled off the reuters wire from last week.

    I'm impressed!

  • Very nice. (Score:4, Informative)

    by Animats ( 122034 ) on Sunday August 12, 2012 @12:39AM (#40962127) Homepage

    Here, from GitHub, is the short presentation. [github.com] This is very impressive. It finds parallelism automatically, at least for simple cases. Over 50x performance improvement on matrix multiply and naive Fourier transform (not FFT), both of which have very simple inner loops. Not clear how it does on less obvious problems.

  • Legal Problems. (Score:4, Insightful)

    by softcoder ( 252233 ) on Sunday August 12, 2012 @02:01AM (#40962435)

    Considering the approach that Oracle is taking of trying to copyright and charge license fees just for using the Java API's (see Oracle vs Google) I cant see any sane person developing on a non-Oracle provided Java platform. If they can sue Google for Dalvik they can certainly sue whoever deploys Rootbeer if they feel like it.
    pgmer6809

    • by DrXym ( 126579 )
      They can sue. Doesn't mean they're going to win.
      • Re: (Score:2, Insightful)

        by Anonymous Coward

        But because US law works so well, it does mean you'll be ruined long before the trial is over.

      • Re:Legal Problems. (Score:5, Insightful)

        by ultranova ( 717540 ) on Sunday August 12, 2012 @06:24AM (#40963205)

        They can sue. Doesn't mean they're going to win.

        Doesn't really matter who the law would eventually side with, if it takes long enough to do so to bankrupt you.

    • Did you miss the news? Oracle lost that battle on all claims.

      Of course they can still sue.

    • IANAL but since Oracle has now designated OpenJDK (GPLed) as the official reference platform in place of their own, I don't think there are many (any?) legal problems to worry about. Can a company that contributes heavily to a GPLed language turn around and sue somebody for using that code? I didn't think so... but again ianal.

    • by devent ( 1627873 )

      Nice FUD, troll. Google was sued because they took the Java APIs and have not got a license from Oracle (that was before Java was released under the GPL with OpenJDK). At least that what Oracle was trying to convince the judge and the jury.

      Long story short: Orcale was thinking that Google needed the TCK license (or other license) to use the Java APIs. Orcale was trying to convince the court that APIs are copyrightable (or more specific, the Structure, Sequence and Organization of APIs). We now all know the

    • by caseih ( 160668 )

      Nice try with the FUD. rootbeer is simply a bytecode translator. The Java API isn't involved here at all. And the place where the translated bytecodes execute is the graphics card, not some Java JRE-derived virtual machine and runtime API implementation. There is no runtime involved at all. The Java code is simply math that gets translated directly to GPU instructions.

      This is no different than Google's Web Toolkit which translates Java (not necessarily bytecode though) to javascript, which has never

  • by Kartu ( 1490911 ) on Sunday August 12, 2012 @03:06AM (#40962665)
    Shouldn't title mention "nVidia GPU" or did I miss that AMD now supports CUDA on top of OpenCL?
    • At the moment, few people care about ATI cards for anything else than video games.
      This may change in the next few years as better ATI graphics card for HPC are introduced.

One man's constant is another man's variable. -- A.J. Perlis

Working...