Nvidia Releases Hardware-Accelerated Film Renderer 251
snowtigger writes "The day we'll be doing movie rendering in hardware has come: Nvidia today released Gelato, a hardware rendering solution for movie production with some advanced rendering features: displacement, motion blur, raytracing, flexible shading and lighting, a C++ interface for plugins and integration, plus lots of other goodies used in television and movie production. It will be nice to see how this will compete against the software rendering solutions used today. And it runs under Linux too, so we might be seeing more Linux rendering clusters in the future =)" Gelato is proprietary (and pricey), which makes me wonder: is there any Free software capable of exploiting the general computing power of modern video cards?
Fab for machinima (Score:5, Informative)
Eat some gelato (Score:2, Informative)
Re:This would be more useful (Score:5, Informative)
Re:3D graphics cards aren't relevant (Score:4, Informative)
I guess this is what BMRT has turned into... (Score:1, Informative)
If anyone's wondering, a couple of the latest releases of BMRT (Blue Moon Rendering Tools) before NVIDIA pulled the plug on them are available here [unsw.edu.au]
General computing on graphics hardware (Score:3, Informative)
This might also be interesting: GPGPU [gpgpu.org] /Arvid
Linux software (Score:5, Informative)
Take a look at the Jashaka [jashaka.com] project. It is a real time video editing suit and the designers have been working with and have supposedly been getting support from Nvidia, so they may have had access and I would imagine certainly will have access to these video cards. I can't imagine them not taking advantage of this technology.
The other nice thing is if memory serves me correctly this program is being designed to work on Windows, Linux and OS X, so good news all around.
Using GPU for signal processing (Score:5, Informative)
A quick Googling revealed the following:
- BrookGPU [stanford.edu]
- GPGPU [gpgpu.org]
Re:This would be more useful (Score:4, Informative)
Maya, Houdini and XSI are all available for Linux, and they work well.
Nice to see some good out of BMRT/Exluna. (Score:5, Informative)
For those who don't remember, BMRT was a really cool RenderMan based renderer that Pixar had some sort of love/hate relationship with. IIRC, they used it, yet they sued the company. At the end nVidia bought them, though it wasn't clear why at the time.
Re:3D graphics cards aren't relevant (Score:2, Informative)
http://www.pinnaclesys.com/ProductPage_n.asp?Prod
for details
Absolutely (Score:2, Informative)
Check out www.jahshaka.com. It's an open source video compositing / FX package that leverages the 3D accelerator chip on your graphics card to do incredible things. This is one to watch, it's definitely going places.
You can download binaries for linux and windows (and MAC), and source tarballs are available for the savvy.
I know, it's not strictly a "renderer", but it employs many of the fuctions of a renderer to create realtime effects and transitions.
Re:Nice to see some good out of BMRT/Exluna. (Score:5, Informative)
Interestingly, BMRT was free as in $$$ but not as in Free Software. This was one of the first software packages where I first recognized how big this distinction is. (A free as in Free Software program probably would have continued on as people may have coded around some of the disputed intellectual property - a free as in $$$ program was possible to kill with the carrot and stick of a lawsuit and buyout opportunity)
Re:This would be more useful (Score:2, Informative)
Free software ready indeed! (Score:5, Informative)
Re:Nice to see some good out of BMRT/Exluna. (Score:1, Informative)
Indeed. NVidia's FAQ for this group [nvidia.com] says " It is the evolution of NVIDIA's acquisition of Exluna in 2002"
round and round we go (Score:2, Informative)
Re:'pricey' - but worth it? (Score:3, Informative)
I think the point is not that it can render just like other engines, but that it can do so at a far greater speed (with a lot more flexibility and features than the PURE card). That would indeed be worth the money to all but the smallest studios - much faster feedback at full quality is an artist's dream, quite apart from the (more expensive) option of using it to accelerate your render farm.
What they don't really say anywhere is *how much* faster it is. There are many factors involved, but if you basically have 16 * 4 * 2 FP execution units running at 400 MHz in a highly parallel configuration, backed by 32 GB/s of bandwidth, there is quite a bit of potential there (~50 GFLOPS vs P4's ~2 GFLOPS?).
For a farm... If, say, it renders 5x faster than a given render machine, then that's 4 machines (and engine licences) you don't have to buy, which would easily cover the cost.
Re:Video Cards as Renderers (Score:2, Informative)
Comment removed (Score:3, Informative)
Seems to be Open now? (Score:4, Informative)
You're kidding, right? (Score:4, Informative)
PRMan is a fine product, but it has its limitations, as well as its price. There are numerous competitors, many of which use the same Renderman interface but offer more speed and/or more features at a lower price (BMRT and Entropy are[were] notable, and relevant, until Pixar squashed them with the threat of an expensive court case). Brazil, AIR, etc - these RIB-based renderers drop into the same place in the workflow.
Please explain to me why a dedicated rendering device from NVidia would be any better than your average UNIX or Linux machine?
Only if you explain why your average UNIX or Linux machine is better than a Commodore 64 or a PDA, which is also "in essence nothing but processor operations" etc :-) If you listed SPEED in there, you're on the right track.
A modern GPU has far more floating-point hardware than any general-purpose CPU, and it's all geared towards the process of rendering pixels. For certain tasks, one of those expensive dedicated rendering devices from nVidia could be better than FIFTY of your "average" UNIX or Linux machines! Is that enough of an advantage to consider?
Dang, I went and fed the troll, didn't I...
Re:Little value... (Score:4, Informative)
And, apparently, orders of magnitude faster.
Personally, I'd put that rather firmly into the advantage column, and for a number of reasons. You could either render your movie with a smaller farm (always a plus) or you could render even more complex scenes in the same time period--which is probably what most people would use this technology for. On the commentary track of Monsters Inc, the guys from Pixar note that despite having MUCH faster hardware (and alot more of it) the average time to render a single frame of Monsters Inc was just as long as a single frame of Toy Story. Why? Because the frames were FAR more complex.
I think this is a Good Thing(tm) at least for the people who have the imagination to use it.
But ATI's solution is free (Score:2, Informative)
Re:3D graphics cards aren't relevant (Score:3, Informative)
A couple of years ago I got a GeForce4 4800 and a Quadro4 900 XGL. I performed the required resistor mod and flashed the GeForce4 with the Quadro4's BIOS.
Sure the GeForce4 got recognised as a Quadro4 900 XGL in the Windows display control panel, but when you run benchmarks like SPECViewPerf it was obvious the modded-GeForce4 did NOT perform like a real Quadro4 900 XGL. Capabilities like the HW-accelerated clip planes did not seem to present in the GeForce4, and this made a big difference to the scores I was getting.
Re:I like this... (Score:3, Informative)
Look at the AMD 64 [devx.com] ("Opteron", etc) CPU. Linux support is here, but native versions of Microsoft Windows are still yet to be released.
GeForce 6800 (Score:2, Informative)