Nvidia Releases Hardware-Accelerated Film Renderer 251
snowtigger writes "The day we'll be doing movie rendering in hardware has come: Nvidia today released Gelato, a hardware rendering solution for movie production with some advanced rendering features: displacement, motion blur, raytracing, flexible shading and lighting, a C++ interface for plugins and integration, plus lots of other goodies used in television and movie production. It will be nice to see how this will compete against the software rendering solutions used today. And it runs under Linux too, so we might be seeing more Linux rendering clusters in the future =)" Gelato is proprietary (and pricey), which makes me wonder: is there any Free software capable of exploiting the general computing power of modern video cards?
3D graphics cards aren't relevant (Score:2, Interesting)
Rendering artefacts between cards? (Score:5, Interesting)
So they wrote an engine to do renderman->OpenGL and ran it across many boxes.
Problem was that they got random rendering artefacts by rendering on different cards - different colors etc, and couldn't figure out why.
When working on one box they got controlled results, but only had the power of one renderer.
Re:I like this... (Score:2, Interesting)
Re:3D graphics cards aren't relevant (Score:5, Interesting)
This film renderer is different. It uses the GPU and CPU together as powerful floating point processors (not sure if gelato does anything more than that).
Re:3D graphics cards aren't relevant (Score:5, Interesting)
Re:Rendering artefacts between cards? (Score:4, Interesting)
I have seen this problem in software renderers as well. The problem seemd to be that part of the rendering farm was running on different processors (some were Intel, some AMD and many different speeds and revs) and one of them supposedly had a little difficulty with high-precision floating points and it computed the images with a greenish tone. Took over a week to figure this one out.
ExLuna, take 2 :) (Score:2, Interesting)
M4 open GL VJtool. (Score:3, Interesting)
GPU as 2nd processor (slightly offtopic) (Score:3, Interesting)
I expect that once it suddenly becomes clear that the GPU in a modern video card has serious processing power, that someone will release a version of the SETI@Home [berkeley.edu] client which can use the rendering engine as a processor. Bearing in mind that most computers use their GPU's for a very small percentage of their logged-in life, I suspect there is real potential for using it for analysing on distributed computing projects.
Re:'pricey' (Score:3, Interesting)
No, you can get raytracing hardware for less than the software and a Quadro FX would cost you.
For example, there's the ART PURE P1800 card [artvps.com] which is a purpose-built raytracing accelerator. It's a mature product with an excellent featureset, speaks renderman and has good integration into all the usual 3d packages. It's generally acknowledged as a very fast piece of kit with excellent image quality, and plenty of quality/speed trade-off options. And if you've a deeper wallet they do much bigger network-appliance versions.
Re:Rendering artefacts between cards? (Score:5, Interesting)
Ah those were the days. We were on a deadline and rendered it over Christmas. After four hours the disks would be full and it would be time to transfer it to the DPS-PVR. I spent six days where I couldn't leave the room for more than four hours, sleep included. Was pretty wild !
VH1 viewers voted it second best CGI video of all time, behind Gabriel's Sledgehammer so I guess it was worth it!
Video Cards as Renderers (Score:4, Interesting)
Re:3D graphics cards aren't relevant (Score:2, Interesting)
- 32 bit per component precision in the pixel shader
- dynamic branches in the pixel and vertex shaders
- unlimited instruction count for pixel and vertex shader programs
These features are very useful, they make possible to render frames using GPU at the full quality. Considering that GPUs have HUGE amount of processing power, this will make the rendering much faster.
For example, above mentioned new GPUs from NVidia have 32 FPUs in the pixel pipeline only. Each of those FPUs can perform 4 FP operations at 32bit precision per clock. That's 128 operations per clock or 51.2 GFLOPS in the pixel shader alone.
Heeeey.. that dino looks familiar... (Score:2, Interesting)
Re:Teh horror !!! (Score:3, Interesting)
I've toyed with shaders some and implemented a system for image processing on GPUs. Quite a lot of fun really, though we didn't do any comparisons with CPU to see how much faster it was. (That project isn't published anywhere though.)
Re:BURN!!!!!! (Score:3, Interesting)
Could make a nice addition to GIMP (if there isn't one already).
Comment removed (Score:3, Interesting)
Re:I think it'll start happening a lot more (Score:3, Interesting)
How do you know ISV's are getting annoyed? Do you go to lunch with ISV's every other day?
Not only are you crudely generalizing, I think your point is actually not sound at all. You think Adobe cares about Microsoft dominating?
The much more plausible explanation is that they (nVidia) already had the drivers/software for the architecture on linux (read previous posts, they bought the card).
Another yet plausible explanation is that drivers are more difficult to implement in Windows (because of kernel mode constraints).
And before people start flaming, I'll tell you right away:
I've written drivers for both. A seg fault in a linux kernel driver will generate a message saying "oops, you segfaulted your kernel", whereas a seg fault, or even an invalid heap memory tag in Windows will instantly halt the system with a blue screen. Don't even get me started about IRQL's in Windows - they are essential for proper SMP multithreading, but really are very difficult to work with.
Would be nice to have it running on OSX as well (Score:2, Interesting)
For the kind of work I do, a Nvidia Quadro FX 3000G would be best for driving large displays.
Re:But ATI's solution is free (Score:3, Interesting)
ASHLI is *not* a renderer. It isn't anywhere near doing what Gelato does. Gelato takes a scene file, and gives you a picture. It does it very nicely, using motion blur, programmable shading, and all sorts of fun stuff like that. It is written by the Ex - Ex Luna boys. (Larry Gritz, Matt Pharr, Craig Colb -- Three mofos who know their shizzle.)
ASHLI takes a renderman shader in RSL, and gives you a compiled shader for OpenGL or DirectX. It's then up to you to write the whole renderer.
It's cool, but if you are seriously writing lots of RenderMan shaders, you can probably just as well write them in GL Slang, for your in house customised OpenGL renderer. ASHLI's utility is limited. Frankly, it is pretty much a complete non sequiter in a discussion about a renderer.