Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming IT Technology

ATI Releases Competition for NVIDIA's Cg 168

death00 writes "ATI has released a beta of RenderMonkey, their suite of open, extensible shader development tools. ATI showed these tools for the first time at Siggraph 2002. Should be interesting to see who wins the shader development race, NVIDIA's Cg, RenderMonkey or whatever 3Dlabs has on the go."
This discussion has been archived. No new comments can be posted.

ATI Releases Competition for NVIDIA's Cg

Comments Filter:
  • by Anonymous Coward
    All right, let's get all the name jokes out of the way now.

    Or maybe ATI did it on purpose, I mean what would you rather program in Cg or RenderMonkey.... it's a no brainer!

  • by javilon ( 99157 ) on Friday August 23, 2002 @10:55AM (#4126792) Homepage
    "Should be interesting to see who wins the shader development race, NVIDIA's Cg, RenderMonkey or whatever 3Dlabs has on the go."

    Or maybe nobody wins. Maybe three uncompatible ways to do things will hurt developers.
    What they should be doing is to reach an agreement and put it onto opengl.
    • by Anonymous Coward
      Isn't Cg open source? Why couldn't ATI just add their modifications to Cg in such a way that at compile time, both Nvidia and ATI cards would get the proper code to do shading properly?

      Or is this yet another pissing contest?
      • Isn't Cg open source? Why couldn't ATI just add their modifications to Cg in such a way that at compile time, both Nvidia and ATI cards would get the proper code to do shading properly?


        According to nVidia's Cg FAQ they should be able to modify the Cg compiler to do this. Whether or not it would actually make sense to do so I don't know, or if it requires some level of support for nVidia's programmable shader techniques in the first place.
      • by yasth ( 203461 )
        Because Cg is written more or less directly to NVidia's methodology and silicon. So if you don't share that methodology, and want to do things differently you will be working against the design goals of the language. Which would you prefer a hack that adds on some functionality, or proper well designed language?
        • Don't forget that Cg and DirectX 9's shader language are one and the same. It's not just nVidia that designed the language.

          If you're going to write a shader for a DirectX game, you might as well run it through the Cg compiler & get a version for OpenGL 1.4 too. The beta 2 version nVidia released recently adds support for ARB_vertex_program as well as NV30-specific profiles.

          Better still, if you choose the run-time compile option, you can get your high-level shader compiled & optimised for whatever supported hardware or API version is present - which means NV30-specific support if it's there, or a fallback to ARB if it isn't.

          And if ATI play along with their own profile, you could automatically get compiled code that takes advantage of Radeon 9700-specific features too.

          • Maybe the 9700, but the 8500's a whole different monkey. In order to squeeze the extra performance out of the pixel shader, they have a phase instruction which allows the shader to make a second pass, and do dependant texture reads. Something like that'd be hard to compile code to use, I'd wager. At the very least, it'd require the developer know how to tweak Cg code to get it to compile nicely for the 8500, which would probably require they know the assembly-level stuff anyways.

            Remember that the 9000 is essentially a modified 8500. They aren't totally getting rid of this architecture yet; they want people to still support it as much as possible. Hence the desire to stick to assembly-level shader coding as long as possible.

            Overall, RenderMonkey looks like a nice, useful shader-creation environment for prototyping assembly-level shaders in. All they need to do is add support for OpenGL2's shader language, and Microsoft's, when they come out, and I'm sold.

            (And before anyone pipes up that that's Cg, Cg's a totally different product; it's a language, compiler, libraries, etc, but mostly a language spec. RenderMonkey's aiming to be a complete content management system. Cg's for programmers who write tools, RenderMonkey's a tool for your programmers and artists.)
    • Or maybe nobody wins. Maybe three uncompatible ways to do things will hurt developers.
      What they should be doing is to reach an agreement and put it onto opengl.


      Developers are already doing it at a lower level by writing code directly to each card to get programmable shaders when they want them. Even if every major card manufacturer makes up their own method of doing things, there's a possibility that programming for each card becomes faster than it currently is.

      As far as OpenGL goes, both nVidia and ATI have been working for quite some time to get (their own) programmable shaders into OpenGL and Direct3D. RenderMonkey doesn't even support OpenGL yet, which isn't really surprising given ATI's history of OpenGL support in their drivers.
    • by j1mmy ( 43634 )
      The 3Dlabs proposal is actually for extending OpenGL with a C-like shader language, and it looks pretty clever.
    • Or maybe nobody wins. Maybe three uncompatible ways to do things will hurt developers.

      You mean like:
      Mac vs Windows vs Linux?

      Or maybe:
      Gnome vs KDE vs Motif?

      Or how about:
      rpm vs apt-get vs pkg?

      • Or maybe nobody wins. Maybe three uncompatible ways to do things will hurt developers.

        You mean like:

        Mac vs Windows vs Linux?

        Yep. The biggest hurdle for Linux and the Mac is Windows compatibility - hence SoftWindows/SoftPC, Wine, VMWare, Bochs, Win4Lin - not to mention cross-platform systems like Java, HTML etc, all designed to circumvent this problem.

        Remember what (almost) killed Unix in the first place? Fragmentation. Each Unix vendor adding their own little "features", making cross-platform support a little harder each time. If they'd been able to agree on a single "Unix" rather than each going their own way, we could all be using commodity Unix boxes interchangeably.

        Or maybe:

        Gnome vs KDE vs Motif?

        Yes, another case in point: this is a big PITA for developers - and, again, something the original Unix people avoided like the plague. You were supposed to have "a window manager" - it doesn't matter which, they all use the same protocol, and are all interchangeable. Now, we have a nightmare of rival libraries, doing the same thing in different ways.

        One of the greatest advantages Windows has is uniformity - for users and developers alike. With very few exceptions, a 7 year-old Windows 95 (Win32) application will run perfectly happily on Windows XP; even old DOS applications from 20 years ago will usually run, unless they access low-level things directly. Meanwhile, adding items to the Programs menu hasn't even remained consistent within Gnome, let alone between Gnome, KDE and others; as I type this, CodeWeavers are still trying to update Crossover to support Gnome 2's new approach.

        Or how about:

        rpm vs apt-get vs pkg?

        apt-get and rpm work very nicely as a team, so that's hardly a good example...

        Haven't you ever wondered why so much effort goes into standards - the IETF with RFCs, the Single Unix Spec, POSIX, W3C, Linux Standards Base...?

    • Well, all these shader languages are just a higher-level way of generating the assembly instructions for the shaders. The difference between Cg and RenderMonkey is just like the difference between C, C++, <insert language of choice>, etc. They're all just an easier way than coding for the metal. So multiple high-level shader languages will hurt us exactly as much as multiple high-level programming languages has.

      Just because people are proposing different, competing ways of doing things doesn't make them nefarious. Maybe they're just trying to hit different goals.
    • Not competing (Score:5, Insightful)

      by Namarrgon ( 105036 ) on Friday August 23, 2002 @12:03PM (#4127390) Homepage
      Except they're not really in competition to each other.

      - Cg is a high-level shading language (compatible with DirectX 9's HLSL) that will compile to both DirectX and OpenGL APIs, or to various sets of OpenGL extensions, even at runtime if desired.

      - Render Monkey is an IDE that supports various shading languages, including Microsoft's (and therefore Cg, at least when they add DX9 support). AFAIK, it's not that dissimilar to nVidia's own Effects Browser.

      - OpenGL 2.0 is a lot more than just a shading language, but in any case, it's still at the proposal stage. Cg may yet be adopted for the language, but it will likely end up being quite similar at least.

      So I see no reason why you couldn't write your shaders in Cg (sorry, DX9 HLSL) within the RenderMonkey environment, and then compile your results to OpenGL 2.0.

      nVidia have said they'll support whatever the ARB decides, but even if OpenGL 2.0 does use the 3DLabs language, there's no particular reason you couldn't use a Cg profile to output an OpenGL 2.0 HL shader, or an ARB_vertex_program shader, or something even lower-level.

      Hell, why not just write your shaders in RenderMan & then use RenderMonkey or Effects Browser or whatever to import the RIBs & compile that down. Ever wondered why nVidia bought Exluna? There's a lot of RenderMan expertise right there...

      • Re:Not competing (Score:3, Informative)

        by Forkenhoppen ( 16574 )
        Effects Browser requires that you write the plugin as you usually would, compile, and create a .dll for it, etc., and then you can view it in the Effects Browser. (From the Effects Browser you can't change anything; only view.)

        With RenderMonkey, you don't worry about C/C++ at all. All of the settings (constants, input textures, etc.) for the shader are nicely displayed on the left, and you can just click and change them as you wish. This is absolutely nothing like NVidia's Effects Browser.
    • Yes, competition is bad and hurts the consumer. True innovation can only be ensured by a single strong and dominant industry leading standard.

      VIVA REDMOND!

      Oh wait, thats not quite what you meant to imply, was it?

  • co-operate (Score:1, Insightful)

    by CptSkydrop ( 577286 )
    Why dont they work together and stop thinking about the shareholders! It would be positive for everyone involved.
    • stop thinking about the shareholders

      ha! that is a good one! Almost lost my breakfast on it!
    • hmm, a better thought, why doesnt someone create a language that is similar to both and can be translated directly into either for compilation.

      Suggestions for names?

      JavaRender ? ;)
    • Re:co-operate (Score:5, Insightful)

      by MikeTheYak ( 123496 ) on Friday August 23, 2002 @11:19AM (#4126992)
      Perhaps because the shareholders own the corporations, and corporations are chartered to serve the interests of the shareholders? Imagine if you owned a store, only to find that the employees were giving away your stock to charity. Sure it would be a nice gesture, but it's not exactly the right thing to do.

      You could argue that it would be in the shareholders' best interest that ATi and nVidia cooperate on a standard, but it would be unethical for either corporation to deliberately flaunt the interests of the shareholders.

      • They regularly fuck over the shareholder with dilution from stock based option compensation anyway, so fucking the shareholder is nothing new.

        Your post is the antithesis of the open source philosophy. By your logic, no company would ever write for open source software because it benefits their competitors as much as it benefits them. You can see that isn't the case with many companies these days, betting more and more on open source.

        Open standards never hurt companies. The company with the best implementation of the open standard will win in the end. The only reason not to adopt open standards is to try to hedge in a monopoly bet. It is the same with adopting an open standard and then bastardizing it so that people have to write for your version of the standard only. "Works best with Windows", etc.
        • They regularly fuck over the shareholder with dilution from stock based option compensation anyway, so fucking the shareholder is nothing new.

          That doesn't make it right. Regardless, shareholders get to vote directors on and off the board, so there is accountability should the shareholders wish to make use of it.

          Your post is the antithesis of the open source philosophy. By your logic, no company would ever write for open source software because it benefits their competitors as much as it benefits them. You can see that isn't the case with many companies these days, betting more and more on open source.

          Of course, you assume that I should take the "open source philosophy" as personal dogma. As it happens, though, my post is entirely in line with open source as I understand it. What I said was that corporations should not screw over shareholders, not that they shouldn't adopt open standards. I actually raised the possibility that shareholders might benefit from a common standard.

          I think what you're failing to infer from my post is that if a store owner wants to give away stock to charity and the employees do so in line with the owners directions, then that's just fine by me.

        • Your post is the antithesis of the open source philosophy. By your logic, no company would ever write for open source software because it benefits their competitors as much as it benefits them. You can see that isn't the case with many companies these days, betting more and more on open source.


          Do you really believe that anyone other than IBM and their customers really benefits more from the biggest portions of their investments in Linux than IBM and their customers? IBM sees all of their customers taking their lucrative support contracts elsewhere and tossing expensive mainframes because they want to use Linux instead of the proprietary lockin that IBM has had for so long. So, to hold on to customers (and maybe even get some new ones) they start porting their software to Linux and start porting Linux to their hardware. I'm sure they spent plenty of time figuring out how much this move would benefit them vs. their competitors before they did it.

          In the end a very small number of their customers can become self-supporting because they now have the source code to everything they need, while the majority of their customers stay with IBM when they would've been switching to anyone that could supply them with a boatload of x86 parts and Linux support. IBM even managed to save some of these people some money in the short term by keeping them from having to buy new hardware. Who knows how many people saw IBM's embracing of Linux as some embrace of the 'open source philosophy' and brought their employers into IBM's open arms, but there's little doubt that they've gained some new customers through the tactic.
    • Well, because under US corporate law, if a company fails to do what is best for their shareholders, then they are under civil law, and in some cases, criminal negligent, and can be sued.

      • Hemos, your thoughts about the liability of corporate employees is not entirely true.

        Under most U.S. law (mostly State law relating to fiduciary obligations of Directors, Officers & Executives (DOE)), the standard is called the Business Judgement Rule (BJR).

        Under the BJR, a DOE is protected from suits for breach of fiduciary duty if the DOE (1) is disinterested (i.e. financially independant of the outcome), (2) acts in good faith, and (3) exercises due care in reaching the decision.

        Every decision that a DOE makes does not have to directly accrue to the financial benefit of the corporation. Corporations can, and are expected to, contribute to the local community, support charities, exercise some respect for the environment, i.e perform "good acts".

        These "good acts" will not always (or even ever) benefit the corporation financially. Nonetheless, DOEs are never successfully sued because they should have given a dividend instead of donating a piece of land to the city for a park, decided to give employees a day off to help with Habitat for Humnaity, or install CO2 scubbers when the fines are cheaper.

        So, contrary to what I see posted on Slashdot frequently, every decision an employee of a company makes DOES NOT have to be the short-term, profit-maximizing decision. Believe it or not, the law allows some leeway in the decision making of running a company.

        For more information, see this link [tmib.com]

        (and please don't respond with "but Enron...." or "look at WorldCom, Global Crossing, etc." I am speaking about the applicable law of fiduciary duties, not engaging in a public policy debate or Crossfire-esque corporate-bashing)


  • Namely, the fact that OSX actually renders its GUI in 3D (w/ hw accel). Doom3 mentioned just because it will do the same as ID's publications have always done: sell the hottest 3D graphics cards with every new PC. Pie menus because they make thus far the best use of available 2D space. Catch my drift?

    If only someone would pay me to implement the first usable prototype. *wish*
    • Namely, the fact that OSX actually renders its GUI in 3D (w/ hw accel).

      No, it doesn't. What the new version of Quartz - the one shippning in Jaguar, a.k.a OS X.II (well, X.2:) - does is let developers create composite objects, which are then sent to the graphics card to actually render. It's using the graphics card, but it's not doing 3D acceleration (OS X does not have a 3Dwm, last I checked); still, it's a good thing since it does free up CPU time for other tasks. At this point I'm obliged to mention that Windows does the same thing with GDI+, and has been doing so since the initial release of Windows XP; obviously you need a driver which supports this, but almost all video card drivers currently do.

      Jaguar should have been the real release of OS X - it finally delivers everything Apple promised - it's too bad they released it a year early (and in shoddy condition) just to get it out the door ahead of XP. Well, you know Steve Jobs and his ego.

      I am further obliged to mention the fact that I think Berlin/Fresco might be doing if you're running it on the OpenGL graphics driver. Of course, XFree86 doesn't. When even Berlin supports something, you know X is behind the curve. :-P

      • The next version of enlightenment is supposed to use OpenGL for rendering. So it's a window manager thing, not a Xfree thing. Of course, it would be nice if Xfree would support hw rendering of all Xlib primitives.. Oh well..
        • It is an X thing - X is what provides the basic drawing primitives that applications are supposed to use. So Enlightenment sidesteps X and uses OpenGL for rendering- Enlightenment is just a single program, and does not provide drawing functionality to the other X clients: they're still stuck out in the cold.
    • Pie menus because they make thus far the best use of available 2D space.

      Pie menus don't make the best use of available 2d space (in fact they tend to use more space than is strictly necessary). They do make the most efficient use (thus far) of mouse movement, though, which is precisely why people are messing with bringing them out of their normal domain (games) and into normal applications.
  • Apples vs. Oranges ? (Score:4, Interesting)

    by tandr ( 108948 ) on Friday August 23, 2002 @11:06AM (#4126889)

    NVIDIA's Cg is a compiler to create shaders.

    ATI's Rendermonkey is Toolkit to debug (low-level?) shaders. I did not find any word about "compiler" on ATI's site.

    It is more like they are extending each other -- ATI gives IDE, NVIDIA provides compiler.
    • Indeed, ATI people have stated that it would be possible to create shaders in whatever language you prefer. DX9 HLSL, Cg, OpenGL 2.0 SL.

      But more importantly, all the ARB members are working to unify the various approaches in the OpenGL 2.0 shading language. 3DLabs and NV have both contributed proposals, and the relative merits are being discussed.

      Expect to see a much more managable shader situation in 2003. A developer will be able to use wichever tool like RenderMonkey they wish, or roll their own. I fully expect to see tools like this as plug ins to 3DSMax, Maya, Softimage etc.

      FYI, the name refers to RenderMan, in case anyone didn't get that.

      (i don't work for any of the above)
    • I agree with everything except the extending part.

      I see assemblyish code in the screen shots, and the address of the site places it in the Radeon SDK. This looks like a nice tool to ease the customization/creation of low lever shaders. Certainly interesting, but not at all what Slashdot story seems to imply.
    • I thought it was Apple vs. Microsoft. Crap. Out of the loop again.
  • If ATI and NVIDIA can start leapfrogging each other then way AMD and Intel have, then the end result is simply going to be faster, more powerful, and less expensive video cards for us.

    If RenderMonkey is better the Cg, then great! Cg can then try to leapfrog RenderMonkey, and then the API for both just gets faster, more reliable, and easier for developers to use.

    [rant]

    I'm going to be building myself a new PC shortly, and I'm going to put an ATI Radeon 9700 in it.

    Personally, I've been pretty upset with NVIDIA ever since they bought out 3dfx and told Voodoo owners to go screw themselves, that they weren't releasing any new drivers or supporting any Voodoo products. I bought a Voodoo5, instead of a Geforce2 - due to the stability of the Voodoo2 and Voodoo3 I had owned, and due to reading the complaints about NVIDIA's drivers... and a week later 3dfx went under. D'OH!

    • Except they aren't getting cheaper, just more powerful. A state of the art card still costs $400+ and probably will for the forseeable future.

      On the other hand, today's so called 'budget' cards are far more capable than they used to be, so in that respect prices (with regards to price/performance) have gone down I suppose.
      • True, but look at the vast difference in performance for the high end cards. The Radeon 9700 is *supposedly* 40% faster then the Geforce4.. now, compare either card to the top of the line cards of a few years ago, like when the Voodoo3 3500 card came out. Even my Voodoo5, which was a very fast card at one point, is simply no match for either the latest Radeon / Geforce cards.

        So in a way, you're cost has gone down, because you're getting more value for your money.

        • and I'm still using a 64MB GeForce2 I bought when they were first released. At this point I'm looking at probably waiting for the next chipset release to drive down the prices on the high-end 128MB GF4 cards before buying one.

          On the other hand, most of the people I know that only play games occasionally or that just want the 'best bang for the buck' usually get an MX card, which is more than enough for most people, and far cheaper than the Ti and Radeon cards.
    • Yes, but gripes aside about how open their drivers are, The Geforce2 kicked the crap out of any Voodoo card. nvidia's dominance continues today.

      nVidia did the right thing burying an inferior product line and buying 3dfx for their IP.

      siri

    • wrong,

      Intel and AMD compete on _implementations_ of an API (x86). That is good.

      An example of competition on propietary APIs and why it's not good is the competition between opengl and DirectX.

      If DirectX wins (as seems to be happening), all other operating systems are left out.

      If opengl wins, all the people investing on DirectX games developement will have to rewrite their rendering code.
      • OpenGL vs Direct3D a really bad example.

        For games, D3D is probably better for games right now as it matches the hardware more closely, but for any longer term graphics work basing it on D3D would be a big risk. The result, D3D is the games king and OpenGL is used more for longer term graphics projects that may be cross platform. Also, OpenGL is much better for anyone learning graphics because it is so much cleaner than D3D.

        Both have advantages and disadvantages and both fill important parts of the spectrum.

        Which one will win ? Who knows and indeed who cares, both are well supported and actively developed on. In 5 years, when graphics hardware has gotten to the point where it is really stable then we might have a winner ( my bets on OpenGL because it's so much cleaner ) but until then the two standards both serve a purpose.

        • Which one will win ? Who knows and indeed who cares, both are well supported and actively developed on. In 5 years, when graphics hardware has gotten to the point where it is really stable then we might have a winner ( my bets on OpenGL because it's so much cleaner ) but until then the two standards both serve a purpose.



          I doubt there will be any clear winner unless you focus on the area that each was built for in the first place (ie Direct3D will most likely win in games, especially Windows-only games, OpenGL will remain king on workstations).

          As for which is 'cleaner', I guess that depends on how you look at it and what you're used to. An advantage of Direct3D for people learning programming is that the style of the API is kept throughout DirectX, and you'll find that even the most hardcore OpenGL programmers are using DirectX for the sound and input on the Windows versions of their games.
          • OpenGL vs D3D (Score:2, Informative)

            by Steveftoth ( 78419 )
            I think that you and the parent are missing the real reason that OpenGL is king on the workstations and Direct3D never will be. OpenGL is extendable by everyone and Direct3D is completly controlled by MS.

            You can't go and make your own version of Direct3D with a hypercube (or whatever) extension that draws a super widget because Direct3D is closed. OpenGL is completly open and a vendor of 3-D hardware can add whatever they want to the API. The programmers have to then use the extension, but for workstations where the constraints are defined this is usually not a problem.

            PC Games on the other hand have to run on the most diverse set of hardware ever. From the lowest 90Mhz Pentium to the newest 2600+ Athlon, with a range of video cards to make a driver writer weak at the knees. (they don't have to run well, just run) D3D solves this by making all functions static and creating software implementations of them. So you program to a 'version' of the API and you know exactly what the machine is capable of and if you can't get an accelerated version of the functionality, then at least you don't have to create your own. Great for game writers, but not as useful for workstation class applications, since they are more concerned with accuracy then games are (in general).

            OpenGL is better when the HW that you want to support is smaller, since you can get better performance out of it. Direct3D is better when you just want it to run without having to write a ton of code to emulate the top level functionality.
            • I think that you and the parent are missing the real reason that OpenGL is king on the workstations and Direct3D never will be. OpenGL is extendable by everyone and Direct3D is completly controlled by MS.


              That's very true, but you also missed the reason that Direct3D got where it is today, because OpenGL was definitely the king in games up until at least DirectX5. The reason is simply that Direct3D is updated on a fairly regular basis to take advantage of the new features on video cards as quickly as possible (and those features are most often geared towards what the game developers want to see), whereas such support is provided in OpenGL only through those card-specific extensions most of the time (as it takes much longer for OpenGL itself to be updated with those features).

              OpenGL is a much more static API and that has it's own appeal in areas where your application may be in use for several years, rather than just the 60 hours that a player may spend on your game. As long as people believe that they need the latest flash-bang-gizmo graphics tricks in order to sell their games, they'll want the API they use to make their lives easier when they use those tricks. Even the idea that OpenGL is faster can be questioned when looking at cards that support the calls being used (though that certainly wasn't the case in earlier versions of Direct3D, which were always slower than properly-written and supported OpenGL).

              You can't go and make your own version of Direct3D with a hypercube (or whatever) extension that draws a super widget because Direct3D is closed.

              Then again, any graphics chip manufacturer is free to add new features to a card and give the developers the ability to use them alongside Direct3D before Direct3D supports them. This is basically what nVidia and ATI are doing with Cg and RenderMonkey (giving them a higher-level language that compiles down to hardware-level code for DirectX8 while also compiling to OpenGL and DirectX9 code for those calls that are (or will be) supported).
              • Oh, and I forgot to add that a lot of the high-end workstation graphics cards don't even support Direct3D anyway, and render OpenGL much more efficiently than an equally powerful consumer card, simply because the hardware (and driver) has been optimized for it.
              • I wasn't aware that you could extend Direct3D, I guess that you'd need a seperate API from the card manufacturer or something right?
                OpenGL assumes that each card will have it's own extensions and thus provides a common extension mechanism.

                The reason that DirectX has been able to grow so quickly is that MS has kept complete control over the platform, only with versions 8 and higher have they actually let the other companies (read nVidia and ATI) have a real say in the api it seems like. I know that they always consulted the card manufacturers, but they still had the final say.

                Contrasted to OpenGL where it has taken forever to get version 2.0 out because there are so many people with so many opinions to consider. Open apis are just slower to develop, but in the end, hopefully better since the requirements are more diverse.
        • OpenGL isn't 'so much cleaner' than D3D, and hasn't been since D3D v3. In the six versions since then, it's been cleaned up a lot - even John Carmack admits to that.

    • Personally, I've been pretty upset with NVIDIA ever since they bought out 3dfx and told Voodoo owners to go screw themselves, that they weren't releasing any new drivers or supporting any Voodoo products. I bought a Voodoo5, instead of a Geforce2 - due to the stability of the Voodoo2 and Voodoo3 I had owned, and due to reading the complaints about NVIDIA's drivers... and a week later 3dfx went under. D'OH!

      nVidia didn't really 'buy out' 3dfx, they just bought up certain portions of their technology when 3dfx went under. As for complaints about nVidia's drivers, I'd be interested to know what they were at that time, since I dumped 3dfx for nVidia when the Voodoo3 made it extremely clear that 3dfx was not going to be able to produce cards comparable to the Voodoo and Voodoo2 on nVidia's timeline. Through the TNT lines and GeForce lines the only problems I've had with drivers have been specific to particular games and particular driver versions, and fixed very quickly in most cases.
    • True, but a common interface is also good.

      For example, The OpenGL interface hasn't changed for donkeys' years. The reason it hasn't changed is that it was designed correctly in the first place. But hardware has still gotten better, thanks to killer apps such as games forcing competition.

      Indeed, having to go through the time-expense of learning it might even dissuade developers from picking up a new interface.

      Still, these things tend to settle into a reasonably steady state without requiring our opinions, so don't worry; it'll be cool. :)

    • That's pretty funny, you are going with ATI because you were upset with the lack of 3DFX driver support.

      You are in for a big suprise, ATI is the WORST by far of any of the desktop user gfx card companies as far as driver support. They are infamous for releasing buggy drivers or discontinuing support for many of their lines at a moments notice. Compare this to Nvidia who releases a universal driver which covers years worth of part releases and is updated on a regular basis. Even if ATI releases a superior part they just shoot themselves in the foot by releasing shitty drivers on an unpredictable schedule. I used to like ATI back in the day but have been burned one too many times by them.

      • by dinivin ( 444905 ) on Friday August 23, 2002 @11:53AM (#4127304)
        Even if ATI releases a superior part they just shoot themselves in the foot by releasing shitty drivers on an unpredictable schedule.

        That was true in the past, but I haven't had a single issue with either my original 64 Meg Radeon, my Radeon 7500, or my Radeon 8500. I really wish people would stop living in the past and realize that ATI has dramatically improved their driver support in the past year.

        Dinivin
    • No, it's not a good thing (competition).

      AMD and Intel leapfrogging each other turned out to be beneficial for everybody ('cept for those funny guys with the colored plastic) because their products are mostly compatible with each other. Two (three? four?) divergent advanced shader standards means more "simply texture the triangles because it works everywhere" games. Which is not good.
    • Personally, I've been pretty upset with NVIDIA ever since they bought out 3dfx and told Voodoo owners to go screw themselves, that they weren't releasing any new drivers or supporting any Voodoo products. I bought a Voodoo5, instead of a Geforce2 - due to the stability of the Voodoo2 and Voodoo3 I had owned, and due to reading the complaints about NVIDIA's drivers... and a week later 3dfx went under. D'OH!

      You mean like when 3dfx bought out STB, and told STB card owners to go screw themselves (especially NVidia STB card owners)? Gee, that doesn't sound familiar at all.
    • Interesting. I am looking at building a semi-low cost machine in the near future, keeping case (but not PS), hard drives, burner, and sound card from my current machine. I was debating between ATI 8500 and Geforce Ti 4200, and finally settled on the Geforce because NVIDIA offers support for Linux/XFree, where as ATI, per contact with their sales/marketing has stated they have NO plans to offer anything similiar. One person went so far to ask why in the world would I WANT hardware 3d support on Linux.
  • by brejc8 ( 223089 ) on Friday August 23, 2002 @11:08AM (#4126903) Homepage Journal
    Experience in C, C++, x86 Assembler, Cg, Render Monkey, Will work for peanuts.
  • by thelinuxking ( 574760 ) on Friday August 23, 2002 @11:08AM (#4126908)
    Except its backwards...First they have RenderMan, then RenderMonkey.
  • monkey (Score:4, Funny)

    by !splut ( 512711 ) <.ude.ipr.mula. .ta. .tups.> on Friday August 23, 2002 @11:10AM (#4126923) Journal
    Well, if we go on the basis of clever names, ATI wins hands down. Plus, monkeys have always made me think of high quality shaders.
    • Yes, I agree 100%. It goes along with the M&M Rating Sytem for movies...Monkeys and Midgets....if a movie has monkies in it, it gets 1 star..if it has midgets in it it gets 1 star..if it has monkeys AND midgets, it gets a whopping 2 stars (the highest rating possible). If only ATI had caled it MonkeyMidgetRender, then we could be assured it would beat out the other 2.
    • Dude, you crack my silly white ass up.


  • I do game programming, sometime.

    I do it in assembly language, almost always.

    Since the advent of Cg and now the Render Monkey, I want to know if there has been any comparison between assembly language, Cg and Render Monkey ?

    I won't say that I know what are the exact categories for such comparison. Perhaps people who have more knowledge than I am in this field may think of what to compare (features? ease of use? speed? code size? clarity?)

    • by Anonymous Coward
      Good old JC (John Carmack :) wrote/talked about this briefly a little while back (i can't remember where)

      Basically, like the difference between C and ASM, you will be sacrificing speed and smaller code size for easier readability, maintainability, and faster production time. Because hardware has become so advanced, the switch is nessesary because the programs are getting so complex. Of course, there will always be the guys who hand code everything in ASM, but for most people, especially game dev studios who have specific budgets and deadlines, you have to take it because of the time it can save.

      Check Carmack's .plan, or his address from QuakeCon 2002. I think this is where he talks about it.
    • by Anonymous Coward
      I do game programming, sometime.

      I do it in assembly language, almost always.


      As an aside, are you still using assembler for everything? If so - why? You should be concentrating on the game, being as productive as possible and building 100% of what you're doing in a higher level language (such as C++).

      When you're done, the last phase of your optimisation should be converting the most performance-critical sections of the program to assembler if it's still needed after optimising your higher-level language code. For a modern game this shouldn't be more than about 5%, if at all - particularly if it uses 3D acceleration.

      If your game development is a hobby, nevermind - you should be in it for the enjoyment, not the productivity.
  • Oh, sure, ATI might give lip service to our little primate friends. But take a closer look at their branding for this project. [ati.com] That's no monkey, mister! That there's a gorilla! (Or possibly an extra from the movie "Congo".)

    Where are the cute tricks involving prehensile tails? Where are the happy waltzes to some itinerant organ grinder's music? WHERE'S MY MONKEY?!

  • I vote Nvidia. RenderMonkey sounds cool, but ATI have had banana problems in the past..
  • Over at NVNews [nvnews.net] it says.. There are similar APIs in the works from Microsoft and 3DLabs and a new version of OpenGL that all work similarly, and according to Carmack there "isn't much difference" between them.
  • My next card is a GeForce.
    I can't play any games at all on my PC without a crash if I use the latest ATI drivers for XP. This has been for the last couple of revisions already.
    I am forced to run the XP packaged drivers. Slower but no bluescreens. Of course no openGL games. Oh well.
    I have a Radeon if you're curious.
    • Maybe you need to learn how to properly install & setup drivers?

      I have only run ATI cards for the last 3 years & I never have any problems beyond what you see with the occasional game here or there (which also effects Nvidia)... (Radeon 32MB, Radeon 64MB VIVO, & currently a Radeon 8500LE if your curious)

      Because of work I also run WIn XP Pro at home & their drivers don't crash my system...

      I ahve noticed as a tech that dumb people can't ever seem to install a driver properly... How you screw it up I'm still uncertain, but somehow they always do...

      Where I work we sell cards with both Nvidia & Ati chips... People come back complaining how they couldn't install both types of cards (or that they wouldn't run after installed). I can then spend 5 minutes & have their system up & running with their new hardware...
      • If you don't uninstall the old driver set before installing the new one, you can really mess up your system. Likewise if you had a NVidia card previously, and didn't completely uninstall the drivers for that before installing the Radeon ones. Did you upgrade without uninstalling at any time previously? If so, you may want to first unisntall your current drivers, and then pick through the registry for dangling settings from old cards. NVidia's site has a doc describing how to do this for their cards somewhere on their site..

        Under Win2k+, ATi's drivers are rock-solid stable. Unless, of course, your motherboard has a lousy AGP driver, in which case you're screwed. (I've found that some motherboard manufacturers only cater to one brand of video card.)

        It can be especially bad whenever you have a relatively new motherboard. Some manufacturers ship these things out, and the entire first revision of the board has a flaw. There was a whole generation of VIA chips like this a while back.. I think it was the KT133A chipset on a certain ASUS board.. I had one, but ended up returning it and getting the revision board.

        The worst I've had, as far as instability's concerned, with my 8500LE, is a few occassional reboots whenever exiting an especially taxing 3D application. It is a bit annoying, but at least it doesn't lock hard like previous Windows versions did..

        Incidentally, if it's your own code that's causing it to reboot/lock up, then I suggest you check your vertex shader setting code. I had a rebooting problem with my own code for a while, and it turned out to be that I was setting an invalid vertex shader. On a NVidia card, it'll just ignore it, and use the fixed function pipeline. On an 8500, the computer'll reboot. Likewise, if it's a specific game that's screwing up on just 8500s, then this could be the problem there as well.

        Just my two cents..

        Cheers..
    • Same experience w/ my 8500 DV. While I was able to upgrade the drivers somewhat, and I did eventaully get opengl working, it still crashed.

      I upgraded to a Gf4 4600 (128). I'm very very pleased. My old Gf2 Ultra left a bad taste in my mouth but this is certainly a rock solid product.
    • I had an ASUS GeForce2 MX with a DVD decoder and TV out. Only I couldn't use the DVD decoder because it caused a hard freeze. And I couldn't use the TV out because it blue screened. And I couldn't use the packaged drivers at all because they left artifacts on the screen.

      So I used the XP bundled drivers. They worked, but they were slow and didn't support TV out or hardware DVD decoding.

      On the other hand, I now have a Radeon 8500 that worked swell right out of the box. My iBook has a Radeon Mobility 7500 and hasn't given me the first problem.

      It's all relative, and probably has a lot to do with your particular hardware and software configurations. I'm not saying Radeon is better than GeForce; I'm saying that just because it wasn't the right solution for you doesn't mean it isn't the right solution for someone else.

      Then again, if you get poor support, or worse, no support (like with *cough*ASUS*cough*), then that's another story. But if they give you your money back and their blessings on buying a new card, you can't really hold it against them.
      • Then again, if you get poor support, or worse, no support (like with *cough*ASUS*cough*), then that's another story. But if they give you your money back and their blessings on buying a new card, you can't really hold it against them.


        I found that although ASUS is slow updating their drivers (I had an ASUS GeForce 256 SDR), their cards work extremely well with the reference drivers. Of course, you probably still can't get many special features working with reference drivers, but there are a few that do (TV out should work for one).

        I didn't know anyone bothered with hardware DVD decoders beyond the on-board MPEG-2 acceleration that's standard on most video cards anymore, though. I stopped messing with the decoder cards and/or cards that included specific mention of on-board DVD-decoding when the difference between hardware and software decoding became mostly an issue of whether or not the software would run rather than the difference in image quality.

        All of that said, I've seen a lot of people have problems with both. The difference is that I've usually been able to trace most of the problems with nVidia cards back to Via AGP drivers, whereas most of the ATI problems traced to ATI-specific updates for the games involved or ATI driver updates (though there have been a fair number of nVidia driver updates to fix specific problems too).

        My own experience has been that nVidia cards are about 90% flawless on my systems (the other 10% being that brand new game that needs that driver that nVidia didn't release until the next day, even though the developers had them and they were available as 'leaked' drivers if you knew where to look for that sort of thing). My biggest worry with nVidia has been deciding which card vendor to deal with (especially since I was a die-hard DiamondMM fan before S3 bought them, and my Hercules card has been great but they're only releasing ATI cards now, and the ASUS card was good but they never kept up with drivers), but at the same time I like having that choice and knowing that nVidia's reference drivers work (and very well at that) with 99% of the cards out there using their chips.

        Maybe if I had gone with ATI instead of nVidia when I dropped 3dfx I'd feel the same way about ATI. I just have a hard time believing that when I consider where ATI was at the time (the ATI 3D Rage Pro?).
  • What good is it with no Linux port? Like anybody uses Windows to do stuff like this!
    • Lots of games are written for Windows. Most of them, in fact. If there's no Linux developer toolkit, it's just because ATI catering to the bigger audience. But having said that, you're partially right - I personally do no development on Windows at all, and would much rather see Linux tools than Windows. (probably the majority of the slashdot folks are in this category, or want to / soon will be)
  • Every video card comes out with its own programming suite. Reminds me of the bad old days when you had to write your own print interface to every printer you wanted to support. The nice thing about 'industry standards' is that there are so many to choose from.
  • Yay competition!

    Wait...This is Slashdot, home of the "competition is bad" thread...

    Boo! Competiton! Users and developers don't want choice! What does this look like? A democracy?
  • Is Rendermonkey a predecessor to Renderman? Is Renderneanderthal out there?
  • It's easy, cg wins over ati, as nvidia has
    cg for linux. There is no rendermonkey for
    ati. I was at siggraph, and I was very impressed
    with ati's demo's. Now I am disapointed that
    ati does no linux.

    go nvidia!

    Bram
  • The coding effort to get high-quality shader effects on real materials looks difficult. I guess this is bound to happend with complex pixel shaders.. although I guess renderMonkey would be an better GFx developlement IDE than Visual C++..

    Is there any effort by ATI or other vendors to just create a standard library of materials and effects without relying on coding? (ex. "object covered by material Glass 1021" insted of "dot-product sum-add mutliply")

    Perhaps the 3-D modeling vendors would incorporate these libraries for artists development while seamlessly being integrated with a game engine in a standard file format that includes vertices/textures/shader algorithms..

    -bobby
    • CGShaders [cgshaders.org] has a collection of shaders available for developers to use as a base (or as is) in their applications. It may not be a 'standard library', but at least it's a good starting point for some people, and should grow with time.
  • In other gaming news, I think it's time to take the link to Old Man Murry [oldmanmurry.com] off of the front page of Slashdot. Old Man Murry has been AOL for quite a while now and it's time to give up on them, at least until they actually get their site back up. Maybe the link could be replaced with Penny Arcade [penny-arcade.com] or something.
  • Now imagine if Microsoft adopted the name "monkey.net."

    Now imagine the logo for it as Ballmer's face....

    That is all.

"It takes all sorts of in & out-door schooling to get adapted to my kind of fooling" - R. Frost

Working...