Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software

NVidia announces Cg: "C" for Graphics 176

mr_sheel writes: "Thresh's FiringSquad has an article about Cg, a new language developed by NVidia in collaboration with Microsoft. 'Up until now,' Brandon Bell writes, 'game developers have had to use complex assembly language to create the lifelike graphics gamers experience in today's latest games.' Cg eases the process of bringing graphics to the screen and compiles for DirectX 8,9 and OpenGL 1.4. Many companies, including 3D Studio Max, Blizzard, and over 100 game developers, have already jumped onto the Cg bandwagon. Will this replace assembly graphics coding once and for all?"
This discussion has been archived. No new comments can be posted.

NVidia announces Cg: "C" for Graphics

Comments Filter:
  • Isn't this (Score:5, Informative)

    by mcspock ( 252093 ) on Saturday June 15, 2002 @01:01PM (#3708243)
  • Didn't we just see this here a couple of days ago?
    • I always find it funny when there's people repeating the "didn't we see this already?"

      Kind of ironic, isn't it?
      • I always find it funny when there's people repeating the "didn't we see this already?"

        Well, to you and the person who moderatd me down, let me explain something that you don't grasp. The original post was a 1:59. I saw it at 2:00. There were no other responses of this nature posted when I looked, and I did. But in the seconds I was typing my short post, previewing it and posting it, several others made similar observations. It happens. Get over it.

        Perhaps the fact that you always find it funny indicates that the problem (duplicated original posts) happens often enough that it indeed should be mentioned.

  • by cp4 ( 250029 ) on Saturday June 15, 2002 @01:08PM (#3708268)
    Hey Timothy.... go to your preference page and UNBLOCK all CmdrTaco stories.... this way you can see what he posted and not post it yourself.
    I know some of his stories suck but it's for the good of all of us.

    You can keep Katz stuff blocked.
  • When slashdot starts picking stories it already scavenged from thereg, linuxtoday, or arstechnica and posted this week already, that means the world is running at slow speed on a sunny Saturday like this one.

    PPA, the girl next door
  • Another article (Score:3, Informative)

    by purepower ( 231452 ) on Saturday June 15, 2002 @01:08PM (#3708271) Homepage
    Extreme Tech also has an article [extremetech.com] about Cg.
  • You know, Cg may seem attractive, but game developers who really know their stuff will probably stick to assembly. Or, alternatively, use Cg in parts a a game but use assembly where it counts.
    • I'm not an expert in graphics programming but I do some firmware and programmable logic for 8-bit micros. I thought that today it was generally accepted that compilers for intel machines and similar CPUs had advanced to the point where they were able to produce assembly that was more or les on par with hand written assembly. Is this not true?

      Asembly for an 8-bit micro with les than 100 instructions isn't a hard thing to grasp...but more complex CPUs have hundred and hundreds of inctructions. It must be nearly inpossible for someone to always knmow they are using the best instructions/sequence of instructions to perform a given task.
  • timothy should have his post deleted and his account banned for repeated double-posting. It's like he's spamming us!
  • ...that cygwin goes under just in time for cgwin to come out? Conspiracy or coincidence - you decide!
  • MS Mistrust (Score:3, Interesting)

    by feldsteins ( 313201 ) <scott @ s c ottfeldstein.net> on Saturday June 15, 2002 @01:18PM (#3708298) Homepage
    "...in collaboration with Microsoft..."

    I just hope that phrase doesn't mean non-DirectX operating systems (Linux, Mac OS X) aren't about to get the short end of the grahics stick. I can visualize features not being implimented for OpenGL, or worse, support for OpenGL discontinued at some strategic point in the future "because our customers strongly prefer DirectX" [says Microsoft].

    Ok maybe I'm paranoid. Maybe this is basically nVidia's baby and MS is only involved a little bit. Let's all hope. Can someone reasure me?
    • I'm with you on the mistrust, I'm worried about nVidia using conventional MS practices or being dominated by their "partner" Microsoft. I do like nVidia, and love their graphics cards but this plants a little seed of doubt for the future. Hmm.

      I wish I could tell you that wouldn't happen, but it seems to me that it is somewhat likely down the road.

      or worse getting taken over by Microsoft.
      • by jpt.d ( 444929 )
        nVidia doesn't make graphics cards, they make graphics chips. Its like saying Linux is an operating system.
        • Yes, you are technically correct. What I meant was that I like cards that use nVidia chips. Since it's the core of the graphics system it's more akin to describing a computer as a "Pentium" which isn't correct but is certainly is a common usage.

          And Linux is technically a kernel not an OS, but 95% of the time I hear the word it's referred to as an OS.

          So what is your point?
        • http://slashdot.org/articles/99/04/09/1516203.shtm l
    • I kan read artical (Score:5, Informative)

      by Enonu ( 129798 ) on Saturday June 15, 2002 @01:42PM (#3708371)
      Since Cg is designed specifically for vertex and pixel shader programs, DirectX versions 8 and 9 are supported as well as OpenGL 1.4. The compiler itself is cross platform; in particular programs written for Windows, Linux, Macintosh, and Xbox are supported. And if all that isn't enough, the compiler can create code for all GPUs that support DirectX 8 (or above) and/or OpenGL 1.4, making it very universal. In keeping with Linux tradition, NVIDIA has open-sourced certain components of the compiler, allowing content developers to add their own customizations as well.
    • My guess would be that Nvidia originally started looking into this as a tool to help developers use the full power of the Xbox. Being that the Xbox was a partnership with Microsoft, this would be too.
    • Re:MS Mistrust (Score:3, Interesting)

      by fferreres ( 525414 )
      Nvidia = Microsoft since the begining. I can say that because I know them since the NV1 (1995 i think), which I own.

      They where virtually killed by 3Dfx many times yet someone kept pumping money from "somewhere" for them. The only change in focus after their failure this: target for Direct3D.

      They kept pumping money and money and selling at a loss until they bough 3Dfx. Most other players where dead already (rendition and many others) dead by then. Now they control the market with Microsoft by their side. Microsoft was always against 3Dfx, they direct3D was never "friendly" with 3Dfx cards (though it worked fine, features where really targeted for Nvidia cards).

      It's like Nvidia and Microsoft developed direct3D. It's not a coincidence. Microsoft would never enjoy a monopolistic provider in a key market unless they own/control it somehow (and no, you don't need Microsoft puting money directly to own it. There are a thouthand ways to own something in an unnoticed fashion).

      Remark: i know Nvidia cards are the best and excelent ones! This has nothing to do with it :) The 3D market history is full of "black holes" which you just can't understand well without some conspiracy.
  • Why? (Score:2, Insightful)

    by dhaberx ( 585739 )
    Why make a new language with it's own compiler for this? Wouldn't it make much more sense to make some libraries?
    • I'm sure the Ph.D's at Nvidia thought about that, and disregarded it. Pretty arrogant for a bunch of Slashbots to second-guess the cream of the crop...
  • Well if this has been posted before I missed it and am glad I had a chance to read it. Sorry to all you others who are annoyed by the duplication.

    The technology seems like a necessary step for the industry. I do graphics programming, although I'm not elite enough to do it in raw assembly.. I have used OpenGL a few times. From the article it does sound like it will be much easier to develop visualization code that is more standard - this would have made working on my Thesis much easier! Oh well, better late than never I guess...

    Also the article heavily focused on examples of characters from game environments, it would be nice to see graphics examples of other types of applications besides gaming. For example scientific visualization and similar areas, that is what I'm more interested in that gaming development.

    However I am concerned that NVidia won't stay compliant with standards across video cards and things may get ugly in the future. To pull a Microsoft, as it were. I hope not because it sounds like an interesting product that could quickly become dominant if they continue to do things right.

    • Re:Good stuff! (Score:2, Insightful)

      by KewlPC ( 245768 )
      ENOUGH!

      This has nothing to do with raw x86 assembly! This isn't doing graphics programming in raw assembly!

      There is an assembly-like language for writing vertex and pixel shaders on nVidia and ATI GPUs. Your entire program could be nothing but high-level calls to DirectX 8 or OpenGL 1.4 in C++, but if you want to take advantage of the ability of the Geforce3, Geforce4 (but not Geforce4 MX), and recent ATI cards, to do custom pixel and vertex shaders, you have to write said pixel and vertex shaders in a small assembly-like language, and use DirextX or OpenGL calls to load them into the GPU. These shaders get used by the card's GPU, and has absolutely nothing to do with the CPU.

      The idea here is that instead of writing your pixel or vertex shaders in something similar to assembly (again, it *ISN'T* assembly), you can write them in something similar to C, and the Cg compiler will compile your shaders into machine code for the GPU (Graphics Processing Unit; it's on the GeForce3, GeForce4 (but not GeForce4 MX), and recent ATI cards), not the CPU!

      Writing shaders in this assembly-like language for the GPU is nothing like trying to do graphics in x86 assembly back in the day, ok?
      • Well, thanks for the clarification. It seems like the article might have made that fact more clear, or maybe it was there and I didn't pick out the distinction. So nVidia's Cg is only for the vertex and pixel shaders? I suppose it doesn't provide a high-level api for the rest of the graphics process as I originally thought. Doh.

        Too much marketing-speak in the article I guess. Or I'm just dumb.
  • by magic ( 19621 ) on Saturday June 15, 2002 @01:26PM (#3708324) Homepage
    At flipcode.com [flipcode.com]

    -m
  • by Lethyos ( 408045 ) on Saturday June 15, 2002 @01:26PM (#3708325) Journal
    Will this replace assembly graphics coding once and for all?

    What on earth are you talking about? This will be a requirement... ALWAYS. You know why? Because nVidia is only one company. Not everyone conforms to their "standards". This is good especially because they are in bed with Microsoft. They are a conduit for MS to control the graphics market as nVidia are up for sale to the highest bidder.

    So, what do you do when you want to strengthen control on a market? You introduce a new language that makes it easier for developers to take advantage of one piece of hardware on your platform!

    No, this will NOT replace ASM for driver/graphics engine development. It should be disregarded because there's nothing standard about it. Hopefully, the leaders in the industry will always support chipsets other than nVidia's.
    • Sure it's non-standard-- except that it works with ATI cards. Doh!
      • My god how naive :( ... I'm not trolling. I'm just suggesting to pay a little more attention to this kind of stuff, so that we can prevent having unstopable monopolies as a result.

        Do you think this Cg will take most out of ATI cards? Do you think ATI likes it or it's a good thing for them? Why did Nvidia developed this thing with Microsoft help? Why wasn't ATI there? Why weren't Matrox and 3Dlabs invited?

        It may be compatible for ATI cards as in beer, not as in freedom. And that beer will probably taste like you don't want a second round.

        This can only hurt 3D card makers and promote Nvidia. Mirosoft ins making sure they can control it at a software level rather than a hardware level. It's the only way they have to lock you and control the market. Just like the control OEM bundling, and everything in the PC industry.

        Don't complain in the future unless you can have some vision of that are the consecuences of some "innovations".
        • The thing is that everything that Nvidia's said sounds like they don't intend to play that game. Most of these comments are the "knee-jerk-haven't read-the-article" level.

          Proprietary standards are not a bad thing when there are no open standards. In fact, many standards bodies look to current practice for examples of standards. If everyone waited for standards, there would be fewer (or worse) standards. And for a proprietary standard, this one seems pretty open.

          I understand that Nvidia has been heavily involved in the creation of the suspiciously-similar OpenGL 2.0. Their release of Cg will give them useful information for refining the standard. It's also a solid pragmatic decision-- since many people are using Direct3D, they can either laugh and point at them or release something similar to OpenGL 2.0 that works with Direct3D. It also gets the tools to people now, rather than when OGL 2 is done.

          The release of Cg could have the benefits I describe, or it could be as bad as you suggest. At this time, I don't think there's enough reason to think either one of them's true.
    • by liquidsin ( 398151 ) on Saturday June 15, 2002 @03:02PM (#3708647) Homepage
      Since Cg is designed specifically for vertex and pixel shader programs, DirectX versions 8 and 9 are supported as well as OpenGL 1.4. The compiler itself is cross platform; in particular programs written for Windows, Linux, Macintosh, and Xbox are supported. And if all that isn't enough, the compiler can create code for all GPUs that support DirectX 8 (or above) and/or OpenGL 1.4, making it very universal. In keeping with Linux tradition, NVIDIA has open-sourced certain components of the compiler, allowing content developers to add their own customizations as well.

      Do you even read articles? How did you get modded insightful? Do moderators read articles?
      • by fferreres ( 525414 ) on Saturday June 15, 2002 @06:50PM (#3709389)
        Do you even read articles? How did you get modded insightful? Do moderators read articles?

        Yes, I think they do read them. There's nothing in the article that contradicts what he says. Actually, Nvidia IS IN BED with Microsoft. 3Dfx was not and got killed. 3DLabs was not and suffered.

        If it werent't for Id, i'd say OpenGL will be dead right now and you would not be able to play any 3D games but loading Windows.

        Now, all this can sound unsound, but if you really followed what happened in the 3D world since 1995 you will notice it's not a crazy idea.

        Microsoft needs the games to run under Windows and XBox, and to NOT run on any other plataform. This is as true as the sky is blue. So the original poster does have a valid point with I'd mod as Insightfull any day.
      • "In keeping with Linux tradition, NVIDIA has open-sourced certain components of the compiler, allowing content developers to add their own customizations as well."

        This is as true as saying:

        "In keeping with MS tradition, NVIDIA has closed-sourced certain components of the compiler, but still allowing content developers to add their own customizations as well."
  • by Space Coyote ( 413320 ) on Saturday June 15, 2002 @01:32PM (#3708342) Homepage
    There's an interesting editorial [theregister.co.uk] at The Register pointing out some of the flaws in Cg, and speculating about NVidia's intention for future development of the language as it relates to their core graphics hardware busines.

    From the article:

    • No break, continue, goto, switch, case, default. These are useful features that can be used without penalty on other vector processors.
    • No pointers. This is Cg's most serious omission. Pointers are necessary for storing scene graphs, so this will quickly become a serious omission for vector processors that can store and process the entire scene or even sections of it.
    • No integers. This may be appropriate to NVIDIA, but is not a universal design decision.
    • Arrays use float indices. This is an odd design decision, relevant to DirectX 8 and Nvidia only.

    It may be possible that NVidia is holding back support for such rudimentary language features until such time as they are supported in their own hardware. I don't think this is a formula for a widely-adopted language at all, and smells a little of 3dfx's efforts with Glide.

    • Arrays use float indices. This is an odd design decision, relevant to DirectX 8 and Nvidia only.

      Ummm, what?

      I know absolutely nothing about graphics programming. Not ashamed, it's just not my area of specialty. However, I'm intrigued as to why floats would be used for an array index. If anyone can enlighten, I'd be interested to hear.

      • It doesn't make any sense at all. Is it just a hashtable with float keys? Or are floats just the only numerical type, and they're casted to ints for array lookups?
      • Ummm, what?

        I know absolutely nothing about graphics programming. Not ashamed, it's just not my area of specialty. However, I'm intrigued as to why floats would be used for an array index. If anyone can enlighten, I'd be interested to hear.


        Just guessing, but could it be for automatic interpolating of values between table's elements? register's article mentioned something about this thing bing DX8 specific, so it must be already in use somewhere. Still i can't imagine this but to be something of a cludge.
        It's difficult to see nVidia be involved in something cludgy, must've been something of MS:s work.
        • I doubt it's for interpolation. It's easy to average two numbers and pick the midpoint but proper interpolation is a *very* complex subject. To get really good interpolation you need to graph both sides of the data set and extrapolate over the area you're interested in, from both sides. Then where (if) those meet, that's the new value. The farther away you graph from, the smoother the interpolated area.

          This is something it might be nice to have a function to do, but if this was done on every array access it's going to be hella-slow.

          Not to mention, sometimes you store arrays of things you don't want interpolated. You could interpolate between shades in a pallette, but perhaps you're holding three colors in an array for three effects. Effect one (blood) is red, effect two (slime) is green, etc... If you use floats for integers either you round them to get ints, or you have something that's never exact. So in this case you'd have 1.000003, for instance, and it'd interpolate between the red and green, even though you didn't want that behaviour.

          So, for the reason that automatic interpolation between array values is hard to do, and hard to do in a way that you'd want, I don't think they're doing it.

          Most likely they're dealing in floats simply because they've got hardware that can deal with floats very quickly, and they trunc or round to get the desired value when using them in an integer context.
      • I would assume the last two are related.

        No integers. This may be appropriate to NVIDIA, but is not a universal design decision.

        Arrays use float indices. This is an odd design decision, relevant to DirectX 8 and Nvidia only.
        Without ints what else would you use?

    • Its interesting the top two grips (no break and no pointers) are totally irrelevent to shaders. We're not talking generic vector processors here, but specifically vertex and pixel shaders. Current shaders don't do conditionals, and they don't process sections of a scene. They get a few floats as input (vertex coordinates and normals) and produce a few numbers as output.
    • by donglekey ( 124433 ) on Saturday June 15, 2002 @03:14PM (#3708698) Homepage
      Whomever wrote that article may understand 3D, but doesn't understand where shaders fit in and the history and experience already here. Pixel and Vertex shaders have been around since the inception of commercial 3D in the form of Renderman surface and displacement shaders. They are small and very modular programs which don't need access to a large amount of information at one time. Thus because of the implied modularity, and the isolation of the calculations relative to the rest of the scene there is no need for OO (I know you didn't mention it, but someone in the FIRST story about this did, and its a good question), and no real need for pointers.

      Furthermore because of the very analog nature of what is being descibed, control statements and desicion shortcuts aren't a very big deal. Of course there are if else statements, but they are not used as much as simple and very general algorithms. Hard desicions lead to aliasing, because they rule out a gradual change. Also because of the analog nature of what is being reproduced integers are used very rarely, almost exclusivly for loop counters.

      Using float indices for arrays is a kick ass design descision. It allows for smooth and elegant interpolation between discreet values, and I can't stress what a cool idea that is.

      In short, the register is wrong, and this IS a formula for a widespread language, because it is copying another very mature widespread language, the Renderman shading language. The only thing I am worried about is that it will be geared towards only Nvidia products, thus competing with OpenGL 2.0 (whenever the vapor settles).

      Keep in mind that I am not trying to argue you, but I am trying to argue the register's stance. The designers of Nvidia are very aware of the vast history of Renderman I am sure, and this language looks just fine.

      For anyone who wants to get into writing shaders, the book 'Advanced Renderman: Creating CGI for motion pictures' by Anthony Apodaca and Larry Gritz is your bible. it covers everything you need to know and more, and I highly recommend it.
    • Cg (and the pixel and vertex shaders written in it) have nothing to do with processing scenes, etc.

      All they're for is to tell the GPU how to draw an object. For example, lets say you wanted to draw a polygon that was supposed to be part of someone's face. You would write your pixel shader to simulate the way light bounces off human skin, use OpenGL calls to load it into the GPU, then tell OpenGL to draw the polygon using the shader you just loaded.

      The advantage that Cg brings is that instead of having to write the mathematics necessary to realistically simulate the way light bounces off human skin (or some other material) in an assembly-like language, you can do it in a C-like language. This would make writing shaders *MUCH* easier.

      And, it will work on ATI cards that support custom pixel and vertex shaders as well, so it really does benefit everybody, since now those who've got a Geforce3, Geforce4 (but not a Geforce4 MX) or a recent ATI card will see more games that support the ability of these cards to do custom shaders.
    • All loops can be implemented using if/else/goto. Adding while, for, switch, try, etc. help promote structured programming for complicated tasks. But I think it's good to promote KISS here. So include goto because even though it's been vilified it is still a staple; HLLs just try to discourage using it directly. But anything more is just needless bloat.

      Pointers? Obviously, buffer overruns just aren't doing enough damage in protected memory and we have to start corrupting our framebuffers as well.

      Are there any integer-based 3D cards anyway? If there are, they can write their own integer language.

      But I believe that if NVidia were really interested in making shader programming easier and doing it in an open way, they'd write an extension to GCC that lets it compile to their machine code. NVidia's policy of open sourcing "certain components" while keeping critical portions proprietary reminds me so much of Marie Antoinette's "Let them eat cake."
  • Is this a new "language" or is is a library. What does this do that GL or DirectX cannot? I personally don't see a need for this when we have OpenGL and to a lesser respect DirectX (cause there isn't a *nix implementation).
  • by levik ( 52444 ) on Saturday June 15, 2002 @01:56PM (#3708405) Homepage
    This looks like a very smart move on NVIDIA's part. Remember GLIDE? Everyone loved it, and it arguably contributed a lot to 3Dfx's success in late 90s. The only problem with GLIDE was that it completely didn't work on any other cards.

    Which was fine while the market was 90% Voodoo, but once other players got more or less established, the benefit of easily developing stuff with GLIDE was overshadowed by the loss of a chunk of your target audience, and the developers moved away from it.

    Now imagine, if GLIDE worked on all the competitors' cards, but worked just *slightly* worse than on a Voodoo card. Not enough to be unplayable or anything, but worse nontheless. Then, there's a good chance developers would still use GLIDE, and 3Dfx could claim supperiority on all those products.

    However "open" Cg will be, NVIDIA will definitely get the edge on any software written with it, if only because they will have had a head start.

    I wonder though if this language is at all similar to GLIDE, which they acquired together with 3Dfx. I also wonder what someone who is very good at low-level graphics programming (like John Carmack) thinks of the idea of this language.

    • ... the benefit of easily developing stuff with GLIDE was overshadowed by the loss of a chunk of your target audience, and the developers moved away from it.

      That's what Direct3D was for from the begining. But Glide was there BEFORE direct3D, so 3Dfx was not responsible for their success. May sure also know Voodoo cards where several times faster than the closest competitor...for years.

      After that, companies started selling "promises" like the S3 virge cards, the RIVA crap and the like. They sold a lot. After than, some harm was done. Because everyone had crappy cards which only had Direct3D drivers. The cheapo crap market killed 3Dfx. They could get financing, they lost the edge.

      Good think that Nvidia was MS funded (my guess) so at least we have their damn fine cards. But I fear promoting Cg is not a move towards better competition. It can only see it hurting other vendors, not allowing faster/better games.
  • by BitHive ( 578094 ) on Saturday June 15, 2002 @02:04PM (#3708428) Homepage
    To all of you that complain your submissions never get accepted--just resubmit stories from a few days ago!.

    "Same shit, different day."

  • by s4f ( 523726 )
    Assembly will always be faster than a compiled language. That's just one of those immutable laws. It's the way the universe is put together.
    • Assembly will always be faster than a compiled language.

      Not exactly true. Perfectly written assembly will always be faster than a compiled language. If it were easy enough to come up with perfectly written assembly that most programmers could do it, there wouldn't be any need for compilers at all.

    • um, sorry but you have to compile assembly also...

      org 100h
      mov dx,msg
      mov ah,9
      int 21h
      mov ah,4Ch
      int 21h
      msg db 'You still have to compile me!',0Dh,0Ah,'$'

      nasm -f elf compileme.asm
      ld -s -o compileme compileme.o
    • Silicon is cheap. Programmers are not. The point of higher level languages isn't to write faster programs. It's to write programs faster.
  • Many companies, including 3D Studio Max

    3D Studio MAX is a product, not a company. Discreet [discreet.com] is the name of the company currently making MAX, a subsidiary of Autodesk [autodesk.com]
  • by Screaming Lunatic ( 526975 ) on Saturday June 15, 2002 @02:36PM (#3708554) Homepage
    The official community site is at cgshaders.org [cgshaders.org]. There's a Linux Toolkit [nvidia.com] out now. There's a interview [cgshaders.org] with CEO David Kirk. Along with articles, a shader repository, and forums for help.
  • The shocking part about this story is that Thresh's FiringSquad still exists
  • Wow! (Score:5, Informative)

    by be-fan ( 61476 ) on Saturday June 15, 2002 @03:06PM (#3708668)
    I'm impressed. This is the second time this has been posted on /., and people are STILL clueless about what Cg is! Incredible!

    Cluestick: Cg is not a language like C/C++. It is not an API like OpenGL/DirectX. Instead, it is a simple, high-level replacement for the assembly language traditionally used to program pixel and vertex shader units on graphics cards. These programs are typically a few dozen instructions long and basically map a small set of inputs to a small set of outputs. So you could write a program to rotate a vertex around a point in Cg, but not something like Quake...
  • by RoninM ( 105723 ) on Saturday June 15, 2002 @04:13PM (#3708882) Journal
    Cg looked awfully familiar to me (and not just because we had this article, before). You might want to compare it to the OpenGL 2.0 Shader Language defined here (PDF) [3dlabs.com] and implemented here [3dlabs.com].

    All of this leaves me a little bit confused. I'm not sure why we need two (or, perhaps, more) C-based shader languages, at least one of which (Cg) is hardware-specific, but API neutral.

    • Because Cg is here now and OpenGL 2.0, if the past is any indication, won't even be fully standardized (let alone implemented) for another couple of years.

      I'm fairly sure NVidia recognizes this as a stop-gap measure since both DX9 and OpenGL 2.0 will have their own high-level shader compilers..But since neither is shipping (outside of the DX9 beta, anyway)...Well, this does the job for now.

    • "This zip file contains the source code and project files necessary to build the OpenGL Shading Language compiler as a standalone executable for Windows platforms. It is 3Dlabs intention to help OpenGL move forward by providing source code to the OpenGL Shading Language parser and intermediate-code generator that we have implemented. We are making this code available in the hopes of encouraging rapid development of OpenGL 2.0 implementations as well as common tools and utilities."

      The cg toolkit is available as a tarball or in rpm. Hmmm...Anybody porting OpenGL Shader Language to linux?
  • 3d Studio Max isnt a company....its a software package developed and distributed by Discreet (www.discreet.com).
  • It's not like I'm an obsessive reader of /. or anything, but even I noticed that this story has been posted only two days before. Really, you guys are meant to be editors - don't you even read /.? I mean, reposting a story that is a few weeks old is fair enough, but two days? Hell, the old story is still listed on the front page!
  • by iggie ( 183722 )
    No, 'g' is for Graphics, 'C' is for C.
  • by theMightyE ( 579317 ) on Sunday June 16, 2002 @01:35AM (#3710172)

    With Microsoft involved we will still get the Blue Screen of Death, only now it'll be anti-aliased, vertex shaded, and happen at 400+ fps.

  • Interesting, this gets posted twice but my submission on how not everyone agrees [theregister.co.uk] gets rejected.

    I'm not sure I'll ever fully understand Slashdot.

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...