NVidia announces Cg: "C" for Graphics 176
mr_sheel writes: "Thresh's FiringSquad has an article about Cg, a new language developed by NVidia in collaboration with Microsoft. 'Up until now,' Brandon Bell writes, 'game developers have had to use complex assembly language to create the lifelike graphics gamers experience in today's latest games.' Cg eases the process of bringing graphics to the screen and compiles for DirectX 8,9 and OpenGL 1.4. Many companies, including 3D Studio Max, Blizzard, and over 100 game developers, have already jumped onto the Cg bandwagon. Will this replace assembly graphics coding once and for all?"
Isn't this (Score:5, Informative)
Yes and No (Score:5, Funny)
Re:Isn't this (Score:2, Informative)
Re:Isn't this (Score:1)
Re:Isn't this (Score:1, Insightful)
Break it down (Score:1)
deja vu all over again (Score:1, Redundant)
Re:deja vu all over again (Score:1)
Kind of ironic, isn't it?
Re:deja vu all over again (Score:1)
Well, to you and the person who moderatd me down, let me explain something that you don't grasp. The original post was a 1:59. I saw it at 2:00. There were no other responses of this nature posted when I looked, and I did. But in the seconds I was typing my short post, previewing it and posting it, several others made similar observations. It happens. Get over it.
Perhaps the fact that you always find it funny indicates that the problem (duplicated original posts) happens often enough that it indeed should be mentioned.
Re:deja vu all over again (Score:1)
Hey Timothy! (Score:5, Funny)
I know some of his stories suck but it's for the good of all of us.
You can keep Katz stuff blocked.
Re:This wouldn't happen... (Score:1)
Re:This wouldn't happen... (Score:2)
You're just making that up, jackass. There is no limit [mysql.com] to how short the text is that you query for.
This limitation has little to do with MySQL (besides general poor performance) and everything to do with SlashCode's poorly designed searching facilities.
Re:This wouldn't happen... (Score:3, Interesting)
No news good news. (Score:2)
PPA, the girl next door
Re:No news good news. (Score:5, Funny)
User Bio:
Active Open Source bi-geek girl who loves boys with ear-rings and enjoys reading Playboy in the train. I play bass in an all girls band and work in the porn movie industry as an amateur actress to pay for my scholarship. Drop me a line sometime...
Next door to whom exactly?
Re:No news good news. (Score:2, Funny)
A bass player? Pshaw, as if.
Re:No news good news. (Score:1)
How about you drop me a line? Pink Fruit Person...
Re:No news good news. (Score:3, Funny)
That's total copyright infrigement.
PPA, the girl next door.
Re:No news good news. (Score:2)
-
damn
Oh - and if you're looking for a one night stand, I'm not your guy.
Re:No news good news. (Score:2)
He might not be, but I am.
Another article (Score:3, Informative)
Assembly ain't going anywhere (Score:1)
Re:Assembly ain't going anywhere (Score:1)
Asembly for an 8-bit micro with les than 100 instructions isn't a hard thing to grasp...but more complex CPUs have hundred and hundreds of inctructions. It must be nearly inpossible for someone to always knmow they are using the best instructions/sequence of instructions to perform a given task.
Re:Assembly ain't going anywhere (Score:1)
Abuse! (Score:1)
Isn't it interesting... (Score:2)
Re:Isn't it interesting... (Score:1, Funny)
Thank you, thank you very much.
Re:Isn't it interesting... (Score:2)
The operative word here is wants to hack on. Sure, there are lots of people who want to use it, but how many are there with the time, the ability, and the desire to maintain it?
Cygnus, BTW, is now a division of RedHat.
MS Mistrust (Score:3, Interesting)
I just hope that phrase doesn't mean non-DirectX operating systems (Linux, Mac OS X) aren't about to get the short end of the grahics stick. I can visualize features not being implimented for OpenGL, or worse, support for OpenGL discontinued at some strategic point in the future "because our customers strongly prefer DirectX" [says Microsoft].
Ok maybe I'm paranoid. Maybe this is basically nVidia's baby and MS is only involved a little bit. Let's all hope. Can someone reasure me?
Re:MS Mistrust (Score:2)
I wish I could tell you that wouldn't happen, but it seems to me that it is somewhat likely down the road.
or worse getting taken over by Microsoft.
Re:MS Mistrust (Score:2, Funny)
Re:MS Mistrust (Score:2)
And Linux is technically a kernel not an OS, but 95% of the time I hear the word it's referred to as an OS.
So what is your point?
Re:MS Mistrust (Score:1)
I kan read artical (Score:5, Informative)
Re:I kan read artical (Score:2)
Re:MS Mistrust (Score:1)
Re:MS Mistrust (Score:3, Interesting)
They where virtually killed by 3Dfx many times yet someone kept pumping money from "somewhere" for them. The only change in focus after their failure this: target for Direct3D.
They kept pumping money and money and selling at a loss until they bough 3Dfx. Most other players where dead already (rendition and many others) dead by then. Now they control the market with Microsoft by their side. Microsoft was always against 3Dfx, they direct3D was never "friendly" with 3Dfx cards (though it worked fine, features where really targeted for Nvidia cards).
It's like Nvidia and Microsoft developed direct3D. It's not a coincidence. Microsoft would never enjoy a monopolistic provider in a key market unless they own/control it somehow (and no, you don't need Microsoft puting money directly to own it. There are a thouthand ways to own something in an unnoticed fashion).
Remark: i know Nvidia cards are the best and excelent ones! This has nothing to do with it
Why? (Score:2, Insightful)
Re:Why? (Score:1)
Re:Why? (Score:1)
Re:Why? (Score:1)
check out this message. [slashdot.org]
Good stuff! (Score:2)
The technology seems like a necessary step for the industry. I do graphics programming, although I'm not elite enough to do it in raw assembly.. I have used OpenGL a few times. From the article it does sound like it will be much easier to develop visualization code that is more standard - this would have made working on my Thesis much easier! Oh well, better late than never I guess...
Also the article heavily focused on examples of characters from game environments, it would be nice to see graphics examples of other types of applications besides gaming. For example scientific visualization and similar areas, that is what I'm more interested in that gaming development.
However I am concerned that NVidia won't stay compliant with standards across video cards and things may get ugly in the future. To pull a Microsoft, as it were. I hope not because it sounds like an interesting product that could quickly become dominant if they continue to do things right.
Re:Good stuff! (Score:2, Insightful)
This has nothing to do with raw x86 assembly! This isn't doing graphics programming in raw assembly!
There is an assembly-like language for writing vertex and pixel shaders on nVidia and ATI GPUs. Your entire program could be nothing but high-level calls to DirectX 8 or OpenGL 1.4 in C++, but if you want to take advantage of the ability of the Geforce3, Geforce4 (but not Geforce4 MX), and recent ATI cards, to do custom pixel and vertex shaders, you have to write said pixel and vertex shaders in a small assembly-like language, and use DirextX or OpenGL calls to load them into the GPU. These shaders get used by the card's GPU, and has absolutely nothing to do with the CPU.
The idea here is that instead of writing your pixel or vertex shaders in something similar to assembly (again, it *ISN'T* assembly), you can write them in something similar to C, and the Cg compiler will compile your shaders into machine code for the GPU (Graphics Processing Unit; it's on the GeForce3, GeForce4 (but not GeForce4 MX), and recent ATI cards), not the CPU!
Writing shaders in this assembly-like language for the GPU is nothing like trying to do graphics in x86 assembly back in the day, ok?
Ok, it's not real assembly (Score:2)
Too much marketing-speak in the article I guess. Or I'm just dumb.
See what some game developers think (Score:4, Insightful)
-m
Re:that should be "amateur game developers" (Score:1, Funny)
What Are You Talking About? (Score:3, Insightful)
What on earth are you talking about? This will be a requirement... ALWAYS. You know why? Because nVidia is only one company. Not everyone conforms to their "standards". This is good especially because they are in bed with Microsoft. They are a conduit for MS to control the graphics market as nVidia are up for sale to the highest bidder.
So, what do you do when you want to strengthen control on a market? You introduce a new language that makes it easier for developers to take advantage of one piece of hardware on your platform!
No, this will NOT replace ASM for driver/graphics engine development. It should be disregarded because there's nothing standard about it. Hopefully, the leaders in the industry will always support chipsets other than nVidia's.
Re:What Are You Talking About? (Score:2)
Re:What Are You Talking About? (Score:2)
Do you think this Cg will take most out of ATI cards? Do you think ATI likes it or it's a good thing for them? Why did Nvidia developed this thing with Microsoft help? Why wasn't ATI there? Why weren't Matrox and 3Dlabs invited?
It may be compatible for ATI cards as in beer, not as in freedom. And that beer will probably taste like you don't want a second round.
This can only hurt 3D card makers and promote Nvidia. Mirosoft ins making sure they can control it at a software level rather than a hardware level. It's the only way they have to lock you and control the market. Just like the control OEM bundling, and everything in the PC industry.
Don't complain in the future unless you can have some vision of that are the consecuences of some "innovations".
Re:What Are You Talking About? (Score:2)
Proprietary standards are not a bad thing when there are no open standards. In fact, many standards bodies look to current practice for examples of standards. If everyone waited for standards, there would be fewer (or worse) standards. And for a proprietary standard, this one seems pretty open.
I understand that Nvidia has been heavily involved in the creation of the suspiciously-similar OpenGL 2.0. Their release of Cg will give them useful information for refining the standard. It's also a solid pragmatic decision-- since many people are using Direct3D, they can either laugh and point at them or release something similar to OpenGL 2.0 that works with Direct3D. It also gets the tools to people now, rather than when OGL 2 is done.
The release of Cg could have the benefits I describe, or it could be as bad as you suggest. At this time, I don't think there's enough reason to think either one of them's true.
Re:What Are You Talking About? (Score:4, Insightful)
Do you even read articles? How did you get modded insightful? Do moderators read articles?
Re:What Are You Talking About? (Score:4, Interesting)
Yes, I think they do read them. There's nothing in the article that contradicts what he says. Actually, Nvidia IS IN BED with Microsoft. 3Dfx was not and got killed. 3DLabs was not and suffered.
If it werent't for Id, i'd say OpenGL will be dead right now and you would not be able to play any 3D games but loading Windows.
Now, all this can sound unsound, but if you really followed what happened in the 3D world since 1995 you will notice it's not a crazy idea.
Microsoft needs the games to run under Windows and XBox, and to NOT run on any other plataform. This is as true as the sky is blue. So the original poster does have a valid point with I'd mod as Insightfull any day.
Re:What Are They Talking About? (Score:2)
This is as true as saying:
"In keeping with MS tradition, NVIDIA has closed-sourced certain components of the compiler, but still allowing content developers to add their own customizations as well."
Re:What Are You Talking About? (Score:2)
The problem is not against "writing [efficient] vertex and pixel shaders easier and more portable across graphics cards", the problem is about Microsoft and NVidia controlling the "writing [efficient] vertex and pixel shaders easier and more portable across graphics cards" for the good and benefit of all society.
It doesn't matter how much times you read the articule...this is just obvious. Unless it's a standard with representation, then it's bad for you and me (unless we own certain stock).
Why it won't work (link) (Score:4, Interesting)
From the article:
It may be possible that NVidia is holding back support for such rudimentary language features until such time as they are supported in their own hardware. I don't think this is a formula for a widely-adopted language at all, and smells a little of 3dfx's efforts with Glide.
Float for array indices? (Score:2)
Arrays use float indices. This is an odd design decision, relevant to DirectX 8 and Nvidia only.
Ummm, what?
I know absolutely nothing about graphics programming. Not ashamed, it's just not my area of specialty. However, I'm intrigued as to why floats would be used for an array index. If anyone can enlighten, I'd be interested to hear.
Re:Float for array indices? (Score:2)
Re:Float for array indices? (Score:2)
I know absolutely nothing about graphics programming. Not ashamed, it's just not my area of specialty. However, I'm intrigued as to why floats would be used for an array index. If anyone can enlighten, I'd be interested to hear.
Just guessing, but could it be for automatic interpolating of values between table's elements? register's article mentioned something about this thing bing DX8 specific, so it must be already in use somewhere. Still i can't imagine this but to be something of a cludge.
It's difficult to see nVidia be involved in something cludgy, must've been something of MS:s work.
Re:Float for array indices? (Score:3, Interesting)
This is something it might be nice to have a function to do, but if this was done on every array access it's going to be hella-slow.
Not to mention, sometimes you store arrays of things you don't want interpolated. You could interpolate between shades in a pallette, but perhaps you're holding three colors in an array for three effects. Effect one (blood) is red, effect two (slime) is green, etc... If you use floats for integers either you round them to get ints, or you have something that's never exact. So in this case you'd have 1.000003, for instance, and it'd interpolate between the red and green, even though you didn't want that behaviour.
So, for the reason that automatic interpolation between array values is hard to do, and hard to do in a way that you'd want, I don't think they're doing it.
Most likely they're dealing in floats simply because they've got hardware that can deal with floats very quickly, and they trunc or round to get the desired value when using them in an integer context.
Re:Float for array indices? (Score:2, Insightful)
No integers. This may be appropriate to NVIDIA, but is not a universal design decision.
Arrays use float indices. This is an odd design decision, relevant to DirectX 8 and Nvidia only.
Without ints what else would you use?
Re:Why it won't work (link) (Score:2)
Re:Why it won't work (link) (Score:5, Insightful)
Furthermore because of the very analog nature of what is being descibed, control statements and desicion shortcuts aren't a very big deal. Of course there are if else statements, but they are not used as much as simple and very general algorithms. Hard desicions lead to aliasing, because they rule out a gradual change. Also because of the analog nature of what is being reproduced integers are used very rarely, almost exclusivly for loop counters.
Using float indices for arrays is a kick ass design descision. It allows for smooth and elegant interpolation between discreet values, and I can't stress what a cool idea that is.
In short, the register is wrong, and this IS a formula for a widespread language, because it is copying another very mature widespread language, the Renderman shading language. The only thing I am worried about is that it will be geared towards only Nvidia products, thus competing with OpenGL 2.0 (whenever the vapor settles).
Keep in mind that I am not trying to argue you, but I am trying to argue the register's stance. The designers of Nvidia are very aware of the vast history of Renderman I am sure, and this language looks just fine.
For anyone who wants to get into writing shaders, the book 'Advanced Renderman: Creating CGI for motion pictures' by Anthony Apodaca and Larry Gritz is your bible. it covers everything you need to know and more, and I highly recommend it.
Re:Why it won't work (link) (Score:1)
All they're for is to tell the GPU how to draw an object. For example, lets say you wanted to draw a polygon that was supposed to be part of someone's face. You would write your pixel shader to simulate the way light bounces off human skin, use OpenGL calls to load it into the GPU, then tell OpenGL to draw the polygon using the shader you just loaded.
The advantage that Cg brings is that instead of having to write the mathematics necessary to realistically simulate the way light bounces off human skin (or some other material) in an assembly-like language, you can do it in a C-like language. This would make writing shaders *MUCH* easier.
And, it will work on ATI cards that support custom pixel and vertex shaders as well, so it really does benefit everybody, since now those who've got a Geforce3, Geforce4 (but not a Geforce4 MX) or a recent ATI card will see more games that support the ability of these cards to do custom shaders.
Re:Why it won't work (link) (Score:2)
Pointers? Obviously, buffer overruns just aren't doing enough damage in protected memory and we have to start corrupting our framebuffers as well.
Are there any integer-based 3D cards anyway? If there are, they can write their own integer language.
But I believe that if NVidia were really interested in making shader programming easier and doing it in an open way, they'd write an extension to GCC that lets it compile to their machine code. NVidia's policy of open sourcing "certain components" while keeping critical portions proprietary reminds me so much of Marie Antoinette's "Let them eat cake."
Language or Library? (Score:1)
I wonder what John Carmack thinks of this (Score:3, Interesting)
Which was fine while the market was 90% Voodoo, but once other players got more or less established, the benefit of easily developing stuff with GLIDE was overshadowed by the loss of a chunk of your target audience, and the developers moved away from it.
Now imagine, if GLIDE worked on all the competitors' cards, but worked just *slightly* worse than on a Voodoo card. Not enough to be unplayable or anything, but worse nontheless. Then, there's a good chance developers would still use GLIDE, and 3Dfx could claim supperiority on all those products.
However "open" Cg will be, NVIDIA will definitely get the edge on any software written with it, if only because they will have had a head start.
I wonder though if this language is at all similar to GLIDE, which they acquired together with 3Dfx. I also wonder what someone who is very good at low-level graphics programming (like John Carmack) thinks of the idea of this language.
Re:I wonder what John Carmack thinks of this (Score:2)
That's what Direct3D was for from the begining. But Glide was there BEFORE direct3D, so 3Dfx was not responsible for their success. May sure also know Voodoo cards where several times faster than the closest competitor...for years.
After that, companies started selling "promises" like the S3 virge cards, the RIVA crap and the like. They sold a lot. After than, some harm was done. Because everyone had crappy cards which only had Direct3D drivers. The cheapo crap market killed 3Dfx. They could get financing, they lost the edge.
Good think that Nvidia was MS funded (my guess) so at least we have their damn fine cards. But I fear promoting Cg is not a move towards better competition. It can only see it hurting other vendors, not allowing faster/better games.
Let this be a lesson (Score:3, Funny)
"Same shit, different day."
C'mon (Score:1)
Re:C'mon (Score:1)
Assembly will always be faster than a compiled language.
Not exactly true. Perfectly written assembly will always be faster than a compiled language. If it were easy enough to come up with perfectly written assembly that most programmers could do it, there wouldn't be any need for compilers at all.
Re:C'mon (Score:1)
org 100h
mov dx,msg
mov ah,9
int 21h
mov ah,4Ch
int 21h
msg db 'You still have to compile me!',0Dh,0Ah,'$'
nasm -f elf compileme.asm
ld -s -o compileme compileme.o
Re:C'mon (Score:2)
nit picking (Score:1)
3D Studio MAX is a product, not a company. Discreet [discreet.com] is the name of the company currently making MAX, a subsidiary of Autodesk [autodesk.com]
cgshaders.org and Linux Toolkit (Score:5, Interesting)
Good Gravy (Score:2)
Wow! (Score:5, Informative)
Cluestick: Cg is not a language like C/C++. It is not an API like OpenGL/DirectX. Instead, it is a simple, high-level replacement for the assembly language traditionally used to program pixel and vertex shader units on graphics cards. These programs are typically a few dozen instructions long and basically map a small set of inputs to a small set of outputs. So you could write a program to rotate a vertex around a point in Cg, but not something like Quake...
OpenGL 2.0 Shader Language (Score:5, Interesting)
All of this leaves me a little bit confused. I'm not sure why we need two (or, perhaps, more) C-based shader languages, at least one of which (Cg) is hardware-specific, but API neutral.
Re:OpenGL 2.0 Shader Language (Score:1)
I'm fairly sure NVidia recognizes this as a stop-gap measure since both DX9 and OpenGL 2.0 will have their own high-level shader compilers..But since neither is shipping (outside of the DX9 beta, anyway)...Well, this does the job for now.
Re:OpenGL 2.0 Shader Language (Score:2)
The cg toolkit is available as a tarball or in rpm. Hmmm...Anybody porting OpenGL Shader Language to linux?
dont you mean discreet? (Score:1)
Come ON Guys! (Score:2)
Hmmm (Score:1)
BSOD refresh rates (Score:4, Funny)
With Microsoft involved we will still get the Blue Screen of Death, only now it'll be anti-aliased, vertex shaded, and happen at 400+ fps.
Re:BSOD refresh rates (Score:2)
Not everyone agrees with Cg (Score:2)
I'm not sure I'll ever fully understand Slashdot.
Re:Complex assemly language? (Score:2)
Because it's about programming the GPU not the CPU... Pixel shaders, vertex shaders, etc. Until now this was always done in assembler.
Milalwi
Re:Complex assemly language? (Score:2)
Please grab a clue, high performance 3d graphics aren't done in assember. Its done in C with OpenGL calls. The OGL calls are quite high level, nothing as simple as 'put a point at x,y on the screen'. Why do I know? Because I know OpenGL perhaps?
I was going to mod people about but I decided to get out the clue stick instead.
Re:Complex assemly language? (Score:2, Informative)
Re:Complex assemly language? (Score:1, Informative)
Why don't you give yourself a good whack.
Pixel Shaders are an extension to OpenGL and DirectX that allow you to load a small gpu assembly routine into the hw. The point of Cg is to write pixel shaders in a c-like language that gets translated into the ogl/dx gpu assembly routines. This lets the programmers focus on the higher level details, rather than worrying about when which register can be used.
Re:Complex assemly language? (Score:2)
Re:Complex assemly language? (Score:2, Informative)
Where?
high performance 3d graphics aren't done in assember. Its done in C with OpenGL calls.
Or DirectX, with calls to handle an assembly language shader. I don't know how OpenGL does it, but under Direct3D, it is quite possible to produce some GPU assembly language programs in ASCII, and get the API to assemble them.
The OGL calls are quite high level, nothing as simple as 'put a point at x,y on the screen'
No, you need to go for this level of complexity:
glBegin(GL_POINTS);
glVertex2f(x, y)
glEnd();
Re:Complex assemly language? (Score:3, Interesting)
Perhaps you should look at this [cgchannel.com] before you comment further?
"Writing code for existing Pixel and Vertex Shaders is akin to writing assembly code. Eventually it'll work but it's a laborious, low-level exercise with almost no comprehensibility if someone else works on the code."
Maybe you "know" OpenGL, but have you ever written a pixel or vertex shader?
Milalwi
Re:Microsoft? (Score:1)
Re:Microsoft? (Score:1)
It's not a disincentive to non-MS-based game developers.
Re:Microsoft? (Score:1, Redundant)
It's not a disincentive to non-MS-based game developers.
Why not? looks like it is a windows-only technology to me. After all, it uses DirectX. True, it also claims to use OpenGL, but while OpenGL is cross-platform, the compiler in the article is not.
look a little deeper (Score:1)
Re:Inefficiencies (Score:2)
Why just offload rendering to custom hardware, offload physics and ai too!
Re:Inefficiencies (Score:2)
That is precisely my problem with Cg. It's very limited, aimed specifically at their hardware. Yet graphics hardware is becoming much more generically programmable than this (already was, the PS2 had a more generically programmable pipeline a whole year before Nvidia hit the market with programmable shaders).
All Cg is is the easy bits of C (to make a compiler for) and a bunch of missing stuff with intrinsic functions to do the vector/shader bits. It's just as easily done with a full C/C++ compiler, and already has been [codeplay.com] by my company.
Surely Cg will be seen for what it is, a short-sighted proprietary system for supporting their hardware from a company with no experience in producing cross-platform high performance vectorizing compilers for game developers.
The game companies that have come out in support of this are all PC/Xbox based obviously. There Cg is basically an easier-to-use authoring tool for the dominating hardware's shader features.
Generic language for computer graphics this is not!
Re:Hello! Anyone have eyes!? (Score:1)