OpenGL 1.3 Spec Released 193
JigSaw writes "The OpenGL Architecture Review Board announced the new OpenGL 1.3 specification (1.8 MB pdf). In OpenGL 1.3, several additional features and functions have been ratified and brought into the API's core functionality. New features include Cube map texturing, multisampling, new texture modes that provide more powerful ways of applying textures to rendered objects, compressed texture framework etc. Let's all hope that GL can catch up with Direct3D now, as with the latest DirectX8, Direct3D has done some big steps towards feature-set, speed and even non-bloatiness when it comes to coding for it, while OpenGL 1.2 was released more than 2 years ago and it did not offer as much."
As the technology evolves ... (Score:1)
Contrast this with D3D in DX8. If the hardware doesn't support acceleration of this feature, would it do it in software? If it did, would users want it? Is there a way to choose that if a feature is implemented only in software, that it not be used at all?
Re:As the technology evolves ... (Score:2)
It depends; yes, you can tell it not to do it if you want, programmatically. But ultimately, this is why games have render engine feature options screens. So you can turn off stuff your system can't handle, or handles badly.
Of course, you have to design for it - and this is NOT a problem that is DirectX specific - the same issues also apply to OpenGl.
Simon
does it do sharders? (Score:1, Insightful)
Plagiarism alert! (Score:1)
John Carmack? (Score:1)
Catch up to what? (Score:5, Insightful)
Comparing DX or better D3D to GL is like comparing UNIX to Windows. You can either allow modular ententions or rewrite the API every release, whus breaking backwards compatibility for no reason. GL ext from ATI and Nvidia are much easier to use for development that D3D imho.
Only moogles may disagree. We still love you dan! =)
Re:Catch up to what? (Score:2)
Re:Catch up to what? (Score:1)
I work on OGL clones of DX games as a hobby. Once I have my OGL based engine under the game, then I don't have to worry about newer API version breaking my binaries much less source.
I for one still play old DOS games and I enjoy having a 'long shelf life' in the titles I buy. I no longer buy DX based games even if I really want them now. If MarrowWind doesn't move off DX ( xbox/"pc" game ) I will never buy it. I waited years for MarrowWind too...
Re:Catch up to what? (Score:2)
Re:Catch up to what? - DX8 does break DX7 games (Score:1)
Frankly, I don't need Windows or DirectX 8 to live a rich and fulfilling life. In fact, I find I'm much happier without them.
Re:Catch up to what? (Score:1)
Apple (Score:3, Insightful)
Re:Apple (Score:1)
I'm can't wait (Score:5, Funny)
Re:I'm can't wait (Score:1, Offtopic)
Blowing karma for originality.
Re:I'm can't wait (Score:2, Interesting)
The funny bit... (Score:2, Insightful)
Gamer 1 " Good god this quake 3 is SUCH 24 bit color, how could they stand it?"
Gamer 2 "Totally!"
Re:The funny bit... (Score:2, Insightful)
Re:The funny bit... (Score:3, Interesting)
When it comes to 3D you and everybody else can, thats because 24 bit color only has 8bit integer precision/color. With todays games evry pixel on the screen is rendered from many textures, lightmaps, bumpmaps etc. This gives errors when there only is 8 bit precisions.
John Carmack (Id software) have stated that more precicion is needed on future GFX cards.
Consider this
In floatingpoint math.
120/50*30/2 = 36
In integer math that answer would be 30 (calcing from left to right)
From what I understand the cards nowadays uses integer math.
Re:The funny bit... (Score:1)
bits aren't actually used for anything else than padding. Atleast when just running a desktop.
Re:The funny bit... (Score:2, Informative)
Alpha is one way to do the smoke/fog effects.
Alpha is the transparency of a material/texture.
Re:The funny bit... (Score:1)
I heard somewhere that Photoshop can use a 32bit display to accelerate redraws of multilayer images (letting the hardware sort out the transparency). It would be possible using say OpenGL, but I'm not that windows will do this, even in 32bit mode. I really don't have any clue about whether or not the mac would.
Re:The funny bit... (Score:1)
Re:The funny bit... (Score:2)
The Cineon format sometimes used in digital film work (compositing, etc) uses 3 10-bit channels using a logarithmic scale. Depends what your ultimate display medium will be.
No one can. (Score:3, Informative)
Re:No one can. (Score:2)
Tell you what. Take about 5 or 6 images of the same size 32 bit RGBA, keep them fairly simple. Now mathematically blend them together in various ways, such as multiplying, adding, etc. Now look at the result. This causes an effect known as "banding" pretty easily.
Take those same graphics and convert each channel to a float between 0 and 1 and do your blends like that, clamping at 1.0 and 0.0 if necessary. When you convert back to 32 bit color, the image will probably not show the artifacts found in the simply blended version. You can achieve the same results by using a higher precision framebuffer in a 3D card, be it 64 or 128 as some people are suggesting.
While the human eye is only capible of seeing about 10 million colors (I think that's the right number?), 16.7 million plus an alpha channel isn't enough when you do too many blends simply because each blend lowers your precision.
Exactly (Score:2)
Re:No one can. (Score:1)
Well, and even if 32 bit used more bits for each color, 24 bit already stores more colors then us mere humans are able to see (typically).
Re:The funny bit... (Score:2, Funny)
Re:The funny bit... (Score:1)
Anyway, if we can't tell the difference between 8bit color and higher color depths, then way to so many professional video rigs record and play back in 10 bit color, and why was Shrek recorded to film in 16bit color?
The answer is that when an image emphasizes one color over another, banding can occur. Also, it isn't hard to find instances where detail was compromised by contrast in digitial images. Looking around on the sections about digital cameras on www.photo.net will discuss this issue somewhat.
Re:The funny bit... (Score:2, Funny)
Re:The funny bit... (Score:1)
Your hardware probably uses 32bpp during calculations because it's faster that way.
But if your graphic-card isn't very exotic it only uses 24bpp to actually display a colour and eight bits are ignored.
Re:The funny bit... (Score:2)
Hmm. (Score:2, Redundant)
OpenGL was, is, and will be here. (Score:1, Interesting)
Then, there's this very nice company called EpicGames. It created Unreal and Unreal Tournament (while trying to push Glide) and are now doing Unreal Warfare. These guys provide nice competition to ID Software and YES, they use Direct3D. Now take a modern computer with an NVIDIA card (chances are you already have one anyways) and play some Quake2 and Quake3...See the framerates ? OK... Now start up Unreal/UT, select D3D as the renderer and...do I really have to tell you how low will your FPS go ?
Start-up Half-Life, the most popular online 3D FPS game at the moment (due to CS), try switching back and forth between the OpenGL and D3D renderers and compare the framerates. I know some of you are going to scream that HL is based on the Quake engine, etc, but just to let you know, only 20% of the HL engine code come from Quake.
Re:OpenGL was, is, and will be here. (Score:2, Informative)
The fact is, the renderer in UT produces much better looking results than in Quake, and is designed for larger maps too. It also handles mirrors, etc. much better. It even has procedural texturing built in. So this isn't a valid comparison; UT runs slower because it does MORE. (And looks better for it)
Start-up Half-Life, the most popular online 3D FPS game at the moment (due to CS), try switching back and forth between the OpenGL and D3D renderers and compare the framerates. I know some of you are going to scream that HL is based on the Quake engine, etc, but just to let you know, only 20% of the HL engine code come from Quake.
Clue; most of that 20% is the RENDERING CODE, which is still largely OpenGL based. They have a wrapper layer between OpenGL and DirectX for the DirectX output. That's where the slowdown comes in. (For example, surfaces don't have to be decomposed into triangles in OpenGl; in DirectX they have to be... and in HalfLife, none of the surfaces are decomposed into triangles by preprocessing the data - which is why it's slower; OpenGL drivers are optimized for this kind of work... but they're doing the conversion themselves).
Simon
Re:OpenGL was, is, and will be here. (Score:2)
I really wish that people like you would stop talking about things you know nothing about. Unreal Tournament is limited by the CPU, not the graphics card because it uses a slow visibility-determination-scheme that favorizes its software renderer. Remember that UT is based on Unreal which was in development long before 3D Hardware came about.
It's NOT slow because of D3D, as the next version of the engine (where this issue is fixed) will prove.
Even Carmack will tell you that.
Re:OpenGL was, is, and will be here. (Score:1)
BS. Every 3D game out there nowadays is limited by both. Lots of RAM will help you run UT better, so will a graphics card, the CPU comes 3rd. When it comes to many other games, both the CPU and the graphics card are equally important. I am yet to see a modern game that actually relies more on the CPU then on the video card, this doesn't mean that the CPU can't be a bottleneck though.
Implementation (Score:4, Insightful)
As to this issue of Direct 3D having a bigger feature set et al., this is only a worthy argument if we are talking MS-Windows. Outside of the Windows platform Direct 3D means nothing, since it isn't available there. OpenGL is currently the only cross-platform solution worth mentioning (please correct me if there is another). IMHO, the SDL game API made the right move in using OGL for it graphics, since the last thing we need is yet another graphics API that is just about supported. Maybe one thing that will help OGL, especially in games, is if more noise was made about it.
Re:Implementation (Score:1)
They already do [nvidia.com].
Cross-platform and "implementations that actua..." (Score:2)
Now that the Win9X line is scheduled to wither, and the WinNT line is to be the One True Solution with the release of WinXP, the term "cross platform" becomes irrelevant, just like "Office" without the "MS " prefix. Of course at some point in the future, "cross-platform" may need to be resurrected, with the release of the X-Box or WinCE-9.3.
On a slightly more serious vein...
The issue of "implementations that actually take advantage" crossed with OpenGL extensions that may differ from vendor to vendor is a bit of a red herring. After all, not all cards are going to have these new whiz-bang features. Someone enlighten me if Direct3D still has the accursed "capability bits" that are under-architected for telling true capabilities, and may leave a game falling back on software rendering without warning - unless the game is 'written to the cards' in the first place.
IMHO, the newest games will be written to a reasonable common denominator, then with a few extra modes, first to a "better common denominator", and finally to the full feature set of a few of the newest cards. This isn't a "write once, run on any Direct3D with the BEST eye-candy" situation by any means.
I hate pdf! (Score:1)
Use HTML or something you ppl!
Slogan: Stop nVidia from blocking 3dfx completely!
A little bit offtopic but... (Score:4, Informative)
But the "evil" API Direct3D is already (mostly) available for Linux. Haven't you heard of trasgaming (http://www.transgaming.com/ [transgaming.com])?
They are currenly working on D3D port to WINE [wineqh.com].
(If you don't know, their license is not fully "free", but they will make it "free" when they get enough "support".)
[ By the way, I don't think opengl will die anytime soon. Because "serious" graphics work is not only "games". have you used SGI? they do not support D3D or whatsoever ]
Prof. graphics programs are moving to d3d (Score:2)
A lot of people without a clue will scream and cry that OpenGL is faster, easier and can do more, but frankly, if that was the case, more people would use OpenGL in the games they write. OpenGL is a nice api and I use it a lot, my library DemoGL is based on it, and if OpenGL dies it will break my heart, but when I think realisticly, OpenGL is practically dead on Windows: the ICD connector DLL (opengl32.dll) isn't updated by Microsoft, documentation SUCKS compared to D3D, and f.e. ATi's OpenGL driver is horrible, making developing OpenGL software much harder than D3D based software.
Re:A little bit offtopic but... (Score:2, Funny)
Re:SGI is dead. (Score:1)
Currently I am trying to develop some software for it. It runs on machine costing $1,000,000 with 20 CPUs. (you may check this page [uiuc.edu])
Um... (Score:5, Informative)
No, nVidia drivers give you all the features... (Score:2)
Re:Um... (Score:3, Insightful)
>>>>>
Is this really such a *bad* thing? K & R wrote the UNIX API and said "do it this way." Is anyone complaining? IEEE standardized the API into POSIX and told people to "do it this way." POSIX is perhaps the most successful OS API in history. Somtimes, a nice standard is just better than some additional freedom (especially when that freedom is for hardware developers, which aren't highest on my ethics list).
Re:Um... (Score:2)
What's interesting is your nick. See, Be people have written their OS precisely to dispose of old garbage like Unix API.
>>POSIX is perhaps the most successful OS API in history.
Notice that most OS's are not Posix compliant, not Linux, not Windows, not Mac, not Be, not Atheos, not Hurd. Full Posix compliance is hard to find.
Re:Um... (Score:2)
As for POSIX complience, Windows 2000 (Cygwin) Linux, BeOS, AtheOS, Hurd, and MacOS X will run 99% of all (non-GUI) POSIX software out there. I'd say that that's pretty damn successful.
Re:Um... (Score:1)
wha? (Score:1)
think about it, OpenGL spec has not changed in how long and what kind of games are being produced with it?
basically all the best. im not a graphics programmer, but i think its safe to say something is good when the best choose it.
Look at that list... (Score:3, Informative)
Tribes2 is multi-API. So are some other biggies (Unreal Tournament comes to mind.)
Re:Look at that list... (Score:1)
I think he started favoring OpenGL in the days of Quake 1, about the time of the appearance of the Voodoo 1. To support that card (and future accelerators, of course), he attempted to port the Quake software renderer to both OpenGL and DirectX.
He succeeded with OpenGL in a single weekend. With DirectX, however, the API at the time (DirectX 1.0? 2.0?) was crap, or it was poorly documented. So he gave up on it.
I don't really know why he still doesn't use DirectX. Maybe he's just being consistent
Re:Look at that list... (Score:2)
3.0, as far as I know... the original story can be found from many places (here, for instance [vcnet.com] - look at the appendix in the end).
those are your list of the best games ?? (Score:1, Troll)
Berlin and OpenGL (Score:1)
OpenGL, CORBA and Unicode display server
We'll all be writing 3D super life like games in Python in a while
Come on! (Score:3, Redundant)
drooool... (Score:2, Informative)
NVidia OpenGL bad-ass extenstions [nvidia.com]
NVidia DX8 SDK [nvidia.com]
both contain very similar stuff you'll find i think, and I've always found OpenGL to be a better interface anyway. DX8 is night and day better than DX7 or before, but still carries a bit of the bloat around the middle that DirectX is famous for...
Extensions vs Core (Score:5, Insightful)
If you go with OpenGL you have to write your program for each different vendor extension that comes out. Honestly, what are the chances of ATI or PowerVR ever supporting NV_texture_shader or NV_texture_shader2?
One of the main aims of DirectX was to avoid the situation in the days of games under DOS where a game developer would have to write different code for each different target video card. Through the use of vendor extensions, OpenGL does no better than DOS did - requiring the developer to figure out exactly which cards he is going to support and writing to those extensions individually, also sacrificing future compatibility if the next generation of cards support different extensions.
Writing to DirectX gives some degree of future-proofing your application as forward compatibility of the core API is preserved through revisions of DirectX. Sure this may carry a bit of "bloat around the middle" but that's the price you pay for compatibility.
Of course, if you aren't writing for Windows you're stuck with extensions.
Re:Extensions vs Core (Score:2, Interesting)
Standardized extentions which aren't necessarily part of the spec, but, they work the same accross implementations. I don't think it'll be too long before we see some sort of standardized shader extension. But then, if you have to write microcode for the vertex shaders etc. then don't you have to do that over anyways unless the cards are binary compatible with their shader processors. I don't know about you, but I wouldn't trust DirectX to take my optimized per-vertex code and translate it to a different shader language set.
...but then what do I know
This is a job for the ARB (Score:4, Interesting)
If you go with OpenGL you have to write your program for each different vendor extension that comes out. Honestly, what are the chances of ATI or PowerVR ever supporting NV_texture_shader or NV_texture_shader2?
I'd put the chances quite high if it's a decent spec. Perhaps it might not be called NV_texture_shader in a year's time, it'll be ARB_texture_shader, and as an ARB-mandated extension will end up being supported by every sane driver with the required hardware support. You can bet that the NVidia drivers will still support the old NV_texture_shader as well though.
This is the way the OpenGL spec grows. Manufacturers are free to go ahead and implement whatever features they'd like or need in OpenGL, and they implement them as vendor-specific extensions. If someone else needs that functionality for their driver, well, before you know it the vendor-specific extension will become ARB-mandated, and probably pored over and refined a little by all the OpenGL board in the process - a board which consists of all the major 3D hardware and software manufacturers. Shortly after, most drivers will support that. Eventually the ARB extensions will probably be integrated into the next OpenGL base spec, as just happened with OpenGL 1.3.
So, there's no one single vendor controlling the spec. 3D vendors can be as creative as they want. Only if a feature becomes used on several different 3D architectures does it become a standard. Your code will continue to run fine on architectures where you used the vendor-specific extensions, as the vendor will almost certainly keep the original functionality around indefinitely as well as supporting the ARB-mandated version of it. If you want, you can go back a little later and use the ARB extension instead in your code, and the functionality becomes available to everyone.
By using DX8 instead of OpenGL you know that effects designed for the NVidia pixel shader will magically just work on the next-generation Radeons. At the same time, you're handing over control of the whole API to Microsoft, which does not make 3D chipsets, and you're stuck with their idea of how the pixel shader ought to work, as opposed to an API for it designed by the company that makes the chipsets, and then later (if it's successful), reviewed and revised by everyone important in the industry. I won't even start on the cross-platform issues.
Your choice.
But is that good enough? (Score:2)
You are right. Microsoft isn't a hardware vendor. They are a software vendor which is simply the other side of the API. There is no good argument that an API should be specified by either side, so long as it is done in consultation with the other.
I agree that using DX is trusting Microsoft to do the "right thing" by hardware companies, but I'd also argue that it is well within their interests to do just that. If they invent a spec that just sucks then they are only harming their own APIs acceptance by the game industry as a whole - hardware and ISVs. Microsoft has been very fast to incorporate the latest hardware advances in DX and to work closely with the hardware vendors to ensure they converge on an interface that is manageable and uniform.
Remember DX8 has been available to ISVs for over a year now, even long before the nVidia specs on the GF3 were available. DX certainly gives you the time advantage over OpenGL. Waiting for the ARB spec to come out isn't the best solution for a game designer who wants to get their game using the latest hardware as soon as possible. A game developer who wrote to the DX8 spec could be sure their game will have life on the top level cards produced by all hardware manufacturers simply through MS's weight in the marketplace.
I'll happily grant that the OpenGL extensions for the GF3 are going to be much more closely aligned to the hardware than DX8 pixel shaders are. I'd expect that to be the case seeing they are vendor designed extensions for their own chipset.
What it seems to come down to if you want the latest and greatest hardware support in your software (assuming you are a Windows-only designer) is to either support the latest DX and trust Microsoft to have the weight to pull the hardware designers into line (a pretty sure bet), or support OpenGL vendor extensions and hope the vendors don't change them, implement different ones or settle on a totally different ARB extension and write different code for each card you plan to support.
It makes DX very attractive if you are a Windows developer, especially given the fact you are almost certainly using other DX components to handle audio, input and force feedback.
Re:But is that good enough? (Score:2)
Re:This is a job for the ARB (Score:2)
Actually, it's devised by several companies that make chipsets, who Microsoft works with closely to ensure that (a) desired, and (b) feasible.
Re:This is a job for the ARB (Score:2)
Actually, it's [DX8 API] devised by several companies that make chipsets, who Microsoft works with closely to ensure that (a) desired, and (b) feasible.
So how do you explain the fact that the DX8 pixel shading API doesn't support everything the GeForce3 does? Surely if Microsoft had really worked that closely with NVidia there would not be this difference?
Or perhaps Microsoft dumbed down the API to only support what it thought would be common features to all procedural pixel shaders?
Either way, it sure doesn't sound like NVidia designed it.
Remember, no matter who is involved with Direct3D, in the end, the only entity actually controlling it is Microsoft. They have the last word on what goes in and what stays out. They're the people who write and distribute the code that forms the DX8 APIs. If they don't like it, you don't get to use it.
With OpenGL, the people with the real power are the developers of end-user software. It's them who get to decide what vendor extensions to take or leave, not Microsoft. Further down the line, once an extension becomes popular... well, every OpenGL ARB member has the same rights as every other regarding choosing what becomes part of the standard spec.
I thought most of us had decided that governance by consensus amongst equals was superior to dictatorship?
about OpenGL 1.2... (Score:2, Insightful)
A difference no-one mentioned (Score:1)
What would be great would be team based 3D game under X that gives you little windows showing the point of view of the others on your side, just like the views from the marines's cameras in "Aliens." With OpenGL it wouldn't be too much of a hassle to export the views from each machine and re-scale them.
A very important feature would be for the screens to go to static as each team member goes down
Re:A difference no-one mentioned (Score:2)
Also GLX (or whatever Irix called it when you went between machines) was never very fast. People would always complain to me that their program was slow, and I would discover that they had accidentally rsh'ed into another machine. In some ways it would have been preferrable if it didn't work at all, so people would know immediately not to do that.
It is too bad but I would have to say this is not a selling point of OpenGL anymore.
The Best Part about this is.... (Score:2, Funny)
But does it do shaders? (Score:1, Interesting)
Well, it wasn't listed in the "core features" in the press release, so I doubt it.
like deja vu all over again (Score:5, Informative)
Re:like deja vu all over again (Score:2)
timmothy != taco (Score:1)
you missed it (Score:1)
Re:like deja vu all over again (Score:1, Insightful)
My guess is that it's easier for people skimming the homepage of /. [slashdot.org] to read. ;-)
OpenGL = Direct3D 8.1 (Score:5, Insightful)
OpenGL uses extensions, so you don't have to rev the version number to add funtionality, you only have to have supporting drivers (and/or hardware).
That is why OGL hasn't been rev'ed in so long, it didn't need it, so you can provide a stable API for the developers.
It is just cleaner to have this new stuff "built-in", so they do it every now and then.
Re:OpenGL = Direct3D 8.1 (Score:5, Insightful)
Re:OpenGL = Direct3D 8.1 (Score:1, Insightful)
DirectX is a great standard for 3D graphics.
I've spent many a day playing DirectX games on Linux/BSD/Macintosh/BEOS/OS2/QNX/Amiga.
Oh crap! What am I saying? DIRECTX DOESN'T SUPPORT ANYTHING NON MICROSOFT!!!
Yes that's a standard alright.
Standard piece of BS.
Re:OpenGL = Direct3D 8.1 (Score:2, Interesting)
As said here a bunch of times, OpenGL relies on extensions to expand it's functionality. AFAIK, both NVIDIA and ATI offer these extensions for their cards (as well as a lot of extensions from other developers).
If both ATI and NVIDIA release OpenGL extensions to support new feature x, is there something that keeps developers from having to implement feature x twice, for each api/card, compared with DirectX where there is one standard way to do it?
Re:OpenGL = Direct3D 8.1 (Score:1)
OpenGL is more flexible in this way. The only problem is that some things need to be migrated from vendor to multi vendor extensions faster. As soon as both Nvidea and ATI have a new feature, it is time for the Gl commitee to hammer out an initial draft for a multi vendor extension I think.
Re:OpenGL = Direct3D 8.1 (Score:1, Interesting)
Vendor specific extension(NV_ , ATI_, SGI_, etc.)
Multivendor extension(EXT_)
ARB extension(ARB_ )
Then it might be let into the core, if it's really that useful and supported by the companies sitting on the ARB. When the extension hits the EXT_, you can pretty much count on it beeing supported on chips that matter.
Re:OpenGL = Direct3D 8.1 (Score:2)
Until the day we get the extension starting with "ARB" it is not a true extension that everyone can use. I only consider standard extensions to be part of OGL. What's the point of using the extension if it only works for one card, and you'll have to write different code in order to support another card?
In this aspect, D3D is doing very well - even though some features are supported only by 1 chip today (e.g. pixel shaders), the interface is neutral. It ensures your code will work without even a recompile when a newer card from another company comes out. Can we do it in OpenGL?
Although I still use OGL, I might not be soon unless *vendor-independent* extensions come out faster.
Re:OpenGL = Direct3D 8.1 (Score:2)
Yeah, D3D is better at what it does (Score:1)
Re:Yeah, D3D is better at what it does (Score:1)
Re:OpenGL = Direct3D 8.1 (Score:1)
SDL's 3D code is just a wrapper around OpenGL for whatever platform you're using, if I'm not mistaken. So the 'future' that you point out is wrappers for OpenGL?
OpenGL Best Game Hands Down. (Score:1)
Which of course everyone on this board knows, is...
Homeworld!
:-)
You ARE the mothership when you have a 64MB Video card and a 21 inch Sony G500 monitor out there in "The Wastelands" in 4 player internet gaming mode on Won.
The tactical, combat and collaboration OpenGL gives to the game is spectacular.
Nothing is more breathtaking than to radio for Help! to your teammate while your outgunned and outnumbered as the two bastards decide to double team ya. On no! 5 Destroyers and 2 Heavy Cruisers?!! Heeeeeeeellllllllllpppp!!!
^%#@^&%#&
As your Mothership burns and the sniveling little idiots radio in, "Yeah, this guy sucks...."
Only to see them run like scared chickens duley beheaded when that moment comes....
Hyperspace Signature Detected.
No, it isn't just a frigate...
Sweet Mother of Pearl its the entire fleet!
Sweet Jesus! 3 Heavy Cruisers and 8 Destroyers comming out of Hyperspace like the calvery at 1600x1200 in 32bit Open GL mode!!!!
2 Minutes later the two assholes fleets are burning at high res and in 32bit color!
My god its its beautiful.
-hack
Hehe, yeah Homeworld rocks. (Score:1)
instead (Score:2, Insightful)
Re:Open GL is Dying (Score:1)
Re:Open GL is Dying (Score:1)
Could also mean that D3D is more difficult to use, hence the 1:4 ratio of Usenet posts, or that DirectX does more than just graphics. Not that one would expect anything but a fundamentally flawed analysis in an obvious troll.
Re:Open GL is Dying (Score:2, Funny)
Please grow up, before you anger the real developers.
Re:Open GL is Dying (Score:2, Insightful)
In the ruthlessly Darwinist gamer/graphics market the answer to "who is the market leader" and "who has the best solution" is usually the same, as long as you consider that "best solution" does not mean "most sensible and powerful API". From a developer's point of view OpenGL may very well be better but it just isn't where the money is.
(And BTW, I am not the author of the parent to which you replied. That's a not-even-thinly-disguised recycled anti-BSD troll.)
What about pro apps, i.e. Maya? (Score:1)
MAX is soooo much better on GL. I've never tried HEIDI or other such. I wonder what Maya would run like in D3D? Not so well, I imagine.
Re:What about pro apps, i.e. Maya? (Score:1)
Re:OpenGL Death: This is not such a sad thing (Score:2, Informative)
Yes it is. Instead of just writing directly to [OpenGL] api for all 3 platforms (Win32,Mac,Linux), I now have to use a wrapper (assuming you don't support consoles, which not ever game developer can/does.)
Of course most PC game developers are just using DX so this doesn't effect them AT ALL.
> it does mean good things for the majority of the game playing market
Gamers don't care which (graphics) API a game uses.
> However, todays hardware is written with Direct3D in mind,
That's not true. The GeForce cards expose more of their functionality under OpenGL then D3D.
I believe you mean "the majority of today's PC hardware have better support on D3D then OpenGL." And, yes, you are right.
The point is, though, that even if OpenGL was vanquished tomorrow, us game developers STILL have to support at LEAST *3* API's: One on PC (X-Box), PS2, and Gamecube. (X-Box is basically DX8, but I won't know 100% for sure until we get our dev boxes.)
Re:OpenGL Death: This is not such a sad thing (Score:1)
Err... unless the card they're using has a great d3d driver and a crap opengl driver, or vice versa. I want to point out matrox as a historical example, but I'm not sure about that (please educate me), and my matrox card works fine under linux.
Re:OpenGL Death: This is not such a sad thing (Score:1)
But I don't like people having to be forced in to accepting a standard that isn't support by any other operating systems. How is people developing for linux gonna support it?