Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software

OpenGL 1.3 Spec Released 193

JigSaw writes "The OpenGL Architecture Review Board announced the new OpenGL 1.3 specification (1.8 MB pdf). In OpenGL 1.3, several additional features and functions have been ratified and brought into the API's core functionality. New features include Cube map texturing, multisampling, new texture modes that provide more powerful ways of applying textures to rendered objects, compressed texture framework etc. Let's all hope that GL can catch up with Direct3D now, as with the latest DirectX8, Direct3D has done some big steps towards feature-set, speed and even non-bloatiness when it comes to coding for it, while OpenGL 1.2 was released more than 2 years ago and it did not offer as much."
This discussion has been archived. No new comments can be posted.

OpenGL 1.3 Spec Released

Comments Filter:
  • Re:The funny bit... (Score:3, Interesting)

    by Lussarn ( 105276 ) on Wednesday August 15, 2001 @08:27PM (#2126065)
    I for one can't tell 24bit colour from anything higher

    When it comes to 3D you and everybody else can, thats because 24 bit color only has 8bit integer precision/color. With todays games evry pixel on the screen is rendered from many textures, lightmaps, bumpmaps etc. This gives errors when there only is 8 bit precisions.

    John Carmack (Id software) have stated that more precicion is needed on future GFX cards.

    Consider this
    In floatingpoint math.
    120/50*30/2 = 36
    In integer math that answer would be 30 (calcing from left to right)

    From what I understand the cards nowadays uses integer math.

  • by dnaumov ( 453672 ) on Wednesday August 15, 2001 @07:53PM (#2132708)
    And the reason ? It's the API of choice for John Carmack, the main programmer of Quake, Quake 2 and Quake 3: Arena. YES, you can find DirectInput code in Quake3, but can you find a SINGLE LINE of D3D code in ANY of games coming out of ID Software ? No chance in hell.

    Then, there's this very nice company called EpicGames. It created Unreal and Unreal Tournament (while trying to push Glide) and are now doing Unreal Warfare. These guys provide nice competition to ID Software and YES, they use Direct3D. Now take a modern computer with an NVIDIA card (chances are you already have one anyways) and play some Quake2 and Quake3...See the framerates ? OK... Now start up Unreal/UT, select D3D as the renderer and...do I really have to tell you how low will your FPS go ?

    Start-up Half-Life, the most popular online 3D FPS game at the moment (due to CS), try switching back and forth between the OpenGL and D3D renderers and compare the framerates. I know some of you are going to scream that HL is based on the Quake engine, etc, but just to let you know, only 20% of the HL engine code come from Quake.
  • by .pentai. ( 37595 ) on Wednesday August 15, 2001 @06:22PM (#2135432) Homepage
    That's what ARB extensions are for.
    Standardized extentions which aren't necessarily part of the spec, but, they work the same accross implementations. I don't think it'll be too long before we see some sort of standardized shader extension. But then, if you have to write microcode for the vertex shaders etc. then don't you have to do that over anyways unless the cards are binary compatible with their shader processors. I don't know about you, but I wouldn't trust DirectX to take my optimized per-vertex code and translate it to a different shader language set.

    ...but then what do I know :P
  • by marm ( 144733 ) on Wednesday August 15, 2001 @08:18PM (#2139846)

    If you go with OpenGL you have to write your program for each different vendor extension that comes out. Honestly, what are the chances of ATI or PowerVR ever supporting NV_texture_shader or NV_texture_shader2?

    I'd put the chances quite high if it's a decent spec. Perhaps it might not be called NV_texture_shader in a year's time, it'll be ARB_texture_shader, and as an ARB-mandated extension will end up being supported by every sane driver with the required hardware support. You can bet that the NVidia drivers will still support the old NV_texture_shader as well though.

    This is the way the OpenGL spec grows. Manufacturers are free to go ahead and implement whatever features they'd like or need in OpenGL, and they implement them as vendor-specific extensions. If someone else needs that functionality for their driver, well, before you know it the vendor-specific extension will become ARB-mandated, and probably pored over and refined a little by all the OpenGL board in the process - a board which consists of all the major 3D hardware and software manufacturers. Shortly after, most drivers will support that. Eventually the ARB extensions will probably be integrated into the next OpenGL base spec, as just happened with OpenGL 1.3.

    So, there's no one single vendor controlling the spec. 3D vendors can be as creative as they want. Only if a feature becomes used on several different 3D architectures does it become a standard. Your code will continue to run fine on architectures where you used the vendor-specific extensions, as the vendor will almost certainly keep the original functionality around indefinitely as well as supporting the ARB-mandated version of it. If you want, you can go back a little later and use the ARB extension instead in your code, and the functionality becomes available to everyone.

    By using DX8 instead of OpenGL you know that effects designed for the NVidia pixel shader will magically just work on the next-generation Radeons. At the same time, you're handing over control of the whole API to Microsoft, which does not make 3D chipsets, and you're stuck with their idea of how the pixel shader ought to work, as opposed to an API for it designed by the company that makes the chipsets, and then later (if it's successful), reviewed and revised by everyone important in the industry. I won't even start on the cross-platform issues.

    Your choice.

  • by Anonymous Coward on Wednesday August 15, 2001 @05:07PM (#2143492)
    Developers now have a card vendor neutral way to access programmable shaders (pixel and vertex shaders) from DX8. But does OpenGL1.3 have anything comparable, or do we have to resort to NVidia or ATI extensions? If that is the case, OpenGL will be hard hit unless a standard vendor neutral extension it added soon.

    Well, it wasn't listed in the "core features" in the press release, so I doubt it.

  • by Anonymous Coward on Wednesday August 15, 2001 @06:28PM (#2143739)
    The usual evolution of extensions are as follows(given that they are useful extensions):
    Vendor specific extension(NV_ , ATI_, SGI_, etc.)
    Multivendor extension(EXT_)
    ARB extension(ARB_ )

    Then it might be let into the core, if it's really that useful and supported by the companies sitting on the ARB. When the extension hits the EXT_, you can pretty much count on it beeing supported on chips that matter.

  • by MasterVidBoi ( 267096 ) on Wednesday August 15, 2001 @05:34PM (#2144177)
    I've been considering tinkering with OpenGL graphics for awhile, and I have a small question that someone more experienced than I can answer.

    As said here a bunch of times, OpenGL relies on extensions to expand it's functionality. AFAIK, both NVIDIA and ATI offer these extensions for their cards (as well as a lot of extensions from other developers).

    If both ATI and NVIDIA release OpenGL extensions to support new feature x, is there something that keeps developers from having to implement feature x twice, for each api/card, compared with DirectX where there is one standard way to do it?
  • Re:I'm can't wait (Score:2, Interesting)

    by Nate Fox ( 1271 ) on Wednesday August 15, 2001 @04:56PM (#2144752)
    Actually, this isnt too far off. ABC News [go.com] had an article about Sony cams with 'night vision' and a filter, and can (kinda) see through clothes.

When bad men combine, the good must associate; else they will fall one by one, an unpitied sacrifice in a contemptible struggle. - Edmund Burke

Working...