Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Programming IT Technology

OpenGL 1.5 130

Yogijalla writes "SGI and OpenGL ARB just announced the OpenGL 1.5 specification, introducing support for a new OGL Shading Language. Also, check out the new Java bindings to OpenGL. OGL 1.5 is a step towards the OGL 2.0, already suggested 2.0 by 3DLabs." Also worth pointing out that OpenML SDK has been released as well.
This discussion has been archived. No new comments can be posted.

OpenGL 1.5

Comments Filter:
  • Gotta love (Score:5, Funny)

    by Anonymous Coward on Thursday July 31, 2003 @02:38AM (#6577761)
    those editing skills.

    OGL 1.5 is a step towards the OGL 2.0, already suggested 2.0 by 3DLabs.
    • by Randolpho ( 628485 ) on Thursday July 31, 2003 @09:42AM (#6579288) Homepage Journal
      In it, it makes no mention of Java3d [sun.com], which is a scene graph API with bindings to OpenGL or Direct3d. Is this announcement going to be a thin binding, or a new version of Java3d? Or will it replace Java3d?

      Inquiring minds want to know! :)
      • by deanj ( 519759 ) on Thursday July 31, 2003 @09:48AM (#6579333)
        Java3D is currently on hold. This isn't a replacement for Java 3D, it's just something different. Kinda like Inventor is different than OpenGL, although it uses OpenGL.
        • Hmm....

          I didn't know Java3D was on hold -- I've been trying to write a program with it for the past few months too.

          I wonder what this is going to be like. I'm not sure if I would prefer a thin wrapper or a totally new API.
        • by Mithrandir ( 3459 ) on Thursday July 31, 2003 @11:30AM (#6580338) Homepage
          Not on hold - its officially dead. All of the resources working on the project have been moved off and are now working on either JSR-184 or the new OpenGL bindings JSR (once it gets started).

          We're involved in various efforts to replicate and replace Java3D with another scene graph. If you need something immediate, take a look at the Xith3D project from Dave Yazel and the MagiCosm guys. Alternatively, if you can wait a couple of months, we're in the process of ripping our OpenGL scene graph out of the core of Xj3D [xj3d.org] to turn it into a separate project. The difference between the two is that Xith3D is focused on games (it has primitive objects for particle systems, for example) where our work is focused on CAVEs, powerwalls, linux clusters etc.
          • by deanj ( 519759 ) on Thursday July 31, 2003 @11:43AM (#6580462)
            It's not "officially dead" until Sun says it is. When I spoke to the developers at JavaOne, they said it was on hold. That's the official line right now.
            • Sun cannot ever proclaim anything dead. It is admission of failure to them and therefore not acceptable. If you are waiting for them to declare it as dead, you'll probably die before they will state that. JSDT is still officially "on hold" too - it has been for the last 4 years...

              When evertything has been cancelled, and their developers are now not answering emails on the java3d list, and have been assigned to other parts of Sun, it is dead. Also, we already have been told that we'll have access to parts o
              • Yeah, I know...I know... technically (offical Sun party line), it's not dead, but for all practical purposes, you're right, it is dead. I was just relating what one of the developers said at JavaOne. I did a lot of work with it, so it's hard to see it go.

                It'll be interesting what happens with the patents, particularly the .compile() part of the API, since I believe that's one of the things Henry Swizworal (I never could spell that) and the hardware guy (who's name escapes me at the moment...John Deering
    • ...DirectX is already up to 9!
    • OGL 1.5 is a step towards the OGL 2.0, already suggested 2.0 by 3DLabs.

      A suggestion 2.0 is a suggestion you can't refuse.
  • by ericvids ( 227598 ) on Thursday July 31, 2003 @02:42AM (#6577771)
    But I'm happy to see that they're finally putting a high-level alternative to the ARB_vertex_program and ARB_fragment_program extensions. I've been dabbling in these extensions and it's been a huge pain. Also just in time for the class I'm teaching next semester.

    I wonder when these will become standard (not just as an ARB extension but as an ARB required feature). Hopefully in 2.0? It will save a lot of calls, at the very least--just check the version number of the GL implementation, no more searching extension strings... :)
    • by luther ( 112387 ) on Thursday July 31, 2003 @02:55AM (#6577805) Homepage

      I wonder when these will become standard (not just as an ARB extension but as an ARB required feature). Hopefully in 2.0? It will save a lot of calls, at the very least--just check the version number of the GL implementation, no more searching extension strings... :)


      You can see at :

      http://www.opengl.org/developers/about/arb/notes /m eeting_note_2003-06-10.html

      That the reason that it was called 1.5 and not 2.0 is that the shader language didn't make it to the core specs, and that they plan to promote it to core specs (and release 2.0) soon, possibly even in the next ARB meeting in september or the one thereafter.

      Here is the quote :

      The result was a simple majority of non-abstaining YES votes, but not a supermajority. Interpretation of this vote required some care since final spec approval requires a supermajority vote, while consideration of features for the final spec requires only a simple majority. Because the NO votes were strongly held, we expect that trying to approval a core revision including the shading language would carry extremely high risk of failing to approve the spec. We will therefore not include the shading language into the core at this time, but instead drive a new core version as soon as there's more experience with the extensions, perhaps as soon as this fall.

      As previously agreed in the marketing working group, we will call the new core revision OpenGL 1.5, reserving OpenGL 2.0 for a future core revision including the shading language.
      • As previously agreed in the marketing working group, we will call the new core revision OpenGL 1.5, reserving OpenGL 2.0 for a future core revision including the shading language.

        I think it was a marketing error. They should have called it OpenGL 2.0 Full Speed and reserve OpenGL 2.0 Hi-Speed for the future core spec.

        Sorry, couldn't resist a joke referring to a recent turmoil [slashdot.org] in USB packaging names, which have been clarified [slashdot.org] later.

    • Somewhat old, it's been there since Monday...

      Then why didn't you submit the story on Monday?
  • Woohoo (Score:5, Interesting)

    by Duncan3 ( 10537 ) on Thursday July 31, 2003 @02:46AM (#6577786) Homepage
    "Non power-of-two Textures"

    That's _thee_ key feature Apple needed to do the fully OpenGL desktop, along with a pile of more eligant error handling of course. Glad to see it's now standard.

    It also makes the modeling and artist guys much happier. Do you have any idea how hard it is to make everything out of squares?

    2.0 should put the last of what we need for the next 5 years into OpenGL, then maybe people can start writing more portable games again.
    • Re:Woohoo (Score:5, Informative)

      by ericvids ( 227598 ) on Thursday July 31, 2003 @03:02AM (#6577825)
      > That's _thee_ key feature Apple needed to do the fully OpenGL desktop,

      But there have always been tools to circumvent the power-of-two limitation. You can always use only part of a texture on a primitive (triangle, quad, etc.). There are tools to tile non-power-of-two textures on a power-of-two texture to minimize memory usage.

      At least in game dev, it didn't really prove to be useful. (Except for those artist whiners who insist that they can use any size image that comes out of Photoshop. *laugh* just kidding guys... I hear ye)

      I'm just thinking that Apple wouldn't be making the late introduction of non-power-of-two textures into OpenGL as a plausible excuse for not making the desktop fully-OpenGL already. Besides, they can always introduce Apple-specific extensions--why didn't they do that already? Methinks they're just lazy. :)
      • Re:Woohoo (Score:3, Informative)

        by Anonymous Coward
        Apple did implement GL_EXT_texture_rectangle (DRI implements it as well). It has some uses, especially with GL_APPLE_ycbcr_422 or GL_MESA_ycbcr_422, but in games, it is non issue, for reasons you mentioned. Do not forget, that NPOT textures still aren't first-class citizens compared to POT textures, you lose tiling for example.
        • Oops, sorry for the apparent misinformation. I'm aware of those extensions--what I meant in my previous post is that Apple can always introduce their Apple-specific extensions into their desktop, but they haven't done so already. :)
        • Re:Woohoo (Score:4, Informative)

          by arekusu ( 159916 ) on Thursday July 31, 2003 @03:50AM (#6577972) Homepage
          However, NPOT textures are the only way to get fully accelerated DMA AGP updates. POT textures are not currently accelerated in 10.2.6, see Apple's TextureRange sample code.

          As you point out, this is primarily useful with video textures, but any game that does animated textures can take advantage of it.
      • Re:Woohoo (Score:4, Informative)

        by Duncan3 ( 10537 ) on Thursday July 31, 2003 @03:36AM (#6577929) Homepage
        Yes there are hacks, but those hacks eat memory, and you only have so much texture memory on a card.

        And Apple has a loooong list of OpenGL extensions. They implemented this all some time ago now. You should see some of the realtime video editing that they can do.

        Nothing gets into OpenGL that wasn't an extension first - that way it's all developed and debugged before we have to deal with it.
        • Re:Woohoo (Score:3, Informative)

          by ericvids ( 227598 )
          Oops, check my response to Anonymous Coward on the other subthread. :) Sorry for the misinformation. Apple did have non-power-of-two textures through the multivendor GL_EXT_texture_rectangle extension; I was just pointing out that they didn't bother using this for their desktop. :)

          You're right about the hack-nature of what I proposed though. In practice, however, you only lose very little texture memory, if you have a good tiler. Besides, I believe that non-power-of-two textures will suffer some memory
      • >> That's _thee_ key feature Apple needed to do the fully
        >> OpenGL desktop,
        >
        > But there have always been tools to circumvent the
        > power-of-two limitation. You can always use only part
        > of a texture on a primitive (triangle, quad, etc.).

        The gotcha here is that Apple needs to take a window backing store, a buffer, and apply that as a texture to the quad. Doing this by the old slice-n-dice into assorted power-of-two tiles is a pain, and an intermediate copy that just slows things do
      • An implementation is free to use the hacks to get the non-power-of-2 textures. The point is that the hack is now inside OpenGL and thus the interface *allows* a non-hack implementation. This is entirely a good thing. Way too much of OpenGL and X (and Windows GDI and some parts of DirectX) is crap like this where just a trivial amount of extra code would make a better interface, but the implementors don't add it because "it can be done by the user".

        Apparently it is very difficult for developers to see wher

    • Re:Woohoo (Score:4, Interesting)

      by zenyu ( 248067 ) on Thursday July 31, 2003 @03:24AM (#6577887)
      "Non power-of-two Textures"

      That's _thee_ key feature Apple needed to do the fully OpenGL desktop, along with a pile of more eligant error handling of course. Glad to see it's now standard.


      I suspect it's for bind to texture, not anything that can already be done by just using part of the texture. Supposedly nVidia has been waiting for 1.5 before doing bind to texture in UNIX environments (currently only a WGL extension.) For me, on the FX, copy has actually turned out faster than bind, but that is hopefully just a driver limitation. Rectangular textures also have nice coordinates for using them in multi-layer programmable pipeline settings. (I haven't read the specs yet, just extrapolating from the nVidia extension.)
    • No doubt, my knowledge of 3D APIs and hardware is pathetic; however, what exactly do you mean by "fully OpenGL desktop"?

      I thought Apple's intentions were to give users a postscript desktop. OpenGL is simply a window to unused 3D hardware that could be used in place of an expensive proprietary postscript coprocessor (ie; like those old NExT cubes).

      Isn't quartz extreme no more than an openGL wrapper?
      • by Dominic_Mazzoni ( 125164 ) * on Thursday July 31, 2003 @03:52AM (#6577975) Homepage
        No doubt, my knowledge of 3D APIs and hardware is pathetic; however, what exactly do you mean by "fully OpenGL desktop"?

        Quartz Extreme, Apple's name for the Mac OS 10.2 version of the Quartz Compositor, uses OpenGL to render what you see on the screen. Note that just because it uses OpenGL doesn't mean that it uses any 3D - all it takes advantage of is 2D bitmaps, special effects like shadows, and alpha transparency.

        But that's a really big deal! It means that all of the bitmaps representing your windows and other screen objects are in your graphic card's RAM, and moving a window in OS X doesn't require computing of any pixels at all on the CPU. (Unfortunately, resizing is slower, because every redraw requires sending a new bitmap, of a different size, to the graphics card.) This also allows them to do the Genie effect, window scale effects, and Expose superfast without wasting any CPU cycles. Compare that to Windows or X - when you move a window, all of the windows underneath it get repaint events, which can take a while to trigger depending on the application.

        Note that the Quartz Compositor is a totally different thing than Quartz, the new 2D graphics library in OS X that is designed to replace QuickDraw.

        The Quartz Compositor is what makes it possible for QuickTime movies to keep playing when you minimize them to the dock, transparent overlays to smoothly fade in and out when you hit a volume control key on the keyboard, and 10.3's fast user switching to literally "rotate" the screen in 3D to show the other user's desktop.

        Quartz Extreme - i.e. using OpenGL to implement Quartz Compositor - is what makes it fast.

        The great thing about Quartz Extreme is that Apple has only begun to explore the possibilities. The fast user switching effect probably only took them a day to code up, because all of the underlying technology was there.
        • by darkov ( 261309 ) on Thursday July 31, 2003 @04:12AM (#6578026)
          But that's a really big deal! It means that all of the bitmaps representing your windows and other screen objects are in your graphic card's RAM, and moving a window in OS X doesn't require computing of any pixels at all on the CPU.

          Actually, that's due to the fact that the bits are in your computer's RAM. Quartz double buffers all drawing so that it doesn't flash and looks smooth. Because of this no redrawing has to be done when part of a window is revealed. On the other hand, resizing means that the buffer has to be reallocated and redrawn.
        • by IamTheRealMike ( 537420 ) on Thursday July 31, 2003 @04:30AM (#6578080)
          Yes, all that is absolutely correct. I'd just like to point out that you don't really need to use OpenGL to do those things though, at the end of the day the 2D/3D primitives all boil down to a sequence of instructions to the card. For instance X pixmaps (on XFree) are already stored in video RAM.

          The main problem with XFrees responsiveness is not whether it uses OpenGL or not (which ultimately makes little difference) but how it interacts with applications and how it pokes the video card. For instance very few drivers fully accelerate RENDER (which is 2D hardware acceleration for alpha channel blending and some other things). That means you end up doing very slow framebuffer reads, compositing in software then upload. I guess part of the reason for using OpenGL was to work around the reluctance of driver manufacturers to write specialist fully optimized drivers for their hardware.

          Not to mention that most apps are very slow at processing Expose events. There has been talk of doing what MacOS does here and having apps directly render client-side into a compressed backing store.

        • Interesting... so how exactly does postscript/pdf play into this?
          • Its a different layer of abstraction. DisplayPDF is used to 'describe' the widgets, the window contents, etc. The OpenGL part is getting this information (in a rawer form, ie. pixels and textures, what have you) to the card. Its kinda like how the grpahics card cares very little about the internal representation of a JPEG file, it just draws the pixels sent to it. If Apple based their desktop on Jpegs (a really bad idea) they could still use OpenGL to render it.
      • The Quartz engine (the 'postscript desktop") is used largely to create the contents of the windows. OpenGL, at this stage, is used to paint the windows onto the screen, after they've been rendered by Quartz (or Quickdraw in some cases). OpenGL gets more hardware support than Quartz rendering. A fully OpenGL desktop might leverage more of the hardware so that it becomes more of a "wrapper" for GL, but I don't think it's that simple. Maintaining the rendering quality of Quartz may mean OpenGL isn't up to scra
    • The easy way around that limitation is to have the artist draw the texture however he wants in any size and then have the coder stretch it into a best fit square. For instance if you have a 640x480 texture you can simply use Photoshop or whatever to set the image size to 512x512. For older cards you can have a second version shrunk to 256x256.

      OpenGL and DirectX use 0.0-1.0 coordinates so it has no effect on the output (as far as the texture being mapped where it's supposed to be) as long as you don't cro
  • Doom 3 (Score:5, Interesting)

    by DeathPenguin ( 449875 ) * on Thursday July 31, 2003 @02:50AM (#6577794)
    The shipping date is coming up in a few months for Doom 3. Any ideas on whether it will be using OpenGL 1.5, or is Carmack still intent on pushing the industry forward by implementing draft standards?
    • Re:Doom 3 (Score:5, Insightful)

      by Yuioup ( 452151 ) on Thursday July 31, 2003 @03:06AM (#6577836)
      I wouldn't be surprised if OpenGL 1.5 is largely based on the work that Carmack has done on DOOM III.

      • Re:Doom 3 (Score:5, Insightful)

        by Forkenhoppen ( 16574 ) on Thursday July 31, 2003 @10:07AM (#6579509)
        It's sad but true; Carmack's probably the one person out there with the biggest influence on new revisions of the OpenGL spec.

        As to whether or not it's largely based on his work, however, that's another story. Honestly, there are tons of people working on the same thing that Carmack is. He's just the most well known, with the biggest profile. The technology behind Doom III, while interesting, really is just a natural, next logical extension of the current state of 3D graphics.
    • Re:Doom 3 (Score:3, Insightful)

      by bogie ( 31020 )
      Not to bring your down, but all signs are point to Doom3 being a lot longer than a few months away.
      • all signs are point to Doom3 being a lot longer than a few months away

        According to my fuzzy recollection, JC was saying that Microsoft is paying them large amounts of money to hold off releasing until the XBox version is ready. That must be some large pile of money.

        I think he also said that id would use the time profitably, to make Doom 3 even better. It's apparently been release-ready for quite some time.
      • Not to bring "your" down, but they'll release a
        playable beta version that we'll have for FREE
        for a long damn time like they always do. Are you
        old enough to remember when Quake3 Demo was the
        most popular game on the net? If wasn't even a full
        game. Nobody paid for it. They will do the exact
        same thing with Doom3. Rumor has it they are
        releasing the first multiplayer Doom3 demo next
        month at quakecon. Considering they have already
        announced they will be having Doom3 deathmatch
        at quakecon, this rumor is probably clo
    • Re:Doom 3 (Score:3, Insightful)

      by FreonTrip ( 694097 )
      Doom 3 will ship with at least five OpenGL rendering paths to address the fragmentation of capabilities in the video card market: a failsafe path for the original Geforce and Geforce2s, an older Radeon-specific path, a Geforce3/4 specific path, a newer Radeon-specific path, and an OpenGL 2.0 path. The latter's probably the one in which you're most interested. :-) His tinkering with a prerelease 3Dlabs P10 board and the preliminary OpenGL 2.0 spec impressed him enough to convince him that this was The Rig
  • by Biomechanoid ( 515993 ) on Thursday July 31, 2003 @02:51AM (#6577795)
    An OpenML SDK is now available

    http://www.khronos.org/openml/sdk/linux_requirem ents.html

    Id really like to try this OpenML SDK, but it seems you are requered to enter your phone number - now why is that??

  • So: (Score:4, Interesting)

    by Anonymous Coward on Thursday July 31, 2003 @02:56AM (#6577807)
    - What still remains before we can say OpenGL is back toward its original goal (you write for one standard instead of having to write for every single little card driver, something kind of ruined by the fact that many things these days, every card uses a different opengl "extention" to do the exact same goal.)

    - What still remains that DirectX excels at that OpenGL is lagging behind at

    - What of the things in the above two lists will be fixed by OpenGL 2.0, when/if it is adopted.
    • Re:So: (Score:5, Informative)

      by ericvids ( 227598 ) on Thursday July 31, 2003 @03:26AM (#6577894)
      These are valid concerns. I'm surprised the guy got marked as a troll.

      > - What still remains before we can say OpenGL is back toward its original goal [...] every card uses a different opengl "extention" to do the exact same goal.)

      Well, I must say that OpenGL never really swayed from its original goal. OpenGL follows a pseudo-Bazaar philosophy--let vendors push for features they want, and if they get a massive following then it's good enough to put into the standard. I say pseudo-Bazaar because, unlike open source, this process happens somewhat too slowly for it to be competitive with DirectX.

      > - What still remains that DirectX excels at that OpenGL is lagging behind at

      The only thing that DirectX seems to excel at right now is standardized support for a lot of graphics programming constructs, i.e. its D3DX library (especially those mesh/skinning functions, quaternion arithmetic and the myriad transformation matrix builders it has by default--can anyone say D3DXMatrixShadow? :))

      However, we can also say that DirectX contains too many features that won't be used by many (and in fact some of them do get dropped in subsequent API releases) OpenGL, on the other hand, tries to promote a *clean* standard, not a super-bleeding-edge standard that contains a lot of cruft. That is also the reason why OpenGL lacks the utility functions I mentioned in the above paragraph; many developers have a (portable) base of utilities that works well for them, and all they want is a (portable) API to display their graphics, not something like DirectX which coerces you to use the Microsoft-only stuff for mostly everything (including the math, which should be something the programmer himself should handle).

      > - What of the things in the above two lists will be fixed by OpenGL 2.0, when/if it is adopted.

      OpenGL 2 is a draft (and therefore moving) standard, and it will be largely up to the ARB to define what is being used by most applications to be declared fit for the standard. It doesn't necessarily mean it will beat DirectX in terms of functionality, but it will surely be a cleaner, more general standard that vendors will be happy to adhere to.
      • by Anonymous Coward on Thursday July 31, 2003 @03:42AM (#6577950)
        I've heard the comments before that Direct3D/Quickdraw3D are "high-level" standards, and OpenGL is a "low level" standard-- i.e., OpenGL is largely primitive things, meaning developers must implement a bit more engine on their own; Direct3D tries to bring in the programmer at the higher level but limits them if their needs don't exactly fit that of Direct3D. Is that an accurate portrayal?

        What I always wondered is why the OpenGL people don't promote a two-level standard; the low-level is OpenGL as it exists now, the second level of the standard would be optional. and consist of the kinds of things that Direct3D/Quickdraw3D would have offered, higher level things. The second-level standard would be implemented on *top* of the first level standard, meaning it would be as portable as the base is and not provide a roadblock to changes in creating new opengl versions. Something like Mesa.

        Is this an attractive idea, or do the present existence of third-party libraries that sit on top of opengl make such an idea irrelivant? Even if so, it seems a "standard" higher-level library for opengl could take out one of the big complaints of Direct3D programmers ("OpenGL is too much work!")

        Let me know if anything i've said here is wrong; I've followed the Direct3D/OpenGL argument but have personally done nothing more complex than some simple GLUT applications. (And I didn't even get enough into GLUT to see to what extent it functions as a higher-level 'cover' API for OpenGL..)
        • by ericvids ( 227598 ) on Thursday July 31, 2003 @04:14AM (#6578037)
          > I've heard the comments before that Direct3D/Quickdraw3D are "high-level" standards, and OpenGL is a "low level" standard

          Yep, Direct3D does try to bring in the programmer at the higher level, and it does limit the programmer if they're programming anything other than games. That's because Direct3D (and DirectX in general) is meant for games in the first place; other media-intensive apps are somewhat secondary, although they can be done in DirectX, and for that particular application you have DirectShow (which used to be separate from DirectX, FYI, but is now part of it. I don't know why, but Microsoft said so.)

          I think Direct3D is more of that two-level standard you're trying to elaborate on. In fact, for a while Direct3D really is defined like that--it used to have a "retained mode" for high-level apps and an "immediate mode" for low-level ones. (They've since unified it into one immediate mode and did away with retained mode altogether.)

          The primary users of OpenGL, on the other hand, have been on the field for ages already, which means that they probably have all of the higher level stuff as part of their intellectual property. Using another vendor's API for what they already have is generally a dumb thing to do (lost productivity due to the fact that most of their apps are written in their old, working libraries already, and rewriting them in Direct3DX is tedious and error-prone). Besides, OpenGL clearly defined itself as a standard for displaying high-performance graphics, and helping the programmer on his other tasks (representing models, parameterizing the effects he can do in his engine, etc.) isn't really part of that goal.
          • This suggests there is a hole waiting to be filled by OpenGL.GAME - extensions that provide those features the games writers want specifically.

            Open Source being Open Source I imagine there are five or ten contenders already floating around- anyone know which are best of breed for BSDish Licencing and GPL Licencing respectively?
            • I don't know about 'extensions' but there are plenty of game libraries.

              SDL could be considered one, it handles OpenGL windows.

              PLIB (.sf.net) is a game library, it contains some feaures that assist games developers (a GUI, fonts, scene graph, utility fns).

              OpenSceneGraph is rather good.

              and a host more - use google to find them, or search through SourceForge, or even the OpenGL page has a set of links.

              As for the licence - most of these are LGPL, which I think is the most appropriate licence for a library.
        • What I always wondered is why the OpenGL people don't promote a two-level standard; the low-level is OpenGL as it exists now, the second level of the standard would be optional.

          That is what GLU is for, and GLUT does some job too.

        • by p3d0 ( 42270 ) on Thursday July 31, 2003 @09:10AM (#6579038)
          What I always wondered is why the OpenGL people don't promote a two-level standard; the low-level is OpenGL as it exists now, the second level of the standard would be optional. and consist of the kinds of things that Direct3D/Quickdraw3D would have offered, higher level things.
          Forgive my ignorance, but isn't that what GLU is for?
        • "What I always wondered is why the OpenGL people don't promote a two-level standard"

          That would be silly. Why not give people a choice of what to use on the second level? Especially if it's implemented in terms of the first level, it really doesn't matter what you use.
    • OGL 2.0... (Score:4, Informative)

      by zenyu ( 248067 ) on Thursday July 31, 2003 @03:47AM (#6577961)

      - What still remains before we can say OpenGL is back toward its original goal (you write for one standard instead of having to write for every single little card driver, something kind of ruined by the fact that many things these days, every card uses a different opengl "extention" to do the exact same goal.)

      - What still remains that DirectX excels at that OpenGL is lagging behind at


      I don't think pt.1 has really been lost, unless you are doing really cutting edge stuff you can use OpenGL pretty happily as is. Many scientific applications are actually coded to Performer which works just fine on OpenGL 1.0. I've written lots of stuff, some just a couple years ago, that used plain immediate mode OGL 1.0, with a switch added later on for vertex arrays.

      What remains is the vertex and pixel shaders, these will be in 2.0. They are already pretty much supported with the nv FX and I guess the 3D Labs card. I haven't been programming the ATI card, though many have for it's speed advantage, but from what I understand it doesn't quite live up to the requirements of 2.0. Also I think 3D Labs is pressing for infinite length programs, this can be implemented in the driver by simply compiling to multiple passes implicitly, though who knows about the performance. But the nv would handily beat the ATI if you do this because it can natively handle pretty long instruction streams. Unless this is already a driver trick, I dunno.

      2.0 will almost certainly wait until ATI is ready on the hardware level at least. If you program to extensions...OpenGL is ahead of Direct X, but this means you are stuck with the vendor if you use their specific stuff, say using fp30 Cg on the FX. I think everyone pretty much does program to extensions and not the standard if they are doing cutting edge stuff, usually with a compile or run-time code switch based on the extensions present.
    • - What still remains before we can say OpenGL is back toward its original goal (you write for one standard instead of having to write for every single little card driver, something kind of ruined by the fact that many things these days, every card uses a different opengl "extention" to do the exact same goal.)

      Agreed.

      - What still remains that DirectX excels at that OpenGL is lagging behind at

      Totally disagree. D3D9 is currently 'ahead' of OpenGL, with it's unified shader system, effect file framework,
      • ...and the HLSL (which can be processed in software). OpenGL is too vendor specific (read: Cg)...

        You realize, of course, that HLSL and Cg are the exact same language, right? Microsoft helped nVidia develop Cg, and then renamed it to HLSL for the DX9 implementation.
  • OpenGL + Perl (Score:4, Informative)

    by Anonymous Coward on Thursday July 31, 2003 @03:11AM (#6577849)
    Apropos to the recent Perl 6 announcement I'll just go ahead and mention the perl interface [cpan.org] as well. Wouldn't want anybody to forget.
  • Anybody checked out the OpenGL Shader Language? Is it like HLSL (same syntax as Cg)? I've been learning Cg, and I have to say that I like it... I'm kinda hoping developers will standardize on Cg.
  • Indeed (Score:5, Informative)

    by Ridge ( 37884 ) on Thursday July 31, 2003 @03:31AM (#6577910)
    This is good news, but I should point out the OpenGL [opengl.org] 1.5 spec is not at this moment available yet, it's only been announced. Hopefully it will be available with some headers and implementations Real Soon Now (tm). I know the past few ATI Catalyst drivers have had experimental glslang extensions in them... Of course, it'd be nice if Microsoft would update their implementation such that OpenGL on Windows could make use of this without going through extensions, but I'm not holding my breath, nor is it really a huge issue.

    The OpenML [khronos.org] SDK is an alpha release and the final spec for it won't be out until about this time next year.

    However, the Khronos Group also released the OpenGL ES [khronos.org] spec and there's actually a couple implementations already available. OpenGL ES is for embedded systems and mobile devices, it's essentially just a subset of OpenGL. Seems like it might be pretty nifty...
  • by Frohboy ( 78614 ) on Thursday July 31, 2003 @03:31AM (#6577913)
    Granted, OpenGL 1.5, with improved programmable shader support is indeed a step toward OpenGL 2.0, it is really a fairly minor evolutionary step.

    When OpenGL 1.0 was initially proposed, it provided a standard implementation for fixed pipeline segments (with the idea that individual implementations could selectively choose which pieces of the pipe would be implemented in software, and which would be implemented in hardware). This was a very significant development, as it meant that everyone could operate with the same set of rules, and could do the same things, but those without hardware support may suffer some performance penalties (of course, with modern CPUs, some of the stages in the pipeline can have very high-perf implementations in software).

    Since then, the rules have changed significantly. Hardware developers have started to suggest that the behaviour of the individual components of the pipeline could be programmable. NVidia and ATI have already responded to this, providing a wide variety of programmable shader technologies (e.g. programmable vertex and pixel shaders). I understand that OpenGL 1.5 essentially brings this level of programmability up to current levels (I think that DirectX 9.0 does the same thing, but I would love for someone to correct me on this. I haven't touched DirectX in a while, so I'm a little rusty. In fact, at the pace that hardware is evolving, I'm actually very rusty, and likely collapsing due to decay.) OpenGL 2.0 extends this idea of programmability to every stage of the pipeline. For most current video cards, this means that a lot of that programmability has to be implemented in software (which is essentially what people are doing anyway. If you want to implement programmable textures, you write software that interfaces with your video card's static texture routines.) 3DLabs is hoping to turn the computer graphics world on its ear by providing almost completely programmable graphics cards. Nearly every stage of the pipeline should be programmable in hardware. Of course, we will have to wait to see what they deliver, but I imagine that even if their cards suck ass in terms of performance (or they'll be targetted to the super high-end, so most of us will never see them), they should offer some features that will force some new developments from ATI and NVidia, which will eventually make their way down to regular consumers.

    It's good that OpenGL 1.5 is out, to help keep OpenGL on the map of standards (especially since DirectX is a really inconvenient standard for those of us who don't run Windows), but I'm still pretty psyched for the release of OpenGL 2.0, w00t!
    • I think that DirectX 9.0 does the same thing, but I would love for someone to correct me on this.

      Okay, so here's for you, not a correction but more like a clarification: DirectX has had this functionallity since at least DirectX 8. For game developers, these things aren't "new and exciting" anymore, they're things you need to make a new game. So - this issue of not having a standard interface to programmable shaders in opengl has been a big factors in making people move to DirectX for a while now, so thi

  • by BigFootApe ( 264256 ) on Thursday July 31, 2003 @03:37AM (#6577933)
    IFAICR, nobody has been able to do work on programmable hardware shader support for DRI (because of IP issues on some GL_ARB_Vertex* extensions). Is the new shader language similarily problematic?
    • I suspect that it is due to a lack of documentation from the hardware vendors rather than IP issues.After all - DRI wouldn't *implement* the vertex/shader programs, just expose the functionality that the hardware already has.

      Unfortunately, no-one on the DRI mailing list responded the last few times I asked there...
  • ID is already doing amazing shit with OpenGL, will this be something that helps a game like D3 do more or do the same thing faster?
  • by lokedhs ( 672255 ) on Thursday July 31, 2003 @04:07AM (#6578013)
    It's great to see that the Java bindings will become "official". Anyone who messed around with Java3D knows why this is a good thing.
  • by rexguo ( 555504 ) on Thursday July 31, 2003 @04:17AM (#6578048) Homepage
    I'd much prefer to see Sun/SGI base their work on GL4Java (www.jausoft.com/gl4java) than starting from scratch all over again. The industry needs this now and needs it fast. Microsoft has already got DX9 bindings for .NET months ago, but Sun/SGI has only announced it -now-? GL4Java, which is open-source, has been around for a long time and is pretty mature. It has survived the competition from commercial offerings like Magician (which is now dead). In fact, last year, SGI (or was it Sun?) used a customized version of GL4Java to show off the new NIO features of Java, rendering a 300MB+ terrain dataset in real-time. The speed at which Sun/JCP develops Java, and the speed at which SGI/ARB develops OpenGL, is a shame, let's hope they change this tradition this time!
    • Actually, Sun is already sporting a new OpenGL binding for Java, JOGL [java.net], which has been available for several weeks now.

      One of its key features is that the Java bindings are automatically generated from the C OpenGL bindings, so it's pretty trivial to keep it up to date with the very latest changes Compare this to gl4Java which has really started falling out of date.

  • All I remember about it is that cool poster of flames on the water or something.
    • Farenehit was a great success, actually. Not many project succees in doing exactly what they set out to do.

      Microsoft said that Farenheit would be the new standard 3D API and replacement for DirectX, and they managed to get SGI on board on it, which of course was the only purpose.

      Remember that at this time DirectX was hopefully behind OpenGL, and Microsoft needed to make sure that OpeNGL development came to a standstill for a year or so while they were improving DirectX. After they had suceeded with th

  • OpenAL (Score:2, Interesting)

    by Wuukie ( 47391 )
    Now we have OpenGL and OpenML. It seems nobody picked up OpenAL when Loki left the building.

    Do Linux game developers (or anyone at all) use OpenAL nowadays for environmental sound effects? Is it any good in its present state? It seems the website www.openal.org [openal.org] hasn't been updated since 2002. Well, most of the stuff seems to be from 2001...
    • "It seems nobody picked up OpenAL" Creative are the main supporters now. OpenAL is used in UT2003 and quite a few Linux games. The website is badly out of date - you pretty much need to be on the developer list to get the latest information.
  • Related to this topic, I would like to know if I can run OpenGL programs on one of my thin clients over X, i.e. what hardware and software is required?

    I know that people have done this with SGI kit, but most commodity X drivers don't seem to support remote OpenGL and there is precious little information around.
    Has anyone tried this?

    - Brian
    • by Goth Biker Babe ( 311502 ) on Thursday July 31, 2003 @07:22AM (#6578504) Homepage Journal
      If it's a decently implemented X server on the client then it should support it provided the application isn't using any calls or extensions that aren't supported. For example my Linux PC (with GForce4 graphics card) will work as an X server for my Indy for some applications but not for others.
    • The glx X extension should allow remote execution of OpenGL apps. It basically works in a similar manner to X - instead of being executed remotely and drawing bitmaps on the local display, the OpenGL commands are passed to the local display, which renders everything. I tried it once a couple years ago, but it crashed my X server, so I've never actually seen it work.

      TimeZone

  • by MarkSwanson ( 648947 ) on Thursday July 31, 2003 @08:42AM (#6578890) Homepage
    Sun announced full Open[GL,AL] support here:

    http://games-core.dev.java.net/

    Here's a great example of using OpenGL/OpenAL under Win32/Linux written in Java.
    (It uses the LWJGL - which is an OpenGL/OpenAL Java wrapper that uses nio).

    http://www.puppygames.net/
  • Microsoft rushed ahead with Direct3D in the mid 1990s, made lots of well-publicized mistakes early on because of a general lack of 3D knowledge, then nailed it starting with the graphics side of DirectX 8. The next version, DirectX 9 is a dead-on match for what's generally considered the state of the art in PC video cards. Microsoft isn't even planning DirectX 10, because DX9 is still way beyond what most people need or use (well, that, and the overall decline of the the PC video market). And, believe it
  • I notice the same thing happens everytime news is released about OpenGL or DirectX. Basically, it becomes a bash fest where one camp supports one API, and the other camp supports the other API. I've used both, and both have there merits ( well... Since Direct3D 7, they have... before that, D3D basically stunk). Now is a good time to bemoan what could have been.

    Anyone remember Fahrenheit? The collabrative effort between SGI and Microsoft to redesign both API's, into a new, leaner more capable common a

BLISS is ignorance.

Working...