Khronos Releases OpenGL 4.2 Specification 98
jrepin tips news that the Khronos Group has announced the release of the OpenGL 4.2 specification. Some of the new functionality includes:
"Enabling shaders with atomic counters and load/store/atomic read-modify-write operations to a single level of a texture (These capabilities can be combined, for example, to maintain a counter at each pixel in a buffer object for single-rendering-pass order-independent transparency); Capturing GPU-tessellated geometry and drawing multiple instances of the result of a transform feedback to enable complex objects to be efficiently repositioned and replicated; Modifying an arbitrary subset of a compressed texture, without having to re-download the whole texture to the GPU for significant performance improvements; Packing multiple 8 and 16 bit values into a single 32-bit value for efficient shader processing with significantly reduced memory storage and bandwidth, especially useful when transferring data between shader stages."
Re: (Score:3, Informative)
PARENT LINK IS GOATSE
Re: (Score:2, Funny)
It took me a good 30 seconds or so before I realized that I wasn't looking at some real-world DirectX code written in C++.
Re: (Score:2)
Yeah, it looks like C# to me.
Feature Bloat? (Score:1)
Re: (Score:1)
Re: (Score:2)
or Vista (locked to D3D10, but GL4.2 is still available).>
D3D11 works just fine under Vista.
Re: (Score:2)
At this point DirectX is so important they're not going to drop anything from hardware because OpenGL does or doesn't support it. The only difference is whether there'll be an interface to use it on Linux or not and the time you buy is only temporary. What do you do in five years when developers ask for features every card sold in the last five years has? Is it going to be less work then, or will it be just as much work and you're now five years behind the cutting edge? The answer is the latter, you're real
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re:Feature Bloat? (Score:4, Insightful)
OpenGL promised in the last version to cut away a lot of the features from really old versions, just like DirectX 10 did. This would have the disadvantage of breaking compatibility but the advantage of making it easier to support into the future and more efficient right now. Instead they maintained backward compatibility which made it bloated and hard to use.
You are aware that the complaints that you are repeating were about OpenGL 3.0, were addressed in 3.1, and the current topic is 4.2? 3.0 introduced a deprecation mechanism, and deprecated a load of stuff. 3.1 removed it. In 4, all of the fixed-function stuff is gone completely.
Re: (Score:2)
Re:Feature Bloat? (Score:4, Insightful)
You know that is one of the things MSFT really should be given credit for, because DirectX "just works".
Is that why so many games using DirectX come bundled with all those DirectX updates and patches?
Checking a randomly chosen game I have installed here (Dragon Age Origins) shows no less than 74(!) cab files with DirectX updates and patches, from Feb2005_d3dx9_24_x86.cab to NOV2007_d3dx9_36.x86.cab
Yep, it "just works".
Re: (Score:2)
How exactly do you think you're going to use the newest version if you don't have the newest version you stupid dickhead?
The normal way would be to provide an installer that installs the latest version. The problem with DirectX is that there's so much breakage and lack of backward and forward compatibilty, that developers often have to pick just which updates to install.
So? Who gives a fuck how many files it has.
It's not the number of files, it's the amount of distinct patches needed.
Re: (Score:2)
Re: (Score:2)
Hey, Hairyfeet, speaking of old gaming...
You know of any solutions to play games from that screwfest DX 5-7 period? I've been trying to get TIE Fighter 95 and XvT going, Win98 is too old for VirtualBox and AFAIK, DosBox is for DOS-age games. Two of my favorite games of all time are stuck in the middle.
BTW, from our conversation last month (on the topic of old games) : My Dingoo A320 [dealextreme.com] finally showed up. The stock NES and GBA emulators would be pretty nice, but they seem to have some unreliability in dealing
Re: (Score:2)
Re: (Score:2)
Uhhh...seriously WTF dude. You DO know that DirectX IS FREE, right? Why in the hell would a game company NOT include the files just to make sure their customers could run the game?
You missed the point entirely. Why would they need to provide dozens of cherry-picked patches, instead of just, um, install the latest version of DirectX?
The problem is that you can't - DirectX has so many compatibility problems that Microsoft has to provide the updates individually, so the end user (or developer) can pick and choose just which updates to install.
"Just works", indeed.
Re: (Score:2)
Re: (Score:2)
It's not about space, it's about a claim of "just works".
As in "it just works if and only if you also install the November 2006 redist as well as the February 2007 one, because the 2009 final one doesn't have the compatible parts that are only in those two redists"?
This is the problem with shifting standards as you go, with the distributed code being authoritative. OpenGL doesn't have that problem; the libraries are made according to the specs, not the other way around.
Wine.apk (Score:2)
That's funny, Wine doesn't seem to be on the Android market....
You're missing the point. PC gaming is irrelevant. As it is, even dedicated portable gaming consoles seem to be becoming irrelevant. What is relevant are millions of small form factor devices, all of which use OpenGL and none of which run Windows. That is where gaming is going.
Re: (Score:1)
Re: (Score:2)
The atomic stuff and the counter stuff does sound useful to me, though I have not had the pleasure of programming a game yet. Well, not in the last 25 years, anyway.
Re: (Score:1)
Seriously? Taking coded features out of the software layer and placed on the hardware layer can multiply the speed of operation by an order of magnitude. OpenGL is far behind DirectX in that sense. DirectX is in many ways easier, and faster because of it. OpenGL needs to ditch some of features that they are holding onto for the backwards compatibility. Anything older than 7years should be on the chopping block if it isn't needed.
Re: (Score:1)
OpenGL Has Left DirectX In The Dust (Score:5, Insightful)
I don't know what delusion planet you are posting from but here in the Real World OpenGL has left DirectX in the dust in both features and performance a long time ago.
Khronos is absolutely on fire with giving developers what they want as quickly as possible. OpenGL developers have access to the absolute bleeding edge features of new graphics cards that people who are stuck still using DirectX have to wait around for Microsoft to get off their ass and implement.
It shouldn't be surprising OpenGL has won the API war with Microsoft:
210 Million OpenGL ES based Android devices a year.
150 Million OpenGL ES based iOS devices a year.
Every Linux, Mac,and Windows machine
The dying PC games market and the last place Xbox 360 are the only places left in the world still using the dead end DirectX API.
Re: (Score:1)
Thank you.
Lets try this again:
I don't see Rift, Eve Online (Who actualy dropped support but had it once, due to what Im discussing about) or other games being developed for Linux.
Lets confuse the issue more:
In fact if I do a search for linux gaming I get old games, bad games, unfinished games that were abandoned, and a very few clones of the classics done as a learning exercise.. all in opengl.
And lets not forget to piss some people off:
Why? Because in general the so called "pro game devs" only
Re: (Score:2)
Also add the Nintendo platforms to the OpenGL list. Those professional 3D graphics and GIS programs running on MS Windows are using OpenGL as well.
Re: (Score:2)
Why the hell are talking about linux? While DirectX is windows only, OpenGL runs on pretty much every platform in common use. Before accusing others of fallacious logic, look to your own strawman.
Re: (Score:2)
The parent stated that OpenGL had more features and better performance than DirectX. You responded criticizing Linux gaming, and accusing the parent of making fallacious arguments. Either you're just totally off-topic and decided to randomly spurt about how linux gaming sucks in this thread, or you're making an implicit assertion that linux/windows parallels OpenGL/DirectX, which is false.
Re: (Score:3)
Uh, quite a few Windows games use OpenGL, even the forthcoming Rage (so OpenGL on Windows is hardly "dead"). A lot of them (especially older ones) even offer both - I can set Half-Life or Unreal Tournament to use OpenGL, Direct3D, or even software rendering.
Most popular engines support both. UE3, used by about half the games on the market, uses OpenGL on the PS3, Wii, Mac, iPhone and Linux, and D3D on the XBox and Windows.
See, you're thinking too much about Windows VS Linux VS Mac, when the developers are t
Re: (Score:3)
They did that already. As of OpenGL 3.1 the only non-deprecated rendering method is Vertex Buffer Objects. Link [opengl.org].
There are a lot of things OpenGL could do to make itself more accessible - better-supported crossplatform utility libraries, three or four shortcut commands that set the various glEnable() states that 95% of new developers actually care about, streamlining eyebrow-raising pile of mipmap generation options, the entire process of setting up a vertex buffer object could be MASSIVELY simplified...
Ho
Re: (Score:2)
Microsoft might have been Slashdot's Great Satan for a long time, but they do listen to the sort of developers they're hungry for, and DirectX is one of the better examples of that.
Well, it used to be, but they really screwed up by not supporting DirectX 10 on XP. If you use DirectX 10, you are limited to the operating systems with around 40% of the market. If you use DirectX 9, you get another 50%, but you're limited to old features. On the other hand, Intel, nVidia and AMD all support OpenGL - with all of the latest shader functionality exposed, either as part of the core standard or via extensions - on XP, Vista, and Windows 7. Oh, and you also get another 5-10% from OS X, if y
Re: (Score:3)
I'm a newbie at this stuff, but here goes:
"single-rendering-pass order-independent transparency" - let's say I have three translucent objects at roughly the same depth, with parts of one in front of and behind parts of the others (and maybe the same is true for objects B and C as well). Figuring out the correct draw order is absolute fucking murder, and there still isn't a generalized approach for anybody but the most advanced of the most advanced (like Dual depth peeling [nvidia.com] or making convex hulls out of all
Re: (Score:1)
Re: (Score:2)
In that case, I'd suggest phrasing the question a bit less inflammatory next time :) You did not come across as a nice guy who would like to know the concrete benefits and drawbacks.
Re: (Score:2)
Is this one of those things you would get practically automatically with ray-tracing? It seems to me that a z-buffer just isn't capable of adequately dealing with this kind of situation unless you actually want to sort the objects by depth for ev
Re: (Score:2)
Just a note on OIT, with regards to DirectX: Since the fixed function pipeline was obsoleted in favor of the programmable shader based approach API implemented OIT has been orphaned as something of a strange hybrid between a crutch for people just getting into advanced topics and a luxury item.
DirectX is primarily focused on high-end games, and nowadays most games (it would seem) use some variation of Deferred Lighting. The type of deferred lighting you use would determine which, of the many, sorting appr
Re: (Score:1)
Re: (Score:2)
Not only that, but the exact same post from dev346, dev347 and dev348.
If only all those efforts went into something productive.
Re: (Score:2)
Looks like the MichaelKristopeit troll decided to move to a shorter username. :)
Re: (Score:2)
Slashdot should out anonymous posters that reach -1
In Soviet Russia (Score:2)
In Soviet Russia, something else expects you!
How much of the API is needed for HW accel? (Score:3)
Perhaps somebody in the know can enlighten me about this.
I see many fairly advanced features and functions in both the DX and OpenGL APIs , but I was under the impression that a modern graphics cards were basically designed to do a few fairly primitive operations very well and in parallel. So basically, how much of these APIs actually deal with interfacing the graphics card and it's hardware accelerated features, and how much of it is more along the lines of just a standard library that contains frequently used graphics algorithms?
Maybe my view of how programming is done these days is a bit naive, but I've always sort of felt there was a difference between the APIs that are there in order to let you use the hardware without mucking around with terribly low level and platform dependent stuff like interupts and so on, and on the other hand just standard libraries that is pretty much things where the code would be more or less the same on most platforms, but you just don't want to write it all over again whenever you make a new program ( things like some container class for C++ ).
My idea of what OpenGL and DirectX did was to let you access the features of the video card without having to worry about all the little differences between one card and another. So you could send the card a bunch of textures or something without having to rewrite the code for every card you wanted to run on.
Am I missing a lot here? Do the OpenGL and DirectX APIs also deal with a load of stuff that is just generally handy to have around when writing graphics programs?
Re:How much of the API is needed for HW accel? (Score:5, Informative)
(Disclaimer: Simplifications follow.)
Originally, there was OpenGL, which provided the model of a graphics pipeline as a set of stages where different things (depth culling, occlusion, texturing, lighting) happened in a specific order, with some configurable bits. There was a reference implementation that implemented the entire pipeline in software. Graphics card vendors would take this and replace some parts with hardware. For example, the 3dfx Voodoo card did texturing in hardware, which sped things up a lot. The GeForce added transform and lighting, and the Radeon added clipping.
Gradually, the blocks in this pipeline stopped being fixed function units and became programmable. Initially, the texturing unit was programmable, so you could add effects by running small programs in the texturing phase (pixel shaders). Then the same thing happened for vertex calculations, and finally you got geometry shaders too.
Then the card manufacturers noticed that each stage in the pipeline was running quite similar programs. They introduced a unified shader model, and now cards just run a sequence of shader programs on the same execution units.
As to how specialised they are... it's debatable. A modern GPU is a turing-complete processor. It can implement any algorithm. Some things, however, are very fast. For example, copying data between bits of memory in specific patterns that are common for graphics.
Modern graphics APIs are split into two parts. The shader language (GLSL or HLSL) is used to write the small programs that run on the graphics card and implement the various stages of the pipeline. The rest is responsible for things like passing data to the card (e.g. textures, geometry), setting up output buffers, and scheduling the shaders to run.
Re: (Score:2)
The GeForce added transform and lighting
Nerdmode:on
Oh, how excited I was to get a GeFORCE256 card and talk about T&L in hardware in my home PC... Ironically I worked with an SGI RE2 about two feet away from me at the time and couldn't get as excited about it (have you ever worked with Irix? LOL.)
Nerdmode:off
Re: (Score:2)
So basically, how much of these APIs actually deal with interfacing the graphics card and it's hardware accelerated features
Most of these APIs. OpenGL und D3D are basically meant to be thin, portable layers encapsulating the capabilities of (some generation of) the graphics hardware.
and how much of it is more along the lines of just a standard library that contains frequently used graphics algorithms?
Not much of it. You can use the GLU (GL utililities) library for some software utility functions (basically just convenience or comfort stuff, no thick API layers). Even for very basic stuff like matrix multiplications you have to use 3rd party libraries (if you need to do it on the CPU, rather than in a shader on the GPU). The API implementations ma
Re: (Score:1)
OpenGL a thing of the post (Score:1)
But DirectX is no better.
The future lies in directly programming the hardware with a classical programming language, building your own renderers in software, hopefully not limited by outdated polygon technology.
Re: (Score:2)
Re: (Score:3)
It's a troll or loufoque is a bit detached from reality, but this does bring up an interesting point: a lot of what people are looking into these days in terms of rendering is voxels drawn using polygons. Minecraft? Basically those tiles are voxels being rendered as an uniform convex hulls - lends itself to some amazing efficiency.
This [youtube.com] is even more interesting from a technical perspective - stretching isosurfaces across voxel terrain to create a truly malleable world.
Re: (Score:2)
HAHAHAHA!
Wait, you're serious?
HAHAHAHAHAHA!
NO games programmer wants to get involved in bare hardware coding. That would require so much redundant code to be written, and testing would be an absolute nightmare. Even the vaunted Intel Larrabee design was going to have drivers and code so that it would appear to games as a regular OpenGL/DirectX card. You could write your own code, sure, but it would default to acting just like any other card (as far as the software can tell).
Re: (Score:2)
Ever heard of engines and other middleware?
Re: (Score:2)
Even engine developers don't want to do that unless necessary for some really, really cool feature (realtime ray-tracing, maybe). It's just far too much work.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Who owns OpenGL (Score:1)
OpenGL.... (Score:2)
We are already at 4.2...
Wow, GL moves fast, but who cares, when you need to force your users to update their drivers so you can have the bare minimum "new" features.
Seriously, what's the rate of adoption? Perhaps it was never labeled properly but don't all default installs of GL support like 1.x? What drivers and cards provide support for GL 2.0+?
And most importantly why should I bother developing in a newer version of GL, if I don't know if the user will be able to update to the right version to run a game