OpenGL 3.0 Released, Developers Furious 643
ikol writes "After over a year of delays, the OpenGL ARB (part of the Khronos industry group) today released the long-awaited spec for OpenGL 3.0 as part of the SIGGRAPH 2008 proceedings. Unfortunately it turns out not to be the major rewrite that was promised to developers. The developer community is generally furious, with many game developers intending to jump ship to DX10. Is this the end of cross-platform 3d on the cutting edge?"
Re:Question (Score:5, Informative)
Re:Question (Score:5, Informative)
Cross-platform 3D is useful, but OpenGL stopped being cutting-edge many years ago. The model that it uses is falling farther and farther from the model that the hardware supports, and many new extensions and features are not supported on many platforms (particularly ATI). It has become increasingly difficult to write cutting-edge graphics software, and OpenGL 3 does little to fix that.
Re:Typical F/OSS Failure (Score:2, Informative)
OpenGL isn't from the open source community, actually. It's "open" is the old "open systems" open, which is rather less open.
While people might say linux uses opengl, a lot of the time it's using mesa, which implements the opengl specifications but is NOT a certified opengl implementation.
(This revised 3.0 might be good news for mesa, as the originally threatened backward-incompatible 3.0, that, perhaps contrary to this slashdot "story" most opengl folk decided they didn't like, looks like it might be implementable without certain patent issues biting).
No it doesn't (Score:5, Informative)
Part of the reason for DX's success is that nobody else seems interested in developing anything to compete with it. OpenGL is the only cross platform 3D API I'm aware of and it and DX are all that there is these days. GLs problem is that it isn't keeping up with the hardware. The "just use vendor specific extensions" isn't a realistic solution in most cases. Thus GL is suffering and DX is winning by default.
If someone like Apple did develop a good 3D API, it might do well. However nobody seems interested.
Re:Err, yeah. (Score:5, Informative)
The PS3 uses OpenGL ES for basic rendering (GL with all the ancient cruft ripped out) and NVIDIA's Cg for the actual shaders.
what we have here is a misunderstanding (Score:1, Informative)
DirectX is graphics, input, sound and peripherals interaction.
Opengl can only handle the graphics portion of a game, everything else needs other products. The unified nature of DirectX makes it superior in many ways.
The only bad thing about is that in its pure form its Windows and Xbox only, if we don't count Wine.
Re:Question (Score:5, Informative)
That's most of the problem though... they did rewrite OpenGL, then they scrapped it. So in the process, we got a few years of the new version not existing. And a year of communication (from ARB/Khronos) not existing, particularly frustrating after they'd spent the previous year saying they were going to work on communications and transparency.
Even better, GL2 was supposed to be a cleaned up API, so this was the second time they promised a rewrite and scrapped it.
So either they were completely wrong about the justification for the rewrite both times (which doesn't bode well for the group in charge of the API) or we are missing out on the benefits the rewritten API would have provided.
Probably the biggest problem was the communications though, if they'd admitted the problems as they happened, there probably would have been less backlash. As it is, everyone was still pretty much expecting the original 3.0 design, so not getting that, on top of a year's worth of promised status updated, on top of the previous poor communication the promised status updates were supposed to fix, on top of the promised-then-scrapped 2.0 update, etc. leads to unhappy community.
(For those not following the situation, advertised benefits included:
simpler api = simpler drivers = better conformance + fewer driver bugs
new object model = less need for consistency checking in drivers = faster drivers with fewer bugs
getting rid of outdated code paths = easier to understand the api, easier to tell what will be fast
probably some more I forgot)
Re:Calm down (Score:4, Informative)
Thing is it's not even close to being easy to use anymore. Especially not if you're interested in performance.
Because of the two decades of crud that has accumulated, there are so many ways of falling off the fast path in OGL, and it's next to impossible to know beforehand what will and will not work. Drivers are also a bitch to develop and maintain because of the size of the thing, which makes things even worse since what works on Nvidia may not work on ATI and vice versa.
The only way to fix this is a good cleanup to bring it in line with modern hardware. What they did was add even more crud.
The Chicken and the Egg (Score:5, Informative)
.
Vista is approaching 20% of the market. Top Operating System Share Trend [hitslink.com] You can't expect expect Linux ports if entry level DX9l/DX10 outperforms OGL.
GL is doomed in the short-to-medium term... (Score:5, Informative)
...and probably irrelevant in the longer term.
This is not the first time this has happened. GL2 was also supposed to be a cleanup, but turned out to be anything but. This latest fiasco is more significant as a failure of governance than of technology. All the right ideas were there; they were published in some detail over a year ago in the Pipeline newsletters. But the ARB very obviously a) can't agree to get anything meaningful done, and b) now has subzero credibility with developers. It's not coming back from that.
So yes, I think cutting-edge cross-platform 3D is dead for the next 2-3 years. Let's face it, it was never exactly healthy. It's not the end of the world. Linux share is currently growing mostly at the low end, netbooks etc, while the Mac seems to be thriving despite the fact that Apple doesn't give a flying fsck about gaming and never has.
Fast forward a couple of years, though, and things like Larrabee will be hitting the market; embarrassingly parallel hardware that can be programmed with standard languages and tools. The API's role as gatekeeper of functionality will be gone. And suddenly, 3D rendering libs can be written by anyone with the time and expertise, without having to go through Microsoft or the ARB or NV or AMD/ATi or Intel or anyone. Experimentation, competition, cross-fertilization, evolution. Remember Outcast's [wikipedia.org] voxel engine? Seen things like Anti-Grain [antigrain.com]? This will happen.
(Or, yes, people could just reimplement the DXwhatever API directly, but that would be a little disappointing.)
Today was not a good day, by any stretch of the imagination. But it's not the end.
Re:Question (Score:5, Informative)
Well, it is a Microsoft product, so it's not without its flaws (The Vista dependency for one), but over all it's a good API for taking advantage of modern hardware without all the legacy crud that plagues OpenGL.
If you've used D3D8 or older, you'll find it a massive improvement.
Re:Question (Score:1, Informative)
Re:Hard to believe the new standards change anythi (Score:3, Informative)
ATI supports openGL, Intel added full support as well when they implemented DX10 (only one chipset has these so far iirc though).
The apparent failure of OpenGL to provide a significant rival to DX10 sucks though, especially since the DX10 on Vista only might have provided game makers an incentive to jump ship in order to get bleeding edge graphics onto XP systems.
Re:Err, yeah. (Score:3, Informative)
OpenGL ES (PSGL) is provided, but I don't think anyone is seriously using it except to do initial porting efforts.
Sony supply an alternative low level api called libGcm.
the misunderstanding is yours (Score:4, Informative)
That's an old perpetuated myth, that the value of D3D has anything to do with the satelite APis.
Let's confront it with facts:
- Direct3D is the absolutely dominant component of DirectX, in terms of received and deserved attention by users. As well as R&D effort by MSFT. It's the advancement of D3D that drives MSFT to release every next version of DirectX.
- Each component of DX is a completely indepependent API, sharing only design convention. OpenGL games on Windows use DirectInput for input, perfectly ignoring Direct3D.
- funnily, the satelite APIs are actually being phased out by MSFT. DirectPlay was always thoroughly ignored by the industry. For DirectSound and DirectInput there are replacement already. Not to mention the fate of hardware acceleration of DirectSound in Vista.
D3D and DX are de facto interchangable terms. Get over it.
Not handling sound/input by OpenGL is absolutely irrelevant in discussion about it's applicablity.
Re:people still make opengl games? (Score:2, Informative)
DX11 brings "compute shaders" to the table, which is a Good Thing - forcing a standard for GPU computation, allowing say hardware accelerated physics libraries to run on GPUs from multiple vendors.
Windows 7 is the usual product development cycle, and it was in the pipeline before Vista was even beta.
Re:Question (Score:5, Informative)
the legacy crud that plagues OpenGL.
Did you read "the deprecation model" (appendix e) of the OpenGL 3.0 spec? OpenGL 3.0 apparently provides for a mode (a "forward compatible context") that helpfully excludes deprecated "legacy crud".
This sounds very handy for people trying to update codebases - they can presumably switch to a forward-compatible context, do a build, see what breaks.
Vocal minority... (Score:1, Informative)
Sure, *some* developers might be "furious", and maybe it was wrong for Khronos to abandon the original proposal.
In any case, even with OpenGL 2.0 you can do a lot of stuff. Are there games that benefit from using Direct3D instead of OpenGL 2.0? Having done a lot of programming in DX and MDX (before it was rolled in to XNA) and in OpenGL, I really wonder what someone might claim is missing in OpenGL...
I think the features people are "furious" about must be pretty damned esoteric.
Re:Question (Score:5, Informative)
AFAIK you cannot use the dependency resolution logic of apt or yum or w/e without also divulging the source code something which is never going to happen with commercial s/w.
kjella@desktop:~$ dpkg --info opera_9.51.2061.gcc4.qt3_i386.deb
new debian package, version 2.0.
size 8295240 bytes: control archive= 6485 bytes.
34 bytes, 2 lines conffiles
1275 bytes, 21 lines control
16580 bytes, 231 lines md5sums
1719 bytes, 54 lines * postinst #!/bin/sh
572 bytes, 18 lines * postrm #!/bin/sh
179 bytes, 9 lines * prerm #!/bin/sh
Package: opera
Version: 9.51.2061.gcc4.qt3
Section: non-free/web
Priority: optional
Architecture: i386
Depends: libc6 (>= 2.1.3), xlib6g (>= 3.3.6) | xlibs | libxmu6, libqt3-mt (>= 3.3.4), libstdc++6
Suggests: flash-npapi-plugin | flashplugin-nonfree | swf-player | libflash-mozplugin | mozilla-plugin-gnash, pdf-npapi-plugin | djvulibre-plugin | mozilla-acroread, cupsys-client | lpr, sun-java6-jre | sun-java5-jre | java-gcj-compat, linux-libertine | ttf-dejavu | ttf-bitstream-vera | msttcorefonts, xine-plugin | gxineplugin | mplayerplug-in | kaffeine-mozilla | mozilla-mplayer | mozilla-helix-player | gecko-mediaplayer, mozplugger | plugger, mozilla-bonobo, aspell
Conflicts: opera-static
Replaces: opera-static
Provides: opera-static, www-browser
Installed-Size: 20100
Maintainer: Opera Packaging Team
Bugs: mailto:packager@opera.com [mailto]
Description: The Opera Web Browser
Welcome to the Opera Web browser. It is smaller, faster,
customizable, powerful, yet user-friendly. Opera
eliminates sluggish performance, HTML standard violations,
desktop domination, and instability. This robust Web
browser lets you navigate the Web at incredible speed and
offers you the best Internet experience.
The binaries were built on Debian using gcc-4.0.0.
I think someone sent us a telex saying they want their troll back.
Re:This can't be good. (Score:2, Informative)
The thing OpenGL is typically bad at is removing legacy stuff
One of the innovations of OpenGL 3.0 is a means for deprecating and removing legacy stuff, see appendix E of the spec....
Re:No it doesn't (Score:5, Informative)
If someone like Apple did develop a good 3D API, it might do well. However nobody seems interested.
Sadly, won't ever happen. Apple's "commitment" to gaming on their platform doesn't extend far beyond 3D chess and Tetris clones. Hell, having a working Flash client is probably Apple's idea of supporting "gaming" for their users.
Apple appears to be quite content with OpenGL in its current state, and haven't even gotten close to pushing its limits.
Have you installed the DirectX SDK lately? It's sad how wide the divide is. On the DirectX side you get a *massive* library of documentation, sample code snippets, entire sample projects, and more guides than you can shake a stick at. Compare this with the new-hotness that is Apple's iPhone SDK. Worlds apart. The iPhone SDK documentation is absolute trash. There are almost no tutorials, "sample code" is hardly ever commented. No code snippets to accompany tricky API calls, and the entire thing uses so much Objective-C-speak that I'm quite surprised anybody but a hardened Mac developer can even begin to comprehend it.
One company is very good at fostering a developer community and making sure it's easy to get on board their API. The other seems like it goes out of their way to torture devs.
Disclaimer: I am a hobbyist iPhone developer, Mac user, Xbox owner, and DirectX developer.
Re:people still make opengl games? (Score:5, Informative)
MPC: So, you said Rage is a 60Hz game. Is it an OpenGL or DirectX game?
JC: Itâ(TM)s still OpenGL, although we obviously use a D3D-ish API [on the Xbox 360], and CG on the PS3. Itâ(TM)s interesting how little of the technology cares what API youâ(TM)re using and what generation of the technology youâ(TM)re on. Youâ(TM)ve got a small handful of files that care about what API theyâ(TM)re on, and millions of lines of code that are agnostic to the platform that theyâ(TM)re on.
MPC: Are you using DirectX 9 equivalent? For Doom 4 as well?
JC: Yes to both. Itâ(TM)s one of those things I get asked a lot. Whatâ(TM)s big and exciting for DirectX 10 or DirectX 11? Thereâ(TM)s not a whole lot of⦠really not a whole lot. The big touted geometry shaders were in many ways, a mistaken belief that people desperately wanted to create stencil shadow volume.
So less than a month ago John said that he's still developing with OpenGL and that DX10 isn't really a worthwhile improvement.
And congratulations on referring me to something he said ages ago, when you find something more recent feel free to reply
Oh and source of interview: http://www.maximumpc.com/article/features/e3_2008_the_john_carmack_interview_rage_id_tech_6_doom_4_details_and_more?page=0%2C0 [maximumpc.com]
Xbox does not support DX10, only Vista does (Score:2, Informative)
As for where do I get the information for DX10 not running on the Xbox, read below.
1up reports that ATI has debunked a rumor that Xbox 360 could be upgraded to support DirectX 10 via a patch. "Xbox360 cannot run DX10," an ATI spokesperson told 1up. Currently, Microsoft's console runs an advanced version of DirectX 9, which, according to ATI, features "memory export that can enable DX10-class functionality such as stream-out."
http://www.joystiq.com/2006/08/24/xbox-360-cant-run-directx-10-confirms-ati/ [joystiq.com]
Mod parent up, maybe (Score:4, Informative)
All I've really seen of the PS3 dev kit is what was on display at GDC. The Sony guys talked about GL ES and NVIDIA's Cg toolchain for shaders, so that's what I posted. This, however, sounds a lot more like what I expected from Sony and is right in line with the PS2 dev kit (emphasis mine):
Sony supply an alternative low level api called libGcm.
If libGcm is what I think it is (macro'd constants to build push buffers + raw DMA access) then pretty much nobody will be using the GL stuff. Coding right to the hardware is what PlayStation development is all about.
Re:No it doesn't (Score:3, Informative)
There is no Half Life for Mac.
Depends on how you mean that. According to this section of the Half-Life article, [wikipedia.org] Sierra made one, said they were going to release it, [planetfortress.com] and then never did....
Makes you wonder if it's just sitting in a vault somewhere....
Re:This can't be good. (Score:5, Informative)
What MS call "Shader model 4" (even though geometry programs aren't, strictly speaking, shaders as they don't necessarily SHADE anything per se) includes mandatory support for geometry programs.
The geometry program sits in the programmable pipeline between the vertex program (which is used for real-time vertex deformation in hardware, world-space to object-space clipping to generate texcoords for the fragment program, etc) and the fragment program (which is used to colour fragments [1] based upon the output of the vertex program and input from one or more texture samplers.)
Unlike "old" vertex programs, a geometry program is able to generate new geometry on the fly. This allows a whole heap of really cool stuff, such as real-time shadowing effects, for essentially free.
So, yeah, much as I hate to admit it (and REALLY hate the Direct3D 'shader' nomenclature concerning pipeline programs,) D3D10 actually has changes with merit from D3D9c.
[1] A fragment is a fancy name for a voxel defined in clip space. After shading and occlusion, the remaining fragments become rasterised as pixels. Thus, the term 'pixel shader' is rather inaccurate.
Re:The Chicken and the Egg (Score:5, Informative)
That's measured from visitors to thousands of sites, including many large mainstream websites. So that'd be computers in use, not licenses sold. Yep, Vista adoption is going even better than XP's did back in 2001, and no, not everyone gets rid of Vista for XP (I'm very happy with upgrading from XP to Vista myself -- no plans on ever downgrading to XP). Not that anyone here would admit to that...
Re:OpenGL falling down a pit (Score:3, Informative)
That's the entire problem. nVidia would have functionality available in both DX and GL drivers on release day and would frequently submit it to the ARB for ratification as an extension, which often would become a core feature in the future.
Unfortunately, nobody else took their lead. In good scenarios, you'd have separate implementations on different hardware - extensions named GL_NV_blah and GL_ATI_blah. Sometimes only one would implement it (I think ATI's vertex buffers were judged superior and promoted to core shortly thereafter).
The worse offender, though, (and really the sign that OpenGL was going nowhere) was the geometry shaders. nVidia had the supporting hardware first, but rather than make it an extension specific to their drivers, they implemented it and submitted it as a GL_EXT extension - one that everyone should implement. Nobody else did.
Not licenses - users (Score:4, Informative)
.
FIY:
We collect data from the browsers of site visitors to our exclusive on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month.
Additional estimates about the website population:
76% participate in pay per click programs to drive traffic to their sites.
43% are commerce sites
18% are corporate sites
10% are content sites
29% classify themselves as other (includes gov, org, search engine marketers etc.. About Our Market Share Statistics [hitslink.com]
Net Applications stats are global.
Its clients - for reasons which should be blindingly obvious - are interested only in meaningful stats about users, not licenses.
Plenty of people are buying computers with Vista and switching to another OS, or downgrading to XP.
The numbers simply aren't there to support this argument.
Re:Not licenses - users (Score:5, Informative)
.
Net Applications doesn't give a damn about the locked-down corporate desktop - it is selling stats about the home market to retailers like Target.
Every user I know personally who has tried Vista rolled back to XP or moved to Linux.
The plural of anecdote is not data. Net Applications builds its stats from 160 million page views each month.
yet for all the new Vista licenses being sold, XP dominates the statistics you linked from 70 to 17 percent.
The Net Applications stats are global.
There are by some measures a billion users world-wide running Windows - most on older hardware that cannot be realistically upgraded to Vista.
But something like 1 in 5 users will have made a very significant investment in hardware and in Vista in less than two years.
Not completely a "fail" (Score:4, Informative)
Re:Um, DUH!!! Who own OpenGL now? (Score:5, Informative)
Please stop modding up this troll.
That article is 6 years old.
Most of those patents are hardware patents totally irrelevant for OpenGL (or Direct3D for that matter).
Also, Microsoft is not a member of the group that actually writes the OpenGL specification. They have no vote on what gets in OpenGL or don't.
Of course this might give them leverage on some of the hardware vendors (like Nvidia) that will have to implement the new OpenGL standard in the future. But history does not show them trying to use this in any way against OpenGL.
But claiming they "own OpenGL" is nonsense.
Re:Question (Score:5, Informative)
Implements.
Re:KDE? (Score:1, Informative)
Actually, that would be only half a circle, 2*pi is a full one, am i rite?
Re:KDE? (Score:1, Informative)
Re:The Chicken and the Egg (Score:3, Informative)
I can't believe you are the obvious one to this discussion. I'm not dogging you in the least. I'm saying that floppies should have never been removed from the computers.
I build systems and always spec a floppy drive, I have a few colleges that seem to skip out on them and nothing makes me more pissed off then doing a 11:00pm service call that ends up in a 3 am hunt for a floppy because something needs done to the boot sector or the bios needs flashed after a windows update made something previously stable too unstable to boot the OS. There are other times like when whatever problem stops access to the CDrom drives and you need to transfer a small file. Of course the network is out because generally when that happens, it is due to a virus infection or some other malware (Winantivirus2008). Floppies should never be skipped out on. And when a tech claims they aren't needed, that tech either isn't exposed to the abuses of idiots working on them or they don't fix the problem but just reimage the drive which can lead to another can of worms.
Re:The Chicken and the Egg (Score:5, Informative)
Really? Perhaps if you're using Excel on a Pentium II for your calcs...
XP hit 20% of the market in less than a year, and was at 40% by the 24 month mark.
Vista was released in November 06, and in august '08 is still below 20%. It might make that 20% share by November, 24 months after it was released.
That's a dismal performance by any standards, but for a monopoly OS that was seven years in development, it's an astonishing failure.
Comment removed (Score:5, Informative)
Re:Question (Score:3, Informative)
The goal of Wine is a full reimplementation of the Windows API which will make Windows unnecessary.
Emphasis mine.
Wine comes with a full set of headers and libraries which make it possible for a programmer to view the Win32 API as a spec and recompile it with the Wine implementation.
However, Wine also comes with a program that loads native Win32 .exe files and tricks them into thinking that they're running on a bona-fide Win32 OS. This is how most end-users experience Wine, and it's hard to argue that's not an emulator.
WINE should really stand for "Wine Is Not just an Emulator" or maybe "Wine Is Not a hardware Emulator".
Explanation from OpenGL ARB Working Group Chair (Score:5, Informative)
Basically they've got tangled in the implementation details and decided to play it safe with OpenGL 3.0 instead of starting from scratch.
========
What happened to Longs Peak?
In January 2008 the ARB decided to change directions. At that point it had become clear that doing Longs Peak, although a great effort, wasn't going to happen. We ran into details that we couldn't resolve cleanly in a timely manner. For example, state objects. The idea there is that of all state is immutable. But when we were deciding where to put some of the sample ops state, we ran into issues. If the alpha test is immutable, is the alpha ref value also? If we do so, what does this mean to a developer? How many (100s?) of objects does a developer need to manage? Should we split sample ops state into more than one object? Those kind of issues were taking a lot of time to decide.
Furthermore, the "opt in" method in Longs Peak to move an existing application forward has its pros and cons. The model of creating another context to write Longs Peak code in is very clean. It'll work great for anyone who doesn't have a large code base that they want to move forward incrementally. I suspect that that is most of the developers that are active in this forum. However, there are a class of developers for which this would have been a, potentially very large, burden. This clearly is a controversial topic, and has its share of proponents and opponents.
While we were discussing this, the clock didn't stop ticking. The OpenGL API *has to* provide access to the latest graphics hardware features. OpenGL wasn't doing that anymore in a timely manner. OpenGL was behind in features. All graphics hardware vendors have been shipping hardware with many more features available than OpenGL was exposing. Yes, vendor specific extensions were and are available to fill the gap, but that is not the same as having a core API including those new features. An API that does not expose hardware capabilities is a dead API.
Thus, prioritization was needed, and we made several decisons.
1) We set a goal of exposing hardware functionality of the latest generations of hardware by this Siggraph. Hence, the OpenGL 3.0 and GLSL 1.30 API you guys all seem to love ;\)
2) We decided on a formal mechanism to remove functionality from the API. We fully realize that the existing API has been around for a long time, has cruft and is inconsistent with its treatment of objects (how many object models are in the OpenGL 3.0 spec? You count). In its shortest form, removing functionality is a two-step process. First, functionality will be marked "deprecated" in the specification. A long list of functionality is already marked deprecated in the OpenGL 3.0 spec. Second, a future revision of the core spec will actually remove the deprecated functionality. After that, the ARB has options. It can decide to do a third step, and fold some of the removed functionality into a profile. Profiles are optional to implement (more below) and its functionality might still be very important to a sub-set of the OpenGL market. Note that we also decided that new functionality does not have to, and will likely not work with, deprecated functionality. That will make the spec easier to write, read and understand, and drivers easier to implement.
3) We decided to provide a way to create a forward-compatible context. That is an OpenGL 3.0 context with all deprecated features removed. Giving you, as a developer, a preview of what a next version of OpenGL might look like. Drivers can take advantage of this, and might be able to optimize certain code paths in the forward-compatible context only. This is described in the WGL_ARB_create_context extension spec.
4) We decided to have a formal way of defining profiles. During the Longs Peak design phase, we ran into disagreement over what features to remove from the API. Longs Peak removed quite a lot of features as you might remember. Not coincidentally, most of those features are marked deprecated in OpenGL 3
Re:The Chicken and the Egg (Score:3, Informative)
I can't believe you are the obvious one to this discussion. I'm not dogging you in the least. I'm saying that floppies should have never been removed from the computers.
I build systems and always spec a floppy drive, I have a few colleges that seem to skip out on them and nothing makes me more pissed off then doing a 11:00pm service call that ends up in a 3 am hunt for a floppy because something needs done to the boot sector or the bios needs flashed after a windows update made something previously stable too unstable to boot the OS. There are other times like when whatever problem stops access to the CDrom drives and you need to transfer a small file. Of course the network is out because generally when that happens, it is due to a virus infection or some other malware (Winantivirus2008). Floppies should never be skipped out on. And when a tech claims they aren't needed, that tech either isn't exposed to the abuses of idiots working on them or they don't fix the problem but just reimage the drive which can lead to another can of worms.
Don't you have like... an USB stick? You DO know that modern computers can boot from them, etc.? The reason we gave up floppies isn't because the need for what they were good for would have disappeared. It is because something a lot better in a dozen ways came up. Sure, floppy drives are good for older computers but they do have them already.
I haven't had a floppy drive in five years and haven't needed one a single time during that. During that time I've actively used three desktop computers (2 of my own and 1 at work) and two laptops (one at work and one personal). I've done a lot of tweaking around with everything possible and can't imagine a single use for floppies that USB sticks wouldn't be just as good for.
Re:No it doesn't (Score:3, Informative)
You don't have to convince MS to implement the API. You might notice that MS doesn't implement OpenGL, yet it works just fine. Windows doesn't prevent 3rd party APIs. Video card vendors are free to implement whatever APIs they like. In XP they really have nothing at all to do with each other. In Vista, the XP method works fine, though Vista will have to turn off it's shiny features if one is used, or there's a new path that's fully WDDM compatible. nVidia and ATi both implement this with their OpenGL drivers. So GL apps run just fine, and don't have any problems with regards to the eye candy. In the case of nVidia, their OpenGL drivers are highly developed and are as fast as their DX drivers.
So if a company/group developed a good 3rd API, you could perhaps convince nVidia and ATi to implement it on Windows and other platforms. MS would never enter the equation.
For reference, this sort of thing has been done in the pro audio world. For a long time OSes didn't have very good support for low latency audio. So Steinberg developed a cross platform standard called ASIO. You get a soundcard with ASIO drivers and an ASIO compatible app and there you go, low latency bit accurate input/output. Even though OSes now all feature their own low latency APIs, it continues to be used in part for legacy reasons but also largely because it is cross platform whereas OS specific APIs aren't.
You can easily get a soundcard that has WDM drivers and thus works with all the Windows APIs such as MME, DS, and WDM/KS and also has ASIO drivers. You can run apps of either kind seamlessly. The Creative Labs X-Fi supports not only WDM and ASIO, but also Creative's own standard OpenAL. So the soundcard has 3 different basic APIs with which an app can talk to it. Winamp can play a files via MME (which goes through he WDM driver) while UT3 does sound through OpenAL, and Cubase can talk to it via ASIO. Windows (including Vista) is just fine with all of this.
MS doesn't force an all or nothing approach to DirectX. Prior to Vista, DirectX support wasn't really even required and indeed some pro cards only accelerated OpenGL. Even in Vista, DirectX support is required only in so far as if you provide a WDDM driver, you are automatically providing DirectX support. However that doesn't stop you at all from implementing any other APIs you like, and the big graphics vendors do just that.
However, before any of that is going to happen, you need to have a good API.
Re:Question (Score:2, Informative)
It was posted yesterday:
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=243193&fpart=7
Re:Question (Score:4, Informative)
And the article text:
What happened to Longs Peak?
In January 2008 the ARB decided to change directions. At that point it had become clear that doing Longs Peak, although a great effort, wasn't going to happen. We ran into details that we couldn't resolve cleanly in a timely manner. For example, state objects. The idea there is that of all state is immutable. But when we were deciding where to put some of the sample ops state, we ran into issues. If the alpha test is immutable, is the alpha ref value also? If we do so, what does this mean to a developer? How many (100s?) of objects does a developer need to manage? Should we split sample ops state into more than one object? Those kind of issues were taking a lot of time to decide.
Furthermore, the "opt in" method in Longs Peak to move an existing application forward has its pros and cons. The model of creating another context to write Longs Peak code in is very clean. It'll work great for anyone who doesn't have a large code base that they want to move forward incrementally. I suspect that that is most of the developers that are active in this forum. However, there are a class of developers for which this would have been a, potentially very large, burden. This clearly is a controversial topic, and has its share of proponents and opponents.
While we were discussing this, the clock didn't stop ticking. The OpenGL API *has to* provide access to the latest graphics hardware features. OpenGL wasn't doing that anymore in a timely manner. OpenGL was behind in features. All graphics hardware vendors have been shipping hardware with many more features available than OpenGL was exposing. Yes, vendor specific extensions were and are available to fill the gap, but that is not the same as having a core API including those new features. An API that does not expose hardware capabilities is a dead API.
Thus, prioritization was needed, and we made several decisons.
1) We set a goal of exposing hardware functionality of the latest generations of hardware by this Siggraph. Hence, the OpenGL 3.0 and GLSL 1.30 API you guys all seem to love ;\)
2) We decided on a formal mechanism to remove functionality from the API. We fully realize that the existing API has been around for a long time, has cruft and is inconsistent with its treatment of objects (how many object models are in the OpenGL 3.0 spec? You count). In its shortest form, removing functionality is a two-step process. First, functionality will be marked "deprecated" in the specification. A long list of functionality is already marked deprecated in the OpenGL 3.0 spec. Second, a future revision of the core spec will actually remove the deprecated functionality. After that, the ARB has options. It can decide to do a third step, and fold some of the removed functionality into a profile. Profiles are optional to implement (more below) and its functionality might still be very important to a sub-set of the OpenGL market. Note that we also decided that new functionality does not have to, and will likely not work with, deprecated functionality. That will make the spec easier to write, read and understand, and drivers easier to implement.
3) We decided to provide a way to create a forward-compatible context. That is an OpenGL 3.0 context with all deprecated features removed. Giving you, as a developer, a preview of what a next version of OpenGL might look like. Drivers can take advantage of this, and might be able to optimize certain code paths in the forward-compatible context only. This is described in the WGL_ARB_create_context extension spec.
4) We decided to have a formal way of defining profiles. During the Longs Peak design phase, we ran into disagreement over what features to remove from the API. Longs Peak removed quite a lot of features as you might remember. Not coincidentally, most of those features are marked deprecated in OpenGL 3.0. The disagreements happened because of different market needs. For some markets a feature is essential,
Re:Question (Score:3, Informative)
Re:Question (Score:3, Informative)
What is funny, this DirectX implementation runs on top of -- guess what? -- OpenGL. And often outperforms actual Microsoft DirectX implementation. ... and I just told guys from /b/ to keep their memes out of Slashdot [slashdot.org], so I can't use "lulz" and "epic fail" without sounding like a hypocrite. Damn.
Re:Question (Score:2, Informative)
Re:Question (Score:3, Informative)
However, as Wine becomes better and better, it becomes more viable for companies to easily port their application across (using winelib etc.).
Re:Not licenses - users (Score:4, Informative)
"Apple is the #3 seller of laptops on the planet right now"
They're the #3 seller in the US. Worldwide, Apple's market share of computers is around 3%, which isn't enough to put them in the top 5 global laptop manufacturers.
Re:The Chicken and the Egg (Score:1, Informative)
You're completely wrong:
Windows XP adoption was below 10% (according to some) for the first year (Vista's being around 15%). It certainly wasn't anywhere near 20%. See Ed Bott's Windows adoption rates: a history lesson [zdnet.com]
But hey, this is /. so you get modded +5 for bashing Vista, even if it's factually wrong, and revisionist history. Yep, XP is the real failure here when you look at the adoption rate.
Re:OpenGL falling down a pit (Score:1, Informative)
Your GPU maintains its efficiency by keeping a tight pipeline on vertex transformations and drawcalls.
Direct X has pretty much one way of controlling this pipeline, and it matches current generation hardware well (and the HW matches DirectX, it's a chicken and egg syndrome). New generation DX might give the game programmers a completely new interface if that is advantageous in terms of efficiency on the next-gen GPU.
OpenGL have numerous ways of controlling the graphics pipeline; let's pick the vertex transformation pipeline as an example.
You have the dead slow glBegin/glEnd. This assumes your GPU is a input/output machine, and whenever you provide it with a vertex, it will process it and render it immediately. That worked great 15 years ago, but it's not how your GPU works today. To work around this, your driver will likely buffer up the vertices, but that requires extra processing power, dynamic allocations and .. we call it fluff.
You have the faster pointer system, where you can set up OpenGL to process a batch of vertices at once, by simply giving it a pointer to your vertices. That allows OpenGL to create a batch jobs based on your input directly, which is a lot faster than simply buffering up the vertices. This is a lot closer to how your current GPU works.
Or you can use vertex buffer objects, which basically cache this data on the GPU memory side, for even more efficiency.
The problem with OpenGL is that there are a lot of ways to do most anything, and it's not clearly defined which are fast and which are not on modern hardware. So the drivers are expected to work fast on all paths, which is very very hard and time consuming for driver developers. Ultimately, this leads to worse drivers; and its one of the major reasons why f.ex ATI OpenGL drivers suck badly compared to their DX drivers.
OpenGL 3.0 starts off by deprecating a lot of the outdated old ways of doing things, effectively limiting you to the fast ways of doing things. It's a good start. The "massive outcries" is basically that it's not doing enough, fast enough.
Re:No it doesn't (Score:3, Informative)
To be fair, how long has the iPhone SDK been out, and how long has DX been out? It is a much more mature product.