OpenGL 3.0 Released, Developers Furious 643
ikol writes "After over a year of delays, the OpenGL ARB (part of the Khronos industry group) today released the long-awaited spec for OpenGL 3.0 as part of the SIGGRAPH 2008 proceedings. Unfortunately it turns out not to be the major rewrite that was promised to developers. The developer community is generally furious, with many game developers intending to jump ship to DX10. Is this the end of cross-platform 3d on the cutting edge?"
KDE? (Score:5, Funny)
OpenGL 3.1 will rock
Re:KDE? (Score:5, Funny)
Re:KDE? (Score:5, Funny)
OpenGL 3.5.9 will rock
Re:KDE? (Score:5, Funny)
Unfortunatly once it hits version 3.14.159 it comes full circle, starting back at the beginning.
Re:KDE? (Score:4, Funny)
Re:KDE? (Score:5, Funny)
Question (Score:5, Insightful)
Is this the end of cross-platform 3d on the cutting edge?"
Probably not. As long as DX remains solely in the hands of MicroSoft; there will be use for other forms of cross-platform 3D. More so as the "none-MS" OSes continue to grow in numbers.
Re:Question (Score:5, Informative)
Re:Question (Score:5, Insightful)
The Chicken and the Egg (Score:5, Informative)
.
Vista is approaching 20% of the market. Top Operating System Share Trend [hitslink.com] You can't expect expect Linux ports if entry level DX9l/DX10 outperforms OGL.
Re: (Score:3, Insightful)
How do you get 20% out of 16.9%?
On some of my sites (very generic, non-tech orientated) Linux is hitting a very respectable 5.6% of all traffic.
So Vista is doing appallingly considering its on all new computers.
Re:The Chicken and the Egg (Score:5, Informative)
That's measured from visitors to thousands of sites, including many large mainstream websites. So that'd be computers in use, not licenses sold. Yep, Vista adoption is going even better than XP's did back in 2001, and no, not everyone gets rid of Vista for XP (I'm very happy with upgrading from XP to Vista myself -- no plans on ever downgrading to XP). Not that anyone here would admit to that...
Re:The Chicken and the Egg (Score:5, Informative)
Really? Perhaps if you're using Excel on a Pentium II for your calcs...
XP hit 20% of the market in less than a year, and was at 40% by the 24 month mark.
Vista was released in November 06, and in august '08 is still below 20%. It might make that 20% share by November, 24 months after it was released.
That's a dismal performance by any standards, but for a monopoly OS that was seven years in development, it's an astonishing failure.
Re: (Score:3, Interesting)
XP hit 20% of the market in less than a year, and was at 40% by the 24 month mark.
XP was a major upgrade (especially from the ignorant end-user perspective) over the existing Windows 98, Vista is not.
That's a dismal performance by any standards, but for a monopoly OS that was seven years in development, it's an astonishing failure.
The only people who think Vista should be storming the market, are those who take some sort of perverse pleasure highlighting that it is not doing so. Everyone else understa
Re:Shotgun Marriage (Score:5, Interesting)
All problems stem from one company. Microsoft. Microsoft has been the reason that Vista's adoption is 1/2 what it was and it is the reason we have that much of an adoption as it is. What do I mean?
Microsoft is forcing vendors to sell Vista instead of XP. Microsoft is also forcing hardware vendors to implement BIOS hacks to keep transitions from Vista to XP. This is evidenced by many factors, such as the lack of available XP drivers for these new pre-fab pre-installed Vista boxes.
Keep in mind that this is not the case with custom built. Custom built machines can take XP or Vista, or any other.
I remember back a while ago about the Foxconn debacle. I think there's something similar going on here. When you attempt to install XP on various hardware that came pre-installed with Vista you can get the OS installed. But if you attempt to install drivers for those components, if you can find them, there is an almost complete failure to get these components to function.
This is not the case with all manufacturers. It is the case with Gateway and with Toshiba. Both of these manufacturers are forcing Vista installs. It may be with a few chipset packages such as the Intel GM/GL 965. But it is happening.
After a successful install of XP (after verifying that the components work under Vista) and then attempt to install say the wireless, wired, sound, SMBUS drivers, you'll get messages from the installers informing you that the devices aren't present.
You can confirm that this is a BIOS level function due to the fact that if you take a component from a machine that came pre-installed with XP and put it into the new machine where you have removed Vista and installed XP, that component's driver installer will also tell you that the device is not present, even though it was properly installed in another machine.
This clearly is an attempt by Microsoft to mandate to the manufacturers that they are not to support XP any longer even if the customer has chosen to do this on their own.
We did not have this situation when going from Win2k to XP nor from Win98 to XP. It appears to be an issue specifically with going from Vista to XP. It appears to be a bios level hack which creates the situation.
Contact with others has confirmed the situation. Many have reported that this is occuring and the consensus is that it is a mandate by Microsoft to prevent users from running XP on these older machines.
As I said, it isn't all machines. It is a new tactic being implemented on newer hardware in an attempt to force us to stay with Vista.
One has to ask why this is the case. Why on earth is Microsoft so hell bent on forcing us to Vista? Is it some hidden back door? Why would Microsoft care which OS we run given that we have paid them for Vista and paid a second time for XP? What is their motive for mandating this type of issue? Why would they dictate that the sales support for XP has been dropped so quickly?
Something is awry here.
Re:The Chicken and the Egg (Score:4, Insightful)
Plenty of people are buying computers with Vista and switching to another OS, or downgrading to XP.
I wouldn't call it a downgrade.
Re:The Chicken and the Egg (Score:4, Insightful)
I wouldn't call it a downgrade.
Perhaps a "retrograde to XP"? Sounds kinda hip.
Not licenses - users (Score:4, Informative)
.
FIY:
We collect data from the browsers of site visitors to our exclusive on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month.
Additional estimates about the website population:
76% participate in pay per click programs to drive traffic to their sites.
43% are commerce sites
18% are corporate sites
10% are content sites
29% classify themselves as other (includes gov, org, search engine marketers etc.. About Our Market Share Statistics [hitslink.com]
Net Applications stats are global.
Its clients - for reasons which should be blindingly obvious - are interested only in meaningful stats about users, not licenses.
Plenty of people are buying computers with Vista and switching to another OS, or downgrading to XP.
The numbers simply aren't there to support this argument.
Re:Not licenses - users (Score:5, Informative)
.
Net Applications doesn't give a damn about the locked-down corporate desktop - it is selling stats about the home market to retailers like Target.
Every user I know personally who has tried Vista rolled back to XP or moved to Linux.
The plural of anecdote is not data. Net Applications builds its stats from 160 million page views each month.
yet for all the new Vista licenses being sold, XP dominates the statistics you linked from 70 to 17 percent.
The Net Applications stats are global.
There are by some measures a billion users world-wide running Windows - most on older hardware that cannot be realistically upgraded to Vista.
But something like 1 in 5 users will have made a very significant investment in hardware and in Vista in less than two years.
Re:Not licenses - users (Score:5, Insightful)
The plural of anecdote is not data.
You know, I hear this a lot, but I think you're wrong, at least in this case. Or to put it more succinctly, the plural of anecdote may not be data, but the collective of anecdote is indeed data. If every user Enderandrew knows, and every user I know, and...
I mean, how many times do you have to hear the exact same story from how many different people before you admit it's the truth? I don't know one single person who is using Vista as their home OS. Zero. Nada. None. And I work in end-user support. I talk to lots of people. It's my job. And seriously, not one. I know some people who had it for a while, and ditched it for either XP or (yes, seriously) Ubuntu Linux.
I spend a fair bit of time in hipster coffee shops (don't judge me, it's part of my job), the patrons of which I take as a fairly good bellwether of consumer tech, if only because there's a decent amount of disposable income floating around and a majority of the machines in use at such places tend to be 1 year old (yeah, lots of Macs, but even so...) And there is no way I'm going to believe that Vista is at a 20% adaptation rate, at least not in this major Midwesten metropolitan area. Absolutely not.
-p.
PS: It's 3:00 AM central time, and I have been drinking. If something up there doesn't make sense or sounds stupid, please ask me to clarify rather than modding me down. I don't usually drink and post, officer, honest :)
Re:Not licenses - users (Score:4, Informative)
"Apple is the #3 seller of laptops on the planet right now"
They're the #3 seller in the US. Worldwide, Apple's market share of computers is around 3%, which isn't enough to put them in the top 5 global laptop manufacturers.
Re:Not licenses - users (Score:4, Interesting)
None of which is surprising, given the differences between the circumstances of the releases of XP and Vista.
1. XP initially wasn't a whole lot more than a facelift of Win2K. Which itself wasn't initially a whole lot more than a facelift of NT4.0 which initially was - well, you get the point. So while the development process that eventually resulted in XP was ongoing, they kept releasing versions of it along the way. This is perhaps the only *real* mistake they made when developing Vista.
2. Microsoft really did choose a great point in the NT timeline to cross the system over to be their consumer operating system as well. Their latest attempt at a consumer OS had been an abject failure, customers were getting more and more disgruntled with the inherent instability of the 9x platform, the home PC market was really picking up steam, and home-user hardware was finally to the point that it could support an NT OS that had all the bells and whistles needed to make it appealing to said home users. So when XP came out it was almost a no-brainer to switch from the problems that were the 9x system - and even so a lot of people held back for 2+ years.
3. XP had the advantage of being an upgrade from a clearly inferior system. Windows 9x was so much more limited in so many ways (couldn't even use more than 1 CPU!). Meanwhile, a lot of the improvements in Vista are not so obvious to the uninformed. It's not obviously more stable, and a lot of the small improvements don't immediately appear to be improvements because people have to re-learn shortcuts that they had been using for as long as 5 years or more.
4. WinME came out barely a year and a half before XP. It's not hard to remember the flaws of the last version when it's been that recent. But by the time Vista came out, it was 3 years since SP2, which fixed the majority of the glaring issues XP had. By then many people forgot the initial troubles of XP, if they were around to see them at all.
So sure, most people aren't going nuts over Vista. I don't use it (or like it) yet myself. But to say it's a failure compared to XP is false. In reality it's doing surprisingly well. In fact, if you want to point fingers the only real mistake you can point at that MS made while developing vista is that they didn't come up with at least one more *new* version of the old NT in the meantime to charge us for. So personally I'm happy keep using XP until I'm ready to upgrade to Vista, which will probably be in another 2 years or so. Meanwhile, many people I know are happily using and enjoying Vista.
Re: (Score:3, Interesting)
Re: (Score:3, Informative)
I can't believe you are the obvious one to this discussion. I'm not dogging you in the least. I'm saying that floppies should have never been removed from the computers.
I build systems and always spec a floppy drive, I have a few colleges that seem to skip out on them and nothing makes me more pissed off then doing a 11:00pm service call that ends up in a 3 am hunt for a floppy because something needs done to the boot sector or the bios needs flashed after a windows update made something previously stable t
Re: (Score:3, Informative)
I can't believe you are the obvious one to this discussion. I'm not dogging you in the least. I'm saying that floppies should have never been removed from the computers.
I build systems and always spec a floppy drive, I have a few colleges that seem to skip out on them and nothing makes me more pissed off then doing a 11:00pm service call that ends up in a 3 am hunt for a floppy because something needs done to the boot sector or the bios needs flashed after a windows update made something previously stable too unstable to boot the OS. There are other times like when whatever problem stops access to the CDrom drives and you need to transfer a small file. Of course the network is out because generally when that happens, it is due to a virus infection or some other malware (Winantivirus2008). Floppies should never be skipped out on. And when a tech claims they aren't needed, that tech either isn't exposed to the abuses of idiots working on them or they don't fix the problem but just reimage the drive which can lead to another can of worms.
Don't you have like... an USB stick? You DO know that modern computers can boot from them, etc.? The reason we gave up floppies isn't because the need for what they were good for would have disappeared. It is because something a lot better in a dozen ways came up. Sure, floppy drives are good for older computers but they do have them already.
I haven't had a floppy drive in five years and haven't needed one a single time during that. During that time I've actively used three desktop computers (2 of my own an
Comment removed (Score:5, Informative)
Re:The Chicken and the Egg (Score:5, Insightful)
Does that count the people who've pirated Vista and run it?
I've always found it sad that people have to bandwagon things like operating systems. I mean, take Irix for example. It's possibly the worst, most unstable operating system in history (through its lifetime) and I had to suffer with it for years, but you don't read about people bashing it because it's *nix. I don't care too much about Vista. I don't care too much about any flavor of *nix either. It's all a toolbox and people pretending otherwise have agendas that range from personal to political and monetary issues. Now I must admit I have a proclivity for laughing at Windows ME (how'd that ever get released? LOL)
Re: (Score:3, Informative)
However, as Wine becomes better and better, it becomes more viable for companies to easily port their application across (using winelib etc.).
Re:Question (Score:5, Informative)
AFAIK you cannot use the dependency resolution logic of apt or yum or w/e without also divulging the source code something which is never going to happen with commercial s/w.
kjella@desktop:~$ dpkg --info opera_9.51.2061.gcc4.qt3_i386.deb
new debian package, version 2.0.
size 8295240 bytes: control archive= 6485 bytes.
34 bytes, 2 lines conffiles
1275 bytes, 21 lines control
16580 bytes, 231 lines md5sums
1719 bytes, 54 lines * postinst #!/bin/sh
572 bytes, 18 lines * postrm #!/bin/sh
179 bytes, 9 lines * prerm #!/bin/sh
Package: opera
Version: 9.51.2061.gcc4.qt3
Section: non-free/web
Priority: optional
Architecture: i386
Depends: libc6 (>= 2.1.3), xlib6g (>= 3.3.6) | xlibs | libxmu6, libqt3-mt (>= 3.3.4), libstdc++6
Suggests: flash-npapi-plugin | flashplugin-nonfree | swf-player | libflash-mozplugin | mozilla-plugin-gnash, pdf-npapi-plugin | djvulibre-plugin | mozilla-acroread, cupsys-client | lpr, sun-java6-jre | sun-java5-jre | java-gcj-compat, linux-libertine | ttf-dejavu | ttf-bitstream-vera | msttcorefonts, xine-plugin | gxineplugin | mplayerplug-in | kaffeine-mozilla | mozilla-mplayer | mozilla-helix-player | gecko-mediaplayer, mozplugger | plugger, mozilla-bonobo, aspell
Conflicts: opera-static
Replaces: opera-static
Provides: opera-static, www-browser
Installed-Size: 20100
Maintainer: Opera Packaging Team
Bugs: mailto:packager@opera.com [mailto]
Description: The Opera Web Browser
Welcome to the Opera Web browser. It is smaller, faster,
customizable, powerful, yet user-friendly. Opera
eliminates sluggish performance, HTML standard violations,
desktop domination, and instability. This robust Web
browser lets you navigate the Web at incredible speed and
offers you the best Internet experience.
The binaries were built on Debian using gcc-4.0.0.
I think someone sent us a telex saying they want their troll back.
Re:Question (Score:5, Funny)
Wine Isn't aN Emulator?
No way!
Re: (Score:3, Informative)
Re:Question (Score:5, Funny)
That's right, WINE is not an emulator. It just, uh, "approximates" the Win32 libs.
"Approximates?" No, that's not right. Simulates? Imitates? Hmm... if only there was a word for something that attempts to perform a task in an identical way to something else.
Re:Question (Score:5, Informative)
Implements.
Re: (Score:3, Informative)
The goal of Wine is a full reimplementation of the Windows API which will make Windows unnecessary.
Emphasis mine.
Wine comes with a full set of headers and libraries which make it possible for a programmer to view the Win32 API as a spec and recompile it with the Wine implementation.
However, Wine also comes with a program that loads native Win32 .exe files and tricks them into thinking that they're running on a bona-fide Win32 OS. This is how most end-users experience Wine, and it's hard to argue that's not an emulator.
WINE should really stand for "Wine Is Not just an Emulator" or maybe "Wine Is Not a ha
Re:Question (Score:5, Interesting)
WINE's Direct3D sits on top of native linux OpenGL.
I don't think most developers are "furious". When OpenGL 3.0 was described as a backward-incompatible rewrite, they were a bit closer to furious. They spoke, and said they wanted backward compatibility retained a while longer. And lo, Khronos delivered, while providing a mechanism for migration to the new architectural constructs (buffer objects, shaders, moar buffer objects, moar shaders), and a way to build your code so that deprecated constructs fail.
Seriously, most people in the OpenGL community are fairly happy (though there's some grumbling over the still-wide OpenGL / OpenGL ES split).
Re:Question (Score:5, Interesting)
That certainly isn't my experience. Most people on the OGL discussion boards were very much looking forward to the changes to the API. The previews Khronos posted in the Pipeline newsletter looked bloody amazing.
But when those previews are followed by almost a year of complete silence and then finally an API which is nothing at all like the one they promised, but rather some more spit and polish on the mess that is OGL 2.1 (much like OGL 2.0 was really just 1.6 with a new name), people got pissed off. And rightfully so.
The only ones pleased with this change as far as I've been able to gather are the CAD people wanting to continue to run their old, stale OpenGL bases code until the end of time. For new development, using OpenGL is a pain in the back side, which is why I just began bringing my renderer up in D3D10.
Re: (Score:3, Insightful)
"The CAD people" are the bread and butter of OpenGL. It's incredibly important to keep those high-margin dudes happy. Direct3D totally fails at addressing CAD + grownups' visualisation needs, so killing OpenGL's core and basically captive market to keep low-margin gamer devs happy would be superdumb.
But wait, legacy-free OpenGL ES also exists, catering specifically to gaming+embedded markets! How about that!
Re:Question (Score:5, Interesting)
Imagine you were the owner of a CAD or Animation software company. I suppose that when you have multiple OpenGL apps each with 10s of millions of lines of code, it's pretty hard to justify a rewrite from a business standpoint. Those "old stale" code bases each generate 100s of millions of dollars each year, and they're orders of magnitude larger and more complex than games. It would take millions of $$ to port one of the major OpenGl apps to another API, and from a business standpoint, those $$ would be wasted -- they wouldn't be doing anything other than chasing someone else's aims and objectives -- not doing anything that would generate a decent return on the investment.
Your customers don't care what the underlying API is that you use -- what they care is that you solve their problems in a cost effective way. If OpenGL3.x was a complete and incompatible break -- these companies would think "well if those a$$h0les are going to make us rewrite the software, we might as well jump to DX instead and be done with it" (At least if you don't have to support mac and linux).
It's not too hard for people to figure out who I work for so let me add that these are my opinions only -- my employer may share them, or they may not -- I certainly make no representations in this -- but these opinions are mine.
Re:Question (Score:5, Insightful)
Their code will require a huge architectural rewrite no matter what the API looks like. Hardware just doesn't work like these programs are using the APIs anymore, and hasn't for a long time. Keeping this legacy stuff around in the new API won't change that. It'll still be a complete mismatch with the hardware.
If they want to take advantage of GL3 (either the promised or the delivered version) they will have to rewrite large parts of their code, so why not just drop all this backwards compatibility nonsense and make GL3 actually good, while still keeping GL2 around for legacy? With the original plan for interoperability between the two they could still switch to GL3 one piece at the time while they rewrote their codebase to modern standards. This would have been much simpler for everyone involved. These companies included.
Re: (Score:3, Interesting)
You are correct that code needs to be rewritten and even rearchitected -- the old way of doing things in GL is often a very poor match for today's hardware, and GL is pretty crufty these days -- but it would be nice to be able to do the rewrites incrementally over several releases as opposed to all at once (incrementally with multiple contexts is not so nice either). That said, I think it would have been better had GL3.0 been what we had been expecting as opposed to GL2.2, which is what we got.
Barthold Lich
Re:Question (Score:4, Informative)
And the article text:
What happened to Longs Peak?
In January 2008 the ARB decided to change directions. At that point it had become clear that doing Longs Peak, although a great effort, wasn't going to happen. We ran into details that we couldn't resolve cleanly in a timely manner. For example, state objects. The idea there is that of all state is immutable. But when we were deciding where to put some of the sample ops state, we ran into issues. If the alpha test is immutable, is the alpha ref value also? If we do so, what does this mean to a developer? How many (100s?) of objects does a developer need to manage? Should we split sample ops state into more than one object? Those kind of issues were taking a lot of time to decide.
Furthermore, the "opt in" method in Longs Peak to move an existing application forward has its pros and cons. The model of creating another context to write Longs Peak code in is very clean. It'll work great for anyone who doesn't have a large code base that they want to move forward incrementally. I suspect that that is most of the developers that are active in this forum. However, there are a class of developers for which this would have been a, potentially very large, burden. This clearly is a controversial topic, and has its share of proponents and opponents.
While we were discussing this, the clock didn't stop ticking. The OpenGL API *has to* provide access to the latest graphics hardware features. OpenGL wasn't doing that anymore in a timely manner. OpenGL was behind in features. All graphics hardware vendors have been shipping hardware with many more features available than OpenGL was exposing. Yes, vendor specific extensions were and are available to fill the gap, but that is not the same as having a core API including those new features. An API that does not expose hardware capabilities is a dead API.
Thus, prioritization was needed, and we made several decisons.
1) We set a goal of exposing hardware functionality of the latest generations of hardware by this Siggraph. Hence, the OpenGL 3.0 and GLSL 1.30 API you guys all seem to love ;\)
2) We decided on a formal mechanism to remove functionality from the API. We fully realize that the existing API has been around for a long time, has cruft and is inconsistent with its treatment of objects (how many object models are in the OpenGL 3.0 spec? You count). In its shortest form, removing functionality is a two-step process. First, functionality will be marked "deprecated" in the specification. A long list of functionality is already marked deprecated in the OpenGL 3.0 spec. Second, a future revision of the core spec will actually remove the deprecated functionality. After that, the ARB has options. It can decide to do a third step, and fold some of the removed functionality into a profile. Profiles are optional to implement (more below) and its functionality might still be very important to a sub-set of the OpenGL market. Note that we also decided that new functionality does not have to, and will likely not work with, deprecated functionality. That will make the spec easier to write, read and understand, and drivers easier to implement.
3) We decided to provide a way to create a forward-compatible context. That is an OpenGL 3.0 context with all deprecated features removed. Giving you, as a developer, a preview of what a next version of OpenGL might look like. Drivers can take advantage of this, and might be able to optimize certain code paths in the forward-compatible context only. This is described in the WGL_ARB_create_context extension spec.
4) We decided to have a formal way of defining profiles. During the Longs Peak design phase, we ran into disagreement over what features to remove from the API. Longs Peak removed quite a lot of features as you might remember. Not coincidentally, most of those features are marked deprecated in OpenGL 3.0. The disagreements happened because of different market needs. For some markets a feature is essential,
Re:Question (Score:5, Funny)
Jesus Christ, why don't you just change your last name to match your company's and be done with it? Do they own you? Do you feel the need to make a similar disclaimer every time you take a dump?
Excuse me, folks! I just wanted to let everyone else in the bathroom know that this stink is not the fault of my employer! My employer does not necessarily have as much gas as I do!
Are you worried that your masters will punish you? If so, I suggest that you reconsider your loyalties.
Re:Question (Score:5, Informative)
Well, it is a Microsoft product, so it's not without its flaws (The Vista dependency for one), but over all it's a good API for taking advantage of modern hardware without all the legacy crud that plagues OpenGL.
If you've used D3D8 or older, you'll find it a massive improvement.
Re:Question (Score:5, Informative)
the legacy crud that plagues OpenGL.
Did you read "the deprecation model" (appendix e) of the OpenGL 3.0 spec? OpenGL 3.0 apparently provides for a mode (a "forward compatible context") that helpfully excludes deprecated "legacy crud".
This sounds very handy for people trying to update codebases - they can presumably switch to a forward-compatible context, do a build, see what breaks.
Re:Question (Score:5, Interesting)
Sure, but that does nothing to help driver development. They still need to support all the deprecated features if the application requests them (most likely for a very long time to come as well), and driver quality is one of the major problems with OGL right now.
The "old" GL3 was also supposed to include interoperability with GL2 mind. But it would not do it by layering yet more stuff on top of the old, which I can't imagine will do driver quality any favours.
Re:Question (Score:5, Interesting)
most likely for a very long time to come as well
Seems rather FUDy... Why introduce a deprecation model if not to encourage people to the more OpenGL ES like nondeprecated bits? Yeah, you still can call glBegin/End, but it'll presumably hiss nastily at you.
I just don't see it as "layered on top", particularly - you do things the new way if you want your code to run in forward-compat mode. It's "beside" rather than "on top".
(certainly unlikely to be "layered on top" at the driver sources level, would be inverted if anything - any old fixed pipeline functionality emulated with programmable hardware.)
Bit of a book-scam though. Whole 'nother round of red/orange book purchases...
Re:Question (Score:5, Interesting)
The problem here isn't the actual implementation of the old fixed function pipeline. That has been emulated with shaders for yonks already.
The problem lies in the state machine at the core of OpenGL. This will have to be there no matter what "deprecation level" you're running at and I can't imagine the IHVs will implement a standalone version of that for each of these levels. The result is that every feature will impact others since they interact with the same core system, enabled or not. IHVs will have to hack up their currently stable code to add OGL "3" support, and they will break things in the process.
What really breaks my heart is that OGL2 could "easily" be layered on top of the original GL3 they proposed. That way they could take care of backwards compatibility while still providing lean and mean drivers for the rest of us. The other way around isn't nearly as easy though (if at all possible), and will do jack squat for driver simplicity.
Re: (Score:3, Interesting)
Two levels, for now. There will be more in the future as more stuff gets deprecated if I'm reading it right.
And the forward compatible API is nowhere near as clean as the one promised a year ago. The fact that it may be layered on top of a cleaner API inside the driver doesn't really help me.
Will the IHVs (or Tungsten for Gallium) develop separate state trackers (or whatever it is they're doing internally) for each deprecation level though?
Re: (Score:3, Interesting)
EXT_direct_state_access [opengl.org] is their answer to your state machine problems. Although it hasnt abolished state, the extension is designed to make it accessible: whereas previously programmers had to update selectors and latch in state, the EXT_direct_state_access extension attempts to, from what I can discern, provide easy on-demand access to various states, no context switching required.
As you are the only sane comment I've read from this entire thread, I'd be interested to hear what you think of EXT_direct_st
Re: (Score:3, Insightful)
Re:Question (Score:5, Informative)
That's most of the problem though... they did rewrite OpenGL, then they scrapped it. So in the process, we got a few years of the new version not existing. And a year of communication (from ARB/Khronos) not existing, particularly frustrating after they'd spent the previous year saying they were going to work on communications and transparency.
Even better, GL2 was supposed to be a cleaned up API, so this was the second time they promised a rewrite and scrapped it.
So either they were completely wrong about the justification for the rewrite both times (which doesn't bode well for the group in charge of the API) or we are missing out on the benefits the rewritten API would have provided.
Probably the biggest problem was the communications though, if they'd admitted the problems as they happened, there probably would have been less backlash. As it is, everyone was still pretty much expecting the original 3.0 design, so not getting that, on top of a year's worth of promised status updated, on top of the previous poor communication the promised status updates were supposed to fix, on top of the promised-then-scrapped 2.0 update, etc. leads to unhappy community.
(For those not following the situation, advertised benefits included:
simpler api = simpler drivers = better conformance + fewer driver bugs
new object model = less need for consistency checking in drivers = faster drivers with fewer bugs
getting rid of outdated code paths = easier to understand the api, easier to tell what will be fast
probably some more I forgot)
Re: (Score:3, Informative)
What is funny, this DirectX implementation runs on top of -- guess what? -- OpenGL. And often outperforms actual Microsoft DirectX implementation. ... and I just told guys from /b/ to keep their memes out of Slashdot [slashdot.org], so I can't use "lulz" and "epic fail" without sounding like a hypocrite. Damn.
Re:Question (Score:5, Informative)
Cross-platform 3D is useful, but OpenGL stopped being cutting-edge many years ago. The model that it uses is falling farther and farther from the model that the hardware supports, and many new extensions and features are not supported on many platforms (particularly ATI). It has become increasingly difficult to write cutting-edge graphics software, and OpenGL 3 does little to fix that.
Re: (Score:3, Interesting)
Question is, what does OSX currently have handy that would replace it? (it's been way too long, my memory sucks, so let me take a stab here... Quartz, Core Graphics, whatever-it's-called-nowadays?)
Either way, any developer having to keep two separate code branches for two separate library sets is (okay, just IMHO) begging for pain.
No it doesn't (Score:5, Informative)
Part of the reason for DX's success is that nobody else seems interested in developing anything to compete with it. OpenGL is the only cross platform 3D API I'm aware of and it and DX are all that there is these days. GLs problem is that it isn't keeping up with the hardware. The "just use vendor specific extensions" isn't a realistic solution in most cases. Thus GL is suffering and DX is winning by default.
If someone like Apple did develop a good 3D API, it might do well. However nobody seems interested.
Re:No it doesn't (Score:5, Informative)
If someone like Apple did develop a good 3D API, it might do well. However nobody seems interested.
Sadly, won't ever happen. Apple's "commitment" to gaming on their platform doesn't extend far beyond 3D chess and Tetris clones. Hell, having a working Flash client is probably Apple's idea of supporting "gaming" for their users.
Apple appears to be quite content with OpenGL in its current state, and haven't even gotten close to pushing its limits.
Have you installed the DirectX SDK lately? It's sad how wide the divide is. On the DirectX side you get a *massive* library of documentation, sample code snippets, entire sample projects, and more guides than you can shake a stick at. Compare this with the new-hotness that is Apple's iPhone SDK. Worlds apart. The iPhone SDK documentation is absolute trash. There are almost no tutorials, "sample code" is hardly ever commented. No code snippets to accompany tricky API calls, and the entire thing uses so much Objective-C-speak that I'm quite surprised anybody but a hardened Mac developer can even begin to comprehend it.
One company is very good at fostering a developer community and making sure it's easy to get on board their API. The other seems like it goes out of their way to torture devs.
Disclaimer: I am a hobbyist iPhone developer, Mac user, Xbox owner, and DirectX developer.
Re: (Score:3, Funny)
Re: (Score:3, Informative)
There is no Half Life for Mac.
Depends on how you mean that. According to this section of the Half-Life article, [wikipedia.org] Sierra made one, said they were going to release it, [planetfortress.com] and then never did....
Makes you wonder if it's just sitting in a vault somewhere....
Re: (Score:3, Informative)
You don't have to convince MS to implement the API. You might notice that MS doesn't implement OpenGL, yet it works just fine. Windows doesn't prevent 3rd party APIs. Video card vendors are free to implement whatever APIs they like. In XP they really have nothing at all to do with each other. In Vista, the XP method works fine, though Vista will have to turn off it's shiny features if one is used, or there's a new path that's fully WDDM compatible. nVidia and ATi both implement this with their OpenGL driver
Re:Question (Score:5, Interesting)
This has however everything to with ATI and nothing really with OpenGL, as it is the hardware manufacturer who ultimately decides which capabilities will they expose in the drivers. ATI's OpenGL drivers was *always* bad, buggy, and badly performing (go on, search for some old benchmarks, you will see that ATI cards that easily outperform their NVidia counterparts in DirectX falls heavily behind when it comes to OpenGL apps and games).
The developers' expectations here was that if OpenGL 3.0 will include all the newest stuff in core spec, ATI (and Intel and others) will be forced to support them (so they can pass the certification and be able to call their products compliant), however the same expectation for improved OpenGL drivers was there when ATI was bought by AMD, and that too never really materialized. ATI simply doesn't care enough about OpenGL, their main focus was always DirectX, and i don't see that changing in nearby future.
As for OpenGL 3.0, the rage is that Khronos group promised us moisty delicious cake (whole new API, yay!), but after long long wait delivered only small biscuit. I didn't expect much so i'm not disappointed and overall the spec is good step (deprecation model for lots of old stuff, FBO finally promoted to core, direct access extension), but just like KDE 4.0, it is only first step, and it *really* depends on where it will go from now.
This can't be good. (Score:5, Insightful)
Jumping ship to DX10 would be nice, if it were cross-platform. (No, Xbox + PC does not count as "cross-platform".)
Unfortunately for those of us on Linux/Mac, a lot of Windows developers don't care.
Unfortunately for those of you who think you don't care about this, consider that porting an app generally improves it, and can shake out bugs which aren't as apparent on the other platform -- which means potentially less reliable games, even if you're only on Windows.
And unfortunately for those of us who hate Vista, that's kind of a requirement for DirectX 10. At least with OpenGL, those in charge have no agenda to push Vista -- so an OpenGL 3.0 game should run on XP, if it runs on anything.
Re: (Score:3, Insightful)
Even with a cross-platform graphics layer, we're not seeing a lot high-performance games that run on all of Windows, Linux, and Mac. The problems of developing and debugging this kind of software are big enough to discourage people doing it for multiple platforms in any case.
Re:This can't be good. (Score:5, Interesting)
How about the creation of a fully operational open source, cross platform, DX10 or DX11 implementation, not created by Microsoft but by the community,
Wine will do this, eventually.
and fully working natively (not through Wine)
That's a bit harder, because it requires driver support.
and supported by NVidia and ATI drivers?
The official ones? Never going to happen. Anyone want to guess how many patents Microsoft has on DirectX tech?
And the unofficial ones haven't even gotten GL right, yet, and you're proposing they try to support another interface?
More importantly, you're assuming this is a good idea -- that we should be working to clone a Microsoft technology, instead of improving on one which has been open from the start (GL).
Re: (Score:3, Insightful)
If not for the reasons stated above, then at least for the reason of being able to suddenly convert a lot more games natively for other platforms than Windows more easily.
But, how can we improve on it? Just wait?
Re: (Score:3, Insightful)
But, how can we improve on it? Just wait?
We can't. OpenGL evolution is controlled by the people who build hardware. If the hardware guys add a new feature, they can add an extension to OpenGL to support it. If they can persuade someone else to add the same feature, they can propose it as a standard extensions, and then propose it to be a required part of the next version of the spec.
The thing OpenGL is typically bad at is removing legacy stuff. OpenGL ES is, in many ways, a nicer API - it is designed for embedded systems and removes a lot of
Re:This can't be good. (Score:5, Interesting)
How about the creation of a fully operational open source, cross platform, DX10 or DX11 implementation, not created by Microsoft but by the community,
Wine will do this, eventually.
Wine uses OpenGL to do the actual rendering AFAIK, it reads the DirectX function calls, but it doesn't interface with the hardware itself, it basically just implements the functions with OpenGL calls.
So while the OpenGL dependency may be less obvious for the user or casual developer, it's still there, and a bad OpenGL release means a bad DirectX implementation in Wine
I'm no expert though, correct me if I'm wrong about this
Re: (Score:3, Insightful)
There are two basic problems with this:
1) Direct3D is tied to Windows pretty well, so making it crossplatform would be difficult (your DX10 implementation may need to include a lot of Wine).
2) You'll need the graphics vendors to support the API. With Intel and ATI opening up their specs, we're closer to having a way for the community to make up-to-date graphics drivers, but there's still the problem of NVIDIA. Without them, no one's going to try to write software with this new API, and it seems unlikely tha
Re:This can't be good. (Score:5, Insightful)
Gallium3d [tungstengraphics.com] will enable just that. The Wikipedia [wikipedia.org] page even mentions DirectX and wine.
That said, I don't think the uproar over OpenGL 3.0 is as widespread as the summary would have you believe. OpenGL's grave will likely be right next to Unix, X, vi and C (ie. no time soon).
Re:This can't be good. (Score:5, Informative)
What MS call "Shader model 4" (even though geometry programs aren't, strictly speaking, shaders as they don't necessarily SHADE anything per se) includes mandatory support for geometry programs.
The geometry program sits in the programmable pipeline between the vertex program (which is used for real-time vertex deformation in hardware, world-space to object-space clipping to generate texcoords for the fragment program, etc) and the fragment program (which is used to colour fragments [1] based upon the output of the vertex program and input from one or more texture samplers.)
Unlike "old" vertex programs, a geometry program is able to generate new geometry on the fly. This allows a whole heap of really cool stuff, such as real-time shadowing effects, for essentially free.
So, yeah, much as I hate to admit it (and REALLY hate the Direct3D 'shader' nomenclature concerning pipeline programs,) D3D10 actually has changes with merit from D3D9c.
[1] A fragment is a fancy name for a voxel defined in clip space. After shading and occlusion, the remaining fragments become rasterised as pixels. Thus, the term 'pixel shader' is rather inaccurate.
Is this the end? (Score:5, Interesting)
jump ship to DX10
And when they do they wander into Direct/Input/Sound/Video/Play/etc. OpenGL does 3D rendering. The rest? Cobble it together from whatever other obsolete scraps are available.
The non-Microsoft "stacks" suck. Bottom line.
The concept of a 2D "layer" still hasn't impinged on the basic SDL API. Couldn't believe it when I learned that.
I guess professional game developers don't care that Microsoft owns the machinery of their livelihoods. They sure aren't contributing to their own independence in any noticeable way.
Re:Is this the end? (Score:5, Interesting)
Sorry my anonymous brethren, but you're exaggerating a bit. First off, DirectDraw (DirectX 2d API), DirectInput, and DirectPlay are all deprecated for other APIs. Granted, the other APIs are Microsoft but even they aren't always consistent across MS platforms. For example, DirectInput [wikipedia.org] is replaced by one API on the 360 and a different one for the PC.
SDL handles cross-platform input and some basic platform functionality on the open side. Not that you could expect it to run on a console, but it should run on a Mac, Linux, or Windows.
The open equivalent of DirectSound is OpenAL, which looks a lot like OpenGL in usage. Of course, that's more of a negative, since they both need an overhaul. It *is* cross-platform and supports 3d sound though.
The other APIs aren't nearly as important for game development.
Re: (Score:3, Insightful)
As for professional game developers not caring that Microsoft owns the machinery of their livelihoods. Of course they don't, why should they, and even if they did what would they do about it? Game developers don't write Operating Systems so someone else will always hold their livelihoods, and so long as Microsoft offers them a decent platform to write games for(which it does) and so long as Microso
Re: (Score:3, Insightful)
but though much a grin
I read that about 20 times trying to figure out what it said... finally, I decided that I think you meant "to my chagrin [wordreference.com]".
Err, yeah. (Score:5, Interesting)
Heh - Games developers may have that luxury, but 3D/GC vendors certainly don't. So unless someone decides to port DX10 to OSX (*snort*) or Linux (sing it with me now: "render farms!"), OpenGL will continue to have a decently-sized userbase for a very long time.
IMHO, anyone making the claim that they're going to suddenly jump to DX10 is only making noise; nobody is dumb enough to cut off the fastest-growing consumer market sectors.
(...besides, doesn't the PS3 use OGL, or do they use some other home-brewed library set? Not sure there...)
Re:Err, yeah. (Score:5, Informative)
The PS3 uses OpenGL ES for basic rendering (GL with all the ancient cruft ripped out) and NVIDIA's Cg for the actual shaders.
Re: (Score:3, Informative)
OpenGL ES (PSGL) is provided, but I don't think anyone is seriously using it except to do initial porting efforts.
Sony supply an alternative low level api called libGcm.
Mod parent up, maybe (Score:4, Informative)
All I've really seen of the PS3 dev kit is what was on display at GDC. The Sony guys talked about GL ES and NVIDIA's Cg toolchain for shaders, so that's what I posted. This, however, sounds a lot more like what I expected from Sony and is right in line with the PS2 dev kit (emphasis mine):
Sony supply an alternative low level api called libGcm.
If libGcm is what I think it is (macro'd constants to build push buffers + raw DMA access) then pretty much nobody will be using the GL stuff. Coding right to the hardware is what PlayStation development is all about.
Re:Err, yeah. (Score:5, Interesting)
Render farms don't use OpenGL or DX for rendering in programs such as Lightwave/Maya/blender, the frames are rendered by the CPU not GPU. (there are a couple exceptions to this).
The only place the video comes into play is when you are running the 3D app and modelling of huge poly objects. I can slow Blender down to a crawl in big scenes on my older powerbook with only 64MB of video ram, but it runs smooth in my old G4 tower with 256MB of video ram, yet the render times on the same frame are about the same. (1.5Ghz vs. 2x1Ghz G4 CPU's., both with 1.25GB of Ram).
DX10 is not cross platform? (Score:5, Funny)
OpenGL falling down a pit (Score:4, Interesting)
Not much unlike the one XFree86 fell down.
It needs to be forked. We need a fork of the 3D library, much like Xorg was forked.
The fork/rewrite should be more like DX10 than like OpenGL.
The library needs to be able to interoperate with current and future video hardware, so that all hardware acceleration features will be available to applications using the 3D library...
That means providing an underlying interface compatible with both the OpenGL and DX10 ABIs and conventional hardware drivers.
Re:OpenGL falling down a pit (Score:5, Insightful)
"The library needs to be able to interoperate with current and future video hardware, so that all hardware acceleration features will be available to applications using the 3D library..."
Now, I know next to nothing about the nitty-gritty details of OpenGL or DirectX,
but I really thought they were pretty much equal (in terms of being able to fully utilise the hardware)
I was under the impression that MS wrote the DirectX API, and graphics hardware was expected to provide in interface to GPU operations as per MS's API spec
On the flip side, OpenGL being less centrally controlled,
instead graphics hardware provide their own API calls for new GPU operations, and provide this new API call to OpenGL via it's "extension" interface
and every so often, the OpenGL spec would be updated, with new GPU functions (currently using seperate, per-vendor extensions) would be standardised into a single implementation
Are developers really saying that OpenGL cannot do things DirectX can?
I thought as long as (say) Nvidia kept provided drivers, and software kept querying for the hardware's capabilities, DirectX & OpenGL were pretty much on a par with each other....
Can anyone provide a semi-layman's explanation?
Re: (Score:3, Informative)
That's the entire problem. nVidia would have functionality available in both DX and GL drivers on release day and would frequently submit it to the ARB for ratification as an extension, which often would become a core feature in the future.
Unfortunately, nobody else took their lead. In good scenarios, you'd have separate im
Re:OpenGL falling down a pit (Score:4, Insightful)
If you're talking about forking OpenGL... how do you convince Nvidia, AMD, Intel, PowerVR, Apple, Microsoft and who knows how many other companies to implement your incompatible version of the API in their OpenGL stacks?
However if you're simply talking about GLX/Mesa then you'll be happy to know that it's being reimplemented in the Gallium3D project.
Re:OpenGL falling down a pit (Score:5, Insightful)
You don't fork a spec. You create a new one and try to get it accepted by the industry (ATI, Nvidia and Intel in this case).
Good luck with that.
Re: (Score:3, Insightful)
OpenGL is forked. It's designed for forking. Every single vendor implements extensions. When two vendors have implemented their own extension then it can be proposed for inclusion in the standard.
The community at large was unhappy with the slow speed of development around OpenGL 2.0, so they let Khronos take over development of the next version. It seems that this didn't work very well, in hindsight.
No. (Score:5, Interesting)
Is this the end of cross-platform 3d on the cutting edge?
it isnt. because OpenGL ARB is gonna play it nice, and revise their specs, therefore pleasing developers and therefore GAMERS as much as they can, and fix the matter, wont you now ? dont make us wait.
GL is doomed in the short-to-medium term... (Score:5, Informative)
...and probably irrelevant in the longer term.
This is not the first time this has happened. GL2 was also supposed to be a cleanup, but turned out to be anything but. This latest fiasco is more significant as a failure of governance than of technology. All the right ideas were there; they were published in some detail over a year ago in the Pipeline newsletters. But the ARB very obviously a) can't agree to get anything meaningful done, and b) now has subzero credibility with developers. It's not coming back from that.
So yes, I think cutting-edge cross-platform 3D is dead for the next 2-3 years. Let's face it, it was never exactly healthy. It's not the end of the world. Linux share is currently growing mostly at the low end, netbooks etc, while the Mac seems to be thriving despite the fact that Apple doesn't give a flying fsck about gaming and never has.
Fast forward a couple of years, though, and things like Larrabee will be hitting the market; embarrassingly parallel hardware that can be programmed with standard languages and tools. The API's role as gatekeeper of functionality will be gone. And suddenly, 3D rendering libs can be written by anyone with the time and expertise, without having to go through Microsoft or the ARB or NV or AMD/ATi or Intel or anyone. Experimentation, competition, cross-fertilization, evolution. Remember Outcast's [wikipedia.org] voxel engine? Seen things like Anti-Grain [antigrain.com]? This will happen.
(Or, yes, people could just reimplement the DXwhatever API directly, but that would be a little disappointing.)
Today was not a good day, by any stretch of the imagination. But it's not the end.
OpenGL is NOT only games (Score:5, Insightful)
Professional apps (CAD/simulators/visualizations...) make up the majority of the OpenGL market and they have to be supported for decades (no, military or airlines do not buy a new training system every two years ...)
So breaking compatibility is deal breaker. This is exactly what OpenGL 3.0 is about. I am developing OpenGL applications for a decade now and all are still running and being used. How many 10 year old games can you actually get working today? God forbid - on Vista? That is the difference.
Also, the "newest features not supported by OpenGL" - how many "newest features" are your typical games actually using? Perhaps one or two and they are optional, because the game must run even on not bleeding-edge hardware (how many games are DX10-only? - commercial suicide ...)
So to wrap this up - the title is EXTREMELY misleading and making up a storm where one doesn't exist.
Marketshare (Score:3, Insightful)
While the gaming community is growing at an awesome rate, I doubt its the same size, and definitely not bigger than the hollywood industry. Coming from the special effects/render farm industry, I can tell you that every single movie that makes it to the big screen today, is in one way or many, made with products that use OpenGL. The gaming community/developers of course are frustrated that opengl is not dx10, but lets face it, hollywood has an endless budget, and a lot of say. This story does not surprise me, and opengl is still going to be the best cross-platform solution for many (and most) 3d technologies, less gaming.
Not completely a "fail" (Score:4, Informative)
Explanation from OpenGL ARB Working Group Chair (Score:5, Informative)
Basically they've got tangled in the implementation details and decided to play it safe with OpenGL 3.0 instead of starting from scratch.
========
What happened to Longs Peak?
In January 2008 the ARB decided to change directions. At that point it had become clear that doing Longs Peak, although a great effort, wasn't going to happen. We ran into details that we couldn't resolve cleanly in a timely manner. For example, state objects. The idea there is that of all state is immutable. But when we were deciding where to put some of the sample ops state, we ran into issues. If the alpha test is immutable, is the alpha ref value also? If we do so, what does this mean to a developer? How many (100s?) of objects does a developer need to manage? Should we split sample ops state into more than one object? Those kind of issues were taking a lot of time to decide.
Furthermore, the "opt in" method in Longs Peak to move an existing application forward has its pros and cons. The model of creating another context to write Longs Peak code in is very clean. It'll work great for anyone who doesn't have a large code base that they want to move forward incrementally. I suspect that that is most of the developers that are active in this forum. However, there are a class of developers for which this would have been a, potentially very large, burden. This clearly is a controversial topic, and has its share of proponents and opponents.
While we were discussing this, the clock didn't stop ticking. The OpenGL API *has to* provide access to the latest graphics hardware features. OpenGL wasn't doing that anymore in a timely manner. OpenGL was behind in features. All graphics hardware vendors have been shipping hardware with many more features available than OpenGL was exposing. Yes, vendor specific extensions were and are available to fill the gap, but that is not the same as having a core API including those new features. An API that does not expose hardware capabilities is a dead API.
Thus, prioritization was needed, and we made several decisons.
1) We set a goal of exposing hardware functionality of the latest generations of hardware by this Siggraph. Hence, the OpenGL 3.0 and GLSL 1.30 API you guys all seem to love ;\)
2) We decided on a formal mechanism to remove functionality from the API. We fully realize that the existing API has been around for a long time, has cruft and is inconsistent with its treatment of objects (how many object models are in the OpenGL 3.0 spec? You count). In its shortest form, removing functionality is a two-step process. First, functionality will be marked "deprecated" in the specification. A long list of functionality is already marked deprecated in the OpenGL 3.0 spec. Second, a future revision of the core spec will actually remove the deprecated functionality. After that, the ARB has options. It can decide to do a third step, and fold some of the removed functionality into a profile. Profiles are optional to implement (more below) and its functionality might still be very important to a sub-set of the OpenGL market. Note that we also decided that new functionality does not have to, and will likely not work with, deprecated functionality. That will make the spec easier to write, read and understand, and drivers easier to implement.
3) We decided to provide a way to create a forward-compatible context. That is an OpenGL 3.0 context with all deprecated features removed. Giving you, as a developer, a preview of what a next version of OpenGL might look like. Drivers can take advantage of this, and might be able to optimize certain code paths in the forward-compatible context only. This is described in the WGL_ARB_create_context extension spec.
4) We decided to have a formal way of defining profiles. During the Longs Peak design phase, we ran into disagreement over what features to remove from the API. Longs Peak removed quite a lot of features as you might remember. Not coincidentally, most of those features are marked deprecated in OpenGL 3
They should have run with OpenGL ES (Score:4, Interesting)
As a developer I'd like to see OpenGL ES given priority over OpenGL. OpenGL ES matches the hardware much better than OpenGL does.
OpenGL itself could be implemented as a library on top of OpenGL ES. This would move all the legacy crud out of the main driver and make the jobs of driver writers a lot easier (an OpenGL ES driver is a lot smaller than an OpenGL driver).
OpenGL ES could become the basis for Linux graphics drivers instead of OpenGL (why does a window manager need all those OpenGL functions? It doesn't...)
Re:Calm down (Score:4, Informative)
Thing is it's not even close to being easy to use anymore. Especially not if you're interested in performance.
Because of the two decades of crud that has accumulated, there are so many ways of falling off the fast path in OGL, and it's next to impossible to know beforehand what will and will not work. Drivers are also a bitch to develop and maintain because of the size of the thing, which makes things even worse since what works on Nvidia may not work on ATI and vice versa.
The only way to fix this is a good cleanup to bring it in line with modern hardware. What they did was add even more crud.
Re:Calm down (Score:4, Insightful)
That would have been back in the mid to late 1990's. The Indy's (those flattish sparkly blue workstations) were the first to have a software implementation of OpenGL.
OpenGL was designed primarily so that third party graphics card manufacturers could write device drivers compatibile with the CAD and scientific-visualisation markets. The developers didn't really care much about 3D sound, real-time physics engines or AI. All they needed was a GUI and one or more hardware accelerated 3D graphics contexts for applications to run. The most complicated lighting models at the time were smooth shaded textures geometry using point light sources.
Now, modern game engines will be using multiple vertex and fragment shaders for things like relief mapping, occlusion mapping, reflection, refraction, BRDF, BTF, spherical harmonics, environment mapping, ambient shadow-mapping, real-time radiance, particle systems animated using textures and feed-back loops. Current research is attempting to include Computation Fluid Dynamics for animating dust around racing cars (EA), the use of Partial Differential Equations to synthesize the spots, stripes and spirals for virtual creatures (Spore), and that's just the visual part of the game. Then there is 3D surround sound, the physics (collision detection between animated characters, their environment, and everything else), along with multi-player network support (sockets).
To make a PC game that sells, it is going to have the visual appeal that takes the graphics hardware to its full potential. This means having people experienced in all of these fields or having the budget to afford middleware. Only large companies can really do this now.
Re: (Score:3, Informative)
ATI supports openGL, Intel added full support as well when they implemented DX10 (only one chipset has these so far iirc though).
The apparent failure of OpenGL to provide a significant rival to DX10 sucks though, especially since the DX10 on Vista only might have provided game makers an incentive to jump ship in order to get bleeding edge graphics onto XP systems.
the misunderstanding is yours (Score:4, Informative)
That's an old perpetuated myth, that the value of D3D has anything to do with the satelite APis.
Let's confront it with facts:
- Direct3D is the absolutely dominant component of DirectX, in terms of received and deserved attention by users. As well as R&D effort by MSFT. It's the advancement of D3D that drives MSFT to release every next version of DirectX.
- Each component of DX is a completely indepependent API, sharing only design convention. OpenGL games on Windows use DirectInput for input, perfectly ignoring Direct3D.
- funnily, the satelite APIs are actually being phased out by MSFT. DirectPlay was always thoroughly ignored by the industry. For DirectSound and DirectInput there are replacement already. Not to mention the fate of hardware acceleration of DirectSound in Vista.
D3D and DX are de facto interchangable terms. Get over it.
Not handling sound/input by OpenGL is absolutely irrelevant in discussion about it's applicablity.
Re: (Score:3, Insightful)
It also allows you to pick libraries that have been made by experts in the area the lib focuses on.
Are you implying that Microsoft just throws whomever at the various portions of directx? The networking guy programs directsound, the sound guy programs directinput, the janitor programs direct3d? I don't generally stand up to defend MS, but I'm willing to bet that they have plenty of money and clout to get various experts to program their specs. And I know it's totally anecdotal, but most game developers I know or whose opinions I've read seem to feel that directx is a more intuitive and feature-rich platf
Re: (Score:3, Insightful)
but to 90% of the market it's irrelevant since they don't run Vista and therefore don't have DX10, OpenGL is competing with DX9, not 10, and winning.
<sarcasm>Yes, because game developers look to the now and not the future.
When FPS Games take multiple years to develop, its a well known fact game designers program it with current generation graphics in mind. They never plan ahead 1-3 years so that at release they have a game that will push grahpics to the limit</sarcasm>
Honestly, the biggest supporters of OpenGL i can think of have been ID Software and Epic MegaGames, and i expect even Carmack would be getting tired of working around a primiti
Re:people still make opengl games? (Score:5, Informative)
MPC: So, you said Rage is a 60Hz game. Is it an OpenGL or DirectX game?
JC: Itâ(TM)s still OpenGL, although we obviously use a D3D-ish API [on the Xbox 360], and CG on the PS3. Itâ(TM)s interesting how little of the technology cares what API youâ(TM)re using and what generation of the technology youâ(TM)re on. Youâ(TM)ve got a small handful of files that care about what API theyâ(TM)re on, and millions of lines of code that are agnostic to the platform that theyâ(TM)re on.
MPC: Are you using DirectX 9 equivalent? For Doom 4 as well?
JC: Yes to both. Itâ(TM)s one of those things I get asked a lot. Whatâ(TM)s big and exciting for DirectX 10 or DirectX 11? Thereâ(TM)s not a whole lot of⦠really not a whole lot. The big touted geometry shaders were in many ways, a mistaken belief that people desperately wanted to create stencil shadow volume.
So less than a month ago John said that he's still developing with OpenGL and that DX10 isn't really a worthwhile improvement.
And congratulations on referring me to something he said ages ago, when you find something more recent feel free to reply
Oh and source of interview: http://www.maximumpc.com/article/features/e3_2008_the_john_carmack_interview_rage_id_tech_6_doom_4_details_and_more?page=0%2C0 [maximumpc.com]
Re:Um, DUH!!! Who own OpenGL now? (Score:5, Informative)
Please stop modding up this troll.
That article is 6 years old.
Most of those patents are hardware patents totally irrelevant for OpenGL (or Direct3D for that matter).
Also, Microsoft is not a member of the group that actually writes the OpenGL specification. They have no vote on what gets in OpenGL or don't.
Of course this might give them leverage on some of the hardware vendors (like Nvidia) that will have to implement the new OpenGL standard in the future. But history does not show them trying to use this in any way against OpenGL.
But claiming they "own OpenGL" is nonsense.