John Carmack Discusses 360's Edge, Considers DS 244
Via a Gamasutra post, John Carmack's comments on upcoming id choices. Game|Life has a few quick comments on Carmack's hope to bring Orcs and Elves to the DS. This would be id's first game on a Nintendo platform in some time. Likewise, he makes it clear that he considers the 360 the dev platform of choice due to the ease of development on the console. From the article: "the honest truth is that Microsoft dev tools are so much better than Sony's. We expect to keep in mind the issues of bringing this up on the PlayStation 3. But we're not going to do much until we're at the point where we need to bring it up to spec on the PlayStation 3. We'll probably do that two or three times during the major development schedule. It's not something we're going to try and keep in-step with us. None of my opinions have really changed on that. I think the decision to use an asymmetric CPU by Sony was a wrong one."
Ouch... (Score:2, Funny)
Oooh,*burn*!
Re: (Score:2, Funny)
Carmack: Yeah, that's all bull****.
Oooh,*burn*!Well... (Score:4, Insightful)
Re:Well... (Score:5, Interesting)
I sure as hell wouldn't take his advice on game design though. He repeats the same tired formulas over and over. DOOM was cool in the 90's, but these days you gotta be a bit more creative.
Re: (Score:2)
He admits as much in the interview. The sting is in what he has to say about Windows and the XBox 360. With not so much as bone tossed the way of the OSX and Linux PC gamer.
Re: (Score:3, Informative)
Re: (Score:2, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
True enough. But the better tool set does influence your choice of platform and technology - and even a developer in Carmack's league has to make these choices.
Re: (Score:2)
Re:Well... (Score:5, Insightful)
It really is a shame that the games tacked on these days tend to be glorified engine demos...
Re: (Score:2)
That said, if you're after a deathmatch shooter, it doesn't get much better than id's gear.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Apparently not seeing that his games are all still played today. Look at Zelda: TP. Get the mastersword, defeat 8 dungeons, kill Ganandorf, save Zelda and the world. Where have we seen this before?
Re: (Score:3, Interesting)
DOOM 3 and Quake 4 (though Q4 is a bit better in my mind) really just feel like rehashes of the same ol', same ol', just with much better graphics. Better things have been done in the FPS genre, and those 2 just have not caught up. Quake Wars looks like it will be the answer however (admit
Re: (Score:3, Interesting)
All plots are old and tired. Fantasy seldom has any sort of different plot conflict than Lord of the Rings...Man vs. Evil. But the execution is what separates LotR from "Wizards of the Coast" licensed D&D spinoff novels.
Same is true with Zelda...The plot remains the same, but the details are always fresh and creative. It's the same with games like "Gears of War"...How tired is that formula? But the game is a great game!
Re:Well... (Score:4, Funny)
... PS3 fanboys dismiss Carmack as "moron". (Score:4, Interesting)
This thread has been one of the funniest things I've ever seen. All the PS3 fanboys are bashing Carmack for his comments about Cell, despite the fact that it's quite clear none of them program at all, let alone program on asymmetric CPUs.
Hilarity ensues as people who would have been lauding Carmack to the skies if they'd seen only his gripes about the 360 CPU attempt to prove that he's totally irrelevant and afraid of learning about technology.
Re: (Score:3, Funny)
Re:... PS3 fanboys dismiss Carmack as "moron". (Score:5, Insightful)
Re: (Score:3, Funny)
Re: (Score:2)
Re: (Score:2)
-Eric
Re: (Score:2)
Re: (Score:3, Insightful)
If you only programmed directly for the API, you would get something like Civ 4, a game with average graphics, but that needs a top graphics card.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
All the posters who seem like they know what they're talking about are simply ignored.
Re: (Score:2, Funny)
Re: (Score:2)
-Eric
Re: (Score:2)
I figured that Slashdot users would get a kick out of reading something that reminds us of just far we've come. I mean, compared to that thread, jokes about Natalie Portman and Hot Grits are practically computer science!
Re: (Score:2)
Right (Score:2)
How exactly is the XBox 2 ("360") going to run OpenGL code, Carmacks API of choice?
Re: (Score:2)
But is OGL still his first choice?
He isn't convinced that it is time to make the jump to DX10, but DX9 seems be delivering pretty much everything he wants.
Re:Right (Score:5, Interesting)
On a PC, OpenGL and Direct3D 9 are almost exactly equivalent. On a modern engine, 70% of the code is (or could be, if you put a little effort into the design) API-independent. Things like scene traversal, resource management, batching, loading and generating geometry and so on. In the API-specific part, you have a small piece of code that does useful work, and in an ideal world this is all you'd need. It simply manages resources for you, so you can (for example) tell it to load a mesh onto the video card and it does it, or give it a triangle mesh and tell it to draw it.
The larger part is dealing will all kinds of stupid garbage. For example, in Direct3D 9 you can lose any data you have stored in VRAM at any time, so you have to detect this condition, re-load (or regenerate) all the data, and fix it before you can do any more rendering. It's not difficult, but it's more work. In OpenGL (and on games consoles) you do not need to do this. With OpenGL, you need to detect and load appropriate extensions, which you don't have to do as much in Direct3D 9 (you do have caps bits, which serve much the same function). So in the OpenGL code, you might need a couple of extra paths to deal with hardware / drivers that don't support specific extensions. You could have four separate paths for things like render-to-texture, or several different texture formats for floating point textures, you might have to deal with drivers that don't support S3TC compression, and so on. You've got to deal with the operating system, other software components you're using, and the whole thing's a huge mess.
That entire part is unnecessary on a console. You have one piece of hardware, which maps directly to the available API, with 100% of all features exposed, including things that PC APIs deliberately prevent you from doing. It makes writing code so much simpler - you write some basic, low-level rendering code which interacts directly with the hardware, then get on with the useful task of writing the game engine. You never have to worry about whether or not you've hit a slow path (happens in OpenGL a lot) or a bug (happens in Direct3D a lot, and OpenGL if you're using Intel's rather crappy drivers) in the driver, because the drivers don't actually do anything more than queueing command packets for sumbission to the video hardware.
The same applies to the Xbox, but also to the GameCube, the Wii, the Dreamcast, and to a lesser extent the PS2 (because you have to write all rendering code in VU assembly to do anything useful).
Re: (Score:2)
Re: (Score:2)
Doom on the Wii (Score:5, Funny)
Vista & DX10 (Score:3, Insightful)
Re: (Score:2)
If people actually kept insisting that Dell etc preload XP instead of Vista, then the WINE etc people would have a chance to take over the desktop from Windows.
That's because Carmack et all will keep writing stuff for XP+DX9.
Then all those Linux ppl will have time to make XP+DX9 compatible stuff. Once that happens, Microsof could end up like Intel trying to go Itanic, but everyone ignoring Intel and sticking to x86 because AMD provides a compatible path.
As is most peo
Re:Carmack? (Score:5, Funny)
Nope, Carmack was just responsible for Doom... Doom II... Quake... Quake III... Quake III... Doom III...
You're after Romero's head. And not the one in map30.
Re:Carmack? (Score:4, Funny)
Re: (Score:2, Funny)
Re:Quit your whining... (Score:5, Insightful)
I think you're wrong about id's dedication to writing good 3D engines for the hardware of the times, regardless of complexity. Quake3's engine, for instance, allowed for multithreaded rendering when nobody else was even considering multiple CPUs.
It wasn't completely stable -- and I wonder how many people actually turned the feature on -- but it was multithreading *way* ahead of its time on the gaming front.
C
Re: (Score:3, Interesting)
At any rate, it's trendy to bash the PS3 lately. It really is a leap in console capabilities (and I don't even like/buy/use consoles period), but all the negative comments about developers avoiding the PS3 are creating a self-fulfilling prophecy. The problem is people viewing it as a hurdle rather than an opportunity.
The fact of the matter is that multi-threading is here to
Re: (Score:2)
Re:Quit your whining... (Score:5, Insightful)
The problem is people viewing it as a hurdle rather than an opportunity.
It's a hurdle and an opportunity. Practical people see this, theorists don't. Hell, even IBM admitted that it was a hurdle for programmers in an architecture talk about Cell that I attended at UT.
It's a safe bet that the XBox 1080 (or whatever) will have multi cores, and of course the PC industry is full-steam ahead on that front.
Are you mental? The 360 is already multi-processor, and multi-core is just a performance optimization to reduce communication overhead between processors (while making DRAM access more expensive). It already requires multi-processor programming which Carmack is an early adopter of. You think he doesn't know how to write a multi-threaded application? Please. His point, and a very good one at that, is that it is harder to write multi-threaded code when some of your processors have drastically different capabilities than others. Like I said even IBM, the creator of Cell, agrees with this assesment, so it is nothing but bald-faced denial of reality to pretend otherwise.
Re: (Score:3, Informative)
Actually, the 360 is multi-core, not multi-processor (it has a single Xenon CPU with 3 symmetrical PPC-based cores), and multi-core is more of a cost optimization and less of a performance one (AMD's communication overhead reduction between cores, as well as cache-sharing technologies, is an optimization only viable through the use of multi-core packages
Re: (Score:3, Interesting)
Blame id's game designer for the game design, Carmack is a coder and making the game fun is not his department. He makes it run and easy to modify, the game designers make the gameplay work as intended after that.
Doom 3 has a great engine but most of its capabilities aren't in plain sight. Someone made bots for Doom 3 and just by using the AAS (with the AAS data automatically calculated for the MP maps) they were qui
Comment removed (Score:4, Interesting)
Re:Quit your whining... (Score:4, Informative)
Re: (Score:2)
No they don't, they truly don't, most people don't being to truly "get" parallel processing and -- even worse -- no "popular" languages gets it either, all of the languages based on Dijkstra's concurrency model (shared memory and semaphores/locks) are fundamentally broken.
Re: (Score:2)
That it may be a wonderful opportunity, if you can overcome it, doesn't change the fact that the cost of developing a game for PS3 will be higher than the cost of developing a similar game on the 360. It may look a lot better, but so what? You have to sell enough copies to cover your costs.
Re:Quit your whining... (Score:5, Insightful)
I'm a PS2 programmer; as "bad" as the PS2 is to program for, the PS3 is FAR worse; getting any sort of decent performance out of the PS2 involves utilizing each processor, and now our job is (at least) 7 times harder on the PS3?! Only a masochist would want to program on that thing. Give me ram, lots of it, and not fragmented into tiny little pieces. And few, but faster processors, then many, and slower. There are only so many tasks that can be parallized.
Even on the XBox, MS's tools were miles ahead of Sony's. Most of Sony's PS2 tools haven't been updated in ~4 years, and you wonder why developers are avoiding the PS3?!
And you have the gall to tell me and others it's a self-fulfilling prophecy, when people, like me, are saying the PS3 is "hard, dam hard" _based on past experience_, when you've never even programmed the PS2? Boy, are you naive!
Maybe _you_ want the "opportunity" of staying up late chasing down DMA bugs, trying to figure out why the hell your streaming engine isn't loading some data fast enough (guess what -- streaming is even MORE troublesome 'cuz we have 4 times the memory to fill, but yet the DVD transfer rate has only increased by 2 on next-gen consoles), trying to debug VU code when one of your models isn't skinning properly, trying to figure out where you're going to fit all the game assets in memory, etc, but I've jumped through enough hurdles, that I don't want or need any more then necessary, because I have better things to do (such as implementing the game), then fighting broken, and limited hardware. The principles are indeed the same, but the devil is in the details, and frankly, we're getting tired of having to spend such insane amounts of time on them.
And yes, I do actually love programming the PS2. The risk/reward ratio is very fulfulling. The XBox (1 or 360) even more so. But people aren't bitching when they are stating facts -- "Programming the PS3 is hard. Period." The risk/reward ratio is out of line compared to other consoles -- and we have to ask "Why? Why does it have to be so difficult?"
Maybe Sony will wise up, and realize that "when you make it -easy- to develop on your system, people will -want- to, and be are more then happy to spend the time expirementing. It's all about minimizing the cycle: code new feature - compile - link - export assets - convert to native format. Make it easier on the developers and we will love you -- make it harder and we will hate it. It's not rocket science, only computer, and social science.
Anyways, I've rambled on long enough.
Cheers
Re: (Score:3, Insightful)
You're the first developer I've heard indicate that programming the PS3 is harder than programming the PS2. With PS3, you at least have a fairly straight-forward PPE and the use of OpenGL for the graphics.. it's got to be easier to bring up a game with that than on PS2, right?
Re: (Score:2)
Re:What's happened to us... (Score:5, Insightful)
You're an idiot!!!
I developed games for years before I 'burnt out' on the deadlines and schedule
The fact is no one wants to program for hardware that is 'interesting and challenging' they want to produce software which is 'interesting and challenging'. It is very difficult to produce an advanced 3d engine on a piece of hardware when you're fighting for adequate performance and as you add complexity you're constantly fighting with the hardware to get stable performance.
Essentially, an architecture which is 'difficult' to program for make creating high performance applications like building a sand castle in the rain.
It's difficult to believe ... (Score:5, Insightful)
You're a fool if you think developers only want "the most powerful box". There's a lot more to the industry, heck software programming in general, than power. Also, you completely ignore the fact that if you were right, then you'd have developers flocking to the Xbox, since it was the most powerful console last generation.
Re: (Score:2)
Not really no, it's a repeat of the past PC history: push more pixels on the screen, output more flops. Nothing innovative, and nothing truly "leaping", especially when you don't build the tools to allow your devs to work on massively parallel machines.
Uh... You're aware that the XBox 360 already has 3 (symmetric) cores aren't you? You don't seem to be...
Re: (Score:2)
Actually that's aiming rather low.
I want a machine that looks like a single powerful machine, but can be made out of tons of different PCs, where if one PC dies, it doesn't really matter.
Something like clustered VMS - but more generalized than Google's map reduce stuff.
On Linux there's OpenSSI, but it's still got a long way to go - stuff like Postgresql won't work well on OpenSSI. If AMD or Intel or someone can help fix tha
Re: (Score:2)
Can't help you for general purpose stuff, but if you want a way to code applications on multiple machines, use Erlang, it's been built for this from ground up.
Re: (Score:2)
3 processors, each dual-core, I thought.
Re: (Score:2)
It's not just a safe bet, it's a sure thing, unless Microsoft decides to rip the multiple cores out of the current XBOX 360 design and go "old skool" or something. Oh, and the architectures are already completely different between the XBOX 360 and PS3 despite both consoles u
Re: (Score:2, Insightful)
Also MS was known for providing very good development tools for quite some time. They used those for earlier games, too. Doesn't mean they can't make them cross-platform.
Re: (Score:2)
But I take his word seriously - if he says it was a mistake, it probably was in many respects knowing atypical programmers.
It's not as if this opinion was not said before by others.
Re:Quit your whining... (Score:5, Interesting)
Now, don't get me wrong, I love the Saturn. Good developers could do amazing things with it -- Radiant Silvergun, Panzer Dragoon Saga, NiGHTS, Powerslave, Virtua Fighter 2, Astal, Guardian Heroes... but most chose to develop for the slightly less powerful and far more developer-friendly PlayStation.
And why? Because it made sense. It's not just a matter of developers being lazy or unskilled; if it is too hard to develop for a system, that also means doing so will take longer and cost more.
Who the hell modded that interesting? (Score:2)
That is utter bullshit. The saturn was much better at 2D whereas the PSone totally trounced it in 3D. And 95% of the games were 3D.
Re: (Score:2)
Re:Quit your whining... (Score:4, Insightful)
I think Carmack's view of the PS3 is a lot more realistic than yours. He's talking from the perspective of someone who has to ship a product or his friends don't eat. You're talking about what would be nice if programmer time were free.
FWIW, I have a PS3, which I am using to do Cell development. It really is very impressive... And it also really is a lot more work than a more traditional multicore system. The decision to specialize an extra time here reflects Sony's PS2 design (crappy CPU with two very impressive and non-interchangeable vector processors to make up for it), and I think it also reflects Sony's arrogance; they simply assume that, of course, people will be willing to spend twice as long developing software on their system to get a noticeable but not earth-shattering improvement in performance.
Re:Quit your whining... (Score:4, Informative)
As a Software Engineer I grow more an more disgusted with the x86, and to a lesser extent Power, architecture every day. Ah if only I could go back to the CBM days.
Re:Quit your whining... (Score:5, Insightful)
With the absolute best tools conceivable, even with tools that we don't even have the technology to build, Cell will still be harder to develop for than a more conventional processor.
Yes, it's more powerful; probably a LOT more powerful. It's still mork work to get anything done.
Your claim tht it will be "easier" as it allows "better division of code and work" is just plain nonsense. Any division of code and work I can do on a multi-core system, I can do on a single-core system, too. That division is already available to me. The extra work Cell imposes is that I have to divide it asymmetrically; I can't just partition the task in whatever chunks the task makes sense in, I have to partition a lot of it in terms of the very specific requirements of the SPEs.
That work won't go away, even with perfect tools. It's harder, and it will always be harder.
I think it's probably worth it for supercomputing. I'm less sure that it's worth it for consoles, because game development costs are a plague upon the industry, and making them worse won't help.
Will it get easier than it is now? Yes, but the underlying fact that it is "harder to program and takes a little more time" won't. What might change is that, right now, it takes a lot more time; that might be plausibly reduced. But, in the end, making Cell just as easy as a multicore SMP system is in the same bin as lossless compression that is guaranteed to compress ALL possible inputs.
Re: (Score:2)
Re:Quit your whining... (Score:5, Interesting)
There's a huge difference between Cell and "PPC with vectors". It's called local store. Each SPE has 256KB of local storage. You have to have your code in there, and then stream data through. That means you have to do a fair amount of setup and partitioning specifically around that 256KB limit, which wouldn't apply on a multi-core PPC. That's a real issue, and you can't just paper it over for real projects.
And yes, in theory, the tools should be able to hide some of that from you. That's why Carmack's comments about the tools are so damning. If there has ever been an architecture which desperately needs polished and mature tools, this is it.
Also, I don't know where you get the idea that SPEs can do scalar code. They are 100% vector-only. The closest they can get is to emulate scalar code by ignoring the rest of a vector while manipulating only its first slot. That can be done, but it leaves you with a very slow processor spending a lot of its time masking things out and merging vectors together. (or, if you just omit those slots, and use only one slot out of each potential vector, your data takes much more space; 4x as much for 32-bit objects, 16x as much for bytes.)
Re: (Score:3, Insightful)
IBM has already given one solution to this particular complexity in the form of an advanced compiler that implements software cache [ibm.com] among other enhancements to neutralize your arguments. Pay close attention to "Scalar code on SIMD units" which supports your argument that the SPEs can not intrinsically handle scalar operations specifically (though with the right k
Re:Quit your whining... (Score:4, Informative)
The software cache imposes a fair number of cycles of latency per access even when the datum in question is in the cache; it's even slower when the datum isn't in the cache. Is it useable? Certainly. However, it's going to give you a serious performance hit, especially when used heavily. You can't just use it and get full performance from the SPE without worrying about local store; you have to use it sparingly only for access to large chunks of data, while keeping the bulk of the data you're working on in local store all the time. It does not solve the problem; it mitigates it.
The auto-SIMDization stuff does not make scalar code efficient. What that does is, in some cases, automatically convert scalar operations on arrays into vector operations on the same arrays. It doesn't solve the more general problem of operations that aren't being performed on whole arrays; those stay slow. what's impressive is that they handle the alignment problems involved in vectorizing, e.g., a[i] = b[i + 1] * c[i + 3];, not that they have magically cured the problem.
So we're back to where we started. The cell development support and compiler can give you comparative ease of use, but you lose a lot of the performance potential of the processor. All those theoretical numbers you see are based on the assumption that you are doing 90% of your calculations with no latency; not waiting for DMA, not accessing a cache, and so on.
So, really, you do have to do a lot of extra work to get the theoretical performance, or even very close to it. The software cache and other techniques mitigate this, so you can get enough performance from the SPEs to benefit from them somewhat without having to do anything exceptionally elaborate, but they don't give you the theoretical performance. If you want that performance, you have to do a lot of fiddly little management of, for instance, the tiny local store available on the SPEs. If you want to have that handled automatically, you take a very noticeable performance hit.
Re: (Score:3, Insightful)
It's not just thinking in parallel, Cell requires that you think in terms of what the SPEs are good at, which is streaming data processing with no branches. The transition to parallel thinking is much easier with the Xbox360 because you can parallelize along any task boundary since each processor is the same and capable of the same general processor functions.
As a Software Engineer I g
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
See also:
http://www-128.ibm.com/developerworks/power/librar y/pa-cellspu/ [ibm.com]
Re:odd (Score:5, Informative)
What a bizarre concept. When I worked for a software company in Japan, the only way that culture intruded into software development (aside from an organizational structure that makes Dilbert's seem positively effective) was the fact that programmers who could used English manuals. Japanese manuals and other reference/instructional materials were so vague and information-poor that the language barrier was easier for my co-workers to overcome than reading materials in their native tongue!
I will grant you that Asians (East Asians, at least) seem much more comfortable with interfaces that we find incredibly cluttered. But that's not programming as such.
Re:odd (Score:4, Interesting)
Honestly, developers don't want to re-create the wheel each time for their build processes, but these kinds of things end up forcing that on them. The first chance they have to jettison it, they're going to.
Re: (Score:2)
Re:odd (Score:5, Funny)
Re:odd (Score:4, Funny)
Re: (Score:2)
VB is not about logic, or intelligence.
Re: (Score:3, Informative)
Besides, what do you think if() and while() do in programs? How about == and !=? <, <=, >, and >= are just a series of boolean comparisons.
That isn't to say you can write an application using just boolean logic, as you'd need math in there at some point...
Re: (Score:2)
Nope. You can build the math out of boolean logic.
Re: (Score:2)
Re: (Score:2)
. 2) Boolean logic is a system. It has certain, limited, capabilities which do not include anything like the concept of jumps. You can build other systems, that do include jumps, out of boolean logic but that still doesn't mean that if and while are boolean constructs. They aren't.
Re: (Score:2)
Re: (Score:2, Insightful)
Speaking of which, can anyone point me to a link or other resource that talk about, or actually has the so-called debates between Carmack and Bill Gates of OpenGL and Direct X? Last time I tried Google with this, I couldn't find too much useful information.
Re: (Score:2)
Actually, he's been on Microsoft's side ever since Quake [wikipedia.org] in 1996 or so. David Kushner's book, Masters of Doom [amazon.com] talks just a little about their partnership. After Doom II's major success, Microsoft saw the chance to push id to support it's Windows platform. The original Quake was released for DOS, but later a WinQuake version was released that worked using everything from DirectX except Direct3D
Re: (Score:2)
/begs to the gods of winter-een-mas that John Carmack does make something awesome for the Wii
Re: (Score:2)
You're ascribing the slashdot geek's viewpoint to that of a "normal" guy like Carmack. Carmack, like all "normal" humans, isn't interested in the OSS/anti-MS jihad, where "hate" would be involved.
Re: (Score:2)
Re: (Score:3, Informative)