Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Sony Programming PlayStation (Games) IT Technology

Sony Says Nobody Will Ever Use All the Power of a PS3 581

Tighthead Prop writes "Sony executive Phil Harrison has made some brash comments about the Cell processor and the PlayStation 3. Harrison says that the current PS3 game lineup is using less than half of the machines power, adding that 'nobody will ever use 100 percent of its capacity.' Is he right? 'The major reason Harrison wants to hype up the "unlimited" potential of the PS3's architecture is to downplay comparisons between games running on Sony's console and Microsoft's Xbox 360. The two systems are not completely dissimilar: they both contain a PowerPC core running at 3.2 GHz, both have similarly-clocked GPUs, and both come with 512 MB of RAM.'"
This discussion has been archived. No new comments can be posted.

Sony Says Nobody Will Ever Use All the Power of a PS3

Comments Filter:
  • Woh! Business model! (Score:2, Interesting)

    by Stormx2 ( 1003260 ) on Wednesday December 20, 2006 @11:04AM (#17312528)
    Hold on. Why sell a product with something the consumer will never use? Unless this is a rallying cry, why make consumers pay hundreds of dollars for something they aren't going to use?!
  • Linux Performance (Score:4, Interesting)

    by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Wednesday December 20, 2006 @11:05AM (#17312536) Journal
    Harrison says that the current PS3 game lineup is using less than half of the machines power, adding that 'nobody will ever use 100 percent of its capacity.'
    Well, perhaps this statement will be true for games. I'm not sure. But I have been hearing rumors of the PS3--while running Linux--is not too impressive because it lacks beasty memory. Remember, I'm no expert but I read of a study done running Fedora Core Five versus a Mac G5 running FC5 [geekpatrol.ca] and also a German study claiming the PS3 is little better than a Pentium III 800Mhz when it comes to Linux [google.com].

    But Harrison could be correct depending on how he defines 'capacity.' In the world of computer science, one must be careful with the absolute of "never ever" but he hasn't defined capacity sufficiently. Now if he means there will never be a PS3 game capable of using it to the full capacity then he's probably right.
  • Kind of funny. (Score:4, Interesting)

    by Viewsonic ( 584922 ) on Wednesday December 20, 2006 @11:05AM (#17312542)
    Ubisoft says Assassins Creed will have more intelligent AI in the 360 version simply because the three dedicated cores offer more raw horsepower that the PS3 doesn't have. You can also tell that the PS3 has run into some issues regarding the limit of 256MB of texture memory compared to the 360, most textures are all blurry and low res compared to their 360 counterparts. It's the PS2 hype all over again.
  • Re:Architecture (Score:4, Interesting)

    by aadvancedGIR ( 959466 ) on Wednesday December 20, 2006 @11:18AM (#17312746)
    Or there is another reason, far less flamebait than my GP post: since the PS3 and the 360 are somehow similar, game developpers will be tempted to build their games on the common ground between those tho systems, therefore, even with a superior PS3, the game will be exactly as it is on the 360.
  • by hAckz0r ( 989977 ) on Wednesday December 20, 2006 @11:45AM (#17313140)
    Not being able to utilize 100% of the computing power is inherent in the design of the Cell processor. Don't get me wrong, its a powerful chip, but its like any multi/distributed/multithreaded-processor. With the Cell it takes time to set up and tear down the configuration between the processors, and if there is no data to work on this very nanosecond then that processor is starved and is essentially spinning and waiting for something to do. The cell has some unique capabilities to configure its processor units in parallel or in a serial data flow through shared memory, but if the task can not be broken down into appropriate computational algorithms that keep every processor unit busy then you are simply not running at 100%.
  • by thatguywhoiam ( 524290 ) on Wednesday December 20, 2006 @11:51AM (#17313218)
    First off, this is a famous Sony marketdroid and you should pay him as much heed as you would any other marketdroid from any big corporation. He's just ignorant enought to make boneheaded statements such as this.

    Having said that, for such a nerd-oriented site, I can't believe some of the parsing going on here, and it must come down at least partially to latent Sony-hate (for whatever reason).

    Let's just put the word 'Sony' aside, for ONE second. Just bear with me here.

    The PS3's 3.2 GHz Cell processor, developed jointly by Sony, Toshiba and IBM ("STI"), is an implementation to dynamically assign physical processor cores to do different types of work independently. It has a PowerPC-based "Power Processing Element" (PPE) and six accessible 3.2 GHz Synergistic Processing Elements (SPEs), a seventh runs in a special mode and is dedicated to OS security, and an eighth disabled to improve production yields. The PPE, SPE's and other elements ("units") are connected via an Element Interconnect Bus which serves to connect all of the units in a ring-style bus. The PPE has a 512 KiB level 2 cache and one VMX vector unit. Each SPE is a RISC processor with 128 128-bit SIMD GPRs and superscalar functions. Each SPE contains 256 KiB of non-cached memory (local storage, "LS") that is shared by program code and work data. SPEs may access more data in the main memory using DMA. The floating point performance of the whole system (CPU + GPU) is reported to be 2 TFLOPS[74]. PlayStation 3's Cell CPU achieves 204 GFLOPS single precision float and 15 GFLOPS double precision. The PS3 will ship with 256 MiB of Rambus XDR DRAM, clocked at CPU die speed.

    That is one deeply weird hunk of hardware. And its pretty fucking cool. Or at least, IBM seems to think so.

    Someone has tried to dumb down an explanation like this to our boy Phil and he shat out this 'will never use the full potential' idiocy, which in turn riles all the nerds because its just such a lame thing to say, you can poke holes in it all day (such as, 'why build such a complicated beast if we will never be able to program it - equally idiotic).

    So the statement is 100% true, and 100% meaningless.

    Like the hamburger truck at the end of my street that claims Greatest Burgers in the Universe.

  • huh? (Score:2, Interesting)

    by OriginalArlen ( 726444 ) on Wednesday December 20, 2006 @12:00PM (#17313358)
    I thought the PS3 had an all-new processor called the Cell, nothing to do with PPC but designed from scratch to be massively parellisable and distributed - a cluster in a box in fact. Was I dreaming? Or could it be that the story submitter took a bit of a knock in the maul or a ruck, or had a scrum come down on him, with consequent massive brain trauma and oxygen starvation? That's what happened to me - tighthead in the hardest postion on the damn pitch IMO, you can't even punch your oppo back... mind you this is Welsh borders school rugby I'm talking about. Much more important than just a came.
  • Re:Kind of funny. (Score:1, Interesting)

    by Anonymous Coward on Wednesday December 20, 2006 @12:01PM (#17313364)
    (if correct)

    Translation:
    Ubisoft can not break up their AI algorithm to run on so many cores. Therefore, it runs well on one core just as many PC games do.

    I think this will be a continuing problem with these game consoles as well as PCs. SMP has been around for ages and yet most programmers do not know anything about writing efficient code for multiple processors (or cores). In fact, I've never been asked to write a multithreaded application in any class during my bachelors degree. I've got very few classes left and I don't see it happening. Best we did was talk about the existence of threads in Operating Systems. I realize there are schools that actually bring this up, but I can safely say that Western Michigan University does not until their graduate programs. Many of us will pick it up by ourselves, but imagine the students who just get by and only memorize what is taught.

    I can't comment on the textures part because I'm not familiar with the specifications for either system. I've also never seen a PS3 side by side with an xbox 360. The PS3 demos were using off or broken locally. I have seen an xbox 360 side by side with a wii. All I will say is that the Sega Genesis was very popular despite having poor graphics compared to its SNES counterpart. If the games are fun, people won't care. In fact, if you look at the PS1 or PS2 they weren't necessarily the best systems for graphics either. This is somewhat subjective though. I think the gamecube and xbox look better than a PS2 personally. Actually for some games, the dreamcast looked better. (Tony Hawk 2 for instance) Regardless if you agree with me or not, the point is many people bought Sony game consoles.. so many that Sega stopped selling hardware because they couldn't compete with Sony and Microsoft. Of course, Microsoft had the advantage of helping sega setup their windows ce based dreamcast and watching it fail before the xbox launch. The graphics are not important, but games are. The PS2 launch sucked too. Sony got many of you to buy it eventually. I'm sure they knew it would be a slow process. I've even heard the PSP sales are starting to go back up so anything is possible.

  • by jandrese ( 485 ) <kensama@vt.edu> on Wednesday December 20, 2006 @12:01PM (#17313368) Homepage Journal
    Sure you could use all of the memory fairly easily, but could you soak up every single CPU cycle available to you, especially during the H Blanks? Even if you did that, could you soak up all of the cycles during the (relatively long) V Blank? Remember, even a cycle or two of "slack" would mean you're not using 100% of the machine, and worse, even if you did use up every single cycle of CPU time, you can bet that some marginal machines with slightly marginal processors will roll the screen if you do that.

    Even if you managed that, your game would require two joysticks to play and require constant input on both of them, otherwise you'd be wasting a joystick port. I'm not even going to get into the mode switches and whatnot. It's basically impossible to use 100% of any machine like that.
  • by Doctor Memory ( 6336 ) on Wednesday December 20, 2006 @12:03PM (#17313392)
    You don't have to be a pro driver to get the most out of a Ferrari on a race track. I've taken my Porsche on several tracks, as part of the PCA Driver Education program [pca.org] (basically a racing school without the high cost). I'd further argue that anybody who drives a Ferrari and slams on the brakes to avoid an accident is using it to the fullest, but I doubt that limiting "performance" to braking performance would sway much opinion...
  • by JayBlalock ( 635935 ) on Wednesday December 20, 2006 @02:02PM (#17314968)
    I would suggest that the REAL reason that so few games take advantage of the processing power of the system anymore is that, back in the "good old days," no one really knew what was coming "next."

    Remember, when the NES came out, the video game market was just recovering from a horrendous crash. (that, for a couple years, prevented Nintendo from gaining ground in America) No one knew how long the console would "last," so there was no reason not to try to squeeze everything out of it possible. (resulting in games like Battletoads which, to this day, look closer to the 16-bit games than 8-bit) Same even held true for the next generation. The future was fuzzy. Better to use incredible programming tricks to give the Genesis "Mode 7" effects or hack math coprocessors onto the cart than bet on something better being around the corner.

    If you disagree with this, just ask yourself - would Starfox, with its horribly expensive hardware hacks, have EVER have been made if people were certain a polygon-based console was less than two years away?

    But after the Saturn and Playstation came out, and the PSX became huge, suddenly the next generation started to be a sure thing. Why squeeze every drop of power when you can just wait a little longer and release a game on a superior system? I refer you, for example, to Shenmue - began development on the Saturn (as a Virtua Fighter spinoff), finally released on the Dreamcast. Or Dinosaur Planet / Starfox Adventures - first for N64, finally released on Gamecube. Ditto for Eternal Darkness. There are innumerable examples these days.

    And SPEAKING of Shenmue, there's also a cautionary tale there. The Dreamcast was 2 years into its life. The PS2 was on the horizon, and Sony was fudding endlessly to try to get people to save their money for the PS2. Sega decided (unwisely) to try to have their actions speak louder than their words and poured *$80 Million Dollars* into a supergame which was going to be so incredibly good that no one who saw it would even see the NEED for a PS2.

    That game, of course, was Shenmue. And it was probably better looking and playing than the first wave of PS2 games. None the less, it didn't save the console. And, in fact, its huge expense likely contributed greatly to Sega's rapid crumble afterwards. (and AM2's followup effort, Propeller Arena, looked better than PS2 flight sims for a couple years following... except that it was dumped by Sega and was never even officially released)

    So, combined, what we have here is a very clear message - DON'T TRY TO PRESERVE A DYING CONSOLE. There is no easily-seen reason to do so any more. It sucks, but it's true. You (the developer) can make just as much money delaying the game's release for a year or two, and you risk sinking your entire company if you try too hard to hold onto the past.

  • by flitty ( 981864 ) on Wednesday December 20, 2006 @03:23PM (#17315984)
    2K Games/Take-Two/Rockstar * Red Dead Revolver 2 ~Rockstar North, TBA~ (i'm betting not exclusive by the time it comes out. Atlus * Shin Megami Tensei 4 ~Atlus R&D1, TBA~ Capcom * Devil May Cry 4 ~Capcom Studio 1, Q4 2007~ * Monster Hunter 3 ~Capcom Studio 1, 2008~ Eidos * Age of Conan ~Funcom, Q3 2007~ (not exclusive, coming to PC if i'm not mistaken) * Untitled ~Action~ ~TBD, TBA~ (not a game, stupid) Koei * Blade Storm: Hundred Years War ~Omega Force, 2007~ * Fatal Inertia ~Koei Canada, 2007~ * Mahjong Taikai IV ~In-house, Nov. 22~ (who cares?) * Ni-Oh ~In-house, 2007~ Konami * Bomberman ~Hudson, TBA~ (if it's as awesome as the last bomberman for x360...) * Coded Arms: Assault ~KCET, 2007~ (another psp port?) * Gradius VI ~TBD, TBA~ * Mahjong Fight Club ~TBD, Launch~ (who cares!) * Metal Gear Solid 4: Guns of the Patriots ~Kojima Productions, Q4 2007~ * Rengoku: The End of the Century ~Hudson, TBA~ * Untitled ~RPG~ ~TBD, TBA~ (not a game) * Untitled ~RPG~ ~Hudson, TBA~ (not a game) Midway * Unreal Tournament 2007 ~Epic, 2007~ (coming to pc, if you were a betting man) Namco Bandai * Mobile Suit Gundam: Crossfire ~BEC, Launch~ * Ridge Racer 7 ~In-house, Launch~ (isn't this ridge racer 6?) * Tekken 6 ~In-house, 2007~ * Untitled ~Anime Project~ ~TBD, TBA~ (not a game) * Untitled ~Mech Action~ ~TBD, TBA~ (duh) * Untitled ~RPG~ ~TBD, TBA~ (duh) * Untitled ~Shooter~ ~TBD, TBA~ (duh) * Untitled ~Sports~ ~TBD, TBA~ (shutup) Nippon Ichi Software * Makai Wars ~In-house, TBA~ (probably not american released) Sega Sammy * Fifth Phantom Saga ~Sonic Team, TBA~ * Full Auto 2: Battlelines ~Pseudo, Launch~ (full auto port) * Guilty Gear BB ~Arc System Works, TBA~ * Miyazato Sega Golf Club ~AM1, Launch~ (not american launch?) * Virtua Fighter 5 ~AM2, Q1 2007~ * Untitled ~RPG~ ~Obsidian, TBA~ (not a game, yet)
  • by pdaoust007 ( 258232 ) on Wednesday December 20, 2006 @03:40PM (#17316158)
    "Sony has convinced you that you *need* blu-ray..and it's just not true."

    That depends... I actually bought a PS3 almost solely for watching Blu-Ray movies. If you look around, the 20G PS3 is actually the best high definition DVD player for the money. It's even cheaper than standalone players that won't do a fraction of what the PS3 can do. I know the 360 has an add-on but the lack of HDMI output was a deal-breaker for me. For someone who doesn't need/want HDMI then the 360 + add-on is also a very attractive option.

    Now I just need to sit tight and hope Sony's format wins... :-)
  • by DrXym ( 126579 ) on Wednesday December 20, 2006 @04:52PM (#17317220)
    Why do people say the PS3 is complex? It has a GNU toolchain and supports a ton of 3rd party APIs (Unreal Engine, OpenGL, Collada, PhysX, Havok, etc.). If you can program a computer then you can program the PS3. Even if you want to get your hands dirty with SPU programming, it doesn't look that hard. If the libspe API in the Cell SDK is anything to go by, then SPU development is pretty straightforward and very familiar to anyone who has had to spawn a thread before.

    The hardest thing would be figure out which parts of the program should go on SPUs. But that's a problem that all multi-threaded apps face, and it's not specific to the PS3. A 360 which intends to use its 3 cores to their full potential has similar issues. If there is something "hard" about it, it is that so few games need the full potential of the system that it's hard to know what it is.

    Just look at the games appearing for the PS2. I doubt anyone would have imagined when the PS2 launched that you'd see games like Shadow of the Colossus, Bully or God of War by the end. I expect Harrison is just alluding to that.

  • by plalonde2 ( 527372 ) on Wednesday December 20, 2006 @07:51PM (#17319794)
    The amount of RAM is a different issue from being bottlenecked on the memory subsystem. Long ago a cpu running 1mhz had memory running at the same rate - you could effectively manage a memory access per instruction. Over time CPUs got faster faster than memory got faster. So caches showed up to try to mask it. On a PS2 a cache miss wound up costing 40-60 cycles. Ouch. And the trend has continued, but now it's worse: on the PS3 a cache miss is something ludicrous like 400-600 cycles. Think of it: 500 instructions possible in the time it takes to fetch from memory. Without getting clever, you wind up spending a lot of time stalled waiting for memory. And that's without piles of contention from lots of different threads and processors trying to use the same bus. That's what's meant by being bottlenecked on memory.
  • by faragon ( 789704 ) on Wednesday December 20, 2006 @08:43PM (#17320256) Homepage
    IMO, the problem is not the memory latency, it is the CELL-PPE in-order execution the point that kills performance: the CPU instruction pipe becomes blocked waiting for memory load after cache miss. Most modern CPUs that deal with high latency RAMs are usually out-of-order, for increasing IPC [wikipedia.org], however, the CELL-PPE in-order CPU has to be programmed with explicit prefetch in mind for avoiding pipe stalls. Don't expect great performance from C/C++ code until the compiler gets decent loop unrolling, pipe stall control optimizations, etc. (explicit prefetch will be still necessary for streaming processing).

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...