Slashdot stories can be listened to in audio form via an RSS feed, as read by our own robotic overlord.

 



Forgot your password?
typodupeerror
Oracle Sun Microsystems

Oracle To Invest In Sun Hardware, Cut Sun Staff 135

Posted by timothy
from the taketh-with-one-hand dept.
An anonymous reader writes "There's been much speculation as to what Oracle plans to do with Sun once the all-but-certain acquisition is complete. According to separate reports on InfoWorld, Oracle has disclosed plans to continue investing in Sun's multithreaded UltraSparc T family of processors, which are used in its Niagara servers, and the M series server family, based on the Sparc64 processors developed by Fujitsu. However, Larry Ellison has reportedly said that once the Sun acquisition is complete, Oracle will hire 2,000 new employees — more people than it expects to cut from the Sun workforce. Oracle will present its plans for Sun to the public Wednesday."
This discussion has been archived. No new comments can be posted.

Oracle To Invest In Sun Hardware, Cut Sun Staff

Comments Filter:
  • by Anonymous Coward on Wednesday January 27, 2010 @03:34PM (#30923944)

    What about all of Sun's software? Solaris? Java? NetBeans? Their C, C++ and FORTRAN compilers? OpenOffice.org?

  • by Anonymous Coward on Wednesday January 27, 2010 @03:56PM (#30924372)
    As an Oracle employee, I can tell you "fear-based fealty" is not at all how Oracle works. They have a long history of acquisitions, and the strategy is always the same: Keep the best and brightest from the acquired company, and let everyone else go. Heck, they've bought entire companies before specifically so they could get their best engineers (virtual iron). They're practically obsessed with getting the best people, not the best bootlickers.
  • by PCM2 (4486) on Wednesday January 27, 2010 @04:44PM (#30925328) Homepage

    Ellison re-iterated that it is not competitive, but complementary with Oracle. They plan to increase investment in the business. No specific announcements about development direction or how Oracle plans to package it (no mention of an "Unbreakable MySQL," for example).

  • by Anonymous Coward on Wednesday January 27, 2010 @05:05PM (#30925780)

    http://blogs.sun.com/BestPerf/entry/free_compiler_wins_nehalem_race

  • by Anonymous Coward on Wednesday January 27, 2010 @05:19PM (#30926054)

    Sorry, I have a very difficult time considering MySQL to be Sun-caliber software.

    Frankly, that acquisition has already gone down in history as one of the biggest tech industry blunders of all time. Sun gave up a lot of money, and in return got a completely shitty product. I mean, MySQL isn't just a bad database system. It goes out of its way to be fucking stupid whenever it can. The mere fact that MyISAM doesn't support transactions and foreign key constraints makes it a complete joke. Even SQLite now supports both of those!

    Sun could have done much better for themselves, and for their customers, had they invested even just a small fraction of that money into the development of PostgreSQL. Unlike MySQL, PostgreSQL is the type of professional database system that would have fit in really well with their Solaris and Java offerings.

    Regardless of what Oracle says now, I hope they kill off MySQL as quickly as possible. MySQL is a disease, and needs to be eliminated. It is responsible for more corrupt and lost data than basically anything else in history. And it's something that Oracle doesn't need associated with them, given that Oracle's database products are basically the complete opposite of MySQL in every way.

  • by TheRaven64 (641858) on Wednesday January 27, 2010 @05:27PM (#30926202) Journal

    The most interesting thing about that was that the 'auto parallelization' code used 8 cores to get slightly more than 50% more performance than it got with one core. To be honest, I'm a bit surprised that it got any benefit. It's parallelizing loops which can only be done if the compiler can prove that there are no dependencies between loops (which means that it must be able to see all of the code that executes in the loop, including the bodies of called functions) and it will often result in slowdown because subsequent loop iterations will use data in the same cache line, so you get a lot of churn.

    Compiler performance is very important on the newer UltraSPARCs, like the T1 and T2, because they do not do out-of-order execution. That means that data dependencies between instructions can cause pipeline stalls (which, hopefully, won't be a problem because you've got another thread or seven waiting to run). The compiler needs to know the length of the pipeline and design the instruction stream with this in mind. It also needs to do things like space floating point operations for the T1, which has a much larger floating point latency than most other chips. Moving data between the floating point unit and an integer register (e.g. branch on a comparison between floating point values) takes several cycles, so it needs to be aware of this and shuffle the instructions accordingly.

  • by David Gerard (12369) <slashdotNO@SPAMdavidgerard.co.uk> on Wednesday January 27, 2010 @07:15PM (#30927778) Homepage

    The killer app for the Niagaras is Java, lots of Java. That's CPU-hungry but of course you can run a separate Java app (in the JVM) on each of your 96 threads. Makes a Niagara server well worth the money in our experience.

"Why waste negative entropy on comments, when you could use the same entropy to create bugs instead?" -- Steve Elias

Working...