Vista's Graphics To Be Moved Out of the Kernel 555
Tiberius_Fel writes "TechWorld is running an article saying that Vista's graphics will not be in the kernel. The goal is obviously to improve reliability, alongside the plan to make most drivers run in user mode." From the article: "The shift of the UI into user mode also helps to make the UI hardware independent - and has already allowed Microsoft to release beta code of the UI to provide developers with early experience. IT also helps make it less vulnerable to kernel mode malware that could take the system down or steal data. In broader terms, this makes Windows far more like Linux and Unix - and even the MacOS - where the graphics subsystem is a separate component, rather than being hard-wired into the OS kernel."
The Bloat Divides? (Score:4, Insightful)
So this is like cell division. The bloat of Windows divides into the Kernel and UI pools.
Taking this article into account [slashdot.org], it seems clear why the massive graphics card requirement. However, if this much is being pulled from the Kernel, then why still such a massive minimum RAM?
"if you hold down ctrl+shift+alt and tap the backspace you can watch a video of steve wrecking a chair"
Finally, can I turn the GUI off on my server? (Score:5, Insightful)
Now for the marketing (Score:4, Insightful)
New! - Microsoft's Exclusive Patented Technology allows for graphics outside the kernel, to provide higher stability.
New! - Microsoft's Revolutionary Technology allows for graphics outside the kernel, to provide higher stability.
Just wait.... they'll make it sound like a new concept. Rather than a copycat.
Reinventing Unix (Score:0, Insightful)
Open GL Drivers? (Score:5, Insightful)
Re:Finally, can I turn the GUI off on my server? (Score:3, Insightful)
Re:The Bloat Divides? (Score:4, Insightful)
The simple fact is that it's possible to do great graphics, at least for a GUI, without needing a bloody supercomputer (Yes yes yes I *know*. I'm overstating for effect). Basically if they did these things properly they would see a lot of the hating go away.
Re:Finally, can I turn the GUI off on my server? (Score:5, Insightful)
Ah, yes. Just what we all want. Command-line administration of Active Directory and Exchange.
Windows Server 2003's GUI overhead is extremely small in comparison to the other tasks it's performing. Besides, it's not a matter of being "scared" of a CLI, in fact pretty much all the Windows sysadmins I know (including myself) use the Windows command line on a regular basis. Believe it or not, but a GUI really can give a boost to speed and efficiency when it comes to server management, regardless of what the zealots here might say.
Obligitory: (Score:4, Insightful)
Re:Finally, can I turn the GUI off on my server? (Score:2, Insightful)
You still need alternate access to console (Score:2, Insightful)
Re:Now for the marketing (Score:5, Insightful)
Day in and day out, Microsoft takes a beating around here for putting too many irrelevant subsystems into their kernel.
And then, when Microsoft makes a positive design change, they are attacked for HYPOTHETICAL marketing. You don't know how (or if) they'll market this.
I can see it now: Bill Gates shows up at your front door, hands you a million dollars, and walks away. You run to your computer and submit the headline, "BILL GATES IS A TRESSPASSER."
Re:Finally, can I turn the GUI off on my server? (Score:5, Insightful)
Never used or seen Netware or used any UNIX, have you?
There is no NEED for a GUI on the server. Keep the admin tools on the client! If you can't administer AD from your client, restart the AD Admin Service on the service.
Admins should only physically touch servers when there is a hardware problem or network problem. If you are sitting on the console of your server using the GUI, I would suggest that you are not a very experienced sysadmin.
Re:This is NOT a good thing. (Score:5, Insightful)
Re:More like Mac & linux = understatement of t (Score:2, Insightful)
Re:YES a COPYCAT (Score:3, Insightful)
There are design tradeoffs made in doing operating system design. Back when NT4.0 came out, Microsoft decided that the performance of the kernel-mode graphics system was superior to that of the user-mode graphics system. Now that the hardware is so much faster, there's virtually no difference in performance...so now they're moving it back into userland.
I realize this is
Re:Now for the marketing (Score:3, Insightful)
Why would someone need to think of something original when they can just keep recycling the same old jokes over and over?
I'm no MS fanboy myself, considering some of the mistakes they've made in the past. However, I'm disappointed with what passes for humor here sometimes.
Re:Finally, can I turn the GUI off on my server? (Score:3, Insightful)
Re:Finally, can I turn the GUI off on my server? (Score:4, Insightful)
Re:Finally, can I turn the GUI off on my server? (Score:3, Insightful)
Well yeah, in the same sense that Unix is DOS on steroids.
I know this analogy is not entirely correct, but wasn't the point of Win9x that it put the gui INTO the kernel?
No. The point of Win9x was to look like Mac OS. Moving the GUI into the kernel was a poorly thought out premature optimization. Microsoft is doing the right thing by changing that.
Re:This is NOT a good thing. (Score:5, Insightful)
Any bugs that exist in the kernel mode driver would yield the same problems in user mode. If a video driver incorrectly configures your graphics card, you're going to get a garbled display, period.
I don't think we're too worried about garbled displays here. If you have a kernel mode driver, it can do whatever the hell it likes with the entire kernel address space. Even if it isn't malicious, a badly written kernel driver can cause all sorts of corruption all over the place.
Re:So what does this mean for cheaters? (Score:3, Insightful)
Re:The Bloat Divides? (Score:3, Insightful)
In fact, I'd like to see an ability in Vista Server to shut down the UI completely unless someone is actually using the system in an interactive mode.
Re:Finally, can I turn the GUI off on my server? (Score:2, Insightful)
If I said, "provide me an example of a situation where a GUI would be quicker/easier/fewer steps/less error prone/whatever compared to a command line interface, and I will give you $100", you don't think you could come up with even 1?
I personally could fill a book of pro's for both GUI and command line.
Ahum... NVIDIA userspace kernel module (Score:3, Insightful)
Yeah, running graphics drivers in kernel space is just plain ugly... Luckily for us Linux users, we can get full graphics acceleration by running the "userspace" NVIDIA kernel module
size
text data bss dec hex filename
2476901 947920 6916 3431737 345d39
Re:Apple and Microsoft (Score:3, Insightful)
the reason this is happening now... (Score:3, Insightful)
Re:The Bloat Divides? (Score:5, Insightful)
Just my two cents because I get sick of morons bloviating this crap...
NT borrowed almost NOTHING from the VMS or *nix world. Culter was author of VMS and a brilliant *nix designer, but he also knew the shortcommings of both OS models. NT was designed specifically to be different and not be tied to a *nix or for that matter a VMS architecture.
(In fact Cutler could have made NT a full *nix Windows, as Microsoft owned Xenix at the time, and was willing to go with whatever the Cutler team decided would create the next great OS architecture.)
People can bitch about Windows and specfically Win32, but there is not a whole lof ot NT itself that is flawed or attackable in its design. It is still doing kernel and architectual concepts today that you cannot find any other consumer level OS. PERIOD.
For graphics and sound to work best, commonly used objects are stored in memory, ideally most rapidly accessible by the chipset which makes use of it. If you can pre-load a graphics card with most of your GUI toolkit you can do some amazingly fast rendering.
Ok, this partially true; however, the thing people seem to miss is that when Microsoft dropped Video to Ring0 with NT4 it was to improve video performance for games, specifically WinG and DirectDraw at the time. This was a major performance increase at the time because of the higher level GDI calls of Win32 that were mostly non-accelerated for gaming. ALso at the time 3D accelerated Video Cards were basically non-existent at the time, so machines didn't have a powerful GPU to utilize.
And what this means by them moving the Video back up from Ring0 is of course more stability, so the new NVidia beta build doesn't make the Windows machine lock up when it shouldn't, as most graphic drivers are the root of 99% of all system lockups with Windows, since most users don't run MS certified drivers and are running the latest incarnations.
Additionaly, with the new graphics subsystem concepts in Vista, having Video Drivers in Ring0 is far less important, as the entire WPF is designed to take advantage of the Video GPU from everything from off-screen buffering like OSX, to drawing the entire controls and 3D interfaces.
In fact with the new WPF in Vista, the GPU can even be used to accelerate printing, and creation of XPS graphical/display documents.
So there is no longer a need or reason for the small performance benefits by having the video in Ring0, since the GPU, even older GPUs by today's standards handle all the gaming and now even the new UI controls and 3D vecotoring of the UI.
Basically MS is saying, we are moving to where the GPU will do its job, so we no longer have to compensate software rendering and no longer need Video drivers to have Ring0 access.
Microsoft considered this move with WindowsXP, but with the driver changes needed and the UI still being GDI+ based for most applications, there was still a lot of software rendering taking place. It was only the games that it really didn't matter for as they were already doing DirectX and OpenGL for performance.
My two cents....
(And if you don't believe my post, please go look this stuff up - do your own reseach and not follow the rants of myself or other Slashdot Biases. - Truly, I don't profess to know everything, and my rant is short, you will probably learn more by looking up the stuff I talk about than just reading my or any post and believing it without the poster's personal basis).
bugs, not cycles (Score:3, Insightful)
The big deal is eliminating a potential source of crashes. Right now, a video driver bug can (and often does) bring down the entire system. By putting the gui in a user process you can (in theory) avoid all that. What's more, you get that addes stability Whether you decide to use the gui or not