Forgot your password?
typodupeerror
Programming IT Technology

Interview With Cosmoe's Bill Hayden 130

Posted by timothy
from the smooshing-things-together dept.
Eugenia writes: "Over a month ago it was reported that a developer had forked the Athe(na) operating system and ported its GUI on top of Linux, without the use of XFree86. This combined OS, called Cosmoe, would support Linux, AtheOS, BeOS and even Macintosh's Carbon APIs (without the use of GNUStep - his port of Carbon is wrapped around the Be API). OSNews today features an interview with the architect of the combined OS, Bill Hayden, where a lot of things are explained about his plans for Cosmoe."
This discussion has been archived. No new comments can be posted.

Interview With Cosmoe's Bill Hayden

Comments Filter:
  • GnuSTEP and Carbon (Score:5, Informative)

    by Eugenia Loli (250395) on Tuesday May 14, 2002 @12:45AM (#3514905) Homepage Journal
    Duh! I made a mistake when I submitted the story. GNUSTEP wraps around the Cocoa API, not the Carbon one. Sorry for the confusion.

    I was deep into porting a game to MacOSX at the time of the submission and everything was like a big knot in my head... :P
    • amazing. a first post by submitter. anyway, i know it's off topic, but i was just curious what books you might recommend to learn cocoa and carbon. i'm getting tired of windows programming.
      • Hello Dooras. :)

        > amazing. a first post by submitter

        Yes, weird, isn't it? I mean, I loaded Slashdot just before I go to bed tonight, and the story had just come up.. :o

        At OSNews I have already written 2-3 book reviews about MacOSX programming. I am new into MacOSX myself, I only got this G4 450 Mhz Cube some weeks ago, but I started reading about Carbon and Cocoa since January, because I was seriously thinking of getting a Mac anyway.

        So, here is a review [osnews.com], a second [osnews.com] one, and I also recommend this book [osnews.com] as well. Please come back soon on OSNews, because I will be reviewing another Carbon book soon, which (so far) seems to be the best of the lot.

        (I have the whole OSX book series here, all the latest MacOSX programming books can be found in the shelf behind me. :D )

      • I don't do Carbon so I'm not much help there but for Cocoa get Cocoa Programming for Mac OS X by Aaron Hillegass. Stay away from Learning Cocoa, it sucks.
      • by duffbeer (114852)
        I wouldn't waste my money on books. The developer tools you can download for free from apple are fantastic, and come with ample tutorials, examples, and documentation to get going. I got ahold of a mac to get back into client-side coding after a few years in web apps and was up and running in only a couple of weeks.

        Have fun, you'll love it. =) Also, you don't have to bother with objc if you don't like. I've been cranking along using the java bindings just fine. I love the java option, as it lets me utilize lots of java code of my own and some harvested from the net.
      • As another said, stay away from the Orielly book Learning Cocoa, it is an abomination and practically useless.
  • .cx is perhaps the most unfortunate 3 characters to have at the end of a URL.
  • by 56ker (566853) on Tuesday May 14, 2002 @12:56AM (#3514947) Homepage Journal
    Words/phrases in story -

    operating system - 1 point

    Linux - 10 points

    Linux (again) - extra 5 points

    AtheOS - 5 points

    BeOS - 15 points

    APIs - 2 points

    Be - 5 points

    Total : 43 points!

    40-45 points: This story is so geeky it's bound to be accepted on slashdot!
    • 5 points for AtheOS, and 15 for BeOS? Shouldn't that be the other way around?
    • Well, that was absolutely pointless. BeOS doesn't even have the same geekiness factor. More geeky than MacOS, I think not as geeky as MacOS X. It isn't GPL'ed, so not geeky enough for the Slashdot crew. Try "I wanted an early post, but didn't bother to read the article." Karma slut.
      • Hmm, MacOS isn't geeky, it's grognardy. It caters to an older generation of geeks who still think Mach is cool. BeOS is a more hip geeky, catering to the fresh young people who think prevasive multithreading and transparent SMP are where its at.
  • by tswinzig (210999) on Tuesday May 14, 2002 @01:06AM (#3514990) Journal
    When AtheOS was "outed," it was really far along. Especially when you consider it was all written by one person.

    Similarly, OpenBeOS was impressive because it garnered a big crowd working on it rather quickly, and working code soon followed, to the chagrin of many. (There's already much work done on the kernel, via NewOS, BFS, the network stack, the GUI implementation, various preference and utility apps, and much more.)

    AtheOS was a new OS built for fun (seemingly) by a guy that was impressed (but maybe not directly influenced) by BeOS. More power to him.

    OpenBeOS is being built by fans of BeOS who want to see an open source version that can live on in binary compatability (for the first releases), and eventually progress beyond what Be, Inc. did (RIP).

    Where does Cosmoes fit in to things? This guy forked AtheOS against the original author's wishes (welcome to the world of Open Source, Kurt), in order to ... what? Run BeOS apps on Linux? Run AtheOS apps on Linux? Run BeOS apps on AtheOS? Run MacOS X apps on Linux?

    Honestly I'm trying to figure out what the goals are; I don't mean to be negative. If the guy is just doing this like Kurt, to have fun, then great... Otherwise, why promote this thing so much when virtually nothing is done? He admits the most of the hard stuff is waiting to be done. Instead of doing an interview, announcing the code fork, etc, why not start coding and announce it when you've got something to show for it?
    • Because maybe people will see the benefit of an OS that can run all the apps, wheter they are Linux, BeOS, MacOS, or similar and maybe these same people will know a bit about code, and will have time to spare to help out, thus giving this project a leapfrog start!

      In that case, we'll all have...

      one OS to rule us all and leave this proprietary code behind in the darkness.

      Why do MOST OSes run only code written for them? Sure technically it makes sense, but as a user - I want to run an app, I don't care what OS it's written for.

      That's the real benefit if a trulley OPEN system, IMHO...
    • by tftp (111690) on Tuesday May 14, 2002 @01:19AM (#3515026) Homepage
      I tend to agree. The Cosmoe web site is devoid of any content whatsoever. Not even a screenshot - despite of author's claim of having a graphic artist's soul.

      Experience of developing BSD and Linux already tells us that a good OS is most definitely not a one-man job. Goals of Cosmoe are highly undefined. Even if the author produces something working, it is likely to be very simple and not up to standard that BSD and Linux set every day.

      IMO, the mistake #1 is to start the project without setting a reachable goal, and establishing means to reach that goal. So many projects fail (in open- and closed-source worlds) because of that. Unrealistic expectations, deadlines that are years off mark, lack of understanding of now complicated some things are (just look at QoS for example!) drive projects into the ground.

      Of course, everyone is free to do whatever he likes with his own free time, but setting up a Web site to sell the OS seems to be a little bit premature.

    • Out of interest, when did Kurt express opposition to a fork?

      My limited understanding, and perhaps things have moved on since then, is that Kurt wants control over his own project, but he must have GPLd it for a reason, and I always assumed that would have been so that others who wanted to try other things with his code can do that without needing him to change the direction of his work.

      Is there a mailing list posting from Kurt clarifying his position?
    • Honestly I'm trying to figure out what the goals are
      Then go and read the posts in the atheos list where he announce his thing. There's quite interresting stuff in those exchanges. After some flame about forking from many persons, he fists acknowledge that he wasnt clear enough about his project then he states that this is not a fork of atheos
      what he plans is: he takes the atheos api which he likes, add the beos api which he likes, put this on top of linux kernel which he likes, in order to replace someday X which he dislikes and which is the real target of his project.
      he is in no way forking atheos as this is a linux (kernel) project, using code from atheos project found valuable. open source at his best in my books. It's too often that valuable projects just can't reuse existing code to cry a river when some valuable code is found usable.
      on the coding side, he claims having 6 months of work into the projects already. But just yesterday, I was talking about this project and said:"there isnt even a site!". So i guess this project entered the "there isnt even screenshots yet" phase, but for me, i'm going to be a little more humble and wait and see.
      by the way, you should be able to have this thing run alongside X....
    • Ok, here is my ungrateful whiney rant. Feel free to skip it and move along.

      We have tons of half-baked, alpha, spin-off, desktop operating system/UI projects that never get anywhere, and which invoke laughter when anybody claims Linux/<some free unix> is ready for the desktop. I don't mean to sound ungrateful, I realize that these guys are doing this "for themselves" as a hobby, but unless desktop developers get together, think over the hard questions, and come to solutions which all can agree on ("standards"? Gasp!), all we will have is a sea of hobby projects. Work that has already been done needs to be leveraged.

      It seems to me that AtheOS, OpenBeOS, BlueOS, and NewOS (which I just discovered today), all have the same goal in common - to create a new, all-encompassing, semantics-enforcing, object oriented UI (basing it of the BeOS APIs simply because this is one of the areas BeOS did really well) fundamentally integrated with the rest of the OS. Surely these projects can work together? What about Berlin - is *none* of the work they've done relevant?

      In the end, it may simply be that more work will get done *without* cooperating because each hobby developer is incentivized to work on his own thang (working in parallel, through the magic of open source), but it just really screams of inefficiency to me - the work that is done on any of these projects is probably reusable in the others (and I think bootstrapping by using the X drivers is a great idea). Is there really any fundamental philosophical difference between any of these projects? We'll never get anywhere if everybody is reinventing their own unicycle - let's combine them into a useful vehicle. I'm also more than partly motivated because I, myself, as a user, am pretty sick of X (no matter what is thrown on it) and am ready for a free desktop OS designed from the ground up *as* a desktop OS.
  • Why not support the XFree developer team instead of creating another GUI? The chances are higher that it will be integrated into a bigger user base. It is very unlikely that it will happen in the Cosmoe case.

    Anyone remember Berlin?
    http://www.berlin-consortium.org
  • by g4dget (579145)
    please understand that the previous sentence does NOT mean that I will be using XFree86 as the engine (shudder), just that Cosmoe will leverage XFree86's existing drivers to drive Cosmoe's graphics engine.

    What is supposed to be wrong with the X11 graphics engine? Why do people keep complaining about it?

    X11 does hardware-accelerated 2D and 3D graphics with transparency, anti-aliased fonts and graphics, ClearType-like rendering, and server-side geometric transforms.

    You can get a good X11 server into less than 1Mbyte with almost no off-screen memory, or you can give it hundreds of megabytes for caching, buffering, fonts, and textures.

    X11 provides a uniform and powerful API for all sorts of input devices.

    It can't be the client/server architecture that people complain about because neither Windows nor OSX do direct graphics I/O for their UI either. Neither can it be the footprint, because if you look at Windows or OSX GUI apps and system resources, they are as big as X11 on similar hardware, or bigger.

    Graphic design can't be the problem either: X11 imposes no constraints on the toolkits, GUIs, or desktops you build on top of it, and X11 toolkits have emulated Windows, OS/2, and MacOS/OSX so closely that it is really hard to tell the difference.

    So, what concretely is supposed to be wrong with X11? Why this visceral dislike? Why do people keep starting projects to replace X11 with unaccelerated display servers?

    • X11 imposes no constraints on the toolkits

      Thats why, it's not X, but the lack of standard toolkits.
      In joe-user mode, I want a standard look&feel across all my apps, if all the developers started using either KDE or GNOME, or if KDE & GNOME joined forces to create a standard toolkit for X, which everyone used then it would be fine. But instead every soddy app works complete differntly.

      • In joe-user mode, I want a standard look&feel across all my apps

        X11 is the equivalent of GDI, Quartz, or DisplayPostscript. If you want to try to enforce a consistent user environment on top of it for your new OS, you can do that as much as you can do it on any other platform.

        Conversely, Quartz and GDI fail to enforce a consistent UI either. Toolkits like Borland, FLTK, wxWindows/Universal, Qt, Swing, and OpenStep run on top of both of them and they give you applications that look and behave differently from native apps.

        In different words, I think this is a red herring. If you build a new windowing system, for a while, things will be consistent because your own applications will be the only ones. Once it becomes popular, the consistency vanishes.

      • I want a standard look&feel across all my apps

        My response seems so obvious, but no one ever seems to suggest it. If you want all your apps to look and feel the same, then I suggest you run KDE or GNOME (or any other toolkit/appkit). Just pick one. There are enough apps for each that you never should have to leave your chosen "standard" look and feel. And for the rest of us who don't care, let us run what we want.
      • The MacOS and Windows UIs have plenty of inconsistencies, even in those applications that come from the OS vendor itself. The Interface Hall of Shame [iarchitect.com] picks some of them apart. Have a look [iarchitect.com] at some of the applications that supposed HCI experts at IBM and Microsoft are producing. And that isn't even counting all the Windows and MacOS apps written using cross-platform toolkits that differ substantially from native apps (often, they are actually better than native apps).

        X11-based UIs aren't perfect, but on the whole, they are no worse than the commercial stuff. And whatever problems GUIs like Gnome and KDE have aren't the responsibility of X11, which is merely a windowing and graphics library (roughly like Quartz).

      • You're right that the lack of a consistent user interface is a huge problem for desktop users. I recently moved all my work from MacOS to Linux, so I really really really see the difference between a coherent UI and a chaotic one.

        But why should it be X's job to enforce a standard UI? X is low-level. It's like saying the C compiler should enforce a standard UI.

        The problem isn't that there's no standard way of doing a UI using X. The problem is that there are too many standard ways of doing it. Due to the nature of open source, I don't see how this is going to change. If they weren't rugged individualists, they wouldn't have become open source developers. The best we can hope for is that some sufficiently large organization produces a big suite of desktop tools that all follow the same standards. Maybe GNOME will be it. But I don't see how you can blame the problem on X.

    • Graphic design can't be the problem either: X11 imposes no constraints on the toolkits, GUIs, or desktops you build on top of it, and X11 toolkits have emulated Windows, OS/2, and MacOS/OSX so closely that it is really hard to tell the difference.

      LOL. If no one could tell the difference between X11 and MacOS, do you think people would still be buying Macs at double the price? Hint: there's a reason folks still buy Macs.

    • by po8 (187055) on Tuesday May 14, 2002 @02:16AM (#3515293)

      As many of the responses to your post illustrate, folks just don't get the idea that XFree86 is a highly modular system. They don't get the idea that the fastest path to a high-quality GUI desktop for their favorite OS is to start with the existing XFree86 server, extend it as necessary, and layer atop it with a decent client side. Yes, Xlib's time has come and gone, and Xt has always been pretty hopeless. So use something like XCB [pdx.edu] as a base, and design the GUI API of your dreams atop it.

      Also note that many of the XFree86 features you mention are either brand-new or not-quite-there-yet. For example, decent font support has only been solid for about a year now, and is still evolving a bit. Server-side affine transformations have been specified but not yet implemented. The spec for proper anti-aliasing of polygons was just finalized last week: it was implemented this week. (That's how fast XFree86 is moving these days with Keith Packard [keithp.com] working on it full time. Keith has repeatedly demonstrated that it's pretty easy to add the "missing" functionality you want as an X extension.) As folks get used to the Render and FontConfig APIs, I expect to see correspondingly less interest in building window systems from scratch.

      IMHO, the "visceral dislike" comes from several factors, including outdated ideas about what X is and how well it works (the performance claims I see around here sometimes crack me up), insufficient appreciation of the difficulty of what X does, and NIH syndrome.

      The good news is that all the carping isn't slowing down the clueful folks any. KDE 3 is nice enough that for the first time since the mid-80s I'm not running twm as my window manager any more. I expect things to only get better from here.

      • by Permission Denied (551645) on Tuesday May 14, 2002 @03:18AM (#3515569) Journal
        IMHO, the "visceral dislike" comes from several factors, including outdated ideas about what X is and how well it works (the performance claims I see around here sometimes crack me up), insufficient appreciation of the difficulty of what X does, and NIH syndrome.

        Disclaimer: I'm an X11 programmer. I love X. Allows me to install matlab on only one machine but use it from anywhere (instead of buying 30 licenses for the 30 people that would occasionally use it), wrote my own window manager since everyone else's window manager got at least something wrong, and the X11 APIs are extremely clean and elegant - especially if you compare it to say, win32 SDK.

        Here's why I think the kids don't like X nowadays:

        1. Athena toolkit. Occasionally they'll find a scrollbar that they can't figure out how to use (pre-version 21 emacs) or buttons used for popping up menus (like xfontsel or gv), and it's pretty damned ugly.

          I don't know how to solve this problem: XFree86 programs like xfontsel, bitmap or xvidtune shouldn't depend on GTK or QT, but at the same time, we should limit exposure to Athena. At least use Xaw3d instead.

        2. Fonts still suck. XFree86 4.2 doesn't come with any truetype fonts, you have to recompile QT and set an environment variable to even get konqueror to look decent, old apps like netscape won't ever use truetype fonts and even getting netscape to display legible text involves venturing through Dante's ninth level.

          This problem can probably be solved by the Linux distributers - include decent truetype fonts, set up QT correctly, etc. I imagine they're probably doing this at this point, but I don't keep up with them.

        3. Configuration still sucks. Only once - once - did I find that XF86Setup correctly set up a graphics card, and I've worked with a lot of machines. Same deal with "X -configure" on XF86 4.x. I have to fix XF86Config on every machine, and a number of times, I rewrote the thing from scratch as the default template is extremely long and verbose (eg, it has commented-out examples of how to set up multi-monitor systems, something I've never seen in all my years of computing). In addition, the little XF86Config generators still won't set up vital options such as ZAxisMapping, so you end up editing it by hand anyway. They still expect you to know your monitor's sync frequencies, which is absolutely unreasonable.

          Solution to this problem? The config generators must be fixed. This is really a very, very big problem. I shoudn't ever have to choose my graphics card from a drop-down list - the config generator program should figure out what graphics card I have by snarfing its PCI id. "X -configure" does this, but it's none too friendly and still doesn't make for a usable XF86Config - it should be integrated with something like XF86Setup from the 3.x days, which allowed you to also set the various other options. No keyword should be added to XF86Config until the config generators are updated to set up the new option.

          I've never managed to watch a DVD using Linux/XFree86. I'm a unix systems programmer, so I'm not some noob who's afraid to read docs - however, the last time I checked, oms still doesn't do sound sync, so it's completely useless. In XFree86 4.1, the ATI drivers were completely broken and wouldn't correctly do DGA, etc on every ATI card I've seen. It's much better now with XFree86 4.2, but it's extremely dangerous to say that watching DVDs in Linux is feasible at this point. This will only encourage the neophyte to actually try it - they'll have to upgrade and recompile their kernel, upgrade their XFree86, mess with some crap in /proc to enable DMA on their DVD drive, get a CVS checkout of oms as the published tarballs are outdated, search for the correct css plugin, as there are at least three different ones, and then, maybe, perhaps, they can try to see if it will even work. I didn't write down the crap I went through while trying this, so I'm missing a whole bunch of steps here - suffice it to say, this will take a few days. If you try to run quake3 or any other 3D game under Linux, you'll run into the same things.

          Solution to this problem? Cut back on the advocacy. Let's be honest and admit that very few people will be able to watch DVDs or play newer games in Linux at this point.

        • If you try to run quake3 or any other 3D game under Linux, you'll run into the same things.

          Never had ANY trouble setting up quake3 in linux. Not a GeForce either, a G400 Max. Couple of kernel options and a few tweaks to XF86Config and it was the first "application" I had working on my new Gentoo Linux install besides X

          And since I'm already WAY offtopic, I'll mention that the Gentoo Linux [gentoo.org] install is long but worth it and not difficult at all. The instructions on the website are FANTASTIC. I have never seen better installation instructions in my entire life. I was a hardcore slack user for a while, and this came to be as a bit of a shock. System is nice too :P

          Anyway, play q3! and to a lesser extent rtcw! Support your local Cache [cached.net]! Cause computer games are damn fun

        • DVD Playing on Linux (Score:4, Informative)

          by benjj (302095) on Tuesday May 14, 2002 @03:54AM (#3515668) Homepage
          Woh woh woh woh!

          I think it must have been a while since you tried dvd playing on linux. AFAIK oms isn't even developed anymore. I use xine with the dvdnav plugin. It installed easily on debian, and the only sound sync problems I had were when I tried it using esd.

          I also run 3d games under linux. Both the Wolfenstein and Quake III work, as does max payne running under WineX. All with no trouble after I installed the NVidia drivers.

          I do agree with you about the X configuration issues, this seems to be something that each commercial distro is trying to solve in their installers (somewhat unsuccessfully in my experience).
        • by po8 (187055) on Tuesday May 14, 2002 @04:08AM (#3515705)

          Your points are mostly well-taken.

          1. There is a serious effort underway to remove all Athena dependencies from the sample X apps, and not to replace those dependencies with Gnome or KDE dependencies. But it'll be a while before this happens: the replacement has to be designed and built first.
          2. Font support is being improved as we speak. Distro vendors can certainly help. The KDE and Gnome groups are helping. I'm using anti-aliasing on most every font on my screen now, and although it was decidedly non-trivial I didn't actually have to stand on my head to do it.
          3. IMHO the configuration situation, while bad, is not as bad as you describe. Certainly the only viable way to configure X is to run "XFree86 -configure" on a 4.2 server and edit the output. But the edits aren't that hard any more. As you note, card detection is automatic, and usually works. The VESA bits make modern monitor detection also automatic, eliminating that source of confusion. Mostly it's input devices that are a continuing source of grief. Keith Packard finally rewrote the mouse protocol autodetect fairly recently: XFree86 now successfully autodetects your mouse type the first time you move the mouse around. (This, BTW, was surprisingly hard.)

            The default XF86Config file format may be moving to XML. This would help a lot with newbies being able to use sensible tools to edit their configuration. In particular, XML editors are pretty good at not messing with parts of the file they don't understand...

          The DVD player thing is a special case, since there are folks in the world actively trying to make it hard :-). But if you run Debian, you can very easily install usable XFree86 bits, a usable kernel, and the current Xine bits. It's then just a question of finding a .deb for the Xine CSS plugin, and you should be able to watch movies---I can.

          The DRM/DRI support for 3D has stabilized to the point that it mostly just works. As you suggest, if it doesn't, you are probably out of luck unless you have direct access to a guru. This is true in Windows-land as well. The traditional solution there is to buy new hardware to make your software work. Buying a modern Nvidia card means you automatically get usable Linux drivers and some tech support, so this is always an option.

          I agree that there are some things that still require some expert help, and that this is too bad. But all of this has gotten pretty off topic. If you check out the 3D and video HW support of the competition to XFree86 (e.g. Cosmoe [which is apparently going to call its initial distribution potatoe :-)]) you'll find it to be far inferior, to say the least. X may not be perfect, but it's tremendously good. Help out or just be patient, and it will get even better.

          • But if you run Debian, you can very easily install usable XFree86 bits, a usable kernel, and the current Xine bits. It's then just a question of finding a .deb for the Xine CSS plugin, and you should be able to watch movies---I can.

            Use vlc [videolan.org]. Much easier to use than xine, and it's packaged with debian.

        • I don't really understand those issues. If a Linux user does the same thing that a Windows user does, install a ready-made distribution, they don't ever have to deal with any of this. Modern Linux distributions configure the server and OpenGL, figure out the monitor, install a desktop, and install TrueType fonts. Even the DVD players work out of the box (you have to get a third party CSS decoder plug-in yourself for encrypted DVDs--thank the MPAA).

          If you choose to compile and install things by hand, of course, you have to edit XF86Config files. But even there things have gotten much easier (e.g., in XF86v4, you don't need to worry about modelines anymore).

        • The biggest problem with X, no matter how wonderful it is, technically speaking, is that it does not enforce GUI semantics. This can be considered either a good or bad thing. From the perspective of a desktop OS, I think this is a bad thing. I don't know the guts of X well, but isn't the fact that video drivers are implemented in userland an architectural problem to begin with? Plus, the resources mechanism is absolutely byzantine and needs to be be razed, and salted, as well the complex distinctions between server and client (wait, who's the server, who's the client, who has the toolkit?, who's running the window manager? what the fsck is going on?). X simple suffers from being everything to everyone. I mean, it does an OK job being everything to everyone, but sometimes, as in when designing a *cohesive* desktop operating system, you *purposefully* don't want to have every option, and you *purposefully* want to force/standardize some things. Currently to get a GUI you have to choose a permutation of Toolkit/Window Manager/Desktop each of which might have its own subtle semantic differences, and due to the "flexibility" of X, may or may not work with each other with varying degrees of success. Just because a toothpick is "modular" doesn't mean you should build a house out of them.

          Now maybe X is as modular as you say, and can be used as the building blocks for a more narrowly-defined GUI. But I think X has engendered as much bile against it as there is inertia behind it. (and a lot of the problems of X are the same old unsolved problems of Unix in general - no standardized method of configuration or collation of preferences for one)
          • The biggest problem with X, no matter how wonderful it is, technically speaking, is that it does not enforce GUI semantics.

            X11 is the equivalent of GDI or Quartz; it doesn't have to enforce GUI semantics. If you want to enforce a "coherent" desktop on top of it, you can impose whatever draconian measures you like. KDE looks quite coherent and standardized to me, for example.

            It's a myth, in any case, that Windows or MacOS are any more coherent than, say, KDE. Take a look here [iarchitect.com] for an extensive critique. And you think that the appearance or window management behavior can't be changed on Windows? Think again: Stardock [windowblinds.net], Litestep [litestep.net], Microsoft PowerToys [microsoft.com].

            but isn't the fact that video drivers are implemented in userland an architectural problem to begin with?

            The video drivers are in the kernel. The drawing and acceleration is in the display server. The toolkit is in the application. It's fast and it's robust. It's what NeXTStep and MacOSX do as well. Where is the "architectural" problem?

            Plus, the resources mechanism is absolutely byzantine and needs to be be razed,

            Neither Gnome nor KDE use the X11 resource mechanism. They use something much more like Windows. That's actually a shame because the X11 resource mechanism is better.

            as well the complex distinctions between server and client (wait, who's the server, who's the client, who has the toolkit?, who's running the window manager? what the fsck is going on?).

            Windows, MacOS, and NeXTStep make the same distinction as X11: they have a low-level graphics and windowing component that runs in a display server, and they have a high level toolkit part that runs in a display client.

            Altogether, it looks to me like you have a rather outdated notion of what Windows, MacOS, and X11 are. Windows and MacOS have pretty much become like X11 architecturally; they simply lack the well-defined and efficient X11 protocol to support that architecture. On the other hand, X11 toolkits (for better or worse) have become much more like Windows and MacOS toolkits. All three of them have gotten direct rendering and 3D acceleration.

            • The video drivers are in the kernel. The drawing and acceleration is in the display server.
              >>>>>
              Except its not. For almost all cards (NVIDIA and its kernel driver nonwithstanding) the whole driver is in userspace, acessing the hardware via user-mapped I/O ports. This isn't the ideal situation, because for the absolute best performance, you need some stuff in the kernel (which DRI does, but DRI support is rather limited and only for 3D).

              The toolkit is in the application.
              >>>
              That's a crappy design. Its more flexible, but its faster to have the toolkit server-side. That's why Qt (and GTK+) on X is slower than Qt on Windows. It basically uses X as a way to move bitmaps around the screen, which isn't the best (or fastest) way to use X. If the toolkit was in the sever, communication between the client and server would be limited to a much higher level (and thus low bandwidth) protocol.

              As for windows and MacOS becoming more like X, that's only half true. Windows has the GDI in the kernel, unlike X, which is in userspace (personally, I think that's okay, I mean networking is pretty big too, and that's in the kernel). Quartz is slow as hell, so that's a bad example. Either way, client/server archs are becoming more practical. Before, when basically everything was simple blits or pixel plotting, the latency of individual operations was critical. These days, with OpenGL serving as the support for the GUI (see Longhorn and Jaguar) clients have to package up command anyway (vertex buffers, display lists, etc) and each operation takes comparitvely longer than a single PutPixel(). Thus, the latency of the communication isn't as much of a factor anymore.
              • Except its not. For almost all cards (NVIDIA and its kernel driver nonwithstanding) the whole driver is in userspace, acessing the hardware via user-mapped I/O ports.

                That's what I said: the video driver (i.e., on Linux, the thing that does the mapping and switching) is in the kernel. The drawing and acceleration is in the display server, and for certain special applications, the application itself gets memory mapped into its address space.

                This isn't the ideal situation, because for the absolute best performance, you need some stuff in the kernel (which DRI does, but DRI support is rather limited and only for 3D).

                That's a crappy design. Its more flexible, but its faster to have the toolkit server-side. That's why Qt (and GTK+) on X is slower than Qt on Windows.

                Qt is slower on X11 than on Windows because Qt ignores most of the server-side facilities that X offers. The "crappy design" there is Qt, not X11, and it mostly means that the authors of Qt just didn't want to bother doing a high-quality X11 implementation: Windows apparently matters more to them. Furthermore, on Windows, the "toolkit" isn't server side either: the display server runs in the kernel, and Qt runs in user space.

                In fact, with the right toolkit, X11 is often faster than GDI. The reason is that X11 was designed to go through a bottleneck. GDI was designed assuming direct library calls and had to get retrofitted to work in a protected mode environment. Furthermore, X11 naturally takes advantage of multiple processors and graphics processors.

                Before, when basically everything was simple blits or pixel plotting, the latency of individual operations was critical.

                I'm not sure what you mean by "before". Windows was written that way. X11 had a client/server architecture from the start and has always worked well with it. Windows is the latecomer, and it still isn't very good at it.

                These days, with OpenGL serving as the support for the GUI (see Longhorn and Jaguar) clients have to package up command anyway (vertex buffers, display lists, etc) and each operation takes comparitvely longer than a single PutPixel(). Thus, the latency of the communication isn't as much of a factor anymore.

                As I was saying, X11 got it right from the start because X11 didn't assume that any program can just bash pixels in the frame buffer. Windows is playing catch-up.

                • That's what I said: the video driver (i.e., on Linux, the thing that does the mapping and switching)
                  >>>>>>>
                  Except its not! The only part of whole display system that's in the kernel is the part that alters the X server's TSS so it can use the I/O ports of the graphics hardware. This is unlike other OSs (even other ones with client/server archs) where an actual graphics driver, that does stuff like initialization, handling interrupts, etc, runs in the kernel. The main weakness with X's approach is that because *no* part of the graphics driver is in the kernel, certain facillities of the card cannot be taken advantage of. DRI (and NVIDIA's kernel driver) fix this by putting a little bit of the graphics driver in the kernel, but both are rather limited in the range of hardware they support, and only really affect 3D operation.

                  Qt is slower on X11 than on Windows because Qt ignores most of the server-side facilities that X offers. The "crappy design" there is Qt, not X11, and it mostly means that the authors of Qt just didn't want to bother doing a high-quality X11 implementation: Windows apparently matters more to them. Furthermore, on Windows, the "toolkit" isn't server side either: the display server runs in the kernel, and Qt runs in user space.
                  >>>>>>>
                  Hmm. If that's the case, then X's design is so borked that implementing toolkit functionality server-side is too difficult to get working. There is no major toolkit that puts an appreciable amount of code server-side. This is one of the big things Berlin is trying to solve. As for the Windows case, it's hard to tell. The GDI spec is just the call interface to gdi32.dll. How much of the GDI is implemented in userspace, is uncertain. It is entirely possible that gdi32.dll maps the graphics driver into the application and implements accelerated drawing in userspace. BeOS had an API that worked this way, btw.

                  I'm not sure what you mean by "before". Windows was written that way. X11 had a client/server architecture from the start and has always worked well with it. Windows is the latecomer, and it still isn't very good at it.
                  >>>>>>>
                  No, I should have said "in the past." I'm talking about the standard WIMP's that have been around for years, basically moving bitmaps around the screen and drawing lines and pixels. Now, the communication latency isn't as important anymore because a lot of overhead goes into packaging objects for the API to begin with.

                  As I was saying, X11 got it right from the start because X11 didn't assume that any program can just bash pixels in the frame buffer. Windows is playing catch-up
                  >>>>>>>
                  Oh please. The people who designed X never had any clue that 3D hardware would eventually come to their rescue and render the terrible latency in the interface a moot issue. Otherwise, they would have provisioned the system with something like DRI to begin with. Windows is not playing catch up at all in this case. Windows did it right the first time. It specified the drawing API as a set of procedures supported by gdi.dll, nothing more. They've got the freedom to implement their graphics engine however they bloody want to, without being restricted to a 20-year old protocol like X. And no, extensions don't help, because app must be specifically compiled to use them. Extensions violate every principle of OS transparency out there. Take a look at DirectX for an interface that got backwards compatibility correct. Just code your apps for the interface, and generations of hardware can go by and you're app will automatically support new advances.
        • They still expect you to know your monitor's sync frequencies, which is
          absolutely unreasonable.


          Why do they require you to know your monitors sync frequencies and how does windows get away with not requiring people to know them?
  • Hmm.. (Score:3, Insightful)

    by josh crawley (537561) on Tuesday May 14, 2002 @02:03AM (#3515218)
    Is anybody thinking that this might be on the "Top Ten Vapourware of the year"?

    In other news, did I mention that I'm building a program for Linux that can eliminate ALL of those nasty unresolved dependancies!! It works for DEB, Slackware TAR, and RPM's. It automagically scans and can determine what the developer really means when he puts the program names and versions in the RPM's. REALLY! IM SERIOUS!(cough)
  • we'll have BeOS and Carbon API support under normal real world linux use [with X11]. After all, if he's using the linux kernel, he's just writing another user mode GUI layer, like X. It should be relatively trivial to modify the code so that instead of directly talking to the framebuffer, it opens up in a managed window under X, [like with wine].

    Why do I care? Because I like X, and I'm certainly not about to want to give that up that to run other neat apps that have been targeted to BeOS and carbon API. But if it's already on linux, well, nifty! [Once it's out of pre-alpha, of course]...
    • Well, you're not going to be able to run Mac binaries, just because of Carbon support--if that's what you think. If nothing else, Macs are big endian while PCs are little (or is it the other way around?).

      Even source compatibility will be tough, given the huge differences between the design of MacOS and Linux.
      • The reason you will probably be unable to run carbon binaries is simply because those binaries are targeted for a PPC processor, not an 8086 processor.

        As for source compatbility, the design differences between linux and classic Mac systems is rougtly the same than the difference between Mac OS 9 and Mac OS X. Carbon was designed to be implemented on a Unix core. If the Carbon layer is correctly implemented then it should work.

  • and it will be: a) pointless b) vaporware
  • ...but of course I'm going to be:


    Doesn't this sound a bit like Windows 3.1 being called an 'Operating System'?


    This is a GUI that supposedly will support the API's of multiple platforms. Methinks it won't be all that easy in the end. (think ardi with their executor trying to copy *just* the Mac API...yes, I *did* buy a copy).

    I don't see how this could be even remotely possible, unless linux-like numbers of developers jumped in.

  • At least now all of those people that bitch about X have something better to play with now than SVGAlib. Personally I like X, but I'm probably biased since I like to run stuff across networks. something like this may actually work on a large scale if qt and gtk support is added. How many apps actually talk to X directly?
  • by tbien (28401)
    First of all Carbon is the C++ based MacOSX API
    based on the old toolbox functions from MacOS 9 backwards. It has nothing to do with Cocoa which
    is a Objective C API based on OpenStep.

    GNUStep is the free (as in GPL) Implementation of
    the OpenStep specification.
  • In case you're wondering why Bill doesn't like X11, he explains this on the Cosmoe mailing list. Basically Bill encountered the bug in Qt 2.x that means that you can't cut and paste between most KDE applications and most other apps.

    So what did he do? Did he report the bug to the Trolls, or at least to KDE's devel team? No. He decided to create a new OS based on AtheOS, BeOS, Linux and anything else that creates hype. The only thing missing so far is a promise to build a next generation Amiga

    Oh yeah, and Bill thinks X is ugly. But, he also thinks AtheOS and Cosmoe are ugly. He proposes to fix this by making Cosmoe look different, which will be very hard, rather than making X look different which is easy but doesn't get you on Slashdot.

    Meanwhile every distro worth talking about either includes KDE 3.x or plans to do so very soon. So the bug that annoyed Bill is gone.

The Universe is populated by stable things. -- Richard Dawkins

Working...