Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Internet

W3C Releases Drafts For DOM L2 And More 150

TobiasSodergren writes "People at W3C seem to have had a busy Friday, according to their website. They have released no less than 4 working drafts (Web Ontology Language (OWL) Guide, the QA Working group - Introduction, Process and Operational Guidelines, Specification Guidelines) and 2 proposed recommendations: XML-Signature XPath Filter 2.0 and HTML DOM 2. Does the this mean that one can expect browsers to behave in a predictable manner when playing around with HTML documents? Hope is the last thing to leave optimistic people, right?"
This discussion has been archived. No new comments can be posted.

W3C Releases Drafts For DOM L2 And More

Comments Filter:
  • by Anonymous Coward on Sunday November 10, 2002 @01:44AM (#4635823)
    Who needs more than h1, b, and i tags for documents?
  • doesn't matter... (Score:3, Insightful)

    by adamb0mb ( 68142 ) on Sunday November 10, 2002 @01:44AM (#4635824) Homepage Journal
    doesn't matter how many standard that w3c sets, MS is never going to follow them. They'll just set their own standards, and those will become the de facto standards... its rough, but its the ways it is...
    • Re:doesn't matter... (Score:3, Interesting)

      by Goalie_Ca ( 584234 )
      I agree exactly with that. How many standards does IE 6 adhere with. One of my professors actually uses microsoft office (or some other ms product) to make the website and its components and it is a pain in the ass to access unless i'm using IE 6. In fact i was using mozilla and I ended up missing 6 pages from a document. I don't see why and how MS needs to break standards other than for their own agenda. If they do set their own standards it should be something the whole world can agree upon. Communication technologies should all follow standard protocols!
      • by ender81b ( 520454 ) <wdinger@g m a il.com> on Sunday November 10, 2002 @02:36AM (#4635952) Homepage Journal
        Ok... you tripped mode.

        I work in a student computer lab for a fairly large university, about 28,000 students. You wouldn't *believe* the problems I have to deal with because of stupid, and I stress stupid, professors using stuff like MSword/powerpoint for their class notes and webpages.

        I'll give you a few examples. Powerpoint is the most common for posting class notes. All good and fine because thanks to OpenOffice even a linux box can read pp slides just fine. The problem is printing them. Since we have only dot matrix printers (long story...) if the professor uses too weird a color scheme the slides don't print worth a damm, even with 'print only black/white' option checked. Problem #1.

        The bigger problem is when they use MSword to post syllabi, notes, etc. Students have a problem viewing them at home for whatever reason (most likely they are using an old version of word) and they have to come back to campus to look at this stuff. It is insane. I always direct them to install OpenOffice but sometimes they might only have a modem so it isn't really an option. And if you talk to these professors about only posting stuff in MSWord they get defensive and say such things like 'everyone uses it' and other to the like. Try pointing out that just clicking 'save as rich text format' will cover 99% of the stuff they publish just doesn't work. Sigh. It is becoming a real problem. Same with webpages - what standards, microsoft is a stanard, I'm sure this would work fine if you would use a *microsoft* browser, etc, etc.

        Not that all professors are dumb, a lot use things like rich text format and try to stay away from word but alot don't. It is a major headache to some students, and for me. And don't even get me started about how IE handles word documents - has the nasty tendancy to embed them within the current frame which causes havoc with printing, saving, etc - at least for your average student.

        Seriously, more teachers need to be educated on thigns like open formats. For instance, it wouldn't be that hard to devolp a campus-wide XML format and a nice little front-end for making syllabus's, class notes, outlines, etc available to all faculty. That way you could ensure that everyone had equal access to the documents instead of forcing students to use MS products.
        • Well if your university wants to have something like you suggest, they need to do it. They need to implement a campus wide system that all professors have access too. They then need to make training either available or possibaly manditory on how to use the system. If they really want it to take off, they need to mandiate its use.

          However, I really don't feel much sympathy for students as I don't see professors using MS Office, or whatever else they like as a problem. There is always teh simple option of attending class and picking up the hardcopy when it is passed out. Indeed many classess I have taken have no website at all, and it is your responsbility to attend class and get your informaton that way.

          Also, all the universities I have seen do at least a passable job (and usually much better) of providing computer facalities in places like the main library. It is not hard to go to the library and print what you need.

          If you want to mandiate that professors all must use a given system for their websites, fine, but you'd better be prepared to make sure it works WELL and provide any and all necessary support for them to use it. Otherwise, they need to be allowed to use what they like.
        • At university I found you could usually tell how good a lecturer would be by the material used for slides. Those using LaTeX and its slides package usually had the most interesting courses (if more difficult); those with wordprocessors in the middle; PowerPoint usually meant fairly fluffy. There were exceptions and it wasn't a perfect correlation, but it was certainly a factor in choosing what course to take.
        • How about a VBA macro to translate MSWord into HTML. Sure, some fidelity loss, and getting the professors to use it would be like pushing Jell-O up a hill with a toothpick, but it would be something...
          • I wouldn't want to even read what that html would look like after that. Can't they just write their documents in XHTML 1.0 and CSS 2? After they got the original document done, their style would (should) be the same for each presentation.
      • IE6 W3 support (Score:5, Interesting)

        by Cardinal ( 311 ) on Sunday November 10, 2002 @03:40AM (#4636073)
        Actually, IE6 does a decent job. Their DOM1 support is good, their CSS1 is more or less complete, but their CSS2 is pretty crappy. Fixed positioning doesn't work, selectors [w3.org] like E[attr] are missing, etc.

        Lately I've been working on an app for a company's internal use, which means the delightful situation of being able to dictate minimum browser requirements. As a result, the app is designed for IE6/Mozilla. All development has been in Mozilla, and a lot of DOM use goes on. And it all works in IE6, no browser checking anywhere. My only regrets is I can't make use of the more advanced selectors provided by CSS2, so the HTML has a few more class attributes than it would need otherwise. But, overall, not bad.

        Another positive note, IE6 SP1 finally supports XHTML sent as text/xml. So at last, XHTML documents can be sent with the proper mime type [hixie.ch].

        So despite being a Mozilla (Galeon) user, as a web developer who makes heavy use of modern standards, I look forward to seeing IE continue to catch up to Mozilla so that I can worry even less about browser-specific issues.
        • as a web developer who makes heavy use of modern standards, I look forward to seeing IE continue to catch up to Mozilla so that I can worry even less about browser-specific issues.

          Ah, yes, the 'Not having to worry about browser-specific issues" notion. You haven't exactly been a web-dev long enough, have you? (:P)

          This is _exactly_ what we thought HTML 3.2 would turn out to be....and look at how wel that worked!

          And anyway, if it isn't W3C standards, it's resolution, colors (allthough that's fixed now...sortof) etc.
        • nother positive note, IE6 SP1 finally supports XHTML sent as text/xml.

          Great, except XHTML is supposed to be served as application/xhtml+xml, which IE6 SP1 still wants to download rather than display.

          I guess text/xml is one step closer, though.. assuming it works properly.
        • Another positive note, IE6 SP1 finally supports XHTML sent as text/xml.

          How did you get text/xml to work in IE? When I try it, I get a collapsible tree view of the document's source code.

      • by mijok ( 603178 )
        In case you haven't noticed MS benefits enormously by breaking standards and creating their own. To the average user MS standards are the only standards and since OpenOffice, Mozilla etc. can't implement .doc and their html 100% correct it makes them look bad, ie. "that must be crapp, my homepage looked good in IE"
    • I don't think that's necessarily true. It's a given that Microsoft's track record in terms of standards compliance has been exceptionally poor relative to Mozilla and other similar efforts, but "MS HTML" is somewhat closer to w3c's standards at the present than they were previously.

      Also, while IE is the most popular browser, it's not the only one, and a not insignificant proportion of the population uses Mozilla, Opera, and other browsers. Somewhat hypocritical of me, since I'm currently using IE on my Windows partition, as opposed to Mozilla on my FreeBSD partition, but on purely technical merits, IE isn't really the best browser, and the optimist in me is convinced that the greater portion of the online population will eventually go for the better solution. On the other hand, if they don't, why should we worry about it? The proletariat can do as they please. So long as "MS HTML" doesn't somehow become entirely proprietary, we retain the ability to access it, plus we get to view properly-rendered pages. Whee.

      Don't forget, either, that Microsoft actually is a member [w3.org] of the w3c. Microsoft can be accused of many things, but blatantly violating one's own standards is a rather stupid thing to do.
  • No. (Score:5, Insightful)

    by Trusty Penfold ( 615679 ) <jon_edwards@spanners4us.com> on Sunday November 10, 2002 @01:48AM (#4635836) Journal
    Does the this mean that one can expect browsers to behave in a predictable manner

    When there was 1 standard (HTML), browsers didn't behave predictably.

    Now there are more, there is more scope for implemetations to have their quirks, not less.

    Standards are large and complicated descriptions of expected behaviour. Each implementor may have a slightly different interpretation. Different implementations will have their strengths and weaknesses which make different parts of the standard easier or harder to implement fully and/or correctly. There may even be reasons why an implementor may choose to ignore part of a standard (perhaps it is difficult and he believes that users don't want or need that functionality yet).

    Unfortunately, standards are an ideal to aim for, not a description of reality.
  • C++ XML API (Score:4, Interesting)

    by be-fan ( 61476 ) on Sunday November 10, 2002 @01:53AM (#4635856)
    I've been looking around for a nice simple API to XML parsers, and I've yet to find one. Java and Perl both have clean, native-feeling XML APIs (JDOM and XML::Simple) but so far, the only C++ ones I've found map closely to DOM's overly complicated object model, and don't "feel" like C++ libraries (they don't use the STL and whatnot). Anybody know of a library along the lines of JDOM except for C++?
    • Re:C++ XML API (Score:3, Informative)

      by sporty ( 27564 )
      Have you tried the Xalan type stuff? http://xml.apache.org [apache.org]
      • Xerces C++ is a very good XML parser, but it's for really heavy duty stuff, not at all like JDOM or XML::Simple. Plus, the API is almost 1:1 to the DOM API, and isn't very C++ at all. From the Xerces C++ page:

        "For portability, care has been taken to make minimal use of templates, no RTTI, no C++ namespaces and minimal use of #ifdefs."

        The API is basically C with classes, uses XMLChar * instead of std::string, etc. I'm looking for something more along the lines of the Boost or Loki libraries in that they integrate cleanly with the STL.

        Let me use JDOM and XML::Simple as examples. They both simplify the (IMHO too complex) DOM model, as well as fitting closely to the language. JDOM, for example, uses standard Java strings and containers, while XML::Simple uses Perl associative arrays.
    • I haven't used it yet, but looking at Arabica [jezuk.co.uk] is on my todo list. No STL integration, but it does deliver std::string or std::wstring.
    • Re:C++ XML API (Score:5, Informative)

      by KidSock ( 150684 ) on Sunday November 10, 2002 @03:31AM (#4636052)
      I've been looking around for a nice simple API to XML parsers, and I've yet to find one. Java and Perl both have clean, native-feeling XML APIs (JDOM and XML::Simple) but so far, the only C++ ones I've found map closely to DOM's overly complicated object model, and don't "feel" like C++ libraries (they don't use the STL and whatnot). Anybody know of a library along the lines of JDOM except for C++?

      Someone posted a neat little class to the expat mailing list ~2yrs ago. Basically it was just a Node class with STL list for children and a hashmap for attributes. It was very small, clean, and was in essance a DOM. It used expat but trust me, the code was so tiny you could use any parser with it. It was like 200 lines of code.

      I liked it so much I created the same thing in C called domnode [eskimo.com].

      Search the expat archives [sourceforge.net]. Wish I could give you more to go on.
    • It's probably not what you want but FleXML [sourceforge.net] is a very fast way of parsing XML that conforms to a particular DTD. It's like lex and yacc - is that C++-like enough?

      I completely agree about all the weird reinvent-the-wheel stuff that DOM and similar libraries contain: it would be so much better if they could use the STL in C++ and native data structures in other languages (nested lists in Lisp, etc etc). It's just that a basic function call interface is the lowest common denominator, so if you want the same library on every language you have to invent a whole new list and tree API. Perhaps this is an indication that the same library on every different language isn't such a good idea. (Think of the Mozilla debate: 'the same on every platform' versus 'native on every platform'. I have a feeling that in programming languages as well as GUIs the second choice is better.)
    • I ran into this problem recently at work. I am developing for an app server in java, and we have decided to have it run through XML. This way we can have our powerbuilder gui app and our java servlet website (using xslt) use the same appserver. We tried using Apache's jxpath [apache.org], but I found it too limited in its support for xpath.

      Instead I implemented my own jdom like system that uses xpath to find noes in a document using Xalan's [apache.org] xpath API. This gives me the flexibility of xpath and the usefulness of a DOM-like XML api. I was thinking of porting it to C++ for use at home.

  • Standards (Score:2, Interesting)

    Web standards set by the W3C have little meaning right now. Standards are controlled by marketshare, and Internet Explorer has been the leading browser for at least a couple of years. Surely Mozilla and Opera will follow these standards, as they always have, but will IE do the same?

    Perhaps it's time we stopped sitting on our thumbs and complaining about Microsoft ignoring standards. An outright ban of IE is needed, from workplaces, schools, ect... Sites should block access to people using IE. This is the only way we can get our rights to web standards back!

    Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?
    • Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?

      I remember a slashdot link [slashdot.org] somewhere mentioning something about IE getting eliminated due to some sort of plugin junk?
      • As hypocritical as this may seem (we don't like vast patents, but we don't mind when they apply to Microsoft), I think it's out only hope. I don't see this company as one that just wants to get out there and earn money, I see them as one who doesn't like Microsoft, but woudln't mind earning some money on the side. If somebody could get them to exclude Microsoft from using the technology outright until they followed some rules (standardization, etc.), we could really give Microsoft a run for there money. Sadly enough, it seems to perfect, and I predict the company will eventually bow and just accept a nice fat check.
        • Woops...sorry for the constant spelling mistakes. I *really* think Slashdot should implement some kind of editing system. Say, you can edit it for five minutes after you posted, as long as nobody has replied, and you get a little link that says you edited and links to the original version.


          Ugh...
          Slashdot requires you to wait 2 minutes between each successful posting of a comment to allow everyone a fair chance at posting a comment. It's been 1 minute since you last successfully posted a comment Note: chances are, you're behind a firewall, or proxy, or clicked the Back button to accidentally reuse a form. We know about those kinds of errors. But if you think you shouldn't be getting this error, feel free to file a bug report, telling us: Your browser type Your userid "614145" What steps caused this error Whether you used the Back button on your browser Whether or not you know your ISP to be using a proxy, or any sort of service that gives you an IP that others are using simultaneously How many posts to this form you successfully submitted during the day Please set the Category to "Formkeys." Thank you.
    • Perhaps it's time we stopped sitting on our thumbs and complaining about Microsoft ignoring standards. An outright ban of IE is needed, from workplaces, schools, ect... Sites should block access to people using IE. This is the only way we can get our rights to web standards back!

      Y'know, in a perfect world, I'd whole heartedly agree with you. Is it a perfect world? Hence, the diatribe.

      Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?

      Ooops, sorry. Cancel diatribe... ;) Seriously, I don't think we as a community can really do anything substancial to Microsoft, since they don't want to listen to us anyway. Advocacy is about the only weapon we have, unless you come up with the "next killer app" that everyone needs and exclude any browser that doesn't follow the W3C standards you espouse . When you do that, you can set terms. Until then, we're just a bunch on Don Quixotes, tilting against windmills.

      Sorry for the dose of reality.

      Soko
    • Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?

      Why bother? Have you taken a look at these standard recently? They're huge and unwieldly. Perhaps a more attainable goal is to develop the next generation of browsers - a blank context for multimedia rendering as directed by the server-side script. Sort of a Shockwave Flash as a native platform.
    • Re:Standards (Score:4, Informative)

      by eddy the lip ( 20794 ) on Sunday November 10, 2002 @02:30AM (#4635940)

      Somedays I'm more optimistic. Today's one of those days (tomorrow may not be 'cause I'm digging deeper into IE's weird-ass DOM than I usually care to). But...

      Most web developers that have been around for a while would rather code to standards than to marketshare. Standards give you the promise of backward, and more importantly, forward, compatibility. It's also a helluva lot easier to sort out your code when a client asks for a redesign in a year or two if you've been conscious of more than just "making it look right" in the popular browser of the day.

      Markup designed for IE only often does truly evil things on other platforms - there's going to be more cellphones and PDAs accessing web pages, not fewer. There are also serious organizational advantages to coding to standards - more tools for handling your pages, it's easier to whip up a quick perl script to process standards compliant HTML...the list of advantages is long.

      Just like any other field, there's a trickle-down effect. Not everyone will write good, W3C compliant code, but more will, more often. And despite their megalithic, feudal mentality, Microsoft will have to pay attention. IE6 is still a long ways away from adhering to standards, but it's much, much closer than IE4 was. This seems to have been in large part a reaction to developers bitching about their lack of compliance. I'm hopeful the trend will continue.

    • Re:Standards (Score:3, Interesting)

      by frawaradaR ( 591488 )
      Yeah, some really popular sites (like Slashdot) need to use standards compliant code and not cover for browser bugs. Wired recently went XHTML and CSS2. This is the way to go. If a browser can't render it, file a bug. If it doesn't work in IE, too bad!

      My own homepage doesn't render in anything but Mozilla, currently, but small, personal sites aren't gonna break or make anything (unless they come in the millions, which is unlikely).

      The people at Mozilla have provided us with a tool of 99% perfect rendering. Now it is up to the web site maintainers to actually enforce the use of Mozilla (or any other browser that fully adheres to standards; there is no other currently).

      But Slashdot won't take this upon its shoulders, because it doesn't believe in standards, just like M$.

      So M$ wins.
      • Re:Standards (Score:4, Informative)

        by whereiswaldo ( 459052 ) on Sunday November 10, 2002 @03:45AM (#4636079) Journal
        If a browser can't render it, file a bug. If it doesn't work in IE, too bad!

        Many sites can get away with this, but many cannot. If I'm selling a product on the web, I'll make darn sure that 99% of my customer's browsers work with my site. It's a good ideal to say "fix your IE bugs", but often not realistic.
    • Re:Standards (Score:2, Interesting)

      Surely Mozilla and Opera will follow these standards, as they always have, but will IE do the same?

      That depends quite a lot on your definition of ALWAYS as it applies to Mozilla...Considering Mozilla was originally based off the Netscape source code (though I realize now it is been virtually completely rewritten). People seem to forget that Netscape were the kings of non-standard HTML as an attempt to "lock-in" customers. Hell, IE still to this day includes Mozilla in its user agent header to work around all the sites that would deny access to anything other than Netscape, back in the 2.0 era.

      • Hell, IE still to this day includes Mozilla in its user agent header to work around all the sites that would deny access to anything other than Netscape, back in the 2.0 era.

        At this I am very surprised. It's Microsoft's style to turn around and bite people in the ass when they have the upper hand. I wonder why MS hasn't "forced" Netscape only sites to change by updating their agent header?
  • My Hope is that they eliminate javascript from all web-browsers, not just the ability to remove javascript... ELIMINATE IT. I have no need for it and I don't go to any webpages that are reliant on it. The last thing I need is some sleezy company (like maybe microsoft or something) popping up an annoying advertisement on my desktop.
    • javascript is good for helping forms behave certain ways, like allowing you to change how a drop down list works etc....Javascript is definitely needed to add to the functionality of html forms, other then that i agree with you totally
    • by Cheese Cracker ( 615402 ) on Sunday November 10, 2002 @02:24AM (#4635932)
      JavaScript is good for many things, like eliminating travel to server for doing basic input checks, make HTML documents smaller (and thereby faster to transmit), dynamically creating HTML in a frame etc. Other people can probably give you more examples.

      If you got a problem with popup ads, then please download the Opera browser [opera.com]... you'll find F12 to be your best friend. ;)

      If you really want to crusade against something, then VB script is a better candidate or why not Outlook... the worst virus spreading software ever created.
      • Outlook... the worst virus spreading software ever created.

        that reminds me, since I do not use outlook/express for e-mail (I use mozilla at work and opera's stuff at home) I just set my adress list to use public addresses @ microsoft.com, that way if for some reason (someone else in the family ignores one of the computer commandments and opens some virus in an attachment) it simply sends the crap to microsoft and no one else

        junk snail mail is also handled by removing the postage paid self-adressed enveloped and filling it with metal scraps and placing in the mail (receivers are charged with postage) - make the spammers/virus enablers pay whenever you can.
      • Those are all useful things to do. The problem is with how JavaScript does them. For example, for making HTML documents smaller, a client-side macro facility would be more reliable, more efficient, and simpler. For doing input checks, a pattern language would be better. And on and on.

        If JavaScript (by which I mean JavaScript, DOM, DHTML, etc.) were a simple, if limited, solution to those problems, it would be OK. But it isn't. It is much more complicated than technically better solutions, yet it still is extremely limited.

        Simple and limited, and complex and powerful are both acceptable engineering tradeoffs. But complex and limited and buggy is a bad engineering tradeoff. And that's JavaScript.

    • I use it to help cache my site.
      The banner rotation is via js so that the main page can be cached.
      (but not annoying pop-up/unders - some of us realise they are a detraction).
      Our banners don't link to any external sites.
      The banner is part of the web frame of reference.

      We have over 500 pages of content so I'm sure you'll excuse us our right to present deep links on our main page.

    • Or are you really demanding we all take a nice big step backwards and remove the capacity for client side scripting because you're a caveman and can't understand what it's used for?

      Do you think javascript == popup windows? The open window call is abused, and I'd like to see the spec implement some kind of suggested behaviour along the lines of disregarding popups that aren't user activated (Mozillia already does a great job of this, but making it part of the spec would be superior) but to lose client based scripting would be a blow to the usability of the Internet and the palette of web designers trying to make intelligent sites.

      Client side form validation, adapting pages, and heck, even silly stuff like graphical rollovers which you can't do in CSS yet, are all things the Internet benefits from. Only an idiot would fail to anticipate how their page would work to users who don't have Javascript turned on, but it can make the experience run that much nicer and efficiently.

      Not to mention that nuking Javascript, an open, standards based, accessible language, will simply promote the use of obnoxious propriety technology like Flash.
  • The W3C is a joke (Score:2, Insightful)

    by Anonymous Coward
    What good is a standard if you never hold anyone's feet to the fire if they don't support it? If developers never have any incentive to actually get it right? If the standards are so vague that it allows for interpretations that can be so drastically different that the standard becomes useless?

    Has any company yet written a complete CSS1 implementation? A complete working version of DOM0? Yet here we are toiling away on XHTML and CSS3(!) and DOM Level 2. And they don't even seem to give a rat's ass if anyone actually follows the rules.

    From what I hear about CSS3, it's going to be such a massive specification that no company (save Microsoft, if they actually gave a damn) would possibly be able to implement it.

    What are we doing? The W3C puts out specifications that by the year become less and less relevant because their possible implementation date grows further and further remote. We'll see CSS3 arrive but will we ever see it in action? Or will it be supplanted by CSS4 and 5 which we will also never see? In the meantime we see developers actually building websites entirely out of Flash because there's one reference implementation (one version, period) and it just works. Is that the future we want?

    It's time to hold these clowns accountable. Make them do some real work: make them create a working version of their spec. Make them brand one developer's work as a reference. Make them do something to prove that these standards are more than just empty clouds of words!
    • Re:The W3C is a joke (Score:3, Informative)

      by Anonymous Coward
      Has any company yet written a complete CSS1 implementation?
      Yes. Mozilla. Got most of CSS2 as well.
      A complete working version of DOM0?
      Once again, Mozilla. Also supports DOM1. Oh, and most of DOM2. See the Mozilla DOM Support [mozilla.org] doc for the details.
      Yet here we are toiling away on XHTML and CSS3(!) and DOM Level 2. And they don't even seem to give a rat's ass if anyone actually follows the rules.
      Good job the Mozilla developers care then. Mozilla supports XHTML and some CSS3 (see below) and DOM2 (see above).
      From what I hear about CSS3, it's going to be such a massive specification that no company (save Microsoft, if they actually gave a damn) would possibly be able to implement it.
      Mozilla implements bits of it, mainly as vendor-specific extensions. No, that's not the same as proprietary. Vendor specific extensions are allowed by the spec if implemented correctly e.g. properties should be prefixed with -vendorname- (Mozilla uses -moz-).
      • Mozilla supports XHTML and some CSS3 (see below) and DOM2 (see above).

        Unfortunately, Mozilla does not support DOM 2 HTML in XHTML... and probably never will, because the bug assignee doesn't seem to care about this rather crucial bug.

        Btw, DOM 0 is not a standard, but a collection of common garbage from the old days. It is supported in Mozilla only for backward compatibility, and people shouldn't use it in design. Mozilla explicitly does not support IE and NN4 only stuff such as document.all and document.layers.

    • It's better for the W3C to release specifications early on than to have each vendors wait for it, and in the mean time, develop their own proprietary solutions. Browser developers are much lees likely to roll their own proprietary specifications if they can just read a W3C one and worry only about the implementation details.
      • The vendors are heavily represented in the W3C, and the discussions are open. Anyone can start implementing the specs way before they're final. It's not like people are waiting until the W3C releases the specs before starting to work on implementations. Much of the discussions in the W3C are based on what issues pop up as people try to implement various aspects of working drafts.
    • Re:The W3C is a joke (Score:4, Informative)

      by IamTheRealMike ( 537420 ) on Sunday November 10, 2002 @09:15AM (#4636632)
      What good is a standard if you never hold anyone's feet to the fire if they don't support it? If developers never have any incentive to actually get it right? If the standards are so vague that it allows for interpretations that can be so drastically different that the standard becomes useless?

      You have to have standards. The W3C are the people who are widely recognized as being the technical lead for the net. Now they don't make law, quite right, but if there was no W3C then Microsoft really WOULD own the web: as it is, we can and do take them to task when they break the rules. They can ignore us of course, yet whaddaya know but IE6 supports DOM/CSS Level 1. Not a particularly impressive achievement, but it's a start.

      The standards are actually very precise, which is one reason they are seens as being very large. There is hardly any room for interpretation in stuff like the DOM, CSS, XML etc. Of course, sometimes when the internal architecture of IE mandates it Microsoft simply ignore things, the mime-type issue being a good example, but also the fact that you have to specify node.className = "class" to set the style on a new element, as opposed to setting the class attribute (which works fine in Mozilla). Why? Because (according to an MS developer) internally the MS dom is based on object model attributes, so that's what you have to set.

      Has any company yet written a complete CSS1 implementation? A complete working version of DOM0? Yet here we are toiling away on XHTML and CSS3(!) and DOM Level 2. And they don't even seem to give a rat's ass if anyone actually follows the rules.

      [sigh] Yes. Mozilla supports DOM and CSS Level 2 and they have partial support for Level 3 now. Level 0 is the term used to refer to the pre-standardized technologies, it doesn't actually exist as a standard so EVERY browser that can script web pages has a level zero DOM. It should be noted that TBL himself has stepped in on occasion to tick off Microsoft about stuff like browser blocks, bad HTML etc.

      From what I hear about CSS3, it's going to be such a massive specification that no company (save Microsoft, if they actually gave a damn) would possibly be able to implement it.

      Then you hear wrong.

      In the meantime we see developers actually building websites entirely out of Flash because there's one reference implementation (one version, period) and it just works. Is that the future we want?

      Developers do not build web pages out of flash. Marketing departments do. Luckily most web pages are not built by marketing.

      It's time to hold these clowns accountable. Make them do some real work: make them create a working version of their spec.

      Poor troll. The W3C already implement all their standards, go to w3.org and download Amaya. Nobody uses it for actually browsing the web, but there it is, proof that an actually very small organization with very few coders can implement their standards.

    • W3C's idea of a web browser. Hey, it already supports some CSS2 features!

      Amaya [w3.org]

      I'm all for standards, but they should have a basis in reality (read: working implementations) and not be some committee's idea of a good idea.

  • DOM not HTML (Score:3, Informative)

    by krokodil ( 110356 ) on Sunday November 10, 2002 @02:14AM (#4635907) Homepage
    Does the this mean that one can expect browsers to behave in a predictable manner when playing around with HTML documents?


    You seems to confuse DOM with HTML standard. DOM does not enforce HTML document structure, it is just OO representation of HTML and XHTML documents.

    • DOM can be used to "play around" with HTML documents, after they have been loaded by the browser.

      I seem to recall some web site using Javascript to expand and collapse discussion threads. Think it was kuro5hin [kuro5hin.org]. I'm not sure if it's using DOM to do that, but that is the sort of thing you can do with DOM.

  • huh? (Score:1, Funny)

    by Anonymous Coward
    WWE Releases Drafts For Doom II And More

    what does that mean?

    *squints*

    I gotta get some sleep..........
  • by TheSHAD0W ( 258774 ) on Sunday November 10, 2002 @02:33AM (#4635947) Homepage
    I thought they released a draft for DOOM 2.

    Yeah, considering how long ago it was released, the draft for it would be just about due...
    • by Anonymous Coward
      And I keep seeing "w3c" and thinking "wc3! cool a warcraft 3 article...oh wait."
  • by Proc6 ( 518858 ) on Sunday November 10, 2002 @02:51AM (#4635976)
    ... when Netscape did it to themselves. If you want to talk about standards, go look at charts showing what CSS properties Netscape versions properly support and which ones IE supports. IE kicks its ass all over the place. Netscape is downright broken on some very easy things. Now the new Netscape based on Mozilla, I can't comment. But that's when someone else did all the work for them, maybe Mozilla is fine. But IE is a pretty fast, stable browser that has supported more standards, more correctly than any version of Netscape prior to Mozilla. And if you want to talk about "MS's proprietary HTML tags", yea, Netscape did the same shit, so would anyone trying to own marketshare.

    How about an example from around the time of the Great Browser Holy Wars...

    NETSCAPE ONLY TAGS - blink - layer - keygen - multicol - nolayer - server - spacer

    INTERNET EXPLORER ONLY TAGS - bgsound - iframe - marquee

    Hmm... looks like Netscape had more.

    Look around you, proprietary "anything" is how you keep money coming in and marketshare up. If youre talking about some kind of open source, community developed code, like Mozilla, then yes, please avoid proprietary stuff. But quit bashing Microsoft just because they have a good browser that supports standards at least as well as their only major competitor and are using the same technique as just about every other capitalist on the planet to make more money and keep investors happy. Netscape sucked and deserved to die.

    Now go ahead, mod me down because I stood up for MS.

    • It was a choice of either a mod, or a comment. I like discussion better than point systems.

      I tend to agree with you on the CCS sheets. For example, in IE there is a CSS that allows me to do a hover color change WITHOUT using the seemingly more popular java code. I like it, its a better design for sites in my opinoin, netscape(older versions) craps on it though.

      However, I dont really agree that netscape sucked and deserved to die. Without it there would have been even less innovation. Even now, I use opera over IE because of the ability to go to different and seperate connection by using a simple tab layout at the top of the screen all contained in one program. Whereas to do something similar in IE, I have to open up half a dozen instances of explorer

      • Um... your comment regarding CSS is not true about later versions of netscape (6.0 and on) I use that mouseover color change all the time with CSS, and it renders perfectly in mozilla, netscape 6, 6.1, 6.2 and 7... sure netscape 4 doesn't support it but IE 4 didn't either, so thats a silly argument. I could tell you that mozilla is better than IE because IE 3 won't even open up MS's own home page anymore... but thats irrelevant.
      • IE6s :hover is pretty much broken. You can change the color yes, but you can't get change the display: of a box within the :hover element.

        No nice popup menus in other words ..
    • by Anonymous Coward
      IE didn't start supporting CSS until after MS totally *destroyed* Netscape in the browser wars, so that's not why Navigator lost. Mozilla has excellent CSS and DOM support.

      There are some sites that are absolutely committed to IE and use evil tech like VBscript. Mostly, sites are optimized to IE's idiosyncracies. Since there's no W3 standard on rendering broken, non-compliant code, IE made it render a particular way while Netscape rendered it a particular way. With proper, compliant code, the pages look close enough or at least don't entirely die when you load them. And of all those non-compliant tools, I typically only see iframe, spacer, and bgsound being used.

      But as IE market share grew, lazy/ignorant web designers (which includes Frontpage users) started to test only for IE. When MS destroyed Netscape, most web designers stopped testing for alternative browsers. So Microsoft indirectly caused mass W3C noncompliance.

      I think the problem with your post is that you confuse standards with features. CSS support is a feature. An analogy: the DMV license application form in my state comes with a voter registration form attached. DMVs aren't required to attach forms; it's just an added bonus to promote voting. But, the voter registration form has to be standard. If my DMV office created a "SDMV voter registration form" that had extra questions like political ideology and sexual preference, any other DMV would wonder what the hell the DMV branch was thinking when they made the form.

      It does seem that Mozilla is a lot more willing than the old Netscape and Opera to render broken, non-standard HTML pages, although IE will still render the mind-bogglingly broken things.

      With Mozilla 1.1, I have seen _no_ pages that only work in IE ( excluding those using Evil MS Tech (tm) ), and a minority (usually made by non-professionals) that totally screw up the rendering.
    • by skunkeh ( 410004 ) on Sunday November 10, 2002 @04:52AM (#4636163)

      Shock horror! Browser released in 1996 fails to support latest web standards!

      If you want to bash Netscape, aim at Netscape 6 or 7 (both of which have superb standards compliance thanks to the Mozilla project). Netscape 4 simply isn't relevant any more, and hasn't been for several years. It's only big companies and institutions who don't want the hassle of upgrading their site-wide PCs that are keeping it alive, and with any luck even they will give it up soon.

    • Javascript was a Netscape invention.

      Hows about that for non-standard!

      My first introduction to the DOM and Scripting was builing an I.E.4 Based VB Script application for Boots The Chemist Intranet. That's about as non-standard as you can get. The VBS/JS step debugger in Visual Studio was useful if you could get it going.

      These days there are few differences between the different javascript/dom. (getting xml documents without screen refreshes is unfortunately one of them *sigh*). My favoured route is develop in Mozilla then test in I.E. I've done a drag and drop HTML email editor that works in Moz IE & Opera. The scope of Javascript doesn't really get excercised as far as I've seen round the web.

      • Javascript was a Netscape invention. Hows about that for non-standard!

        Was. Now it's an international standard, called "ECMAScript" [wikipedia.org] (ECMA-262) for the language and HTML DOM Level 2 for everything under document.

    • Now go ahead, mod me down because I stood up for MS.

      I wish I had mod points, I would mod you down. Not because you stood up for MS, but because I don't think you know what you're talking about.


      Most of the work on mozilla is done by netscape employees. I would guess much more that 3/4's of the mozilla code is written by aol/netscape'rs


      And as such, most of the kudos for mozilla's design and engineering accomplishments goes to the netscape engineer staff. There are a lot of very smart people in that group. I encourage anyone to try following mozilla's development for a while. Track a bug, follow discussions on #mozilla, etc. I don't agree on a lot of what the moz dev team does ( sometimes my opinion is they back-burner linux bugs for windows ), but I have a lot of respect for them. And you should too.


      People say "netscape sucks", "mozilla rules" not realizing that mozilla today would be a much smaller project ( read not as many platforms, not as many standards ) if it weren't for the hardwork and dedication to open-source of AOL/Netscape

  • by AcquaCow ( 56720 )
    bother writing compliant html? People will always dream up crazy site designs, they are going to go with whatever technology they can use to make that design a reality. Look at flash, look what happened with DHTML. Netscape's DHTML manual went into documenting aspects of DHTML that weren't even supported in their browser.
    Standards can be made, don't expect that people will ever follow them.

    -- AcquaCow
    • Them designers would probably be shocked to find out that it is much easier writing cool design using proper standards. Not to mention how much easier it is to remake the design or just change it a bit...

      The maintenance factor should be of major importance to businesses... as it is, they have sloppy code that takes years to debug (font tags, inline propriteary javascript, both CSS and styled HTML, sniffer code and so on), and they have to maintain several versions for various browsers. Maintaining one standards compliant version with style separated from content is so much economically sane.
      • Now, I write some fairly base HTML, no more than font, table, p, and br tags really. I wrote a decent layout for my site (dcw.govsci.com [govsci.com]). True it may never get updated, but I wrote it knowing how Netscape likes its html and how IE liked html. Page in the end looked perfect in IE, but rendered mainly single column in Netscape. There were several little `quirks` I had to work around and kludge before it really worked properly in Netscape as well. Things I shouldn't have had to work around. For instance, I have a java applet on my site. I think the width of it is set to 429, the containing cell is set to 430px wide. If I take that java applet up to 430px wide, it completely breaks my site. But only in Netscape (older versions, not moz). I had all cell padding off, everything I could think of. Its just how Netscape handles that particular applet different from IE/Moz/etc. I only tested for Moz/IE/Netscape at the time though, opera wasn't even remotely popular (or even out) when I came up with that design.

        This has become a slightly longer rant than I wanted to write (esp at near 4am) but I suppose my point was that sure Netscape and IE are both rendering the HTML to standard but they handle certain objects differently causing the coder (me) to be forced to adjust their site accordingly to kludge around those slight differences. Standars or not, there are still differences.

        If we can come up with one solid independent rendering engine that is both fast and highly portable, use that in all browsers, I think we'd be set.

        5 mins to 4 am...its time for bed.

        -- AcquaCow
    • Does anyone ever bother checking that their HTML is compliant? By which I mean validating it against the DTD. This ought to be an elementary step in HTML writing - just like compiling a C program is a first step towards checking it works - but it seems so difficult to set up that hardly anyone does it.

      Most Linux systems nowadays include nsgmls, but that command has so many obscure options and SGML prologues are hard to understand. There needs to be a single command 'html_validate' which runs nsgmls with all the necessary command-line options (and obscure environment variables, and DTD files stored in the right place) to validate an HTML document. If that existed then I'd run it every time before saving my document and I'm sure many others would too. But at the moment setting up nsgmls to do HTML validation (at least on Linux-Mandrake) is horribly complex. (Especially if you want to validate XML as well; you need to set environment variables differently between the two uses.)
    • er, yes. (Score:3, Informative)

      by DrSkwid ( 118965 )
      http://validator.w3.org

      Is a great tool.

      If your code is valid HTML then if anyone complains that their X browser doesn't render it properly that's your first point of defense.

    • by Anonymous Coward
      Yes, I do. And I do it without ever once querying browser make or version. The catch? I've stopped supporting NS4 and IE4. That makes all the difference. It's hard, though. And once you start using the DOM extensively you need to test every single line of code you write and have backup plans for every possible contingency. So far though, I'm doing better, not worse, than in the old days of if((is_nav3 || has_frames) && ((!ie || has_jscript11) || iesubversion != 4)) pathology = (stupid_table_bug ? offset-10 : offset).
    • [Does anyone ever...] bother writing compliant html?

      Yes, I do, all the time.

      The current site I'm designing for gets about 35,000 visitors a day, and it's going to be XHTML 1.1 (served as application/xhtml+xml to accepting clients, no less) with a full CSS layout (with the XHTML being semantically rich so it's not required; no DIV/SPAN soup), and hopefully level AAA on the Web Content Accessability Guidelines 1.0.

      I do the same for tiny sites too; the latest being a site for a diving club.

      I have noticed a trend towards larger sites redesigning for XHTML and CSS recently; what was the trend for personal sites seems now to be migrating up the hierachy to larger sites such as Wired [wired.com] and AllTheWeb [alltheweb.com]. I don't expect this trend to reverse.
  • Eh? (Score:5, Funny)

    by Wrexen ( 151642 ) on Sunday November 10, 2002 @03:10AM (#4636007) Homepage
    Web Ontology Language (OWL) Guide

    Soon to be followed by the Acronyn Formation Policy (FAP) ?
    • Re:Eh? (Score:1, Insightful)

      by Anonymous Coward
      See "House at Pooh Corner", by AA Milne.

      Maybe WOL really was right!
    • The DNA (National Dyslexics Association) will surely complain....
  • by mdubinko ( 459807 ) on Sunday November 10, 2002 @03:21AM (#4636032) Homepage
    >2 proposed recommendations: XML-Signature XPath Filter 2.0 and HTML DOM 2.

    XML-Signature XPath Filter 2.0 is a final W3C Recommendation, not proposed.

    -m
  • Nice with standards... now we just have to sit back and wait for people to follow them. That could be a while since there are quite a few developers who don't give a darn to adhere to them.
  • is not to hope for safety... in the form of standards that are adhered to!
  • Sorry... (Score:4, Informative)

    by WhaDaYaKnow ( 563683 ) on Sunday November 10, 2002 @04:00AM (#4636104)
    Does the this mean that one can expect browsers to behave in a predictable manner when playing around with HTML documents?

    One simple example: innerHTML. This 'property' is not part of ANY W3C draft, yet many, many websites use it because both IE and Mozilla (Netscape) support it.

    Even though M$ is on the committee, their own browser still has plenty of features that are not defined in XHTML 1.0, DOM (level 2 or 3), CSS or whatever. And of course 99% of all web 'developers' are more than happy to use these features.
  • Does the this mean that one can expect browsers to behave in a predictable manner when playing around with HTML documents?

    As long as you do things strictly DOM-1 way, current browsers have been working pretty much predictably for quite some time. I develop sophisticated DHTML and test it in IE, Mozilla and Opera, and I never have a problem as long as I use only DOM methods (which can sometimes be quite limiting, but bearable overall).

    A lot of people still do pre-DOM legacy DHTML because they have to make 4.x-compatible sites, but that's another story. DOM-2 may be more featureful, but it doesn't promise making cross-browser development any easier. It can make it harder indeed if not implemented accurately and timely among different browsers. Given a lesser incentive to implement it (DOM-1 is OK for most things), I find it quite possible.

  • W3C: stop now (Score:3, Interesting)

    by g4dget ( 579145 ) on Sunday November 10, 2002 @06:04AM (#4636255)
    The W3C should have stopped with a full specification of HTML. Anything they have been doing beyond that has been doing more harm than good. The web succeeded because HTML was simple.

    Of course, some client-side code is useful, but unfortunately, the major contenders have dropped the ball on that one. The W3C has given us JavaScript+DOM+CSS+..., but it's way too complicated for the vanishingly small amount of functionality, and nobody has managed to implement it correctly; in fact, I doubt nobody knows what a correct implementation would even mean. Flash has become ubiquitous, but it just isn't suitable for real GUI programming and is effectively proprietary. And Java could have been a contender, but Sun has chosen to keep it proprietary, and the once small and simple language has become unusably bloated.

    But, hey, that means that there is an opportunity for better approaches to client-side programming. Curl might have been a candidate if it weren't for the ridiculous license. But someone outside the W3C will do something decent that catches on sooner or later.
    • If we don't have someone like the W3C putting this stuff in writing somewhere, how else are we going to have a hope in hell of browsers talking to each other?

      Should everyone just copy whatever Microsoft comes up with, because lets face it, they have the largest userbase? Somehow I don't see people here appreciating that.

      I mean sure, you can say "wah wah, Microsoft didn't follow the standards, wah wah, Opera doesn't do this yet, this standards system is flawed!" but if there is no reference point for any of these things, how could you possibly expect things to improve?

      One thing that's obvious is that these technologies are needed, not just silly ideas implemented by bored programmers, so if they're going to exist, then better an appropriate committee come up with workable drafts than a lone company goes ahead and does what they feel like. (heck that's one of the main reasons MS came up with so much funky spec breaking stuff - call it embrace and extend if you want, but they wanted to do things before the standards were there, which is why we have this mess)
      • Re:W3C: stop now (Score:4, Interesting)

        by g4dget ( 579145 ) on Sunday November 10, 2002 @07:31AM (#4636419)
        Should everyone just copy whatever Microsoft comes up with

        Everybody is, for practical purposes. Who do you think is dreaming up a lot of the stuff that comes out of the W3C? Look at the authorships of the standards. And if you sit in those meetings, you'll quickly see that Microsoft doesn't often take "no" for an answer.

        Microsoft has even told us why they like their standards to be complicated: they believe that if they just make it complicated enough, nobody else but them can implement them. Of course, Microsoft's reasoning is at the level of Wiley Coyote, with Open Source being the Roadrunner, but what can you do.

        One thing that's obvious is that these technologies are needed,

        We have a problem with creating dynamic web content, but the current crop of W3C standards for addressing that problem isn't working; it has turned into a Rube Goldberg contraption. Someone needs to start from scratch, and the W3C appears to be incapable of doing it.

        If we don't have someone like the W3C putting this stuff in writing somewhere, how else are we going to have a hope in hell of browsers talking to each other?

        Of course, things need to get written down and standardized. But the way standards are supposed to work is that people try things out in practice, whatever works well survives in the marketplace or among users, people create multiple implementations, then people get together and work out the differences among the implementations, then it all gets written up as a standard, and finally everybody goes back and makes their implementations standards compliant. It's a long, tedious process, but it does result in reasonable standards that real people can actually implement.

        What the W3C is often doing is using its position to create completely unproven systems on paper and let the rest of the world figure out how to deal with it. Or, worse, the W3C is used by powerful companies to push through "standards" that haven't stood the test of time and for which only they themselves have a working implementation. If you give that kind of junk the stamp of approval of a standards body, you make things worse, not better.

    • The W3C has given us JavaScript+DOM+CSS+..., but it's way too complicated for the vanishingly small amount of functionality, and nobody has managed to implement it correctly; in fact, I doubt nobody knows what a correct implementation would even mean.

      Huh? JavaScript is the Mozilla implementation of ECMAScript, a standard (not W3C) invented by Netscape. The DOM was also a Netscape idea, now standardized. CSS was originally proposed and largely designed by a guy from Opera. There are quite a few implementations out there actually, the idea that W3C technologies are too large to implement is crazy. Look at Mozilla, Amaya, even Konqueror is getting there now.....

      The W3C should have stopped with a full specification of HTML. Anything they have been doing beyond that has been doing more harm than good. The web succeeded because HTML was simple.

      Yes, and now it's ubiquitous do you really think we need to keep it simple? Being simple was great when the web was small, it let it grow very quickly. Why should we keep it simple now? Just for the sake of it? I'd rather have power. If that means there are only 3 or 4 quality implementations as opposed to 20, then so be it.

      The world is not a simple place, and the things we want to do with the web nowadays aren't simple either. If you want simplicity then feel free to write a web browser that only understands a subset of the standards, they are layered so people can do this. Just bear in mind that it won't be useful for browsing the web, because a lot of people like powerful technologies and use them.

      • Huh? JavaScript is the Mozilla implementation of ECMAScript, a standard (not W3C) invented by Netscape. The DOM was also a Netscape idea, now standardized.

        Yes, but the W3C gave it its blessing and built lots of other standards on it.

        Why should we keep it simple now? Just for the sake of it? I'd rather have power. If that means there are only 3 or 4 quality implementations as opposed to 20, then so be it.

        You are confusing complexity with power. The W3C standards are complex, but they aren't powerful. And that's the problem. Despite all the junk coming out of the W3C, it's still basically impossible to do reliable animations, drag-and-drop, document image display, editing, and other commonly desired things in web browsers.

        I want to do complex things, but after 10 years, the W3C standards still don't support it.

        The world is not a simple place, and the things we want to do with the web nowadays aren't simple either.

        Yes, and the W3C fails to meet the needs of people who want to do complex things. All the W3C does is provide people with ever more complex ways of doing simple things. That is not progress.

        If you want simplicity then feel free to write a web browser

        More likely, there are new web browsers and plugins around the corner that build on HTML/XHTML, but come up with better ways of doing the other stuff. It's harder now than it was 10 years ago, when any kind of bogus idea could succeed, but it's still possible. Perhaps Curl could have succeeded in that area if they had open sourced it. But something else will come along.

  • Microsoft are rolling their own anyway (.NET), and with their monopoly over 90% of the desktops and IE6 high in this prominent spot, I fail to see how this will make a difference to end-users...
    • Just thought I'd point out that W3C standards and .NET are orthogonal; .NET doesn't specify anything about how to render web pages or do client-side scripting.

      Now, if you were talking about SOAP...
  • by Anonymous Coward
    Maybe, you are missing the point on that W3C is centering its efforts in other applications that web development. Say documents representation (XML), machine understandable information, web information retrieval and so.
    OWL is about information retrieval, and 'XML-Signature XPath Filter' is about document signing.
    The DOM stuff, is no more only a Dynamic HTML stuff. DOM is important because it is being actively used to manage XML documents, and previous specifications are very clumpsy because they are a compromise between previous brosers specific market standards.
    Maybe, it is a need to develop some simple DOM stuff from scratch instead of adding levels over a compromise approach. And again, as said above, give a reference implementation, to start with.

    Vokimon
  • The ontology project seems kinda cool, but it iwll never be practical for anything but the most stringently controlled or automated Intranet.
  • by rocjoe71 ( 545053 ) on Sunday November 10, 2002 @12:52PM (#4637470) Homepage
    ...Because I use them all the time, testing against Mozilla 1.x, IE 6.0, 5.5 and 5.0.

    MSDN clearly marks out which functions are standard to and which version of HTML/DOM they are complying to.

    Mozilla is almost de-facto compliant because that's the only thing they have to work from and they don't have an agenda like interoperation with MS Office/Front Page.

    Standards compliance does work, it's the lazy/inept authors of web pages that are to blame for faulty product resulting from an ad-hoc approach to web page development.

    Then again, like the saying goes: "A bad workman always blames his tools..."

Kiss your keyboard goodbye!

Working...