W3C Releases Drafts For DOM L2 And More 150
TobiasSodergren writes "People at W3C seem to have had a busy Friday, according to their website.
They have released no less than 4 working drafts
(Web Ontology Language (OWL) Guide,
the QA Working group - Introduction,
Process and Operational Guidelines,
Specification Guidelines)
and 2 proposed recommendations:
XML-Signature XPath Filter 2.0
and HTML DOM 2.
Does the this mean that one can expect browsers to behave in a predictable manner when
playing around with HTML documents? Hope is the last thing to leave optimistic people, right?"
W3C standards getting out of hand (Score:3, Funny)
Re:W3C standards getting out of hand (Score:1)
Re:W3C standards getting out of hand (Score:4, Funny)
doesn't matter... (Score:3, Insightful)
Re:doesn't matter... (Score:3, Interesting)
Re:doesn't matter... (Score:5, Insightful)
I work in a student computer lab for a fairly large university, about 28,000 students. You wouldn't *believe* the problems I have to deal with because of stupid, and I stress stupid, professors using stuff like MSword/powerpoint for their class notes and webpages.
I'll give you a few examples. Powerpoint is the most common for posting class notes. All good and fine because thanks to OpenOffice even a linux box can read pp slides just fine. The problem is printing them. Since we have only dot matrix printers (long story...) if the professor uses too weird a color scheme the slides don't print worth a damm, even with 'print only black/white' option checked. Problem #1.
The bigger problem is when they use MSword to post syllabi, notes, etc. Students have a problem viewing them at home for whatever reason (most likely they are using an old version of word) and they have to come back to campus to look at this stuff. It is insane. I always direct them to install OpenOffice but sometimes they might only have a modem so it isn't really an option. And if you talk to these professors about only posting stuff in MSWord they get defensive and say such things like 'everyone uses it' and other to the like. Try pointing out that just clicking 'save as rich text format' will cover 99% of the stuff they publish just doesn't work. Sigh. It is becoming a real problem. Same with webpages - what standards, microsoft is a stanard, I'm sure this would work fine if you would use a *microsoft* browser, etc, etc.
Not that all professors are dumb, a lot use things like rich text format and try to stay away from word but alot don't. It is a major headache to some students, and for me. And don't even get me started about how IE handles word documents - has the nasty tendancy to embed them within the current frame which causes havoc with printing, saving, etc - at least for your average student.
Seriously, more teachers need to be educated on thigns like open formats. For instance, it wouldn't be that hard to devolp a campus-wide XML format and a nice little front-end for making syllabus's, class notes, outlines, etc available to all faculty. That way you could ensure that everyone had equal access to the documents instead of forcing students to use MS products.
Re:doesn't matter... (Score:1)
However, I really don't feel much sympathy for students as I don't see professors using MS Office, or whatever else they like as a problem. There is always teh simple option of attending class and picking up the hardcopy when it is passed out. Indeed many classess I have taken have no website at all, and it is your responsbility to attend class and get your informaton that way.
Also, all the universities I have seen do at least a passable job (and usually much better) of providing computer facalities in places like the main library. It is not hard to go to the library and print what you need.
If you want to mandiate that professors all must use a given system for their websites, fine, but you'd better be prepared to make sure it works WELL and provide any and all necessary support for them to use it. Otherwise, they need to be allowed to use what they like.
Re:doesn't matter... (Score:1)
Re:doesn't matter... (Score:1)
Re:doesn't matter... (Score:2)
Re:doesn't matter... (Score:2)
Re:doesn't matter... (Score:1, Informative)
IE6 W3 support (Score:5, Interesting)
Lately I've been working on an app for a company's internal use, which means the delightful situation of being able to dictate minimum browser requirements. As a result, the app is designed for IE6/Mozilla. All development has been in Mozilla, and a lot of DOM use goes on. And it all works in IE6, no browser checking anywhere. My only regrets is I can't make use of the more advanced selectors provided by CSS2, so the HTML has a few more class attributes than it would need otherwise. But, overall, not bad.
Another positive note, IE6 SP1 finally supports XHTML sent as text/xml. So at last, XHTML documents can be sent with the proper mime type [hixie.ch].
So despite being a Mozilla (Galeon) user, as a web developer who makes heavy use of modern standards, I look forward to seeing IE continue to catch up to Mozilla so that I can worry even less about browser-specific issues.
Re:IE6 W3 support (Score:2)
Ah, yes, the 'Not having to worry about browser-specific issues" notion. You haven't exactly been a web-dev long enough, have you? (:P)
This is _exactly_ what we thought HTML 3.2 would turn out to be....and look at how wel that worked!
And anyway, if it isn't W3C standards, it's resolution, colors (allthough that's fixed now...sortof) etc.
Re:IE6 W3 support (Score:2)
Great, except XHTML is supposed to be served as application/xhtml+xml, which IE6 SP1 still wants to download rather than display.
I guess text/xml is one step closer, though.. assuming it works properly.
XHTML MIME types (Score:1)
Another positive note, IE6 SP1 finally supports XHTML sent as text/xml.
How did you get text/xml to work in IE? When I try it, I get a collapsible tree view of the document's source code.
Re:doesn't matter... (Score:1, Insightful)
Re:doesn't matter... (Score:2)
Also, while IE is the most popular browser, it's not the only one, and a not insignificant proportion of the population uses Mozilla, Opera, and other browsers. Somewhat hypocritical of me, since I'm currently using IE on my Windows partition, as opposed to Mozilla on my FreeBSD partition, but on purely technical merits, IE isn't really the best browser, and the optimist in me is convinced that the greater portion of the online population will eventually go for the better solution. On the other hand, if they don't, why should we worry about it? The proletariat can do as they please. So long as "MS HTML" doesn't somehow become entirely proprietary, we retain the ability to access it, plus we get to view properly-rendered pages. Whee.
Don't forget, either, that Microsoft actually is a member [w3.org] of the w3c. Microsoft can be accused of many things, but blatantly violating one's own standards is a rather stupid thing to do.
No. (Score:5, Insightful)
When there was 1 standard (HTML), browsers didn't behave predictably.
Now there are more, there is more scope for implemetations to have their quirks, not less.
Standards are large and complicated descriptions of expected behaviour. Each implementor may have a slightly different interpretation. Different implementations will have their strengths and weaknesses which make different parts of the standard easier or harder to implement fully and/or correctly. There may even be reasons why an implementor may choose to ignore part of a standard (perhaps it is difficult and he believes that users don't want or need that functionality yet).
Unfortunately, standards are an ideal to aim for, not a description of reality.
C++ XML API (Score:4, Interesting)
Re:C++ XML API (Score:3, Informative)
Re:C++ XML API (Score:2)
"For portability, care has been taken to make minimal use of templates, no RTTI, no C++ namespaces and minimal use of #ifdefs."
The API is basically C with classes, uses XMLChar * instead of std::string, etc. I'm looking for something more along the lines of the Boost or Loki libraries in that they integrate cleanly with the STL.
Let me use JDOM and XML::Simple as examples. They both simplify the (IMHO too complex) DOM model, as well as fitting closely to the language. JDOM, for example, uses standard Java strings and containers, while XML::Simple uses Perl associative arrays.
Re:C++ XML API (Score:2)
Re:C++ XML API (Score:5, Informative)
Someone posted a neat little class to the expat mailing list ~2yrs ago. Basically it was just a Node class with STL list for children and a hashmap for attributes. It was very small, clean, and was in essance a DOM. It used expat but trust me, the code was so tiny you could use any parser with it. It was like 200 lines of code.
I liked it so much I created the same thing in C called domnode [eskimo.com].
Search the expat archives [sourceforge.net]. Wish I could give you more to go on.
Re:C++ XML API (Score:3, Informative)
http://mail.libexpat.org/pipermail/expat-discuss/
Re:C++ XML API (Score:2)
I completely agree about all the weird reinvent-the-wheel stuff that DOM and similar libraries contain: it would be so much better if they could use the STL in C++ and native data structures in other languages (nested lists in Lisp, etc etc). It's just that a basic function call interface is the lowest common denominator, so if you want the same library on every language you have to invent a whole new list and tree API. Perhaps this is an indication that the same library on every different language isn't such a good idea. (Think of the Mozilla debate: 'the same on every platform' versus 'native on every platform'. I have a feeling that in programming languages as well as GUIs the second choice is better.)
Re:XML API (Score:1)
Instead I implemented my own jdom like system that uses xpath to find noes in a document using Xalan's [apache.org] xpath API. This gives me the flexibility of xpath and the usefulness of a DOM-like XML api. I was thinking of porting it to C++ for use at home.
Standards (Score:2, Interesting)
Perhaps it's time we stopped sitting on our thumbs and complaining about Microsoft ignoring standards. An outright ban of IE is needed, from workplaces, schools, ect... Sites should block access to people using IE. This is the only way we can get our rights to web standards back!
Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?
Something about reading Eolas thingie.. (Score:1)
I remember a slashdot link [slashdot.org] somewhere mentioning something about IE getting eliminated due to some sort of plugin junk?
Re:Something about reading Eolas thingie.. (Score:1)
Re:Something about reading Eolas thingie.. (Score:1)
Ugh...
Slashdot requires you to wait 2 minutes between each successful posting of a comment to allow everyone a fair chance at posting a comment. It's been 1 minute since you last successfully posted a comment Note: chances are, you're behind a firewall, or proxy, or clicked the Back button to accidentally reuse a form. We know about those kinds of errors. But if you think you shouldn't be getting this error, feel free to file a bug report, telling us: Your browser type Your userid "614145" What steps caused this error Whether you used the Back button on your browser Whether or not you know your ISP to be using a proxy, or any sort of service that gives you an IP that others are using simultaneously How many posts to this form you successfully submitted during the day Please set the Category to "Formkeys." Thank you.
Re:Standards (Score:2)
Y'know, in a perfect world, I'd whole heartedly agree with you. Is it a perfect world? Hence, the diatribe.
Seriously though, does anyone have any ideas on how we can take control of web standards away from MS ?
Ooops, sorry. Cancel diatribe...
Sorry for the dose of reality.
Soko
Re:Standards (Score:1)
Why bother? Have you taken a look at these standard recently? They're huge and unwieldly. Perhaps a more attainable goal is to develop the next generation of browsers - a blank context for multimedia rendering as directed by the server-side script. Sort of a Shockwave Flash as a native platform.
Re:Standards (Score:4, Informative)
Somedays I'm more optimistic. Today's one of those days (tomorrow may not be 'cause I'm digging deeper into IE's weird-ass DOM than I usually care to). But...
Most web developers that have been around for a while would rather code to standards than to marketshare. Standards give you the promise of backward, and more importantly, forward, compatibility. It's also a helluva lot easier to sort out your code when a client asks for a redesign in a year or two if you've been conscious of more than just "making it look right" in the popular browser of the day.
Markup designed for IE only often does truly evil things on other platforms - there's going to be more cellphones and PDAs accessing web pages, not fewer. There are also serious organizational advantages to coding to standards - more tools for handling your pages, it's easier to whip up a quick perl script to process standards compliant HTML...the list of advantages is long.
Just like any other field, there's a trickle-down effect. Not everyone will write good, W3C compliant code, but more will, more often. And despite their megalithic, feudal mentality, Microsoft will have to pay attention. IE6 is still a long ways away from adhering to standards, but it's much, much closer than IE4 was. This seems to have been in large part a reaction to developers bitching about their lack of compliance. I'm hopeful the trend will continue.
Re:Standards (Score:3, Interesting)
My own homepage doesn't render in anything but Mozilla, currently, but small, personal sites aren't gonna break or make anything (unless they come in the millions, which is unlikely).
The people at Mozilla have provided us with a tool of 99% perfect rendering. Now it is up to the web site maintainers to actually enforce the use of Mozilla (or any other browser that fully adheres to standards; there is no other currently).
But Slashdot won't take this upon its shoulders, because it doesn't believe in standards, just like M$.
So M$ wins.
Re:Standards (Score:4, Informative)
Many sites can get away with this, but many cannot. If I'm selling a product on the web, I'll make darn sure that 99% of my customer's browsers work with my site. It's a good ideal to say "fix your IE bugs", but often not realistic.
Re:Standards (Score:2, Interesting)
That depends quite a lot on your definition of ALWAYS as it applies to Mozilla...Considering Mozilla was originally based off the Netscape source code (though I realize now it is been virtually completely rewritten). People seem to forget that Netscape were the kings of non-standard HTML as an attempt to "lock-in" customers. Hell, IE still to this day includes Mozilla in its user agent header to work around all the sites that would deny access to anything other than Netscape, back in the 2.0 era.
Re:Standards (Score:2)
At this I am very surprised. It's Microsoft's style to turn around and bite people in the ass when they have the upper hand. I wonder why MS hasn't "forced" Netscape only sites to change by updating their agent header?
No need - they have Passport (Score:2)
Changing headers is no use in that scenario
I just wish one little thing (Score:1)
Re:I just wish one little thing (Score:1)
Re:I just wish one little thing (Score:1)
Re:I just wish one little thing (Score:2)
There are some excellent accessible, standards compliant scripts now for creating trees / drop down menus from HTML nested lists - browsers without javascript see the list, while browsers with javascript get a nice expanding tree. Two examples:
Re:I just wish one little thing (Score:1)
Re:I just wish one little thing (Score:4, Informative)
If you got a problem with popup ads, then please download the Opera browser [opera.com]... you'll find F12 to be your best friend.
If you really want to crusade against something, then VB script is a better candidate or why not Outlook... the worst virus spreading software ever created.
Re:I just wish one little thing (Score:2)
that reminds me, since I do not use outlook/express for e-mail (I use mozilla at work and opera's stuff at home) I just set my adress list to use public addresses @ microsoft.com, that way if for some reason (someone else in the family ignores one of the computer commandments and opens some virus in an attachment) it simply sends the crap to microsoft and no one else
junk snail mail is also handled by removing the postage paid self-adressed enveloped and filling it with metal scraps and placing in the mail (receivers are charged with postage) - make the spammers/virus enablers pay whenever you can.
client side scripting: good, JavaScript: bad (Score:2)
If JavaScript (by which I mean JavaScript, DOM, DHTML, etc.) were a simple, if limited, solution to those problems, it would be OK. But it isn't. It is much more complicated than technically better solutions, yet it still is extremely limited.
Simple and limited, and complex and powerful are both acceptable engineering tradeoffs. But complex and limited and buggy is a bad engineering tradeoff. And that's JavaScript.
Just because you don't feel the need .... (Score:1)
The banner rotation is via js so that the main page can be cached.
(but not annoying pop-up/unders - some of us realise they are a detraction).
Our banners don't link to any external sites.
The banner is part of the web frame of reference.
We have over 500 pages of content so I'm sure you'll excuse us our right to present deep links on our main page.
This is a troll, right? (Score:1)
Do you think javascript == popup windows? The open window call is abused, and I'd like to see the spec implement some kind of suggested behaviour along the lines of disregarding popups that aren't user activated (Mozillia already does a great job of this, but making it part of the spec would be superior) but to lose client based scripting would be a blow to the usability of the Internet and the palette of web designers trying to make intelligent sites.
Client side form validation, adapting pages, and heck, even silly stuff like graphical rollovers which you can't do in CSS yet, are all things the Internet benefits from. Only an idiot would fail to anticipate how their page would work to users who don't have Javascript turned on, but it can make the experience run that much nicer and efficiently.
Not to mention that nuking Javascript, an open, standards based, accessible language, will simply promote the use of obnoxious propriety technology like Flash.
The W3C is a joke (Score:2, Insightful)
Has any company yet written a complete CSS1 implementation? A complete working version of DOM0? Yet here we are toiling away on XHTML and CSS3(!) and DOM Level 2. And they don't even seem to give a rat's ass if anyone actually follows the rules.
From what I hear about CSS3, it's going to be such a massive specification that no company (save Microsoft, if they actually gave a damn) would possibly be able to implement it.
What are we doing? The W3C puts out specifications that by the year become less and less relevant because their possible implementation date grows further and further remote. We'll see CSS3 arrive but will we ever see it in action? Or will it be supplanted by CSS4 and 5 which we will also never see? In the meantime we see developers actually building websites entirely out of Flash because there's one reference implementation (one version, period) and it just works. Is that the future we want?
It's time to hold these clowns accountable. Make them do some real work: make them create a working version of their spec. Make them brand one developer's work as a reference. Make them do something to prove that these standards are more than just empty clouds of words!
Re:The W3C is a joke (Score:3, Informative)
Re:The W3C is a joke (Score:2, Interesting)
Unfortunately, Mozilla does not support DOM 2 HTML in XHTML... and probably never will, because the bug assignee doesn't seem to care about this rather crucial bug.
Btw, DOM 0 is not a standard, but a collection of common garbage from the old days. It is supported in Mozilla only for backward compatibility, and people shouldn't use it in design. Mozilla explicitly does not support IE and NN4 only stuff such as document.all and document.layers.
Re:The W3C is a joke (Score:1)
Re:The W3C is a joke (Score:2)
Re:The W3C is a joke (Score:4, Informative)
You have to have standards. The W3C are the people who are widely recognized as being the technical lead for the net. Now they don't make law, quite right, but if there was no W3C then Microsoft really WOULD own the web: as it is, we can and do take them to task when they break the rules. They can ignore us of course, yet whaddaya know but IE6 supports DOM/CSS Level 1. Not a particularly impressive achievement, but it's a start.
The standards are actually very precise, which is one reason they are seens as being very large. There is hardly any room for interpretation in stuff like the DOM, CSS, XML etc. Of course, sometimes when the internal architecture of IE mandates it Microsoft simply ignore things, the mime-type issue being a good example, but also the fact that you have to specify node.className = "class" to set the style on a new element, as opposed to setting the class attribute (which works fine in Mozilla). Why? Because (according to an MS developer) internally the MS dom is based on object model attributes, so that's what you have to set.
Has any company yet written a complete CSS1 implementation? A complete working version of DOM0? Yet here we are toiling away on XHTML and CSS3(!) and DOM Level 2. And they don't even seem to give a rat's ass if anyone actually follows the rules.
[sigh] Yes. Mozilla supports DOM and CSS Level 2 and they have partial support for Level 3 now. Level 0 is the term used to refer to the pre-standardized technologies, it doesn't actually exist as a standard so EVERY browser that can script web pages has a level zero DOM. It should be noted that TBL himself has stepped in on occasion to tick off Microsoft about stuff like browser blocks, bad HTML etc.
From what I hear about CSS3, it's going to be such a massive specification that no company (save Microsoft, if they actually gave a damn) would possibly be able to implement it.
Then you hear wrong.
In the meantime we see developers actually building websites entirely out of Flash because there's one reference implementation (one version, period) and it just works. Is that the future we want?
Developers do not build web pages out of flash. Marketing departments do. Luckily most web pages are not built by marketing.
It's time to hold these clowns accountable. Make them do some real work: make them create a working version of their spec.
Poor troll. The W3C already implement all their standards, go to w3.org and download Amaya. Nobody uses it for actually browsing the web, but there it is, proof that an actually very small organization with very few coders can implement their standards.
Amaya *cough* *cough* :-) (Score:2)
Amaya [w3.org]
I'm all for standards, but they should have a basis in reality (read: working implementations) and not be some committee's idea of a good idea.
DOM not HTML (Score:3, Informative)
You seems to confuse DOM with HTML standard. DOM does not enforce HTML document structure, it is just OO representation of HTML and XHTML documents.
Re:DOM not HTML (Score:2)
DOM can be used to "play around" with HTML documents, after they have been loaded by the browser.
I seem to recall some web site using Javascript to expand and collapse discussion threads. Think it was kuro5hin [kuro5hin.org]. I'm not sure if it's using DOM to do that, but that is the sort of thing you can do with DOM.
huh? (Score:1, Funny)
what does that mean?
*squints*
I gotta get some sleep..........
Ohhhh... _DOM_. (Score:3, Funny)
Yeah, considering how long ago it was released, the draft for it would be just about due...
Re:Ohhhh... _DOM_. (Score:1, Funny)
Yea, bash MS some more... (Score:3, Flamebait)
How about an example from around the time of the Great Browser Holy Wars...
NETSCAPE ONLY TAGS - blink - layer - keygen - multicol - nolayer - server - spacer
INTERNET EXPLORER ONLY TAGS - bgsound - iframe - marquee
Hmm... looks like Netscape had more.
Look around you, proprietary "anything" is how you keep money coming in and marketshare up. If youre talking about some kind of open source, community developed code, like Mozilla, then yes, please avoid proprietary stuff. But quit bashing Microsoft just because they have a good browser that supports standards at least as well as their only major competitor and are using the same technique as just about every other capitalist on the planet to make more money and keep investors happy. Netscape sucked and deserved to die.
Now go ahead, mod me down because I stood up for MS.
Re:Yea, bash MS some more... (Score:1)
It was a choice of either a mod, or a comment. I like discussion better than point systems.
I tend to agree with you on the CCS sheets. For example, in IE there is a CSS that allows me to do a hover color change WITHOUT using the seemingly more popular java code. I like it, its a better design for sites in my opinoin, netscape(older versions) craps on it though.
However, I dont really agree that netscape sucked and deserved to die. Without it there would have been even less innovation. Even now, I use opera over IE because of the ability to go to different and seperate connection by using a simple tab layout at the top of the screen all contained in one program. Whereas to do something similar in IE, I have to open up half a dozen instances of explorer
Re:Yea, bash MS some more... (Score:1)
Re:Yea, bash MS some more... (Score:2)
No nice popup menus in other words
Re:Yea, bash MS some more... (Score:2, Insightful)
There are some sites that are absolutely committed to IE and use evil tech like VBscript. Mostly, sites are optimized to IE's idiosyncracies. Since there's no W3 standard on rendering broken, non-compliant code, IE made it render a particular way while Netscape rendered it a particular way. With proper, compliant code, the pages look close enough or at least don't entirely die when you load them. And of all those non-compliant tools, I typically only see iframe, spacer, and bgsound being used.
But as IE market share grew, lazy/ignorant web designers (which includes Frontpage users) started to test only for IE. When MS destroyed Netscape, most web designers stopped testing for alternative browsers. So Microsoft indirectly caused mass W3C noncompliance.
I think the problem with your post is that you confuse standards with features. CSS support is a feature. An analogy: the DMV license application form in my state comes with a voter registration form attached. DMVs aren't required to attach forms; it's just an added bonus to promote voting. But, the voter registration form has to be standard. If my DMV office created a "SDMV voter registration form" that had extra questions like political ideology and sexual preference, any other DMV would wonder what the hell the DMV branch was thinking when they made the form.
It does seem that Mozilla is a lot more willing than the old Netscape and Opera to render broken, non-standard HTML pages, although IE will still render the mind-bogglingly broken things.
With Mozilla 1.1, I have seen _no_ pages that only work in IE ( excluding those using Evil MS Tech (tm) ), and a minority (usually made by non-professionals) that totally screw up the rendering.
Re:Yea, bash MS some more... (Score:4, Insightful)
Shock horror! Browser released in 1996 fails to support latest web standards!
If you want to bash Netscape, aim at Netscape 6 or 7 (both of which have superb standards compliance thanks to the Mozilla project). Netscape 4 simply isn't relevant any more, and hasn't been for several years. It's only big companies and institutions who don't want the hassle of upgrading their site-wide PCs that are keeping it alive, and with any luck even they will give it up soon.
Let's not forget JS, VBS & JSCRIPT (Score:2)
Hows about that for non-standard!
My first introduction to the DOM and Scripting was builing an I.E.4 Based VB Script application for Boots The Chemist Intranet. That's about as non-standard as you can get. The VBS/JS step debugger in Visual Studio was useful if you could get it going.
These days there are few differences between the different javascript/dom. (getting xml documents without screen refreshes is unfortunately one of them *sigh*). My favoured route is develop in Mozilla then test in I.E. I've done a drag and drop HTML email editor that works in Moz IE & Opera. The scope of Javascript doesn't really get excercised as far as I've seen round the web.
JS is a standard (Score:1)
Javascript was a Netscape invention. Hows about that for non-standard!
Was. Now it's an international standard, called "ECMAScript" [wikipedia.org] (ECMA-262) for the language and HTML DOM Level 2 for everything under document.
mozilla is run by netscape (Score:2)
Now go ahead, mod me down because I stood up for MS.
I wish I had mod points, I would mod you down. Not because you stood up for MS, but because I don't think you know what you're talking about.
Most of the work on mozilla is done by netscape employees. I would guess much more that 3/4's of the mozilla code is written by aol/netscape'rs
And as such, most of the kudos for mozilla's design and engineering accomplishments goes to the netscape engineer staff. There are a lot of very smart people in that group. I encourage anyone to try following mozilla's development for a while. Track a bug, follow discussions on #mozilla, etc. I don't agree on a lot of what the moz dev team does ( sometimes my opinion is they back-burner linux bugs for windows ), but I have a lot of respect for them. And you should too.
People say "netscape sucks", "mozilla rules" not realizing that mozilla today would be a much smaller project ( read not as many platforms, not as many standards ) if it weren't for the hardwork and dedication to open-source of AOL/Netscape
Does anyone ever... (Score:2, Insightful)
Standards can be made, don't expect that people will ever follow them.
-- AcquaCow
Re:Does anyone ever... (Score:1)
The maintenance factor should be of major importance to businesses... as it is, they have sloppy code that takes years to debug (font tags, inline propriteary javascript, both CSS and styled HTML, sniffer code and so on), and they have to maintain several versions for various browsers. Maintaining one standards compliant version with style separated from content is so much economically sane.
Re:Does anyone ever... (Score:1)
This has become a slightly longer rant than I wanted to write (esp at near 4am) but I suppose my point was that sure Netscape and IE are both rendering the HTML to standard but they handle certain objects differently causing the coder (me) to be forced to adjust their site accordingly to kludge around those slight differences. Standars or not, there are still differences.
If we can come up with one solid independent rendering engine that is both fast and highly portable, use that in all browsers, I think we'd be set.
5 mins to 4 am...its time for bed.
-- AcquaCow
Re:Does anyone ever... (Score:2)
Most Linux systems nowadays include nsgmls, but that command has so many obscure options and SGML prologues are hard to understand. There needs to be a single command 'html_validate' which runs nsgmls with all the necessary command-line options (and obscure environment variables, and DTD files stored in the right place) to validate an HTML document. If that existed then I'd run it every time before saving my document and I'm sure many others would too. But at the moment setting up nsgmls to do HTML validation (at least on Linux-Mandrake) is horribly complex. (Especially if you want to validate XML as well; you need to set environment variables differently between the two uses.)
Re:Does anyone ever... (Score:1)
You've got to be kidding me. You've never used the W3C validator? I couldn't live without that thing...
http://validator.w3.org [w3.org]
er, yes. (Score:3, Informative)
Is a great tool.
If your code is valid HTML then if anyone complains that their X browser doesn't render it properly that's your first point of defense.
Re:Does anyone ever... (Score:1, Interesting)
Re:Does anyone ever... (Score:2)
Yes, I do, all the time.
The current site I'm designing for gets about 35,000 visitors a day, and it's going to be XHTML 1.1 (served as application/xhtml+xml to accepting clients, no less) with a full CSS layout (with the XHTML being semantically rich so it's not required; no DIV/SPAN soup), and hopefully level AAA on the Web Content Accessability Guidelines 1.0.
I do the same for tiny sites too; the latest being a site for a diving club.
I have noticed a trend towards larger sites redesigning for XHTML and CSS recently; what was the trend for personal sites seems now to be migrating up the hierachy to larger sites such as Wired [wired.com] and AllTheWeb [alltheweb.com]. I don't expect this trend to reverse.
Re:Does anyone ever... (Score:2)
Yes, used sensibly to denote sections, since HTML provides no better way to mark them up yet.
There's nothing wrong with using DIV and SPAN, it's just when that's all you have that things get questionable.
Compare: To: Now, which do you think has more semantic meaning and will degrade better?
With CSS, both can easily be made to render identically, but the second non-DIV-and-SPAN-soup version degrades much better. Unfortunately a worrying number of people seem to think the former method is what CSS is all about -- the default Movable Type [movabletype.org] templates are a good example of this brain damaged view of HTML
Eh? (Score:5, Funny)
Soon to be followed by the Acronyn Formation Policy (FAP) ?
Re:Eh? (Score:1, Insightful)
Maybe WOL really was right!
Re:Eh? (Score:1)
Not "Proposed" Recommendation anymore, it's final (Score:3, Informative)
XML-Signature XPath Filter 2.0 is a final W3C Recommendation, not proposed.
-m
Standards (Score:2)
the last hope of the doomed (Score:1)
Sorry... (Score:4, Informative)
One simple example: innerHTML. This 'property' is not part of ANY W3C draft, yet many, many websites use it because both IE and Mozilla (Netscape) support it.
Even though M$ is on the committee, their own browser still has plenty of features that are not defined in XHTML 1.0, DOM (level 2 or 3), CSS or whatever. And of course 99% of all web 'developers' are more than happy to use these features.
Re:Sorry... (Score:2)
look here for more info [developer-x.com]
DOM-2 irrelevant to cross-browser issues (Score:2, Informative)
As long as you do things strictly DOM-1 way, current browsers have been working pretty much predictably for quite some time. I develop sophisticated DHTML and test it in IE, Mozilla and Opera, and I never have a problem as long as I use only DOM methods (which can sometimes be quite limiting, but bearable overall).
A lot of people still do pre-DOM legacy DHTML because they have to make 4.x-compatible sites, but that's another story. DOM-2 may be more featureful, but it doesn't promise making cross-browser development any easier. It can make it harder indeed if not implemented accurately and timely among different browsers. Given a lesser incentive to implement it (DOM-1 is OK for most things), I find it quite possible.
W3C: stop now (Score:3, Interesting)
Of course, some client-side code is useful, but unfortunately, the major contenders have dropped the ball on that one. The W3C has given us JavaScript+DOM+CSS+..., but it's way too complicated for the vanishingly small amount of functionality, and nobody has managed to implement it correctly; in fact, I doubt nobody knows what a correct implementation would even mean. Flash has become ubiquitous, but it just isn't suitable for real GUI programming and is effectively proprietary. And Java could have been a contender, but Sun has chosen to keep it proprietary, and the once small and simple language has become unusably bloated.
But, hey, that means that there is an opportunity for better approaches to client-side programming. Curl might have been a candidate if it weren't for the ridiculous license. But someone outside the W3C will do something decent that catches on sooner or later.
Re:W3C: stop now (Score:1)
Should everyone just copy whatever Microsoft comes up with, because lets face it, they have the largest userbase? Somehow I don't see people here appreciating that.
I mean sure, you can say "wah wah, Microsoft didn't follow the standards, wah wah, Opera doesn't do this yet, this standards system is flawed!" but if there is no reference point for any of these things, how could you possibly expect things to improve?
One thing that's obvious is that these technologies are needed, not just silly ideas implemented by bored programmers, so if they're going to exist, then better an appropriate committee come up with workable drafts than a lone company goes ahead and does what they feel like. (heck that's one of the main reasons MS came up with so much funky spec breaking stuff - call it embrace and extend if you want, but they wanted to do things before the standards were there, which is why we have this mess)
Re:W3C: stop now (Score:4, Interesting)
Everybody is, for practical purposes. Who do you think is dreaming up a lot of the stuff that comes out of the W3C? Look at the authorships of the standards. And if you sit in those meetings, you'll quickly see that Microsoft doesn't often take "no" for an answer.
Microsoft has even told us why they like their standards to be complicated: they believe that if they just make it complicated enough, nobody else but them can implement them. Of course, Microsoft's reasoning is at the level of Wiley Coyote, with Open Source being the Roadrunner, but what can you do.
One thing that's obvious is that these technologies are needed,
We have a problem with creating dynamic web content, but the current crop of W3C standards for addressing that problem isn't working; it has turned into a Rube Goldberg contraption. Someone needs to start from scratch, and the W3C appears to be incapable of doing it.
If we don't have someone like the W3C putting this stuff in writing somewhere, how else are we going to have a hope in hell of browsers talking to each other?
Of course, things need to get written down and standardized. But the way standards are supposed to work is that people try things out in practice, whatever works well survives in the marketplace or among users, people create multiple implementations, then people get together and work out the differences among the implementations, then it all gets written up as a standard, and finally everybody goes back and makes their implementations standards compliant. It's a long, tedious process, but it does result in reasonable standards that real people can actually implement.
What the W3C is often doing is using its position to create completely unproven systems on paper and let the rest of the world figure out how to deal with it. Or, worse, the W3C is used by powerful companies to push through "standards" that haven't stood the test of time and for which only they themselves have a working implementation. If you give that kind of junk the stamp of approval of a standards body, you make things worse, not better.
Re:W3C: stop now (Score:2)
Huh? JavaScript is the Mozilla implementation of ECMAScript, a standard (not W3C) invented by Netscape. The DOM was also a Netscape idea, now standardized. CSS was originally proposed and largely designed by a guy from Opera. There are quite a few implementations out there actually, the idea that W3C technologies are too large to implement is crazy. Look at Mozilla, Amaya, even Konqueror is getting there now.....
The W3C should have stopped with a full specification of HTML. Anything they have been doing beyond that has been doing more harm than good. The web succeeded because HTML was simple.
Yes, and now it's ubiquitous do you really think we need to keep it simple? Being simple was great when the web was small, it let it grow very quickly. Why should we keep it simple now? Just for the sake of it? I'd rather have power. If that means there are only 3 or 4 quality implementations as opposed to 20, then so be it.
The world is not a simple place, and the things we want to do with the web nowadays aren't simple either. If you want simplicity then feel free to write a web browser that only understands a subset of the standards, they are layered so people can do this. Just bear in mind that it won't be useful for browsing the web, because a lot of people like powerful technologies and use them.
Re:W3C: stop now (Score:2)
Yes, but the W3C gave it its blessing and built lots of other standards on it.
Why should we keep it simple now? Just for the sake of it? I'd rather have power. If that means there are only 3 or 4 quality implementations as opposed to 20, then so be it.
You are confusing complexity with power. The W3C standards are complex, but they aren't powerful. And that's the problem. Despite all the junk coming out of the W3C, it's still basically impossible to do reliable animations, drag-and-drop, document image display, editing, and other commonly desired things in web browsers.
I want to do complex things, but after 10 years, the W3C standards still don't support it.
The world is not a simple place, and the things we want to do with the web nowadays aren't simple either.
Yes, and the W3C fails to meet the needs of people who want to do complex things. All the W3C does is provide people with ever more complex ways of doing simple things. That is not progress.
If you want simplicity then feel free to write a web browser
More likely, there are new web browsers and plugins around the corner that build on HTML/XHTML, but come up with better ways of doing the other stuff. It's harder now than it was 10 years ago, when any kind of bogus idea could succeed, but it's still possible. Perhaps Curl could have succeeded in that area if they had open sourced it. But something else will come along.
Who needs W3C standards (Score:2)
Re:Who needs W3C standards (Score:2, Informative)
Now, if you were talking about SOAP...
It is not only Web development (Score:1, Insightful)
OWL is about information retrieval, and 'XML-Signature XPath Filter' is about document signing.
The DOM stuff, is no more only a Dynamic HTML stuff. DOM is important because it is being actively used to manage XML documents, and previous specifications are very clumpsy because they are a compromise between previous brosers specific market standards.
Maybe, it is a need to develop some simple DOM stuff from scratch instead of adding levels over a compromise approach. And again, as said above, give a reference implementation, to start with.
Vokimon
Ontologies (Score:2)
Re:Ontologies (Score:2)
I'm glad the W3 is there, pushing this, but I hope ontologies aren't just a web fad and can mature to a usable component of 'the semantic web'.
(Of course I also hope to one day get back to school or get a job that pays more than my expenses
Standards *DO* work. (Score:5, Insightful)
MSDN clearly marks out which functions are standard to and which version of HTML/DOM they are complying to.
Mozilla is almost de-facto compliant because that's the only thing they have to work from and they don't have an agenda like interoperation with MS Office/Front Page.
Standards compliance does work, it's the lazy/inept authors of web pages that are to blame for faulty product resulting from an ad-hoc approach to web page development.
Then again, like the saying goes: "A bad workman always blames his tools..."
Re:Help, About (Score:1)