Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Internet

Saving Bandwidth Through Standards Compliance, Pt. 2 34

elijahao writes "In case part one of the interview with Mike Davidson of ESPN was interesting, the second part has been posted today."
This discussion has been archived. No new comments can be posted.

Saving Bandwidth Through Standards Compliance, Pt. 2

Comments Filter:
  • Redesigning (Score:3, Insightful)

    by Anonymous Coward on Saturday March 29, 2003 @02:08PM (#5622840)
    Having recently been involved in a site redesign of a large site using a CMS I can sympathize with the issues at work. The biggest struggle I faced was that our company used an outside design firm to come up with a "look." There was little understanding of the issues around building a template driven site and they came up with a design totally unsuited to the project. We (IT department) were given a handful of layered Photoshop files and expected to code behind them.
    • Whomever hired this group to deliver a look without html to back up their approach should be put in charge of something they understand.
    • Oh, the joys of working with ignorant pixelpushers...
      And don't try to tell them, that what they cooked up won't work. Even if it's just a small change in the layout to make it work in the template without resorting to voodoo, they get all worked up, as if you were personally insulting them.

      What they don't know is that the real insults start as soon as they are off the phone :-)
  • Bad example (Score:4, Insightful)

    by JimDabell ( 42870 ) on Saturday March 29, 2003 @04:05PM (#5623368) Homepage

    One step forward, two steps back:

    Positioning footers is a huge Achilles heel of absolute positioning. It is ridiculous that you cannot embed three absolutely positioned columns within a master div and then position a footer below that master div. This is a well known problem of absolute positioning and there are a few workarounds, none of which are very elegant.

    Actually, it's dead simple to do this with css 2. Unfortunately, Internet Explorer doesn't support a decent amount of css 2. Having said that, there are plenty of workarounds that work in Internet Explorer that aren't anywhere near as bad as this:

    The workaround we settled on for the front page was simply positioning our partner's footer a concrete pixel value from the top of the screen. Since our front page is always roughly the same length, we don't need to worry about any of our columns creeping down into the footer.

    Excuse me? How on earth can they possibly know how high their home page is? That would depend on the size of the text, which depends on the font size I've picked to surf with.

    Then there's validation. Telling me my site needs to validate in order to be standards-compliant is like telling me I need a flag in my lawn to call myself an American.

    What a fucking idiot. Validation is a mechanical syntax checking of the document. If your site doesn't validate, you aren't conforming to the rules of HTML/XHTML. It's more like saying he needs to be an American citizen to call himself an American.

    For a simple, small, text-heavy site like a blog, validation may come relatively easily, but when you have a site like ours which dynamically writes out a lot of content, uses third-party statistical tracking, makes liberal use of Flash, and offers complex and flexible advertising modules, validation is simply a pie in the sky.

    Okay, let's take these things one at a time:

    Sometimes we dynamically open divs and other tags with document.write and the validator can't figure out why we're closing a tag which appears not to be open.

    If you are closing an element (not tag), then it had better be open. If you open the element via a script, close it via a script, otherwise you are not following the specifications. The validator can't "figure it out" because it isn't compliant code. This guy seems to think that the use of client-side scripting somehow makes invalid documents magically valid.

    Our ad server requires us to send ampersand-delimited variables to it which are not URL-encoded and the validator treats any ampersands in your code as invalid.

    It's a one-liner in most languages to fix this. If you are using a third-party ad server, then ask them to give you compliant code, it should be part of your contract to reduce business risk anyway.

    Our statistical tracking code puts id attributes to certain script tags, which the validator claims is not valid.

    Sounds like exactly the same thing. Ask your suppliers to give you code that follows the specifications.

    We sometimes do not include alt tags for images which aren't important unless they are physically seen. Some people will say "Just include alt=''", but I simply don't agree with including alt tags for the heck of it.

    Well existing user-agents treat empty alt attributes differently to missing alt attributes, and for good reason. It may mean little to him, because he doesn't use that software, others do. That is why you follow specifications, so all user-agents get a good deal.

    We display all of our Flash elements using a home-spun JavaScript delivery method which is way more flexible and compatible than even the method Macromedia recommends.

    • The only downside is that it doesn't validate. Boohoo.

      How can ESPN.com be touted as a site that is "saving bandwidth through standards compliance" when they're not standards compliant? It's possible to do all the absolute-positioning and other CSS tricks without making the site completely standards non-compliant.

      I think the intentions are noble (encouraging upgrades to compliant browsers, reducing page weight with less code), but it seems like somebody didn't finish the job. That's fine if that's what
      • They're not saving bandwidth either really. The page is still loading in another window on their slow server, it's up to something like 220K at this point.

        What's rendered so far doesn't look so hot.

        I applaude the effort, but fixed pixels is never a good solution. The whole idea behind standards compliance is seperating content from presentation. The side effect is that you don't have absolute control over the presentation as an author. People just need to get over it. If they want absolute control ov
    • Re:Bad example (Score:3, Interesting)

      by sigwinch ( 115375 )
      Indeed. I'm looking at ESPN.com on a 1600x1200 screen under a recent Mozilla, and it is an unreadable, shitty looking pile of dreck:
      • Text hanging across columns
      • Inter-line spacing too small so the characters of one line physically overlap the previous line
      • Ugly line breaks in the scores sidebar
      • Content boxes that stick down too far and chop of the top of the box below
      • Boxes that have their bottom part chopped off by the box below (they screwed it up both ways)
      • Shitty Javascript menus with expander buttons
      • I'm looking at it on a 1280x1024 screen, in Mozilla 1.3, and I see none of the problems you describe (except for the colors in the lite site - that yellow needs to go). Perhaps it's time to upgrade your pre-1.0 Moz build?

        I'd imagine the reason it goes to espn.go.com is the same reason all C|Net sites go to .com.com - so a single cookie can be used across multiple sites on their network.
        • Perhaps it's time to upgrade your pre-1.0 Moz build?

          I'm running 1.0.1.

          I'm looking at it on a 1280x1024 screen, in Mozilla 1.3, and I see none of the problems you describe...

          Crank your resolution up to 1600x1200 and set the font to a comfortable size: the site disintegrates into unusability. Leave the original fonts alone and the characters are 8 pixels tall--small enough to draw 120 lines of text on the screen--which is hideously painfully small.

          ESPN.com is simply an amateurish disaster, design

          • ESPN.com is simply an amateurish disaster, designed by people who know little about standards, browsers, or usability.

            I wouldn't be that harsh...though they aren't a good example of standards compliance. On a practical note, even on a 1280x1024 with Mozilla 1.3 the text chop/overlap problem is obvious. Just increase the font once or twice and look at the menus.

            BTW...what's the point using a fixed width page? Why not use variable width columns?

          • You need to fix the DPI (dots per inch) setting on your monitor, then.

            A 12pt font should be the same size across all screens and all resolutions.

            If you vid. card's drivers don't come with the ability to change the DPI setting, then it's time to get a better vid. card (or possibly OS). Shoot, even my ancient ATI Rage 128 Pro drivers let me do this.

            At home I have a 20" IBM P202 monitor running off an ATI Rage 128 Pro card at 1280x1024 and the fonts are set at standard 12pt -- I just cranked the DPI up to
            • The other reply is correct: ESPN.com specifies absolute pixel sizes.

              The DPI setting on XFree86 is erratic anyway, but that's beside the point.

    • Relax, man (Score:3, Interesting)

      Well, this guy is clearly not a "fucking idiot." He simply believes in practical solutions, and is not interested in abstract validation. I am a bit more to your side (I find, for instance, the opening of tags in javascript to be a nightmare maintanance idea), but I respect his approach. All of us know the difficulty of turning a Photoshop document from a designer used to print publishing, and turning it into a compliant web page.

      Calm down. He's on your team. Don't be so absolute.

    • Our ad server requires us to send ampersand-delimited variables to it which are not URL-encoded and the validator treats any ampersands in your code as invalid.

      It's a one-liner in most languages to fix this. If you are using a third-party ad server, then ask them to give you compliant code, it should be part of your contract to reduce business risk anyway.

      There's no need to bother with the third-party ad server. Just replace & with & in the href and src attribute values as you would anyw

  • Annoying! *groan* (Score:2, Interesting)

    by usotsuki ( 530037 )
    Why does everything have to be candycoated and designed in ways to discriminate against us people who would use Mozilla if we had the choice but are hog-tied down (at our public library) into using Netscape 4.7? It's only two years old, guys. It isn't like it's hard to code a page that will look correct in NS4. I'll go further and say that a decent Web page should be 100% viewable in ANY browser, not just the latest cream of the crop. Got Netscape 2.01 on one of those old 603e machines? Logged into som
    • Re:Annoying! *groan* (Score:3, Interesting)

      by h3 ( 27424 )

      Netscape 4.7? It's only two years old, guys. It isn't like it's hard to code a page that will look correct in NS4

      Well, while the specific dot version may only be two years old, I believe the NS4 series was released in the '97-'98 timeframe, making the codebase in the area of 5-6 years old! That's half the age of the web!

      And no, it's not hard to code a page that will look correct in NS4. It is hard to code a page that will look correct and good, and do so in the most recent browsers, and use proper a

    • .. Netscape 4.7? It's only two years old, guys.

      XHTML is 3 years old.
      CSS 1 is 7 years old.

      And neither standard appeared overnight.
    • If a page isn't viewable in Lynx, that's the coder's fault. All my pages are viewable with Lynx, *if I can help it*.

      So, your pages aren't viewable in Lynx 100% of the time then, is what you're saying.

      Netscape 4.7? It's only two years old, guys. It isn't like it's hard to code a page that will look correct in NS4.

      Yes it is - at least, as someone else pointed out, to have it be visually normal in modern browsers. No, NS4.7 isn't two years old.

      http://www.blooberry.com/indexdot/history/brows e rs .htm

      NS
  • One excellent stooopidity of the validator that he points out is the practice of not allowing & delimited variables in URLs. (Look at your /. URL and you'll see some).

    This is the reason that my website [sillytech.com] does NOT validate [w3.org].

    I wanted to validate. I tried to validate. But the ampersands screwed me.

    • Well, duh, you should be writing "&" instead of "&" in the URLs.
      • Yeah, i could do that, but why should I have to?

        What's the problem with &'s in URLs?

        • & in HTML (Score:3, Informative)

          by LiamQ ( 110676 )
          & begins an entity or character reference in HTML, so a literal & needs to be escaped as &. Otherwise, you would have confusion in a case such as href="foo.cgi?bar=baz&copy=yes" (which is valid HTML but probably not what the author intended with that copyright sign in the URI).
    • You just need to change the & character to & in your attributes. This is because HTML entities such as é are allowed in attribute values just as in normal text. So when you want a plain ampersand you have to escape it.

      BTW, some web application libraries (such as Perl's CGI.pm) are moving to a newer style of URL that uses semicolons rather than ampersand to separate the parameters.
  • by ajwade ( 662555 )

    The absolute positioning trick destroys the layout in Galeon (I've got the minimum font size set to 22 for the sake of my sanity). The left hand column overlaps the centre column (although gecko should arguably character-wrap to prevent that), and some of the text in the boxes on the right is missing because it doesn't fit. And the only reason the line spacing isn't far to small is because I've overridden it in my user stylesheet to fix similarly brain-damaged sites. The "lite" site isn't much better.

    To be

  • I commend their development efforts, but such a task is not easy. I am a web developer too. I write in XHTML Strict sometimes, and XHTML Transitional most of the time.

    http://www.froggy.com.au/mike.skinner/Mike Skinner - Resume.htm

    If I am building a site for a target browser and version (MSIE 6 on an intranet, etc), I will build to XHTML Strict, just to keep my brain active. XHTML Strict is a pain in the butt, some things are virtually impossible to do (or workarounds are not elegant).

    Otherwise I like to

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...