Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Businesses The Internet

Google To Promote Web Speed On New Dev Site 106

CWmike writes "Google has created a Web site for developers that is focused exclusively on making Web applications, sites and browsers faster. The site will allow developers to submit ideas, suggestions and questions via a discussion forum and by using Google's Moderator tool. Google hopes developers will join it in improving core online technologies such as HTML and TCP/IP. For Google, a prime example of how Web performance can be enhanced is the development of HTML 5, which provides a major improvement in how Web applications process Javascript, Google believes. 'We're hoping the community will spend some time on the basic protocols of the Internet,' Google product manager Richard Rabbat said. 'There's quite a bit of optimization that can be done [in that area].'"
This discussion has been archived. No new comments can be posted.

Google To Promote Web Speed On New Dev Site

Comments Filter:
  • by Anonymous Coward on Wednesday June 24, 2009 @10:48AM (#28452429)

    Come on now. The price of downloading html and javascript source is peanuts compared to images and flash animations. The solution is better web design, not another layer of complexity in the process. There is no shortage of low-hanging fruit to be picked here. Metric tons, you could say.

  • Re:It's a plague. (Score:3, Insightful)

    by gmuslera ( 3436 ) on Wednesday June 24, 2009 @10:52AM (#28452483) Homepage Journal
    I remember when the recommendation was that your webpage in total (counting all resources that includes, code, graphics, etc) couldn't weight more than 50k. What is the average total page size today? 500k? 1Mb? And that loading a lot of resouces between main page, style sheets, javascripts and graphics both small and big (and that gets only worse with flash apps/movies),

    Technology is advancing (i think i read somewhere there that JS processing is 100x faster in modern browsers) and there are a lot of developers tools that give advices on how to improve responsiveness of your site (yes, most of them linked from that google site), so maybe the good part of the web could improve speed in a near future.
  • by reed ( 19777 ) on Wednesday June 24, 2009 @11:36AM (#28453147) Homepage

    The number one slowdown I see on pages is linking to all kinds of external resources: images, flash movies, iframes, CSS, bits of javascript. Each of these requires at least another DNS lookup and a new HTTP connection, and often those external servers take a really long time to respond (because they're busy doing the same for all those other websites using them). Why is this going on in each users browser? It should all be done behind the scenes on the web server. Why would you put the basic user experience of your users or customers in the hands of random partners who are also doing the same for competing sites? It takes some load off your server, but I think the real reason that people just link in external resources as images, objects, etc is just that it's easier than implementing it in the back end. If you really want to offload work, then design a mechanism that addresses that need specifically.

    We've ended up with a broken idea of what a web server is. Because it was the easiest way to get started, we now seem to be stuck with the basic idea that a web server is something that maps request URLs directly to files on the server's hard disk that are either returned as is or executed as scripts. This needs to change (and it is a little bit, as those "CGI scripts" have now evolved into scripts which are using real web app frameworks.)

  • by quanticle ( 843097 ) on Wednesday June 24, 2009 @12:07PM (#28453683) Homepage

    The problem with gzip compression (in this case) is that its not lossy. All of the "unnecessary" things that you have (e.g. the unneeded closing tags on some elements) will still be there when you decompress the transmitted data. I think the grandparent wants a compression algorithm that's "intelligently lossy"; in other words, smart enough to strip off all the unneeded data (comments, extra tags, etc.) and then gzip the result for additional savings.

  • by quanticle ( 843097 ) on Wednesday June 24, 2009 @12:17PM (#28453839) Homepage

    From TFA:

    Sometimes PHP novices attempt to make their code "cleaner" by copying predefined variables to variables with shorter names. What this actually results in is doubled memory consumption, and therefore, slow scripts.

    It seems to me that this is a flaw in the PHP interpreter, not the PHP programmer. The way I see it, the interpreter should be lazily copying data in this case. In other words, the "copy" should be a pointer to the original variable until the script calls for the copy to be changed. At that point the variable should be copied and changed. I believe this is how Python handles assignments, and I'm surprised that PHP does not do it the same way.

  • by JohnnyBGod ( 1088549 ) on Wednesday June 24, 2009 @01:03PM (#28454669)

    WHOOSH

  • by Tokerat ( 150341 ) on Wednesday June 24, 2009 @01:55PM (#28455625) Journal

    If you save 320 bytes per file, serving 200 different files 750,000 times per day each (imagine some HTML docs that load a bunch of images, JavaScript, and CSS), that's 1.3TB over the course of 30 days. It adds up fast.

    320 was chosen out of the air, as the total length of removed JavaScript comments (320 bytes is the content of 2 full SMS messages), trimmed image pixels, or extraneous tabs in an HTML document. Of course some files will see more page hits than others, some days will see less traffic on the site, and some files/file types are likely to be reduced by different amounts. The question still remains - how you would like to reduce your bandwidth bill and have your users be happier with your site all at the same time? Less traffic, maybe you don't need to bother with it. 500 hits/day sure paints a different picture (915MB/month), but upper-mid-sized sites which rely on leased hosting should really be keeping an eye on this, and it certainly would be good netiquette for everyone to ensure optimized traffic.

If you have a procedure with 10 parameters, you probably missed some.

Working...