Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Programming Businesses Google The Internet IT Technology

Google To Host Ajax Libraries 285

ruphus13 writes "So, hosting and managing a ton of Ajax calls, even when working with mootools, dojo or scriptaculous, can be quite cumbersome, especially as they get updated, along with your code. In addition, several sites now use these libraries, and the end-user has to download the library each time. Google now will provide hosted versions of these libraries, so users can simply reference Google's hosted version. From the article, 'The thing is, what if multiple sites are using Prototype 1.6? Because browsers cache files according to their URL, there is no way for your browser to realize that it is downloading the same file multiple times. And thus, if you visit 30 sites that use Prototype, then your browser will download prototype.js 30 times. Today, Google announced a partial solution to this problem that seems obvious in retrospect: Google is now offering the "Google Ajax Libraries API," which allows sites to download five well-known Ajax libraries (Dojo, Prototype, Scriptaculous, Mootools, and jQuery) from Google. This will only work if many sites decide to use Google's copies of the JavaScript libraries; if only one site does so, then there will be no real speed improvement. There is, of course, something of a privacy violation here, in that Google will now be able to keep track of which users are entering various non-Google Web pages.' Will users adopt this, or is it easy enough to simply host an additional file?"
This discussion has been archived. No new comments can be posted.

Google To Host Ajax Libraries

Comments Filter:
  • Yabbut (Score:3, Interesting)

    by FlyByPC ( 841016 ) on Wednesday May 28, 2008 @10:35AM (#23570441) Homepage
    Yeah, but what if Google decides that nobody is using these -- or they can't legally host them for whatever reason -- or they just decide that they don't want to do this anymore?

    I like Google too -- and this is nice of them -- but I like the idea of a website being as self-sufficient as possible (not relying on other servers, which introduce extra single-points-of-failure into the process.)

    At the risk of sounding like an old curmudgeon, whatever happened to good ol' HTML?
  • by causality ( 777677 ) on Wednesday May 28, 2008 @10:37AM (#23570467)
    The "problem" already exists. It's "how can we collect more data about user's browsing habits?" You have to consider that Google is a for-profit business and hosting these files represents a bandwidth cost and a maintainence cost for them. They are unlikely to do this unless they believe that they can turn that into a profit, and the mechanism available to them is advertising revenue.

    This is very similar to the purpose of the already-existing google-analytics.com. I block this site in my hosts file (among others) and I take other measures because I feel that if a corporation wants to take my data and profit from it, they first need to negotiate with me. Since Google is not going to do that, I refuse to contribute my data. To the folks who say "well how else are they supposed to make money" I say that I am not responsible for the success of someone else's business model, they are free to deny me access to their search engine if they so choose, and I would also point out that Google is not exactly struggling to turn a profit.

    The "something of a privacy violation" mentioned in the summary seems to be the specific purpose.
  • by Anonymous Coward on Wednesday May 28, 2008 @10:43AM (#23570555)
    That's the idea. AdWords, these "hosted" JS libraries, Urchin/Google Analytics, Google Friend Connect -- Google clearly wants to be involved in every single web "page" that's ever served.

    http://www.radaronline.com/from-the-magazine/2007/09/google_fiction_evil_dangerous_surveillance_control_1.php

  • by Gopal.V ( 532678 ) on Wednesday May 28, 2008 @10:43AM (#23570559) Homepage Journal

    I didn't see no slashdot article when yahoo put up hosted YUI packages [yahoo.com] served off their CDN.

    I guess it's because google is hosting non-google libraries?

  • by samuel4242 ( 630369 ) on Wednesday May 28, 2008 @10:45AM (#23570581)
    With their own YUI libraries. See here [yahoo.com] Anyone have any experience with this? I'm a bit wary of trusting Yahoo, although I guess it's easy enough to swap it out.
  • by SuperBanana ( 662181 ) on Wednesday May 28, 2008 @10:45AM (#23570585)

    Yeah, but what if Google decides that nobody is using these -- or they can't legally host them for whatever reason -- or they just decide that they don't want to do this anymore?

    Think broader. What happens when:

    • Google decides to wrap more than just the promised functionality into it? For example, maybe "display a button" turns into "display a button and report usage stats"?
    • Google gets hacked and malicious Javascript is included?

    But, yes- you're right. This is a scary new dependency. For a company full of PhD geniuses supposedly Doing No Evil, nobody at Google seems to understand how dangerous they are to the health of the web. In fact, I'd suggest they do, and they don't care- because they seem hell-bent on making everything on the web touch/use/rely upon Google in some way. This is no exception.

    A lot of folks don't even realize how Google is slowly weaning open-source projects into relying on them, too (with Google Summer of Code.)

  • by Shakrai ( 717556 ) * on Wednesday May 28, 2008 @10:54AM (#23570703) Journal

    This is very similar to the purpose of the already-existing google-analytics.com. I block this site in my hosts file (among others) and I take other measures because I feel that if a corporation wants to take my data and profit from it

    Do you actually have to block it in your hosts file in order to effectively deny them information? I have it blacklisted in NoScript -- is that sufficient? I'd always thought it was called via Javascript.

  • by Anonymous Coward on Wednesday May 28, 2008 @10:55AM (#23570723)
    A far better solution would be to add a meta-tag to a call, which the browser could check to see if it has it. For security reasons you need to define it always to use it, so if you don't define it, there will never be a mixup.

    Eg:

    script type="javascript" src="prototype.js" origin="http://www.prototype.com/version/1.6/" md5="..............."

    When another user want to use the same lib, he can the use the origin, and the browser will not download it from the new site. It's crucial to use the md5 (or other method), which the browser must calculate the first time it download it. Or else it would be easy to create a bogus file and get it run on another site.

    Of course this approach is only as secure as the hash.
  • by causality ( 777677 ) on Wednesday May 28, 2008 @11:07AM (#23570889)

    Do you actually have to block it in your hosts file in order to effectively deny them information? I have it blacklisted in NoScript -- is that sufficient? I'd always thought it was called via Javascript.


    The file is indeed Javascript and it's called "urchin.js" (nice name eh?). Personally, I use the hosts file because I don't care to even have my IP address showing up in their access logs. This isn't necessarily because I think that would be a bad thing, but it's because I don't see what benefit there would be for me and, as others have mentioned, the additional DNS query and traffic that would take place could only slow down the rendering of a given Web page.

    I also use NoScript, AdBlock, RefControl and others. RefControl is nice because the HTTP Referrer is another way that sites can track your browsing; before Google Analytics it was common for many pages to include a one-pixel graphic from a common third-party host for this reason. Just bear in mind that some sites (especially some shopping-cart systems) legitimately use the referrer so you may need to add those sites to RefControl's exception list in order to shop there, as the default is to populate the referrer with the site's own homepage no matter what the actual referrer would have been.
  • Umm, no (Score:4, Interesting)

    by holophrastic ( 221104 ) on Wednesday May 28, 2008 @11:24AM (#23571103)
    First, I block all google-related content, period. This type of thing would render many sites non-operational.

    Second, I've always had this complaint with the whole external javascript files. When you're already downloading a 50K html page, another 10K of javascript code in the same file inline downloads at full-speed. The external file requires yet another hit to the server, and everything involved therein. It almost never makes any sense. Even as a locally cached file, on a broadband connection, downloading the extra 10K is typically faster than opening and reading the locally cached file!

    But still, hosting a part of your corporate web-site with google simply breaches most of your confidentiality and non-disclosure agreements that you have with your clients and suppliers. It's that simple. Find the line that reads "shall not in any way disclose Confidential Information to any third party at any time, including consultants and contractors, copy and/or merge the Confidential Information/business relationship with any other technology, software or materials, except contractors with a specific need to know . . ."

    Simply put, if your Confidential client conversations go over gmail, you're in breach. If google tracks/monitors/sells/organizes/eases your business with your clients or suppliers, you're in breach -- i.e. it's illegal, and your own clients/suppliers can easily sue you for giving google their trade secrets.

    Obviously it's easier to out-source everything and do nothing. But there's a reason that google and other such companies offer these services for free -- it's free as in beer, at the definite cost of every other free; and it's often illegal for businesses.
  • by alta ( 1263 ) on Wednesday May 28, 2008 @12:06PM (#23571751) Homepage Journal

    I don't see what benefit there would be for me
    There are benefits, they just may not be as direct as you like, or appreciated.

    We use analiytics. We use it almost exclusively to improve the experience of our customers. We don't care how many people come to our site. We care how many buy... and we have internal reports for that. What we do care about is:
    How many people are not using IE. (We found it was worth making sure all pages worked on most all.

    How many people are at 1280*1024 or over.
    We dropped the notion that we needed to program for 800*600, thereby letting people use more of those big ass screens they buy.

    Where are most of the people located?
    We now have an east coast and west coast server.

    What pages are most viewed?
    We made them easier to get to.

    Who doesn't have flash?
    It was 2.08%, but I'm still not going to let them put flash on my site.
  • by dintech ( 998802 ) on Wednesday May 28, 2008 @12:08PM (#23571779)
    That doesn't make any sense. What the GP is saying is that he would like to be able to exclude users who deliberately set out to circumvent his business model. Kudos to him, I hope he finds it and posts it on slashdot when he's done.
  • Re:Umm, no (Score:3, Interesting)

    by Bogtha ( 906264 ) on Wednesday May 28, 2008 @12:37PM (#23572269)

    Second, I've always had this complaint with the whole external javascript files. When you're already downloading a 50K html page, another 10K of javascript code in the same file inline downloads at full-speed. The external file requires yet another hit to the server, and everything involved therein. It almost never makes any sense.

    It almost always makes sense. The external file only requires another hit to the server the first time you see it. From that point on, every page hit is smaller in size because you don't have to download the JavaScript again.

    Even as a locally cached file, on a broadband connection, downloading the extra 10K is typically faster than opening and reading the locally cached file!

    Downloading 10K is faster than loading 10K from disk? What are you using, floppies?

    Even if it is faster for you, it isn't faster for the website. Try magnifying that extra 10K for tens of thousands of visitors.

    But still, hosting a part of your corporate web-site with google simply breaches most of your confidentiality and non-disclosure agreements that you have with your clients and suppliers. It's that simple.

    If your contracts bar things like this, then they bar a hell of a lot. You can't use any external hosts. You can't use a CDN. You can't use most advertising on your website.

    Find the line that reads "shall not in any way disclose Confidential Information to any third party at any time

    And what confidential information do you think is being disclosed?

  • by jfmiller ( 119037 ) on Wednesday May 28, 2008 @01:43PM (#23573319) Homepage Journal
    The whole idea of having a single URI for these very common .js files is that they can be cached, and not just on your local computer. Any router with the ability to follow the HTTP1.1 cache protocol would serve these pages out of a local cache.

    Moreover, if this idea catches on, WebBrowsers will begin shipping with these well know URIs preinstalled, perhaps even with optimized versions of the scripts that cut out all the IE6 cruft. What is really needed to make this work is a high bandwidth, high availability server that has enough name recognition to get them selves on slashdot. Google sounds like the right choice to me.

    If this works, in 5 years most of the requests for these URIs will never even leave your computer, and you cannot beat that kind of privacy.
  • by richardtallent ( 309050 ) on Wednesday May 28, 2008 @01:48PM (#23573409) Homepage
    If someone could hack into the ISP's DNS server, it wouldn't matter where the fake code is being requested from.

    And, frankly, they could do a lot more dangerous (and easy pay-off) things than just redirect requests for a Javascript library--such as redirecting ebay, paypal, etc. to a phishing site.
  • by Firehed ( 942385 ) on Wednesday May 28, 2008 @02:06PM (#23573659) Homepage
    Well that's just the dilemma. I use Google Analytics on all my sites, and sort of use the information to see what keywords are effective and most common. I don't then turn around and use that information to focus my content on those areas like any smart person would, but I don't really care if someone stumbles across my blog either (much more interesting are HeatMaps, not that I use that information in a meaningful way either).

    However, it's not just Google that's grabbing that kind of information. Anyone with a server is probably keeping referrer logs whether they intend to or not, and some people [ittybiz.com] get a chuckle out of the nonsense. I'd suggest the vast majority do nothing. If Google can get that information by providing a valuable service, and then turn around and create value from doing so at no cost to site owners or visitors, more power to them.
  • by Bobb Sledd ( 307434 ) on Wednesday May 28, 2008 @03:52PM (#23575233) Homepage
    Yes, you've hit the nail on the head.

    But more specifically, my sites are military and all the content must come from trusted military web servers (better if it's the same one for all the content).
  • bindun (Score:2, Interesting)

    by grrowl ( 953625 ) on Wednesday May 28, 2008 @09:33PM (#23580033) Homepage Journal
    One very important metric google analytics doesn't include is "Who doesn't have Javascript enabled?". Another thing to keep in mind: the whole "hosting scripts for global caching" thing was already done by Yahoo! with their YUI libraries, so keep in mind you should apply all your google-directed conspiracy hate at them as well.
  • by Phantom of the Opera ( 1867 ) on Thursday May 29, 2008 @12:39AM (#23581721) Homepage

    Everything from your age, race, language, skin colour, religion, marital status, where you work, what you drive, your education level, your income level, where you went to school, who your friends are, where you vacation, where you shop, your taste in movies, your taste in porn, your taste in books, your politics, whether you have kids... they might even know your face.

    Its a surveillance wet dream.
    It's not that they are specifically tracking *you*, they are tracking your 'type'. Its even more insidious. Ever think about the case where it *is* that true? That you are that predictable and actually do fit pretty close to one of a handful of templates? It's a truism that we are individuals, but what if that is less the case than instinct and pride wants to allow?

A list is only as strong as its weakest link. -- Don Knuth

Working...