3 reasons why you should let Google host jQuery for you
AJAX, JavaScript, jQuery, Performance By Dave Ward. Updated October 6, 2015All too often, I find code similar to this when inspecting the source for public websites that use jQuery:
|
1 |
<script type="text/javascript" src="/js/jQuery.min.js"></script> |
If you’re doing this on a public facing website, you are doing it wrong.
Instead, I urge you to use the Google Hosted Libraries content delivery network to serve jQuery to your users directly from Google’s network of datacenters. Doing so has several advantages over hosting jQuery on your server(s): decreased latency, increased parallelism, and better caching.
In this post, I will expand upon those three benefits of Google’s CDN and show you a couple examples of how you can make use of the service.
Just here for the links?
If you’ve already read this post and are just here for the links, you’re in the right place!
If you care about older browsers, primarily versions of IE prior to IE9, this is the most widely compatible jQuery version:
|
1 |
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js"></script> |
If you don’t care about oldIE, this one is smaller and faster:
|
1 |
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.4/jquery.min.js"></script> |
Either way, you should use a fallback to local just in case the Google CDN fails (unlikely) or is blocked in a location that your users access your site from (slightly more likely), like Iran or sometimes China. This is an example of using a local fallback with jQuery 2.x:
|
1 2 3 4 5 |
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.4/jquery.min.js"></script> <script>if (typeof jQuery === 'undefined') { document.write(unescape('%3Cscript%20src%3D%22/path/to/your/scripts/jquery-2.1.4.min.js%22%3E%3C/script%3E')); } </script> |
Now, on to the reasons why you should use these links.
Decreased Latency
A CDN — short for Content Delivery Network — distributes your static content across servers in various, diverse physical locations. When a user’s browser resolves the URL for these files, their download will automatically target the closest available server in the network.
In the case of Google’s AJAX Libraries CDN, what this means is that any users not physically near your server will be able to download jQuery faster than if you force them to download it from your arbitrarily located server.
There are a handful of CDN services comparable to Google’s, but it’s hard to beat the price of free! This benefit alone could decide the issue, but there’s even more.
Increased parallelism
To avoid needlessly overloading servers, browsers limit the number of connections that can be made simultaneously. Depending on which browser, this limit may be as low as two connections per hostname.
Using the Google AJAX Libraries CDN eliminates one request to your site, allowing more of your local content to downloaded in parallel. It doesn’t make a gigantic difference for users with a six concurrent connection browser, but for those still running a browser that only allows two, the difference is noticeable.
Better caching
Potentially the greatest benefit of using the Google AJAX Libraries CDN is that your users may not need to download jQuery at all.
No matter how well optimized your site is, if you’re hosting jQuery locally then your users must download it at least once. Each of your users probably already has dozens of identical copies of jQuery in their browser’s cache, but those copies of jQuery are ignored when they visit your site.
However, when a browser sees references to CDN-hosted copies of jQuery, it understands that all of those references do refer to the exact same file. With all of these CDN references point to exactly the same URLs, the browser can trust that those files truly are identical and won’t waste time re-requesting the file if it’s already cached. Thus, the browser is able to use a single copy that’s cached on-disk, regardless of which site the CDN references appear on.
This creates a potent “cross-site caching” effect which all sites using the CDN benefit from. Since Google’s CDN serves the file with headers that attempt to cache the file for up to one year, this effect truly has amazing potential. With many thousands of the most trafficked sites on the Internet already using the Google CDN to serve jQuery, it’s quite possible that many of your users will never make a single HTTP request for jQuery when they visit sites using the CDN.
Even if someone visits hundreds of sites using the same Google hosted version of jQuery, they will only need download it once!
Implementation
Note: At the time I originally wrote this, Google used to recommend this loader approach, but no longer does. I’m leaving this section of the post intact, but you can skip to the next section unless you’re planning on time traveling to 2008 someday.
By now, you’re probably convinced that the Google AJAX Libraries CDN is the way to go for your public facing sites that use jQuery. So, let me show you how you can put it to use.
Of the two methods available, this option is the one that Google recommends:
The google.load() approach offers the most functionality and performance.
For example:
|
1 2 3 4 5 6 7 8 |
<script src="http://www.google.com/jsapi"></script> <script> google.load("jquery", "1"); google.setOnLoadCallback(function() { // Place init code here instead of $(document).ready() }); </script> |
While there’s nothing wrong with this, and it is definitely an improvement over hosting jQuery locally, I don’t agree that it offers the best performance.

As you can see, loading, parsing, and executing jsapi delays the actual jQuery request. Not usually by a very large amount, but it’s an unnecessary delay. Tenths of a second may not seem significant, but they add up very quickly.
Worse, you cannot reliably use a $(document).ready() handler in conjunction with this load method. The setOnLoadCallback() handler is a requirement.
Back to basics
In the face of those drawbacks to the google.load() method, I’d suggest using a good ‘ol fashioned <script> declaration. Google does support this method as well.
For example:
|
1 2 3 4 5 6 |
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.2/jquery.min.js"></script> <script> $(document).ready(function() { // This is more like it! }); </script> |
Not only does this method avoid the jsapi delay, but it also eliminates three unnecessary HTTP requests. I prefer and recommend this method.
HTTPS all the things
Note that the script reference above uses the HTTPS protocol. For years, the examples on this page used a protocol-relative reference, beginning with just // instead of http:// or https://, and this section of the post described how that was not a typo and was actually a valid URL.
I no longer recommend that approach.
You should always use the HTTPS jQuery CDN reference, even if you’re including jQuery on a page that is itself being served via unencrypted HTTP. This greatly reduces the chances of someone tampering with the contents of the jQuery script between Google’s CDN servers and your users’ browsers. The overhead is minimal and all of the three benefits that I described earlier are still applicable when using the encrypted transport.
Other implementations
WordPress – If you’re using WordPress and want to take advantage of the Google CDN throughout your WordPress site without manually editing themes, plugins, etc, check out Jason Penney‘s Use Google Libraries plugin.
HTML5 Boilerplate – H5BP is a web site skeleton that provides a great starting point for building a modern site. As part of that, it uses a reference to jQuery on the Google CDN (and an automatic fallback to a local copy for users that aren’t able to load jQuery from the CDN for some reason).
Conclusion
According to a recent study, Google will consume 16.5% of all consumer Internet capacity in the United States during 2008. I think it’s fair to say that they know how to efficiently serve up some content.
The opportunity to let the pros handle part of your site’s JavaScript footprint free of charge is too good to pass up. As often as even returning users experience the “empty cache” load time of your site, it’s important to take advantage of an easy optimization like this one.
What do you think? Are you using the Google AJAX Libraries CDN on your sites? Can you think of a scenario where the google.load() method would perform better than simple <script> declaration?
Similar posts
Your turn. What do you think?
A blog isn't a blog without comments, but do try to stay on topic. If you have a question unrelated to this post, you're better off posting it on Stack Overflow instead of commenting here. Tweet or email me a link to your question there and I'll try to help if I can.
149 Mentions Elsewhere
- Host jQuery at Google (with Intellisense support)
- Dew Drop - December 10, 2008 | Alvin Ashcraft's Morning Dew
- rascunho » Blog Archive » links for 2008-12-10
- if(JTeam && Toolman) {blog.read();} » Blog Archive » Javascript libraries hosted by Google
- Looking at Code | Have Google host libraries.
- Let Google host your jquery.js file | eKini: Web Developer Blog
- Coderies | taggle.org
- links for 2008-12-12 « boblog
- Daily Digest for 2008-12-12 | andrew . tj . id . au
- BlogBuzz December 27, 2008
- Quick Tip – let someone else serve your CSS framework
- Google hostet Javascript-Libraries
- Test
- Let Google Host Your jQuery Javascript File at Will Ayers - Design and Programming Blog
- Daily Links | AndySowards.com :: Professional Web Design, Development, Programming, Hacks, Downloads, Math and being a Web 2.0 Hipster?
- Weekly Web Nuggets #42 : Code Monkey Labs
- Event Calendar: Submitting Events » Death of a Gremmie
- Shizzle» Blog Archive » Let Google host jQuery for you
- ghettocooler.net » The Links Have Risen
- Wordpress插件:Use Google Libraries - 苍楼笔记
- Ventajas y desventajas del uso de Google como host de librerías javascript « Gerardo Contijoch
- Carron Media - Extend Google Analytics with jQuery
- DotNetShoutout
- Introducting Typekit | Industrial Brand
- Jaap Vossers’ SharePoint Blog » Using Google Ajax Libraries API to load jQuery
- Code a Tricky Login Form With Sliding Signup with jQuery. | CodeTricky
- Mike Panitz's Blog » Using jQuery with CakePHP: The Basics
- Use Google Libraries | 风云阁
- Sunfish Interactive | Blog » 3 reasons why you should let Google host jQuery for you | Encosia
- Varför ska man använda jQuery från Google? | Andreas Karman
- Consuming a Web Service in OBIEE Presentation Services using JQuery CDN | Art of Business Intelligence
- JavaScript Standards - Not Just a Hat Rack
- Lightbox Effects, smaller and quicker | Nashville Web Design by GroovySoup
- SODEVE
- 5 Fast Ways to Speed Up Your Blog
- Installing JQuery « Lonelycamel's Blog
- ZENVERSE – Fail-Proof Method to load jQuery Library via Google AJAX Libraries API
- Image Caption Slide using jquery | iamkreative - design, kidney and general blog
- Usa el jQuery de Google, anda. | Quenerapú
- Use Google Libraries | Blue Orbs
- 4 Ways Google Wants to Help Your Site’s Speed | Nashville Web Design by GroovySoup
- Splat Labs » Blog Archive » Let Google host it for you!
- Adding jQuery to a custom theme using wp_enqueue_script | Prolific Notion
- Using CDN Hosted jQuery with a Local Fall-back Copy | I love .NET!
- Ubrzajte izradu web stranica pripremom predloška | Kroativ
- What is Content Delivery Network & how to use it with WordPress? | Blog Design Studio
- Random Links #127 | YASDW - yet another software developer weblog
- Featured Slider
- We Love… » Blog Archive » Speed Up Your Web Content Delivery… the essential checklist
- Website Optimisation - iandevlin.com – blog
- 5 Cool Hacks and Tutorials Using WordPress and jQuery | wpConstructs.com - Community Blog for WordPress Lovers
- Lifesize Blog
- Notes from Day One of WordCamp Ireland | Steve Flinter
- uberVU - social comments
- Let Google host the JQuery of your Trac! « Das Weblog zur Person
- How to marry TYPOlight with jQuery | qzminski
- jQuery för nybörjare
- jQuery HTML effects - Trav's Tinkering
- jQuery – de goede manier | Code snipplets
- The Venture Foundation ::
- Teaching Online Journalism » Looking at jQuery for visual journalism
- Rhonda Friberg » Blog Archive » Google AJAX Libraries
- Caching on the Google AJAX Libraries API | Soupgiant
- DrakNet Web Hosting Blog » Site Performance: Tweaking Distribution
- Faster Speed in WordPress Using Google’s Hosted JavaScript Libraries | kevinleary.net
- Tips & Tricks « jQuery Refuge
- Simplify you Web Analytics with jQuery - 1918 Internet Services
- Cargar jQuery desde Google | GFDEZ: Laboratorio Web 2010
- RSS mixer service update | BrianDart.net Blog
- 让Google给你的网站加速 | 图腾
- First Time Website Design - Page 11 - Overclock.net - Overclocking.net
- Kestrel Internet Development
- jQuery: » The Official jQuery Podcast – Episode 32 – Dave Ward
- The Official jQuery Podcast
- The Official jQuery Podcast – Episode 32 – Dave Ward
- Dave Ward | Sachin Handiekar
- jQuery CDN을 대안적인 로컬카피와 함께 사용하기 | FRENDS.KR
- this is why you should let Google to host your jQuery | [email protected]@L!
- Linee guida per un Web Design più Verde | WebCarpenters
- Self-Baked Social Toolbar with jQuery – Thoughts on Design – trinkaus.cc blog
- Five steps to cleaning up that Javascript and CSS in your web application
- Use Google Hosted jQuery in WordPress « T. Longren
- Google CDN Naming Conventions and You | The Hostmaster's Blog :: Web Hosting Tutorials | cPanel Guides | UK Unlimited Domain Web Hosting
- Calling All jQuerys : PubMedia Commons
- jQuery über CDN von Google in Website einbinden | PAS solutions GmbH Blog | Frauenfeld, Thurgau
- CSS OSCommerce » Blog Archive » Cut Page Load time in OSCommerce 2.3 & OSC to CSS
- Site Performance: Tweaking Distribution | A Small Orange
- jQuery deshabilitará el hotlinking
- All jQuery Google Repository Hosting | Devin R. Olsen Web Developer
- Speeding Up Your Blogs/Websites? | Ali Hussain Jiwani
- SOLD and All Change!
- jQuery Filtering - Simple Data Filtering Using jQuery | Think Tank
- Backie » Why not to use Google’s jQuery CDN hosting with WordPress
- 3 reasons why you should let Google host jQuery for you – Encosia | Online Class
- My Bookmarks « Ruman's Blog
- Using Google Ajax Libraries API to load jQuery « Jaap Vossers' SharePoint Blog
- jQuery en 10 minutos | guia para impacientes | javascript | leccionespracticas.com
- Using jQuery on Google's CDN but having a plan B - Dan Esparza
- How To Let Google Host jQuery For You
- Placeholder Image (s) with jQuery and Bing Image Search API »
- How To Let jQuery.com Host jQuery For You
- How To Let Microsoft Host jQuery For You
- Why Use CDN Hosted jQuery?
- jQuery Tips & Tricks: Create Sliding Horizontal Menu | cramie
- Client-side paging with jQuery | Pros Global TV
- dittocode • A Simple jQuery Modal Box
- Increase WordPress performance by using Google’s CDN Javascript Libraries « Tournas Dimitrios
- .NET - Add jQuery to an ASP.NET Web Application | Oscar's Code
- Referencing External Script Libraries | Dysfunctional Spec
- Tips untuk Meningkatkan Kelajuan Laman Web | Heiswayi Nrird
- Add rel="lightbox" to WordPress Galleries without a plugin
- JQuery linkelés | Kerek egy ég alatt
- Can I Use HTML5 Now? « Almost Humor
- Sandbox SharePoint Online jQuery & jQuery UI WSP package « All things dev
- Los plugins imprescindibles para montar un blog en wordpress. | Gadgetopost
- HyperPac´s site » Blog Archive » Wordpress Visual Editor broken after Update 3.3.1 (again)
- Load jQuery from Google with Fallback to Local File | Mike Newell
- Online jQuery & jQuery UI WSP package of SharePoint 2010/SharePoint 2007 | An Hero
- Web Optimization – Use Google CDN to host jQuery | mesmor
- Coolest web development trick I’ve learned in a long time | Chainsaw on a Tire Swing
- slidebox jquery banner rotator
- Loading jQuery from Google CDN and dependent scripts in WordPress | Yen's Blog
- How to: AJAX Contact Form Using jQuery | Crafted by David
- Por qué deberías dejar a Google alojar las librerías javascript que usas en tu web | www.cyanogenmod.info
- Hosting your jQuery framework – locally or remotely?
- Web Journal – [littlefixit] » Using Google Hosted Libraries
- Bookmarklet | Elitist
- Query Basics and Methods for Beginners « Tech Snippets
- Building SharePoint Dashboards with Google Chart | sudhirke
- jQuery Basics Part I | Life's for Learning
- jQuery Animate & Highlight Effect » Girl From Out of This World
- Let Google Host jQuery :: Codex :: JDM
- Horizontal & Vertical Scrollbar jQuery Plugin | Complete Web 2.0
- Hvorfor bruge Googles jQuery.js fremfor din egen kopi? | Svend Koustrup
- Coding the Bootstrap Portfolio using Twitter Bootstrap, HTML5 and CSS3 - Webmaster-Deals Blog
- Contact Form 7 spins | Tech4Eleven | Computer Repair and Web Design
- jQuery Basics Part I ← Life's For Learning
- How to Set Up WordPress (No, Really!) / Drew Powers
- How to use AJAX to Query a MYSQL Database and Check availability of a username » Martin Gardner.co.uk
- Add your js and css files to a Prestashop theme | Programmer's Blog
- Google Hosted Libraries – jQuery | Ninja Journal
- Introduction to jQuery | Thomas Wallace
- Progressive Enhancement | Thomas Wallace
- JQuery Foundation’s Dave Methvin helps speed up your website
- How to Load JavaScript Libraries from CDN & Why It's Not Recommended
- Stop WordPress plugins loading jQuery | Adam Dimech's Coding Blog
- Load jQuery From Google CDN in WordPress - Wordpress Code Snippets
- Integrating JQuery AJAX in WordPress using JSON > EJCode in Wilmington DE
- Introduction to JavaScript Libraries | Thomas Wallace
We tried to do this, for all the reasons you state. Unfortunately, for the few weeks that we had it that way, our local development internet connection was a bit unpredictable. Every time the internet connection came down, testing ground to a screeching halt; it just wasn’t considered worth it in the end.
There are ways around this. Not certain of which framework you’re using… But in Django you set a debug settings. And you’ve got access to this in your templating framework.
http://encosia.com/3-reasons-why-you-should-let-google-host-jquery-for-you/
I realize this is an old post but in response to Paul’s comment from 4:37AM on 12/10/2008:
Although I mostly disagree with the general attitude of this article (that I don’t really care enough to address), there are multiple things that you could have done to resolve your specific issue.
#1 – Add a simple check somewhere in the code to determine if it’s being hosted off of the development server. For example, I always make sure W:\adam.machine (Windows) or /adam.machine (Linux) exists on my development machines. Then in the actual code, I do a quick check to see if that file exists and if so – adjust accordingly. For example, if it were determined to be the development machine – use a local copy of jQuery.
#2 – Similar to above, except it’d work for entire networks. Assuming you have a static IP address: add a check in the code that determines if viewer_ip matches development_ip and if so – use a local copy of jQuery.
#3 – Add a hosts entry (%systemroot%\system32\drivers\etc\hosts or /etc/hosts) for ajax.googleapis.com that points to the local server. That way, any requests for the jQuery file on Google’s server will be redirected to the local server, and thus, a local copy would be used.
#4 – Similar to #3 – use a proxy that automatically uses a local copy rather than Google’s version. For example, if the development server were a Windows machine: use something like Fiddler as the web-proxy and add a rule that redirects requests for ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min.js to localhost/jquery.min.js.
#5 – Use both versions on the relevant pages, where Google’s is included first and the local copy second. Modify the local copy so that it checks if some_jquery_function() has already been declared (may wanna do a couple to be safe, in case x_function() is removed) – and if it doesn’t exist, continue [including the local copy] as expected. If the function already exists, though, that must mean that the inclusion of Google’s copy succeeded and thus the local copy doesn’t need to do anything more than the check (e.g. automatically return).
I’m sure there are dozens of different ways this issue could be addressed, even aside from those listed above. Point being: when something poses an unexpected problem, do whatever necessary to resolve it, rather than simply removing whatever caused it. I’ll admit that sometimes it does take a certain level of creativity – but we all have the power to overcome it. It just takes a little bit of thought and more times than not, an equal amount of effort.
(NOTE: I didn’t read anything else on this page, so if any of those solutions have been suggested already – I apologize.)
That’s why you should include local fallback script, look at html5boilerplate.
Dave,
The only thing that bugs me about this is that the intellisense in VS2008 doesn’t pick up the google js file. This isn’t a big deal as I just add a local reference for working with it and then remove it before deployment.
Would be good if the intellisense worked on remote JS files.
Jon
If you’re working in JavaScript include files, you can use:
This is a great post which expands a post I made on my blog called Differences Between Local And Google Hosted jQuery Installations so hopefully anybody who comes here needing help with jQuery on Google will find all the information they need!
Keep up the good work, Ian.
@Paul: When doing development, you should have a local copy anyway IMHO. You’re not developing/testing on your live server are you?
If you’re doing this on a public facing* website, you are doing it wrong…
Strong words.
What if you are serving up content over https?
https://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js
What if you’re serving up content hosted in and most commonly accessed from within the PRC ( think GFW )?
If you have some users in China (or Iran), you’ll want to use a fallback approach like this one: http://weblogs.asp.net/jgalloway/archive/2010/01/21/using-cdn-hosted-jquery-with-a-local-fall-back-copy.aspx
If your users are primarily in those blocked regions, treat it as an intranet and just use a locally hosted copy.
I’d say 80-90% of our users are inside “The Zone”, so we host all necessary info and force anything that requires “not our” content to use AJAX… Developing a site aimed at the PRC has some interesting challenges – we can’t assume ANY service will be available bar ours.
What if my main server is located in Europe but I also have users in Canada and China. Can Google’s CDN cope with it and host lot’s of media? I’m using CDNsun , but I have some doubts as to such big destinations.
The CDN will automatically serve content from the closest datacenter to each individual user, on a user-by-user basis. So, covering both Europe and Canada works great. I believe there are intermittent issues with availability in China due to their censorship firewall, so be sure to use a fallback to local as described in several other comments above (which you should always use just in case).
Using the google.load() method will also allow you to use Google’s ClientLocation :)
http://code.google.com/apis/ajax/documentation/#ClientLocation
True, but caution: ClientLocation is often null even when it’s clear that location information is available.
It’s a nice API and I’m glad they make it free, but just know that even with corporate, fixed-IP addresses it often doesn’t know anything about location.
What if the user’s network (for some reason) is blocking googles CDN.
This could happen if some crazy corporate policy exists, that stops the downloading of some other file on the CDN (or blocking google entirely).
These users will no longer be able to use your site correctly.
I don’t think you’re very likely to find that combination in practice. If they’re willing to block 16.5% of the Internet in one shot, they’ve probably blocked your site too.
I can speak from experience – that this happens.
It’s a royal pain, too.
Indeed it does. And the people on the other end generally won’t be back.
If you’re coding the site correctly, the experience will degrade but not lose any functionality in this scenario. Since some users turn JS off at the browser level, you should be doing this anyway.
yeah yeah, but let’s face it – not everybody has time to pour into perfect degradation when < 1% of their users fall into the no-js category
If you’re thinking degradation, ur doin it rong. Think progressive enhancement instead. That way you get degradation as a matter of course with no extra effort. If you can’t do that, you have no business developing in JavaScript.
Thanks for the write-up, Dave. I’ve been using the Google hosted jQuery for a while and wondering if I shouldn’t be. Your points remind me why I started doing it in the first place and makes me feel more confident that I should continue doing so.
Couldn’t you use JS to verify google’s jQuery script loaded… Then fallback to a locally hosted version if it’s unavailable?
if(typeof jQuery == ‘undefined’)
{
// load local version….
}
https://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js….
Nice one fella!
My only concern would be breaking changes with a new version. I know, not probable, but definitely possible…….
If you’re using the <script/> method, you don’t have to worry about that. The 1.2.6 URL will always load 1.2.6, even after 1.3 is released.
If you used the google.load() method and didn’t specify the full version explicitly (e.g. “1” or “1.2”), that could definitely be a problem.
^^^
Over engineering?
This is probably the most worthless thing Google has come up with. Yeah sure it’s nice that they are supporting the libraries, but really, how is this beneficial? All we do to handle the problem you describe in your post is to download the minified versions of all the plugins and then cut and paste the code into one file.
Now we don’t have to worry about sending 15 files across the pipe (we use a lot of plugins) and all we’re really sending is the minified files and our custom javascript file. 2 files and that’s it.
Why would you want to force me to re-download jQuery in your combined include? That’s an unnecessary waste of time and bandwidth.
Script combining is generally a good idea, but rolling jQuery into your monolithic JS include will never match the performance of using Google’s CDN; not even for users with an empty cache.
You’d be very, very surprised at how much it does matter. Rolling it into 1 big JS is the preferred way to package a JS-packed site.
If you roll jQuery and all your plugins into one JS include and I use the Google hosted jQuery in conjunction with a combined include with just my plugins, my site will load faster than yours every single time. Even for users with an empty cache.
I could argue this all day long since you seem overly stuck in your ways, but 95% of the time it’s far faster to roll up your own JS file, jsmin it, and gzip compress it to your users.
Unless you’re running a heavily loaded server (and even then), it would still be faster to load one file then it would to do 2 full remote calls.
Of course the timing is also dependent upon a few things: your distance to the server hosting the files (hops), your connection, the call order, and how loaded your server is.
I’m not a big fan of someone proposing an idea as the one and only way to accomplish something — and all other ways of doing it are wrong (which is basically what you said, exempting LAN of course). Also by comparing via one tool on one browser is the wrong way to go about showing performance gains.
Anyway, I’d suggest you check out some other methods before claiming a dependency as the best way to do it. I call it a dependency because you’re now fully relying on a third-party. There’s many ways to deliver a payload faster — through things such as dynamic JS importing/loading, etc.
For what it’s worth, so you don’t think I’m picking one way of doing things and arbitrarily claiming it’s best, I’ve been testing it both ways for quite awhile. Even using the old Google Code hosted version, which was less optimized for serving users, was faster than rolling it up.
That’s before you even consider the users who will show up and already have it cached locally.
I’d love to see a post making the counterpoint, with numbers. If you made one, I would be happy to link to it from this post, to offer more complete information.
Interesting post.Thanks for sharing this.
One reason not to use it is privacy:
Google will know all of your client’s IP-addresses, referrers, etc. It’s like using google analytics, but only Google seeing your statistics.
In European data protection laws it may even be illegal.
Yeah, that’s one aspect that I’m not crazy about. At this point though, I’ve mostly given up on Google not knowing what’s going on with a public facing site’s traffic.
++
Privacy was my first thought while reading this article. One should try to not give Google more information than absolutely necessary.
That’s not true, given that a client makes a request once a year if the browser keeps the caching contract, i would hardly call it intrusive.
Chances are your clients will be using google to search for stuff more than once a year.
It would still make the request to get the 304, it just wouldn’t secure a data transfer. Google’s servers could still be logging those 304 requests.
Google currently sets an “expires” header of one year in the future on these files. If your browser has cached Google’s jQuery-1.2.6.min.js on the client side and you visit a new site that uses it within a year, the browser doesn’t even have to check for a 304.
If Google really were as prying as some of the more paranoid among us would suggest, then they wouldn’t set that expires header. They’d happily pay the bandwidth bill to log the 304s.
Completely agreed.
I dont think there is a privacy concern.. google is something that anyone using internet is accessing it.. so sharing an IP address with Google may not be a big deal if performance and code-manageability is your goal.
Sharing an IP address isn’t the biggest deal… but sharing domain and Referer is. Expires headers only reduce the problem.
Further, the same argument (only checking once a year due to Expires headers) supports self-hosting. The file is only downloaded once and the security issue is entirely eliminated. I don’t think performance suffers more than a few milliseconds (if that), and only upon the first page visited. It’s nothing. This is not even an arguable point if your pages include more than a dozen resources.
Thus, I find security/privacy to be the overriding concern here.
Odds are they are using Chrome or have a google toolbar installed and google can see those traffic patterns anyway. If they aren’t using chrome or the google toolbar, it is still fairly likely they came to your site from a google search, or a site with google code on it (which would again allow them to see traffic patterns).
Thanks for this post! But why do you initialize your script with google.setOnLoadCallback() instead of $(document).ready() ?
Because i still use $(document).ready() even when i load jQuery with google’s jsAPI and it works well.
I’ve found that it depends on how fast google.load() loads jQuery and where your $(document).ready() is.
It would probably be of more value if Google also stored copies of frequently used plugins as well as the main jquery file.
Even with the delay, there is one advantage of the google.load() method.
Remember that SCRIPT tags block the download of other components and the loading of the dom. This blocking nature is primarily designed to support document.write() and other synchronous features of the language.
When a script is loaded dynamically, it is not a blocking download. (Document.write is also broken, but you shouldn’t be using that anyhow.)
So, even though the total time for jQuery’s load might be 200ms longer, if the jQuery loading takes a while, then your page is functional that whole time, rather than freezing up waiting for it.
If you’re just including jQuery, a mere 16723 bytes gzipped, it’s probably not too terrible. If you were loading lots of different scripts and modules, or if perceived load time was absolutely critical, then it could be more significant.
Isaac, that’s a great point. Depending on your existing page’s logic, switching to google.load() may or may not be an advantage if you have onload code that (un)intentionally depends on the blocking behavior of regular script tags. Definitely something to consider and test for if making the switch.
It probably bears more thorough testing, but I’ve found that google.load() exhibits the same blocking behavior as a normal script element.
I’m assuming (dangerous!) that google.load() works by injecting a script element via document.createElement(), which would be subject to the same blocking issues.
Hi Dave. But if there truly was blocking with google.load(), then this isn’t consistent with your response to Julien’s comment above where you said “I’ve found that it depends on how fast google.load() loads jQuery and where your $(document).ready() is.” when he asked about why you would need to use setOnLoadCallback() rather than jQuery’s $(document).ready().
I haven’t done any testing, but if there truly was blocking with google.load(), then I’d think there would be no reason you couldn’t use $(document).ready().
Try this, for example:
Attempting to access the jQuery object in the same block as the google.load will fail because jQuery hasn’t had time to load. Yet, if you watch in something like Firebug, the google.load() of jQuery will still block site.js until it completes loading (after the early $ access already threw an error anyway).
Just to underscore this point, this post has been receiving traffic from this search query:
http://www.google.com/search?q=%22%24+is+not+defined%22+jquery+google+ajax+api
So, people are definitely running into this issue in practice.
The google.load() function supports a callback parameter, to let you execute a function when the script has loaded. You should be using that instead for this sort of thing.
Hi, thanks
your write but not for all conditions.
read my blog to know why you should not host it on Google.
http://tajary.blogspot.com/2008/12/1-reason-why-you-should-not-let-google.html
thank you
Any comments on this? Seems Google code might be blocked in certain countries. If so, that would be a strong argument against this method. Anyone have more info on this?
Alireza’s problem is due to the US embargo against Iran, as absurd as that is.
Anyone knowledgable as to which countries are blocked from Google code due to this embargo?
Absurd and ridiculous as it may be, it might be a production-stopper if you plan on doing business in any of those countries (and not already hindered from doing so if your company resides in the U.S).
Once again, good point this should NOT be done. I would rather 100,000 users have .5 sec of extra load time — rather than block a single user. Point made.
It’s like .. I want it to be a good idea .. I’m an optimization freak, but it practice, this is just over engineering that doesn’t help.
This is a good piece of code for testing. It demonstrates that while loading jQuery via google.load(), the browser will continue to execute/process the page until it hits another piece of external content to retrieve. I do get the “$ is not defined” error.
In contrast, there’s no error when using a script tag for jQuery since the browser completely halts execution of the page until jQuery has fully downloaded and been parsed.
But with this, as Isaac pointed out, there may be a small delay in the page fully loading compared to google.load(). Assuming all your onload jQuery logic is currently wrapped up in (document).ready(), switching to Google’s setOnLoadCallback() method seems safe to do. If Google’s CDN serves jQuery quickly, I don’t think I’d be worried about just using the script tag, avoiding google.load().
It’s all very well arguing the point of using Google’s setOnLoadCallback() to initiate your code while not hindering page loads, but at the end of the day it is quite likely that you’re calling a third part jQuery plugin locally, and the fact that this WILL be done using a script tag will no doubt result in the script loading fully before the DOM continues to load happily.
What’s the recommended pattern for lazy-loading the jquery framework (and associated plugins)? I am thinking of scenarios like webparts, where you require the use of jquery, but don’t want to cause another http request and subsequent parsing of the same library.
You could use something like this before first use of jQuery in each of your webparts:
I’m not sold on benefits of google hosted jquery.
Whats your assumption re your visitors browser cache settings (on/off)
if a majority of your visitors have clear cache sure, CDN hosting of ALL objects would accelerate loads but otherwise, there are basic principles to improving client side render performance without introducing a reliance on an external host..
– use minimized versions of jquery (remove white space, decrease file size)
– gzip (compress whats left to send over wire)
– apache mod_expires (cache for X timeframe)
– host jquery on separate host (deliver static objects and dynamic by separate hosts)
– locate JS files at bottom (prevent script blocking)
– Load all required JS files on early in visit process. don’t double up…Static object loads shouldn’t be coupled with query responses
also worth noting, once you start a browser session and your browser either confirms the objects status’s or redownloads, for the rest of that session, you won’t notice a benefit because user local cache is used for the rest of that session..
Even if you ignore the caching angle, Google’s CDN is going to serve files faster than you will be able to. Their CDN is one of the best in the world. Users potentially hitting the page with it pre-cached is just icing on the cake.
Sure, CDN hosting all static assets is even better, but how many sites really do that? The vast majority of sites in the wild use no CDN.
Even for sites that do use a CDN, using Google’s jQuery is an opportunity for cost cutting and potentially increased parallelism (assuming more than just JS is distributed via CDN).
* gzip (compress whats left to send over wire)
Totally in agreement for this. It really matters a lot compressed and uncompressed scripts.
Check the headers. Google’s hosted versions are all compressed, minimized, everything. They’ve basically taken it upon themselves to optimize the hell out of these javascript libraries for the fastest possible hosting of them, with far-future expires headers and everything else.
Using these will almost always be faster than not using them, in virtually all cases.
…but what if the user got this fine plugin named “NoScript”.
NoScript won’t load scripts from other domains as standard – the user have to accept them. :(
are there possibilities like Zach said:
excuse my bad english and greets from northrhine westfalia,
mathias
btw. this is a great article!
This is interesting. What if you are using custom jQuery plugins? Can you still use this method?
Sure. It’s just like using a local copy of jQuery, only faster.
Do you know all those sites that fetch jQuery from Google Dave? Well, 9/10 you will see calls to ajax.googleapis.com that are taking ages to reply, so I think that using a different host to fetch jQuery is slower – overall – than including it from the server that one is already connected to.
I’ve always found the opposite to be the case (Google serves it faster than my servers can). Do you have any examples of sites that hang on the dependency?
I’ve seen this with slow DNS servers.
I’d generally avoid pulling in JQuery from Google, because I’ve seen this a lot as well – You see it in the browser status bar: ‘Looking up ajax.googleapis.com’, etc. I’ve seen sites that pull in dependencies from four or five different domains, and each one requires a DNS lookup and then a TCP connection needs to be established. Unfortunately, although many ISPs offer nice fast bandwidth, they sadly don’t offer the same experience with their DNS servers.
Of course, some users will benefit from CDNs, but many users will feel the opposite effect.
I couldn’t agree more. And with mobile devices where upload is slower than download, it may be faster for mobile devices to keep the connection open and download jQuery than for the device to start a new DNS request for ajax.googleapis.com.
Isn’t it a fact that every OS offers some kind of local DNS caching nowadays? The chances that ajax.googleapis.com is already cached there is probably very high. We currently use this implementation:
window.jQuery || document.write(”)
As we are in the EU, the privacy aspect is also important for us. Although the script will be cached most of the time, any real request offers Google information about the IP and the referer which is – in doubt – not wanted by the user. I now this still sounds petty-minded in a lot of people’s ears, but I guess we all learned that the privacy-is-fucked-up-anyway attitude is no way to go for in the future.
Regarding the “$ is not defined” issue when loading using googles ajax method. I use the following and have had 0 issues with it so far…
—————————————
google.setOnLoadCallback(init);
function init() {
$(function() {
//initialise…
});
}
—————————————
So, uh, Google’s having *serious* latency issues (at least from my part of the world), and large and small sites alike that rely on the hosted jquery are hanging completely. Awesome! Granted, this will hopefully happen only rarely, but…
And that’s exactly why I don’t use Google for my jQuery stuff. Local is best. I started seeing issues with Gmail from yesterday onwards. Who knows, maybe their maps are no longer reduce (or the other way round). Again, CDN is nice in theory, but don’t think that it beats local (well, depending on most sites’ target audience).
Looks to have more to do with network routing and less with Google: http://blogs.zdnet.com/BTL/?p=18064
None of my sites were very affected by this. Remember that the CDN hosted files are served with a +1 year expires header. Returning users don’t even require a 304 ping.
It would be a problem for new users, but how many new users (who don’t have access to Google search to find you) affected by the routing issues are you likely to bounce in that couple hour window? It’s a pain, but no where near a catastrophe.
One big surprise is that with Firefox 3 fetching jquery 1.3.2. from Google I get a 200 status code every time instead of 304.
So there is no caching benefit, just the benefit from the CDN.
Can anybody confirm this ?
thanks
Because it’s served with a far-future expires header, you’ll only see a request if your cache doesn’t contain a copy of the file. For up to a year after it’s cached, no request is made at all when the browser encounters a reference to it, not even to check for a 304.
So, the only time you’ll ever see a request, it will have a 200 response. You shouldn’t see subsequent requests (and 200 responses) though, unless something’s preventing your browser from caching the file.
Firefox’s disk cache is so ineffective that you should assume it doesn’t exist. See https://bugzilla.mozilla.org/show_bug.cgi?id=559729 and its many, many dependencies, especially bugs 175600 (limit of 8192 cached items), 193911 (space limit of 50MB), and 290032 (some files are never cached due to a shitty hash function – may have been fixed in FF3.6).
FF4 may be better – at least, some of those bugs have been marked fixed – but I would want to see test results.
What you mention is true for jquery-1.2.6 but not for 1.3.2
Try the simplest page and check with firebug. If you reload the simple page with 1.2.6 you will see a 304 status code but a 200 status code for 1.3.2:
I think the issue is that the response adds a http header Age not zero which is not included when you request 1.2.6.
Try it! I was quite surprised !
It’s normal to see a 200 response on the first request.
It would be abnormal to see a 304. With a far-future expires header, the browser shouldn’t even be pinging for a 304 if it has it cached.
When it’s being properly cached, you shouldn’t even see it appear in the Firebug net tab.
Sure, the previous post skipped the basic html. I compared a page with jquery 1.2.6 and another with 1.3.2. The page with the old jquery caches properly but not the one with 1.3.2.
At least with firebug 1.3 I see the 304 because if you look at the header responses it returns a Last-Modified, which takes precedence over expires
It takes 5 minutes to do this test in firebug but I assume you haven’t even tested what I am saying.
Your script snippet in Back to Basics references 1.3.2, so you should update the article. I will also try with IE and report the findings.
I use the Google hosted 1.3.2 on several sites. I double checked them this morning, after reading your comment, to make sure it’s still caching properly in Firefox.
It is for me.
The browser never makes a request for jQuery (1.3.2), not even a ping for 304, unless I clear my cache or force an update with a shift/ctrl reload.
I also verified it in Live HTTP headers and Wireshark. Firefox is using the locally cached copy and isn’t sending even a single byte over my connection when it hits a reference to http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js
Do you have a publicly available page that reproduces what you’re seeing? I’ll take a look at it.
Thanks for checking.
My issue is minor, I was refreshing both a page with 1.2.6 and a page with 1.3.2. The page with 1.2.6 checks the 1.2.6 link and gets a 304. The page with 1.3.2 gets a 200
Bravo, Google :)
If you happen to run a WordPress site, there’s a plugin that will easily do this for you, for all available JS libraries:
http://wordpress.org/extend/plugins/use-google-libraries/
Just install and activate. Operation is automatic, no configuration needed.
Thanks for sharing your knowledge!
If you are using opendns and you have set the protection level to highest, there is a great chance that the google website will be blocked for your entire network, these are some wise words though on CDN and general website performance.
Thanks a lot! You have opened my eye and now I am surely going to put the Google hosted jQuery on my blogger blog! :) thanks once again!
I’ve been doing this as a matter of course since Google first announced they were hosting the files. I’ve not personally seen any performance increase, as I always keep my pages as lightweight as possible in any case, but neither have I had any problems. I agree that this is a best practise missed by most, and it’s always worthwhile exploiting any optimisations you can!
Sorry, the first code should be:
google.load(“jquery”, “1.3.2”);
google.setOnLoadCallback(function() {
//this worked
});
And the second block of code should be:
$(function() {
//This didn’t work on IE but it did on FF
});
I am considering using google api, but I have a question.
What is the likelyhood that google will host old versions indefinatly? I have made websites for clients for almost 15 years now and one of the oldest is still running today. I would hate to think that a site I make today would not run in 20 years time because the js is no longer hosted on google?
That’s a good question that I don’t know the answer to. My guess would be that they would continue hosting it as long as it was actively serving requests. There’s so little overhead in it, I’d be surprised if they went out of their way to break something being used like that.
Worst case, it’s a very quick search-replace to globally update a site to use a locally hosted legacy version instead of Google’s.
Hmm, not very reassuring. But I still would like to use it. Best solution I can think of is to check to see if jquery has loaded, if not load a local version?
Have any thoughts on how that could be done best?
You can do this:
Because script elements block rendering until they’ve loaded and been executed, you can assume if jQuery isn’t present in the subsequent script block that it has failed to load from Google and then react accordingly.
You confirmed my thoughts. Thank you very much for the help.
I’ve tried the check and eventually load locally method above; I’ve intetionally used a bad url for google jquery to see if the local loading of jquery was really done. And indeed it was, but looking at firebug network tab, I see that others js libraries requiring jquery, load before the local copy of jquery, breaking any jquery-dependent functionality on the page.
Here’s my test code:
Firefox loads the second library as soon as google gives the 404 error, and then loads local copy of jquery after some other resource files, as ccs o gifs.
Do you know a way to correct this behaviour? Am I doing something wrong?
Thanks!
Sorry, I’ve lost some piece of code. Hope this works:
thanks for your advice
keep it up
Yeah, I just found out from another programmer out there that this is definitely better than hosting the jQuery script yourself. It was nice to see exactly why it is better after reading your article.
Thanks a lot.
It would be nice to use Google’s CDN but as google is blocked in some countries and some networks … Anyhow, it is a good idea to use some CDN, perhaps build a very own to increase parallelism and improve caching. What about latency? well, we are talking about 20kb of jquery minified and gzipped, just make sure to use less images and optimize their size :)
Thanks Dave, Encosia hasn’t been in my feeds for a very long time, but it never fails to teach me something new and interesting, I like this post and I adore the discussion opened up by the readers, you could add the pro’s & con’s to your post that have risen so far, that would be a good summary :)
thanks again and keep it up!
I have a asp.net app that has a dynamic combining of the various scripts (jquery, plugins, etc) – total of ~110Kb, combined, gzipped.
When I serve jquery combined with other scripts (1 script tag), latency is around 500ms.
When used google’s CDN hosted version of jquery (1.3.2) and firebug shows latency of around 828ms for the scripts in net tab.
All tests are done on local VS development server.
What do you think? Will these results change when I host my application on some online server?
Yes, you should expect to see the extra HTTP request be much more significant in a local setting. That won’t be the case once deployed to a live server.
I personally have other stuff within my “jQuery” file like jQuery Tools and JSHTML with I use on almost every page where I might use jQuery.
So that’s one reason I wouldn’t use it, since you can’t have any other files in it. Also, it’s yet another http request so if you’re already loading other stuff, why not have jQuery be loaded in there as well.
I can see myself using this for stuff like tutorials though, where all I would need is jQuery.
Thanks for the article! Firebug tells me, Google’s response is about 50 ms (jquery) and 150 ms (jquery ui) faster than my remote dev machine. It’s also gzipping, so I don’t have to take care of that. Seems to me like there is an advantage on using Google’s hosting the way you suggest. I also used your fallback code, thanks for that!
Although – right now I’ve got only 4 JS files, which might double or tripple at the end of development. From past experience I still suspect, that nothing is faster than combining and compressing all JS into one (as well as CSS). As far as my limited experience goes, this is always the fastes solution, although it’s more tedious to maintain. The process has to be repeated for every update to any JS or CSS file. One could still use a CDN though, by just buying some cloude space. Should not be too expensive per year for a couple of text files which are often cached.
The fallback code works like a charm if remote JS files are blocked by NoScript. That leaves only the “user lives in Iraq”-scenario, unfortunately. Maybe a country specific IP range can be considered.
For https I’m thinking it would be possible to use the Analytics code:
var gaJsHost = ((“https:” == document.location.protocol) ? “https://ssl.” : “http://www.”);
document.write(unescape(“%3Cscript src='” + gaJsHost + “google-analytics.com/ga.js’ type=’text/javascript’%3E%3C/script%3E”));
Obviously changing the URL.. Not tried yest but it should work.
It is good for most shared hosting users to offload extra scripts and resources as much as possible, but when you have a low latency, high bandwidth server, you are just as well off since you no longer require an extra DNS request, which can waste lots of time.
Only if you assume that all of your users are closer to your server than they are one of Google’s CDN edge servers and that the user doesn’t have the Google CDN copy of jQuery already cached (which would lead to the browser not making a DNS lookup or HTTP request at all).
Was doing some research into this trying to pick whether to use
jsapi+google.load or the direct library path. I noticed that with the former
method, the js lib comes with a 1yr-in-the-future expiration date while with
the latter, a 1hr. Here are the (relevant) HTTP response headers from FireBug:
using google.load():
Content-Type text/javascript; charset=UTF-8
Last-Modified Thu, 14 Jan 2010 01:36:01 GMT
Date Fri, 22 Jan 2010 20:27:03 GMT
Expires Sat, 22 Jan 2011 20:27:03 GMT
Content-Encoding gzip
Cache-Control public, must-revalidate, proxy-revalidate, max-age=31536000
Age 176
Content-Length 23807
using straight url:
Response Headersview source
Content-Type text/javascript; charset=UTF-8
Last-Modified Thu, 14 Jan 2010 01:36:01 GMT
Date Fri, 22 Jan 2010 20:33:37 GMT
Expires Fri, 22 Jan 2010 21:33:37 GMT
Content-Encoding gzip
Cache-Control public, must-revalidate, proxy-revalidate, max-age=3600
Age 161
Content-Length 23807
Turns out, the confusion is that this reference:
http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js
Is not a reference to 1.4, but the latest 1.4.x release. Since it would defeat the point of a “latest” reference to cache it for a year without requiring a 304 check, they use shorter expires headers on those.
Using this reference:
http://ajax.googleapis.com/ajax/libs/jquery/1.4.0/jquery.min.js
Supplies the +1 year expires header as expected.
Paul Irish has more info on that here: http://paulirish.com/2009/caching-and-googles-ajax-libraries-api/
This is a great article but when I try this approach the intellisense will not work. Is there a work around to fix that?
Assuming your code is in .js includes, download the vsdoc and place this at the top of your includes:
For Intellisense inline in ASPX pages, you can use this trick:
It will never be rendered to the page, but Visual Studio will still parse the file for Intellisense.
Ok I tried this but I don’t seem to get it to work. Here is what I did
Did I put it in the wrong place? As long as I don’t reference the Google page my page works but as soon as I point to Google – no intellisense.
Thanks for your help.
I don’t have VS2008 handy to test with, but that code does give me jQuery Intellisense in inline code blocks in VS2010.
@Terry: I had the same problem.
This worked for me on VS2008:
http://nitoprograms.blogspot.com/2010/02/jquery-hosted-on-googles-cdn-with.html
This method would be great if everyone used it. The problem is that only something like < %5 of sites use it meaning the extra DNS lookup to google costs more time than just hosting the file yourself. Since you are only pulling one file from google, you never beat the initial dns overhead that using their CDN costs
Having total control of your source codes outweights all of the advantages listed and discussed above. Why should you rely on someone else, even Google, may have an impact, beyond your control, on your company/business website?
For a personal website, I can see the benefit but not for company/business websites. Foolish thinking not practical.
Unless you planned on modifying jQuery.js, you still have total control of your source code. If something were to happen, you could switch the reference to a local copy in minutes. It’s no less practical than using any other CDN, which business sites have been doing for years. Unfounded paranoia is no reason to slow the web down as a whole.
The difference is that the Google CDN for javascript libraries doesn’t offer an SLA.
Automatically falling back to a local copy isn’t difficult (see the examples of that throughout the comments here). In the bigger picture, I tend to agree with Nate that SLAs are empty anyway. I’m less interested in who to blame during the .0001% than I am how to improve performance/user-experience during the other 99.9999%.
I think if you agree to an SLA which is ’empty’ then absolutely, there’s no point to it. But if you properly set an SLA with your hosting provider which guarantees your servers, with fines etc for any problems, then to bring a key part of your page engine away from that definitely negates the whole purpose and would not be justifiable in terms of business accountability.
Switching back to a local copy is your solution if something happens. Then, why bother with that headache in the first place? Beside, your suggestion shows that you did not consider the business implication. Redirecting your customers to Google equates with Google accessing your customer’s intelligent info. Furthermore, it is practical to point out that jQuery is only about 70k. What gain you actually do? NOTHING.
Therefore, it is foolish and ill business practices, still.
Falling back to a local copy is automatic. Why bother? Eliminating a quarter-second or so of page-blocking JavaScript is well worth understanding how to best use the resources available.
As long as the majority of sites are using services such as AdSense, Google Analytics, Google Maps, etc, I have a hard time buying the FUD and paranoia. Of all the services like that, using the CDN is one of the most benign, since the browser may not even contact Google again for a full year after jQuery has been downloaded once.
I think, Billy Nguyen’s concern cannot be dismissed. He points out that there are differences for commercial and for non-commercial websites, and that they must be considered, especially the ones about privacy, and especially if you’re gaining about a millisecond in performance. A business should have the possibility to invest in a save CDN. Just as more and more businesses can’t rely on Google Analytics anymore, because then you’re sending certain 2-party-data to a 3rd party (Google). In some countries like here in Germany, this starts to go to courts already. The solution also for this problem is relying on software which can be hosted on the companie’s property.
On the other hand, since when everyone points to Google for hosting of files, they get cached a lot and Google’s servers are not contacted anymore, so it’s even harder to make “sense” of this info.
Why not do it the other way around for business sites? Use a local copy, and when that is not available, fall back to a 3rd party CDN. Seems like an unusual case (unless you do have your own CDN or CDN-like setup), but if it does happen, it seems like good service to your customers not to bother them with the tech problems.
Privacy on the Internet is a nice notion, but largely an illusion. Of all the actively invasive methods that advertisers and ISPs use to track users, focusing on the potentially once-per-year jQuery CDN accesses is a non-starter for me. If Google were really out to leverage the service for “evil”, they definitely would not serve the file with a +1 year expires header.
Sorry, but I seriously disagree with you here. This cavalier “well, we don’t have privacy anyway” attitude is the very reason that privacy is such an issue in this day and age. When they came for the trade unionists, I said nothing.
You are wrong to just dismiss these claims. At best, you are being disingenuous by not giving security in this matter an equal discussion. At worst, you’re ignorant to the real problems here. In a below response to John Sanders, you state that, “if Google’s CDN were compromised, you’d probably find that out much quicker than if your own hosting was.” I don’t necessarily agree with that. When you use the Google APIs, you don’t have an SLA with them. They are under no contractual obligation to report exposure to you.
More importantly in this day and age, and the real reason I take serious issue with your “this is the best approach for everybody, except maybe intranets” attitude, Google does not need to be compromised in order for the content to be exploited. Man-in-the-middle attacks are extremely easy to pull off, especially when the content is being served via HTTP mechanisms.
Here’s the real problem.
If somebody visits a website that uses the Google-hosted jQuery and their session is compromised just once, every single session they have on any and EVERY site they subsequently visit that uses Google’s jQuery is also compromised. If someone were to inject a modified jQuery to Joe Sixpack while he was browsing the web at Starbucks (or even as he passed by a rogue access point with wifi enabled on his phone or laptop – on a freaking plane, even), the response can be set to cache that content for decades. At which point, every single request to a Google jQuery-enabled site is now affected. The very caching that you espouse as one of your main three benefits also introduces a serious potential vulnerability.
This isn’t being overly-paranoid, this is being realistic. As long as we’re using plaintext protocols, depending upon an outside CDN to provide portions of executable code that are shared among thousands of sites is just an awful idea.
Everybody is free to make their own decisions, but you should really be fair here and give discussion to the fact that in order to enable “Eliminating a quarter-second or so of page-blocking JavaScript,” site managers are opening up a new potential avenue for attack. When explained in these terms, many businesses may make a different decision.
Parallelism is also a dubious argument. Anybody can enable parallel loading of JS content on their site without much difficulty. And if one is concerned about the two-concurrent-sessions-per-host browsers as you discuss, you can just as easily serve the content from a different hostname. Yes, it’s another DNS lookup, which adds latency (assuming they already have the Google DNS entries cached).
Everybody needs to make this decision on their own, weighing all of the benefits against all of the downsides. This is far from a “one size fits all” solution, and there are many factors to consider. Risk assessment is a real concern here.
You have a heavily-linked page on this topic. You would do well to treat the topic honestly and fairly. I’m sorry, but I don’t feel you are currently doing that.
Interesting. I doubt the benefit of users not having to download jQuery at all because it’s cached exceed that of minifying all your javascript files,including jQuery, into one file and serving it from Cloudfront, which is what I do.
Thanks for the article Dave!
What makes me most uncomfortable about this approach is security. If Google’s servers are ever hacked (improbable, but not impossible), then think about how many websites will be running malicious code. Who knows what kind of information a bad person could grab by reading your cookies, or scraping your screen. For this reason, I would only use this strategy for websites that contain no private user information.
Hacking happens.
John
That’s definitely a valid concern.
Of course, if Google’s CDN were compromised, you’d probably find that out much quicker than if your own hosting was.
Exactly. This is the primary reason I’m hosting my JavaScript libraries locally.
However, I’m starting to use Google’s hosted Web fonts, so maybe using their hosted jQuery would be no worse. Also, they’re probably better at protecting the integrity of their CDN than I am — I’m a developer, not a sysadmin.
Dave, have you seen this article?
http://zoompf.com/blog/2010/01/should-you-use-javascript-library-cdns
It says using CDN’s for javascript isn’t worth it – the DNS adds 1/3 second delay verses your own domain, and jQuery only takes 1/3 sec to download.
He’s also says only 2% of Alexa 2000 websites use a javascript CDN, so it’s unlikely the visitor has it cached already.
In short he says:
What do you think?
I don’t agree with their assessment about the likelihood of a cache hit. With such a wide gamut of high-traffic sites (like Time, SlideShare, ArticlesBase, Break, Stack Overflow, Examiner, Woot, and ArmorGames, to name a few) using the Google CDN, it’s becoming more and more likely that the minority of your users are the group that don’t have a local copy cached.
I totally agree D Yeager. 300%.
I tested this on my website and loading MooTools (well it’s not jQuery, but the situation remains the same) from Google CDN adds latency due to the DNS request. We switched to a locally hosted copy and the site is faster now.
My advice is to test before diving into such advice. It may be not that good as it seems…
I agree that testing is important to any optimization. For example, my browser took 678ms just now to download MooTools from your server, but only 91ms from ajax.googleapis.com.
Of course, that datapoint might be irrelevant if you don’t expect to reach a global audience with your particular site. It’s a good reminder of how effective geographically dispersed edge servers are though.
Yes, the location (geographically) of the user plays an important role here. Our server would certainly perfom better in France and nearly countries while you have a penalty from America. Your remark just made me check how many folks visit the our site from the other side of Atlantica. It looks like we have 4%, so it’s reasonnable not to use a CDN.
I guess a very good option here would be to have those major libraries hosted locally, directly embedded in the browser as instance. That would definitively save up some time for everyone!
I’m also wondering how this could affet the new criteria from Google’s pagerank: the site load time is taken into account. If the googlebot fetches everything from America, that would be a load time penality for the crawler. Even more if the DNS are cached: the CDN version would perform a lot better. Any thought on this?
I think I would rather site visitors accessed this locally on my servers than kicking off another DNS request. I think I’ll still run some tests to see if there’s really much difference. Maybe better to use the hosted version if you’re working with Google tools a lot such as Maps?
Cheers.
It’s smart to consider the DNS lookup. However, it turns out that so many of the most trafficked sites are referencing the ajax.googleapis.com domain now, the DNS lookup is going to be already cached at the browser or OS level for most users.
What happens when Skynet(Google) becomes self aware?
Is it possible to use Jquery with Google Sites?
I think I’ll stick to localhost serving of scripts, I got fast server and have run tests and got such mintue improvement it was outweighed by serving extra dns requests to another server, but a great post to highlight the point, it will be better for most sites I’m sure ;-) (lol at Jim Kennellys skynet comment)
Does it help make a decision knowing twice this month I’ve had issues getting Google Code hosted jQuery to work appropriately?
It’s hard to find information out there, but I know it happened.
Great article!
Awesome article, i was always against grabbing jquery from google for risk of it being slower, great work
I was doing this today, until I noticed that http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js is “Not Found” and will not load on my website. I reverted back to a local copy, it is more reliable.
@James P.
If you agree to use Jquery hosted externally and your problem is the reliability of the external link, you can do a little check like this:
In this way you can serve a local copy of jquery just in case google link fails.
Where would I place this backup code in case Google’s CDN was down and didn’t load? In my header.php or function.php?
Neither. JavaScript doesn’t belong in PHP or HTML files.
Seems like Google is moving away from direct inclusion. And want you to get an API key. Using the API method, would the file js file still be cached?
To be more specific, will this still be cached
Scripts loaded through the jsapi loader are cached.
I still prefer the direct reference for pulling a single script off the CDN. It only requires a single HTTP request, where jsapi/google.load() results in two.
The place where using google.load() makes sense is when you’re pulling several scripts off their CDN. Then, its dynamic script injection will probably be faster than static script references.
I would do this if it weren’t for the fact that I’ve corrected bugs inside jQuery that would otherwise cause errors to be served to my clients.
Thanks for the handy article. :)
Great article! Thaks for share
Great article dude
It’ a great idea, everyone is recommended. Load faster and BW saved.
Looking for your donation button. Finest decision support via comments that I have ever seen.
Hi Dave,
I only receive gzipped jquery from Google and Mircrosoft cdn when i use “https://”
and not when when I use “http://” in firefox. Yet I have not tested in other browsers.
example: “https://ajax.googleapis.com/ajax/libs/jquery/1.4.3/jquery.min.js” – gzipped
“http://ajax.googleapis.com/ajax/libs/jquery/1.4.3/jquery.min.js” – Only Minified not gzipped.
Thank You,
Vishal Astik
It’s coming in gzipped for me, as of a few minutes ago:
It’s possible you’ve got a proxy somewhere between you and the CDN that’s altering request headers on regular HTTP traffic but not SSL. Do you see gzip or deflate in the Accept-Encoding line here? http://www.xhaus.com/headers
Hi Dave,
Thank you very much.
you were right, problem was due to the proxy.
when I try it from home everything works fine.
Thank You,
Vishal Astik.
Hello Dear,
I have try to make one jquery plugin for html loading function. Please refer the following link.
http://aspnet-ajax-aspnetmvc.blogspot.com/2010/10/dyamic-html-loader.html
Please suggest me to improve it more.
Thanks,
Mohan
If you’re following this advice, you are doing it wrong.
There is one single reason that outweights the 3 given in the article:
Google will track down and save the information of all your site visitors, without them even noticing. Using the APIs, you just help to feed Google’s databases.
And if some alerted users have blocked scripts from Google for exactly this reason, they will not be able to use your site.
It is completely naive to think that Google hosts this APIs for the good of mankind. It’s all about harvesting of personal data – who visits which site at which time. Using the APIs surrenders your unsuspecting visitors to Google. Don’t do that.
Luckily, that’s not very realistic. The CDN’s files are served with a far-future expires header, which dramatically minimizes how many HTTP requests are actually made to Google’s servers – exactly the opposite of how they would configure it for “evil”.
I am developed a site for a client that is available world wide (in languages I don’t speak), and ran into issues with people in other countries not being able to load the jquery plugins from google. I was not able to get exact error messages. We switched to having the jquery files locally and it resolved our issue…for what it’s worth to anyone.
If you are using type=’text/javascript’, you are doing it wrong.
In 2008, when this was written, no one was using an HTML5 doctype. In HTML 4.01 and XHTML 1.0 [Transitional], script types are required. For that matter, they’re optional in HTML5, not forbidden.
Right, consider that an update comment for future readers. Optional as in ‘you could put anything there and it would still be ignored’.
I have a hard time believing Google is not interested in logging the ip’s. They are willing to give anything for free as long as they get a ping. Last time ( a few years ago) I looked at the google analytic js drop in they have a cute little image of one pixel that is wrapped in a noscript tag. PING.
Google may not have that much more market share yet as far as search goes, but they saturate the web with they ‘free services’ that track nearly every site you go to. Don’t fool yourselves. It is not benign.
Personally, I find it unethical to host 3rd party content from a site. when a user types or clicks a address they should be able to expect that is where they are going. There should not 15 different hosts hiding in the background js.
Constructively, this could be avoided if there was a way to include a hash of the file along with the path in the script tag. It would then be obvious to the browser that the file was indeed the same library it had already loaded and not bother with it again. Very simple without having your every click tracked by big G.
Since the AJAX Libraries CDN serves content from a cookie-less domain, with a +1 year expires header, it’s extremely poorly suited to any sort of per-click tracking. The idea that AdSense or Google Analytics tracks you from site to site is a legitimate concern, but this CDN is configured to optimize performance at the direct loss of “trackability”.
Thank for this great article and good trick ;)
This is a great article. Thank you for taking the time to write it. The commenting area is also very valuable.
Regarding this statement in your article:
While this is indeed a nice trick, I wanted you to know that it did not work for me and probably does not work, period. When I include the link to the jquery without the HTTP that way, jquery simply isn’t loaded.
Thanks again.
Ricardo
It definitely works. The one place you’ll run into trouble is if you’re editing/testing a file locally on Windows, without using a development server. From a page opened at file:///yourPage.html, the protocol-less reference will attempt to load jQuery from your local filesystem instead of using regular HTTP.
I installed a plugin called “Use Google Libraries” that did all the work for me :) however, although it works, I find that it didn’t provide benefit #3 that you describe. For some reason, my website calls jquery 2x from Google’s CDN (probably some other plugin makes the call), and the site attempts to download it. Do you believe that if I hard coded my site like you recommend that it would stop the 2nd call? Or do I need to get rid of that plugin (assuming I can find it, I’ve got 19 installed).
Thanks!
That sounds like either two separate plugins are injecting the jQuery CDN reference or maybe you’ve got the “Use Google Libraries” plugin in addition to a theme that has the hard coded link already included.
I’d say try disabling the “Use Google Libraries” plugin first, and see if there’s still a single request being made for jQuery on the CDN. Also take a look at footer.php and header.php in your theme to see if jQuery is already being included there. If so, you shouldn’t need the plugin.
Good article thanks you
This quote is actually not correct, although it may have been changed since the article was wrriten. The google CDN site currently says:
and
Great article!
I’m glad to see they changed that. Thanks for pointing it out.
Dear encosia,
It is very hard to remember the path to Jquery on google. So every time I want to include google hosted jquery, I do a search of “jquery on google” and your blog always comes up on top.
I have done this about 25 times now and everytime I have to scroll all the way to the bottom of the page to get this bit.
I am sure a lot of people do this. Please insert a similar text right on top of the blog for people like me.
Thank you very much.
Done.
Excéllenté
Thank you very much for listening, this outta save a lot of time for me in future :)
I have used google hosted jquery in blog. Works like charm and thanks for sharing the valuable trick.
We were doing this on a client ecommerce site, and as unlikely as it sounds, the Google-hosted jquery *did* go down. Even going down for a short time means a loss in revenue. We don’t use Google’s hosted jquery anymore.
I visit this site a couple of times a month just for the ajax.googleapis.com link. Saved me lots of time. Thanks!
*Never ever* make yourself dependent on third parties if you are serious about your website and you have alternative solutions. Google’s servers in particular are unreliable in my experience (I am getting server errors all the the time for instance on Google groups, and emails on Gmail have simply disappeared or not been forwarded in the past)
Thomas
This is so amazing tutorial sir. Thanks a lot. This is pretty much helpful for beginners like me. I just want to ask though. After all the codings with jquery and having it hosted in google. I wanna ask how would i be able to install the codes into my site and make it work there?. I am using blogspot platform. I already have the code for the particular purpose. I just don’t know how to install it in my site.
Please help me sir. Thanks and more power to you.
Just wondering why if I leave the https off (//ajax.google.apis.com/…) jquery fails to load. I just end up with a “$ is not defined” error.
That’ll happen when you load the page from your local filesystem (as opposed to using a development server and localhost-based address), since the page’s base protocol isn’t HTTP or HTTPS.
I prefer not to use other parties hosted script. Just in case the unavailability of Google (which is rarely happened) our web might be broken or even make security hole in it because of the library cannot be loaded. IMHO it’s better put on our own hosting. If our hosting down then no one can access the web page, instead of open the web with security problem.
Wait, did you just say absence of juery might leave a security hole?
Looks like you are just making things up out of thin air, I mean, seriously, who depends on js for security? take it out of there and put it where it should belong. Also, please stop complaining about google going down so you didn’t have js to secure your website.
Yes, it’s true we don’t depend on js on security, yet if the website that uses js to do some processes and it broken because of the unavailability of the js might show some data that might not suitable for public to see.
And how you will be sure that google or other script provider does not put other malicious script on their hosted script.
http://www.networkworld.com/news/2007/040207-javascript-ajax-applications.html
and
http://www.clerkendweller.com/2011/1/7/Widespread-JavaScript-Vulnerabilities
If you’re relying on JavaScript to hide data that’s not suitable for public visibility, you’ve probably already been compromised and don’t know it yet. JavaScript can’t hide anything from view source or even a simple wget.
That said, you can use a fallback approach (linked on this page several times) to automatically load a local copy of jQuery if the CDN’s copy fails to load.
Actually I’m not hiding any data, but read data using javascript, I never do that on my projects.
CMIIW, I saw several websites that look fine, but displayed those kind of data when I disabled javascript on my web browser. That’s what I pointed as the security hole. Maybe this is different context though :)
Anyway, I agree on fallback approach when if using CDN. Btw, do you think there’s a file hash checker to make sure the file on CDN is the same as the one on jquery website?
Cheers
I’ve switched to using the specific version (say v1.6.2) of jquery to a generic version (i.e. v1), which is also possible with the script tag:
Even if you trust that new versions of jQuery won’t break your site (and the last two major versions had breaking changes that affected many sites), you shouldn’t use the latest version reference for performance reasons. In order to ensure that a browser seeing that reference will always use the latest version, that reference is served with an extremely short expires header. So, you lose not only the cross-site caching benefit, but for repeat visitors this aspect of your site will be even slower than if you’d self hosted jQuery with proper expires headers yourself.
Excellent points. I used an old version of jQuery for some sites and discovered later that there were errors with IE6. These were solved in a more recent version of jQuery. Hence I thought it might be a good idea to let it update automatically. For these sites, performance wasn’t too much of an issue, but breaking the site would be of course. Thanks for the pointers.
This shows how such a great topic can be in that is spanning over two and a half years! For me personally there is still a element of suspicion and concern using a third party and rely on them – no matter how big or dedicated they may be. I guess where revenue and performance are concerned however this is a deal breaker. Just to be contradictory, the idea of automatic updating does appeal, which I assume the hosted option would provide?
You should avoid the auto-updating usage, because it also comes with a very, very short expires header which hamstrings caching for repeat visitors. It’s also somewhat dangerous to assume that jQuery updates will not break old code these days – jQuery 1.5 and 1.6 included breaking changes (in the interest of the long-term good).
I’m not so sure that I like the idea of Google or anybody else having the ability to track exactly who is requesting my site’s pages. Since every request for the API libraries is accompanied by the key, then Google knows the site being accessed, and can also know who the visitor is and what page he has come from. I’m not paranoid, nor (I hope) are my visitors, but still giving someone that much information on my site and its user’s activities gives me something to think about.
Don’t worry much about that. Since Google’s serving the script with a +1 year expires header, they’re committing to a best-case scenario that’s extremely sub-optimal for tracking your users. With the huge cross-domain caching potential that the Google CDN has accumulated at this point, there’s a good chance they’ll never know that your users hit your site because the user will load jQuery from disk and not even need to make a request for a 304 result.
Great post. I only just got setup with jQuery today and am amazed by the great results you can get with such little code. I implemented the datepicker into my site which used a fraction of the code of my old JavaScript one and has much improved validation. I didn’t really know if I should host the files myself, but one quick Google search and 5 mins later and I now have a good understanding. Thanks!
It’s amazing how the comments have been going on for so long. I too came here in search of the quick link, thought it would be quicker than opening finder and grabbing a copy from one of my sites. Thirty minutes later and I’m not so sure it was quicker.
While I’m here I figure I might as well as put in my few cents as there are clearly a few designers and web developers on here looking for real clarity on the matter.
For those who just to skip to the end for the answer here it is, locally hosting is always the better choice than using google to host one jquery file.
The most basic of reasons is control, of which you lose by allowing someone else to add whatever they choose to on your site. In addition you are at the mercy of their network. Can you have faith in their abilities, sure nothing wrong with that. Should you base a business on letting an easily mitigated risk go on? No, no you shouldn’t.
Looking for technical side of it? Excellent…
To start of the most obvious reason, is that no matter if you have jquery hosted on google or your server all subsequent requests are pulled from cache, meaning any suggested benefit for using google is pointless after users have been to your site. If your site only loads once, in this case you may have a point. I know from my experience my clients like users to get past the home page and the majority do.
Secondly using the google version results in a dns lookup, sure your computer may have that cached but again if relying on an assumption that it already exists you might as well rely on the assumption it doesn’t. Since the assumption that it doesn’t allows you to reduce a risk.
From a CDN stand point, you normally look to use those when your customers are located across fast distances and have a large amount of traffic. If this is the situation you find yourself in, then using a cdn for your own assets would be more beneficial than just one small jquery file.
Additionally users won’t always get the closest response from a data center. For me I get sent up to Mountain View, CA even after first hopping to the data center in LA. It’s not a perfect science but more of a best guess, and guessing should have no part in logical conclusions.
As such, I don’t see any valid use of allowing google to host a jquery object for me or the companies I work for. At best some users may experience a noticeable benefit, say your customers in Montana accessing with their free month of AOL. The downsides associated with that practice can not offset the potential gains…. it’s good sense in my professional opinion.
But by all means, don’t take my advice live and let live i say. But if you build and maintain enough sites for enough time… you’ll realize this was sound advice.
Cheers!
Dev Head
I am a little curious about why no one here discusses your user’s privacy. Google doesn’t make this web space available for altruistic reasons, they are a business whose model is to gather as much data about everyone as possible and sell access to that information to the highest bidder. When a user pulls javascript from GoogleAPIs, this allows Google to track who is pulling the script and for what website it is being pulled. This is a valuable piece of information to add to all of the other information in the user profiles.
I (as a user) mistrust any website that requires me to make a connection to Google’s servers (which we firewall at the router), and have gone so far as to close bank accounts where the programmers have been lazy enough to rely on Google instead of serving their own javascript. Admittedly, most users don’t care about being tracked (there’s a never-ending argument between the young and the old!), but for those of us who do, relying on GoogleAPIs is a strong signal that the website programmers, and by extension the company owning the website, have no regard whatsoever for user privacy.
That’s been discussed a few times throughout the comments. Due to the far-future expires header and cookie-less domain, Google could only gather extremely spotty tracking data if they tracked it all. Many, if not most, of the script references to Google’s CDN don’t result in any HTTP request to Google’s servers, and they lack tracking cookies when they do.
Compared to the even more ubiquitous services like AdSense and Google Analytics that do actively track viewers, the public CDN is harmless.
Compared to the even more ubiquitous services like AdSense and Google Analytics that do actively track viewers, the public CDN is harmless.
That’s a silly argument; compared to Gmail, the CDN is harmless, too (not to imply Google tracking is harmful). But I don’t use Gmail (hell, our mail server won’t accept connections from Gmail machines). I don’t allow connections to google-analytics.com. And none of that is relevant. We are talking solely about the possibility of combining website and user through GoogleAPIs, IP addresses, and existing Goggle cookies, which is possible.
You say they receive, “extremely spotty tracking data.” Since Google has never touched any data they didn’t mine and maintain, I don’t personally want to give them anything, even what you consider “spotty.”
Please don’t take this the wrong way, but if this weren’t profitable for Google in providing them additional data to mine, they wouldn’t be doing it. Google is a business, whose only product is information; they are not a not-for-profit giving things away, and regardless of how most geeks see them, they are not altruistic. Gmail, Google Maps, all of the services are designed to acquire data they can sell. That isn’t “evil,” but I prefer not to participate.
If you as a website require me to load data from any Google server, you are requiring me to participate in G’s tracking whether I wish to or not, and this makes no sense particularly if you are a service that should be providing privacy, like a bank or a domain registrar. I am actually changing registrars because my current one requires me to accept Google AJAX files for only one object; the verify code for the credit card. They “require” me to accept that JS file. I won’t, so I can’t pay them. How seriously stupid is that?
The problem isn’t Google’s CDN. It’s that your credit card company doesn’t offer a fallback when you block connections to the CDN. There’s no reason that both can’t coexist. When you loaded this page, it first attempted to load jQuery via the Google CDN too, but if you blocked that connection then it resorted to loading a local copy. Any site can do that in one line of code. I totally agree with you that it’s not necessary for them to require you to allow a CDN connection to use the service at all.
You’re drastically overestimating how useful IP and referrer data is when you only receive it in up to one year intervals though. With most connections coming from dynamic IPs or from behind NAT, a sporadic sampling of logged IPs without tracking cookies is worse than worthless. Trying to integrate that into the data collected by cookie-tracked services like AdSense, Analytics, and Gmail would only degrade Google’s ability to track you, not improve it.
Google does a lot of things that don’t directly add to their bottom line. From their contributions to charity to things like the Summer of Code each year, they certainly spend a lot of money that only benefits them by improving the web platform in general (so they can continue to serve ads on it) or by buying them goodwill. I can be as cynical as the next guy, but sometimes there aren’t monsters in every shadow. It makes perfect sense that they’d host these common JavaScript libraries on their CDN in order to help speed up the web and improve their advertising business in the long run.
The problem isn’t Google’s CDN.
I agree. And so there is no misunderstanding, it isn’t my credit card company but rather my domain registrar who is requiring the code, only in the payment screen, and only for the CVV entry object – everything else works on their site without problem, except that one tiny entry field. Still, it prevents me from paying them, so I am transferring my domains. (*shrug*) Most people would just bypass their firewalls or use a proxy, I suppose, but I can’t trust a company that’s using an allegedly secure connection and then offloading connections to a non-trusted third party without informing the user.
You’re drastically overestimating how useful IP and referrer data is when you only receive it in up to one year intervals though.
You are drastically underestimating the number of times user browser caches are cleared (in my case multiple times per week), browsers or systems reinstalled, et al. But the level of usefulness is irrelevant in any event; I do not believe Google has ever acquired any data that it didn’t maintain and eventually put to use, so even a small amount of usefulness is something I prefer not to provide.
(*sigh*) We will need to agree to disagree. While it is impossible to eliminate tracking (both on and off the Internet), it is possible to keep one monolithic company from knowing a frighteningly large amount of data about one and maintaining it past the 180-day legal limit in the United States. I chose to follow this path. You as a web programmer seem to think it’s a great idea to allow this company to monitor your users. I think that is the height of irresponsibility, while you believe it to be not only practical for the website but a benefit to the entire infrastructure. I expect they will do whatever is technically possible as their entire business model is predicated on violating the privacy of Internet users, while you believe they have only the noblest intentions and can be trusted with this massive amount of data, so inadvertently providing them more is not a problem.
And since we’re never going to agree on this matter, I leave the last word to you with my thanks for the enjoyable conversation.
Hello, tried using your link, the // thing at the start of URL, chrome interpreted it as file://. Didnt look up on other browsers though.
That will happen if you’ve loaded the page itself via a file:// URL, since the // URL just loads the resource with the same protocol that the page was loaded with. That won’t happen if you access the page via HTTP or HTTPS (e.g. once deployed or using a server on localhost during development).
If you’re allowing third party JavaScript hosted on an external domain to modify the DOM of your site, you’re doing it wrong. Talk about a single point of failure. Imagine the scale of destruction you could cause by hacking the Google hosted version of jquery.
If you think Google’s public-facing servers being hacked is likely, the jQuery CDN should be the least of your worries. The surface area in AdSense, Google Analytics, GMail, and even Chrome itself is far, far larger.
I agree with you that the same problem exists with AdSense and Analytics. This is why we should be encouraging people to use server side log analytics programs instead of Google Analytics, and encouraging people to avoid poorly designed ad systems like AdSense.
GMail and Chrome have nothing to do with the discussion at hand.
What makes this jQuery issue *completely* different to the other things you just mentioned, is that the benefits are only slight, yet the drawbacks are *huge* for the web. Google Analytics and Adsense as a service *require* trusting your DOM to a third party domain. Using jQuery doesn’t.
If you’re encouraging people to make the web considerably less distributed in nature, and encouraging them to build a single point of failure, all for a barely noticeable improvement in performance, you’re doing it wrong.
GMail and Chrome are definitely relevant if you truly believe that Google’s content distribution servers being hacked is a realistic fear to act on. For all I know, I’m using a maliciously patched version of Chrome to write this comment.
I understand your point (if you look through the previous comments, it’s been discussed a few times before), but I can’t agree with the implied risk assessment.
It’s incredibly unlikely that Google’s CDN will be compromised to deliver a malicious copy of jQuery to begin with. Even if that did happen, it would certainly be discovered within hours, if not minutes. The sheer number of eyeballs on those files is the best anti-virus there is. In fact, I’d be shocked if there aren’t people out there running an automated, periodic diff on the CDN copy and the latest copy on the jQuery website to monitor for that specific issue.
Meanwhile, if someone compromises your server – a far, far more likely event – and replaces your copy of jQuery with a modified one, you’ll probably never know. Unless the change breaks something, the overwhelming majority of site owners (myself included) wouldn’t be vigilant enough to notice a subtle change like that in files they assume to be safe. I’d be much more worried about that vector than worrying about a black swan event in spite of the mitigating crowd-sourced security that comes with a popular public CDN.
Ultimately, I’m sure we’ll have to agree to disagree about this, and there’s nothing wrong with that. I do want to be sure that you (and others reading this later) understand that those of us advocating for the public CDN approach have considered the security implications though. We just don’t agree that the risk is probable enough to slow the web down for fear of.
It has a nice API and I am glad that they are free, but just know that even if the static IP addresses for the company, often not know something about the location.
1. What about the best practice of minimizing http requests by combining scripts into one single javascript file?
2. What’s best for sites requiring login and https?
Over time, the solution I’ve found to be optimal is to use the Google CDN for jQuery (with a local fallback, just in case), combine site-wide plugins into a single minified bundle to optimize long-term caching for the portion of scripts that changes infrequently, and then include your more volatile custom scripts in a separate minified bundle.
You can see that in action here at the bottom of the source for this page (except I haven’t bothered to minify my custom script since it’s so small).
Regarding HTTPS, the Google CDN does support HTTPS references to all of its content. An easy way to ensure you’re always using the best reference is to use the protocol-less URL. Then, browsers will automatically select the appropriate protocol based on what was used to load the page.
You can take last version from Google CDN with:
I prefer to hotlink to google APIs too, but some clients insisted to upload to server. They said «it’s safer, what if google site crash?». Personally I never encountered google down or hacked.
But yeah I agree, that’s stupid to have billion copies of jQuery in your browser cache.
This discussion only seems to be addressing the question whether the jQuery library should be hosted by Google or not. I think a more important question is whether one should be using jQuery at all, because in principle everything that jQuery can do can be done with basic Javascript as well (because essentially it is just a kind of restructuring of Javascript). In many cases (e.g. just for a drop-down menu) it simply is overkill to have to download the whole jQuery library, when just little bit of CSS and Javascript does the job as well. Even more importantly, I found that the jQuery code (or that of modules using jQuery) can interfere with standalone Javascript code, For instance for our company website, I had written some Javascript code that, for reasons of optimum page display, enforced a page reload when the window was resized. Now recently we had this website completely redesigned externally, where the designers also added a couple of features using jQuery. However, due to certain features in the jQuery code (which I wasn’t actually able to pinpoint exactly), in IE7 this forced now a reload as soon as the page was loaded, so an infinite reload loop developed (which obviously IE7 users were not happy about). As a result, I have now replaced all modules using jQuery by CSS and Javascript code compatible with the other Javascript on the page. At least this way I can keep control over the script actions and no unexpected results arise.
Thomas
@Thomas, yes, why on earth would anyone use a tool that makes programming easier and reduces code size by as much as 70%? I don’t understand those people. In fact, I’m sending this message in Morse over the telegraph because probably understood, that’s all the internet is: communication over wires. I for one prefer doing things the old and slow way, and I was excited when the homing-pigeon informed me there was a like-minded soul somewhere out there. Yours sincerely, Hendrik.
@Hendrik,
I take it you are trying to be ironic here and actually dismiss my point of view.
Let me again point out that our company website was recently redesigned, with the designers not only implementing jQuery but also Mootools scripts for such basic things like dropdown menus, banner rotation with fade-in etc. As a result the homepage weighed in at over 500kB, which even on broadband connections can cause a noticeable waiting time until the page is fully rendered. Not to mention, as I said already, the problems which certain features in those scripts caused for the correct execution of other Javascript code. And since locating and debugging these kind of conflicts proved to be an utter nightmare, I decided to remove all the jQuery and Mootools scripts and write myself dedicated Javascript/CSS code that does exactly what the former did as well. Not only is all the scripting now without any internal conflicts, but the download size of the homepage has been cut by half to about 250kB.
So I really don’t see what in this case your argument of easier programming and reduced download size is based on. There is nothing that jQuery can do that simple Javascript couldn’t, and in most cases, the latter will provide by far the more efficient solution, not only as far as the amount of code is concerned, but also with regard the maintenance of the code (because if you want to keep control over all aspects of the scripting, you would have to know exactly what each jQuery statement does in terms of the Javascript involved, which I doubt anybody but the authors of jQuery would be able to do).
Just bolting another programming language on top of an already existing one may admittedly keep some additional programmers in bread and butter, but is in most cases problematic and inefficient and thus user-unfriendly.
Thomas
Sure, there might be cases where sticking to standard JS is a good option. However, there are millions of websites that are greatly helped by jQuery, no question about it. This specific thread is meant for people already convinced of its advantages and who are debating a jQuery specific problem. Therefore, while your argument may be valid in some cases, I’m sorry to say that your comments add nothing to this discussion.
I’ve always store js libraries locally. Now I see that is a good idea to do how you recommend.
Just a heads up, you left http: out of the jQuery call.
The full thing should be:
Thanks, Bryan
That was intentional. When you use a protocol-less URL, the browser will automatically use the protocol that the base page was loaded with. More info here: http://encosia.com/cripple-the-google-cdns-caching-with-a-single-character/
Interesting.
Yeah, it is a bit tough to let go of the warm and fuzzy way of doing things. Considering, I’m all about minimalist code this should make me happy in the long run. :)
Thanks, Bryan
True Google has a lot of speed and would save you bandwidth. Cloudflare is a free CDN that would also do this for you for free.
Unfortunately, hackers will attack this if everyone uses it. A hacker could issue a MITM (man-in-the-middle) attack, which would modify the contents of the Jquery script. Since Google is trying to make your browser cache Jquery.js, the hackers code would remain in your browser’s cache and would be loaded every time you would load a website with Jquery that is hosted on google. This is a common attack on large computer networks, but home user’s on private networks should not have to worry about this.
The only problem we have with CDNs, is filtering. Sometimes you use a CDN library, but that library can’t be delivered, because the CDN itself is filtered in a specific country. However, all other reasons are valid. Thanks.
If you use a local fallback (example: http://weblogs.asp.net/jgalloway/archive/2010/01/21/using-cdn-hosted-jquery-with-a-local-fall-back-copy.aspx), users in those countries should get an immediate fallback to your locally hosted copy.
Thanks! Great post. I also notice I was having a lot of issues with some servers hosting jQuery locally. I use google instead now
Really bad idea, I am presently working with a user using a web based app that has done this and the users firewall or proxy is blocking the url.
That’s why it’s a good idea to also use a local fallback (ala: http://weblogs.asp.net/jgalloway/archive/2010/01/21/using-cdn-hosted-jquery-with-a-local-fall-back-copy.aspx) immediately after the CDN reference, for those few users that can’t load it from the CDN.
There is one benefit you didn’t mention. Google gets free tracking data on all the pages you use it on. Yay! ;)
I have no way to know whether they’re trying to harvest some sort of tracking data via the CDN or not, but it would be incredibly poor quality data for individual tracking. The domain is cookie-less and your browser can go for up to one year without ever making a second HTTP request to the CDN for a given script. So, they’d only get useful tracking data on you if you were the only user of a static IP address and had caching disabled in your browser.
First off, I liked your article, and used the method for a client, just now.
But, you are downplaying the tracking information Google does get.
Every time you load a page with this code, it will send a request to Google. Then, Google will send back a 304 “Not modified” response, if the file hasn’t been modified. You can verify this for yourself, using Firebug or another network monitor.
What this means, is every time your browser sends that request, Google gets your IP, your user agent (browser / os), the time of your request, and the referring URL (the page you are on). In my eyes, that’s a good amount of data.
The primary advantage of using a far-future expires/cache-control header is that the browser doesn’t need to make any request at all to use its locally cached assets, not even a 304 check.
You’ll see a 304 when you force a refresh in some browsers, but no request at all when you’re making regular requests to pages. Click around my site here (not manually refresh) and watch the network tab in your browser. You won’t see an HTTP request for the Google CDN reference after it’s been cached once (at least not for a year).
Note: caching is enabled on my browser. And also, from the user-agent, different devices could be distinguished. IE: a smartphone, a mac computer, a windows 7 computer, a windows vista computer.
Just saying. :)
Oh, you are right about the caching! Guess I was being overly cautious
Nothing wrong with a little caution. It’s probably a safe assumption that Google’s tracking you whenever they can. This is just one of those few situations where even Google shouldn’t be able to extract any useful tracking information from the data they’ll be getting.
One major problem we have hit today… OpenDNS has just categorised apis.google.com as a Phising site, meaning anyone who uses OpenDNS can’t access the jQuery we link to via Goggle CDN
Ok, not a Google problem, but a big enough headache today!!
it is a headache indeed – I’ve just switched back to local :)
Using a local fallback (e.g. http://weblogs.asp.net/jgalloway/archive/2010/01/21/using-cdn-hosted-jquery-with-a-local-fall-back-copy.aspx) will handle DNS issues like that automatically, without penalizing the majority of users who have good DNS.
edit: Looks like this is already fixed: http://forums.opendns.com/comments.php?DiscussionID=12755#Item_12
I wonder if Microsoft’s jQuery hosting domain got tagged as phishing as well
DEAD LINK ALERT – I would guess that WordPress is stripping the h t t p from the beginning of the link resulting in this “//ajax.googleapis.com…………….”
This is happening in the link at the top of the tut and bottom
But great tut v informative
Actually, that’s done on purpose. This way, if you are on an HTTPS page, it matches that.
Ah ok… Worth noting that that link does fail if you try and run it from a local HTML file say on your desktop….as I did I guess ‘Live’ would be different – great article though.
I can’t use google as my CDN because they have not enabled CORS.
A CDN doesn’t need to enable CORS for you to use its assets on your page. That’s only necessary for AJAX requests, not for referencing third party scripts.
Hi, I don’t know if somebody mentioned it, but ideally all website would link to jquery (hosted on google ajax libraries) without specifying an exact version. Because if a website links to jquery 1.6.0, the other to 1.6.1, the other to 1.6.4 then it will be the same as linking to your server one.
So e.g. instead of linking to ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js
we should all link to ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js… see the difference? Asap jquery is updated everybody is pointing already to the last version and the caching is there and no need to update your website.
Disadvantage: if jquery breaks something you have to correct your scripts or link to an older version. But I would risk that, after all if something stops working then you just link to the older version and voila: fixed.
You shouldn’t do that in production. In order to reliably serve up the latest version for /1 or /1.n, the CDN responds with very short caching expiration headers. So, you lose almost all of the caching benefit. At that point, serving jQuery from your own site with proper caching headers would actually be faster for returning visitors.
This might be a good idea if google delivered its jquery cache as type text/javascript. Unfortunately, it delivers it as text/plain, which is great for reading it but is likely to result in the script being blocked by any halfway-suspicious anti-XSS software.
Some individual users may be able to work around this, but my view would be that a site that gives me an hour’s work checking out whether I can trust this cache (which I haven’t done, so I don’t even know whether, for instance, google allows third parties to host content) isn’t worth visiting.
I keep google analytics permanently blocked. Why would I trust google not to move some analytics code onto this site?
This is the Content-Type that the Google CDN is serving jQuery to me with right now:
In any event, potential problems like that one are why it’s a good idea to also use a local fallback (see previous comments for links and examples of that). That way the majority of your users will benefit from the CDN, and the few with troublesome setups will still be okay.
Also, keep in mind that the
ajax.googleapis.comdomain is cookie-less. It’s not suited to analytics. Without a way to disambiguate dynamic IPs and machines behind NAT, trying to extract individual tracking data from the CDN would be pointless.So this is one of those articles that encourages “webdesigners” to include contents from google.
I’ve blocked these google services for privacy reasons and it really p*ss*s me off seeing that more and more website just don’t function any more because some web-wiener thought it would be nice to remotely include this stuff from big G. Really, this is not only a bandwidth/latency issue – you’re stealing a bit of your user’s privacy and potentially lock those out that care about it – and really, privacy decisions should be left to the user!
Please also mind that there are many company/country networks that block google services or cross-domain javascript inclusions (for “security” reasons). For those of us that are bound by these restrictions, some websites just don’t work any more, because somebody followed your idea and thought it would be great to include the script directly from google.
By the way – the “decreased latency” is outweighed by the fact that a browser loading the site has to resolve the (additional) domain name again, wait for the reply and then is able to make the request.
do you know if google or twitter provide an similar service to host twitter bootstrap library and css
i found that link is used in twitter bootstrap
http://twitter.github.com/bootstrap/assets/css/bootstrap.css
do you think it is a good idea to Depends on this link
No, I wouldn’t use that in production, because it’s not being served with a far future expires header.
about the tracking of information by Google, if you are already using analytics, adsense or other google products on your site, google is already getting it.
Which is why I don’t use Google Analytics, Adsense, or Plus. I block most third-party content and am dismayed by web sites which compromise user privacy. I thought this public CDN craze was really great a couple years ago, now I avoid it.
Web sites that rely on public CDN’s are like “that guy” who forwards emails without using BCC. Don’t be “that guy”.
You sir is a conspiracy theorist.
Ad hominem is fallacious argumentation. Please debate in rational and supportable terms.
I agree that we should keep name calling to a minimum, but AntoxaGray isn’t far off the mark here though. Between the +1 year far-future caching and cookie-less domain, there’s very little (if any) tracking data that Google’s going to extract from CDN usage. Even if they tried to, that intermittent, anonymized data is going to be worse than what they already have on almost everyone through other sources anyway. You should welcome the prospect of them muddying their tracking data with noise from CDN usage!
Hi Dave. I don’t know what you mean by “worse”. Anything obtained includes IP and Referer which—especially in synergy with the vast amount of information already held by Google—breaches user privacy. Any “hit” provides Google with new information that IP address X is interested in example.com, and shapes what is known about the person at IP X.
Google is primarily an Internet advertising company. Knowing that X has an interest in example.com is extremely valuable information. Why do you think Google offers a CDN? Out of the goodness of their heart? Possible, but unlikely.
Now, I can offer a cookieless far-future expires just as much as Google can. This reduces the advantages of a CDN leaving very little substantive gain. We can respectfully disagree on the importance of that gain vs. privacy, but the gain is minimal (or negative in some cases of DNS lookup or certain countries) and first-hit only. Add the complexity/risk of needing a local backup or development implementation and the gain gets smaller.
Until you know someone who has been subject to the abusive and impenetrable side of Google it may sound like conspiracy to you. Or, you may find the perceived benefit worthwhile despite rare–but real–stories. I personally find otherwise, and I don’t base my opinion on conjecture or superstition; it is a personal weighing of attack surface vs. benefit.
One additional issue which I feel muddies the whole idea of public CDNs is versioning. Caching myriad versions of jQuery (and other libraries) reduces the benefits of shared caching. IMO, a public CDN should push the latest (i.e., bug-free, most secure) code base. But, as you mentioned previously, requesting version “1.*” from Google defeats the far-future expires header. So we end up with the millions of unmaintained sites on the Internet all causing the caching of many years-old versions of jQuery. Needlessly. Yes, I know that breaking changes enters this discussion, but I think the whole public CDN idea needs further development to overcome versioning hell and the many other issues raised by commenters above.
I do find your article to be incredibly well-done with a persuasive argument, technically accurate and informationally valuable. I was initially persuaded but reversed my opinion upon further reading and deliberation.
Thanks for adding your knowledge to the web!
By “worse”, I mean “worse”.
I don’t work for Google and I’m not privy to any of the algorithms the AdWords/AdSense tems use to target ads, but I do spend a lot of time working with web analytics. And without a tracking cookie to identify a particular browser, a simple IP/referrer combo is worthless with the prevalence of dynamic IPs and multitude of devices that masquerade behind a single IP with NAT. In fact, that kind of data is worse than worthless because it pollutes your existing data if you try to include it in information gathered via cookie tracking. Google has far too many high-quality inputs to want to muddy that data with the CDN’s random noise.
If you have specific information to share instead of FUD and innuendo, please do so. If I had concrete information that showed the Google CDN to be a credible privacy threat, I wouldn’t recommend using it for one more second.
Dave, you might be 100% correct. Your suggestions are valuable, but I find that I can address the performance aspects 99% as well with local hosting and have achieved top ranking and 100% PageSpeed without CDNs. I’m not seeing the last 1% (or whatever) gain being significant enough compared to the other issues, of which privacy is only one of them. It might be a few milliseconds or even a tenth, but it’s only the first hit. I might go the other way if it were 1/4 to 1/2 a second.
I don’t intend to derail the discussion by delving into Google’s abusive side. That is only one aspect of my consideration which is not single issue oriented. Google does have an occasional practice of unilaterally closing AdSense accounts and walking away with the cash balance without explanation, warning, evidence or appeal.
Specific info:
http://www.nichepursuits.com/i-just-got-banned-from-google-adsense-now-what/
I will not get dragged into a debate on this topic. I provide it for your edification only. I understand the reasoning from both sides extremely well. If you knew someone who got sacked by Google for big bucks by this brick wall treatment then you might consider they aren’t the next best thing to sliced bread. They are a behemoth entity with their own interests in mind and will drop business partners on their head in a hot second and kick you out the back door without explanation, recourse, or so much as an email address or phone number to contact them.
Once again, it’s not about a single issue, so I’m not interested in diverting to the above topic–post on that guy’s site if you want. It’s the bigger multi-pronged picture of privacy, added complexity and the small gain vs. local control. I prefer local control for my web sites (and for my government).
Your article remains a valuable contribution to the Internet community and provides great feedback regarding the use of CDNs by many others that I would not have conceived of on my own. I hope the discussion of supporting and opposing views will spark ideas, bring improvements and further the industry. Thanks again.
The benefits being shared here and elsewhere are all theoretical. Just came across an in-depth analysis of using a CDN and if it provides the expected performance benefits.
http://www.root777.com/appdev/does-using-google-libraries-api-cdn-give-you-performance-benefits
Fragmentation is a legitimate issue, but you have to keep in mind that a single CDN reference in the top sites that he’s looking at will prime millions of browser caches each day. Coverage doesn’t need to be as thorough as you might intuitively think.
Also, his analysis didn’t go far enough. When you confine your analysis to larger sites, it’s much more likely that they’ll have their own CDN in place and not care to use a third party. When I did something similar a couple years ago, but took my crawler all the way down the top 1,000,000 sites (by Alexa ranking), I found that references to the Google CDN increased the further down that long tail that I went. So many WordPress themes, sites using H5BP, and other starting points use the CDN reference that its coverage among all sorts of new and/or niche sites is really great.
Putting those two facts together, I stand by my recommendation to use Google’s CDN for jQuery.
I use WordPress and having my own server cope with loading jQuery really seemed to slow my site down. After going with your CDN approach, my site now loads a whole 2 seconds quicker (testing with Pingdom).
http://codex.wordpress.org/Function_Reference/wp_enqueue_script
With average website page nearing 1Mb these days, for me a 28K is a negligible tradeoff for being absolutely sure that my website will be functional no matter what happens to Google CDN.
1. It may be down or inaccessible (just google for ajax.googleapis.com and see how many ppl have problems)
2. It may be blocked by your firewall or your sysadmin
3. It may be phished for a malicious code
4. Google might decide to change the naming conventions (happened to my AppEngine site once)
5. You can not run your website on a local network
etc…
This page you’re commenting on, with social sharing widgets, ads, images, and nearly 400 comments only weighs in at 365kb (at the time I’m writing this). jQuery 1.7.2’s 33kb is nearly 10% of that; not negligible.
You’re right that it’s a good idea to prepare for situations where the CDN is inaccessible, but falling back to a local copy in those situations is simple.
As for the security FUD, those doubts have been raised before, but I don’t find them credible. It’s far, far more likely that your own server will be hacked than Google’s hardened CDN. On the other hand, if the Google CDN were hacked, that would be front page news within minutes and remedied immediately, whereas orders of magnitude less eyeballs are watching the assets on your server for you. This is one case where you’re actually safer sticking with the herd, IMO.
Google concerns me more than any hacker. Google is not our friend; they are a powerful entity which is even more empowered by their CDN. Other performance gains can be had without sacrificing user privacy.
Self-hosting with a far future expires header is every bit as effective except for the first hit, and perhaps faster considering the separate DNS resolution required for Google’s CDN.
I don’t find a modest one-hit performance gain to be sufficient reason to overlook privacy.
Dave,
When I first read this a couple years ago I was totally sold. Now, I’m switching back to local.
(1) PRIVACY, PRIVACY, PRIVACY. Google is evil. They really are. And if they aren’t yet, they will be. Or they will be compromised by government.
(2) I can set caching for one year also.
(3) I’ve now set up my own static domain to address parallelism and cookies.
AND THE BIG CDN KILLER…
(4) The latest server technologies automatically combine multiple .js and .css resources into one download, and minifies them and compresses them at the same time. There is now less penalty to maintaining customized jQuery and other resources locally.
GREAT ARTICLE, NEVERTHELESS! THANKS! Your arguments remain powerful ones but I think the benefits can be reasonably reaped without sacrificing privacy.
Best wishes!!!
I’ve been hosting locally and while reading this article I was beginning to feel a bit foolish but, you guys provide some good arguments both ways. I generally feel like I have more control if I’m hosting them on my side, and the privacy issue is definitely worth giving some thought. If I were pressed for bandwidth, however, I may just let them host it.
I wonder if google would they rank your site higher if they gain more stats through your CDN links? I would if I were them and were evil~
Diversifying assets on different domain names will increase your Google PageSpeed score and is part of the secret ranking algorithm. I have achieved 100% PageSpeed scores without CDN hosted assets, however. You can set up your own static site for .js and other assets to improve parallel HTTP GETs.
For Android (& other phone) applications, the cost of data to load jQuery from any external (ie, off-phone) host can be prohibitive of using the article’s suggestion.
Of course, one could insist that an off-phone host be limited to those times (normally few) when it can be loaded via WiFi, rather than via the phone’s costly non-WiFi data connection.
That could be implemented by a data source test & a decision to use the local jQuery.js file when anything other than WiFi would be used to bring it in from elsewhere.
Good article. I think any web developer must use Google CDN and understand its’ benefits.
Though I will recommend to expand a little bit this approach, to be sure that jQuery is loaded even in the case when Google CDN is down (inaccessible), this approach also helps with local web site development without internet connection:
In general, in order to keep page load times to a minimum, it’s best to call any Javascript at the end of the page, because if a script is slow to load from an external server, it may cause the whole page to hang.
This is off-topic, not to mention a nasty habit. Sticking tags all over the place so they are only loaded when necessary is a terrible practice; sometimes you need a script halfway through a page, etc. It’s the kind of fingernails-against-chalkboard style that I see older developers doing.
Load them asynchronously if you really need to. For example, use a callback on the google load call.
Definitely not doing wrong, if you decide hosting jQuery yourself. Google CDN takes some time before adding a new release.
I just prefer having it on my side, compressed, gzipped, and merged with other files.
It’s always good to fall back to the local copy if for some reason Google’s CDN cannot be accessed. yepnope is good for this:
You can download it from http://yepnopejs.com/.
Thanks just so much! I was here for the link actually, and you made it really easy!
Thanks for the run-down. I read your article after installing the Use Google Libaries plugin for wp on a recommendation by Yoast.
thank you for helping me in implementing jquery in wordpress. Since I am a beginner in worpress theme development, I can say that your article was excellent.
Yes, it indeed introduce lots of benefits into your external website with Google hosted jQuery, for example, basically all the DotNetNuke CMS implemented it. :)
FYI, every time I forget the path to the google CDN for jQuery, I search “why you should let google,” and then copy/paste the script tag you’ve got in the article.
I’m vaguely curious how much traffic this page generates by other people doing the same thing :]
This page doesn’t get as much traffic as you might think. A rough average of 1,000 views on a weekday that doesn’t have any extra traffic for some reason or another.
Funny that you mention using this page for reference. I do the same thing myself when I need to add the CDN reference to a page. That’s why I added the easy copy/paste text input at the top and try to keep it up to date with the latest version. Glad to hear that you find it handy too.
THANK YOU for this article. Helped me so much, your explanation of the difference between using the local version and hosted version.
I wondered why that never worked when testing on my local computer.
So simple. Insert the http: Now everything works perfectly.
YOU are a STAR !!
When it comes to institutional establishments such as schools who have very heavy proxies and security settings, often the administrator is the one responsible for the firewall and when web applications have multiple references it can be a real pain to chase them through the DOM to allow the site to work properly. Including all files on one domain simplifies this task so to CDN or not solely depends on the markets needs.
I agree that if you’re 100% inside an Intranet, especially one with only a single physical location, it does make sense to skip using the CDN.
Google’s CDN isn’t 100% reliable, so make sure your site falls back to a local copy.
I quick check revealed that ajax.googleapis.com adds additional cookies to your up and download. about 250K. Apart from wondering what google might do with cookies from these static webpages (spy on you, I suppose), it ads additional burden on upload. Cookies are transmitted as part of the header. This might not seem much but with the advent of mobile devices on a metered interface it is less desirable.
Apart from Google spying on you :-)
Host frameworks on your server, it actually is easier.
developers.google.com/speed/libraries/, where the documentation is hosted does set routine tracking cookies. The images on the 404 page if you visit ajax.googleapis.com are served from google.com, so they also carry some tracking cookies.
However, ajax.googleapis.com is a cookie-less domain. There are no cookies involved in requests/responses for the actual libraries hosted there, e.g. jQuery 1.9.1 as of a few minutes ago:
I definitely wouldn’t recommend any CDN that used cookies, for performance reasons alone.
Was just here for the link… nailed it. Beers to you.
M@
Interesting article and discussion. However, I haven’t seen a counter response on the ‘man in the middle attack’ issue raised by Chris M. and Travis Cunningham. Am I missing something here, because it seems to me quite a powerful argument against using CDN.
You forgot the 4th reason – to make it easy for the NSA to compromise the security of your web application and impersonate your users without you noticing, by handing Google a ‘National Security Letter’. :-(
I’m skeptical of the NSA and their relationship with companies like Google, but this kind of conjecture is purely FUD. Of all the attack vectors the NSA probably has at its disposal, it’s almost inconceivable that they’d chose to muck with something so public and easily detected. Not to mention that a site-specific attack through a poisoned version of jQuery would be unreliable anyway due to the +1 year caching directive and cross-site caching effect.
Amazing post, Great details.
I did the plugin for WordPress before reading it, yet it was incredibly educating and well written.
Thanks
I like the idea, but then everything that relies on jQuery is broken in places like China where ajax.googleapis.com doesn’t work.
That’s an interesting one!
I use this site a lot, just to copy your code, anywayz, your SEO is failing, you where like the #1 result, but now after quite some time, it dropped ’till the 7th place…. just sayin’
Out of curiosity, what’s the search that you do when you’re looking for this page?
Well, my search has always been “jQuery Google CDN” maybe its local, but at first it was a bigger result with image if i’m right… now it’s just a normale result
Anyway, your proposal is fine for HTTP (and HTTP only) if a mandatory fallback is set
Here is an example:
(I suggest to include “http” before ajax.googleapis.com to get the relevant security alert if I forget that code in a HTTPS page: there is a real danger of being victim of a man in the middle attack!)
isn’t this all moot now? I believe google code and related hosting is going away in the next few months.
Which is, clearly, the number one reason to not use free third party hosting for anything that your product relies upon.
This is a basic rule of thumb for development – make sure your product is entirely self sufficient.
However, I think it’s tragic that their development hosting is going away – they claim it’s due to the popularity of github etc in recent years. However nothing out there has the hosting speed that we have found on google code….
Google Code isn’t tied to Google’s Hosted Libraries CDN in any way. The project hosting shutdown won’t affect the CDN described in this post.
Google Code isn’t the same thing as Google Hosted Libraries. There’s nothing on the Hosted Libraries site ( https://developers.google.com/speed/libraries/ ) that suggests that it’s going away. If it did, there’s also CDNJS.
Oh my you are awsome
I was using the jquery-1.11.1.min.js (90KB) but now from google API it is jquery-1.11.2.js (32 KB) only 32 KB
Very cool
Thanks for sharing
This is very helpful! I’m still new to this and I would really like to learn about jQuery, I’ve read a lot of tutorials and had different references just to answer what jQuery really is, how it’s done and how it works. Based from here http://www.lionleaf.com/blog/what-is-jquery/, it states that it can work on effects and animations easier. But with what you’ve said that Google can host jQuery for me, would it work on any web browsers as well or just with Google Chrome?
Oliver
Correct. The Google CDN (and all other CDNs I’m aware of) is not browser specific. It’s just HTTP — it should even work with browsers from the 90s.
So, it means that Google’s CDN in quite outdated, yes?
Not at all.
Be aware that if you are calling libraries from Google’s CDN, your code will fail if it is running in China behind the firewall.
What about modern CDNs? Can it cope with the long destinations well? I work with CDNsun. My site has lots of media. May Google still cope with it?
A static content CDN is fairly simple. I’m not aware of a distinction that would make a CDN “modern”, other than maybe HTTP/2 support.
The killer benefit when using Google’s CDN for popular libraries like jQuery and Angular is that so many sites reference those URLs that your users may already have the file cached for instant use before they’ve ever visited your site the first time. Using a site-specific CDN doesn’t allow for that benefit.
You may want to have a look at the book “How to win friends an influence people”.
Telling people they’re doing something wrong is a really bad way to get them to change. They get defensive and no matter how right they know you are, they’ll dig in and argue against you.
I host jquery locally and I quickly browsed your reasons, but because you’ve put my back up by telling me I’m wrong, I didn’t really bother reading the details and I’m going to continue to host locally….so you’ve failed to influence me – which is the point of your blog (unless the point is to just show off your knowledge).
“How to win friends an influence people” – it’s a good book – I’ve learnt some things from it.
Back when I wrote this, the “you’re doing it wrong” thing was kind of a meme for a while. Don’t take it too seriously.
Not sure I’d agree that the one phrase has hampered the post’s influence though. You wouldn’t be here reading this almost eight years later otherwise.
Hi Dave, for years I’ve come to your site every time I need the URL for Google’s CDN, but recently the widget around your code block has made it very annoying to simply copy the text. When I try to select the text to copy, the block expands on mouse over and deselects the text. Sometimes triple clicking doesn’t work either. Can you fix this or widen your site?
Sorry about that, Mike. I’m using a new syntax highlighting plugin and it seems to be trying to do too many things at once in that situation. I changed some settings so that it should allow you to highlight (or triple click) and copy without it interfering. You may need to wait until the page finishes loading though (and this one takes a few seconds due to the number of comments).
Give it a try now and let me know if that’s better.
Using XSS whether it be google or whomever, YOU are DOING IT WRONG!
Since AdBlock, Noscript, Ghostery are now mainstream, look for a broken website following this dumb advice.
Have you tested that theory? I just gave this page a load with Ghostery and uBlock Origin installed and active with their default settings. The Google CDN was not blocked.
Do you have an example of one of these extensions that does actually block the CDN? I’d be interested to keep track of any that do.
Hey!
I really like how your example code pop out onto the page when the mouse cursor hovers over them!
What is the name of that? I really want to learn how to implement that into my website code.
Thank you!! :D
I’m using a WordPress plugin called Crayon Syntax Highlighter to handle that.