arrow441 Comments
  1. Paul
    Dec 10 - 4:37 am

    We tried to do this, for all the reasons you state. Unfortunately, for the few weeks that we had it that way, our local development internet connection was a bit unpredictable. Every time the internet connection came down, testing ground to a screeching halt; it just wasn’t considered worth it in the end.

    • peterp
      Dec 11 - 7:48 am

      There are ways around this. Not certain of which framework you’re using… But in Django you set a debug settings. And you’ve got access to this in your templating framework.

    • papsy
      Nov 08 - 3:26 am

      I realize this is an old post but in response to Paul’s comment from 4:37AM on 12/10/2008:

      Although I mostly disagree with the general attitude of this article (that I don’t really care enough to address), there are multiple things that you could have done to resolve your specific issue.

      #1 – Add a simple check somewhere in the code to determine if it’s being hosted off of the development server. For example, I always make sure W:\adam.machine (Windows) or /adam.machine (Linux) exists on my development machines. Then in the actual code, I do a quick check to see if that file exists and if so – adjust accordingly. For example, if it were determined to be the development machine – use a local copy of jQuery.

      #2 – Similar to above, except it’d work for entire networks. Assuming you have a static IP address: add a check in the code that determines if viewer_ip matches development_ip and if so – use a local copy of jQuery.

      #3 – Add a hosts entry (%systemroot%\system32\drivers\etc\hosts or /etc/hosts) for that points to the local server. That way, any requests for the jQuery file on Google’s server will be redirected to the local server, and thus, a local copy would be used.

      #4 – Similar to #3 – use a proxy that automatically uses a local copy rather than Google’s version. For example, if the development server were a Windows machine: use something like Fiddler as the web-proxy and add a rule that redirects requests for to localhost/jquery.min.js.

      #5 – Use both versions on the relevant pages, where Google’s is included first and the local copy second. Modify the local copy so that it checks if some_jquery_function() has already been declared (may wanna do a couple to be safe, in case x_function() is removed) – and if it doesn’t exist, continue [including the local copy] as expected. If the function already exists, though, that must mean that the inclusion of Google’s copy succeeded and thus the local copy doesn’t need to do anything more than the check (e.g. automatically return).

      I’m sure there are dozens of different ways this issue could be addressed, even aside from those listed above. Point being: when something poses an unexpected problem, do whatever necessary to resolve it, rather than simply removing whatever caused it. I’ll admit that sometimes it does take a certain level of creativity – but we all have the power to overcome it. It just takes a little bit of thought and more times than not, an equal amount of effort.

      (NOTE: I didn’t read anything else on this page, so if any of those solutions have been suggested already – I apologize.)

    • AntoxaGray
      Aug 21 - 3:54 pm

      That’s why you should include local fallback script, look at html5boilerplate.

  2. Jonathan
    Dec 10 - 4:38 am


    The only thing that bugs me about this is that the intellisense in VS2008 doesn’t pick up the google js file. This isn’t a big deal as I just add a local reference for working with it and then remove it before deployment.

    Would be good if the intellisense worked on remote JS files.


    • Dave Ward
      Dec 10 - 10:20 am

      If you’re working in JavaScript include files, you can use:

      /// <Reference Path="/path/to/jquery-vsdoc.js"/>
  3. Ian Roke
    Dec 10 - 5:41 am

    This is a great post which expands a post I made on my blog called Differences Between Local And Google Hosted jQuery Installations so hopefully anybody who comes here needing help with jQuery on Google will find all the information they need!

    Keep up the good work, Ian.

  4. [...]  3 reasons why you should let Google host jQuery for you | Encosia [...]

  5. Todd
    Dec 10 - 7:39 am

    @Paul: When doing development, you should have a local copy anyway IMHO. You’re not developing/testing on your live server are you?

  6. Paul
    Dec 10 - 8:15 am

    If you’re doing this on a public facing* website, you are doing it wrong…

    Strong words.

    What if you are serving up content over https?

  7. Marin
    Dec 10 - 8:18 am

    Using the google.load() method will also allow you to use Google’s ClientLocation :)

    • Jason Cohen
      Aug 21 - 4:38 pm

      True, but caution: ClientLocation is often null even when it’s clear that location information is available.

      It’s a nice API and I’m glad they make it free, but just know that even with corporate, fixed-IP addresses it often doesn’t know anything about location.

  8. Dave
    Dec 10 - 8:34 am

    What if the user’s network (for some reason) is blocking googles CDN.

    This could happen if some crazy corporate policy exists, that stops the downloading of some other file on the CDN (or blocking google entirely).

    These users will no longer be able to use your site correctly.

    • Dave Ward
      Dec 10 - 10:15 am

      I don’t think you’re very likely to find that combination in practice. If they’re willing to block 16.5% of the Internet in one shot, they’ve probably blocked your site too.

      • Dave
        Dec 10 - 12:18 pm

        I can speak from experience – that this happens.

        It’s a royal pain, too.

      • Steve
        Jul 15 - 12:38 am

        Indeed it does. And the people on the other end generally won’t be back.

    • wekempf
      Dec 10 - 4:34 pm

      If you’re coding the site correctly, the experience will degrade but not lose any functionality in this scenario. Since some users turn JS off at the browser level, you should be doing this anyway.

      • Jim Robert
        Dec 22 - 3:54 pm

        yeah yeah, but let’s face it – not everybody has time to pour into perfect degradation when < 1% of their users fall into the no-js category

  9. Gabe
    Dec 10 - 8:52 am

    Thanks for the write-up, Dave. I’ve been using the Google hosted jQuery for a while and wondering if I shouldn’t be. Your points remind me why I started doing it in the first place and makes me feel more confident that I should continue doing so.

  10. [...] 3 Reasons Why You Should Let Google Host jQuery for You (Dave Ward) [...]

  11. Zach
    Dec 10 - 11:20 am

    Couldn’t you use JS to verify google’s jQuery script loaded… Then fallback to a locally hosted version if it’s unavailable?

    if(typeof jQuery == ‘undefined’)
    // load local version….

  12. Paul
    Dec 10 - 11:36 am
  13. Mark Struzinski
    Dec 10 - 12:08 pm

    My only concern would be breaking changes with a new version. I know, not probable, but definitely possible…….

    • Dave Ward
      Dec 10 - 12:13 pm

      If you’re using the <script/> method, you don’t have to worry about that. The 1.2.6 URL will always load 1.2.6, even after 1.3 is released.

      If you used the google.load() method and didn’t specify the full version explicitly (e.g. “1″ or “1.2″), that could definitely be a problem.

    • bazztrap
      Dec 10 - 1:09 pm

      Over engineering?

  14. [...] 3 reasons why you should let Google host jQuery for you | Encosia Doing so has several advantages over hosting jQuery on your server(s): decreased latency, increased parallelism, and better caching. (tags: 2008 mes11 dia10 jQuery Google blog_post) [...]

  15. tony petruzzi
    Dec 10 - 4:08 pm

    This is probably the most worthless thing Google has come up with. Yeah sure it’s nice that they are supporting the libraries, but really, how is this beneficial? All we do to handle the problem you describe in your post is to download the minified versions of all the plugins and then cut and paste the code into one file.

    Now we don’t have to worry about sending 15 files across the pipe (we use a lot of plugins) and all we’re really sending is the minified files and our custom javascript file. 2 files and that’s it.

    • Dave Ward
      Dec 10 - 4:20 pm

      Why would you want to force me to re-download jQuery in your combined include? That’s an unnecessary waste of time and bandwidth.

      Script combining is generally a good idea, but rolling jQuery into your monolithic JS include will never match the performance of using Google’s CDN; not even for users with an empty cache.

      • Scythe
        Dec 11 - 10:33 am

        You’d be very, very surprised at how much it does matter. Rolling it into 1 big JS is the preferred way to package a JS-packed site.

        • Dave Ward
          Dec 11 - 10:35 am

          If you roll jQuery and all your plugins into one JS include and I use the Google hosted jQuery in conjunction with a combined include with just my plugins, my site will load faster than yours every single time. Even for users with an empty cache.

          • Scythe
            Dec 11 - 4:23 pm

            I could argue this all day long since you seem overly stuck in your ways, but 95% of the time it’s far faster to roll up your own JS file, jsmin it, and gzip compress it to your users.

            Unless you’re running a heavily loaded server (and even then), it would still be faster to load one file then it would to do 2 full remote calls.

            Of course the timing is also dependent upon a few things: your distance to the server hosting the files (hops), your connection, the call order, and how loaded your server is.

            I’m not a big fan of someone proposing an idea as the one and only way to accomplish something — and all other ways of doing it are wrong (which is basically what you said, exempting LAN of course). Also by comparing via one tool on one browser is the wrong way to go about showing performance gains.

            Anyway, I’d suggest you check out some other methods before claiming a dependency as the best way to do it. I call it a dependency because you’re now fully relying on a third-party. There’s many ways to deliver a payload faster — through things such as dynamic JS importing/loading, etc.

            • Dave Ward
              Dec 11 - 4:44 pm

              For what it’s worth, so you don’t think I’m picking one way of doing things and arbitrarily claiming it’s best, I’ve been testing it both ways for quite awhile. Even using the old Google Code hosted version, which was less optimized for serving users, was faster than rolling it up.

              That’s before you even consider the users who will show up and already have it cached locally.

              I’d love to see a post making the counterpoint, with numbers. If you made one, I would be happy to link to it from this post, to offer more complete information.

  16. [...] Thanks to this article for pointing out this option. [...]

  17. Vaibhav
    Dec 11 - 7:38 am

    Interesting post.Thanks for sharing this.

  18. ernesto
    Dec 11 - 8:24 am

    One reason not to use it is privacy:
    Google will know all of your client’s IP-addresses, referrers, etc. It’s like using google analytics, but only Google seeing your statistics.
    In European data protection laws it may even be illegal.

    • Dave Ward
      Dec 11 - 9:07 am

      Yeah, that’s one aspect that I’m not crazy about. At this point though, I’ve mostly given up on Google not knowing what’s going on with a public facing site’s traffic.

    • Ell
      Dec 11 - 3:23 pm


      Privacy was my first thought while reading this article. One should try to not give Google more information than absolutely necessary.

    • manveru
      Dec 11 - 9:43 pm

      Google will know all of your client’s IP-addresses, referrers, etc. It’s like using google analytics, but only Google seeing your statistics.
      In European data protection laws it may even be illegal.

      That’s not true, given that a client makes a request once a year if the browser keeps the caching contract, i would hardly call it intrusive.
      Chances are your clients will be using google to search for stuff more than once a year.

      • Jinno
        Dec 15 - 2:06 pm

        It would still make the request to get the 304, it just wouldn’t secure a data transfer. Google’s servers could still be logging those 304 requests.

        • Dave Ward
          Dec 16 - 8:51 am

          Google currently sets an “expires” header of one year in the future on these files. If your browser has cached Google’s jQuery-1.2.6.min.js on the client side and you visit a new site that uses it within a year, the browser doesn’t even have to check for a 304.

          If Google really were as prying as some of the more paranoid among us would suggest, then they wouldn’t set that expires header. They’d happily pay the bandwidth bill to log the 304s.

      • Nirman Doshi
        Jun 19 - 11:45 pm

        Completely agreed.
        I dont think there is a privacy concern.. google is something that anyone using internet is accessing it.. so sharing an IP address with Google may not be a big deal if performance and code-manageability is your goal.

        • Kevin P. Rice
          Jun 20 - 6:23 am

          Sharing an IP address isn’t the biggest deal… but sharing domain and Referer is. Expires headers only reduce the problem.

          Further, the same argument (only checking once a year due to Expires headers) supports self-hosting. The file is only downloaded once and the security issue is entirely eliminated. I don’t think performance suffers more than a few milliseconds (if that), and only upon the first page visited. It’s nothing. This is not even an arguable point if your pages include more than a dozen resources.

          Thus, I find security/privacy to be the overriding concern here.

    • Gordon
      Sep 10 - 3:46 pm

      Odds are they are using Chrome or have a google toolbar installed and google can see those traffic patterns anyway. If they aren’t using chrome or the google toolbar, it is still fairly likely they came to your site from a google search, or a site with google code on it (which would again allow them to see traffic patterns).

  19. julien
    Dec 11 - 9:25 am

    Thanks for this post! But why do you initialize your script with google.setOnLoadCallback() instead of $(document).ready() ?
    Because i still use $(document).ready() even when i load jQuery with google’s jsAPI and it works well.

    • Dave Ward
      Dec 11 - 9:36 am

      I’ve found that it depends on how fast google.load() loads jQuery and where your $(document).ready() is.

  20. [...] I ran across this article on the Google AJAX Libraries content delivery network. This is the way to go. Using this will [...]

  21. Michael Brennan-White
    Dec 11 - 9:47 am

    It would probably be of more value if Google also stored copies of frequently used plugins as well as the main jquery file.

  22. Isaac Z. Schlueer
    Dec 11 - 1:03 pm

    Even with the delay, there is one advantage of the google.load() method.

    Remember that SCRIPT tags block the download of other components and the loading of the dom. This blocking nature is primarily designed to support document.write() and other synchronous features of the language.

    When a script is loaded dynamically, it is not a blocking download. (Document.write is also broken, but you shouldn’t be using that anyhow.)

    So, even though the total time for jQuery’s load might be 200ms longer, if the jQuery loading takes a while, then your page is functional that whole time, rather than freezing up waiting for it.

    If you’re just including jQuery, a mere 16723 bytes gzipped, it’s probably not too terrible. If you were loading lots of different scripts and modules, or if perceived load time was absolutely critical, then it could be more significant.

    • Ben Amada
      Dec 12 - 12:21 am

      Isaac, that’s a great point. Depending on your existing page’s logic, switching to google.load() may or may not be an advantage if you have onload code that (un)intentionally depends on the blocking behavior of regular script tags. Definitely something to consider and test for if making the switch.

    • Dave Ward
      Dec 12 - 12:26 am

      It probably bears more thorough testing, but I’ve found that google.load() exhibits the same blocking behavior as a normal script element.

      I’m assuming (dangerous!) that google.load() works by injecting a script element via document.createElement(), which would be subject to the same blocking issues.

      • Ben Amada
        Dec 12 - 1:54 pm

        Hi Dave. But if there truly was blocking with google.load(), then this isn’t consistent with your response to Julien’s comment above where you said “I’ve found that it depends on how fast google.load() loads jQuery and where your $(document).ready() is.” when he asked about why you would need to use setOnLoadCallback() rather than jQuery’s $(document).ready().

        I haven’t done any testing, but if there truly was blocking with google.load(), then I’d think there would be no reason you couldn’t use $(document).ready().

        • Dave Ward
          Dec 12 - 5:03 pm

          Try this, for example:

          <script src=""></script>
          <script type="text/javascript">
            google.load("jquery", "1.2.6");
            $(document).ready(function() {
              alert("I won't run.");
          <script src="/scripts/site.js"></script>

          Attempting to access the jQuery object in the same block as the google.load will fail because jQuery hasn’t had time to load. Yet, if you watch in something like Firebug, the google.load() of jQuery will still block site.js until it completes loading (after the early $ access already threw an error anyway).

          • Dave Ward
            Dec 15 - 7:42 pm

            Just to underscore this point, this post has been receiving traffic from this search query:


            So, people are definitely running into this issue in practice.

            • Otto
              Jul 07 - 2:03 pm

              The google.load() function supports a callback parameter, to let you execute a function when the script has loaded. You should be using that instead for this sort of thing.

              function jq_init() {
               alert "jquery just finished loading";
                google.load("jquery", "1.2.6", {"callback" : "jq_init" });
  23. [...] A very interesting read. Check it out here. [...]

  24. Coderies |
    Dec 12 - 3:32 am

    [...] Faire héberger ses librairies Javascript par Google, c’est biien ! [...]

  25. [...] 3 reasons why you should let Google host jQuery for you A CDN — short for Content Delivery Network — distributes your static content across servers in various, diverse physical locations. When a user’s browser resolves the URL for these files, their download will automatically target the closest available server in the network. [...]

  26. Alireza Tajary
    Dec 12 - 4:42 pm

    Hi, thanks
    your write but not for all conditions.
    read my blog to know why you should not host it on Google.
    thank you

    • Whyatt
      Dec 16 - 3:40 am

      Any comments on this? Seems Google code might be blocked in certain countries. If so, that would be a strong argument against this method. Anyone have more info on this?

      • Dave Ward
        Dec 16 - 8:40 am

        Alireza’s problem is due to the US embargo against Iran, as absurd as that is.

        • Whyatt
          Dec 16 - 10:48 am

          Anyone knowledgable as to which countries are blocked from Google code due to this embargo?

          Absurd and ridiculous as it may be, it might be a production-stopper if you plan on doing business in any of those countries (and not already hindered from doing so if your company resides in the U.S).

    • Ben
      Sep 07 - 2:48 am

      Once again, good point this should NOT be done. I would rather 100,000 users have .5 sec of extra load time — rather than block a single user. Point made.

      It’s like .. I want it to be a good idea .. I’m an optimization freak, but it practice, this is just over engineering that doesn’t help.

  27. Ben Amada
    Dec 12 - 5:51 pm

    This is a good piece of code for testing. It demonstrates that while loading jQuery via google.load(), the browser will continue to execute/process the page until it hits another piece of external content to retrieve. I do get the “$ is not defined” error.

    In contrast, there’s no error when using a script tag for jQuery since the browser completely halts execution of the page until jQuery has fully downloaded and been parsed.

    <script type="text/javascript" src=""></script>
    <script type="text/javascript">
      $(document).ready(function() {
        alert("I will run.");
    <script src="/scripts/site.js"></script>

    But with this, as Isaac pointed out, there may be a small delay in the page fully loading compared to google.load(). Assuming all your onload jQuery logic is currently wrapped up in (document).ready(), switching to Google’s setOnLoadCallback() method seems safe to do. If Google’s CDN serves jQuery quickly, I don’t think I’d be worried about just using the script tag, avoiding google.load().

    • Simon
      Jul 20 - 5:25 am

      It’s all very well arguing the point of using Google’s setOnLoadCallback() to initiate your code while not hindering page loads, but at the end of the day it is quite likely that you’re calling a third part jQuery plugin locally, and the fact that this WILL be done using a script tag will no doubt result in the script loading fully before the DOM continues to load happily.

  28. [...] Shared a link on Google Reader. Why you should let Google host jQuery for you [...]

  29. Oskar Austegard
    Dec 15 - 3:01 pm

    What’s the recommended pattern for lazy-loading the jquery framework (and associated plugins)? I am thinking of scenarios like webparts, where you require the use of jquery, but don’t want to cause another http request and subsequent parsing of the same library.

    • Dave Ward
      Dec 15 - 5:03 pm

      You could use something like this before first use of jQuery in each of your webparts:

      <script type="text/javascript" src=""></script>
      <script type="text/javascript">
      if (typeof(jQuery) === "undefined") {
        google.load("jquery", "1.2.6");
      <script type="text/javascript">
      // Safe to use jQuery at this point.
  30. [...] 3 reasons why you should let Google host jQuery for you [...]

  31. [...] of keeping them updated and serving them quickly is more and more time consuming. Fortunately, there is a solution: Let Google and Yahoo host them for [...]

  32. Test
    Dec 30 - 5:24 am

    [...] of keeping them updated and serving them quickly is more and more time consuming. Fortunately, there is a solution: Let Google and Yahoo host them [...]

  33. jeff
    Jan 08 - 11:56 am

    I’m not sold on benefits of google hosted jquery.
    Whats your assumption re your visitors browser cache settings (on/off)
    if a majority of your visitors have clear cache sure, CDN hosting of ALL objects would accelerate loads but otherwise, there are basic principles to improving client side render performance without introducing a reliance on an external host..

    - use minimized versions of jquery (remove white space, decrease file size)
    - gzip (compress whats left to send over wire)
    - apache mod_expires (cache for X timeframe)
    - host jquery on separate host (deliver static objects and dynamic by separate hosts)
    - locate JS files at bottom (prevent script blocking)
    - Load all required JS files on early in visit process. don’t double up…Static object loads shouldn’t be coupled with query responses

    also worth noting, once you start a browser session and your browser either confirms the objects status’s or redownloads, for the rest of that session, you won’t notice a benefit because user local cache is used for the rest of that session..

    • Dave Ward
      Jan 09 - 1:23 am

      Even if you ignore the caching angle, Google’s CDN is going to serve files faster than you will be able to. Their CDN is one of the best in the world. Users potentially hitting the page with it pre-cached is just icing on the cake.

      Sure, CDN hosting all static assets is even better, but how many sites really do that? The vast majority of sites in the wild use no CDN.

      Even for sites that do use a CDN, using Google’s jQuery is an opportunity for cost cutting and potentially increased parallelism (assuming more than just JS is distributed via CDN).

    • Be Professional at Hosting
      Jun 19 - 11:45 am

      * gzip (compress whats left to send over wire)

      Totally in agreement for this. It really matters a lot compressed and uncompressed scripts.

      • Otto
        Jul 07 - 1:55 pm

        Check the headers. Google’s hosted versions are all compressed, minimized, everything. They’ve basically taken it upon themselves to optimize the hell out of these javascript libraries for the fastest possible hosting of them, with far-future expires headers and everything else.

        Using these will almost always be faster than not using them, in virtually all cases.

  34. Mathias
    Jan 22 - 4:41 pm

    …but what if the user got this fine plugin named “NoScript”.
    NoScript won’t load scripts from other domains as standard – the user have to accept them. :(

    are there possibilities like Zach said:

    Couldn’t you use JS to verify google’s jQuery script loaded… Then fallback to a locally hosted version if it’s unavailable?

    if(typeof jQuery == ‘undefined’)
    // load local version….

    excuse my bad english and greets from northrhine westfalia,

    btw. this is a great article!

  35. [...] Encosia – 3 Reasons Why you should let google host jquery for you Google Ajax Search Blog Google Ajax Labs [...]

  36. [...] 3 reasons why you should let Google host jQuery for you | Encosia Let Google Host your jQuery! (tags: webdesign programming ajax download host performance javascript google web development tips tutorials jquery hosting webdevelopment optimization library scalability caching cache 2008 js article code webdev howto) [...]

  37. [...] 3 Reasons Why You Should Let Google Host jQuery For You: Dave Ward explains how to use Google’s ample resources to speed up your web site. [...]

  38. chiller
    Feb 24 - 2:16 pm

    This is interesting. What if you are using custom jQuery plugins? Can you still use this method?

    • Dave Ward
      Feb 26 - 5:19 pm

      Sure. It’s just like using a local copy of jQuery, only faster.

  39. [...] jQuery UI just released version 1.7, which is compatible with jQuery 1.3. After reading this blog entry, I’ve decided to try letting Google host jQuery for me. Google is also hosting the standard [...]

  40. [...] This blog post explains why Google hosting is a Good Idea. [...]

  41. Gerhard Lazu
    Apr 09 - 4:30 am

    Do you know all those sites that fetch jQuery from Google Dave? Well, 9/10 you will see calls to that are taking ages to reply, so I think that using a different host to fetch jQuery is slower – overall – than including it from the server that one is already connected to.

    • Dave Ward
      Apr 09 - 7:28 am

      I’ve always found the opposite to be the case (Google serves it faster than my servers can). Do you have any examples of sites that hang on the dependency?

      • Ryan Platte
        Aug 19 - 9:04 am

        I’ve seen this with slow DNS servers.

        • Mick Sear
          May 14 - 2:34 pm

          I’d generally avoid pulling in JQuery from Google, because I’ve seen this a lot as well – You see it in the browser status bar: ‘Looking up’, etc. I’ve seen sites that pull in dependencies from four or five different domains, and each one requires a DNS lookup and then a TCP connection needs to be established. Unfortunately, although many ISPs offer nice fast bandwidth, they sadly don’t offer the same experience with their DNS servers.

          Of course, some users will benefit from CDNs, but many users will feel the opposite effect.

          • Nathar Leichoz
            Jul 29 - 10:40 pm

            I couldn’t agree more. And with mobile devices where upload is slower than download, it may be faster for mobile devices to keep the connection open and download jQuery than for the device to start a new DNS request for

  42. [...] 3 reasons why you should let Google host jQuery for you Use Google content delivery network to serve jQuery to your users directly from Google’s network of datacenters. Wow! [...]

  43. [...] For a more detailed look see Dave Ward’s 3 reasons why you should let Google host jQuery for you [...]

  44. [...] 3 reasons why you should let Google host jQuery for you [...]

  45. fraser
    May 11 - 12:05 pm

    Regarding the “$ is not defined” issue when loading using googles ajax method. I use the following and have had 0 issues with it so far…


    function init() {
    $(function() {

  46. J
    May 14 - 11:06 am

    So, uh, Google’s having *serious* latency issues (at least from my part of the world), and large and small sites alike that rely on the hosted jquery are hanging completely. Awesome! Granted, this will hopefully happen only rarely, but…

  47. Gerhard Lazu
    May 14 - 11:13 am

    And that’s exactly why I don’t use Google for my jQuery stuff. Local is best. I started seeing issues with Gmail from yesterday onwards. Who knows, maybe their maps are no longer reduce (or the other way round). Again, CDN is nice in theory, but don’t think that it beats local (well, depending on most sites’ target audience).

  48. Dave Ward
    May 14 - 11:41 am

    Looks to have more to do with network routing and less with Google:

    None of my sites were very affected by this. Remember that the CDN hosted files are served with a +1 year expires header. Returning users don’t even require a 304 ping.

    It would be a problem for new users, but how many new users (who don’t have access to Google search to find you) affected by the routing issues are you likely to bounce in that couple hour window? It’s a pain, but no where near a catastrophe.

  49. [...] There are a couple options available to you when you load the jQuery code; you can download a copy from the jQuery site and call it locally, or as we will do, call it directly from the Google code library. For more information why this is the better way to load jQuery, read 3 reasons why you should let Google host jQuery for you. [...]

  50. DotNetShoutout
    May 16 - 4:25 pm

    3 reasons why you should let Google host jQuery for you | Encosia…

    Thank you for submitting this cool story – Trackback from DotNetShoutout…

  51. jose
    May 20 - 1:52 pm

    One big surprise is that with Firefox 3 fetching jquery 1.3.2. from Google I get a 200 status code every time instead of 304.

    So there is no caching benefit, just the benefit from the CDN.

    Can anybody confirm this ?


    • Dave Ward
      May 20 - 2:02 pm

      Because it’s served with a far-future expires header, you’ll only see a request if your cache doesn’t contain a copy of the file. For up to a year after it’s cached, no request is made at all when the browser encounters a reference to it, not even to check for a 304.

      So, the only time you’ll ever see a request, it will have a 200 response. You shouldn’t see subsequent requests (and 200 responses) though, unless something’s preventing your browser from caching the file.

      • Zack
        Oct 11 - 12:18 pm

        Firefox’s disk cache is so ineffective that you should assume it doesn’t exist. See and its many, many dependencies, especially bugs 175600 (limit of 8192 cached items), 193911 (space limit of 50MB), and 290032 (some files are never cached due to a shitty hash function – may have been fixed in FF3.6).

        FF4 may be better – at least, some of those bugs have been marked fixed – but I would want to see test results.

  52. jose
    May 21 - 9:17 am

    What you mention is true for jquery-1.2.6 but not for 1.3.2

    Try the simplest page and check with firebug. If you reload the simple page with 1.2.6 you will see a 304 status code but a 200 status code for 1.3.2:

    I think the issue is that the response adds a http header Age not zero which is not included when you request 1.2.6.

    Try it! I was quite surprised !

    • Dave Ward
      May 21 - 9:51 am

      It’s normal to see a 200 response on the first request.

      It would be abnormal to see a 304. With a far-future expires header, the browser shouldn’t even be pinging for a 304 if it has it cached.

      When it’s being properly cached, you shouldn’t even see it appear in the Firebug net tab.

  53. jose
    May 21 - 12:10 pm

    Sure, the previous post skipped the basic html. I compared a page with jquery 1.2.6 and another with 1.3.2. The page with the old jquery caches properly but not the one with 1.3.2.

    At least with firebug 1.3 I see the 304 because if you look at the header responses it returns a Last-Modified, which takes precedence over expires

    It takes 5 minutes to do this test in firebug but I assume you haven’t even tested what I am saying.

    Your script snippet in Back to Basics references 1.3.2, so you should update the article. I will also try with IE and report the findings.

    • Dave Ward
      May 21 - 12:24 pm

      I use the Google hosted 1.3.2 on several sites. I double checked them this morning, after reading your comment, to make sure it’s still caching properly in Firefox.

      It is for me.

      The browser never makes a request for jQuery (1.3.2), not even a ping for 304, unless I clear my cache or force an update with a shift/ctrl reload.

      I also verified it in Live HTTP headers and Wireshark. Firefox is using the locally cached copy and isn’t sending even a single byte over my connection when it hits a reference to

      Do you have a publicly available page that reproduces what you’re seeing? I’ll take a look at it.

  54. jose
    May 23 - 4:00 am

    Thanks for checking.

    My issue is minor, I was refreshing both a page with 1.2.6 and a page with 1.3.2. The page with 1.2.6 checks the 1.2.6 link and gets a 304. The page with 1.3.2 gets a 200

  55. [...] code snippits on a central server hasn’t presented a prohibitive problem in the past as the jQuery library, CSS Resets and other snippits are all available hosted on a central server and used on major sites [...]

  56. [...] There are several other reasons why it may be a good idea to use Google’s API to load jQuery. Have a look at 3 reasons why you should let Google host jQuery for you. [...]

  57. [...] Now you need to link up your copy of jQuery. Or you can let Google CDN host it. This is what we will do here. If you are wondering why her is a great explanation. [...]

  58. Nikola
    Jun 16 - 2:23 pm

    Bravo, Google :)

  59. Otto
    Jun 17 - 3:49 pm

    If you happen to run a WordPress site, there’s a plugin that will easily do this for you, for all available JS libraries:

    Just install and activate. Operation is automatic, no configuration needed.

  60. [...] makes a pretty compelling argument that one ought to use Google’s CDN for internet-facing apps, and local copies for LAN-based apps, but you can choose whichever you feel is best for your app.  I’m guessing that linking to Google’s copy and downloading a local backup copy for yourself is probably the best way to go. [...]

  61. [...] 根据“3 reasons why you should let Google host jQuery for you”这篇文章中提到它的作用有三:1、减少传输时间;2、提高并行性能;3、更易于缓存。 [...]

  62. [...] 3 reasons why you should let Google host jQuery for you | Encosia. [...]

  63. Terry Kernan
    Jul 18 - 6:53 am

    If you are using opendns and you have set the protection level to highest, there is a great chance that the google website will be blocked for your entire network, these are some wise words though on CDN and general website performance.

  64. [...] Mer information om att hämta JavaScript-filer från Google AJAX Library finns på bloggen Encosia. [...]

  65. Swashata
    Jul 24 - 10:34 am

    Thanks a lot! You have opened my eye and now I am surely going to put the Google hosted jQuery on my blogger blog! :) thanks once again!

  66. James
    Jul 31 - 3:01 am

    I’ve been doing this as a matter of course since Google first announced they were hosting the files. I’ve not personally seen any performance increase, as I always keep my pages as lightweight as possible in any case, but neither have I had any problems. I agree that this is a best practise missed by most, and it’s always worthwhile exploiting any optimisations you can!

  67. [...] Add a Static Text view in order to hold our web service call via dynamic html. As you can see we are combining JavaScript and HTML. We are heavily using tags as we did in my previous Google Maps integration post. You will also notice our use of a JQuery CDN script include tag. We are using Google’s CDN for JQuery. I’ll right another post with more details on the benefits of using a CDN compared to keeping the JQuery file on your own network. But for know you can check our Dave Ward’s post on the topic at [...]

  68. [...] this argument, jQuery should be included as [...]

  69. [...] Using a lightbox effect on any of your pages? I was using Lightbox2 on most projects until I discovered a lighter, simpler version recently. Introducing Slimbox2. All the goodness of Lightbox2 in a 4K script. Also, using Google to serve your jQuery speeds up your page even more. This site’s post tells you how: 3 reasons why you should let Google host jQuery for you. [...]

  70. Grace Basilio
    Sep 02 - 10:31 pm

    Sorry, the first code should be:

    google.load(“jquery”, “1.3.2″);
    google.setOnLoadCallback(function() {
    //this worked

    And the second block of code should be:

    $(function() {
    //This didn’t work on IE but it did on FF

  71. Yvar
    Sep 11 - 11:56 am

    I am considering using google api, but I have a question.
    What is the likelyhood that google will host old versions indefinatly? I have made websites for clients for almost 15 years now and one of the oldest is still running today. I would hate to think that a site I make today would not run in 20 years time because the js is no longer hosted on google?

    • Dave Ward
      Sep 11 - 12:04 pm

      That’s a good question that I don’t know the answer to. My guess would be that they would continue hosting it as long as it was actively serving requests. There’s so little overhead in it, I’d be surprised if they went out of their way to break something being used like that.

      Worst case, it’s a very quick search-replace to globally update a site to use a locally hosted legacy version instead of Google’s.

      • Yvar
        Sep 11 - 12:17 pm

        Hmm, not very reassuring. But I still would like to use it. Best solution I can think of is to check to see if jquery has loaded, if not load a local version?
        Have any thoughts on how that could be done best?

        • Dave Ward
          Sep 11 - 12:35 pm

          You can do this:

          <script type="text/javascript" src=""></script>
          <script type="text/javascript">
          if (typeof jQuery === 'undefined')
            document.write(unescape("%3Cscript src='/path/to/jquery-1.3.2.min.js' type='text/javascript'%3E%3C/script%3E"));

          Because script elements block rendering until they’ve loaded and been executed, you can assume if jQuery isn’t present in the subsequent script block that it has failed to load from Google and then react accordingly.

          • Yvar
            Sep 11 - 2:43 pm

            You confirmed my thoughts. Thank you very much for the help.

          • Daniele
            Mar 12 - 9:10 am

            I’ve tried the check and eventually load locally method above; I’ve intetionally used a bad url for google jquery to see if the local loading of jquery was really done. And indeed it was, but looking at firebug network tab, I see that others js libraries requiring jquery, load before the local copy of jquery, breaking any jquery-dependent functionality on the page.
            Here’s my test code:

            <!-- Call to non-existant jquery file on Google  --> 
            	if (typeof jQuery === 'undefined'){
            		document.write(unescape("%3Cscript src='js/jquery-1.4.2.min.js' type='text/javascript'%3E%3C/script%3E"));
            <!-- Tabs, Tooltip, Scrollable, Overlay, Expose. No jQuery.

            Firefox loads the second library as soon as google gives the 404 error, and then loads local copy of jquery after some other resource files, as ccs o gifs.
            Do you know a way to correct this behaviour? Am I doing something wrong?


          • Daniele
            Mar 12 - 9:15 am

            Sorry, I’ve lost some piece of code. Hope this works:

            	if (typeof jQuery === 'undefined'){
            		document.write(unescape("%3Cscript src='js/jquery-1.4.2.min.js' type='text/javascript'%3E%3C/script%3E"));
  72. SODEVE
    Sep 15 - 10:53 am

    [...] localhost (, but Pyrmont is loading it from Google. There must be a reason for this. Dave Ward at Encosia teaches us that the reasons [...]

  73. mrvirus
    Sep 26 - 3:42 pm

    thanks for your advice

    keep it up

  74. MyWebmasterTips
    Sep 27 - 11:47 am

    Yeah, I just found out from another programmer out there that this is definitely better than hosting the jQuery script yourself. It was nice to see exactly why it is better after reading your article.

    Thanks a lot.

  75. develCuy
    Oct 01 - 11:28 pm

    It would be nice to use Google’s CDN but as google is blocked in some countries and some networks … Anyhow, it is a good idea to use some CDN, perhaps build a very own to increase parallelism and improve caching. What about latency? well, we are talking about 20kb of jquery minified and gzipped, just make sure to use less images and optimize their size :)

  76. [...] sites using content delivery networks, such as Amazon Cloudfront, to distribute some files. Also, Google does host JQuery JavaScripts, which can improve [...]

  77. Sam
    Oct 13 - 7:53 am

    Thanks Dave, Encosia hasn’t been in my feeds for a very long time, but it never fails to teach me something new and interesting, I like this post and I adore the discussion opened up by the readers, you could add the pro’s & con’s to your post that have risen so far, that would be a good summary :)

    thanks again and keep it up!

  78. [...] Installing JQuery Published October 18, 2009 Uncategorized Leave a Comment There are two ways to install jquery. You can either go to the jquery website and download the latest stable release, or you can grab the very latest version straight from googles servers. I would propose the latter which not only has the advantage of simply being easier and quicker than ftping the latest release from JQuery, but, it is beneficial in lots of other ways that are beyond the scope of this post. – Check out this link for a fuller explanation [...]

  79. [...] in Web Development 0 comments I used to load jQuery library via Google AJAX Libraries API to save my bandwidth. It works in all browsers at the beginning, including Internet Explorer. However, it suddenly [...]

  80. [...] Make sure you have inserted jquery into your page, I am trying to get in the habbit of using the version hosted on google. Find out why here. [...]

  81. ReTox
    Oct 23 - 9:28 am

    I have a app that has a dynamic combining of the various scripts (jquery, plugins, etc) – total of ~110Kb, combined, gzipped.

    When I serve jquery combined with other scripts (1 script tag), latency is around 500ms.

    When used google’s CDN hosted version of jquery (1.3.2) and firebug shows latency of around 828ms for the scripts in net tab.

    All tests are done on local VS development server.

    What do you think? Will these results change when I host my application on some online server?

    • Dave Ward
      Oct 23 - 10:35 am

      Yes, you should expect to see the extra HTTP request be much more significant in a local setting. That won’t be the case once deployed to a live server.

  82. [...] embargo un artículo que leí hace algún tiempo [] recomienda evitar esa práctica, y utilizar el jQuery que Google cuelga de sus [...]

  83. Sunny Singh
    Oct 26 - 6:57 am

    I personally have other stuff within my “jQuery” file like jQuery Tools and JSHTML with I use on almost every page where I might use jQuery.

    So that’s one reason I wouldn’t use it, since you can’t have any other files in it. Also, it’s yet another http request so if you’re already loading other stuff, why not have jQuery be loaded in there as well.

    I can see myself using this for stuff like tutorials though, where all I would need is jQuery.

  84. [...] For a more detailed look see Dave Ward’s 3 reasons why you should let Google host jQuery for you. [...]

  85. snlr
    Nov 15 - 2:37 pm

    Thanks for the article! Firebug tells me, Google’s response is about 50 ms (jquery) and 150 ms (jquery ui) faster than my remote dev machine. It’s also gzipping, so I don’t have to take care of that. Seems to me like there is an advantage on using Google’s hosting the way you suggest. I also used your fallback code, thanks for that!

    Although – right now I’ve got only 4 JS files, which might double or tripple at the end of development. From past experience I still suspect, that nothing is faster than combining and compressing all JS into one (as well as CSS). As far as my limited experience goes, this is always the fastes solution, although it’s more tedious to maintain. The process has to be repeated for every update to any JS or CSS file. One could still use a CDN though, by just buying some cloude space. Should not be too expensive per year for a couple of text files which are often cached.

  86. snlr
    Nov 15 - 4:15 pm

    The fallback code works like a charm if remote JS files are blocked by NoScript. That leaves only the “user lives in Iraq”-scenario, unfortunately. Maybe a country specific IP range can be considered.

  87. phil
    Nov 18 - 6:22 pm

    For https I’m thinking it would be possible to use the Analytics code:

    var gaJsHost = ((“https:” == document.location.protocol) ? “https://ssl.” : “http://www.”);
    document.write(unescape(“%3Cscript src=’” + gaJsHost + “’ type=’text/javascript’%3E%3C/script%3E”));

    Obviously changing the URL.. Not tried yest but it should work.

  88. [...] Steve Souder talk about speeding up sites at SXSW Interactive and after seeing it outlined in this post. Considering the fact that you can only load so many things from one server at a time, an easy way [...]

  89. [...] Check it! [...]

  90. [...] on this however if you are interested in reading more you may want to swing by Encosia and read 3 reasons why you should let Google host jQuery for you. If you want to opt for this method you can simply amend the above example as [...]

  91. pceasies
    Jan 17 - 10:34 pm

    It is good for most shared hosting users to offload extra scripts and resources as much as possible, but when you have a low latency, high bandwidth server, you are just as well off since you no longer require an extra DNS request, which can waste lots of time.

    • Dave Ward
      Jan 18 - 12:08 am

      Only if you assume that all of your users are closer to your server than they are one of Google’s CDN edge servers and that the user doesn’t have the Google CDN copy of jQuery already cached (which would lead to the browser not making a DNS lookup or HTTP request at all).

  92. [...] to stop hosting your own local copies of common Javascript includes like jQuery and ASP.NET AJAX. Dave Ward summed up the top three reasons: Decreased [...]

  93. [...] library. Ne hostamo ga lokalno već ga loadamo direktno sa Googla. Pročitajte ovdje 3 razloga zašto je bolje učitavati jQuery s Googla. U navedenom primjeru uključujemo najnoviju verziju 1.4.0, a ako Vam je potrebna neka druga [...]

  94. Nathaniel Smith
    Jan 25 - 10:35 am

    Was doing some research into this trying to pick whether to use
    jsapi+google.load or the direct library path. I noticed that with the former
    method, the js lib comes with a 1yr-in-the-future expiration date while with
    the latter, a 1hr. Here are the (relevant) HTTP response headers from FireBug:

    using google.load():
    Content-Type text/javascript; charset=UTF-8
    Last-Modified Thu, 14 Jan 2010 01:36:01 GMT
    Date Fri, 22 Jan 2010 20:27:03 GMT
    Expires Sat, 22 Jan 2011 20:27:03 GMT
    Content-Encoding gzip
    Cache-Control public, must-revalidate, proxy-revalidate, max-age=31536000
    Age 176
    Content-Length 23807

    using straight url:
    Response Headersview source
    Content-Type text/javascript; charset=UTF-8
    Last-Modified Thu, 14 Jan 2010 01:36:01 GMT
    Date Fri, 22 Jan 2010 20:33:37 GMT
    Expires Fri, 22 Jan 2010 21:33:37 GMT
    Content-Encoding gzip
    Cache-Control public, must-revalidate, proxy-revalidate, max-age=3600
    Age 161
    Content-Length 23807

  95. Terry Harrison
    Jan 28 - 2:01 pm

    This is a great article but when I try this approach the intellisense will not work. Is there a work around to fix that?

    • Dave Ward
      Jan 28 - 2:58 pm

      Assuming your code is in .js includes, download the vsdoc and place this at the top of your includes:

      /// <Reference Path="~/path/to/jQuery-1.x.x-vsdoc.js" />

      For Intellisense inline in ASPX pages, you can use this trick:

      <% if (false) { %>
      <script type="text/javascript" src="/path/to/jQuery-1.x.x-vsdoc.js"></script>
      <% } %>

      It will never be rendered to the page, but Visual Studio will still parse the file for Intellisense.

      • Terry Harrison
        Jan 29 - 11:19 am

        Ok I tried this but I don’t seem to get it to work. Here is what I did

        <%@ Page Language="VB" AutoEventWireup="false" CodeFile="testpage2.aspx.vb" Inherits="testpage2" %>
        <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "">
        <html xmlns="">
        <head runat="server">
        <script type="text/javascript" src=""></script>
        <%  If False Then%>
            <script type="text/javascript" src="/_assets/js/jquery-1.3.2-vsdoc.js"></script>
        <%  End If%>

        Did I put it in the wrong place? As long as I don't reference the Google page my page works but as soon as I point to Google - no intellisense.
        Thanks for your help.

        • Dave Ward
          Jan 29 - 7:43 pm

          I don’t have VS2008 handy to test with, but that code does give me jQuery Intellisense in inline code blocks in VS2010.

  96. Stephen Cleary
    Feb 01 - 11:35 am

    @Terry: I had the same problem.

    This worked for me on VS2008:

  97. [...] Well, that certainly sounds effective! Other than that, Dave Ward gives us a tip where we should use Google’s CDN based hosted javascript libraries to improve the blog page load speed. Other than that, Yahoo also suggests website owners to use CDN [...]

  98. [...] 3 reasons why you should let Google host jQuery for you Convinced? You should be. [...]

  99. Christopher
    Feb 09 - 4:48 pm
    <script type="text/javascript">
    var jQueryVersion = '1.4.1';
    var jQueryRemoteProtocol = (("https:" == document.location.protocol) ? "https://" : "http://");
    document.write(unescape("%3Cscript src='" + jQueryRemoteProtocol + "" + jQueryVersion + "/jquery.min.js' type='text/javascript'%3E%3C/script%3E"));
    if(typeof jQuery === 'undefined'){document.write(unescape("%3Cscript src='/scripts/jquery-" + jQueryVersion + ".min.js' type='text/javascript'%3E%3C/script%3E"));}
  100. Featured Slider
    Feb 15 - 12:44 pm

    [...] you have your jQuery library included too. I’m not using the WP core one because there are reasons not [...]

  101. [...] a further explanation of why you should let Google host jQuery for you visit [...]

  102. [...] reading 3 Reasons Why You Should Let Google host jQuery, I decided to do so, although I went for the easier link styling [...]

  103. will
    Feb 18 - 3:16 pm

    This method would be great if everyone used it. The problem is that only something like < %5 of sites use it meaning the extra DNS lookup to google costs more time than just hosting the file yourself. Since you are only pulling one file from google, you never beat the initial dns overhead that using their CDN costs

  104. [...] 3 reasons why you should let Google host jQuery for you [...]

  105. Billy Nguyen
    Feb 21 - 3:33 am

    Having total control of your source codes outweights all of the advantages listed and discussed above. Why should you rely on someone else, even Google, may have an impact, beyond your control, on your company/business website?

    For a personal website, I can see the benefit but not for company/business websites. Foolish thinking not practical.

    • Dave Ward
      Feb 21 - 1:11 pm

      Unless you planned on modifying jQuery.js, you still have total control of your source code. If something were to happen, you could switch the reference to a local copy in minutes. It’s no less practical than using any other CDN, which business sites have been doing for years. Unfounded paranoia is no reason to slow the web down as a whole.

      • Christopher
        Feb 21 - 1:22 pm

        The difference is that the Google CDN for javascript libraries doesn’t offer an SLA.

        • Dave Ward
          Feb 21 - 1:26 pm

          Automatically falling back to a local copy isn’t difficult (see the examples of that throughout the comments here). In the bigger picture, I tend to agree with Nate that SLAs are empty anyway. I’m less interested in who to blame during the .0001% than I am how to improve performance/user-experience during the other 99.9999%.

          • Jeremy Jones
            Jul 21 - 4:17 am

            I think if you agree to an SLA which is ‘empty’ then absolutely, there’s no point to it. But if you properly set an SLA with your hosting provider which guarantees your servers, with fines etc for any problems, then to bring a key part of your page engine away from that definitely negates the whole purpose and would not be justifiable in terms of business accountability.

      • Billy Nguyen
        Feb 22 - 5:03 am

        Switching back to a local copy is your solution if something happens. Then, why bother with that headache in the first place? Beside, your suggestion shows that you did not consider the business implication. Redirecting your customers to Google equates with Google accessing your customer’s intelligent info. Furthermore, it is practical to point out that jQuery is only about 70k. What gain you actually do? NOTHING.

        Therefore, it is foolish and ill business practices, still.

        • Dave Ward
          Feb 22 - 11:36 pm

          Falling back to a local copy is automatic. Why bother? Eliminating a quarter-second or so of page-blocking JavaScript is well worth understanding how to best use the resources available.

          As long as the majority of sites are using services such as AdSense, Google Analytics, Google Maps, etc, I have a hard time buying the FUD and paranoia. Of all the services like that, using the CDN is one of the most benign, since the browser may not even contact Google again for a full year after jQuery has been downloaded once.

  106. snlr
    Feb 22 - 7:00 am

    I think, Billy Nguyen’s concern cannot be dismissed. He points out that there are differences for commercial and for non-commercial websites, and that they must be considered, especially the ones about privacy, and especially if you’re gaining about a millisecond in performance. A business should have the possibility to invest in a save CDN. Just as more and more businesses can’t rely on Google Analytics anymore, because then you’re sending certain 2-party-data to a 3rd party (Google). In some countries like here in Germany, this starts to go to courts already. The solution also for this problem is relying on software which can be hosted on the companie’s property.

    On the other hand, since when everyone points to Google for hosting of files, they get cached a lot and Google’s servers are not contacted anymore, so it’s even harder to make “sense” of this info.

    Why not do it the other way around for business sites? Use a local copy, and when that is not available, fall back to a 3rd party CDN. Seems like an unusual case (unless you do have your own CDN or CDN-like setup), but if it does happen, it seems like good service to your customers not to bother them with the tech problems.

    • Dave Ward
      Feb 22 - 11:45 pm

      Privacy on the Internet is a nice notion, but largely an illusion. Of all the actively invasive methods that advertisers and ISPs use to track users, focusing on the potentially once-per-year jQuery CDN accesses is a non-starter for me. If Google were really out to leverage the service for “evil”, they definitely would not serve the file with a +1 year expires header.

      • Chris M.
        Jul 19 - 3:35 am

        Sorry, but I seriously disagree with you here. This cavalier “well, we don’t have privacy anyway” attitude is the very reason that privacy is such an issue in this day and age. When they came for the trade unionists, I said nothing.

        You are wrong to just dismiss these claims. At best, you are being disingenuous by not giving security in this matter an equal discussion. At worst, you’re ignorant to the real problems here. In a below response to John Sanders, you state that, “if Google’s CDN were compromised, you’d probably find that out much quicker than if your own hosting was.” I don’t necessarily agree with that. When you use the Google APIs, you don’t have an SLA with them. They are under no contractual obligation to report exposure to you.

        More importantly in this day and age, and the real reason I take serious issue with your “this is the best approach for everybody, except maybe intranets” attitude, Google does not need to be compromised in order for the content to be exploited. Man-in-the-middle attacks are extremely easy to pull off, especially when the content is being served via HTTP mechanisms.

        Here’s the real problem.

        If somebody visits a website that uses the Google-hosted jQuery and their session is compromised just once, every single session they have on any and EVERY site they subsequently visit that uses Google’s jQuery is also compromised. If someone were to inject a modified jQuery to Joe Sixpack while he was browsing the web at Starbucks (or even as he passed by a rogue access point with wifi enabled on his phone or laptop – on a freaking plane, even), the response can be set to cache that content for decades. At which point, every single request to a Google jQuery-enabled site is now affected. The very caching that you espouse as one of your main three benefits also introduces a serious potential vulnerability.

        This isn’t being overly-paranoid, this is being realistic. As long as we’re using plaintext protocols, depending upon an outside CDN to provide portions of executable code that are shared among thousands of sites is just an awful idea.

        Everybody is free to make their own decisions, but you should really be fair here and give discussion to the fact that in order to enable “Eliminating a quarter-second or so of page-blocking JavaScript,” site managers are opening up a new potential avenue for attack. When explained in these terms, many businesses may make a different decision.

        Parallelism is also a dubious argument. Anybody can enable parallel loading of JS content on their site without much difficulty. And if one is concerned about the two-concurrent-sessions-per-host browsers as you discuss, you can just as easily serve the content from a different hostname. Yes, it’s another DNS lookup, which adds latency (assuming they already have the Google DNS entries cached).

        Everybody needs to make this decision on their own, weighing all of the benefits against all of the downsides. This is far from a “one size fits all” solution, and there are many factors to consider. Risk assessment is a real concern here.

        You have a heavily-linked page on this topic. You would do well to treat the topic honestly and fairly. I’m sorry, but I don’t feel you are currently doing that.

  107. Donald Jenkins
    Feb 23 - 9:02 pm

    Interesting. I doubt the benefit of users not having to download jQuery at all because it’s cached exceed that of minifying all your javascript files,including jQuery, into one file and serving it from Cloudfront, which is what I do.

  108. Lifesize Blog
    Mar 03 - 2:43 pm

    [...] many advantages to this but we won’t get into them now (if you are really curious check out this post). For now just consider it “best [...]

  109. [...] If using JQuery, then use Google servers. [...]

  110. John Sanders
    Mar 10 - 6:47 pm

    Thanks for the article Dave!

    What makes me most uncomfortable about this approach is security. If Google’s servers are ever hacked (improbable, but not impossible), then think about how many websites will be running malicious code. Who knows what kind of information a bad person could grab by reading your cookies, or scraping your screen. For this reason, I would only use this strategy for websites that contain no private user information.

    Hacking happens.


    • Dave Ward
      Mar 11 - 1:05 pm

      That’s definitely a valid concern.

      Of course, if Google’s CDN were compromised, you’d probably find that out much quicker than if your own hosting was.

    • Marnen Laibow-Koser
      Mar 22 - 1:38 pm

      Exactly. This is the primary reason I’m hosting my JavaScript libraries locally.

      However, I’m starting to use Google’s hosted Web fonts, so maybe using their hosted jQuery would be no worse. Also, they’re probably better at protecting the integrity of their CDN than I am — I’m a developer, not a sysadmin.

  111. uberVU - social comments
    Mar 11 - 7:33 am

    Social comments and analytics for this post…

    This post was mentioned on Twitter by alexanderkahn: @chipkaye If you use a Google-hosted one, users will probably already have a it cached, and thus will have to wait less.

  112. [...] we all know, there are very good reasons to let google host JQuery for you. Unfortunately, Trac doesn’t support loading JQuery from Google and there does not seem to be [...]

  113. [...] file. Instead of that, I prefer linking it directly to the Google. I recommend reading the article 3 reasons why you should let Google host jQuery for you at [...]

  114. jQuery för nybörjare
    Mar 23 - 4:32 pm

    [...] går även att hämta jQuery direkt från Google vilket ofta är att föredra då det bl.a. kortar sidans laddningstid särskilt om det redan finns en version av jQuery lagrad i [...]

  115. D Yeager
    Mar 26 - 10:20 am

    Dave, have you seen this article?

    It says using CDN’s for javascript isn’t worth it – the DNS adds 1/3 second delay verses your own domain, and jQuery only takes 1/3 sec to download.

    He’s also says only 2% of Alexa 2000 websites use a javascript CDN, so it’s unlikely the visitor has it cached already.

    In short he says:

    If we use Google’s JavaScript Library CDN, we are asking the majority of our website visitors (who don’t have jQuery already cached) to take a 1/3 of a second penalty (the time to connection to Google’s CDN) to potentially save a minority of our website visitors (those who do have a cached copy of jQuery) 1/3 of a second (the length of time to download jQuery 1.3.2 over a 768kps connection).

    What do you think?

    • Dave Ward
      Mar 26 - 4:21 pm

      I don’t agree with their assessment about the likelihood of a cache hit. With such a wide gamut of high-traffic sites (like Time, SlideShare, ArticlesBase, Break, Stack Overflow, Examiner, Woot, and ArmorGames, to name a few) using the Google CDN, it’s becoming more and more likely that the minority of your users are the group that don’t have a local copy cached.

  116. [...] google jQuery host [...]

  117. Savageman
    Apr 12 - 4:53 am

    I totally agree D Yeager. 300%.

    I tested this on my website and loading MooTools (well it’s not jQuery, but the situation remains the same) from Google CDN adds latency due to the DNS request. We switched to a locally hosted copy and the site is faster now.

    My advice is to test before diving into such advice. It may be not that good as it seems…

    • Dave Ward
      Apr 12 - 9:32 am

      I agree that testing is important to any optimization. For example, my browser took 678ms just now to download MooTools from your server, but only 91ms from

      Of course, that datapoint might be irrelevant if you don’t expect to reach a global audience with your particular site. It’s a good reminder of how effective geographically dispersed edge servers are though.

      • Savageman
        Apr 12 - 12:47 pm

        Yes, the location (geographically) of the user plays an important role here. Our server would certainly perfom better in France and nearly countries while you have a penalty from America. Your remark just made me check how many folks visit the our site from the other side of Atlantica. It looks like we have 4%, so it’s reasonnable not to use a CDN.

        I guess a very good option here would be to have those major libraries hosted locally, directly embedded in the browser as instance. That would definitively save up some time for everyone!

        I’m also wondering how this could affet the new criteria from Google’s pagerank: the site load time is taken into account. If the googlebot fetches everything from America, that would be a load time penality for the crawler. Even more if the DNS are cached: the CDN version would perform a lot better. Any thought on this?

  118. The Venture Foundation ::
    Apr 19 - 10:36 am

    [...] For a more detailed look see Dave Ward’s 3 reasons why you should let Google host jQuery for you. [...]

  119. [...] the geeks among you, read why you should link to Google’s copy of jQuery instead of using a version on your own Web [...]

  120. Mag
    May 06 - 5:03 pm

    I think I would rather site visitors accessed this locally on my servers than kicking off another DNS request. I think I’ll still run some tests to see if there’s really much difference. Maybe better to use the hosted version if you’re working with Google tools a lot such as Maps?

    • Dave Ward
      May 06 - 5:39 pm

      It’s smart to consider the DNS lookup. However, it turns out that so many of the most trafficked sites are referencing the domain now, the DNS lookup is going to be already cached at the browser or OS level for most users.

  121. [...] came across an old post that had some pretty good arguments for why we should link to Google’s AJAX libraries instead [...]

  122. [...] take full advantage of the Google servers, as described in a recent article by Dave Ward, the caching times suggest it?s best to specify the full version of the library you wish to [...]

  123. [...] wrote a great article on why you should use this ability, and the benefits to [...]

  124. [...] 3 Reasons Why You Should Let Google Host jQuery For You [...]

  125. Jim Kennelly
    Jun 10 - 1:22 pm

    What happens when Skynet(Google) becomes self aware?

  126. [...] This subject merits its own post but this has already been done by several other people so I'll simply link one of the particularly good articles. [...]

  127. RashTech
    Jun 17 - 1:04 pm

    Is it possible to use Jquery with Google Sites?

  128. Price
    Jun 19 - 5:15 am

    I think I’ll stick to localhost serving of scripts, I got fast server and have run tests and got such mintue improvement it was outweighed by serving extra dns requests to another server, but a great post to highlight the point, it will be better for most sites I’m sure ;-) (lol at Jim Kennellys skynet comment)

  129. Sean
    Jun 19 - 1:08 pm

    Does it help make a decision knowing twice this month I’ve had issues getting Google Code hosted jQuery to work appropriately?

    It’s hard to find information out there, but I know it happened.

  130. [...] you need to invoke the jquery framework up in the head of your document.SHARETHIS.addEntry({ title: "Getting Better Data From Your Links", url: [...]

  131. Tom Dignan
    Jun 26 - 2:07 am

    Great article!

  132. [...] embargo un artículo que leí hace algún tiempo [] recomienda evitar esa práctica, y utilizar el jQuery que Google cuelga de sus [...]

  133. Shore
    Jul 11 - 5:01 pm

    Awesome article, i was always against grabbing jquery from google for risk of it being slower, great work

  134. [...] me wonder what is next.  Letting Google host my jQuery or site fonts?  That is just [...]

  135. James Parsons
    Jul 21 - 11:59 am

    I was doing this today, until I noticed that is “Not Found” and will not load on my website. I reverted back to a local copy, it is more reliable.

  136. daniele
    Jul 22 - 2:46 am

    @James P.
    If you agree to use Jquery hosted externally and your problem is the reliability of the external link, you can do a little check like this:

    	if (typeof jQuery === 'undefined'){
      		document.write(unescape("%3Cscript src='js/jquery-1.4.2.min.js' type='text/javascript'%3E%3C/script%3E"));

    In this way you can serve a local copy of jquery just in case google link fails.

    • Shane
      Mar 12 - 1:44 am

      Where would I place this backup code in case Google’s CDN was down and didn’t load? In my header.php or function.php?

  137. [...] For a more detailed look see Dave Ward’s 3 reasons why you should let Google host jQuery for you. [...]

  138. [...] to look lol…. you can put it locally on your site, but it's better to use someone elses…..…query-for-you/ Nice sidebar, really sprucing up the [...]

  139. John
    Aug 23 - 9:51 am

    Seems like Google is moving away from direct inclusion. And want you to get an API key. Using the API method, would the file js file still be cached?

    • John
      Aug 23 - 10:09 am

      To be more specific, will this still be cached

      <script src="">
      • Dave Ward
        Aug 23 - 10:24 am

        Scripts loaded through the jsapi loader are cached.

        I still prefer the direct reference for pulling a single script off the CDN. It only requires a single HTTP request, where jsapi/google.load() results in two.

        The place where using google.load() makes sense is when you’re pulling several scripts off their CDN. Then, its dynamic script injection will probably be faster than static script references.

  140. [...] the remote jQuery potential for decreased latency, improved parallelism and better cacheing, as detailed in this post on Encosia. On top of these benefits there is the beneficial side-affect of this jQuery being on a Google [...]

  141. [...] 3 reasons why you should let Google host jQuery for you [...]

  142. [...] 3 reasons why you should let Google host jQuery for you [...]

  143. [...] 3 rea­sons why you should let Google host jQuery for you [...]

  144. Jet
    Aug 29 - 4:45 am

    I would do this if it weren’t for the fact that I’ve corrected bugs inside jQuery that would otherwise cause errors to be served to my clients.

  145. Zachary
    Sep 01 - 9:06 am

    Thanks for the handy article. :)

  146. Great article! Thaks for share

  147. [...] 3 reasons why you should let Google host jQuery for you [...]

  148. Alinani
    Sep 16 - 3:50 am

    Great article dude

  149. Kristr
    Sep 19 - 4:11 am

    It’ a great idea, everyone is recommended. Load faster and BW saved.

  150. Dave
    Sep 21 - 3:48 pm

    Looking for your donation button. Finest decision support via comments that I have ever seen.

  151. [...] jQuery같은 공통 자바스크립트 라이브러리를 자신의 서버가 아닌 CDN을 사용해야 하는 여러가지 이유가 있는데 Dave Ward가 3가지 주요한 이유를 설명하였습니다. [...]

  152. Vishal Astik
    Oct 28 - 3:42 am

    Hi Dave,

    I only receive gzipped jquery from Google and Mircrosoft cdn when i use “https://”
    and not when when I use “http://” in firefox. Yet I have not tested in other browsers.

    example: “” – gzipped
    “” – Only Minified not gzipped.

    Thank You,
    Vishal Astik

    • Dave Ward
      Oct 28 - 8:48 am

      It’s coming in gzipped for me, as of a few minutes ago:

      It’s possible you’ve got a proxy somewhere between you and the CDN that’s altering request headers on regular HTTP traffic but not SSL. Do you see gzip or deflate in the Accept-Encoding line here?

      • Vishal Astik
        Oct 29 - 12:13 am

        Hi Dave,

        Thank you very much.

        you were right, problem was due to the proxy.
        when I try it from home everything works fine.

        Thank You,
        Vishal Astik.

  153. [...] For more information on method and the reason explanation, you can visit this [...]

  154. [...] sito che includeva lo stesso file non dovrà ricaricarla. Per approfondimenti consiglio di leggere questo [...]

  155. [...] why are we linking to the jQuery library through Google you may ask? The answer: reliability and speed. After we called jQuery, we get the javascript commands with our script.js file. Don’t forget [...]

  156. [...] Why CDN?:     Use Google CDN instead of MSFT CDN:  [...]

  157. Mohan
    Nov 12 - 10:52 am

    Hello Dear,

    I have try to make one jquery plugin for html loading function. Please refer the following link.

    Please suggest me to improve it more.


  158. palindrome
    Nov 18 - 10:33 am

    If you’re following this advice, you are doing it wrong.

    There is one single reason that outweights the 3 given in the article:

    Google will track down and save the information of all your site visitors, without them even noticing. Using the APIs, you just help to feed Google’s databases.

    And if some alerted users have blocked scripts from Google for exactly this reason, they will not be able to use your site.

    It is completely naive to think that Google hosts this APIs for the good of mankind. It’s all about harvesting of personal data – who visits which site at which time. Using the APIs surrenders your unsuspecting visitors to Google. Don’t do that.

    • Dave Ward
      Nov 18 - 10:41 am

      Luckily, that’s not very realistic. The CDN’s files are served with a far-future expires header, which dramatically minimizes how many HTTP requests are actually made to Google’s servers – exactly the opposite of how they would configure it for “evil”.

  159. [...] are a number of good reasons to let Google host jQuery for you. Categories: Internet, Noteworthy, Personal and WordPress [...]

  160. [...] The reasons for doing this are best put by Dave Ward: [...]

  161. [...] installs jQuery at your site. But for several reasons, calling Google’s file is “the right [...]

  162. [...] Daneben gibt es aber auch die Variante, die jQuery Files von einem sogenannte CDN (Content Delivery Network) zu beziehen. Der prominenteste Anbieter eines solchen Netzwerks ist dabei Google. Dieses Verfahren hat verschiedene Vorteile gegenüber der herkömmlichen Variante (nachzulesen bei Dave Ward): [...]

  163. [...] Parallelism: Up to six files at a time can be requested from one server. But since two files are being requested from a second server, Google API, you can download 8 files concurrently. [...]

  164. [...] wrote a great article on why you should use this ability, and the benefits to [...]

  165. [...] También te puede interesar: Tres razones por las que dejar que Google hostee jQuery por tí. [...]

  166. 916 Networks
    Jan 01 - 11:51 pm

    I am developed a site for a client that is available world wide (in languages I don’t speak), and ran into issues with people in other countries not being able to load the jquery plugins from google. I was not able to get exact error messages. We switched to having the jquery files locally and it resolved our issue…for what it’s worth to anyone.

  167. [...] You can read more about why using Google’s hosting of everything jQuery here. [...]

  168. [...] Also see Dave Ward’s 3 reasons why you should let Google host jQuery for you [...]

  169. Eric Muyser
    Jan 16 - 11:50 pm

    If you are using type=’text/javascript’, you are doing it wrong.

    • Dave Ward
      Jan 17 - 12:23 am

      In 2008, when this was written, no one was using an HTML5 doctype. In HTML 4.01 and XHTML 1.0 [Transitional], script types are required. For that matter, they’re optional in HTML5, not forbidden.

  170. Eric Muyser
    Jan 17 - 1:55 am

    Right, consider that an update comment for future readers. Optional as in ‘you could put anything there and it would still be ignored’.

  171. nate c
    Feb 05 - 11:07 pm

    I have a hard time believing Google is not interested in logging the ip’s. They are willing to give anything for free as long as they get a ping. Last time ( a few years ago) I looked at the google analytic js drop in they have a cute little image of one pixel that is wrapped in a noscript tag. PING.

    Google may not have that much more market share yet as far as search goes, but they saturate the web with they ‘free services’ that track nearly every site you go to. Don’t fool yourselves. It is not benign.

    Personally, I find it unethical to host 3rd party content from a site. when a user types or clicks a address they should be able to expect that is where they are going. There should not 15 different hosts hiding in the background js.

    Constructively, this could be avoided if there was a way to include a hash of the file along with the path in the script tag. It would then be obvious to the browser that the file was indeed the same library it had already loaded and not bother with it again. Very simple without having your every click tracked by big G.

    • Dave Ward
      Feb 06 - 12:09 am

      Since the AJAX Libraries CDN serves content from a cookie-less domain, with a +1 year expires header, it’s extremely poorly suited to any sort of per-click tracking. The idea that AdSense or Google Analytics tracks you from site to site is a legitimate concern, but this CDN is configured to optimize performance at the direct loss of “trackability”.

  172. SOLD and All Change!
    Feb 09 - 5:43 pm

    [...] a side note using Google CDN’s jQuery is very good and better, check this link for reasoning, and this one for implementation (in general not just [...]

  173. Jeune gay
    Feb 11 - 11:22 am

    Thank for this great article and good trick ;)

  174. [...] hosted copy of jQuery in this example (more detail into the benefit of such aproach can be found here), the following snippet is then placed in between our web pages head tags [...]

  175. Ricardo Rowe-Parker
    Mar 08 - 2:11 pm

    This is a great article. Thank you for taking the time to write it. The commenting area is also very valuable.

    Regarding this statement in your article:

    If you’re curious why the script reference is missing the leading http:, that’s a helpful trick which allows you to use a single reference that works on both HTTP and HTTPS pages.

    While this is indeed a nice trick, I wanted you to know that it did not work for me and probably does not work, period. When I include the link to the jquery without the HTTP that way, jquery simply isn’t loaded.

    Thanks again.


    • Dave Ward
      Mar 08 - 2:13 pm

      It definitely works. The one place you’ll run into trouble is if you’re editing/testing a file locally on Windows, without using a development server. From a page opened at file:///yourPage.html, the protocol-less reference will attempt to load jQuery from your local filesystem instead of using regular HTTP.

  176. [...] isn’t. But before I go into why you shouldn’t, you should read why you should, which you can here. All of which are valid [...]

  177. Shane
    Mar 12 - 1:37 am

    I installed a plugin called “Use Google Libraries” that did all the work for me :) however, although it works, I find that it didn’t provide benefit #3 that you describe. For some reason, my website calls jquery 2x from Google’s CDN (probably some other plugin makes the call), and the site attempts to download it. Do you believe that if I hard coded my site like you recommend that it would stop the 2nd call? Or do I need to get rid of that plugin (assuming I can find it, I’ve got 19 installed).


    • Dave Ward
      Mar 12 - 8:31 pm

      That sounds like either two separate plugins are injecting the jQuery CDN reference or maybe you’ve got the “Use Google Libraries” plugin in addition to a theme that has the hard coded link already included.

      I’d say try disabling the “Use Google Libraries” plugin first, and see if there’s still a single request being made for jQuery on the CDN. Also take a look at footer.php and header.php in your theme to see if jQuery is already being included there. If so, you shouldn’t need the plugin.

  178. Besoin de conseils
    Mar 13 - 1:35 pm

    Good article thanks you

  179. [...] 3 reasons why you should let Google host jQuery for you – Encosia. [...]

  180. [...] google link [...]

  181. Dan
    Mar 30 - 8:42 am

    Of the two methods available, this option is the one that Google recommends:

    The google.load() approach offers the most functionality and performance.

    This quote is actually not correct, although it may have been changed since the article was wrriten. The google CDN site currently says:

    Next, you load the libraries. The preferred method is to load the libraries via standard tags, which will result in the fastest loads.


    The google.load() approach offers the most functionality, while additional performance is gained by using the tag approach.

    Great article!

    • Dave Ward
      Mar 30 - 6:12 pm

      I’m glad to see they changed that. Thanks for pointing it out.

  182. Alex
    Apr 07 - 3:56 am

    Dear encosia,

    It is very hard to remember the path to Jquery on google. So every time I want to include google hosted jquery, I do a search of “jquery on google” and your blog always comes up on top.

    I have done this about 25 times now and everytime I have to scroll all the way to the bottom of the page to get this bit.

    <script type="text/javascript"

    I am sure a lot of people do this. Please insert a similar text right on top of the blog for people like me.

    Thank you very much.

  183. Alex
    Apr 09 - 7:01 am

    Thank you very much for listening, this outta save a lot of time for me in future :)

  184. vamban blog
    Apr 21 - 4:06 am

    I have used google hosted jquery in blog. Works like charm and thanks for sharing the valuable trick.

  185. Will Reinhardt
    May 03 - 2:58 pm

    We were doing this on a client ecommerce site, and as unlikely as it sounds, the Google-hosted jquery *did* go down. Even going down for a short time means a loss in revenue. We don’t use Google’s hosted jquery anymore.

  186. korbes
    May 03 - 5:46 pm

    I visit this site a couple of times a month just for the link. Saved me lots of time. Thanks!

  187. Thomas
    May 14 - 7:06 am

    *Never ever* make yourself dependent on third parties if you are serious about your website and you have alternative solutions. Google’s servers in particular are unreliable in my experience (I am getting server errors all the the time for instance on Google groups, and emails on Gmail have simply disappeared or not been forwarded in the past)


  188. Jerico Jien
    May 17 - 12:13 pm

    This is so amazing tutorial sir. Thanks a lot. This is pretty much helpful for beginners like me. I just want to ask though. After all the codings with jquery and having it hosted in google. I wanna ask how would i be able to install the codes into my site and make it work there?. I am using blogspot platform. I already have the code for the particular purpose. I just don’t know how to install it in my site.

    Please help me sir. Thanks and more power to you.

  189. [...] There are several other reasons why it may be a good idea to use Google’s API to load jQuery. Have a look at 3 reasons why you should let Google host jQuery for you. [...]

  190. [...] es recomendable delegar el hosting de la librería a google. ¿Por qué? lee esto. <script type="text/javascript" [...]

  191. Scott
    Jun 02 - 2:19 pm

    Just wondering why if I leave the https off (//…) jquery fails to load. I just end up with a “$ is not defined” error.

    • Dave Ward
      Jun 03 - 12:21 pm

      That’ll happen when you load the page from your local filesystem (as opposed to using a development server and localhost-based address), since the page’s base protocol isn’t HTTP or HTTPS.

  192. [...] personal projects or (even not so personal projects) using Google’s CDN hosted jQuery libraries is a no brainer. But what happens when you need a plan B? Or what happens when you realize that you need to have a [...]

  193. [...] If you’re curious as to why there’s no http:// as part of the src, it’s so you can use a single reference that works on both secure (https) and unsecure (http) pages. David Ward explains further here. [...]

  194. [...] Import jQuery into your HTML. [...]

  195. [...] reference that works on both secure (https) and unsecure (http) pages. David Ward explains further here. Note that you will need to add the http: if you’re testing your code on a local [...]

  196. [...] reference that works on both secure (https) and unsecure (http) pages. David Ward explains further here. Note that you will need to add the http: if you’re testing your code on a local [...]

  197. [...] if you need a little more convincing than that, Dave Ward does a nice job explaining the [...]

  198. Andre Margono
    Jun 15 - 11:27 pm

    I prefer not to use other parties hosted script. Just in case the unavailability of Google (which is rarely happened) our web might be broken or even make security hole in it because of the library cannot be loaded. IMHO it’s better put on our own hosting. If our hosting down then no one can access the web page, instead of open the web with security problem.

    • Not Andre Margono
      Jun 16 - 10:27 am

      Wait, did you just say absence of juery might leave a security hole?

      Looks like you are just making things up out of thin air, I mean, seriously, who depends on js for security? take it out of there and put it where it should belong. Also, please stop complaining about google going down so you didn’t have js to secure your website.

      • Andre Margono
        Jun 30 - 7:19 pm

        Yes, it’s true we don’t depend on js on security, yet if the website that uses js to do some processes and it broken because of the unavailability of the js might show some data that might not suitable for public to see.
        And how you will be sure that google or other script provider does not put other malicious script on their hosted script.

        • Dave Ward
          Jul 01 - 1:24 am

          If you’re relying on JavaScript to hide data that’s not suitable for public visibility, you’ve probably already been compromised and don’t know it yet. JavaScript can’t hide anything from view source or even a simple wget.

          That said, you can use a fallback approach (linked on this page several times) to automatically load a local copy of jQuery if the CDN’s copy fails to load.

  199. Andre Margono
    Jul 01 - 2:13 am

    Actually I’m not hiding any data, but read data using javascript, I never do that on my projects.
    CMIIW, I saw several websites that look fine, but displayed those kind of data when I disabled javascript on my web browser. That’s what I pointed as the security hole. Maybe this is different context though :)

    Anyway, I agree on fallback approach when if using CDN. Btw, do you think there’s a file hash checker to make sure the file on CDN is the same as the one on jquery website?


  200. Hendrik P de Ruijter
    Jul 03 - 7:57 am

    I’ve switched to using the specific version (say v1.6.2) of jquery to a generic version (i.e. v1), which is also possible with the script tag:

    <script src="//" type="text/javascript"></script>
    • Dave Ward
      Jul 03 - 11:17 am

      Even if you trust that new versions of jQuery won’t break your site (and the last two major versions had breaking changes that affected many sites), you shouldn’t use the latest version reference for performance reasons. In order to ensure that a browser seeing that reference will always use the latest version, that reference is served with an extremely short expires header. So, you lose not only the cross-site caching benefit, but for repeat visitors this aspect of your site will be even slower than if you’d self hosted jQuery with proper expires headers yourself.

      • Hendrik P de Ruijter
        Jul 03 - 3:48 pm

        Excellent points. I used an old version of jQuery for some sites and discovered later that there were errors with IE6. These were solved in a more recent version of jQuery. Hence I thought it might be a good idea to let it update automatically. For these sites, performance wasn’t too much of an issue, but breaking the site would be of course. Thanks for the pointers.

  201. Barney
    Jul 18 - 8:47 am

    This shows how such a great topic can be in that is spanning over two and a half years! For me personally there is still a element of suspicion and concern using a third party and rely on them – no matter how big or dedicated they may be. I guess where revenue and performance are concerned however this is a deal breaker. Just to be contradictory, the idea of automatic updating does appeal, which I assume the hosted option would provide?

    • Dave Ward
      Jul 19 - 12:06 am

      You should avoid the auto-updating usage, because it also comes with a very, very short expires header which hamstrings caching for repeat visitors. It’s also somewhat dangerous to assume that jQuery updates will not break old code these days – jQuery 1.5 and 1.6 included breaking changes (in the interest of the long-term good).

  202. Susan Ottwell
    Jul 18 - 11:43 pm

    I’m not so sure that I like the idea of Google or anybody else having the ability to track exactly who is requesting my site’s pages. Since every request for the API libraries is accompanied by the key, then Google knows the site being accessed, and can also know who the visitor is and what page he has come from. I’m not paranoid, nor (I hope) are my visitors, but still giving someone that much information on my site and its user’s activities gives me something to think about.

    • Dave Ward
      Jul 19 - 12:04 am

      Don’t worry much about that. Since Google’s serving the script with a +1 year expires header, they’re committing to a best-case scenario that’s extremely sub-optimal for tracking your users. With the huge cross-domain caching potential that the Google CDN has accumulated at this point, there’s a good chance they’ll never know that your users hit your site because the user will load jQuery from disk and not even need to make a request for a 304 result.

  203. [...] I use Google CDN to include the jQuery and jQuery UI script and I suggest you do the same as there are some advantages of using Google CDN than hosting your own script. <script src="; [...]

  204. [...] the jQuery library from Google’s or Microsoft’s CDN for decreased latency, increased parallelism, and better caching. [...]

  205. [...] up on your jQuery and make sure you have the jQuery source file linked into your page (you are letting Google host jQuery for you, right?). If you don’t know much about jQuery, visit the jQuery documentation for some great [...]

  206. Alex
    Jul 31 - 10:07 am

    Great post. I only just got setup with jQuery today and am amazed by the great results you can get with such little code. I implemented the datepicker into my site which used a fraction of the code of my old JavaScript one and has much improved validation. I didn’t really know if I should host the files myself, but one quick Google search and 5 mins later and I now have a good understanding. Thanks!

  207. [...] to serve jQuery to your users directly from Google’s network of datacenters . Doing so has several advantages  over hosting  jQuery on your hosting server . [...]

  208. [...] to Encosia for this tip. The URL starts with // so it works with both http and [...]

  209. [...] This is a great article on why you should be using a CDN for this accessing these kind of libraries:3 reasons why you should let Google host jQuery for you . The one problem is dealing with them when your page is accessed via https and the “mixed [...]

  210. [...] mengurangkan latensi, meningkatkan keselarian dan caching yang lebih (mungkin anda boleh rujuk disini (English) untuk mengetahui lebih [...]

  211. Dev Head
    Aug 10 - 3:05 am

    It’s amazing how the comments have been going on for so long. I too came here in search of the quick link, thought it would be quicker than opening finder and grabbing a copy from one of my sites. Thirty minutes later and I’m not so sure it was quicker.

    While I’m here I figure I might as well as put in my few cents as there are clearly a few designers and web developers on here looking for real clarity on the matter.

    For those who just to skip to the end for the answer here it is, locally hosting is always the better choice than using google to host one jquery file.

    The most basic of reasons is control, of which you lose by allowing someone else to add whatever they choose to on your site. In addition you are at the mercy of their network. Can you have faith in their abilities, sure nothing wrong with that. Should you base a business on letting an easily mitigated risk go on? No, no you shouldn’t.

    Looking for technical side of it? Excellent…

    To start of the most obvious reason, is that no matter if you have jquery hosted on google or your server all subsequent requests are pulled from cache, meaning any suggested benefit for using google is pointless after users have been to your site. If your site only loads once, in this case you may have a point. I know from my experience my clients like users to get past the home page and the majority do.

    Secondly using the google version results in a dns lookup, sure your computer may have that cached but again if relying on an assumption that it already exists you might as well rely on the assumption it doesn’t. Since the assumption that it doesn’t allows you to reduce a risk.

    From a CDN stand point, you normally look to use those when your customers are located across fast distances and have a large amount of traffic. If this is the situation you find yourself in, then using a cdn for your own assets would be more beneficial than just one small jquery file.

    Additionally users won’t always get the closest response from a data center. For me I get sent up to Mountain View, CA even after first hopping to the data center in LA. It’s not a perfect science but more of a best guess, and guessing should have no part in logical conclusions.

    As such, I don’t see any valid use of allowing google to host a jquery object for me or the companies I work for. At best some users may experience a noticeable benefit, say your customers in Montana accessing with their free month of AOL. The downsides associated with that practice can not offset the potential gains…. it’s good sense in my professional opinion.

    But by all means, don’t take my advice live and let live i say. But if you build and maintain enough sites for enough time… you’ll realize this was sound advice.


    Dev Head

  212. [...] jQuery for this, so you’ll need to have that installed. I use the Google Code hosted version, for many reasons. So to add the call to jQuery, our code becomes [...]

  213. [...] Kategória: Nincs kategorizálva | A közvetlen link. ← Előző bejegyzés [...]

  214. Charlie Summers
    Aug 24 - 1:20 pm

    I am a little curious about why no one here discusses your user’s privacy. Google doesn’t make this web space available for altruistic reasons, they are a business whose model is to gather as much data about everyone as possible and sell access to that information to the highest bidder. When a user pulls javascript from GoogleAPIs, this allows Google to track who is pulling the script and for what website it is being pulled. This is a valuable piece of information to add to all of the other information in the user profiles.

    I (as a user) mistrust any website that requires me to make a connection to Google’s servers (which we firewall at the router), and have gone so far as to close bank accounts where the programmers have been lazy enough to rely on Google instead of serving their own javascript. Admittedly, most users don’t care about being tracked (there’s a never-ending argument between the young and the old!), but for those of us who do, relying on GoogleAPIs is a strong signal that the website programmers, and by extension the company owning the website, have no regard whatsoever for user privacy.

    • Dave Ward
      Aug 24 - 1:32 pm

      That’s been discussed a few times throughout the comments. Due to the far-future expires header and cookie-less domain, Google could only gather extremely spotty tracking data if they tracked it all. Many, if not most, of the script references to Google’s CDN don’t result in any HTTP request to Google’s servers, and they lack tracking cookies when they do.

      Compared to the even more ubiquitous services like AdSense and Google Analytics that do actively track viewers, the public CDN is harmless.

  215. Charlie Summers
    Aug 24 - 6:15 pm

    Compared to the even more ubiquitous services like AdSense and Google Analytics that do actively track viewers, the public CDN is harmless.

    That’s a silly argument; compared to Gmail, the CDN is harmless, too (not to imply Google tracking is harmful). But I don’t use Gmail (hell, our mail server won’t accept connections from Gmail machines). I don’t allow connections to And none of that is relevant. We are talking solely about the possibility of combining website and user through GoogleAPIs, IP addresses, and existing Goggle cookies, which is possible.

    You say they receive, “extremely spotty tracking data.” Since Google has never touched any data they didn’t mine and maintain, I don’t personally want to give them anything, even what you consider “spotty.”

    Please don’t take this the wrong way, but if this weren’t profitable for Google in providing them additional data to mine, they wouldn’t be doing it. Google is a business, whose only product is information; they are not a not-for-profit giving things away, and regardless of how most geeks see them, they are not altruistic. Gmail, Google Maps, all of the services are designed to acquire data they can sell. That isn’t “evil,” but I prefer not to participate.

    If you as a website require me to load data from any Google server, you are requiring me to participate in G’s tracking whether I wish to or not, and this makes no sense particularly if you are a service that should be providing privacy, like a bank or a domain registrar. I am actually changing registrars because my current one requires me to accept Google AJAX files for only one object; the verify code for the credit card. They “require” me to accept that JS file. I won’t, so I can’t pay them. How seriously stupid is that?

    • Dave Ward
      Aug 24 - 11:31 pm

      The problem isn’t Google’s CDN. It’s that your credit card company doesn’t offer a fallback when you block connections to the CDN. There’s no reason that both can’t coexist. When you loaded this page, it first attempted to load jQuery via the Google CDN too, but if you blocked that connection then it resorted to loading a local copy. Any site can do that in one line of code. I totally agree with you that it’s not necessary for them to require you to allow a CDN connection to use the service at all.

      You’re drastically overestimating how useful IP and referrer data is when you only receive it in up to one year intervals though. With most connections coming from dynamic IPs or from behind NAT, a sporadic sampling of logged IPs without tracking cookies is worse than worthless. Trying to integrate that into the data collected by cookie-tracked services like AdSense, Analytics, and Gmail would only degrade Google’s ability to track you, not improve it.

      Google does a lot of things that don’t directly add to their bottom line. From their contributions to charity to things like the Summer of Code each year, they certainly spend a lot of money that only benefits them by improving the web platform in general (so they can continue to serve ads on it) or by buying them goodwill. I can be as cynical as the next guy, but sometimes there aren’t monsters in every shadow. It makes perfect sense that they’d host these common JavaScript libraries on their CDN in order to help speed up the web and improve their advertising business in the long run.

  216. Charlie Summers
    Aug 25 - 12:16 pm

    The problem isn’t Google’s CDN.

    I agree. And so there is no misunderstanding, it isn’t my credit card company but rather my domain registrar who is requiring the code, only in the payment screen, and only for the CVV entry object – everything else works on their site without problem, except that one tiny entry field. Still, it prevents me from paying them, so I am transferring my domains. (*shrug*) Most people would just bypass their firewalls or use a proxy, I suppose, but I can’t trust a company that’s using an allegedly secure connection and then offloading connections to a non-trusted third party without informing the user.

    You’re drastically overestimating how useful IP and referrer data is when you only receive it in up to one year intervals though.

    You are drastically underestimating the number of times user browser caches are cleared (in my case multiple times per week), browsers or systems reinstalled, et al. But the level of usefulness is irrelevant in any event; I do not believe Google has ever acquired any data that it didn’t maintain and eventually put to use, so even a small amount of usefulness is something I prefer not to provide.

    (*sigh*) We will need to agree to disagree. While it is impossible to eliminate tracking (both on and off the Internet), it is possible to keep one monolithic company from knowing a frighteningly large amount of data about one and maintaining it past the 180-day legal limit in the United States. I chose to follow this path. You as a web programmer seem to think it’s a great idea to allow this company to monitor your users. I think that is the height of irresponsibility, while you believe it to be not only practical for the website but a benefit to the entire infrastructure. I expect they will do whatever is technically possible as their entire business model is predicated on violating the privacy of Internet users, while you believe they have only the noblest intentions and can be trusted with this massive amount of data, so inadvertently providing them more is not a problem.

    And since we’re never going to agree on this matter, I leave the last word to you with my thanks for the enjoyable conversation.

  217. Anonymous Guy
    Aug 27 - 6:43 am

    Hello, tried using your link, the // thing at the start of URL, chrome interpreted it as file://. Didnt look up on other browsers though.

    • Dave Ward
      Aug 27 - 1:24 pm

      That will happen if you’ve loaded the page itself via a file:// URL, since the // URL just loads the resource with the same protocol that the page was loaded with. That won’t happen if you access the page via HTTP or HTTPS (e.g. once deployed or using a server on localhost during development).

  218. [...] Read more about why to use Google for your jQuery Scripts & Google’s AJAX Libraries CDN here.  and learn more about jQuery [...]

  219. Mike
    Sep 22 - 4:39 am

    If you’re allowing third party JavaScript hosted on an external domain to modify the DOM of your site, you’re doing it wrong. Talk about a single point of failure. Imagine the scale of destruction you could cause by hacking the Google hosted version of jquery.

    • Dave Ward
      Sep 22 - 11:28 am

      If you think Google’s public-facing servers being hacked is likely, the jQuery CDN should be the least of your worries. The surface area in AdSense, Google Analytics, GMail, and even Chrome itself is far, far larger.

  220. Mike
    Sep 22 - 12:10 pm

    I agree with you that the same problem exists with AdSense and Analytics. This is why we should be encouraging people to use server side log analytics programs instead of Google Analytics, and encouraging people to avoid poorly designed ad systems like AdSense.

    GMail and Chrome have nothing to do with the discussion at hand.

    What makes this jQuery issue *completely* different to the other things you just mentioned, is that the benefits are only slight, yet the drawbacks are *huge* for the web. Google Analytics and Adsense as a service *require* trusting your DOM to a third party domain. Using jQuery doesn’t.

    If you’re encouraging people to make the web considerably less distributed in nature, and encouraging them to build a single point of failure, all for a barely noticeable improvement in performance, you’re doing it wrong.

    • Dave Ward
      Sep 22 - 2:04 pm

      GMail and Chrome are definitely relevant if you truly believe that Google’s content distribution servers being hacked is a realistic fear to act on. For all I know, I’m using a maliciously patched version of Chrome to write this comment.

      I understand your point (if you look through the previous comments, it’s been discussed a few times before), but I can’t agree with the implied risk assessment.

      It’s incredibly unlikely that Google’s CDN will be compromised to deliver a malicious copy of jQuery to begin with. Even if that did happen, it would certainly be discovered within hours, if not minutes. The sheer number of eyeballs on those files is the best anti-virus there is. In fact, I’d be shocked if there aren’t people out there running an automated, periodic diff on the CDN copy and the latest copy on the jQuery website to monitor for that specific issue.

      Meanwhile, if someone compromises your server – a far, far more likely event – and replaces your copy of jQuery with a modified one, you’ll probably never know. Unless the change breaks something, the overwhelming majority of site owners (myself included) wouldn’t be vigilant enough to notice a subtle change like that in files they assume to be safe. I’d be much more worried about that vector than worrying about a black swan event in spite of the mitigating crowd-sourced security that comes with a popular public CDN.

      Ultimately, I’m sure we’ll have to agree to disagree about this, and there’s nothing wrong with that. I do want to be sure that you (and others reading this later) understand that those of us advocating for the public CDN approach have considered the security implications though. We just don’t agree that the risk is probable enough to slow the web down for fear of.

  221. It has a nice API and I am glad that they are free, but just know that even if the static IP addresses for the company, often not know something about the location.

  222. mg
    Sep 23 - 1:44 pm

    1. What about the best practice of minimizing http requests by combining scripts into one single javascript file?

    2. What’s best for sites requiring login and https?

    • Dave Ward
      Sep 24 - 12:05 pm

      Over time, the solution I’ve found to be optimal is to use the Google CDN for jQuery (with a local fallback, just in case), combine site-wide plugins into a single minified bundle to optimize long-term caching for the portion of scripts that changes infrequently, and then include your more volatile custom scripts in a separate minified bundle.

      You can see that in action here at the bottom of the source for this page (except I haven’t bothered to minify my custom script since it’s so small).

      Regarding HTTPS, the Google CDN does support HTTPS references to all of its content. An easy way to ensure you’re always using the best reference is to use the protocol-less URL. Then, browsers will automatically select the appropriate protocol based on what was used to load the page.

  223. Stefano Novelli
    Oct 01 - 5:32 pm

    You can take last version from Google CDN with:

  224. AntoxaGray
    Oct 08 - 3:29 am

    I prefer to hotlink to google APIs too, but some clients insisted to upload to server. They said «it’s safer, what if google site crash?». Personally I never encountered google down or hacked.

    But yeah I agree, that’s stupid to have billion copies of jQuery in your browser cache.

  225. Thomas
    Oct 08 - 10:47 am

    This discussion only seems to be addressing the question whether the jQuery library should be hosted by Google or not. I think a more important question is whether one should be using jQuery at all, because in principle everything that jQuery can do can be done with basic Javascript as well (because essentially it is just a kind of restructuring of Javascript). In many cases (e.g. just for a drop-down menu) it simply is overkill to have to download the whole jQuery library, when just little bit of CSS and Javascript does the job as well. Even more importantly, I found that the jQuery code (or that of modules using jQuery) can interfere with standalone Javascript code, For instance for our company website, I had written some Javascript code that, for reasons of optimum page display, enforced a page reload when the window was resized. Now recently we had this website completely redesigned externally, where the designers also added a couple of features using jQuery. However, due to certain features in the jQuery code (which I wasn’t actually able to pinpoint exactly), in IE7 this forced now a reload as soon as the page was loaded, so an infinite reload loop developed (which obviously IE7 users were not happy about). As a result, I have now replaced all modules using jQuery by CSS and Javascript code compatible with the other Javascript on the page. At least this way I can keep control over the script actions and no unexpected results arise.


    • Hendrik P
      Oct 09 - 8:09 am

      @Thomas, yes, why on earth would anyone use a tool that makes programming easier and reduces code size by as much as 70%? I don’t understand those people. In fact, I’m sending this message in Morse over the telegraph because probably understood, that’s all the internet is: communication over wires. I for one prefer doing things the old and slow way, and I was excited when the homing-pigeon informed me there was a like-minded soul somewhere out there. Yours sincerely, Hendrik.

  226. Thomas
    Oct 09 - 10:11 am


    I take it you are trying to be ironic here and actually dismiss my point of view.

    Let me again point out that our company website was recently redesigned, with the designers not only implementing jQuery but also Mootools scripts for such basic things like dropdown menus, banner rotation with fade-in etc. As a result the homepage weighed in at over 500kB, which even on broadband connections can cause a noticeable waiting time until the page is fully rendered. Not to mention, as I said already, the problems which certain features in those scripts caused for the correct execution of other Javascript code. And since locating and debugging these kind of conflicts proved to be an utter nightmare, I decided to remove all the jQuery and Mootools scripts and write myself dedicated Javascript/CSS code that does exactly what the former did as well. Not only is all the scripting now without any internal conflicts, but the download size of the homepage has been cut by half to about 250kB.

    So I really don’t see what in this case your argument of easier programming and reduced download size is based on. There is nothing that jQuery can do that simple Javascript couldn’t, and in most cases, the latter will provide by far the more efficient solution, not only as far as the amount of code is concerned, but also with regard the maintenance of the code (because if you want to keep control over all aspects of the scripting, you would have to know exactly what each jQuery statement does in terms of the Javascript involved, which I doubt anybody but the authors of jQuery would be able to do).

    Just bolting another programming language on top of an already existing one may admittedly keep some additional programmers in bread and butter, but is in most cases problematic and inefficient and thus user-unfriendly.


    • Hendrik P
      Oct 09 - 2:57 pm

      Sure, there might be cases where sticking to standard JS is a good option. However, there are millions of websites that are greatly helped by jQuery, no question about it. This specific thread is meant for people already convinced of its advantages and who are debating a jQuery specific problem. Therefore, while your argument may be valid in some cases, I’m sorry to say that your comments add nothing to this discussion.

  227. Roman
    Nov 07 - 5:03 pm

    I’ve always store js libraries locally. Now I see that is a good idea to do how you recommend.

  228. Bryan Hadaway
    Nov 10 - 3:17 am

    Just a heads up, you left http: out of the jQuery call.

    The full thing should be:

    <script type="text/javascript" src=""></script>

    Thanks, Bryan

  229. Bryan Hadaway
    Nov 11 - 5:25 pm


    Yeah, it is a bit tough to let go of the warm and fuzzy way of doing things. Considering, I’m all about minimalist code this should make me happy in the long run. :)

    Thanks, Bryan

  230. [...] below if you want to host it your self but having it Google hosted it has advantages as talk about here and you would need to add the theme images and css files into a module in your package if self [...]

  231. Travis Cunningham
    Nov 29 - 7:56 pm

    True Google has a lot of speed and would save you bandwidth. Cloudflare is a free CDN that would also do this for you for free.

    Unfortunately, hackers will attack this if everyone uses it. A hacker could issue a MITM (man-in-the-middle) attack, which would modify the contents of the Jquery script. Since Google is trying to make your browser cache Jquery.js, the hackers code would remain in your browser’s cache and would be loaded every time you would load a website with Jquery that is hosted on google. This is a common attack on large computer networks, but home user’s on private networks should not have to worry about this.

  232. Saeed Neamati
    Dec 03 - 6:54 am

    The only problem we have with CDNs, is filtering. Sometimes you use a CDN library, but that library can’t be delivered, because the CDN itself is filtered in a specific country. However, all other reasons are valid. Thanks.

  233. [...] [...]

  234. Jason
    Dec 19 - 12:52 pm

    Thanks! Great post. I also notice I was having a lot of issues with some servers hosting jQuery locally. I use google instead now

  235. Dan
    Dec 21 - 3:13 pm

    Really bad idea, I am presently working with a user using a web based app that has done this and the users firewall or proxy is blocking the url.

  236. Chris
    Dec 29 - 2:55 pm

    There is one benefit you didn’t mention. Google gets free tracking data on all the pages you use it on. Yay! ;)

    • Dave Ward
      Dec 29 - 3:01 pm

      I have no way to know whether they’re trying to harvest some sort of tracking data via the CDN or not, but it would be incredibly poor quality data for individual tracking. The domain is cookie-less and your browser can go for up to one year without ever making a second HTTP request to the CDN for a given script. So, they’d only get useful tracking data on you if you were the only user of a static IP address and had caching disabled in your browser.

  237. Chris
    Dec 29 - 3:14 pm

    First off, I liked your article, and used the method for a client, just now.

    But, you are downplaying the tracking information Google does get.

    Every time you load a page with this code, it will send a request to Google. Then, Google will send back a 304 “Not modified” response, if the file hasn’t been modified. You can verify this for yourself, using Firebug or another network monitor.

    What this means, is every time your browser sends that request, Google gets your IP, your user agent (browser / os), the time of your request, and the referring URL (the page you are on). In my eyes, that’s a good amount of data.

    • Dave Ward
      Dec 29 - 3:18 pm

      The primary advantage of using a far-future expires/cache-control header is that the browser doesn’t need to make any request at all to use its locally cached assets, not even a 304 check.

      You’ll see a 304 when you force a refresh in some browsers, but no request at all when you’re making regular requests to pages. Click around my site here (not manually refresh) and watch the network tab in your browser. You won’t see an HTTP request for the Google CDN reference after it’s been cached once (at least not for a year).

  238. Chris
    Dec 29 - 3:17 pm

    Note: caching is enabled on my browser. And also, from the user-agent, different devices could be distinguished. IE: a smartphone, a mac computer, a windows 7 computer, a windows vista computer.

    Just saying. :)

  239. Chris
    Dec 29 - 3:23 pm

    Oh, you are right about the caching! Guess I was being overly cautious

    • Dave Ward
      Dec 29 - 3:26 pm

      Nothing wrong with a little caution. It’s probably a safe assumption that Google’s tracking you whenever they can. This is just one of those few situations where even Google shouldn’t be able to extract any useful tracking information from the data they’ll be getting.

  240. Al
    Jan 04 - 6:19 am

    One major problem we have hit today… OpenDNS has just categorised as a Phising site, meaning anyone who uses OpenDNS can’t access the jQuery we link to via Goggle CDN

    Ok, not a Google problem, but a big enough headache today!!

  241. Chris
    Jan 04 - 3:03 pm

    I wonder if Microsoft’s jQuery hosting domain got tagged as phishing as well

  242. [...] And this explains why you may wanna use it, even if your visual editor works fine: [...]

  243. MegaSteve4
    Jan 12 - 6:13 pm

    DEAD LINK ALERT – I would guess that WordPress is stripping the h t t p from the beginning of the link resulting in this “//…………….”

    This is happening in the link at the top of the tut and bottom

    But great tut v informative

  244. Chris
    Jan 12 - 6:16 pm

    Actually, that’s done on purpose. This way, if you are on an HTTPS page, it matches that.

  245. MegaSteve4
    Jan 19 - 3:57 pm

    Ah ok… Worth noting that that link does fail if you try and run it from a local HTML file say on your desktop….as I did I guess ‘Live’ would be different – great article though.

  246. John AtWork
    Jan 24 - 9:25 am

    I can’t use google as my CDN because they have not enabled CORS.

    • Dave Ward
      Jan 24 - 2:45 pm

      A CDN doesn’t need to enable CORS for you to use its assets on your page. That’s only necessary for AJAX requests, not for referencing third party scripts.

  247. Paolo Brocco
    Jan 26 - 10:10 am

    Hi, I don’t know if somebody mentioned it, but ideally all website would link to jquery (hosted on google ajax libraries) without specifying an exact version. Because if a website links to jquery 1.6.0, the other to 1.6.1, the other to 1.6.4 then it will be the same as linking to your server one.

    So e.g. instead of linking to
    we should all link to… see the difference? Asap jquery is updated everybody is pointing already to the last version and the caching is there and no need to update your website.

    Disadvantage: if jquery breaks something you have to correct your scripts or link to an older version. But I would risk that, after all if something stops working then you just link to the older version and voila: fixed.

    • Dave Ward
      Jan 27 - 10:23 pm

      You shouldn’t do that in production. In order to reliably serve up the latest version for /1 or /1.n, the CDN responds with very short caching expiration headers. So, you lose almost all of the caching benefit. At that point, serving jQuery from your own site with proper caching headers would actually be faster for returning visitors.

  248. Andrew Fisher
    Feb 06 - 4:58 am

    This might be a good idea if google delivered its jquery cache as type text/javascript. Unfortunately, it delivers it as text/plain, which is great for reading it but is likely to result in the script being blocked by any halfway-suspicious anti-XSS software.

    Some individual users may be able to work around this, but my view would be that a site that gives me an hour’s work checking out whether I can trust this cache (which I haven’t done, so I don’t even know whether, for instance, google allows third parties to host content) isn’t worth visiting.

    I keep google analytics permanently blocked. Why would I trust google not to move some analytics code onto this site?

    • Dave Ward
      Feb 06 - 11:05 am

      This is the Content-Type that the Google CDN is serving jQuery to me with right now:

      In any event, potential problems like that one are why it’s a good idea to also use a local fallback (see previous comments for links and examples of that). That way the majority of your users will benefit from the CDN, and the few with troublesome setups will still be okay.

      Also, keep in mind that the domain is cookie-less. It’s not suited to analytics. Without a way to disambiguate dynamic IPs and machines behind NAT, trying to extract individual tracking data from the CDN would be pointless.

  249. [...] If you don’t know why you should load your jQuery file from Google read about some of the advantages here. [...]

  250. K. Frost
    Feb 17 - 1:01 pm

    So this is one of those articles that encourages “webdesigners” to include contents from google.

    I’ve blocked these google services for privacy reasons and it really p*ss*s me off seeing that more and more website just don’t function any more because some web-wiener thought it would be nice to remotely include this stuff from big G. Really, this is not only a bandwidth/latency issue – you’re stealing a bit of your user’s privacy and potentially lock those out that care about it – and really, privacy decisions should be left to the user!

    Please also mind that there are many company/country networks that block google services or cross-domain javascript inclusions (for “security” reasons). For those of us that are bound by these restrictions, some websites just don’t work any more, because somebody followed your idea and thought it would be great to include the script directly from google.

    By the way – the “decreased latency” is outweighed by the fact that a browser loading the site has to resolve the (additional) domain name again, wait for the reply and then is able to make the request.

  251. [...] if you want to host it yourself but having it Google hosted it has advantages as talked about here and you would need to add the theme images and css files into a module in your package if self [...]

  252. [...] popular Javascript libraries, we can utilize Google CDN as a source to download the jQuery script. Dave Ward has a very good writing on the advantages of using Google [...]

  253. George
    Feb 28 - 9:30 am

    do you know if google or twitter provide an similar service to host twitter bootstrap library and css

    i found that link is used in twitter bootstrap

    do you think it is a good idea to Depends on this link

    • Dave Ward
      Feb 28 - 11:24 am

      No, I wouldn’t use that in production, because it’s not being served with a far future expires header.

  254. Codicode
    Mar 04 - 9:04 pm

    about the tracking of information by Google, if you are already using analytics, adsense or other google products on your site, google is already getting it.

    • Kevin P. Rice
      Aug 21 - 4:43 pm

      Which is why I don’t use Google Analytics, Adsense, or Plus. I block most third-party content and am dismayed by web sites which compromise user privacy. I thought this public CDN craze was really great a couple years ago, now I avoid it.

      Web sites that rely on public CDN’s are like “that guy” who forwards emails without using BCC. Don’t be “that guy”.

      • AntoxaGray
        Aug 21 - 6:16 pm

        You sir is a conspiracy theorist.

        • Kevin P. Rice
          Aug 21 - 6:25 pm

          Ad hominem is fallacious argumentation. Please debate in rational and supportable terms.

          • Dave Ward
            Aug 21 - 7:45 pm

            I agree that we should keep name calling to a minimum, but AntoxaGray isn’t far off the mark here though. Between the +1 year far-future caching and cookie-less domain, there’s very little (if any) tracking data that Google’s going to extract from CDN usage. Even if they tried to, that intermittent, anonymized data is going to be worse than what they already have on almost everyone through other sources anyway. You should welcome the prospect of them muddying their tracking data with noise from CDN usage!

            • Kevin P. Rice
              Aug 21 - 8:48 pm

              Hi Dave. I don’t know what you mean by “worse”. Anything obtained includes IP and Referer which—especially in synergy with the vast amount of information already held by Google—breaches user privacy. Any “hit” provides Google with new information that IP address X is interested in, and shapes what is known about the person at IP X.

              Google is primarily an Internet advertising company. Knowing that X has an interest in is extremely valuable information. Why do you think Google offers a CDN? Out of the goodness of their heart? Possible, but unlikely.

              Now, I can offer a cookieless far-future expires just as much as Google can. This reduces the advantages of a CDN leaving very little substantive gain. We can respectfully disagree on the importance of that gain vs. privacy, but the gain is minimal (or negative in some cases of DNS lookup or certain countries) and first-hit only. Add the complexity/risk of needing a local backup or development implementation and the gain gets smaller.

              Until you know someone who has been subject to the abusive and impenetrable side of Google it may sound like conspiracy to you. Or, you may find the perceived benefit worthwhile despite rare–but real–stories. I personally find otherwise, and I don’t base my opinion on conjecture or superstition; it is a personal weighing of attack surface vs. benefit.

              One additional issue which I feel muddies the whole idea of public CDNs is versioning. Caching myriad versions of jQuery (and other libraries) reduces the benefits of shared caching. IMO, a public CDN should push the latest (i.e., bug-free, most secure) code base. But, as you mentioned previously, requesting version “1.*” from Google defeats the far-future expires header. So we end up with the millions of unmaintained sites on the Internet all causing the caching of many years-old versions of jQuery. Needlessly. Yes, I know that breaking changes enters this discussion, but I think the whole public CDN idea needs further development to overcome versioning hell and the many other issues raised by commenters above.

              I do find your article to be incredibly well-done with a persuasive argument, technically accurate and informationally valuable. I was initially persuaded but reversed my opinion upon further reading and deliberation.

              Thanks for adding your knowledge to the web!

              • Dave Ward
                Aug 21 - 10:33 pm

                By “worse”, I mean “worse”.

                I don’t work for Google and I’m not privy to any of the algorithms the AdWords/AdSense tems use to target ads, but I do spend a lot of time working with web analytics. And without a tracking cookie to identify a particular browser, a simple IP/referrer combo is worthless with the prevalence of dynamic IPs and multitude of devices that masquerade behind a single IP with NAT. In fact, that kind of data is worse than worthless because it pollutes your existing data if you try to include it in information gathered via cookie tracking. Google has far too many high-quality inputs to want to muddy that data with the CDN’s random noise.

                Until you know someone who has been subject to the abusive and impenetrable side of Google it may sound like conspiracy to you.

                If you have specific information to share instead of FUD and innuendo, please do so. If I had concrete information that showed the Google CDN to be a credible privacy threat, I wouldn’t recommend using it for one more second.

                • Kevin P. Rice
                  Aug 22 - 12:15 am

                  Dave, you might be 100% correct. Your suggestions are valuable, but I find that I can address the performance aspects 99% as well with local hosting and have achieved top ranking and 100% PageSpeed without CDNs. I’m not seeing the last 1% (or whatever) gain being significant enough compared to the other issues, of which privacy is only one of them. It might be a few milliseconds or even a tenth, but it’s only the first hit. I might go the other way if it were 1/4 to 1/2 a second.

                  I don’t intend to derail the discussion by delving into Google’s abusive side. That is only one aspect of my consideration which is not single issue oriented. Google does have an occasional practice of unilaterally closing AdSense accounts and walking away with the cash balance without explanation, warning, evidence or appeal.

                  Specific info:

                  I will not get dragged into a debate on this topic. I provide it for your edification only. I understand the reasoning from both sides extremely well. If you knew someone who got sacked by Google for big bucks by this brick wall treatment then you might consider they aren’t the next best thing to sliced bread. They are a behemoth entity with their own interests in mind and will drop business partners on their head in a hot second and kick you out the back door without explanation, recourse, or so much as an email address or phone number to contact them.

                  Once again, it’s not about a single issue, so I’m not interested in diverting to the above topic–post on that guy’s site if you want. It’s the bigger multi-pronged picture of privacy, added complexity and the small gain vs. local control. I prefer local control for my web sites (and for my government).

                  Your article remains a valuable contribution to the Internet community and provides great feedback regarding the use of CDNs by many others that I would not have conceived of on my own. I hope the discussion of supporting and opposing views will spark ideas, bring improvements and further the industry. Thanks again.

  255. [...] imagine my surprise earlier today when I reading Dave Ward’s “3 reasons why you should let Google host jQuery for you“, when I noticed that his link to Google’s cache of jQuery was as follows: <script [...]

  256. [...] The code above can be inserted inside the head tag or at the very bottom of your document (which is normally recommended for better performance). In either case, it’s more efficient to include the css file before any javascript (usually inside the head tag). Notice that we’re using Google’s CDN to get jquery and jquery UI libraries (why?). [...]

  257. Drosan
    Mar 21 - 10:20 am

    The benefits being shared here and elsewhere are all theoretical. Just came across an in-depth analysis of using a CDN and if it provides the expected performance benefits.

    • Dave Ward
      Mar 21 - 10:39 am

      Fragmentation is a legitimate issue, but you have to keep in mind that a single CDN reference in the top sites that he’s looking at will prime millions of browser caches each day. Coverage doesn’t need to be as thorough as you might intuitively think.

      Also, his analysis didn’t go far enough. When you confine your analysis to larger sites, it’s much more likely that they’ll have their own CDN in place and not care to use a third party. When I did something similar a couple years ago, but took my crawler all the way down the top 1,000,000 sites (by Alexa ranking), I found that references to the Google CDN increased the further down that long tail that I went. So many WordPress themes, sites using H5BP, and other starting points use the CDN reference that its coverage among all sorts of new and/or niche sites is really great.

      Putting those two facts together, I stand by my recommendation to use Google’s CDN for jQuery.

  258. Henry
    Apr 02 - 9:33 pm

    I use WordPress and having my own server cope with loading jQuery really seemed to slow my site down. After going with your CDN approach, my site now loads a whole 2 seconds quicker (testing with Pingdom).

  259. [...] wanted to load jQuery from Google’s Content Delivery Network (why?). My initial attempt was to simply change the URL for the jQuery reference to the [...]

  260. Ivan
    Apr 05 - 12:53 pm

    With average website page nearing 1Mb these days, for me a 28K is a negligible tradeoff for being absolutely sure that my website will be functional no matter what happens to Google CDN.

    1. It may be down or inaccessible (just google for and see how many ppl have problems)
    2. It may be blocked by your firewall or your sysadmin
    3. It may be phished for a malicious code
    4. Google might decide to change the naming conventions (happened to my AppEngine site once)
    5. You can not run your website on a local network


    • Dave Ward
      Apr 05 - 1:59 pm

      This page you’re commenting on, with social sharing widgets, ads, images, and nearly 400 comments only weighs in at 365kb (at the time I’m writing this). jQuery 1.7.2′s 33kb is nearly 10% of that; not negligible.

      You’re right that it’s a good idea to prepare for situations where the CDN is inaccessible, but falling back to a local copy in those situations is simple.

      As for the security FUD, those doubts have been raised before, but I don’t find them credible. It’s far, far more likely that your own server will be hacked than Google’s hardened CDN. On the other hand, if the Google CDN were hacked, that would be front page news within minutes and remedied immediately, whereas orders of magnitude less eyeballs are watching the assets on your server for you. This is one case where you’re actually safer sticking with the herd, IMO.

      • Kevin P. Rice
        Aug 21 - 4:58 pm

        Google concerns me more than any hacker. Google is not our friend; they are a powerful entity which is even more empowered by their CDN. Other performance gains can be had without sacrificing user privacy.

        Self-hosting with a far future expires header is every bit as effective except for the first hit, and perhaps faster considering the separate DNS resolution required for Google’s CDN.

        I don’t find a modest one-hit performance gain to be sufficient reason to overlook privacy.

  261. Kevin P. Rice
    Apr 12 - 12:35 am


    When I first read this a couple years ago I was totally sold. Now, I’m switching back to local.

    (1) PRIVACY, PRIVACY, PRIVACY. Google is evil. They really are. And if they aren’t yet, they will be. Or they will be compromised by government.

    (2) I can set caching for one year also.

    (3) I’ve now set up my own static domain to address parallelism and cookies.


    (4) The latest server technologies automatically combine multiple .js and .css resources into one download, and minifies them and compresses them at the same time. There is now less penalty to maintaining customized jQuery and other resources locally.

    GREAT ARTICLE, NEVERTHELESS! THANKS! Your arguments remain powerful ones but I think the benefits can be reasonably reaped without sacrificing privacy.

    Best wishes!!!

    • Neil
      Aug 21 - 10:07 pm

      I’ve been hosting locally and while reading this article I was beginning to feel a bit foolish but, you guys provide some good arguments both ways. I generally feel like I have more control if I’m hosting them on my side, and the privacy issue is definitely worth giving some thought. If I were pressed for bandwidth, however, I may just let them host it.

      I wonder if google would they rank your site higher if they gain more stats through your CDN links? I would if I were them and were evil~

      • Kevin P. Rice
        Aug 21 - 11:47 pm

        Diversifying assets on different domain names will increase your Google PageSpeed score and is part of the secret ranking algorithm. I have achieved 100% PageSpeed scores without CDN hosted assets, however. You can set up your own static site for .js and other assets to improve parallel HTTP GETs.

  262. IVI
    Apr 19 - 4:20 am

    For Android (& other phone) applications, the cost of data to load jQuery from any external (ie, off-phone) host can be prohibitive of using the article’s suggestion.

    Of course, one could insist that an off-phone host be limited to those times (normally few) when it can be loaded via WiFi, rather than via the phone’s costly non-WiFi data connection.

    That could be implemented by a data source test & a decision to use the local jQuery.js file when anything other than WiFi would be used to bring it in from elsewhere.

  263. Arman P.
    Apr 21 - 8:00 am

    Good article. I think any web developer must use Google CDN and understand its’ benefits.
    Though I will recommend to expand a little bit this approach, to be sure that jQuery is loaded even in the case when Google CDN is down (inaccessible), this approach also helps with local web site development without internet connection:

    <script src="//"></script>
    <script>!window.jQuery && document.write('<script src="js/jquery-1.7.2.min.js"><\/script>')</script>
  264. bedrijf
    Apr 27 - 4:07 pm

    In general, in order to keep page load times to a minimum, it’s best to call any Javascript at the end of the page, because if a script is slow to load from an external server, it may cause the whole page to hang.

    • TylerM
      Oct 30 - 2:13 pm

      This is off-topic, not to mention a nasty habit. Sticking tags all over the place so they are only loaded when necessary is a terrible practice; sometimes you need a script halfway through a page, etc. It’s the kind of fingernails-against-chalkboard style that I see older developers doing.

      Load them asynchronously if you really need to. For example, use a callback on the google load call.

  265. Nando Vieira
    May 14 - 2:49 pm

    Definitely not doing wrong, if you decide hosting jQuery yourself. Google CDN takes some time before adding a new release.

    I just prefer having it on my side, compressed, gzipped, and merged with other files.

  266. [...] generally use Google to host my jQuery – if you’re wondering why take a look at ‘3 reasons why you should let Google host jQuery for you‘ by Dave [...]

  267. Michael
    May 16 - 9:59 am

    It’s always good to fall back to the local copy if for some reason Google’s CDN cannot be accessed. yepnope is good for this:

          // Load the jQuery library
            // Load the jQuery library from Google's AJAX API CDN
            load: '//',
            // Determine if the local copy of the jQuery library needs to be loaded
            callback: function()
              // If the jQuery library could not be loaded then load the local copy
              if (!window.jQuery)

    You can download it from

  268. [...] | encosia Más información | Google Libraries API En Genbeta Dev | Google Libraries API, un repositorio de [...]

  269. [...] Ward (Encosia) has a comprehensive list of over 100 other mentions of the advantages of using this method.  If [...]

  270. Bravo.I
    Jul 05 - 3:33 am

    Thanks just so much! I was here for the link actually, and you made it really easy!

  271. [...] missing. There is no HTTP or HTTPS in the include statement. Why, you say? Well in an article from, the author (Dave Ward) explains the following: It’s not exactly light reading, but section 4.2 [...]

  272. Bookmarklet | Elitist
    Jul 13 - 11:58 pm

    [...] CDN? Google’s, for very good reasons! [...]

  273. [...] #Tips: You should let Google host jQuery for you rather than host it yourself! You may refer to article 3 reasons why you should let Google host jQuery for you. [...]

  274. [...] SharePoint document library, I prefer to use the Google Hosted CDN. Thanks to Dave Ward for putting 3 reasons to let Google Host jQuery for you, this also saves me from keeping multiple local versions of [...]

  275. [...] There are a couple good reasons for following the second method. The main reason is that it will be faster for the user. If you want a more detailed understanding of how it is faster for the user, be sure to check out this link. [...]

  276. Bosphorus
    Oct 19 - 4:05 am

    Thanks for the run-down. I read your article after installing the Use Google Libaries plugin for wp on a recommendation by Yoast.

  277. jobin
    Nov 12 - 11:13 pm

    thank you for helping me in implementing jquery in wordpress. Since I am a beginner in worpress theme development, I can say that your article was excellent.

    • Baldwin
      Dec 07 - 2:25 am

      Yes, it indeed introduce lots of benefits into your external website with Google hosted jQuery, for example, basically all the DotNetNuke CMS implemented it. :)

  278. [...] learn more on why that is a good idea and why is the http: in the URLs missing in a good article here. There are several themes that can be used for highlighting effect, or you can define your own, on [...]

  279. [...] be using Google’s jQuery CDN.  This post was inspired by Dave Ward‘s article, “3 Reasons why you should let Google host jQuery for you,” which deep-dives into various implementations too complex to be covered here.  A special [...]

  280. [...] inside the head tag). We’re using Google’s CDN to get jquery and jquery UI libraries (why?). The archive contains copies of both jquery and jquery UI (I’ve made a custom build of jquery UI [...]

  281. [...] meget mere på dette site inkl. gode tips til den bedste måde at implementere det [...]

  282. [...] the first <script> tag links to the jQuery file hosted by Google; it is a common practice nowadays and it is done for performance reasons; if you are interested to find out more about this, you can read a more comprehensive article. [...]

  283. Adam
    Feb 21 - 11:13 am

    FYI, every time I forget the path to the google CDN for jQuery, I search “why you should let google,” and then copy/paste the script tag you’ve got in the article.

    I’m vaguely curious how much traffic this page generates by other people doing the same thing :]

    • Dave Ward
      Feb 25 - 7:07 pm

      This page doesn’t get as much traffic as you might think. A rough average of 1,000 views on a weekday that doesn’t have any extra traffic for some reason or another.

      Funny that you mention using this page for reference. I do the same thing myself when I need to add the CDN reference to a page. That’s why I added the easy copy/paste text input at the top and try to keep it up to date with the latest version. Glad to hear that you find it handy too.

  284. Una
    Mar 05 - 9:56 am

    THANK YOU for this article. Helped me so much, your explanation of the difference between using the local version and hosted version.
    I wondered why that never worked when testing on my local computer.
    So simple. Insert the http: Now everything works perfectly.
    YOU are a STAR !!

  285. [...] noticed that it made a call to the theme’s storage of jquery 1.3x. Having read the article “3 reasons why you should let Google host jQuery for you” I opened up the theme’s header.php and replaced the reference with a call to the latest [...]

  286. Bob McCharlie
    Mar 19 - 3:10 am

    When it comes to institutional establishments such as schools who have very heavy proxies and security settings, often the administrator is the one responsible for the firewall and when web applications have multiple references it can be a real pain to chase them through the DOM to allow the site to work properly. Including all files on one domain simplifies this task so to CDN or not solely depends on the markets needs.

    • Dave Ward
      Mar 20 - 7:31 am

      I agree that if you’re 100% inside an Intranet, especially one with only a single physical location, it does make sense to skip using the CDN.

  287. [...] There are a couple good reasons for following the second method. The main reason is that it will be faster for the user. If you want a more detailed understanding of how it is faster for the user, be sure to check out this link. [...]

  288. [...] which case you should probably not use anything that terrible), but it has the added benefit of using the Google CDN: mo caching, less [...]

  289. [...] Note: the “missing” http: in that URL isn’t a mistake. You can learn more about that here. [...]

  290. Jimeny Crickbak
    Jun 26 - 6:20 pm

    Google’s CDN isn’t 100% reliable, so make sure your site falls back to a local copy.

  291. Johan
    Jul 11 - 2:33 pm

    I quick check revealed that adds additional cookies to your up and download. about 250K. Apart from wondering what google might do with cookies from these static webpages (spy on you, I suppose), it ads additional burden on upload. Cookies are transmitted as part of the header. This might not seem much but with the advent of mobile devices on a metered interface it is less desirable.

    Apart from Google spying on you :-)

    Host frameworks on your server, it actually is easier.

    • Dave Ward
      Jul 11 - 4:55 pm, where the documentation is hosted does set routine tracking cookies. The images on the 404 page if you visit are served from, so they also carry some tracking cookies.

      However, is a cookie-less domain. There are no cookies involved in requests/responses for the actual libraries hosted there, e.g. jQuery 1.9.1 as of a few minutes ago:

      Request URL:
      Request Method:GET
      Status Code:200 OK
      Request Headersview source
      User-Agent:Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.71 Safari/537.36
      Response Headersview source
      Cache-Control:public, max-age=31536000
      Content-Type:text/javascript; charset=UTF-8
      Date:Thu, 11 Jul 2013 05:45:49 GMT
      Expires:Fri, 11 Jul 2014 05:45:49 GMT
      Last-Modified:Fri, 08 Feb 2013 15:35:10 GMT
      X-XSS-Protection:1; mode=block

      I definitely wouldn’t recommend any CDN that used cookies, for performance reasons alone.

  292. [...] Remember, you can use this technique to also add custom JavaScript files to your pages. You could also add jQuery plugins to your theme with ease, or if you’d prefer Google to host jQuery for you. [...]

  293. M@
    Jul 17 - 8:27 pm

    Was just here for the link… nailed it. Beers to you.


  294. […] ” />Note: the “missing” http: in that URL isn’t a mistake. You can learn more about that later in this post. […]

  295. […] Why use a CDN? […]

  296. […] Why use a CDN? […]

  297. […] along with code snippets of how to use Google’s CDN to host JQuery libraries for free, check out Encosia. There are also Google CDN plug-ins for WordPress sites, […]

  298. Susan Drakenviller
    May 30 - 1:02 pm

    Interesting article and discussion. However, I haven’t seen a counter response on the ‘man in the middle attack’ issue raised by Chris M. and Travis Cunningham. Am I missing something here, because it seems to me quite a powerful argument against using CDN.

  299. […] three most commonly cited reasons to dequeue WordPress’ bundled version of jQuery in favor of Googles […]

  300. […] code will work whether you load your own hosted copy of jQuery, a copy from Google CDN or WordPress’ default […]

Leave a Reply

Mobile Theme