Note: This post is part of a long-running series of posts covering the union of jQuery and ASP.NET: jQuery for the ASP.NET Developer.

Topics in this series range all the way from using jQuery to enhance UpdatePanels to using jQuery up to completely manage rendering and interaction in the browser with ASP.NET only acting as a backend API. If the post you're viewing now is something that interests you, be sure to check out the rest of the posts in this series.

When you’re developing client-side applications, a problem you’ll almost inevitably have to deal with is how to work with services that reside outside your website’s domain. Though many modern APIs do support JSONP, which is a clever workaround to somewhat mitigate the cross-domain problem, JSONP has its own problems.

Worse, if you encounter an API with no JSONP support, the cross-domain barrier can quickly become a formidable one. CORS is slowly becoming a viable alternative, but it requires that the remote service support it via special HTTP headers and browser support for CORS is still not ubiquitous.

Until CORS is more broadly supported, an alternative solution is to bounce cross-domain requests through the web server that hosts your website. In ASP.NET, the best tool for implementing that sort of middleman endpoint is the HttpHandler.

In this post, I’ll show you how to create an HttpHandler to service cross-domain requests, how to use jQuery to communicate with the handler, and an example of one improvement that this approach makes possible.


An example remote API

To focus on an example that’s already familiar to many, I’m going to use Twitter. Twitter’s API does support JSONP, which is a viable alternative for consuming it across domains. In fact, the Twitter status that you see in my sidebar to the right was retrieved from Twitter’s API via JSONP.

However, not every service supports JSONP, its third-party script injection mechanism is sometimes problematic, and using JSONP robs us of niceties like local caching. So, for the sake of a good example, let’s find a way to use the Twitter API on the client-side without resorting to JSONP.

Specifically, I’m interested in querying the service for my last few status updates. The Twitter API request to accomplish that looks like this:

http://api.twitter.com/1/statuses/user_timeline.json?id=Encosia

Twitter will respond to that with a JSON array of objects representing my (or your) last 20 tweets, which is exactly what we’re after.

The best tool for the job: HttpHandler

If you’re accustomed to using ASP.NET’s page methods and ScriptServices to facilitate communication between client and server, those tools begin to look like a hammer that matches every JSON-shaped nail in sight. However, when simply relaying an external API’s JSON through to the client, they often add unnecessary overhead and complexity.

Rather, a lower-level tool is more appropriate in this case.

HttpHandlers are one of ASP.NET’s most under-utilized tools. They’re simple to implement and allow you to handle requests closer to the metal than WebForms pages or MVC controller actions.

One place in particular where HttpHandlers shine is where you would otherwise consider writing Response.Write statements in a WebForms page’s code-behind. This anti-pattern of using ASPX’s code-behind to get closer to the metal looks similar to approaches that you’ll see on some other platforms, such as PHP, but is not equivalent.

Unfortunately, even if you don’t use WebForms controls or ASPX markup at all, executing that low-level code from an ASPX page’s code-behind requires that every request filter through the full page life cycle. That means even the simplest request still has to percolate all the way from PreInit to Unload, adding needless overhead.

Instead, the HttpHandler is where you should write that sort of code that ultimately boils down to Response.Write calls.

Choosing the right handler type

A tricky issue when you’re writing your first HttpHandler is that Visual Studio presents you with two templates, “ASP.NET Handler” and “Generic Handler”:

The add item dialog presents two choices of HttpHandler templates

Both are similar, but the “ASP.NET Handler” template’s approach requires modifying your web.config to configure which URL your handler accepts requests at. Mucking around in the web.config isn’t terribly difficult, but it’s extra friction which makes the process less approachable.

In the spirit of keeping things simple, let’s stick with the more traditionally file-based “Generic Handler”.

Getting started with your first HttpHandler

After choosing that template, specifying a name, and adding the new file to your site, you’ll end up with a bit of boilerplate code that includes this method:

public void ProcessRequest(HttpContext context) {
  context.Response.ContentType = "text/plain";
  context.Response.Write("Hello World");
}

If you start the site up in Visual Studio and then request your newly-created HttpHandler in a browser, Handler1.ashx if you accept the default name, you will see “Hello World” as you might expect.

That’s not very impressive yet, but the response you saw made its way to your browser without touching WebForms’ page life cycle or filtering through ASP.NET MVC’s routing engine and action filters. While those things are worthwhile niceties for the majority of your application, they’re unwanted overhead when all you need is to efficiently relay some content through the server.

Bouncing a request to Twitter through the HttpHandler

To adapt an HttpHandler for relaying requests to the Twitter API, we can use .NET’s handy WebClient class to make the request to Twitter’s API, and then return the result back through as the handler’s response:

public void ProcessRequest(HttpContext context) {
  WebClient twitter = new WebClient();
 
  // The base URL for Twitter API requests.
  string baseUrl = "http://api.twitter.com/1/";
 
  // The specific API call that we're interested in.
  string request = "statuses/user_timeline.json?id=Encosia";
 
  // Make a request to the API and capture its result.
  string response = twitter.DownloadString(baseUrl + request);
 
  // Set the content-type so that libraries like jQuery can 
  //  automatically parse the result.
  context.Response.ContentType = "application/json";
 
  // Relay the API response back down to the client.
  context.Response.Write(response);
}

That code simply makes an HTTP request to the Twitter API and blindly bounces the result back through as the HttpHandler’s response. For the time being, everything is hard-coded, but we’ll improve on that soon enough.

Using the handler proxy on the client-side

With our web server doing the heavy lifting, using this server-side proxy to make a remote request is trivial:

$.getJSON('TwitterProxy.ashx', function(tweets) {
  // Call a magical function that does all the presentational work.
  displayTweets(tweets);
});

The jQuery code here is actually identical what you’d use when requesting the API via JSONP. Whether that response is truly being fulfilled by the specified URL or it’s being relayed through our HttpHandler, it’s all the same to the jQuery code on the client-side.

As you’ll see when we add caching, this can easily be exploited for good.

Mixing things up with QueryString parameters

The hard-coded approach works well enough, but what if we wanted to be able to query any Twitter account’s recent updates instead of being limited to just that boring Encosia character?

Since HttpHandlers receive an instance of the current HttpContext as the parameter to their ProcessRequest method, it’s easy to access QueryString parameters and react accordingly. For example, this would allow us to request any Twitter account’s timeline by via an id parameter on the QueryString:

public void ProcessRequest(HttpContext context) {
  WebClient twitter = new WebClient();
 
  string baseUrl = "http://api.twitter.com/1/";
 
  // Extract the desired account ID from the QueryString.
  string id = context.Request.QueryString["id"];
 
  // Make a request to the API for the specified id.
  string request = "statuses/user_timeline.json?id=" + id;
 
  // Same as before, from here on out:
  string response = twitter.DownloadString(baseUrl + request);
 
  context.Response.ContentType = "application/json";
  context.Response.Write(response);
}

Now it works exactly the same way as before, but we can choose which Twitter account’s timeline is requested. For example, this URL would request Scott Guthrie‘s latest tweets:

TwitterProxy.ashx?id=ScottGu

Supplying parameters with $.getJSON

To pass this new parameter in from the client-side, you could handcraft the entire URL including the appropriate QueryString. Even better though, $.getJSON has an optional “data” argument that accepts a JavaScript object and converts it to QueryString parameters:

$.getJSON('TwitterProxy.ashx', { id: 'ScottGu' }, function(tweets) {
  displayTweets(tweets);
});

jQuery will automatically URLEncode the parameters you specify in the “data” argument and properly assemble them into the final URL to be requested:

Screenshot of the HttpHandler request generated by the jQuery code above.

Which is exactly what we need it to do.

Using a configuration object like this is cleaner than manually concatenating a string together and makes it easier to vary the parameter at runtime.

Improving performance with server-side caching

An advantage the HttpHandler proxy has over CORS and JSONP is that you can perform any arbitrary server-side processing that you wish, both before and after the remote service repsonds. A great way to take advantage of that is adding a server-side caching layer.

Server-side caching will reduce how often requests actually trigger API calls and can significantly improve performance for requests that are already cached. A caching middleman like this is especially valuable when dealing with rate-limited APIs like Twitter’s.

Let’s say that we wanted to cache Twitter responses for up to five minutes, for example:

public void ProcessRequest(HttpContext context) {
  // This will be the case whether there's a cache hit or not.
  context.Response.ContentType = "application/json";
 
  // Check to see if the twitter status is already cached,
  //   then retrieve and return the cached value if so.
  // 8/3/11: Updated with more robust test, thanks to ctolkien.
  object tweetsCache = context.Cache["tweets-" + id];
 
  if (tweetsCache != null) {
    string cachedTweets = tweetsCache.ToString();
 
    context.Response.Write(cachedTweets);
 
    // We're done here.
    return;
  }
 
  WebClient twitter = new WebClient();
 
  // Move along; nothing to see here. The concatenation is just
  //  to avoid horizontal scrolling within the meager 492
  //  pixels I have to work with here.
  string url = "http://api.twitter.com/1/statuses/" +
               "user_timeline.json?id=Encosia";
 
  string tweets = twitter.DownloadString(url);
 
  // This monstrosity essentially just caches the WebClient result
  //  with a maximum lifetime of 5 minutes from now.
  // If you don't care about the expiration, this can be a simple
  //  context.Cache["tweets"] = tweets; instead.
  context.Cache.Add("tweets", tweets,
    null, DateTime.Now.AddMinutes(5), 
    System.Web.Caching.Cache.NoSlidingExpiration,
    System.Web.Caching.CacheItemPriority.Normal, 
    null);
 
  context.Response.Write(tweets);
}

Adding the intermediate cache results in a tremendous performance improvement after the first request:

Screenshot of an initial uncached request to Twitter and then the subsequent, cached requests

With the server able to immediately serve requests within the five minute caching window, subsequent $.getJSON requests are an order of magnitude faster!

Perhaps even more importantly in the case of Twitter, these four refreshes only counted as one API call against my hourly rate-limit.

Conclusion

Using HttpHandlers as server-side proxies turns out to be a simple way to solve the pesky cross-domain restrictions that we’ve all run into from time to time. All said and done, using an HttpHandler to proxy third-party requests takes few lines of code, but offers nearly unlimited flexibility.

In addition to the obvious benefit of getting around the cross-domain restriction, bouncing requests through your own server potentially has a range of other benefits, including:

  • Error handling – This approach not only passes unhandled exceptions on the WebClient request back through to the client-side, but it also gives you the ability to enhance the error handling with your own sanity checks and constraints.
  • Caching – As shown in this post’s final example, you can very easily interject your own caching layer for requests passing through the HttpHandler proxy. That’s especially useful when working against rate limited or potentially slow/flaky APIs (like Twitter’s).
  • Security – When you’re accustomed to server-side programming, the revealing nature of client-side JavaScript can be unnerving. Learning to appropriately partition sensitive algorithms and data between client and server is key to mitigating that issue. Along those lines, moving the remote request to code running on your server is one way to keep sensitive information like API keys and passwords safely hidden from view-source and client-side developer tools.
  • Reliability – One of JSONP’s less obvious drawbacks is the fact that it relies on injecting a third-party script. However, your users may be using something like NoScript to purposely block third-party scripts, effectively shutting down your ability to use JSONP. Even if you prefer JSONP in most cases, a local server-side proxy can be helpful as a fallback in case of unexpected JSONP failures.

That’s not to say that there’s no downsides to this approach. When you’re using an HttpHandler proxy, it’s important to keep in mind that it can be slower since you’re making a series of two connections instead of a single, direct one. You also lose the ability to request content with the user’s third-party cookies attached to the request, which is helpful in some cases.

Overall, using server-side proxies is a very useful item to have in your toolbox. I hope this post has served to introduce you to the approach and/or given you better insight into how you can use HttpHandlers to your advantage.

Get the source

If you’d like to browse through a complete working example of what’s been covered in this post, take a look at the companion project at GitHub. Or, if you’d like to download the entire project and run it in Visual Studio to see it in action yourself, grab the ZIP archive.

HttpHandler-Proxy on GitHubHttpHandler-Proxy.zip