Improving Ajax Performance by Being Selective

I spent a few days over the holidays messing around with Rails 2.0 and the Google Maps API.  Sure, these technologies aren't the newest.  GMaps will be out of the terrible twos this February (2008, in case you're reading this in an archive), and while Rails 2.0 shipped in December 2007, most of the features that are new for me are related to resources, which shipped in 1.2, and DHH talked about in his keynote at RailsConf 2006.  That's not a typo -- it was time I caught up.  We do plenty of JavaScript in Komodo, but I don't get out in the HTML world as much as I used to.

I used the Apress book Beginning Google Maps Applications with Rails and Ajax as my guide through the GMaps API.  There isn't much Rails in this book; no surprise as I found out afterwards that the authors also have a book called Beginning Google Maps Applications with PHP and Ajax, and I suspect the books aren't too different.  If this was going to be a review, I'd point out the total lack of testing in the text, but I'll skip that.  What I found interesting were that the code examples acted as a good springboard to getting me going, and left plenty of open-ended exercises of my own devising, one of which I'm writing up here.

I did all the work on an old laptop, mainly so I could take it to the local coffee shop a couple of times.  It's a slow machine, and I noticed performance problems while working on the part that interested me most, chapter 7, which deals with large data sets.  In this case, the large data set is a table of the locations of about 133,000 FCC-regulated antennae.  The samples in the book deal with Hawaii, which is conveniently isolated from any other districts, and also has just a few hundred of the towers.  Even when I zoomed out to contain all the islands in my view, there wasn't much of a delay.

The problem became much worse when I moved the map over to a more densely populated area, like the Atlantic Seaboard or Great Lakes states.  Every time I nudged the map, the machine would freeze as it redrew the latest set of data.  I was even using the clustering algorithm to reduce a thousand points or so to 25.  From the logs I could see that the image the client was rendering might lag the most recent request by as much as ten requests.  No wonder I was seeing so much beach ball.  It didn't take much code to make sure that the client only rendered the most recent request.  The code below uses the GMap Ajax namespace, but it's similar to raw Ajax.

var request_tag = 0;

function doSomething() {
    var url = "";
    request_tag = (new Date()).valueOf();
    url += "&tag=" + request_tag;
    var request = GXmlHttp.create();'GET', url, true);
    request.onreadystatechange = function() {
         if (request.readyState == 4 && request.status == 200) {
            var edata = eval("(" + request.responseText + ")");
            if (edata.requestTag != request_tag) {
            // update the display

This code saved the client from writing soon-to-be-obsolete data to the screen.  The only change I needed to make to the Rails controller was to add a requestTag item to the return value, like so:

  # render :text=>{:result => resultData}.to_json   # Old code
  render :text=>{:requestTag => params[:tag], :result => resultData}.to_json

For those of you following along with the PHP book, the change is probably trivial there as well.

I've used this pattern in asynchronous coding before -- tag your requests, so you can tell whether a request is still useful when it finally arrives.  This is easy to do in this instance, since I assume that for this class of requests, only the most recent one is valid.  The JavaScript Date function is perfectly fine.

That didn't solve all the problems.  While I was getting fewer beach ball interludes in Firefox, my server was running non-stop.  I couldn't pan from one place to another without giving the server conniptions, and most of the data sets it had worked so hard to calculate would be summarily thrown away by the client.  So I added a delay on the client side, complicating the code only slightly by adding a delay between when a request was formulated, and when it was finally sent to the server.  Here's what that code now looks like:

function setup() {
    // ...
        GEvent.addListener(map,'zoomend',function() {

        GEvent.addListener(map,'moveend',function() {
        setTimeout(updateMarkers, 1000);

The updateMarkers() function was triggering the Ajax request above. I changed this so that the hooks would call updateMarkers() only after a certain delay.  I now used these global variables:

var request_tag = 0;    // Track the most recent tag sent to the server
var pendingRequest;     // Formulated request
var checkDelay = 1000;  // Experiment with this value
var map;

function setup() {
    map = /// initialize global google map object.
    // ...
    GEvent.addListener(map, 'zoomend', updateMarkers);
    GEvent.addListener(map, 'moveend', updateMarkers);
    updateMarkers(true);   // Start a request once we're done

function updateMarkers(do_now) {
    if (typeof(do_now) == "undefined")  do_now = false;
    var currRequestTag = (new Date()).valueOf();
    // this saves the current state of the google map if we want to go
    // with it later
    pendingRequest = {tag: currRequestTag, bounds: map.getBounds()};
    if (do_now) {
    } else {
        setTimeout(finishUpdatingMarkers, checkDelay, currRequestTag);

function finishUpdatingMarkers(expectedTag) {
    if (pendingRequest.tag != expectedTag) {
        GLog.write("tossing tag " + expectedTag);
    request_tag = expectedTag;  //update the global to check ajax results
    var url = (""
               + "&tag=" + request_tag);

    //retrieve the points using Ajax
    var request = GXmlHttp.create();
   // ... rest is the same as above

That worked much better.  While I wormed my way through a map, the server wasn't making unnecessary calculations.  When I paused, the request was delayed for a second, but then the results appeared within the next second.

I played a bit more with this, adding a moving-average calculation to the client side, so it kept track of how long it was taking to render the current screen. When I was in areas without many towers, I ended up waiting less time before sending a request, and presumably the server would need less time as well.  If I panned into an urban area, the client would encounter more resistance, and the hits to server would drop commensurately.

Next, I would modify the server so it had a better idea of how long the client needed to render a screen, and it could modify the amount of detail it would send.  For example, if the server knew that the client found it needed an average of 5 msec to render one marker, it might decide that it wouldn't start clustering until it had more than 100 items to return.  While slower machines that reported an average of 20 msec to render a marker, would get more clusters.

If you've read this far, you've earned yourself an example.  You probably know about geocaching, the hobby where people get rid of old yogurt containers and coffee cans by painting them brown, putting some old Macdonald's toys and extra baseball cards in them (wrapped in plastic), hiding them in the woods, recording the coordinates, and posting them on the site.  It's actually a fun activity, but my kids never understood that the hunt was the thing, and any prizes were incidental. The site was recently updated so you can see a map where the various kinds of caches are hidden.  It's made the site cool enough that my kids are reconsidering heading out with me.  When the weather gets better.

You don't have to be a member of the site to see what I mean.  Bring up, type 1600 Pennsylvania Avenue, Washington, DC in the address box, and you'll see a host of different buttons on the map, each of which represents a different kind of hidden cache (no, those ghostlike icons surrounding the Capital and the Main Mall represent puzzles that must be solved, not lobbyists' offices).  Now pan east until you find route 295 (Anacostia Fwy, turning into the Baltimore Washington Parkway).  Start moving northeast following highway 295 to Baltimore, and you'll see that the rendering on the map slows down periodically, with a "Retrieving geocaches" box appearing. Zoom out one level, and keep following highway 95 (an area dense with both people and caches), and now panning becomes almost impossible.  Obviously while I'm panning I'm not really interested in seeing where the caches are, or in the fine detail the site offers.  This is on a fast machine, by the way, a quad-core 2.4GHz desktop machine, not my gerbil-powered laptop.  The geocaching people might want to consider adding clustering of dense pockets of caches as well, either on the server side or the client.