Blazing the Monkey with AJAX: Caching AJAX javascript libraries for sites where data changes often

Shockey Monkey
Comments Off on Blazing the Monkey with AJAX: Caching AJAX javascript libraries for sites where data changes often

Now that a lot of the logic behind Shockey Monkey has been solidified I decided it was time to optimize the interface performance a little bit. Truth is, there is a lot of minor updating that goes on behind the scenes that should not be calling for the entire page to be reloaded. Enter AJAX.

I was talking with Pablo the other night about XHR and told him I was trying to use Script.aculo.us libraries but couldn’t really wrap my head around some of the stuff. So he showed me his blog which is tricked out with the Yahoo UI library (which I find ugly but to each his own) and explained to me how the whole thing works. Funny how its thing fall into place when someone explains it to you.

So for the past day I have been Ajax crazy trying to nail down some of the effects and events. One of my cornerstone values for Shockey Monkey is that it be lightning fast. If I have to wait to get or input data I’ll open up Notepad (used to be Outlook before 2007 started getting non-response-happy all the time). And even though its a web app I have to say that I’ve really done a good job at keeping it very lightweight and fast.

Speeding it up by slowing it down…

So the AJAX premise is that you can speed up the interface by doing partial page rendering. The smoke and mirrors of it is that in order to enable all the cool effects, events and transitions you have to load up a very hefty Javascript file (mine comes out to about 268Kb).

So yes, the partial screen rendering will be very fast because I am only loading some text into a <div> but what happens when they go from page to page? You’ve guessed it, JS reloads. Because the documents themselves change often I have a no-cache directive for the entire page meaning every time someone clicks on something that triggers the page reload the entire mountain of javascript gets pushed down. Click through four pages and you’ve pulled down over 1 Mb which would effectively make the entire web app crawl and put quite a sticker shock on the bandwidth bill at the end of the month.

Guessing the right Google search string…

There is no way that Google, Yahoo and all the other big league Web 2.0 players are constantly pushing down megs and megs of data during each session. Let’s face it, the events and effects code is in libraries which will rarely be changed so why make it take the pipe trip every time? After a  little bit of searching I found a way to do selective javascript caching from Ajaxian. The answer is actually in the comments but the question is outright scary:

Our users have to download a 2MB ajax application every time they visit our website. It would be much faster if we could cache 90% of our application in the browser cache using this LRU Cache so that the next time the user visits our website, they only need to download a small 10K file!

The answer comes further down from Vasili Sviridov:

    ### activate mod_expires
    ExpiresActive On
    ### Expire images and javascript 1 month from when they’re accessed
    ExpiresByType text/javascript “access plus 4 weeks”

Dumping that into the .htaccess file to override default server config (hi Schrag) takes care of the problem completely. Just have to make sure you’re sending your Javascript as text/javascript and not part of your inline text/html page code and voila.