Performance Tricks, Tweaks and Lessons
I’ve spent the last couple days trying to make a particular page (actually, a widget) of ours load in 2 seconds. It was taking upwards of 10. And now I’m happy to say it is actually less than 2 seconds consistently. Here is what I did to solve the problem.
- Isolate – Part of a larger performance problem affecting our widget was the database getting smoked by something periodically as a result of a design decision in another product that shared the mysql instance. By recreating our product stack on a different machine, that issue cannot cause issue for the widget. At the expense of extra server maintenance and deployment headache, but that is an opportunity cost we’re willing to pay.
-
Compress – Since this is a widget, everything non-image related is either a css file or a javascript one. Because the browser is what is parsing and interpreting these files and not humans, they do not need to be nicely human readable. Enter minification. I’m using the YUI Compressor to compress both css and js files upon deploy. The timing of this is key as minified files suck when you are developing and you don’t want to leave the mandatory step of minification before someone updates a file; that’s just asking for missed revisions and confusion.
Here is the relevant capistrano tasks to do this minification ``` task :default do update minify end
task :minify do run “cd #{current_path} && find . \( -name ‘.js’ \) -and \! \( -name ‘-min.js’ \) -print -exec bash -c ‘/usr/local/java/bin/java -jar jars/yuicompressor-2.4.2.jar –type js -o {} {}’ \;” run “cd #{current_path} && find . \( -name ‘.css’ \) -and \! \( -name ‘-min.css’ \) -print -exec bash -c ‘/usr/local/java/bin/java -jar jars/yuicompressor-2.4.2.jar –type css -o {} {}’ \;” end ```
Those paying close attention will see that I don’t try to compress if the filename ends in -min. These files are already minified and are usually 3rd party libraries we are using.
- Compress some more – Through minification the files being send down the pipe are smaller (which means faster to complete, all things being equal), but we can make them even smaller. All the major, modern browsers and web servers support transmission of compressed data. Here is how you do it for Apache and IIS.
- Cache – Since client side issues are generally considered to have the biggest impact, any speed improvements were now going to involve code changes. The first of which was to see what the various calls were doing and see if we could prevent them from hitting anything. In this case we were generating an xml file our of an activerecord object and then converting that to json. And the result was always the same. Always the same is a big flashing red sign that says ‘cache here!’ so both the xml and json were put into memcached. This was actually the 2nd largest performance improvement and will pay off in huge dividends in he future. The trick here was to actually cache both items. Originally we planned on only doing the xml, but after walking through the problem on paper the json became an obvious candidate as well. I suspect there is some sort of ‘when dealing with caching, think in layers’ heuristic that I’m backing into realizing.
- Thinking – This step I didn’t do as I’m not a developer; I just play one on teevee (as the expression goes). We had a call into the system which was taking 4 – 6 seconds on load and was about 80% of the load time after all the other tricks were done. Someone (a real developer) had to actually look at the code, figure out the problem and create a solution. In this particular case it was as simple as merging a change from a different branch over to the one in production, but she spent a couple days originally to address this particular problem. (And got the call into the 400ms range)
Aside from the first and last one, these are quick and easy wins for performance and have now been added to my initial testing checklists. I’m almost at the point of thinking that not implementing these is a pretty severe bug.
And of course, the big lesson in all this is Server tweaks and client delivery good practices will get you a fair distance, but ultimately pale in the face of a real human using their brain to tackle a problem.