Often it’s because someone wants to use words like “AJAX”, “JSON”, “dynamic”, and “Web 2.0” in marking materials. Sometimes it’s because people believe that scripts are somehow more maintainable than the code behind. The rest of the time it’s just pure laziness.
Thoughts on AJAX and JSON
AJAX and JSON are very powerful tools in the web developers’ arsenal, they give you the power to send and pull data from servers without forcing a page load. However, it is very easy to take it too far and break the golden rules. These technologies are best used on top of existing (working) web applications that only require a 1-step postback.
Every web developer knows of AJAX, it is synonymous with dynamic pages. AJAX is used to send standard GET or POST requests to web servers through the browser and pull XML data back. This offers almost limitless options for a web developer. You can literally build and run an entire web application on top of one HTML page.
For a good example of AJAX and JSON taken too far, just take a look at Twitter, specifically their search. Every time you see a # symbol in the URL you are seeing a hack to support dynamic data and the back button. I understand why they did it, to save page loads, and because it is easier on the servers to publish XML/JSON than it is to craft and return an HTML layout. But they could have accomplished the same thing by publishing the XML data to the browser with an XSLT stylesheet for look and feel.
The World of Warcraft web site uses the XML/XSLT technique and it works great. Since Twitter’s whole service, API, and back-end is based on publishing XML data this technique would have been the perfect solution for them. All of the fancy dynamic loading only reduces the usability of the search feature.
Remember Twitter doesn’t have to worry about SEO, accessibility or backwards compatibility. You do. Don’t use AJAX or JSON for anything but gloss and finish on a functioning base web site. Relying on dynamic calls for widgets and decorations are fine, but don’t require it for any real content.
I will add that I think all of the AJAX in gmail is fine. Why? Because gmail is a web application, behind a login screen. The context is different than a page on a front-facing web site that you might find through a search engine. Web users understand that they are entering something different than a static page when they log in to their gmail accounts.
Thoughts on libraries
Forcing users to download 81kB of script just so you can
$('a#next').click() is a complete waste. You can accomplish the same thing just as easily with the native DOM functions and skip the extra bandwidth, server load, and page load times.
However, if you know that you’re going to be doing a lot of advanced scripting on a site, then please, use a library, preferably jQuery. Having one browser-cached file with all of the fundamental work done can be a lifesaver. jQuery is very powerful and can save you lots of time and headaches on a big project.
Updated: Apr 27th, 2011
Famous websites, and especialy “News” websites, are full of tons of scripts sometimes extremely slow.
The more famous the more crappy and bloated they are.
I can’t understand that because these scripts have, 99% of the time, no reason to be whatsoever. If you disable scripts totaly, everything is still there identical or almost.
So why do they do that? Web designer hubris? Obsession on automation? Non monitored self-generating code?
Anyway, how much time and money have been waisted to code this crap is beyond me.
Even if these codes are autogenerated by several separate entities, and multiplied without purpose just because nobody thought of deactivating them, still someone had to code them one day. Who did all these huge useless and terribly inneficient scripts, must be either maniac, schyzophere or something. So are all the webdesigners js maniacs like that? Why all the big companies and news sites hire them?
What do these companies managers wanted exactely with their websites? Sure, not “please write me a 5000-lines js code for the sake of coding”! I can’t even figure out the goal of these projects. Advertisement, of course, but why like this?
This is a nice idea until you have over a million visitors a month like we do. Then your server is screaming in pain and your hosting costs shoot through the roof.
We found this out the hard way. Q.E.D.
Balancing the load of server side cpu and client cpu is a judgement call. Not a rule.
I completely agree that there are times when offloading some work to the client side can be a real help.
For example, I’ve had to do this with portable ad calls before. We needed to show a random subset of the data to users (5 random ads) on every view, but the ad calls are hit so much that doing a random select for every hit was simply out of the question. To make this scale we did a random select of 50 ads and cached the results server-side for 10 minutes. Then on the client side I select 5 random ads from that data. The users see different random ads but I only need to query data once every 10 minutes, instead of querying data for every single hit.
This was a compromise I had to make to get this feature to scale and support the traffic.
But as you said, it’s a judgment call. I guess I should point out that my golden rules can be bent or broken in places. But it’s best practice to avoid it unless necessary.
Also you have operations obviousely suitable for the clients and others for the servers. It’s not difficult to know which ones.
But the best practice is still to keep things simple. To ask yourself “is this realy necessary?”.
“Forcing users to download 81kB of script just so you can $(‘a#next).click() is a complete waist.”
has 2 errors in it:
$(‘a#next).click() should be $(‘a#next’).click()
“waist” should be “waste”