Excessive JavaScript and AJAX: bad practice & broken pages


I’ve written tens of thousands of lines of JavaScript code and I love the language. I have used it on almost every site that I’ve worked on in the last 10 years. It makes web pages and web applications so much better in every way. That is, if it’s used correctly, in moderation.

However, for the last few years it feels like I’ve had to spend more time explaining why we should not use JavaScript than I do actually writing scripts. This is a trend that I’ve noticed more and more as the web matures. An ever increasing number people believe that JavaScript is the solution to everything. Many take this idea so far that they believe it’s okay to say that we don’t support users without JavaScript.

Often it’s because someone wants to use words like “AJAX”, “JSON”, “dynamic”, and “Web 2.0” in marking materials. Sometimes it’s because people believe that scripts are somehow more maintainable than the code behind. The rest of the time it’s just pure laziness.

These are the golden rules of JavaScript:

  • Pages should display all of the real content and media with JavaScript disabled.
  • All front facing pages should functionally work 100% with JavaScript disabled.
  • Pages should look (mostly) the same with JavaScript disabled.
  • Do not use JavaScript when server-side coding can accomplish the same thing.

If you take these rules to heart then you will understand the real point of JavaScript, to enhance pages and user experience.

Thoughts on AJAX and JSON

AJAX and JSON are very powerful tools in the web developers’ arsenal, they give you the power to send and pull data from servers without forcing a page load. However, it is very easy to take it too far and break the golden rules. These technologies are best used on top of existing (working) web applications that only require a 1-step postback.

Every web developer knows of AJAX, it is synonymous with dynamic pages. AJAX is used to send standard GET or POST requests to web servers through the browser and pull XML data back. This offers almost limitless options for a web developer. You can literally build and run an entire web application on top of one HTML page.

The lesser known of the pair, JSON is a form serialized data for JavaScript. It allows you to pull data from the server to a front-end page. The limit of JSON is that you can only pull it through the browser with a GET request, so there are limits on the amount and type of data you can send to the server. The idea is basically the same as AJAX, without the XML processing or cross-domain issues.

Screenshot of the Twitter home pageFor a good example of AJAX and JSON taken too far, just take a look at Twitter, specifically their search. Every time you see a # symbol in the URL you are seeing a hack to support dynamic data and the back button. I understand why they did it, to save page loads, and because it is easier on the servers to publish XML/JSON than it is to craft and return an HTML layout. But they could have accomplished the same thing by publishing the XML data to the browser with an XSLT stylesheet for look and feel.

The World of Warcraft web site uses the XML/XSLT technique and it works great. Since Twitter’s whole service, API, and back-end is based on publishing XML data this technique would have been the perfect solution for them. All of the fancy dynamic loading only reduces the usability of the search feature.

Remember Twitter doesn’t have to worry about SEO, accessibility or backwards compatibility. You do. Don’t use AJAX or JSON for anything but gloss and finish on a functioning base web site. Relying on dynamic calls for widgets and decorations are fine, but don’t require it for any real content.

I will add that I think all of the AJAX in gmail is fine. Why? Because gmail is a web application, behind a login screen. The context is different than a page on a front-facing web site that you might find through a search engine. Web users understand that they are entering something different than a static page when they log in to their gmail accounts.

Thoughts on libraries

There are countless JavaScript libraries available for anyone to use. Some are better than others, some are truly terrible. When asked, I always recommend jQuery, along with the caveat that you should not use any library unless you are going to use a significant amount of the functionality that library makes available.

Forcing users to download 81kB of script just so you can $('a#next').click() is a complete waste. You can accomplish the same thing just as easily with the native DOM functions and skip the extra bandwidth, server load, and page load times.

However, if you know that you’re going to be doing a lot of advanced scripting on a site, then please, use a library, preferably jQuery. Having one browser-cached file with all of the fundamental work done can be a lifesaver. jQuery is very powerful and can save you lots of time and headaches on a big project.


Do use JavaScript, love it, and even punish users for not having it. But, don’t rely on it for making your consumer-facing web sites work. Only rely on JavaScript for garnish. Though anything behind a login screen is fair game, go crazy. When you do use JavaScript make sure to use unobtrusive techniques and follow graceful degradation ideals. Your work will be more professional, more portable, more accessible and even have better SEO.

Updated: Apr 27th, 2011


  1. Hi, I appreciate that finaly someone says something about excessive use of javascript (not only AJAX).
    Famous websites, and especialy “News” websites, are full of tons of scripts sometimes extremely slow.
    The more famous the more crappy and bloated they are.

    I can’t understand that because these scripts have, 99% of the time, no reason to be whatsoever. If you disable scripts totaly, everything is still there identical or almost.

    So why do they do that? Web designer hubris? Obsession on automation? Non monitored self-generating code?
    Anyway, how much time and money have been waisted to code this crap is beyond me.

    Even if these codes are autogenerated by several separate entities, and multiplied without purpose just because nobody thought of deactivating them, still someone had to code them one day. Who did all these huge useless and terribly inneficient scripts, must be either maniac, schyzophere or something. So are all the webdesigners js maniacs like that? Why all the big companies and news sites hire them?
    What do these companies managers wanted exactely with their websites? Sure, not “please write me a 5000-lines js code for the sake of coding”! I can’t even figure out the goal of these projects. Advertisement, of course, but why like this?


  2. st33d

    “Do not use JavaScript when server-side coding can accomplish the same thing.”

    This is a nice idea until you have over a million visitors a month like we do. Then your server is screaming in pain and your hosting costs shoot through the roof.

    We found this out the hard way. Q.E.D.

    Balancing the load of server side cpu and client cpu is a judgement call. Not a rule.

  3. I completely agree that there are times when offloading some work to the client side can be a real help.

    For example, I’ve had to do this with portable ad calls before. We needed to show a random subset of the data to users (5 random ads) on every view, but the ad calls are hit so much that doing a random select for every hit was simply out of the question. To make this scale we did a random select of 50 ads and cached the results server-side for 10 minutes. Then on the client side I select 5 random ads from that data. The users see different random ads but I only need to query data once every 10 minutes, instead of querying data for every single hit.

    This was a compromise I had to make to get this feature to scale and support the traffic.

    However, in my experience cases like this are fairly rare. This had to be a JavaScript call to begin with, so there wasn’t any lost usability. A vast majority of the time the best practice is to do the work on the server and implement server side caching. Even highly dynamic web apps can use caching. In a very high traffic environment, even a thirty second cache might save you hundreds hits to the database. And will appear no less dynamic to the users.

    But as you said, it’s a judgment call. I guess I should point out that my golden rules can be bent or broken in places. But it’s best practice to avoid it unless necessary.

    What I was ranting about in this article was web developers who use JavaScript and AJAX, not as a tool to enhance the app or make it scale… but simply because they can. I don’t like seeing complete JavaScript dependence on front-facing web sites just because someone thinks it’s cool.

  4. The best practice is to do it server side because, there you are sure it will work as tested. With Javascript you never know how the client’s browser will react.

    Also you have operations obviousely suitable for the clients and others for the servers. It’s not difficult to know which ones.

    But the best practice is still to keep things simple. To ask yourself “is this realy necessary?”.

  5. Jake

    This sentence:

    “Forcing users to download 81kB of script just so you can $(‘a#next).click() is a complete waist.”

    has 2 errors in it:
    $(‘a#next).click() should be $(‘a#next’).click()
    “waist” should be “waste”

    Other than that, great article. I agree completely with what you said here (too often sites will use too much JavaScript (Twitter and Facebook are near the top of the list), and it can really harm the user experience).

Leave a reply