Baselining is used in IT to record some measurement(s) with a view to comparison later. If you know some statistic on one day – say for example the average CPU usage on a server, then you can measure the same data-point the next day and draw a comparison. It is often used so that you can measure the impact of some change or other, and assess its success.
Today, I want to talk about the importance of baselining with a counter-example – what can happen if you don’t baseline before making changes to a system; especially where those changes were intended to improve performance!
While involved with general monitoring of database activity on the clients main server, our consultant noticed that some expensive queries were being run repetitively. The specific situation was that employees would view customer details, one aspect of which was notes made by the system or customer service staff. What we saw was alomost the same query being run sometimes several times for a single customer, within a few seconds of each other.
Further research showed that the control displaying notes was actually paging results, and staff were being forced to page through results by being shown a small number of notes at a time. Our analysis showed that the query necessary to ‘page’ the results was essentially as expensive to run as the one to return all the results for the customer. While we could have reorganised the query and added indexes to improve performance, further research showed that the client actually did not particularly care for the paging of results, but had been told by the development team that paging results would make it faster.
The assertion that paging results would be faster was undoubtedly based on the idea that displaying potentially hundreds of records to a viewer must surely take more time than displaying fewer records. That’s probably true, but it was naive to think that this was the only factor involved in displaying the page. What had happened was that this assumption lead to a number of ‘performance enhancements’ that actually went on to slow the system down!
42 IT Solutions were able to un-stitch this situation and recognise the impact on both the database and the presentation of the web-page to the user (and indeed, how the user then interacted with that page). We reorganised the query to remove paging (this reduced the number of repeated query calls), essentially undoing many of the so-called performance enhancements the developers had previously made. We then went on to consider the business needs of the query, and identified a couple of relatively expensive things that query undertook due to poor underlying data structures. One reorganisation of some specific data later, the query could be made dramatically faster.
In this specific case, the baselining we undertook was to use tools within SQL Server to monitor the cost of running a query. We could modify a query and run comparisons to check the progress we were making was in the right direction!
We could also have considered monitoring the load-time of the web-page, and probably many other characteristics to see if our changes were necessary. Indeed, it may have been advisable to do so because sometimes side-effects of one change can lead to odd outcomes. For example, even if the ‘paging’ version of the query had been faster than the previous version, the overall time to view a customer detail on-screen would have gone up because of the additional time it took for the user to page-through the notes!
Had the original development been undertaken with a performance baseline, it would have been simple to recognise the planned improvements were not delivering what was hoped, and revise the approach.