Subscribe by Email

Tuesday, May 28, 2013

Analytics - Measuring data relating to user information - Part 6

This is a series of posts that talk about the use of analytics in your software application. My experience is more in the nature of desktop applications, but a lot of what has been written in the posts earlier is also related to analytics for web application; there may be some differences, but the need for analytics is the same, and the decisions that can be made on the basis of analytics are the same. Some of the guidelines and warnings are also the same; in short, whether you are working on desktop applications or web applications, the tools may be different, but there is a strong need to ensure that you have designed a strategy for the same, and not doing this on an adhoc basis. In the previous post (Analytics - Measuring data relating to user information - Part 5), I talked about a problem where the team had made a strategy to collect data, but there were not enough people to actually analyze the data and take decisions based on such decisions.
However, there are some pitfalls when it comes to analytics, and taking decisions based on that. There is a joke about a person who would scream for data for every decision, whenever there would be a need for any decision or the planning for taking some decisions, there would be a hunt for data, and if the data was not present, then there were high chances that the team would be sent for such data. This is a joke, but I have seen managers who get too data-oriented. This may be anathema to those who are firm proponents of analytics, but there can be 2 problems with an analytics oriented approach.
- The data may be incorrect
- There may be so much emphasis on data, that it crosses a limit and common sense is lost

Sometimes these 2 problems can also intersect, but let's take each of them separately. You cannot just wish for data to happen - this is a very obvious statement, but it goes to the heart of the problem. We had a situation whereby we were collecting some data sent by a particular dialog in the application, and the data was coming in beautifully. A full release went by, and nobody thought much of the code in the particular function. However, in the next release, there was a defect in that section of the dialog that also affected the data that was collected, and the developer who was debugging that area came across something puzzling. It turned out that the data collection code did not enter one of the areas in which the application went into, which we speculated was around 15% of the time in customer interactions, but we had no real data. The net result was that we understood that our data for that particular dialog was understated by a percentage, but we did not know that particular percentage accurately. Hence, any decision that we made on studying the data from that dialog had a margin of error that was unacceptable. We reviewed the test cases and their execution from that particular time when the code was being written, and realized that because of a paucity in time, the testing for this particular part was not done as it should have been. The learning from all this was that data could be incorrect even with the best of efforts. And this takes us to the next para, although not directly.
Basing business decisions on data analysis can be great if you have the correct data, and can be suicidal if your data is incorrect. Further, when important decisions are being taken, it is important that there be some sort of confirmation, or that data is used to confirm some decision rather than being the driver of the decision making. So, suppose the business end of the application wants to run a campaign based on their observing of the market information they are getting, analytics could be of great help in confirming some of the assumptions that the team is making as a part of this decision making. But, using only analytics as the base on which to make decisions, or creating an environment for the same is not recommended.
Even when collecting data, there should be a thorough analysis of the data and the data collection methods to ensure that the data that is collected is correct; in fact I had a colleague who was in favor of analytics but had also been burnt before. His advice was simple - when you are getting data from analytics, assume that the data is wrong and then prove that it is right and then use it.

Read the next post in this series - Measuring data relating to user information - Part 7

No comments:

Facebook activity