There is a dirty little secret that those who sell digital
analytics technology won’t tell you. More often than not, the data that initial
applications yield just sits on the servers and is never used effectively, if it
is used at all.
This probably will not come as a surprise to many of you
reading this. If you are an experienced
practitioner, you have likely seen it firsthand. So why are expectations - so artfully set in
the technology sales cycle with beautifully polished demos and proofs of concept
- not fulfilled initially? Note I say “initially”
as, there are expensive re-implementation efforts that follow and these tend to
be more successful.
So what happens when you roll out your brand new analytics
technology (hopefully on time and budget ;-) to great fanfare and expectation,
and you find out a month later that it is getting very little user
traction? Of course, you do a survey to
discover what is wrong and you get responses like these:
- I can’t find what I need; too much noise in the data
- The tool is too hard to use
- It does not tell me the right things
- I don’t trust the data; it does not match what I had before
- The data is inconsistent
- I need more history for trending
- I can’t slice it the way I want
- I let my analyst get me what I need, and it takes them so long it is useless by the time I get it
- I cannot compare it to anything
- Data is too old
- I can’t easily export to Excel
- The graphics suck (see - export to Excel)
- I can’t make notes or track external events or share with others
- It’s great for clickstream, but I need survey results, ads served, and inventory too
These are just a few examples; I could go on for quite a
while here and feel free to comment with your own.
For many, the initial instinct is to blame the
technology. This works really well if
the folks who own the implementation are not the ones who selected the
technology. In most cases though, the technology is not at fault. This truth tends to be reinforced by that customer
at the user conference (we got to attend for free last year) that proudly
showcased all the wonderful things we cannot seem to do.
The real barrier to acceptance tends to lie in the failure
to properly define, document, and act upon the true functional requirements of
the application.
This fact has not gone unnoticed by the community. (See: How
to Choose a Web Analytics Tool: A Radical Alternative and the comments for
one take on technology selection failures and Why
Analytical Applications Fail for a view on the difficulties inherent in
defining requirements.)
In my experience as a consultant and a practitioner, I have
found requirements definition failure generally results from one or more of the
following mistakes:
- Assuming that purpose built applications like digital analytics are useful out of the box and do not require a formal set of requirements
- Thinking the requirements used to select the technology are sufficient to drive the implementation
- Defining requirements without any defined process or using methodology that is completely ill-suited to analytic applications
Anyone with experience in this area will tell you 1) and 2)
are patently false apart from the simplest and most basic applications. As for 3), it happens quite often. Let’s break this down a bit by input, process
and output:
Input: All analytic applications are hungry for data that
needs to be captured within the underlying business process and
technology. For customer analytics,
think website clicks, survey responses, call center interactions, etc. Event
capture nearly always requires some custom coding beyond the plug-ins and
templates that the tools provide. Software developers are not great mind
readers; they need solid requirements to do that well.
Process: Sticking
with my customer analytics example, buying a tool in lieu of a custom build is
an easy decision when the high level requirements fit what is on the market.
The tools themselves provide enough flexibility to satisfy most of the
situations they encounter. That
flexibility comes at the price of complexity.
This takes the form of what can be a bewildering array of configuration
options and combinations; any number of which can be applied to solve the same
problem. Finding the optimal combinations,
even for those with the best expertise in that tool requires a well
communicated set of current and anticipated requirements.
Output: Again, purchased analytic applications usually come
with a slew of predefined reports, but the utility of these is directly tied to
the manner in which those report designs mesh with the application
configuration and the preferences of the intended audiences, not to mention the
devices those reports will be viewed on.
As such, they must also be implemented and modified based on documented
requirements.
With this in mind, I will detail in my next post why
requirements definition for analytic applications tends to go off the rails and
my thoughts on how to get it right.