Saturday, May 19, 2018

Is your digital presence failing? It’s probably not a technology problem


A few years ago, I wrote a post where I lamented the fact that analytics fails are often blamed on technology when, in fact, the technology is not the problem and changing it solves nothing.

One of the main causes for failure I noted there was faulty data resulting not from a technology issue; but from poor governance rendering the analytics untrustworthy – and thus nearly useless. In other words, it’s a content problem, not a packaging issue.

As a follow-up to that post, I want to discuss an analogous issue that I see in digital marketing.
Lately, I have been implementing analytics on the digital properties for organizations that do not have much experience with digital marketing. These clients launch their websites and social presence with high expectations that they will generate interest and high levels of engagement with their intended audiences.

Often, I can report that they do… at first.

And then it all falls off. Sometimes very quickly. Traffic starts to wither and the visitors that do come no longer engage as well.

Storytelling is no fun when the data tells a sad story.

The reactions to this tend to follow a pattern: The technology must be to blame. First they question the accuracy of the data. Once I assure them nothing has changed and the analytics are working properly, attention is turned to the website design and the branding elements on the social platforms. What often follows is some modest experimentation with design changes that yields little or no improvement.

Soon, there is talk of an entire website redesign. It’s around this time that I suggest that maybe the technology platforms are not the problem. Perhaps the content that worked well at launch has become stale and needs regular updates to remain relevant and compelling.

Once that reality sets in, the idea of experimenting with fresh content in the form of messages, photos, video, etc. starts to look better than a time consuming, and likely expensive site redesign.
At this point, the problem becomes that the staffing plan does not include having people dedicated to developing new content on an ongoing basis. This, in turn, creates resistance. A one-time design change is easier to pay for.

The only way I know to counter this flawed thinking is with a focused effort, even if it is temporary, to release new content, and monitor the impact.

Results will vary, but fresh content nearly always results in a positive spike in all the important metrics. This improvement, however, can only be sustained with an ongoing effort and dedicated resources, along with an incremental design optimization program to maximize the impact of your content.

That is - people and process will matter more than technology in the long run.

Friday, May 4, 2018

Analytics Fail? It's Probably Not a Technology Problem

This post was originally published on LinkedIn - re-posting here as it generated quite a few responses.


There is a dirty little secret that those who sell analytics technology won’t tell you. More often than not, the data that initial applications yield just sits on the servers and is never used effectively, if it is used at all.

This probably will not come as a surprise to many of you reading this.  If you are an experienced practitioner, you have likely seen it firsthand.   So why are expectations - so artfully set in the technology sales cycle with beautifully polished demos and proofs of concept - not fulfilled initially?  Note: I say “initially” as there are expensive re-implementation efforts that follow and these tend to be more successful.

So what happens when you roll out your brand new analytics technology (hopefully on time and budget ;-) to great fanfare and expectation, and you find out a month later that it is getting very little user traction?  Of course, you do a survey to discover what is wrong and you get responses like these:
  • I can’t find what I need; too much noise in the data
  • The tool is too hard to use
  • It does not tell me the right things
  • I don’t trust the data; it does not match what I had before
  • The data is inconsistent
  • I need more history for trending
  • I can’t slice it the way I want
  • I let my analyst get me what I need, and it takes them so long it is useless by the time I get it
  • I cannot compare it to anything
  • Data is too old
  • I can’t easily export to Excel
  • The graphics suck (see - export to Excel)
  • I can’t make notes or track external events or share with others
  • It's fine as far as it goes, but It's just telling me what happened, not why
  • It's just not actionable
These are just a few examples; I could go on for quite a while here and feel free to comment with your own.

For many, the initial instinct is to blame the technology.  This works really well if the folks who own the implementation are not the ones who selected the technology. In most cases though, the technology is not at fault.  This truth tends to be reinforced by that customer at the user conference (we got to attend for free last year) that proudly showcased all the wonderful things we cannot seem to do.

The real barrier to acceptance tends to lie in the failure to properly define, document, and act upon the true functional requirements of the application. 

This fact has not gone unnoticed by the community.  (See: How to Choose a Web Analytics Tool: A Radical Alternative and the comments for one take on technology selection failures and Why Analytical Applications Fail for a view on the difficulties inherent in defining requirements.)

In my experience as a consultant and a practitioner, I have found requirements definition failure generally results from one or more of the following mistakes:
  1. Assuming that purpose built niche applications like digital analytics are useful out of the box and do not require a formal set of requirements
  2. Thinking the requirements used to select the technology are sufficient to drive the implementation
  3. Defining requirements without any defined process or using methodology that is completely ill-suited to analytic applications
Anyone with experience in this area will tell you 1) and 2) are patently false apart from the simplest and most basic applications.  As for 3), it happens quite often.  Let’s break this down a bit by input, process and output:

Input: All analytic applications are hungry for data that needs to be captured within the underlying business process and technology.  For customer analytics, think website clicks, survey responses, call center interactions, etc. Event capture nearly always requires some custom coding beyond the plug-ins and templates that the tools provide. Software developers are not great mind readers; they need solid requirements to do that well.

Process:  Sticking with my customer analytics example, buying a tool in lieu of a custom build is an easy decision when the high level requirements fit what is on the market. The tools themselves provide enough flexibility to satisfy most of the situations they encounter.  That flexibility comes at the price of complexity.  This takes the form of what can be a bewildering array of configuration options and combinations; any number of which can be applied to solve the same problem.  Finding the optimal combinations, even for those with the best expertise in that tool requires a well communicated set of current and anticipated requirements.

Output: Again, purchased analytic applications usually come with a slew of predefined reports, but the utility of these is directly tied to the manner in which those report designs mesh with the application configuration and the preferences of the intended audiences, not to mention the devices those reports will be viewed on.  As such, they must also be implemented and modified based on documented requirements.

The takeaway here is, when analytic applications fail to gain adoption or prove to be ineffective, resist the temptation to blame the technology and replace it thinking that will solve the problem. Most of the time, we choose technologies that can do the job. Focus the effort on establishing a baseline set of requirements that can deliver value relatively quickly and validate the technology. From there, it is just a matter of delivering frequent incremental value and letting the application evolve as needs change and new opportunities present themselves.