Thursday, November 17, 2016

Agile is dead: Analytic Applications Edition

For those of you who are not involved in developing software applications, The Agile movement was, in essence, a revolt against the then common practice of taking on the entirety of large software projects all at once; adhering to a methodology that stressed a strict sequential progression of stages beginning with requirements gathering, followed by programming, testing and culminating in a ‘big bang’ implementation. This came to be known as ‘waterfall’ methodology. Among the many issues with it is the fact that requirements usually change over the course of these lengthy efforts and what ends up being delivered often no longer adequately addresses the needs of the businesses that commissioned them.

Those of us who develop and manage analytic applications came to this conclusion decades ago out of necessity. One reason is that these applications support the very unique business process that is decision making. Unlike more conventional business processes such as booking airline reservations, decision making as a process can take many paths as data is analyzed and new information is discovered. The idea that one can precisely define requirements in advance of implementation simply does not apply for all but the most structured decisions like automated algorithm-based credit decisions. Even those require frequent updates as outcomes drive new learning and improved algorithms. Successful analytic application developers learned to use prototyping and relatively short incremental development cycles to keep their products relevant and achieve customer satisfaction. In essence, we adopted agility long before the Agile development hype cycle began. I went into more detail on this in a previous post on the impact of Agile on Analytics.

Since that time, there has been both a technology and culture-driven boom in analytic applications development as businesses of all sizes and maturities adopt a more data-driven culture. At the same time, the tenets of Agile methodology have been zealously embraced by IT executives who bought into the hype as they sought to deliver better applications sooner and cheaper. In fact, Agile certification became a requirement to work in some shops on all projects. Collisions ensued as the realization set in that orthodox Agile methods were not developed with analytics in mind and often could not be applied to Business Intelligence and Analytic application development projects successfully.

BI/Analytic application project teams were put in a familiar and awkward position. They could either try to explain why a methodology designed for a different purpose does not apply; or create the illusion of compliance with Agile dogma, technology and terminology that added little or no value to their efforts. Meanwhile, vendors and consultants in the Analytics space were all too happy ride the wave, coining the term “Agile Analytics” in an attempt to reconcile Agile mandates with proven methods in the BI/Analytics discipline.\

It now appears that the software developer community at large is having qualms about Agile software ‘revolution’ and what it became. Even the original thought leaders of the Agile movement have reservations about what has come of it. There has even bit something of a developer revolt against it. Then there are the chronicles of the Agile hype cycle and some very thoughtful pieces around how to move forward from Agile.
These critiques of the Agile movement as it is currently practiced have several points in common:
  • Agile has passed the peak of its hype cycle and benefits resumes and consultants more than software projects
  • Agile, as it is currently practiced, has become more process than objective-driven. This is exactly one of the faults it was designed to cure
  • Requirements definition (often in the form of vague ‘user stories’) has suffered to the point where it has degraded testing, compromised necessary documentation and caused a fall in overall quality of delivered products
  • Adoption of Agile in Name Only (AINO) practices where a waterfall mentality persists, development performance metrics remain a goal unto themselves, scrums become meetings, sprints become epics before defined value is actually delivered, and development teams remain as disconnected from the business as ever
  • Applications architecture suffers as semi-independent project teams ignore standards and governance to meet their time and cost constraints. This one in particularly is deadly in the long run
Aside from sounding very familiar, what does this mean to us in the Analytics community? It means we need to stress  that what works for more traditional business process applications often does not apply to the unique nature of decision support and analytics applications. We define capabilities, not user stories. We lay down a sustainable data platform before we attempt to build applications on it. We prove concepts and prototype before we make major investments, we govern our data relentlessly to preserve credibility, and we develop our environments with an eye towards how they will be maintained and enhanced. Most importantly, we must remain focused on the results we can achieve while they are still needed and not worry so much on how we achieve them.

Monday, February 29, 2016

BI Industry Research: Is There Magic in those Quadrants?

As a Business Intelligence and Analytics (BIA) consultant, I need to keep up with the latest developments in both analytics process and technology. This has become quite a bit easier in recent years as quite a bit of useful information and wisdom is made freely available over the Internet. Premium fee-based research, however, remains an important source as it is generally deeper and more detailed and tends to carry great weight with clients.

I’ve gotten to know several of the analysts who cover the BIA space, and I find them to be among the most hard-working folks around as they juggle their research, conference and consulting responsibilities. Their methods are rigorous and the surveys span a uniquely broad sample of industries and geography.

Given all that, I have become a bit skeptical as to their influence because the research is often misused and/or misinterpreted, resulting in poor purchases or misdirected implementation programs that I have seen firsthand.

Part of the problem is that some readers just look at the pictures. In the case of Gartner’s trademarked Magic Quadrants, they note the vendors in the Northeast corner, and automatically limit their purchase consideration set to the products fortunate enough to be there that year. This is a big mistake. The real value in this content comes from the detailed review of the relative strengths and weaknesses of each vendor’s offerings.

Others get too caught up in the feature function comparisons.  Quality and applicability; not quantity is what matters. Checking off the boxes may be great for sales demos, but many customers get too far over their skis and don’t end up using half the capabilities of the tools they buy.

Another mistake I see is when buyers try to use the research as a substitute for their own reference checks and a well –executed proof of concept to prove out the technology in context.

Keep in mind this research is intended for both buyers and sellers. The vendors themselves are important clients and participants. They provide insight into their product plans and access to their reference clients. They also end up providing much of the research funding. When vendors are reviewed positively, they often purchase the work for redistribution. See here for one example.

Some of the best insights in research pieces are around the overall market as opposed to the individual products and vendors. For example, in this year’s Magic Quadrant, Gartner took the very significant step of redefining their market domain for BIA technology. They completely removed a set of traditional data analysis and reporting tools that have been a major market presence for many years (e.g. Oracle). They limited the coverage to BIA “Platforms” that enable user-driven full-cycle data preparation, integration, visualization and discovery capability. This change makes a strong statement around the direction of the market and the shift of spending away from IT-driven initiatives to user-driven and funded programs where IT is expected to enable and emphasize data provisioning and governance over technology enablement.

The wise consumer of this type of research takes many factors into account when evaluating the products that are covered, including:

  • Vertical solutions – does the vendor have a strong record creating solutions specific to your industry?
  • Partnerships – does the vendor provide a complete solution or do they rely on partners?
  • Costs and pricing – although there is often useful information around pricing models, every sales cycle is unique with regard to the effective costs of purchase and eventual ownership.  Negotiation skills, reference potential, sales incentives etc. all play a significant role.
  • The relative importance of sales experience, documentation, training, and support to you as a customer
  • Pay close attention to the methodology notes. They are usually very comprehensive important, particularly when they detail the breadth of the surveys and discussion of vendors who were included and excluded.
Careful consideration of these details allows the reader to match the technology for a solution to your specific situation, often preventing expensive mistakes.

Sunday, January 3, 2016

My 2016 Business Intelligence & Analytics (BI&A) Wish List

It's that time of year where we bid goodbye to what we did not like about last year and look forward to our best hopes and wishes for this coming year. As I look back on the past year in BI&A, here is what I'd like to see in 2016:

1.       Some real standards the software vendors will respect
I'll admit I've wanted this for a long time, but I can still hope. We are now in the big-data driven third generation of BI&A technology. The first was relational databases. Then, the industry settled on SQL as the standard query language and that facilitated a whole industry with interoperable query tools, ETL tools, database platforms, and a generation of expert professionals. The second was multi-dimensional technology. In this case, we never even saw a standard for query languages aside from some weak attempts to extend SQL. Now 'No SQL' is the emerging non-standard. Metadata standards? Dream on. The one credible attempt at it, the Common Warehouse Metamodel in the 90's, never really caught despite support from several major vendors. Those same vendors eventually decided that proprietary solutions could become de-facto standards if they acquired enough market share through buying out smaller competitors. So metadata standardization is achievable, but mostly with single vendor solutions. This brings me to my second wish.

2.       The next consolidation wave
As is always the case during a technology consolidation, you get a wave of startups with new technology and established players trying to reposition older technology in a fight to win over the early adopters in the market and some love from the industry analysts. In time, the weaker players drop out of sight and the stronger ones get swallowed up by the big enterprise players looking to buy technology and market share. It will be a little different this time as SAAS deployments will allow more of the upstarts to thrive on their own. Some may end up pushing out older players in the process. For me though, there are just too many incomplete solutions out there right now and the consolidation wave cannot come fast enough.

3.     Fewer Big Data wannabees
Another consequence of this technology shift is the emergence of a huge number of resumes on the market promising a depth of knowledge and experience in big data technology stacks that is more hype than substance and becomes clear 10 minutes into an interview session. I’d rather work with seasoned BI&A professionals that have mastered the basics of software engineering, project management, requirements development, data governance, and testing who have demonstrated the ability to learn and adapt to new technology quickly.

4.        More agile BI&A teams and less Agile methodology zealotry
BI&A pros have known for decades that agility is mandatory in our work and waterfall methods do not work. Requirements are generally not completely known in advance and only revealed through prototyping and iterative development. Value should be delivered on an ongoing basis. On the other hand, some of the main tenets of the now-revered Agile movement do not work all that well in the BI&A space. We do not develop structured applications as much as we strive to create environments where our users can create their own applications. This limits the development of workable user stories in advance. The evolution of our platforms over time and need for stable development and support teams does not lend itself well to the scrum concept as it is typically advocated. Yes, we need to be agile (small A) but not necessarily Agile (capital A.)

Business Intelligence and Analytics has never been more recognized as vital to success in business, government science and education. Our tools and technology are better than ever. Those wishes have come true. Now, I wish for all who read this a happy and successful 2016.

Sunday, November 1, 2015

DIY Analytics – Is it right for you?

As I consider the all-too-long list of things that need fixing in my house and watch yet another spot from the big-box home improvement retailers, I realize why the Do It Yourself (DIY) trend has legs. You know your own house and DIY gives you control over the schedule, quality, cost, and outcome of the work. Of course you had better have the necessary expertise and can acquire the right materials and tools. If you don’t have that level of confidence, you outsource the work and hope you can keep any eye on things and get your money’s worth.

The large scale home improvement retailers succeed by making the materials, tools and expertise available and supplying that confidence. Many times, you start with something easy, it works out and you feel empowered. Other times, you take on a little too much, your work does not hold up and you end up calling in the pros and putting your tools away.

We often see the same story in business analytics. Finance, marketing, HR, and supply chain specialists are lured by tool vendors into thinking they are better off with DIY analytics and freeing themselves of the IT pros with their long schedules, hefty price tags and results that can be, shall we say, less than satisfying.

In this case, the folks that make money are the BI and Analytics (BIA) technology vendors who convince business organizations that DIY Analytics is the way to go and happily sell tools and some training.  Then they move on to the next sales cycle. Sometimes it works out, but often these applications fall apart over time for lack of reliable raw materials (data) or solutions that are not built to last and cannot be adequately supported by those who developed them. At this point, the IT pros are brought back in to fix things and the tools end up on the shelf.

There is a better way. IT organizations have found success by taking a lesson from the home improvement retailers and supporting DIY analytics successfully. They do not insist on building all the reports, dashboards, and analytic applications themselves. Instead, for those customers that prefer the DIY model, they provide a set of shared tools and trusted data. Their customers then build and enhance the application to their own preferences and on their own schedules; while controlling costs by paying for much of their own labor.

There is a critical ingredient that is necessary to make this model work though. The home improvement retailers enable their customers with expert advice, training videos and communities of other DIYers sharing knowledge. IT shops can provide that same kind of support structure. It often comes in the form of a formal dedicated organization within IT. I have seen it called a BI Community of Practice, a Competency Center, or a Center of Excellence. Whatever the name, the mission is the same: Crate a resource that gives its customers the necessary technology resources to succeed, and the confidence to do it themselves with a flexible set of support models and services best left to the pros. These include everything from security, backups, capacity planning, performance tuning, professional training, documentation, and proactive knowledge sharing that helps the entire community use their resources efficiently and effectively. This creates an environment where the entire business wins with better service, better decisions, and better performance. Oh, and when the pipes leak, just call a plumber.

Monday, July 6, 2015

Cyber Security and Business Analytics: Imperfect Together

This week, I was reading an excellent piece here about the cyclical nature if the Business Intelligence/Analytics industry (BI). The assertion here is that priority tends to swing between periods of high business-driven enablement and IT-driven governance.  The former tends to be brought on by advances in technology, and the latter by external events, regulation, and necessity.  We are currently at the apex of an enablement cycle at the expense of governance. One casualty of lax governance is often cyber security.

Recently, we have seen a rash of high-profile data breaches. One of these was the large scale theft of data from the health insurer Anthem. This one was notable as it was the result of a vulnerable data warehouse where sensitive data was left unencrypted.

Those of us who practice BI and Data Warehousing professionally have a paradox to deal with. We have always been evaluated on our ability to make more data available to more users on more devices with the least effort to support business decisions. In the process, we tend to create ‘one-stop shopping’ and slew of potential vulnerabilities to those who would access proprietary data with criminal intent.

The software vendors in our space have been all too complicit in this. After all, what sounds better to the business decision-makers they market to: “multi-factor authentication” or “dashboards across all your mobile devices”? “advanced animated visualizations” or “intrusion detection”? “data blending” or “end-end data encryption”?

How about “self-service business analytics” or “help yourself to our data”? Consider how easy we make it for the users in an enterprise to export just the useful parts of a customer database, along with summaries of transaction history to a USB stick and walk out the door with it?

This idea that BI and data warehousing requires more attention to security is starting to gain traction, however. A quick web search reveals that the academics are starting to study it and the leading established vendors in the space are starting to feature it in their marketing in ways I have not seen before. See the current headline on the MicroStrategy website for one example.

The main takeaway here is that BI and data warehousing practitioners need to consider cyber security in architectures and applications the same way it is done in transaction processing:
  •          Get a complete BI vulnerability assessment from a cyber-security professional
  •          Calculate the expected value of an incident (probability of an event times the cost to recover) and allocate budgets accordingly
  •          Demand proven security technology from your vendors and integrators around features such as authentication, end-end encryption, and selective access controls by organizational role
  •          Don’t be afraid of the cloud. The leading vendors of cloud services employ state of the art security technology out of market necessity and are often the most cost effective solution available.

What’s old is new again - BI edition

Those of us with a long history as business intelligence (BI) practitioners have pretty clear memories of all the days when we saw an overhyped technology promise to change the game by freeing business organizations of IT tyranny with a new class of products that made self-service reporting and analytics better, faster, and cheaper. We saw this with the arrival relational databases. Believe it or not, they were originally all about data access not transaction processing. We saw it again when Online Analytical Processing (OLAP) was available on top of Online Transaction Processing (OLTP).  OLAP brought data access directly to our spreadsheets and PowerPoints where we really wanted it. In both cases, business organizations bought this technology and built organizations to use it thinking they could declare their independence from IT. It worked splendidly for a while. IT created extract files from their applications and celebrated getting out from under a backlog of reporting requests. Businesses felt empowered and responsive as they created reports, dashboards, and even derivative databases integrating internal and external data within their siloed subject areas.

Then reality set in.

All these new products required maintenance, documentation, training, version control, and general governance. “Shadow IT” organizations sprung up. They often became, in aggregate, far more expensive and just as cumbersome as what they replaced. Worse, the software vendors happily exploited this balkanization of larger organizations by selling redundant technology that had to be rationalized over time causing licenses to become unused and not transferable. Wouldn’t it be nice to buy a slightly used BI software license at a deep discount?

The fatal flaw in this arrangement is the proliferation of overlapping and inconsistent data presentations that we call multiple versions of the truth. These create mistrust and cause executives to go with their guts in lieu of their data.

Each of these technology advances, along with even faster hardware evolution, did have the impact of making decision support and analytics far more powerful even as the open source movement made it more accessible. This, in turn, created competitive advantage for those who learned to exploit it and made a strong analytics capability mandatory in today’s commercial climate.

One problem still remains. As we like to say, you can buy technology, but you can’t buy your data. Today’s analytics require integrated and governed data across finance, operations and marketing, online and offline, internal and external.

That brings us to the current generation of revolutionary BI tools like the latest data visualization technology that is all the rage right now. (I won’t name names, but think “T” and “Q”.) Just like the previous BI waves, they exploit technology advances very effectively with features like in-memory architectures, wonderful animated graphics for storytelling and dashboards, and even data integration through “blending” and Hadoop access. These products have been hugely successful in the marketplace and are forcing the bigger established players to emulate and/or acquire them. The buyers and advocates are usually not IT organizations, but business units who want to be empowered now.

What does this mean for business decision makers? Just like the technology waves that preceded them, these new visualization tools do not address the organizational and process requirements of a highly functional and sustainable BI capability. Data and tools must be governed and architected together to create effective decision support.  Otherwise, you end up with unsupported applications producing powerful independent presentations of untrustworthy data.

We have seen this movie before and we know how it ends.

Mr. Robinson is currently a Business Intelligence and Analytics consultant with Booz Allen Hamilton. He has previously held practice and consulting leadership positions with Ernst & Young, Oracle, Cox Automotive ( and Home