Thursday, November 17, 2016

Agile is dead: Analytic Applications Edition

For those of you who are not involved in developing software applications, The Agile movement was, in essence, a revolt against the then common practice of taking on the entirety of large software projects all at once; adhering to a methodology that stressed a strict sequential progression of stages beginning with requirements gathering, followed by programming, testing and culminating in a ‘big bang’ implementation. This came to be known as ‘waterfall’ methodology. Among the many issues with it is the fact that requirements usually change over the course of these lengthy efforts and what ends up being delivered often no longer adequately addresses the needs of the businesses that commissioned them.

Those of us who develop and manage analytic applications came to this conclusion decades ago out of necessity. One reason is that these applications support the very unique business process that is decision making. Unlike more conventional business processes such as booking airline reservations, decision making as a process can take many paths as data is analyzed and new information is discovered. The idea that one can precisely define requirements in advance of implementation simply does not apply for all but the most structured decisions like automated algorithm-based credit decisions. Even those require frequent updates as outcomes drive new learning and improved algorithms. Successful analytic application developers learned to use prototyping and relatively short incremental development cycles to keep their products relevant and achieve customer satisfaction. In essence, we adopted agility long before the Agile development hype cycle began. I went into more detail on this in a previous post on the impact of Agile on Analytics.

Since that time, there has been both a technology and culture-driven boom in analytic applications development as businesses of all sizes and maturities adopt a more data-driven culture. At the same time, the tenets of Agile methodology have been zealously embraced by IT executives who bought into the hype as they sought to deliver better applications sooner and cheaper. In fact, Agile certification became a requirement to work in some shops on all projects. Collisions ensued as the realization set in that orthodox Agile methods were not developed with analytics in mind and often could not be applied to Business Intelligence and Analytic application development projects successfully.

BI/Analytic application project teams were put in a familiar and awkward position. They could either try to explain why a methodology designed for a different purpose does not apply; or create the illusion of compliance with Agile dogma, technology and terminology that added little or no value to their efforts. Meanwhile, vendors and consultants in the Analytics space were all too happy ride the wave, coining the term “Agile Analytics” in an attempt to reconcile Agile mandates with proven methods in the BI/Analytics discipline.\

It now appears that the software developer community at large is having qualms about Agile software ‘revolution’ and what it became. Even the original thought leaders of the Agile movement have reservations about what has come of it. There has even bit something of a developer revolt against it. Then there are the chronicles of the Agile hype cycle and some very thoughtful pieces around how to move forward from Agile.
These critiques of the Agile movement as it is currently practiced have several points in common:
  • Agile has passed the peak of its hype cycle and benefits resumes and consultants more than software projects
  • Agile, as it is currently practiced, has become more process than objective-driven. This is exactly one of the faults it was designed to cure
  • Requirements definition (often in the form of vague ‘user stories’) has suffered to the point where it has degraded testing, compromised necessary documentation and caused a fall in overall quality of delivered products
  • Adoption of Agile in Name Only (AINO) practices where a waterfall mentality persists, development performance metrics remain a goal unto themselves, scrums become meetings, sprints become epics before defined value is actually delivered, and development teams remain as disconnected from the business as ever
  • Applications architecture suffers as semi-independent project teams ignore standards and governance to meet their time and cost constraints. This one in particularly is deadly in the long run
Aside from sounding very familiar, what does this mean to us in the Analytics community? It means we need to stress  that what works for more traditional business process applications often does not apply to the unique nature of decision support and analytics applications. We define capabilities, not user stories. We lay down a sustainable data platform before we attempt to build applications on it. We prove concepts and prototype before we make major investments, we govern our data relentlessly to preserve credibility, and we develop our environments with an eye towards how they will be maintained and enhanced. Most importantly, we must remain focused on the results we can achieve while they are still needed and not worry so much on how we achieve them.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.