August 17th, 2013 by Chance Coble

According to Gartner, fewer than 1/3 of BI projects successfully meet their objectives. Below is a list and some helpful hints on how to avoid them.

Lack of Focus on Verification

Delivery in an analytics or business intelligence solution can be tricky. Often the output of these systems is considered done when the reports or dashboards are developed. But so much more goes into it, especially where errors are concerned. In these projects, there is less of an excuse for errors than software development projects. Testing of the data and results should be built into project from the starting point. Of course, testing these solutions will be a challenge for a variety of reasons but here I am focusing on correctness and quality of the data. In our experience, bad data getting into an analytics solution can require 2x-10x as much time as was planned for the original project. Seeing those data problems early can be a tremendous budget and even project saver.

There are a few ways to approach this. Thanks to the analytics market’s growth, tools like DbFit, SQLUnit and other integration testing tools like Fit have created a little bit of automation for testing changes to the transformations and preparations that need to happen. Each of these tools can take a bit of upfront investment to learn and start creating automation. However, the cost of propagating bad information into your analytics system is tremendous and grows as time passes.

Focusing on data instead of questions

Over-defining the reports in requirements is a frequent error in planning an analytics project. Tools are available today to make this reporting simple to organize (even if just for testing purposes), and often it can even be self-service report development which is ideal because it makes the feedback loop very tight.

When you start an analytics project, attempt to write down all of the questions you are interested in answering. Be bold. They don’t have to be questions that you can immediately answer. Try to make the questions specific, but don’t define the report or anything about it in the question. A good way to structure these questions is the same way the user stories are structure. In the process you will scope your project more effectively to answering those questions, and look at your project as an opportunity to empower users to make evidence based decisions.

businessobjectives 5 Behaviors that Make Analytics Fail

This baby will be meeting your business objectives any moment now.

Don’t start with the data available to you, or by reviewing your database to see what people might want to have access to in the system. You will completely exclude opportunities for data enrichment from outsides sources. Some of the most successful analytics projects in the world eventually reached out to partners with data or used openly available data to gain a tremendous leap in value. That kind of opportunity never presents itself if you start thinking about the data and work outward to a report.

Intolerant to Change

Two of the key properties I look for in an analytics project are elasticity and plasticity. Plasticity is the ability to change directions in a project, and is critical to being able to pivot without waste. I would recommend a merciless evaluation of your infrastructure to verify as much plasticity as possible. Plasticity enables evolution, and your team’s ability to make a change when that change would add value. What would happen if you found a better data solution? What if it was NoSQL instead of relational? Could you bring in another reporting tool? These are the kinds of questions that are valuable early in a project, so that the assumption from day one is that this system will change and often.

Elasticity is the ability to scale something up and then scale it back down having costs and infrastructure decreasing when it scales back down. This is critical because of the testing it enables without a permanent investment, and also because of the risk reduction when a project’s outcome is still uncertain. Amazon’s EC2 is an obvious example of a system that can do this well. The key is that you don’t make a permanent investment in your solution until you know it has permanent value (or something that approaches permanent).

Finally, virtualization of both your platform (e.g. through a provider like Amazon or Rackspace) as well as virtualization of your data are both great ways to handle change. Bringing in the kind of staffing to support large infrastructure is a permanent investment. Being able to (even if temporarily) offload that responsibility to a partner is a tremendous advantage in flexibility. Trying to reduce costs by making permanent investments is a strategy that doesn’t take into account the high level of uncertainty involved in deploying an analytics project.

Unsustainable Pace

Teams the work beyond their capacity for more than short bursts lose efficiency. It is tempting, but destructive. Working hard is something we often pat ourselves on the back for, and reward our teams for. Working hard is a great way to build your solution. However, the pathology arises when you can’t get your job done without pushing past your capacity for productivity. Increasing your time input to your project will actually make the problem worse.

This happens especially in analytics projects because risk and uncertainty are so high. At some point, a problem arises and the team decides to put extra time in. Either that extra time resolves the issue, or more often reveals the problem is more widespread than first realized. If that same decision is made to continue to “catch-up” then the ingredients for a burnt out team and failed analytics project have been planted. That trend only needs to continue for a short period of time before the company will accommodate the constant sense of urgency that comes with always trying to catch up. That urgency leads to tactical thinking, and choosing options that lead you to saving time rather than investing in your solution and your business.

stress 300x210 5 Behaviors that Make Analytics Fail

The only cure is a dose of reality and communication. Changing back to a solid solution based on strategically investing your time is hard, and I have rarely seen a company accomplish it. It requires the support of the company’s leadership, which is all to often exacerbating the problem because of inexperience. Only when a successful return to solid progress and sustainable pace is achieved will teams usually see how unnecessary and unhelpful trying to work beyond capacity really is.

Make better the enemy of done

The only way I have seen large analytics projects efficiently succeed is to assert the delivery of what is done. That means breaking the work down and delivering it incrementally. This can be a real challenge because being agile in analytics and business intelligence is unfortunately still an immature arena. Understanding how to version control a database and its data so that you can move back and forth between times is confusing. Managing changing ETL, database, data, business needs and application interfaces comes with daunting complexity and is not supported with the robust toolset that change management in traditional software projects is.

To the best of the teams ability though, incorporating evolutionary design into the analytics project is a critical success factor. As an understanding of business questions and answering them is developed, change will occur. Key opportunities will be missed if the team can’t efficiently incorporate those new ideas into the system. Ken Collier writes some of the most comprehensive work on agile analytics, and is recommended reading for teams that may be new to agile or its incorporation into a lean analytics deployment.