Three reasons reporting and analytics projects fail and how to avoid the pitfalls

By Al Cordoba, Director of the SAS Center of Excellence at Truven Health Analytics

There are many reasons that reporting and analytics projects fail. Here are three frequent risks areas to consider before you dive into a reporting and analytics project, as well as some actions that can help you cope with the potential pitfalls.

Data requirements are complex

Sometimes it's difficult to document the business requirement properly. It may be hard to understand the implications of how data relates, or if all the needed data elements have been captured. It is likely that some data elements will be discovered missing after system construction has ended, and this will create a delay. The implementation team may have a difficult time understanding project expectations. When this happens, there is a chance the resulting application will not meet all necessary business needs. The more complex the business requirements, the longer it will take to analyze, design, construct and test the application.

What to do

To cope, the implementation team should use trained data architects to help collect the data and design what the data structures should look like. The architects have been down that road before. In addition, the implementation team may use a formal design session to gather requirements from all stakeholders. This is helpful for prototyping and iterative development and can assist users in discovering requirements for the new application. The business sponsor and senior management should participate in this session and provide overall guidance. The final business requirements should be documented and approved in writing, and a change management procedure should be enforced after that.

Think big but start small to lower your risk of pitfalls.

Al Cordoba
Director of SAS Center for Excellence
Truven Health Analytics

Quality of source data may not be good

Another common risk to reporting and analytic projects is the quality of available data. It may be poor and difficult to use. This means unplanned work to move data from old operational systems to the new information system, which will generate cost overruns and implementation delays. Moving data from existing systems is usually achieved by getting extracts from the old system in simple formats like comma delimited files. This is a relatively simple method, and if the definition of the extract is clear then fewer problems will arise. However, even scrubbed data can still cause problems in a new system if adequate data validation is not in place. System migrations will also cause problems once the system is running because all systems will have to be adjusted to account for the new mapping. However, there is a positive side to this situation. Often, inaccuracies and inconsistencies are uncovered in existing systems when the fields are remapped. Also, you should consider data conversion problems, which can cause significant project delays.

What to do

First, make sure that all of the old data elements are correctly mapped to the new system. A business analyst familiar with the operational systems can help. Also, you should rigorously test the file extracts before proceeding to the present data. Sometimes it's possible to gather data from existing systems, which expedites the creation of extracts and eliminates the need to access the operational systems. But, it's important to determine if the cost and trouble associated with the existing data is worth it. It might be better to start with a new data extract to service the needed metrics, reports or dashboards. Data may have to be manually cleansed, though this should be avoided as much as possible because it increases system maintenance.

Proposed software may be inadequate

A third common pitfall for a reporting and analytics project is the selection of inadequate software. If the technologies used to build the system are unfamiliar, a learning curve may result in lower initial productivity. If the technologies used are inadequate, the project could face integration problems between old and new systems. In some organizations there may be resistance to certain technologies, which could cause delays. If the software is not readily available, how can it be tested? Also the proposed software may not be installed or configured correctly, which will lead to further delays. If the implementation team is tasked with using new, unfamiliar software, this can lead to longer delivery times, and the performance may be poor while expertise is gained in optimizing and configuring the technology.

What to do

To cope, make sure there is an adequate test environment where the proposed software technology can be easily used. The consulting team should already be trained, or training should be provided as early as possible. The implementation plan should ensure that training is available to everyone who needs to install, use or support the new technology. Software should be easy to use and backed by vendor technical specialists. Additionally, look for software that is supported by a good ecosystem of outside consultants who are familiar with the technology. Before committing to a particular software platform, make sure a solid analysis of the new technology functions, features and capabilities has been completed and that those elements map to the needs of your initiative. Develop standard operating procedures for how to use the software.

Start small

As you can see, there are several risks to consider when developing a reporting and analytics project. For that reason, think big but start small to lower your chance of pitfalls.


Al Cordoba has more than 25 years of experience in designing, planning, implementing and consulting in analytical projects. For 13 years he worked at SAS. He has also worked for SPSS, Lockheed, Steptoe & Johnson, Chevy Chase Bank, Blue Cross Blue Shield Association and Qualex. He has consulted in many different firms worldwide, and is the author of the book Understanding the Predictive Analytics Lifecycle.

Three-reasons-reporting-and-analytics-projects-fail-and-how-to-avoid-the-pitfalls

Read More

  • As Al Cordoba points out, good data quality is imperative for successful reporting and analytics projects. This TDWI e-book examines how organizations are addressing their most pressing data quality issues.
  • Download the first chapter of Al Cordoba's book, Understanding the Predictive Analytics Lifecycle.

    Get More Insights


    Want more Insights from SAS? Subscribe to our Insights newsletter. Or check back often to get more insights on the topics you care about, including analytics, big data, data management, marketing, and risk & fraud.