Data quality: The Achilles' heel of risk management

SAS Risk Research and Quantitative Solutions

Do you have a sound data quality process in place to help you manage and govern big data? If not, prepare yourself to address more than just operational challenges – especially those related to risk. Because if your data is bad (incomplete, corrupt, outdated, etc.), you won’t be able to use the information to make sound business decisions. It won't matter how many talented analysts you throw at it.

A variety of factors highlight the importance of improving data quality :

  • Evolving regulatory requirements such as Basel III/IV, IFRS 9, etc.
  • The need to increase profits, reduce costs and generate new business.
  • Growing demands for faster, more accurate financial data you can trust to run your business.

Along with keeping an eye on new technology developments that improve data management, there are several things you can do to improve the quality of your data.


Beyond staying one step ahead of regulatory mandates, having accurate, integrated and transparent data drives confident, proactive decisions and supports a solid risk management foundation.

Five best practices for improving data quality

1. Align data with risk management and regulatory requirements

High-performance data quality management and optimized data warehousing processes are what make standardized internal and external risk reporting possible. Given the interdependencies, it’s critical that your organization move toward the goal of creating a single source of the truth for reporting, risk control and treasury activities. This may require investing in data cleansing prior to integrating operational data.

2. Make the quality of data risk management transparent

Missing, incomplete and inconsistent data can cause massive problems for financial institutions, especially when it comes to risk controlling and decision making. Banks are dependent on up-to-date, consistent data. But when you have several data silos – each with different connections to the core banking system – it’s difficult to establish a complete, accurate and uniform data overview. What’s needed is a way to adapt warehoused data quickly and efficiently to meet quality standards. This can only be achieved using industrialized, standardized and predefined business rules based on regulatory requirements.

3. Create business rules for sustainable data quality improvement

Continuous monitoring of data is essential – but banks can see even faster improvements by moving to a real-time approach that incorporates a predefined set of business rules – created, shared and adapted to suit the needs of different departments or data sources.

A risk data mart (quality-assured, standardized data warehouse) provides a uniform basis for master data management, reporting and risk controlling. Prior to building one, you need to create a glossary of predefined, relevant terms, data sources and responsibilities for the respective data sources. This basic glossary serves as an initial inventory of all data sources available and makes it easy to identify ones relevant to risk management. In addition, business rules must be defined, developed and maintained to ensure continuous improvement in data quality.

4. Establish continuous monitoring to measure success

Updating data quality processes can significantly reduce costs (such as in the areas of reconciliation and operational error remediation) as well as improve the accuracy of regulatory reporting. But to realize these benefits, data quality assessments – both at the system and department level – should be continuous. And results of these assessments should be presented to stakeholders regularly via dashboards. These dashboards should make it easy for stakeholders to:

  • Understand if data quality levels are falling.
  • Drill down to pinpoint root causes.
  • Run retroactive analyses.
  • Forecast future results.

5. Implement end-to-end analysis of risk processes

By performing a continual, end-to-end analysis of your risk processes, you can identify issues earlier – when they are less costly to fix. In many cases, analysis reveals that while a bank may have data entry rules in place for front-office systems, these systems vary greatly by vendor and age, which creates a patchwork of data feed formats and content. To improve data quality, you need to apply business rules to the initial data entry process for each system – not just as data moves into the risk data mart. This eliminates the need to completely redevelop an in-house approach to data quality management.

Data quality supports strong risk management

Given the tightly regulated environment banks face today, the importance of data quality cannot be overstated. Beyond the obvious benefits of staying one step ahead of regulatory mandates, having accurate, integrated and transparent data drives confident, proactive decisions and supports a solid risk management foundation.