Data quality: The Achilles' heel of risk management

SAS Risk Research and Quantitative Solutions

Without a sound data quality process in place to help you manage and govern big data, prepare yourself to address more than just operational challenges, especially those related to risk. And if the data is no good, you won’t be able to use the information to make sound business decisions – no matter how many talented analysts you throw at it.

Improvements in data quality are driven by a variety of factors:

  • Evolving regulatory requirements such as Basel III.
  • The need to increase profits, reduce costs and generate new business.
  • Growing management demand for faster, more accurate financial data they can trust.
  • New developments in technology focused on improving data management.
Beyond the obvious benefits of staying one step ahead of regulatory mandates, having accurate, integrated and transparent data will drive confident, proactive decisions to support a solid risk management foundation.

Five best practices for improving data quality

  1. Align data with risk management and regulatory requirements
    High-performance data quality management and optimized data warehousing processes are what make standardized internal and external risk reporting possible. Given the interdependencies, it’s critical that your organization move toward the goal of creating a single source of the truth for reporting, risk control and treasury activities. This may require investing in data cleansing prior to integrating operational data.

  • Make the quality of data risk management transparent.
    Missing, incomplete and inconsistent data can cause massive problems for financial institutions, especially when it comes to risk controlling and decision making. Banks are dependent on up-to-date, consistent data. But when you have several data silos – each with different connections to the core banking system – it’s difficult to establish a complete, accurate and uniform data overview. What’s needed is a way to adapt warehoused data quickly and efficiently to meet quality standards. This can only be achieved using industrialized, standardized and predefined business rules based on regulatory requirements.

  • Create business rules for sustainable data quality improvement.
    Continuous monitoring of data is essential – but banks can see even faster improvements by moving to a real-time approach that incorporates a predefined set of business rules – created, shared and adapted to suit the needs of different departments or data sources.

    A risk data mart (quality-assured, standardized data warehouse) provides a uniform basis for master data management, reporting and risk controlling. Prior to building one, you need to create a glossary of predefined, relevant terms, data sources and responsibilities for the respective data sources. This basic glossary serves as an initial inventory of all data sources available and makes it easy to identify ones relevant to risk management. In addition, business rules must be defined, developed and maintained to ensure continuous improvement in data quality.

  • Establish continuous monitoring to measure success.
    Updating data quality processes can significantly reduce costs (such as in the areas of reconciliation and operational error remediation) as well as improve the accuracy of regulatory reporting.

    But to realize these benefits, data quality assessments – both at the system and department level – should be continuous. And results of these assessments should be presented to stakeholders regularly via dashboards. These dashboards should make it easy for stakeholders to understand if data quality levels are falling, drill down to pinpoint root causes, run retroactive analyses and forecast future results.

  • Implement end-to-end analysis of risk processes.
    By performing a continual, end-to-end analysis of your risk processes, you can identify issues earlier – when they are less costly to fix. In many cases, analysis reveals that while a bank may have data entry rules in place for front-office systems, these systems vary greatly by vendor and age, which creates a patchwork of data feed formats and content. To improve data quality, you need to apply business rules to the initial data entry process for each system – not just as data moves into the risk data mart. This eliminates the need to completely redevelop an in-house approach to data quality management.

Given the tightly regulated environment banks face today, the importance of data quality cannot be overstated. Beyond the obvious benefits of staying one step ahead of regulatory mandates, having accurate, integrated and transparent data will drive confident, proactive decisions to support a solid risk management foundation.


achillies heel risk

Read More

Read the rest of this white paper, Data quality: The Achilles' heel of risk management